Our second Ableton Spaces workshop at Together 5 featured Data Garden explaining how they’re using the program to create music with signals generated by plants.
Our Data Garden hosts began by explaining that plants have more refined and developed senses than human beings. Plants sense chemical changes in the air, and the data they output can change based on any number of changes in a room. Multiple plants in a room will put out different data than singular plants, and a person entering a room can change the signal too. All of this was conveniently demonstrated by a crowd passing by the presentation space and completely changing the data output from the plant on hand.
A series of sensors attached to that plant were run through small pieces of hardware to send MIDI data to Ableton, essentially allowing the plant to “play” Ableton’s native instruments. Various MIDI effects, an arpeggiator and Live’s Simple Delay in Repitch Mode were also employed to create and process the plant’s sounds. Human reaction to the sounds being generated were also changing the data output, creating an elaborate cycle of biofeedback in the music.
Data Garden cited Brian Eno’s Music For Airports as a major inspiration for what they do. They described Eno’s process of composing the pioneering ambient work by winding various lengths of tape loops around an entire room and taking a nap in the middle of it, and called it similar to their own system. They set it up and simply allow the plants to play, uninterrupted by the human element.
Their systems work best on heartier tropical plants. Sensors must be attached to leaves of a certain thickness and shape, such that the plant is not harmed in the process. Plants, Data Garden explained, are much like pets. Humans have domesticated them and brought them into our lives for a reason.
Data Garden also previewed their next project: the MIDI Sprout. It’s a device that processes plant data into MIDI data to drive a keyboard or synthesizer without a computer as a middle man.