Our second Ableton Spaces workshop at Together 5 featured Data Garden explaining how they’re using the program to create music with signals generated by plants.
Our Data Garden hosts began by explaining that plants have more refined and developed senses than human beings. Plants sense chemical changes in the air, and the data they output can change based on any number of changes in a room. Multiple plants in a room will put out different data than singular plants, and a person entering a room can change the signal too. All of this was conveniently demonstrated by a crowd passing by the presentation space and completely changing the data output from the plant on hand.
A series of sensors attached to that plant were run through small pieces of hardware to send MIDI data to Ableton, essentially allowing the plant to “play” Ableton’s native instruments. Various MIDI effects, an arpeggiator and Live’s Simple Delay in Repitch Mode were also employed to create and process the plant’s sounds. Human reaction to the sounds being generated were also changing the data output, creating an elaborate cycle of biofeedback in the music.
Data Garden cited Brian Eno’s Music For Airports as a major inspiration for what they do. They described Eno’s process of composing the pioneering ambient work by winding various lengths of tape loops around an entire room and taking a nap in the middle of it, and called it similar to their own system. They set it up and simply allow the plants to play, uninterrupted by the human element.
Their systems work best on heartier tropical plants. Sensors must be attached to leaves of a certain thickness and shape, such that the plant is not harmed in the process. Plants, Data Garden explained, are much like pets. Humans have domesticated them and brought them into our lives for a reason.
Data Garden also previewed their next project: the MIDI Sprout. It’s a device that processes plant data into MIDI data to drive a keyboard or synthesizer without a computer as a middle man.
As part of the Ableton Spaces workshops at Together 5 this weekend, District Hall hosted singer/songwriter/producer Natasha Kmeto for a demonstration and discussion of how she employs Ableton for live performance.
Ms. Kmeto first outlined her hardware setup, which includes a Korg Mini synthesizer and an Akai MPD. Drum pads on the Akai controller are set to trigger loops and scenes organized in Ableton’s clip view. Elsewhere on the MPD, faders are assigned to control Ableton’s reverb, delay and beat repeat effects on her vocals. She also employs side-chain compression to create a “pumping” audio effect. Kmeto mentioned that the vocal effects – delay in particular – are useful for transitions in and between songs.
Kmeto also uses Ableton’s looping functions for vocals, with buttons on the MPD mapped for recording, starting and stopping the loops. She builds each song from pre-existing stems during live performances, rather than looping all of the synth and vocal parts live. This, she said, makes for a less tedious and more immediate set. Still, her MPD is set up to allow flexibility and improvisation in her performances.
As far as specific sample packs and VSTs go, Kmeto cited Goldbaby‘s 808 and 909 sets as her favorite drum packs and Valhalla as her go-to reverb. Software effects, she explained, made touring and performing live much more cost-effective than traveling with complex rack-mounted gear.
Kmeto described her creative process as starting with a title, a mood or an emotion before melody and structure enter the equation. Her ultimate goal, she said, was for her music to be “emotionally honest.” She composes and records using analog synthesizers and Logic, and ultimately bounces her stems as audio to Ableton to adapt the finished songs for live performance. Ableton’s flexible nature makes it the ideal performance tool for Kmeto, and a perfect solution for DJing as well, she said.
Together 5’s not over yet – be sure to check the schedule for daytime events at District Hall throughout the rest of this weekend.