by Daniel Cole

Making waves with Suzanne Ciani

Synth and experimental music pioneer Suzanne Ciani talks about her background, approach to performing live and her collaborative work in this in-depth conversation.

As one of the pioneering synth and experimental music figureheads in the US, Suzanne Ciani’s influence has reverberated through generations of musicians and sound designers. But it was the influence of another synth legend that would prove pivotal in her own musical life. While studying music composition at Berkeley in the late ’60s, Ciani met analogue instrument designer Don Buchla. Fascinated with the sonic possibilities of his creations, she would become Buchla’s apprentice, helping him assemble the instruments that share his name. Ciani’s prowess on the Buchla became the basis for a career, leading to her composing scores for numerous TV campaigns. Most famously, it’s Ciani’s synth-craft behind the iconic “pop and pour” sound effect of ’70s Coca-Cola adverts.

 

Native Instruments called Ciani up at her Californian beachside retreat to discuss her performance at Terraforma festival — a dramatic showcase in the gardens of Villa Arconati outside of Milan, in which her set was cut ten minutes short by a huge power cut — in addition to her approach to technicalities, work at the Berklee College of Music, and experience with collaborative performances.

When I call up Ciani, it is one year to the day since Don Buchla passed away. In the background, you can hear the sea crash against the coastline, and it is these two elements — the instruments of Buchla, and the sound of waves — that have tied her work together. Since Andy Votel began re-releasing her work through Finders Keepers Records in 2012, Ciani has become a more prominent force through her performances and talks. Earlier this year, a documentary about Ciani’s life and work — A Life in Waves — was debuted at South by Southwest.

As we talk, Ciani often pauses and asks me to listen to the ocean. “Can you hear it?”

 

A little bit.

Well, I live in this sound, and it’s no accident that all my concerts come from the ocean. That’s been the sonic reference for practically my whole life. My first album was called Seven Waves; it was the waves connecting the pieces; the pieces born out of waves.

 

How long have you been by the coast?

For 25 years. I was living in New York City for 19 years, and then one day I said, “I’ve got to get out.”

 

And this was when you had the composition company Ciani/Musica?

Yes. I was at the top of my game, but I had a scare with breast cancer, so I moved from the middle of the city to the coast. That was as big a change as I could make.

 

For Terraforma, how much did they let you prepare in the environment in which you were going to play?

It was outdoors, and that can always be a little bit risky. I was a little freaked because before soundcheck, it just started to rain, and I have only one Buchla [synth] — Don Buchla passed away exactly a year ago today, [and] it’s irreplaceable.

 

Can you explain how your quadraphonic setup works? Were you confident that it’d be suitable for the venue at Terraforma?

They picked this garden, and the beauty of this setup was that it was more conducive to a surround sound experience, with people standing around me.

The important thing with the quadraphonic is that all four speakers are equal. I noticed in my travels that [this isn’t widely understood]. In the early days of quadraphonic, the industry didn’t know what to do about it, so they minimised the importance of the back speakers. They said, “Oh, we’re replicating a concert hall, and we’re going to give you some of the reverberant, back-of-the-room experience.”

Therefore, right from the start, quadraphonic was doomed because it wasn’t interesting for acoustic music. But the natural domain of electronic music is spatial. Electronic sound naturally wants to move. If the sound sits there, it’s not expressing — it’s not alive. Once you start to move the sound in a purposeful way, it stops being random. Controlling space is a conscious set of choices and from the beginning, with Don Buchla in the late ’60s, we always played in quadraphonic.

 

How do you control quadraphonic sound?

Don [Buchla] had a quadraphonic interface for the output. You had a voltage-controlled spatial location parameter called Swirl. You can push a button and go into Swirl mode, which is continuous panning that can go right to left, left to right. With voltage control, you can speed it up and slow it down. So it’s all very malleable.

Another really important control has to do with discrete location as opposed to continuous.  This is very effective because in spatial environments there’s this phenomenon that we all know called [auditory] masking. If you are near a speaker, you’re going to hear that one before you hear other speakers, but with discrete location you very quickly move a sound from one discrete location to another. It’s a rhythmic motion that anybody can hear because the sound is only in one speaker at a time, pretty much.

If it’s too discrete, if the sound’s precisely in one speaker or some place, it’s a little unsettling, and so what I do is have two spatial setups. One of them gives me the option of being discrete and the other one creating fill, so there’s never a completely naked spot.

The other part of the spatial is the processing. In the early days we had voltage-controlled reverb, which meant that the space could go close and far away instantaneously — it was amazing. We don’t have that today. I use two Eventide H9 boxes for my processing. They’re nice and small, and I’m trying to get Eventide to add voltage control for the reverb mix, so that I can add and remove the effect.

 

How do you integrate the Eventide units into your setup? And can you contrast this with voltage-controlled reverb?

When I started using Eventides and I had four channels, I realised the [Eventide] boxes are stereo. At first, I thought I needed two boxes to process more channels, but then I thought, “Let’s simplify this and do the [reverb] processing before the spatial movements.” Now what I do is I go from a mixer into an H9 and I come out of the H9 into this [quadraphonic] spatial processor.

It works, but what you don’t get, yet, is the ability to change the perspective. You can add reverb and make a large space. You can take reverb away and make a small space. But what we used to do [with voltage-controlled reverb], is do that on a beat. You’re able to instantaneously control the amount of reverb and it becomes a rhythmic experience creating an illusionary space that’s constantly shifting dimensions.

Instead of sitting in a big reverbed room and then in a quiet non-reverb room, you are going from one to the other instantly and it’s an amazing experience.

How do you control the Eventide units at big outdoor events, like at Terraforma?

The H9 is an amazing little box with an interface. I have one of them interfaced to an iPhone, and another one with an iPad. So I have graphic control of all the parameters in any particular preset. I can do rhythmic tapping and can instantly tune my delays to the rhythm of the music.

When I settle on a tempo, I just go to the H9 and I tap it in, and then I have my delays set for whatever they are. I might play an intermittent sequence and you’ll hear it answering on the other side of the room.

 

I’ve seen you performing a few times where you move the iPad around.

That’s another part of the H9. It has various graphic interfaces. It has one called Tilt where you just tilt the iPad and it changes the relative parameters that you’ve chosen to select. You get to choose what it is you’re going to be impacting. There’s an X axis and a Y axis, and you just tilt it. I use it particularly on the wave sounds.

I think it’s a wonderful thing to see that action and hear the change. There’s also this shifting of the sound, the delays mostly. I love what’s going on with graphic interfaces.

 

How has technology impacted upon the logistics of travelling and performing?

Well, a couple of things happened. I’ve had a long relationship with Eventide, going back to the beginning with Richard Factor and now with Tony Agnello. You used to buy a box and get a delay [effect] for $3,000. Now you get the H9 and get everything they ever developed in a tiny little box with this graphic interface.

The other thing that happened for me was I felt very vulnerable playing the Buchla because it can break at any time. I learned about Animoog [Moog’s iOS synth app] in 2011, which fits the bill when you travel, as I really believe you have to travel light. In the old days, I travelled with truckloads of stuff, and I’m never going to do that again. I want to be able to get on an airplane, by myself, get off the airplane and set up. I don’t want to have an entourage.

So I’ve gone into a less-is-more mode ever since I moved out here from New York. I came out here to a cabin on the ocean where I was alone, so I had to downsize. I built a studio here and my only requirement was that there be no patchbay because I didn’t want all that wiring. I was tired of that.

 

Did the minimalist approach of the New York avant-garde scene influence your setup as well as your music?

Well, that minimalism is something that we all shared in that generation. I mean, Philip Glass is a little bit older than I am, but we all went through the academic system. I have a master’s degree in music composition from the University of California at Berkeley and considered going to Nadia Boulanger in Paris, where everybody went.

What happened was that the academic trajectory of music composition got disconnected from basic human reference points. It became superhuman; it was: “Let’s see how complicated you can write something.”

A lot of us said, “Wait a minute – let’s get back to basics and the experience of music.” Philip Glass didn’t use electronic music to start out, but there was a strong influence of electronics on his compositional thinking. With electronics, being difficult became easy, and so that whole goal was really undermined anyway. Academia thrived on its egocentric expression of complication; there was a revolution against that, and it was called minimalism.

In the studio sense, I think minimalism is appropriate if you’re one person. I don’t want to have a huge console where I have to have an engineer. My little setup for the moment is four speakers, the Buchla, and no patchbay. I also have the two H9s. I just got this MakeNoise system that I’m experimenting with. Then, downstairs I have a Moog 15 and Mother 32. I also have a lot of keyboard instruments. And I’m a huge fan of Digital Performer for recording.

 

How do you factor randomness and uncertainty into everything?

The thing about electronic music is that it is very adept at fluid evolution. Everything morphs into something else. With minimal resources, if you add random elements, the machine can generate something that’s really very interesting. You take a single sequence, shift the octave transposition on some of the notes randomly and your ear starts to hear counterpoints of melodies, even though there’s only one note in a sequence going at one time. Any note that happens two octaves up, you start to connect those in a melody – you start to hear, in the various ranges, a counterpoint.

 

I’m interested in the collaboration you did with Kaitlyn Aurelia Smith. How does it work when you’re incorporating elements of randomness?

That was a live performance — we did that here in my little studio. We set the two Buchlas on top of the piano. We synchronised the clock; one clock drove the other clock, so we were locked. I gave her two sequences,  2 rows of 16 pitches, and I gave myself 2 rows of 16 pitches, and those sequences are designed to work together.

I wrote a paper in 1976 about how to play the Buchla, and [in it] I documented performance techniques and 4 sequence rows, which are the same sequences I’m using today. Once you are locked together rhythmically and have your tonal materials in place, they’re going to work together. Once you get a good, solid starting place, you can dance within it and relate to each other in the process.

I did a class at Berklee with five modular setups and it was fabulous. I gave the students the same sequences from 1976. Their clocks were all locked and then we… It was amazing. I had to say, I was just flabbergasted. I loved being at Berklee because you get to try out a lot of things. They have all these modular setups that are identical and they’ll put two kids at each station, and then they have an interface for communicating and tempo control. Any one of the people can change the tempo of the whole group and send a text message letting them know when they’re going to do it. Matthew Davidson designed all this interactivity and leads the performance class.

How much more are you learning through collaboration?

My first outing was with Sean Canty [of Demdike Stare] and Andy Votel. Andy is the one that brought me back to this because he released the 1975 Buchla concerts, and then said, “Well, why don’t we go out and play?” I got my feet wet with them and it was so lovely. I can’t tell you how relaxing it was to be playing with these two guys. There’s always a huge stress level when you’re playing live on modular because stuff can go wrong. When you have collaborators in performance, it’s really more relaxing.

 

What was the setup there?

We played at the Lincoln Centre in New York. Sean and Andy had their DJ setup and would use vinyl records as the source material that they then transformed, looped, and repitched. They did all kinds of filtering and processing and whatever else, so the source material was just a starting point. It was all very fly-by-the-seat-of-your-pants when we started doing that. I had a very different Buchla setup than I have now.

 

What prompted you to change it?

Well, it changes all the time.  At the Lincoln Center I had a small 12-panel unit, and for Sunergy for instance and my early solo touring, I had a bigger system — two separate units. One was a 12-panel unit Buchla and then the other one was something called a Skylab, which is a 10-panel unit that you can then carry on an airplane. That’s how I was touring, carrying a 10-panel unit and checking in a small case. One time I had trouble on a small airplane; they wouldn’t let me carry the Skylab on, and I said, “Uh-oh, this isn’t going to work. I can’t use this approach.” So I shifted to an 18-panel unit checkable Buchla that wasn’t too heavy.

Another change was that I got a clone of the 248 MARF (Multiple Arbitrary Function Generator).  I talked to [Don] Buchla, and said, “Don, I really can’t perform without the 248 and you don’t make it, and there aren’t any.” He said, “Fine.” I’ve gotten some criticism, but I did clear it with Don first.

Our call cuts out briefly, reconnecting to the sound of the waves in the background.

You know, this reminds me of when the Terraforma concert had the electricity cut out. It was like a very appropriate way to end an electronic concert.

The was so apt for the moment. It’s also a great signal for us end our conversation. Thank you for your time and your music.

 

photo credits: Michela Di Savino

Related articles

Cookie notice

We use cookies and similar technologies to recognize your preferences, as well as to measure the effectiveness of campaigns and analyze traffic.

Manage cookies

Learn more about cookies