Live coding meets augmented instruments

‘Fermata’ for bass clarinet and Threnoscope

Improvisation, 33min.

Thor Magnusson – live electronics (live-coded, 4-channel Threnoscope)
Pete Furniss – augmented bass clarinet

Recorded live 27th June 2015, at Alison House, Edinburgh College of Art, University of Edinburgh, UK (stereo mix)

The Threnoscope

Icelandic musician and researcher Thor Magnusson is based at the University of Sussex, where his work centres on the impact of digital technologies on musical creativity and practice. The Threnoscope has been in development for several years and will shortly be released publicly for use by composers and performers alike. I first witnessed it in action in London at workshops and performances as part of the New Instruments for Musical Expression (NIME 2014) conference and was struck by its immediacy, nauanced configurability and beauty of design.

A highly dynamic and configurable audio-visual instrument that allows the performer/composer to manipulate a set of drones by means of live coding, the Threnoscope is a compostion and performance tool, an instrument, and a live score.

044@2x

The Threnoscope: what you see is what you hear.

see a demo here: vimeo.com/63335988

What’s very much missing in this audio recording is the visual aspect, which contributes significantly to the liveness of the performance. A series of rotating coloured bars represent the various pitches in the electronics – as well as their volume intensity and geographic position within the speaker array. This creates both a visual focus for the audience and a kind of ‘score’ for the co-improviser to follow, complement or subvert (ignore even).

The movement of the bars creates both a trace and an anticipation of real-time musical events and gestures (Magnusson 2014), affording an accute sense of temporal liveness (Sanden 2013) and a pleasing balance of recognition and expectation for both audience and performer(s).

Magnusson has described live coding as “an art practice merging the act of musical composition and performance into a public act of projected writing” (Magnusson 2014). The Threnoscope-specific or SuperCollider code is entered in real-time by the laptop performer and is displayed to the right of the circular graphical interface, just above a status window.

This simple, tripartite interface affords transparency, immediacy and what might be called a textual—or in a broader sense semiotic—liveness to the proceedings ( Sanden 2013).

CJKG-ngVEAARbi5

The complete Threnoscope interface, here showing a single tone, with the live coding and status windows

Thor occasionally likes to bring other improvisers into his Threnoscope performances and in 2015 we played two dates together, firstly in Glasgow’s Centre for Contemporary Arts (CCA) as part of xCoAx 2015, followed by a performance in Edinburgh, promoted jointly by Dialogues Festival and the ECA/Reid School of Music at the University of Edinburgh.

About this collaboration, he writes:

Fermata is a piece written [sic] for a microtonal drone instrument called Threnoscope and an acoustic instrument. It is a framework for improvisation of microtonal music, where both the live coder and the instrumentalist contribute equally to the piece’s development.

The Threnoscope is notated through live coding, with sounds being represented on a graphical score next to the coding terminal. Its visual appearance illustrates the harmonics of a fundamental tone, as well as speaker locations. Musical notes move around the spectral and physical space, long in duration, and sculptable by the performer.

Fermata has been performed with Adriana Sá (London), Miguel Mira (Lisbon), Iñigo Ibaibarriaga (Bilbao), Áki Ásgeirsson (Reykjavik), Alexander Refsum Jensenius (Oslo), and now with Pete Furniss on the bass clarinet.”

[programme note, xCoAx 2015]

2015-06-26xCoAx

Negotiating new perfomance ecologies

“Through a constant visual feedback and limited syntax of the micro- language, the performer is habituating a space of musical possibilities, … where the surroundings and physical setup of equipment is seen as the design of performance affordances. Due to the spatio-visual nature of the Threnoscope, improvising with it is in many ways similar to the situation of an instrumentalist who has shaped or composed their instrument and surroundings with external technology and extended technique, and explored the musical potential of the particular setup.” (Magnusson 2014)

Managing a 30 minute drone-based improvisation can be tricky. There needs to be a sense of embarking on a long jouney, without knowing quite where it’s going to take you. Both the relatively long duration and the resolute pacing of the drone device require a more measured sense of space than I’m generally used to.

A direct response to the tonality of the Threnoscope emerges only after both a long solo from Thor and several minutes of  breath, and articualtion and spit sounds, magnified by use of variable compression gain mapped to a foot pedal. This unpitched-but-not-quite-unsemantic approach that serves to draw the audience into the unfamiliar performance ecology (Bowers 2002, di Scipio 2003; Waters 2007)—blending a live-coded, visually represented, electronic sound synthesis instrument with a live acoustic instrument,augmented by amplification and digital augmentation, in an immersive multichannel environment.

857827_477248805656494_1840954450_oThe bass clarinet is augmented here with  a very limited number of simple delay, reverb, compression and distortion effects,  using Ableton Live. I felt this was a help in bridging the potentially stark acoustic/electroacoustic divide (Emmerson 1998, 2007; MacNutt 2003), which is always a primary concern of mine. The acoustic instrument can feel dry or impoverished in an electro-acoustic environment, bullied even—stripped of those nuanced resonances that form part of our sound by the different qualities produced by the sound from the loudspeakers.

It’s not just about levels, but bringing a sense of cohesion to the whole and allowing each performer to express themselves to an optimal extent.

Managing feedback and bleed was also rather an issue (the drones inevitably leak into the two microphones), which meant a rather different emphasis in comparison to some of my clarinet performances . I didn’t at this point have a direct input fitting for my K1X pickup mic, which has impacted on my overall sound and technique (see other performances and sketches), so I’m carefully ‘testing the waters’ of playing level and effect selection in a process of learning-by-doing in performance.

This on-the-fly adaptation to surroundings is something musicians do all the time in acoustic performance (managing room reverberation and instrumental balance within it, for example). What’s new here is undergoing a process of familiarisation with technologised performance ecologies, and allowing space for musical instincts and priorities within them.

The only planning here was with regard to the approximate duration of the piece and a configuration of pitch and temperament for the drones. B flat was chosen as a tonal centre as it suits the range and inherent resonances of the bass clarinet—interestingly, and pleasingly from my point of view, the pitch of the fundamental drone was configured slightly lower than standard (58.2Hz as opposed to 58.27). This allowed a) consideration for the particular tuning of my instrument and b) a more ‘seated’ feel in dealing with  tuning issues in response to the justly intonated partials of the electronics (which are nevertheless malleable via the real-time coding).

Quite a good example of ‘humanising‘ in a meeting of electronic music and traditional instrumental practice.

Screen Shot 2016-01-12 at 12.01.59

Thanks to Marcin Pietruszewski for the recording and assistance with the event.
Also to xCoAx, CCA and Martin Parker at ECA or making it happen.

__

References:

Bowers, J. (2002). Improvising machines: Ethnographically informed design for improvised electro-acoustic music. ARiADATexts (4).

Di Scipio, A. (2003). “Sound is the interface”: from interactive to ecosystemic signal processing. Organised Sound, 8(03), 269–277.

Emmerson, S. (1998). Acoustic/electroacoustic: the relationship with instruments. Journal of New Music Research, 27(1-2), 146–164.

Emmerson, S. (2007). Living electronic music. Aldershot, Hants, England; Burlington, VT: Ashgate.

McNutt, E. (2003). Performing electroacoustic music: a wider view of interactivity. Organised Sound, 8(3), 297–304.

Magnusson, T. (2013). The threnoscope: A musical work for live coding performance. In ICSELive 2013.

Magnusson, T. (2014). Improvising with the threnoscope: integrating code, hardware, GUI, network, and graphic scores. In Proceedings of the New Interfaces for Musical Expression conference. Goldsmiths University.

Sanden, P. (2013). Liveness in Modern Music Musicians, Technology, and the Perception of Performance. New York: Routledge, 2013.

Waters, S. (2007). Performance Ecosystems: Ecological approaches to musical interaction. EMS: Electroacoustic Music Studies Network.

Leave a comment