Experiments with convolution and concatenative synthesis: Three illuminated videos
In this post, I present three work-in-progress studio recordings that chart the composition of a work for classical guitar and live electronics alongside its accompanying Max patch.
I use these videos as a site of analysis and reflection through text annotations and voiceover, to identify areas for further development, and as a way to note and synthesise ideas that will be elaborated upon in future writings and research narratives.
I will discuss the conceptual inspiration and rationale for this piece at a later time. At this point in its composition, the work’s live electronics employ two different processes. The first, convolution, is a type of “cross synthesis”, which “accurately impart the characteristic timbres of spaces and objects on other signals” (Brown 2019). Most commonly, this process is used to model digital reverbs on real spaces, like concert halls and churches, in convolution reverb plug-ins by using a recording of the space called an impulse response. In this piece, field recordings of waves are used as impulse responses.
The piece also uses concatenative synthesis, in which a ‘corpus’ of prerecorded samples is segmented, categorised and then played back in an order that best matches a live input signal: in this case, my classical guitar.
Both the convolution and concatenation processes have been built in Cycling 74’s Max, using Rodrigo Constanzo’s machine learning toolkit, Data Knot (Constanzo, n.d.), which is itself built around the Fluid Corpus Manipulation Toolkit (Fluid Corpus Manipulation, n.d.)
Video 1: First convolution test
The first video I will present here documents my first interaction with Data Knot’s convolution, using my chosen impulse responses. Annotations in italics articulate my thoughts during moments of discovery, and those in roman type are more detailed analyses of those discoveries made after the recording.
Video 2: Second convolution test, with revised and additional impulse responses
Over the next video, I will talk about the phenomena arising through the use of convolution, and impulse response-as-processing.
Video 3: First concatenation test with my own corpus, playing an outline of guitar material
The corpus I am using for concatenation currently consists of four banks of samples:
A classical guitar improvisation focussing on percussive sounds.
The same recording, layered and then manipulated through delays and granular synthesis in Ableton Live.
A short extract of an electric guitar improvisation, again processed through effects in Live. Only the effected sound is included in the corpus.
A voice recording of myself, reading an AI-generated poem called ‘The Camera is a Time Machine’.
References
Brown, Griffin. 2019. ‘The Basics of Convolution in Audio Production’. iZotope. https://www.izotope.com/en/learn/the-basics-of-convolution-in-audio-production.
Butler, Mark J. 2014. Playing with Something That Runs: Technology, Improvisation, and Composition In DJ and Laptop Performance. Oxford University Press.
Chappell, Herbert, dir. 1976. A Life in the Country. Aired, on BBC.
Collins, Matt. 2023. ‘Compositional Strategies for Timbral Blend in Mixed Electroacoustic Music and Portfolio of Original Compositions’. PhD thesis, University of Oxford.
Constanzo, Rodrigo. n.d. Data Knot. Accessed 27 September 2025. https://rodrigoconstanzo.com/data-knot/.
Croft, John. 2007. Theses on Liveness. 12 (1) (2007–04): 59–66. https://doi.org/10.1017/S1355771807001604.
Emmerson, Simon. 1994. ‘“Live” versus “Real-Time”’. Contemporary Music Review 10 (2): 95–101.
Emmerson, Simon. 1996. ‘“Local/Field”: Towards a Typology of Live Electroacoustic Music’. Paper presented at International Computer Music Conference, Hong Kong. Aesthetics, Philosophy, Criticism.
Fluid Corpus Manipulation. n.d. ‘Fluid Corpus Manipulation Toolkit’. Accessed 29 May 2025. https://www.flucoma.org/download/.
Ingold, Tim. 2016. Lines: A Brief History. Routledge.
Riva, Giuseppe, and Fabrizia Mantovani. 2012. ‘From the Body to the Tools and Back: A General Framework for Presence in Mediated Interactions’. Interacting with Computers 24: 203–10. http://dx.doi.org/10.1016/j.intcom.2012.04.007.

