NEURAL MATERIALS is a new live AV show created by SONAMB, in collaboration with Sean Clarke on visuals. The piece debuted at Modern Art Oxford in spring 2024 and has been subsequently performed at the Creative Machines symposium at the Royal Holloway University. The piece is the result of a two year R&D commission by Cyborg Soloists, a UKRI funded project at Royal Holloway University led by Dr Zubin Kanga. Cyborg Soloists is a 4 year programme partnering artists and digital technologies to create new work. I worked with Bela technologies who created the Bela board, an ultra low latency sonic microcontroller for musicians and sound artists to create interactive artworks. I used their Pepper module as part of my Modular set up, along with their Trill sensors for gestural interaction.
BACKGROUND & PROPOSED SYSTEM
The project is a further exploration of Machine Learning and Musique Concreté, building upon my 2021 residency with NOVARS, University of Manchester (in collaboration with PRiSM, RNCM) where I developed methodologies for building Musique Concreté training datasets for neural synthesis training.
For NEURAL MATERIALS I created a new performance system for AI, modular electronics and sound sculpture; a hybrid setup, with Ableton linking to the modular. With the Bela technology I used their PEPPER module for sample manipulation, controlled via their gestural sensors called TRILL. The aim was to be more hardware focused for live, and explore the ML materials more tangibly through gesture and feedback. To develop sound content for the project I undertook a series of studio experiments and workflows.

SONIC DATASET & RECORDING
My sonic dataset for this project is ‘post industrial’ with each data class containing field recordings that represent a different material that tells the story of Manchester’s industrial past and present. Beginning on the outskirts of the city with cotton mill machinery at Quarry Bank Mill and journeying inwards towards the centre through the canal networks, we finish in the (property booming) city of luxury mill apartments and the shiny tower blocks of modern development and gentrification. This journey is represented by sonic data classes of cotton, water and noise.

SONIC EXPERIMENT EXAMPLE ONE: COTTON MILL DATA CLASS RECORDINGS & RHYTHM
Recording_ For the ‘Cotton’ class of field recordings, I was thrilled to record the machinery at Quarry Bank Mill. Built in 1784, it is one of the best preserved textile factories of the Industrial Revolution, now owned by the National Trust. I was privileged to gain access to the Cotton Processing Room and the Weaving Shed, to record the unique sounds of the machinery using a mixture of directional and stereo mics to record the looped rhythms, unique timbres and drones, and the power up and down of these amazing historical artefacts including the Draw Frame, the Slub Roving machine, the Carding Machine, the Ring Spinning Frame and the Lancashire Loom. My Dad trained as an apprentice electrician here in the 70s, cycling from Hulme to Styal everyday, so it’s a special place for me.
Pattern Generating_ The dataset was used to train the PrismSampleRNN model. Post training I was interested in what ‘future rhythms’ the ML model would generate. Through feature extraction would rhythms collide and be sonically interesting or would they replicate the staccato analogue loops in the dataset? In the Ableton still below, I am analysing and comparing rhythmic patterns from the original machine field recordings with the exported ML audio post training. You can see the audio is classified by each epoch, meaning the amount of times the model has trained on the full set of data. You can also see the ‘temperature’ setting which is a hyper-parameter of ML used to control the randomness of the predictions in latent space.

I enjoyed listening into the sounds seeing if I could identify particular machines from the dataset. Using a midi-drum conversion was the clearest way to listen back to the rhythmic material, which was eclectic and brilliant non-sense that a human drummer could not play, you can see this from the velocity hits in the roll at the bottom. Sending these pulses as CV triggers to the modular and looping the ML sections to locate areas I wanted to work with. In this audio clip you can hear the different generated machine rhythms falling in and out of each other.
SONIC EXPERIMENT EXAMPLE TWO: GEOMETRIC SCULPTURE & TONAL ASPECTS


I’m working with a new metal sound sculpture, another in my series of AURA MACHINE icons ( a visual symbolic language i’ve developed to represent the sound object in latent space.) I’m interested in the acoustic potential of these metal sculptures and how these transmutational objects can be amplified, activated and fed back into the system for live performance. I mapped the frequency response of the physical sculpture at different points across it’s geometry using contact microphones positioned on each plane. There was a clear fundamental on each plane but also evident are harmonics and partials that recur throughout the object. Here is the 3D model with the mapped frequencies.
SONIC EXPERIMENT EXAMPLE THREE: MODULAR DIY_ BELA PEPPER & EMF FEEDBACK
Using the Bela PEPPER module, we wrote a PureData patch that would control the playback and manipulation of the AI cotton mill samples via touch. We used two TRILL sensors to do this, the square one controlled the volume and speed on an XY axis, and the circle sensor controlling a granulated loop through touch. The sensors were fitted to my sculpture and used for live improvisation throughout the set. Another DIY aspect of the modular setup included working with my EMF circuit, which sampled and amplified the live electricity of the modular and fed it back into the system. We designed and laser cut three bespoke units; a breakout board for the TRILL sensors, a CV controlled light reactive unit and a holster for the EMF circuit. Special thanks to Chris Ball and Luke Dobbin for helping me with the coding on this!

AUDIO REACTIVE VISUALS WITH SEAN CLARKE
I collaborated with digital artist Sean Clarke on this project, who expertly took my visual source materials and crafted them into a new audio reactive system for live performance using Touch Designer. I love working and performing with Sean, his artwork has so much feel and energy and he conveyed the sense of the materials in the dataset and the moods and changes I was trying to evoke.




NEXT STEPS
I am still experimenting with this new performance system, the shows so far were the first outings for the piece and I plan to keep experimenting with the sculpture (especially acoustically) and develop the improvisational aspects for any new NEURAL MATERIALS shows. In 2025 I plan to develop my practice into the realm of spatial installation drawing on my methodologies and research from the past 5 years in sonic AI. More to come!
THANK YOU TO MY COLLABORATORS AND SUPPORTERS!
Cyborg Soloists, University of Holloway: Zubin Kanga, Caitlin Rowley, Mark Dyer, Jonathan Packham
Sean Clarke: Live Visuals
Chris Ball & Luke Dobbin: PureData coding support
Nik Void : Music Mentor
Quarry Bank Mill: Suzanne Kellett and team!
PRiSM: RNCM for neural network training
Flux Metal & Neon Creations: Sculptural Fabrication
Dan Hulme, EMPRES, University of Oxford
NEURAL MATERIALS is dedicated to my inspirational Dad, Vince Clarke who passed away July 2024.








