AURA MACHINE RESEARCH SITE_ MACHINE LEARNING & MUSIQUE CONCRETE RESIDENCY_NOVARS_UNI OF MANCHESTER

Homepage AURA MACHINE research website

The AURA MACHINE research website is now live. The platform collates the research and creative outputs from my artist residency over the past two years with NOVARS, centre for innovation in sound at the University of Manchester, where I have been exploring the collision of musique concrete and machine learning. The residency was supported by European Arts Science Technology Network for Digital Creativity and in collaboration with PRiSM Lab (Practice Research in Science and Music) at the Royal Northern College of Music and was kindly supported by Arts Council England.

Framed by my research question “how can concrete materials and neural networks project future sonic realities?” , the site sets out the critical rationale for exploration, could machine learning, specifically sample based neural synthesis, be a tool for contemporary musique concrete?

Throughout the research I developed skills in building sonic concrete datasets, technical knowledge of neural synthesis and methodologies for listening, composition and performance when working with these materials. The musical output was in the form of my piece AURA MACHINE, whose first iteration was a 10 minute AV piece showcased at FutureMusic#3 event at RNCM.

AURA MACHINE, AV piece FutureMusic#3 RNCM

‘What happens to the sound object when proessed by a neural network?
‘A sound object has an AURA, can a machine generate an AURA?

AURA MACHINE

The piece took the listener on a journey through training a neural network, starting with the raw concrete materials (representing the sonic dataset), through to transmutation (model training) to the purely generated AI materials. Conceptually the piece played with both Pierre Schaffer’s idea on the sound objects and Walter Benjamin’s ‘The work of art in the age of mechanical reproduction’ with the idea of a piece of art being authentic to a time and place’. This AV piece was further developed as a live piece into a 20 minute live performance with sound sculpture, audio reactive visuals (collab with Sean Clarke) and performances at Manchester Science Museum and a lead feature for PatchNotes, Factmag.

Patch Notes, FACTmag, live performance AURA MACHINE

The residency was also supported by Arts Council England, this meant I could also develop new systems for algorithmic sound sculpture and circuit designs combining the creation of an ancient alchemical dataset (I found many parallels between alchemy and modern day ML) for visual generation and contemporary prototyping tools including 3d geometric design with Blender and 3D printed models before professional fabrication.

AURA#2 NEON glass sculpture
AURA#1 STEEL sound sculpture

One aim of the residency was to demystify machine learning, break out of the hidden black box and through examining both the inherent barriers and the current accessible tools, share insights and approaches for other artists to create with sonic AI. I created ‘DREAMING WITH MACHINES’ an educational project for young women with Brighter Sound to support others in using these tools and techniques. We looked at the technical history of AI, inherent bias in datasets, the positive aspects of the turn of creative AI and its potential to imagine future societies and realities. We discussed our relationship with machines and the participants created their own datasets and audio visual pieces which premiered online as ‘dream transmissions’ and were showcased at the Unsupervised#3 event at University of Manchester.

SLEEPSTATES DEBUT ALBUM_NOV 2022

SLEEPSTATES is the debut release by SONAMB, the new music project from Manchester sound artist Vicky Clarke, scheduled for release November 18th 2022. Exploring machine addiction, broken radio transmissions and algorithmic sleep territories, the record takes the listener through a cycle of dream transmissions from the electrical imaginary; the techno – emotional states of slumber we experience between humans and machines late at night.  

SLEEPSTATES album artwork

Created over the past three years via networked locations, sonic states combine transmission recordings between Manchester, Berlin and St Petersburg and lockdown internet noise experiments, bringing together her processes of sound creation with DIY electronics, sound sculpture, musique concrete and machine learning. The record explores the sonic materiality of our technologies, the sanctity of sleep and ideas of autonomy and control as we meld into our machines, blurring conscious boundaries of our waking and networked selves.

The release will be accompanied by SLEEPSTATES.NET, a net-art piece featuring music from the record. 

All tracks are written, produced and mixed by SONAMB, except track 7, mixed by Francine Perry and tracks 2 & 9, co-mixed by SONAMB and Francine Perry. SLEEPSTATES is mastered by Katie Tavini, Weird Jungle.


The album is self-released on Control Data Records, and is available on tape cassette and digital download direct from Bandcamp, pre-order now!

View Press Release here

Circular Sferics_ East St Arts residency

For the East St Arts / Convention House residency I am undertaking research for my project ‘Circular Sferics’. This will explore the duality of our technologies being both of the earth and used as detection devices for measuring, through creating a circular technological approach using sound and sculptural process. I’m using DIY environmental sensing technologies to record sounds of the natural environment using self built radio antennae, and my AURA MACHINE VLF and EMF detector circuits. 

The recordings will focus on sferics (atmospherics), the natural electrical phenomena detected by radio antennae, which will be analysed as data to develop a methodology of visualising these as 3D geometries. Working at Convention House I will learn new skills in 3D ceramic printing and 3D scanning to create new sculptural objects  – made out of clay – completing the circle as new material forms. 

I am looking forward to working with clay, the original technological material and representing unseen sferic phenomena as physical matter. The cyclical format of the residency explores themes around the circular economy, deep time and technological e-waste through sonic materiality. I will also be running an electronics workshop, developing my knowledge around environmental sculpture through mentoring and I look forward to getting involved and contributing to the Convention House community.

BACKGROUND

The work is inspired by my research and reading into natural radio and environmental sculpture. In lockdown I undertook the Radio x Rectangle course, learning to build short wave radios, crystal oscillators and antennae, in tandem I got interested in listening to natural electricity through reading Earth Sound Earth Signal by Douglas Kahn, A Geology of Media by Jussi Parikka and was inspired by the great Alvin Lucier’s experiments in the 80s recording sferics.

PHASE ONE: FABRICATION | ELECTRONICS | RECORDINGS

For the first stage I have been collaborating with the wonderful Daniel Simpkins (and fellow Rogue Studios artist) at KUNSTRUCT, who worked with me to fabricate and realise my designs. We began with the material, opting for a dark grey Valcrohmat with yellow flecks, an ethically sourced wooden composite that when varnished takes on a look of stone. Dan meticulously hand crafted each block for the sculpture, the forms of which were algorithmically generated (this sculpture is AURA#3 Wood, the third in the series of my aura machine sculptures developed via my methodology of neurally generating sculptural works) and then further designed by myself. The sculpture needed to be functional also with in-built speaker chamber, a detachable loop antenna and as an outdoor piece it needed to be mobile for recording in fields.

The electronics and interface for the piece encompass a VLF (very low frequency) circuit, a patchable mixer where you can opt to plug in the EMF detector and stereo amplification through a bass speaker and two mids. There is a detachable loop antenna and for grounding a large earth spike, I worked with long term collaborator Creative Technologist Chris Ball who helped me with circuit amplification and testing at Yorkshire Sculpture Park for our first recordings, which attracted a fair few cows! You can listen to these first test recordings below.

recording at Yorkshire Sculpture Park, August 2022

Thanks to Yorkshire Sculpture Park and Invisible Flock for hosting me for these first recordings, and to Shazia and Matt at East St Arts for their ongoing support.

In phase two I will be translating these sounds into 3D geometries for new sculptural ideas and working with the clay 3D printer in the Convention House maker space.

This work has been supported by East Street Arts and explored at Convention House in Leeds.

PATCH NOTES_ FACTmag performance

FACT magazine invited me to present a live performance of AURA MACHINE for their legendary PATCH NOTES series. Directed by Pedro Kuster and filmed at 180 Studios, The Strand, this special film is the latest AURA MACHINE collaboration with Sean Clarke. This performance represents the final piece in my musique concrete and machine learning residency and experience with NOVARS, University of Manchester. It was wonderful to showcase the visuals we made with Sean on a fantastic LED screen and the film really captures the materiality and mood of the sonic machine learning textures and movements throughout the piece. Huge thank you to Pedro and the whole team at FACT, a huge privilege.

You can watch the entire episode of Patch Notes and read the full article on FACTmag.com

What happens to the sound object when processed by a neural network? The AURA MACHINE piece takes the listener on a journey through training a neural network, starting with the concrete training data – through to transmutation and training the model and ends with the purely generated AI output.

Here are some clips from the 20 min set.

Artefacts of Clay and Information

Artefacts of clay and information explores the limits of our material perception. Questioning the things we see, hear and believe through tuning in to unseen forces just beyond the threshold of our reality. Traversing the borderline between the geologically real and the electrical imaginary, these artefacts of clay and information help us to explore this moment of transference and our place as human transmitters and receivers within the electromagnetic spectrum of existence.

Artefacts of Clay and Information is a collaboration between Vicky Clarke (UK) and Joaquina S (Argentina). This international collaboration is part of AMPLIFY DAI connecting women working in digital and sound arts, we have been developing the piece entirely online throughout the pandemic. This prototype was created for MUTEK Forum 2021, and is a first stage proof of concept. Our collaboration is now in a new phase of development for realising this piece for live audiences, updates coming 2022.

AURA MACHINE

AURA MACHINE_ Experiment #1 Neural Synthesis PRiSMSampleRNN

AV piece from my musique concrete and machine learning research residency with NOVARS, centre for innovation in sound, University of Manchester in collaboration with PRiSM. The piece premiered at PRiSM FutureMusic3 event, Royal Northern College of Music, June 2020 alongside works from our UNSUPERVISED machine learning for music working group.

About the piece_

‘The genuineness of a thing is the quintessence of everything about it since its creation that can be handed down, from its material duration to the historical witness that it bears.’

The Work of Art in the Age of Mechanical Reproduction, Walter Benjamin

A sound object has an aura.

Taking the starting point of the sound object, a sonic fragment or atom of authentic matter, what happens to this materiality when processed by a neural network? What new sonic materials and aesthetics will emerge? Can the AI system project newly distilled hybrid forms or will the process of data compression result in a lo-fi statistical imitation?

For this piece, my first experiment with neural synthesis, I sought to collide the two disciplines of musique concrete and machine learning to take the listener on a journey through the process of training a SampleRNN model. 

A tale of two states, AURA MACHINE begins with the training data, the original source material comprising the concrete dataset. Field recordings were categorised into distinct classes; ‘Echoes of Industry’ (Manchester mill spaces), ‘Age of Electricity’ (DIY technology, noise & machinery) and ‘Materiality’ glass fragments and metal sound sculptures. The second state is the purely generated AI output audio.  

Can a machine produce an aura?”

AURA MACHINE residency article written for PRiSM, Royal Northern College of Music: READ HERE

AURA MACHINE RESIDENCY BLOG AND RESEARCH SITE HERE.

This project is kindly support by Arts Council England.

Residency at NOVARS: Machine learning & Musique concrete

I’m joining NOVARS Research Centre for Innovation in Sound at the University of Manchester as artist in residence to explore systems for machine learning and musique concrete 2020-2021. The residency is part of European Art Science Technology Network for Digital Culture a partnership with 14 EU institutions. My research question is “How can concrete materials and neural networks project future sonic realities?”

Building upon my conceptual research into ML and music, the residency will allow me time, space and support to develop my technical skills in order to create my own systems to realise the idea for the AURA MACHINE neural network. The AURA MACHINE architecture takes inspiration from Russian projectionism, thinking around Walter Benjamin’s aura of an object, musique concrete and sample culture.

I’ll be creating a training dataset of concrete materials and echoes of industry and processing through generative neural network architectures, considering process, acousmatic sound and issues of bias, labour and automation at each stage of system development.

I’ll be using Python and Max to create ML systems for sound sculpture and live electronics and am really excited to be collaborating with the wonderful people at PRiSM, Centre for Practice & Research in Science & music at the Royal Northern College of Music through our new ML and music working group across the two institutions. I am really looking forward to being part of this new creative community for machine learning in Manchester and gaining support and insight from academic specialists in electroacoustic music and data science.

I will be starting a research website/blog for the project, until this is ready, here is a talk I gave setting out the research framework residency at ZKM inSonic Festival in December 2020.

inSonic talk at ZKM

ANU_Autonomous Noise Unit. Noise Orchestra

autonomousnoiseunit.co.uk

News from Noise Orchestra 2020, we’ve been working on ANU, a new project to develop technology to help musicians play together online. ANU is a hardware unit that runs Jacktrip software on a Raspberry Pi microcontroller, musicians who have an ANU unit can easily patch in with their instrument or mic, find other ANU users and improvise live over the internet via our NOISE SERVER. Think of it like a rehearsal room online.

The project was borne out of the pandemic situation, Dave and Sam were testing out available online tools for networked jamming and finding a mix of latency issues and not so great audio quality through the well known videochat services. The best open source software out there Jacktrip (developed at Stanford) whilst providing the best audio solution was proving to be a difficult interface and setup for beginners to get started. The research and development was therefore to explore how Jacktrip software could be used on a small hardware module with an easy interface for beginners, where musicians could simply plug in and play music together online. We were pleased to be supported by Innovate UK’s Covid 19 emergency response fund to build a hardware prototype, develop a web platform and get musicians involved for testing!

I worked on the visual design for the project, ANU – god of the skies! We wanted to communicate ideas of transmission, summoning up ancient ritual noise fragments and communing with others across the network, Dave created a fantastic narrative for the piece (*remember to click on the ANU logo) and read the scroll.

ANU god of the skies narrative

The website platform provides an introduction to the project, a technical ‘how to’ guide and signal flow approach plus frequently asked questions. Importantly musicians can create their own login profile and the server automatically saves the audio file of the recorded sessions for archive and playback. The platform represents the first phase in development, Dave and Sam are continuing live jam tests across the UK, Europe and further afield, it’s been fascinating to test these geographical potentials. The internet as a space for improvisation is fertile ground for experimentation, bringing into question how we collaborate, perceive and communicate within this dimension and what that means for the players and the listeners within networked time based media. We intend to progress the project with more groups and testing and further technical developments including hosting a listen back page on the website, where previous jams can be livestreamed and played back.

Visit the ANU website to learn more about the project, or our Noise Orchestra Blog to read about the technical development stages. It was a brilliant collaborative project bringing wonderful Sam Andreae into the Noise Orchestra fold who did an amazing job coding the hardware and website. We also worked with Tom Ward on the Noise Server and as ever legend Chris Ball who made the dashing laser cut boxes for the test modules.

signal flow diagram

We’re enjoying the feedback from musicians and groups who have been using the ANU units, we’ll keep you posted on next stages.

ANU_ DEDICATED TO THE ANCIENT ONES!

DISTANT ARCADES MUTEK

SLEEPSTATES EXHIBITED IN DISTANT ARCADES | MUTEK MONTREAL

As part of AmplifyDAI we were invited to present our work within the context of MUTEK Montreal’s Hybrid online exhibition. My sound piece SleepStates featured within Distant Arcades, the festival’s virtual exhibition.

About the piece

SleepStates is a sound work exploring machine addiction, sleep territories and sonic algorithmic control. Utilising sound sculpture, DIY electronics, broken radio transmissions and an AI trained on lucid dreams, self help slumber fragments and cyber-socialist manifestos.The piece is part of my ongoing SLEEP_STATESDOTNET project, a browser based artwork where the user sleepwalks between different states or audiovisual moments of anxiety, inertia and online perpetuity. I collaborated with digital artist Izzy Bolt who created the video piece working with TouchDesigner.

About Distant Arcades

Distant Arcades is interested in how artists are convening and creating using distance and technology. It features sound works, videos, 360s, and virtual-reality, echoing the tools and concepts discussed during MUTEK Forum: technology and the city, algorithmic bias, collection of personal data, technological racism, and the links between technology, machines and human emotions.”

Turn it Up

In TURN IT UP, we presented our work online to MUTEK audiences, these artist talks were a chance to connect with Mutek’s global community of artists, technologists and curators.

This opportunity was part of AmplifyDAI the digital artist development programme I am on 2020-2021 support women in UK, Argentina and Canada. Great to join the MUTEK community and looking forward to future collaborations.

UK-RUSSIA YEAR OF MUSIC

MACHINE LEARNING AND MUSIC AURAMACHINE RESEARCH TRIP

BRITISH COUNCIL | FEBRUARY 2020

As part of British Council’s UK-RUSSIA year of music programme I travelled to Russia to undertake creative research for AURA MACHINE, my (then) speculative project idea, a proposed artwork in the early stages of development to create a sound sculptural machine that uses machine learning algorithms to generate new audio output based on a data set of material concrete sounds. I wanted to explore the potential of live inputs with neural networks to realise future materialities. Supported by my UK partner FutureEverything and RU partner Ilia Symphocat, the main aims of my trip were to

  • Connect with artists using machine learning in their artistic practice
  • Collaborate and perform with my host part Ilia Symphocat
  • Sound Sculpture and machine design – explore visual aesthetics through visiting collections of technical and sonic objects
  • Field Recordings – recording urban sounds for ML datasets in St Petersburg and Moscow

St Petersburg

I arrived in beautiful St Petersburg to meet my host partner Ilia Symphocat, ambient composer, Simphonic Silence Inside label owner and curator of the Sound Museum to learn of his experiences and perspectives of performing with AI. On the first night Ilia took me to a DIY space for local artists for an exhibition opening from Nikita Panin, an artist working with visuals and machine learning. Nikita took inspiration from the inside of disk drives and considered the synergy of forms with Russian orthodox spiritual iconography. He trained a neural network on a dataset of 6000 sacred images and abstract paintings, presenting the outputs as large scale videos, prints and sculptures. The aesthetic of this was vibrant and pretty psychedelic and through speaking with Ilia’s artist friends, I learned that this visual style was quite typical of 90s St Petersburg rave culture. It was great to gain an insight into the local experimental arts scene and meet artists and musicians who were so welcoming and interested in our arts community in Manchester (especially Joy Division!)

I spent time at the Pushkinskaya Art Centre 10, a legendary alternative complex housing many artist studios, museums and galleries including the wonderful Sound Museum and Museum of Nonconformist Art. I met curator Lora Kucher who took me around the current digital photography exhibition she had worked on featuring Manchester and Russian artists and spoke about their work and history.

My host Ilia lived in the Pushkinskaya building having his studio there and working as the curator and events manager for the Sound Museum. Following some time recording objects in the museums collections, Ilia and I spent the day at his studio, talking and sharing our experiences of making music, performing and about our cities and music communities in Manchester and St Petersburg which incidentally are twinned cities. Ilia shared his experience of working with AI in live performance, he had worked with datasets of classical and jazz music which he reworked and improvised with in real time for his live set at Gamma Festival. We discussed how ‘live’ this can really be and the complexity and perspectives of working with this high level of technicality when you are a musician, as well as the importance of collaborating equitably with technologists.

Ilia Symphocat, Sound Museum

We performed at the Museum of NonConformist Art , a highly improvised and collaborative experience which we both enjoyed  immensely. Blending granulated field recordings and freezes with radio transmission fragments and some live objects. We had a good attentive crowd for a noisy Sunday evening in the gallery with Ilia hosting and introducing me and interpreting for the audience while I talked through the electronics and DIY interfaces I was going to play with. We worked with live visual artist Mikhail Mesyac who created a live digital backdrop to the set.

Other activities in St Petersburg included research on sculpture design through visiting the ‘Kinetic Art in Russia’ exhibition at the Grand Exhibition Hall which was hugely inspiring, charting it’s origins in constructivism to artists working with electronics in the present day, field recording around the city and with objects at the Sound Art Museum and working on music in my temporary airbnb studio! I found an amazing radio at the flea market on the outskirts of town, a Russian 303 with no aerial simply emitting white noise. I used this object as part of my performance with Ilia on the Sunday, combined with my field kit exploring and sampling the broken frequencies of each area I travelled to. It was my first time in St Petersburg and I loved the elegant city and the sparky people who were so welcoming and generous.

Moscow

I was excited to travel back to Moscow, it’s been five years since I was first here with Noise Orchestra, Dave and I had been researching 1920s noise machines and playing and making electromagnetic field recordings around town. I took the bullet train from St Petersburg very early in the morning, taking in the terrain between the two cities and pulling into Moscow, I forgot just how bombastic it is, the sheer scale and might of it. I enjoyed riding the garden ring at night and taking in the dramatically lit architecture and boulevards stretching off into the distance , I also remembered walks through Gorky park and quiet times recording on the streets around Tsvetnoy Bulvar and Patriarchs Pond.

Helena Nikonole and Nikita Prudnikov

I spent time with artist and curator Helena Nikonole and creative ML musician and technologist Nikita Prudnikov who are frequent collaborators on a range of projects exploring art and machine learning. One of their most recent collaborations Bird Language explores the structure of bird sounds through experimenting with neural networks. I had met Helena previously in Berlin on ‘The work of art in the age of artificial intelligence’ project at CTM and Helena had led the AI Hacklab for Gamma Festival. I was interested to discuss their experience of live sampling in AI,  practicalities of datasets and their applications (for example soundscapes v one shot sample libraries) and potential open source methodologies.

ML harpsichord score

I learned a lot practically and conceptually from Nikita and Helena around different neural network architectures from GANs to Auto-encoders and types of dataset. Nikita also shared a number of sounds and pieces testing these architectures some including voice, a call and response with an AI that was totally uncanny and quite eerie, a style transfer method featuring Brodski and footage of a live exhibition (Helena had curated) where he had performed with Katerina a harpsichord player. The sounds were unlike anything I had heard before and I loved the concept of working in latent space to seek timbres. I spoke about my work with radio frequency searching recordings and we discussed the parallels here with exploring space and working with fragments. It was also good to talk with Helena about our experience and the importance of genuine collaboration between artists and creative technologists when working in advanced fields such as this.

The twinned cities of Manchester and St Petersburg

Huge thanks to British Council for enabling me to undertake this trip. It was so inspiring to get to spend time with such amazing people, build connections between Manchester and St Petersburg/Moscow and learn about this field. In terms of my practice it has helped immensely with thinking about the next steps in project development for AURAMACHINE. Special thanks to Irini Papadimitriou and the FutureEverything team who supported me as my UK partner, and Tom Sweet and Evgenia Gerasimova from the British Council for their wonderful support.