Artefacts of Clay and Information is a collaboration between Vicky Clarke (UK) and Joaquina S (Argentina). This international collaboration is part of AMPLIFY DAI connecting women working in digital and sound arts, we have been developing the piece entirely online throughout the pandemic. This prototype was created for MUTEK Forum 2021, and is a first stage proof of concept. Our collaboration is now in a new phase of development for realising this piece for live audiences, updates coming 2022.
AURA MACHINE_ Experiment #1 Neural Synthesis PRiSMSampleRNN
AV piece from my musique concrete and machine learning research residency with NOVARS, centre for innovation in sound, University of Manchester in collaboration with PRiSM. The piece premiered at PRiSM FutureMusic3 event, Royal Northern College of Music, June 2020 alongside works from our UNSUPERVISED machine learning for music working group.
About the piece_
‘The genuineness of a thing is the quintessence of everything about it since its creation that can be handed down, from its material duration to the historical witness that it bears.’
The Work of Art in the Age of Mechanical Reproduction, Walter Benjamin
A sound object has an aura.
Taking the starting point of the sound object, a sonic fragment or atom of authentic matter, what happens to this materiality when processed by a neural network? What new sonic materials and aesthetics will emerge? Can the AI system project newly distilled hybrid forms or will the process of data compression result in a lo-fi statistical imitation?
For this piece, my first experiment with neural synthesis, I sought to collide the two disciplines of musique concrete and machine learning to take the listener on a journey through the process of training a SampleRNN model.
A tale of two states, AURA MACHINE begins with the training data, the original source material comprising the concrete dataset. Field recordings were categorised into distinct classes; ‘Echoes of Industry’ (Manchester mill spaces), ‘Age of Electricity’ (DIY technology, noise & machinery) and ‘Materiality’ glass fragments and metal sound sculptures. The second state is the purely generated AI output audio.
Can a machine produce an aura?”
AURA MACHINE residency article written for PRiSM, Royal Northern College of Music: READ HERE
This project is kindly support by Arts Council England.
I’m joining NOVARS Research Centre for Innovation in Sound at the University of Manchester as artist in residence to explore systems for machine learning and musique concrete 2020-2021. The residency is part of European Art Science Technology Network for Digital Culture a partnership with 14 EU institutions. My research question is “How can concrete materials and neural networks project future sonic realities?”
Building upon my conceptual research into ML and music, the residency will allow me time, space and support to develop my technical skills in order to create my own systems to realise the idea for the AURA MACHINE neural network. The AURA MACHINE architecture takes inspiration from Russian projectionism, thinking around Walter Benjamin’s aura of an object, musique concrete and sample culture.
I’ll be creating a training dataset of concrete materials and echoes of industry and processing through generative neural network architectures, considering process, acousmatic sound and issues of bias, labour and automation at each stage of system development.
I’ll be using Python and Max to create ML systems for sound sculpture and live electronics and am really excited to be collaborating with the wonderful people at PRiSM, Centre for Practice & Research in Science & music at the Royal Northern College of Music through our new ML and music working group across the two institutions. I am really looking forward to being part of this new creative community for machine learning in Manchester and gaining support and insight from academic specialists in electroacoustic music and data science.
I will be starting a research website/blog for the project, until this is ready, here is a talk I gave setting out the research framework residency at ZKM inSonic Festival in December 2020.
inSonic talk at ZKM
News from Noise Orchestra 2020, we’ve been working on ANU, a new project to develop technology to help musicians play together online. ANU is a hardware unit that runs Jacktrip software on a Raspberry Pi microcontroller, musicians who have an ANU unit can easily patch in with their instrument or mic, find other ANU users and improvise live over the internet via our NOISE SERVER. Think of it like a rehearsal room online.
The project was borne out of the pandemic situation, Dave and Sam were testing out available online tools for networked jamming and finding a mix of latency issues and not so great audio quality through the well known videochat services. The best open source software out there Jacktrip (developed at Stanford) whilst providing the best audio solution was proving to be a difficult interface and setup for beginners to get started. The research and development was therefore to explore how Jacktrip software could be used on a small hardware module with an easy interface for beginners, where musicians could simply plug in and play music together online. We were pleased to be supported by Innovate UK’s Covid 19 emergency response fund to build a hardware prototype, develop a web platform and get musicians involved for testing!
I worked on the visual design for the project, ANU – god of the skies! We wanted to communicate ideas of transmission, summoning up ancient ritual noise fragments and communing with others across the network, Dave created a fantastic narrative for the piece (*remember to click on the ANU logo) and read the scroll.
The website platform provides an introduction to the project, a technical ‘how to’ guide and signal flow approach plus frequently asked questions. Importantly musicians can create their own login profile and the server automatically saves the audio file of the recorded sessions for archive and playback. The platform represents the first phase in development, Dave and Sam are continuing live jam tests across the UK, Europe and further afield, it’s been fascinating to test these geographical potentials. The internet as a space for improvisation is fertile ground for experimentation, bringing into question how we collaborate, perceive and communicate within this dimension and what that means for the players and the listeners within networked time based media. We intend to progress the project with more groups and testing and further technical developments including hosting a listen back page on the website, where previous jams can be livestreamed and played back.
Visit the ANU website to learn more about the project, or our Noise Orchestra Blog to read about the technical development stages. It was a brilliant collaborative project bringing wonderful Sam Andreae into the Noise Orchestra fold who did an amazing job coding the hardware and website. We also worked with Tom Ward on the Noise Server and as ever legend Chris Ball who made the dashing laser cut boxes for the test modules.
We’re enjoying the feedback from musicians and groups who have been using the ANU units, we’ll keep you posted on next stages.
ANU_ DEDICATED TO THE ANCIENT ONES!
SLEEPSTATES EXHIBITED IN DISTANT ARCADES | MUTEK MONTREAL
As part of AmplifyDAI we were invited to present our work within the context of MUTEK Montreal’s Hybrid online exhibition. My sound piece SleepStates featured within Distant Arcades, the festival’s virtual exhibition.
About the piece
SleepStates is a sound work exploring machine addiction, sleep territories and sonic algorithmic control. Utilising sound sculpture, DIY electronics, broken radio transmissions and an AI trained on lucid dreams, self help slumber fragments and cyber-socialist manifestos.The piece is part of my ongoing SLEEP_STATESDOTNET project, a browser based artwork where the user sleepwalks between different states or audiovisual moments of anxiety, inertia and online perpetuity. I collaborated with digital artist Izzy Bolt who created the video piece working with TouchDesigner.
About Distant Arcades
“Distant Arcades is interested in how artists are convening and creating using distance and technology. It features sound works, videos, 360s, and virtual-reality, echoing the tools and concepts discussed during MUTEK Forum: technology and the city, algorithmic bias, collection of personal data, technological racism, and the links between technology, machines and human emotions.”
Turn it Up
In TURN IT UP, we presented our work online to MUTEK audiences, these artist talks were a chance to connect with Mutek’s global community of artists, technologists and curators.
This opportunity was part of AmplifyDAI the digital artist development programme I am on 2020-2021 support women in UK, Argentina and Canada. Great to join the MUTEK community and looking forward to future collaborations.
MACHINE LEARNING AND MUSIC AURAMACHINE RESEARCH TRIP
BRITISH COUNCIL | FEBRUARY 2020
As part of British Council’s UK-RUSSIA year of music programme I travelled to Russia to undertake creative research for AURA MACHINE, my (then) speculative project idea, a proposed artwork in the early stages of development to create a sound sculptural machine that uses machine learning algorithms to generate new audio output based on a data set of material concrete sounds. I wanted to explore the potential of live inputs with neural networks to realise future materialities. Supported by my UK partner FutureEverything and RU partner Ilia Symphocat, the main aims of my trip were to
- Connect with artists using machine learning in their artistic practice
- Collaborate and perform with my host part Ilia Symphocat
- Sound Sculpture and machine design – explore visual aesthetics through visiting collections of technical and sonic objects
- Field Recordings – recording urban sounds for ML datasets in St Petersburg and Moscow
I arrived in beautiful St Petersburg to meet my host partner Ilia Symphocat, ambient composer, Simphonic Silence Inside label owner and curator of the Sound Museum to learn of his experiences and perspectives of performing with AI. On the first night Ilia took me to a DIY space for local artists for an exhibition opening from Nikita Panin, an artist working with visuals and machine learning. Nikita took inspiration from the inside of disk drives and considered the synergy of forms with Russian orthodox spiritual iconography. He trained a neural network on a dataset of 6000 sacred images and abstract paintings, presenting the outputs as large scale videos, prints and sculptures. The aesthetic of this was vibrant and pretty psychedelic and through speaking with Ilia’s artist friends, I learned that this visual style was quite typical of 90s St Petersburg rave culture. It was great to gain an insight into the local experimental arts scene and meet artists and musicians who were so welcoming and interested in our arts community in Manchester (especially Joy Division!)
I spent time at the Pushkinskaya Art Centre 10, a legendary alternative complex housing many artist studios, museums and galleries including the wonderful Sound Museum and Museum of Nonconformist Art. I met curator Lora Kucher who took me around the current digital photography exhibition she had worked on featuring Manchester and Russian artists and spoke about their work and history.
My host Ilia lived in the Pushkinskaya building having his studio there and working as the curator and events manager for the Sound Museum. Following some time recording objects in the museums collections, Ilia and I spent the day at his studio, talking and sharing our experiences of making music, performing and about our cities and music communities in Manchester and St Petersburg which incidentally are twinned cities. Ilia shared his experience of working with AI in live performance, he had worked with datasets of classical and jazz music which he reworked and improvised with in real time for his live set at Gamma Festival. We discussed how ‘live’ this can really be and the complexity and perspectives of working with this high level of technicality when you are a musician, as well as the importance of collaborating equitably with technologists.
We performed at the Museum of NonConformist Art , a highly improvised and collaborative experience which we both enjoyed immensely. Blending granulated field recordings and freezes with radio transmission fragments and some live objects. We had a good attentive crowd for a noisy Sunday evening in the gallery with Ilia hosting and introducing me and interpreting for the audience while I talked through the electronics and DIY interfaces I was going to play with. We worked with live visual artist Mikhail Mesyac who created a live digital backdrop to the set.
Other activities in St Petersburg included research on sculpture design through visiting the ‘Kinetic Art in Russia’ exhibition at the Grand Exhibition Hall which was hugely inspiring, charting it’s origins in constructivism to artists working with electronics in the present day, field recording around the city and with objects at the Sound Art Museum and working on music in my temporary airbnb studio! I found an amazing radio at the flea market on the outskirts of town, a Russian 303 with no aerial simply emitting white noise. I used this object as part of my performance with Ilia on the Sunday, combined with my field kit exploring and sampling the broken frequencies of each area I travelled to. It was my first time in St Petersburg and I loved the elegant city and the sparky people who were so welcoming and generous.
I was excited to travel back to Moscow, it’s been five years since I was first here with Noise Orchestra, Dave and I had been researching 1920s noise machines and playing and making electromagnetic field recordings around town. I took the bullet train from St Petersburg very early in the morning, taking in the terrain between the two cities and pulling into Moscow, I forgot just how bombastic it is, the sheer scale and might of it. I enjoyed riding the garden ring at night and taking in the dramatically lit architecture and boulevards stretching off into the distance , I also remembered walks through Gorky park and quiet times recording on the streets around Tsvetnoy Bulvar and Patriarchs Pond.
I spent time with artist and curator Helena Nikonole and creative ML musician and technologist Nikita Prudnikov who are frequent collaborators on a range of projects exploring art and machine learning. One of their most recent collaborations Bird Language explores the structure of bird sounds through experimenting with neural networks. I had met Helena previously in Berlin on ‘The work of art in the age of artificial intelligence’ project at CTM and Helena had led the AI Hacklab for Gamma Festival. I was interested to discuss their experience of live sampling in AI, practicalities of datasets and their applications (for example soundscapes v one shot sample libraries) and potential open source methodologies.
I learned a lot practically and conceptually from Nikita and Helena around different neural network architectures from GANs to Auto-encoders and types of dataset. Nikita also shared a number of sounds and pieces testing these architectures some including voice, a call and response with an AI that was totally uncanny and quite eerie, a style transfer method featuring Brodski and footage of a live exhibition (Helena had curated) where he had performed with Katerina a harpsichord player. The sounds were unlike anything I had heard before and I loved the concept of working in latent space to seek timbres. I spoke about my work with radio frequency searching recordings and we discussed the parallels here with exploring space and working with fragments. It was also good to talk with Helena about our experience and the importance of genuine collaboration between artists and creative technologists when working in advanced fields such as this.
Huge thanks to British Council for enabling me to undertake this trip. It was so inspiring to get to spend time with such amazing people, build connections between Manchester and St Petersburg/Moscow and learn about this field. In terms of my practice it has helped immensely with thinking about the next steps in project development for AURAMACHINE. Special thanks to Irini Papadimitriou and the FutureEverything team who supported me as my UK partner, and Tom Sweet and Evgenia Gerasimova from the British Council for their wonderful support.
Inspired by internet cafe culture, Yemen: Say Hello to Connect is a travelling digital artwork exploring the interconnected themes of the humanitarian crisis through voice interactive storytelling and generative art. Commissioned by Imperial War Museum North and FutureEverything, the aim was to use digital technology to engage public audiences in dialogue about the crisis in public spaces across Manchester and encourage further visits to the ‘Yemen Inside a Crisis’ exhibition at IWMN exhibition.
The piece was a collaboration with spoken word artist Amerah Saleh and creative technologist Chris Ball, we also worked with KUNSTRUCT to realise the project within a physical pop-up structure. Reflecting curatorial content from the gallery exhibition and working closely with the IWMN team, the pop-up digital artwork explored interconnected themes of food and water insecurity, childhood and education, and transport and infrastructure through a storytelling narrative experienced through headphones and voice responsive visuals.
I undertook early research into Yemen’s digital infrastructure, learning that before the crisis there had been an explosion of internet cafe culture, a burgeoning economy for young independent business and how these cafes also acted as safe meeting spaces for women. Houthi rebels had seized control of internet pipelines in major cities across Yemen, so access and control of the internet was a contested and political area. Starting with the idea of an internet cafe, I wanted to somehow reflect this need for direct connection and create a physical pop up space people could come into and connect digitally via the network. Aesthetically I was inspired by nineties cyber cafe, vaporwave vistas and synthwave.
CONNECTING THROUGH STORYTELLING_
The conflict is an extremely complex and sensitive subject area and it was really important to consider tone and balance, working with Amerah and the Yemeni Community was key to thinking how we could best connect to the stories and realities experienced by everyday people. Amerah’s storytelling brought these experiences to life in a personal one on one dialogue, where we encouraged people to ‘take a moment out of their busy day’. The narrative journey begins when a user places headphones on and speaks into the microphone, saying ‘Hello’ to connect, triggering the opening narrative sequence that builds a picture of the crisis followed by an unfolding narrative with accompanying collage imagery.
LIVE POP UPS_
Pop up events took place at Piccadilly Train Station, Great Northern Warehouse, University of Manchester and HATCH. Each location had a different activations including with live performances and conversation cafes. We connected with the Yemeni Community Association in Manchester with Amerah Amerah running spoken word sessions with youth groups building towards a live performance at Manchester’s Piccadilly Train Station. Our partners Reform Radio supported us at our launch at HATCH and broadcast a live ‘Into Continental’ show hosted by Dr Mystery with live poetry, interviews and Yemeni soundtrack.
TECHNOLOGY & INTERACTIVITY_
The interactive aspect of the piece encouraged users to put themselves in the place of Yemeni citizens and answer provocations, simply questions such as “How do you use water everyday?” or “What would you stop doing first if your water was limited?”. Your verbal response used voice recognition to trigger images on screen, creating a uniquely personal generative sequence in real time. To construct the digital piece, ensure connectivity and create the voice response interactivity I worked with Chris Ball, we tested various modes of voice recognition, creating word libraries for possible word combinations and keywords that related to collaged imagery. User testing took place at Manchester Technology Centre and each pop up ran four stations using mobile wifi networks. This verbal dialogue with the narrator asks us to consider the complexity and interconnectedness of the issues Yemeni people are facing. To bring the reality of the situation home, the artwork poses difficult questions for us to consider in the context of our own lives:
The live pop up events took place between May and September 2019. The piece was shortlisted for SXSW 2020. You can read more about the project on the commissioners sites here: FutureEverything and Imperial War Museum North
Special thanks to the team Jacquie Reich, Claire Shaw, Joe Ford, Jez Houghton, Joe Whitmore, Irini Papadimitriou and Camilla Thomas.
My work was featured in new SONIC FUTURES: HOW TECHNOLOGY IS GUIDING MUSIC documentary from FACTmag and British Council Music.
Images and information about the documentary
The British Council and FACT have released a new mini-documentary about British technological innovation in electronic music that explores how contemporary UK artists are looking to the past, present and future to create new sounds and utopian spaces.
Sonic Futures: How Technology is Guiding Electronic Music speaks to several British artists at the cutting edge of composition, coding, engineering and performance: sound artist and DIY musical interface builder Vicky Clarke; producer Lee Gamble; composer and hacker Venus Ex Machina; interdisciplinary artist and Algorave musician Lizzie Wilson aka digital selves; and queer club collective Tremors.
The film explores how contemporary UK artists are using machines and code to facilitate new forms of collaboration, whether musicians should be afraid of machine learning and artificial intelligence’s movement into composition, and how music technology has the power to liberate individuals and bring marginalised communities together.
“In terms of the British history of electronic music we’re a nation of hobbyists and tinkerers and we like to build and make things,” says Manchester-based sound artist Clarke, whose DIY physical interfaces for controlling digital sound are part of a dialogue with the UK’s electronic music history spanning back to the foundation of the pioneering BBC Radiophonic Workshop in 1958.
Produced by FACT in association with the British Council
Directed by Anoushka Seigler
Edited by Kamil Dymek
Shot by Pawel Ptak and Pedro Kuster
AI: More Than Human footage courtesy of Barbican
Archive footage courtesy of BBC
Program advisors – Claire Lobenfeld and John Twells
MATERIALITY explored sound sculpture as a medium to interface physical materials with Ableton music software via Arduino and DIY electronics. Bringing together my love of field recordings, electronics and electronic music production, I researched the sonic and conductive properties of glass and metal materials at London Sculpture Studio and the National Glass Centre resulting in new resonant sculptural instruments and new physical/digital performance system. I collaborated with researchers at the National Graphene Institute to develop a capacitive controller for Ableton and created an industrial musique concrete sample library from the processes of working with the materials. I performed with the graphene interface at Music Tech Fest Stockholm, Access Space and DINA in Sheffield and was featured on the Composer-Curator Sound and Music Podcast. Full project research blog here.
Noise Orchestra were commissioned by ENLIGHT: European Light Expression Network to create ‘SWARM: Play the light of the city’, a sound walk where users are provided with portable Noise Machines that respond to light in the urban environment. These walks took place in 2018 at Rome Media Art Festival, SPECTRA Aberdeen and Manchester Science Festival. Participants form a walking electronic drone orchestra, re-framing their relationship to architecture, the urban environment and spatial acoustics. The Noise Machines sample the environment and can pitch bend with light, they also have three light dependent waveform generators for synthesis in the environment. Technology was developed on residency at Pervasive Media Studios, Bristol, Fondazione Mondo Digitale in Rome and Eagle Labs, Salford. Read more about the project development here.