SLEEPSTATES.NET: networked dream-intermissions throughout lockdown  Remote Residency, Manchester International Festival 2020

ABOUT_ SLEEPSTATES.NET is a work-in-progress browser based artwork where the user enters a virtual environment and navigates between different sleep states; a series of audio-visual moments capturing cyclical feelings of inertia, anxiety and online perpetuity experienced under lockdown. These networked dream-intermissions depict our quietly incremental machine addiction, these late night silent boundaries of questionable consent where algorithmic agents threaten to encroach on our states of sleep; our final autonomous, un-trackable human slumber space.

The sleep states will be realised using newly acquired ‘isolation skills’, learnt via online tutorials of HTML, Three.js and JavaScript programming languages; creatively exploring my theoretical and technical research into neural networks, machine learning and the ethics of training datasets. The sleep state art-work will develop in content, narrative and complexity relative to the developmental acquisition of creative coding skills during the remote residency. The work will be available in your browser at some point … in the near future ………………………………………………………

Opening screen featuring Pink Moon April 2020 visualisation_ generated by Machine Learning StyleGAN trained on 480 found internet images of April pink moon uploads.



Life on lockdown has disbanded all regular structures, routines and sense of time. We’re living in free_fall, whether working from home, the melding of weekdays and weekends, clocks going forward or simple concepts like bed_time. Our view of the future and the past has changed, this current condition seems hazy and perpetual with no end in sight; a cyclical recurring state where we are ‘on the cusp’ … of something. One thing is certain, our undeniable proximity to our computers and their hold over us.

Pink Moon explorations in latent space, machine learning experiments, April 2020 

Adding to our sense of inertia, we’re endlessly scrolling through the noise of social media, riding the rolling 24-hr apocalyptic news and witnessing daily governmental briefings where experts (yes they’re back) share data visualisations from outside reality. Sleep is so important to our sense of time, pace and regularity. It’s one of the structures we haven’t lost in this crisis, yet we’re struggling to sleep, sleeping too much, sleeping at unorthodox times or feeling inexplicable tiredness. Machines are systematically encroaching on our states of sleep, what can we do?

///// Dreams become days ///// Days become dreams ///// Zeros become onessssss…………

To try and make sense of this societal free_fall and as counter to the mass of data and statistics, SleepStates will document these changing emotions and mental states as we navigate through this intermission. will collate this audio-visual content in an interactive online artwork that recurs the grammar of early net art and occurs contemporary digital surveillance tactics, seeking to reflect on our feelings of being online and displaced, connected yet remote, subjects ‘with’ complicity and the state of being in-between time.


Neural Network Architecture for AURA_MACHINE c. 2020

SONAMB corp. is a speculative AI startup who (as it will become clear) is powering the SleepStates interface. Whilst the user innocently navigates the different SleepStates experiencing a variety of sonic emotions, Sonamb is quietly extracting the user’s cognitive particulates,  collating the sonic-spatial-spectral frequencies of your auditory dreams to submit to a vast SleepState dataset. Developers are at the ready to beta test a ‘brand new for 2020’ convnet for forthcoming AURA_MACHINE;  a deep learning algorithm that will compute, predict and broadcast live sonic dream intermissions direct to your auditory cortex, in multichannel, for f(r)ee …. whilst you’re sleeping.


For the project i’m learning HTML and CSS to create web content and styling for the different sleep states combined with P5.js and Three.js for interactive and 3D visual content. Sound design experiments and sketches will be shared to my soundcloud and combined with the navigable states as they develop. Cyclical narratives and interactivity will be completed before the site is hosted and available on the net.


BACKGROUND ML_RESEARCH is a real time work-in-progress methodology to creatively explore my current research in music and machine learning and pursuit to develop my technical and conceptual understanding of this field. In 2019, I participated in the ‘The Work of Art in the Age of Artificial Intelligence’ programme at CTM festival, Berlin followed by further research into Wekinator ML with Dr Rebecca Fiebrink as part of ‘Decoding DIY’ 2019 from Live Art Development Agency/HOME. This year in March 2020 I undertook a creative research trip to Moscow and St Petersburg as a selected artist for British Council’s UK-Russia Year of Music, meeting artists and technologists working in this field including Ilia Symphocat, Helena Nikonole and Nikita Prudnikov and supported by FutureEverything, I’m currently studying ‘Introduction to AI and Neural Networks’ course at the University of Karlsruhe within the KIM ‘Critical Artificial Intelligence’ research group, led by Professor Matteo Pasquinelli, focused on media philosophy, technical history and contemporary ethics around current AI practices and societal implications.

Screenshot 2020-06-17 at 21.15.13

SleepStates work in progress features at the end of a ArtistsDIY lockdown feature I did with FACTmag

Thank you to Manchester International Festival for supporting this remote residency. #MIFCreatives2020


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s