Sound Asleep

1977x1280_primitivism-oil-canvas-sleep-guitar-gypsy-painting

Henri Rousseau, The Sleeping Gypsy (1897)

Sound Asleep is a collaboration between Milton Mermikides and eminent sleep scientists Professor Debra Skene (University of Surrey), Professor Vladyslav Vyazovskiy (University of Oxford), Professor of Computing Paul Krause (University of Surrey) and several sleep researchers and video, audio and computer programmers. The project is developing techniques that allow the systematic translation of sleep data into musical compositions, with a number of outputs and events based on this system. The aims of the project are – through the creation of software resources and associated public engagement artworks and events – to reveal the nature of sleep and the hidden lives that we all share. Through the translation of data into sound, such phenomena as the disruption of sleep in the visually impaired, sleep apnoea and the transitions in brainwave activity between sleep states are captured and translated, allowing a musical communication – and aesthetic appreciation – of this information. The project is being disseminated to members of the sleep science, visually impaired, sleep disorder, Art/Sci and wider communities, allowing us all to experience this otherwise hidden – yet vital – part of our lives. Sound Asleep was presented at the prestigious 2014 European Sleep Research Society conference in Tallinn, Estonia and – supported by a Surrey X- faculty award – will be presented at a keynote presentation at the British Sleep Society conference at the Sage Gateshead in October 2015, and TedX Groningen in Spring 2016.

The X-faculty award has allowed the design of software allowing the automated conversion of several modes of sleep data into sonic works. These include the conversion of sleep maps, revealing the (a)synchrony between circadian rhythms and clock time. This subject has ‘entrained sleep’, where the melatonine peak occurs at a similar time each day, and there is a regular pattern of sleep.

The following subject however has ‘phasing’ of circadian and clock time, resulting in an interrupted sleep (and subsequent melody) and displacement of rhythmic patterns.

PSG data may also by transformed into a ‘score’ revealing the interactions of salient events and parameters during one night’s sleep. Here is the sound of normal sleep:

And – using exactly the same translation rules – an example rendering of a subject with severe sleep apnoea:

And, again using the same translation system, a subject with restless leg syndrome.

Realtime EEG data during various phases of sleep may also translated into sound using various algorithms. The various stages of sleep can be identified by sonic characteristics.

If you are interested in the project please contact Milton

(Visited 1,603 times, 1 visits today)
Menu Title