What does the skyline of New York sound like? How can you make a composition from your sleep patterns or blood cells? Music can be made from anything we find around us, from our names or birth dates to our cells, from atoms to stars. Composer and guitarist Milton Mermikides presents the fascinating origins and history of data sonification – the translation of information or patterns into sound and music – as well as a selection of his own compositions derived from sleep cycles, viruses, paintings, exoplanetary moons, traffic patterns and other ‘non-musical’ data. In addition, a string trio of the Ensemble Montage will demonstrate how these data sound and perform a new composition based on ‘the hidden music’ of Noorderzon Performing Arts Festival. Discover how music can reveal the patterns in the natural world, and give us both a theoretical and aesthetic appreciation of everything around us.
For students and subscribers of Studium Generale tickets are € 5,-
The 2nd International Guitar Research Centre Conference (March 18-23, 2016) has attracted speakers and performers from every continent, and guitar style. It’s a fantastic line-up, and timetable shaping up.
Translation of pendulum waves to music using a simple pitch translation system on a 5-limit Yo scale. Watch the right hand edge and all will make sense! This little experiment turned out so well, I think it deserves a whole project. My head and ears are spinning.
A real pleasure to appear with my sister Alex to talk about the Bloodlines project (and data sonification in general) on BBC Radio 4’s Midweek on Wednesday 28th October hosted by the quite brilliant Libby Purves. Fellow guests included the delightful and inspirational Peggy Seeger and Amati’s James Buchanan.
The next in the series of Hidden Music data sonification works. Data sonification is a long term interest/project/passion of mine, which involves the systematic translation of ‘non-musical’ data into music.
Here I’ve taken Kandinsky’s beautiful 1926 painting Several Circles and translated it systematically into sound. Colour and vertical position are translated into timbre and pitch respectively, as the red cursor scans the image horizontally.
Whether Kandinsky was a synaesthete or not is disputed, but his fusion of music and visual art metaphor, working process and concept is well documented. From the link:
“Our response to his work should mirror our appreciation of music and should come from within, not from its likenesses to the visible world: “Colour is the keyboard. The eye is the hammer. The soul is the piano with its many strings.”
Kandinsky achieved pure abstraction by replacing the castles and hilltop towers of his early landscapes with stabs of paint or, as he saw them, musical notes and chords that would visually “sing” together. In this way, his swirling compositions were painted with polyphonic swathes of warm, high-pitched yellow that he might balance with a patch of cold, sonorous blue or a silent, black void.”
Here’s the first in a long series of data sonification experiments. This Hidden Music series is a long term interest/project/passion of mine, which involves the systematic translation of ‘non-musical’ data into music. Here’s a simple example, the orbital periods of the planets of the solar system translated into pitch and rhythm. The rhythms are simply created by speeding up the actual orbital periods by 25 octaves (doubling the speed 25 times), and the pitches are created by transposing them up 37 octaves. I haven’t quantized pitch or rhythm, so its both microtonal to the nearest cent (100th of a semitone) and microtemporal (to the nearest millisecond), but I hear a clockwork beauty in this irrational/chaotic collection of ratios nonetheless. Stay tuned for some even more distant harmony from some ex-planets. I recommend a sub-bass speaker to really feel Uranus and Neptune’s drones. Thanks to Rob Scott for his space science brain, and my long term partner-in-nerd Anna Tanczos for the visuals.
The next international conference of the International Guitar Research Centre has been announced. It will take place 18th to 23rd March 2016. The call for papers, keynote speakers and headline concert artists can be found here. The deadline for proposals is midnight GMT on Friday 9th October 2015.
The IGRC has no stylistic or conceptual prejudice, if you are doing work that is innovative, creative and related to the guitar, we are interested. For further info