Bridget and Milton Mermikides will be performing their classical guitar and live electronic project, Tension Blue at Canterbury Christ Church University, preceded by a talk on Milton’s Hidden Music series. Wednesday 24th January 2018, St Gregory’s Centre for Music (Talk 11.45am, Concert 1.10-2pm), Free Entry.
What does the skyline of New York sound like? How can you make a composition from your sleep patterns or blood cells? Music can be made from anything we find around us, from our names or birth dates to our cells, from atoms to stars. Composer and guitarist Milton Mermikides presents the fascinating origins and history of data sonification – the translation of information or patterns into sound and music – as well as a selection of his own compositions derived from sleep cycles, viruses, paintings, exoplanetary moons, traffic patterns and other ‘non-musical’ data. In addition, a string trio of the Ensemble Montage will demonstrate how these data sound and perform a new composition based on ‘the hidden music’ of Noorderzon Performing Arts Festival. Discover how music can reveal the patterns in the natural world, and give us both a theoretical and aesthetic appreciation of everything around us.
For students and subscribers of Studium Generale tickets are € 5,-
Translation of pendulum waves to music using a simple pitch translation system on a 5-limit Yo scale. Watch the right hand edge and all will make sense! This little experiment turned out so well, I think it deserves a whole project. My head and ears are spinning.
A real pleasure to appear with my sister Alex to talk about the Bloodlines project (and data sonification in general) on BBC Radio 4’s Midweek on Wednesday 28th October hosted by the quite brilliant Libby Purves. Fellow guests included the delightful and inspirational Peggy Seeger and Amati’s James Buchanan.
The next in the series of Hidden Music data sonification works. Data sonification is a long term interest/project/passion of mine, which involves the systematic translation of ‘non-musical’ data into music.
Here I’ve taken Kandinsky’s beautiful 1926 painting Several Circles and translated it systematically into sound. Colour and vertical position are translated into timbre and pitch respectively, as the red cursor scans the image horizontally.
Whether Kandinsky was a synaesthete or not is disputed, but his fusion of music and visual art metaphor, working process and concept is well documented. From the link:
“Our response to his work should mirror our appreciation of music and should come from within, not from its likenesses to the visible world: “Colour is the keyboard. The eye is the hammer. The soul is the piano with its many strings.”
Kandinsky achieved pure abstraction by replacing the castles and hilltop towers of his early landscapes with stabs of paint or, as he saw them, musical notes and chords that would visually “sing” together. In this way, his swirling compositions were painted with polyphonic swathes of warm, high-pitched yellow that he might balance with a patch of cold, sonorous blue or a silent, black void.”
Here’s the first in a long series of data sonification experiments. This Hidden Music series is a long term interest/project/passion of mine, which involves the systematic translation of ‘non-musical’ data into music. Here’s a simple example, the orbital periods of the planets of the solar system translated into pitch and rhythm. The rhythms are simply created by speeding up the actual orbital periods by 25 octaves (doubling the speed 25 times), and the pitches are created by transposing them up 37 octaves. I haven’t quantized pitch or rhythm, so its both microtonal to the nearest cent (100th of a semitone) and microtemporal (to the nearest millisecond), but I hear a clockwork beauty in this irrational/chaotic collection of ratios nonetheless. Stay tuned for some even more distant harmony from some ex-planets. I recommend a sub-bass speaker to really feel Uranus and Neptune’s drones. Thanks to Rob Scott for his space science brain, and my long term partner-in-nerd Anna Tanczos for the visuals.
I asked my friend and many-time collaborator Anna Tanczos to visualise Villa-Lobos’s New York Skyline Melody for a recent lecture-presentation. The results are fantastic (I predict 1000s of views), and you can see exactly how Villa-Lobos translated the New York Skyline into a solo piano work (note the multiple voices with the foreground and background buildings). This piece has been a big inspiration to me the field of data sonification. For more on New York Skyline Melody and similar workssee here, and for all things Data Sonification here.