Happy to be involved in this fascinating New Scientist article on MIDI 2.0’s potential.
Happy to be involved in this fascinating New Scientist article on MIDI 2.0’s potential.
Lecture and Workshop for Ableton at Glasgow’s Question Session in the beautiful Lighthouse venue. Make music from anything.
Feb 8 2020 Free Entry – For all Info: https://www.questionsession.co.uk
For centuries. composers have been reaching out beyond the musical world into nature, science and other disciplines for inspiration. Pythagoras conceived of a harmony created by the orbiting planets (The Music of the Spheres), Newton attached 7 colours of the rainbow to the notes of a scale, composers like J.S. Bach encoded their names into musical motifs, and Villa Lobos wrote melodies tracing the New York skyline. This workshop enables musicians of all styles to tap into this vast and profound craft of ‘data-music’. This long-established but niche craft has now been given a profound renaissance with contemporary technology: Ableton Live with bespoke Max for Live devices (available to participants in the workshop and distributed online) allow a world of real-time music creativity beyond the limits of human imagination. We will demonstrate such techniques as the automatic translation of your name into melodies, works of art into rhythms, spider webs into virtual harps, live weather reports into MIDI controls and countless other possible translations. This approach provides a uniqueness and profound meaning to your music-making whatever your stylistic interest, allowing you to tap into the infinite and uncharted universe of musical creativity.
A max/msp/Ableton Live tool I’m building to aid my hidden music projects. So many possibilities, these are literally first-go unpolished demos. Images by Bridget Riley.
Trying to push my Push skills up a peg with Peg and Steely Dan’s gorgeous µ harmonies.
Careful at the Rose Main Theatre, Kingston June 5 2018 2pm – This unique dance/theatre performance puts you in the care of five over-stretched nurses as they struggle to balance empathy and efficiency, compassion and clinical proficiency. Inspired by its makers’ experience of long-term hospitalization, Careful celebrates the skill, beauty and toil of professional nursing as seen through the eyes of the patient. Introduced by Professor Karen Norman, a leading expert in nursing, the performance forms part of The Art of Nursing, an annual event hosted by Kingston University and St George’s hospital.
This event is designed for students and professionals of nursing, though members of the public are very warmly welcomed to attend.
Careful was developed in collaboration with the Clinical Skills and Simulation team at Kingston University and St George’s University London. The collaboration has also led to the development of workshops designed to enhance self-awareness and non-technical skills of patient care, which now form part of the Nursing practice curriculum.
Careful is a project by Chimera, an arts company/research network dedicated to making engrossing artworks about, for and with the medical and healthcare sector. Led by Dr Alex Mermikides (Guildhall School of Music & Drama) and Dr Milton Mermikides (University of Surrey), we also create impactful events for students, researchers and the general public. Our work has been supported with funding from the Arts and Humanities Research Council and Arts Council England. www.chimeranetwork.org.
Duration 90 minutes, including introductory talk and post-show discussion. Please note that the event will be filmed for evaluation and publicity purposes. Book FREE tickets here
Here’s another classic process piece used as a ‘Push Etude’, Steve Reich’s Clapping. Even though it was written after Piano Phase, it is somewhat simpler (certainly to perform), relying on discrete rather than continuous phasing, so fits well into the discrete conceptual world of MIDI rhythm. The challenge here is to program the seminal pattern (which can be heard in triple or duple time like much of Reich’s Ewe-inspired phase pieces). You could of course play it in but I’m trying to roast my Push 2 programming chops. Duplicate the track and then shift it over in steps (you could also set global quantise appropriately and restart one clip at the appropriate metric point, but I wanted to make use of the lovely clip view now available). Unfortunately the push has little control over the fine control of offset, a shift move (as far as I can tell is always a semiquaver (1/16)) so I’ve set the Set to 6/8 rather than 6/4. I’m not sure of a more elegant way to reset the start offset other than how I did it, let me know if you can!
Being an 8×8 grid (we do generally reside in the normative binary default rhythmic world like it or not), the Push represents the 12 slots over a row and a half (I’d like to be able to move the rows into 6s for example) so imagine it like this:
You can then apply the pattern to melodic material as I’ve shown later in the video. Enjoy, njoye, joyen, oyenj, yenjo, enjoy.
Ableton Push 2 and Live 10 are incredible devices, both progressive and able to integrate seminal electronic, process and generative creative practices. In order to start exploring their potential I’ve been experimenting with recreating classic works as succinctly and fluently as possible. Here’s Steve Reich’s Piano Phase using just one track and Live 10 and Push’s new melodic sequencer layout which I find hugely valuable.
In essence you can break down the classic theme into its component pitches, and reform them by pitch rather than rhythmic placement.
Here’s the video and Live Set to explore. Piano Phase Push 2 Project
Quick overview: Set scale on Push to E Dorian and form the patterns from above on teh 1st, 2nd, 5th, 6th and 7th degree of the scale respectively. Once you can do this it can be fun to enter them in diferent orders, add chords to each of them and of course use in your own improvisational/compositional practice.
The phasing is super simple (naive really) each dial completes a rotation so you can settle on each semiquaver confidently before moving to the next rotation. This could all be done in microtemporal MIDI (creating fewer artefacts) with M4L devices but I like the ‘in-the-box’ constraint, maximising pre-existing tools.
Piano Phase Push Project (change the MIDI instrument to whatever you like)
On Friday 23rd March, I’ll be giving an Ableton-hosted workshop at the CCA, Glasgow on Breaking 4/4 – rhythmic shenanigans galore.
Booking here and details below.
Renowned TedX Groningen and Ableton Loop keynote speaker, Dr Milton Mermikides and Ableton Certified Trainer Phelan Kane take a look at some less than conventional ways to generate rhythms and sound. Using Live and custom Max for Live devices, this workshop introduces a range of tools and methods to break out of standard repetitive cycles of electronic music composition. Through a series of exercises using custom-built Max for Live devices, they’ll explore Euclidean sequencers, odd meter, micro timing, hypermeter, swing and latency, with the aim of unleashing your creativity and exploring uncharted territory beyond the standard 4/4 landscape.
Bridget and Milton Mermikides will be performing their classical guitar and live electronic project, Tension Blue at Canterbury Christ Church University, preceded by a talk on Milton’s Hidden Music series. Wednesday 24th January 2018, St Gregory’s Centre for Music (Talk 11.45am, Concert 1.10-2pm), Free Entry.
Looking forward to being part of this panel discussion (click for tickets and info)
The first event in a series, the AES London Committee present a discussion exploring the relationship between creativity and technology. Chaired by Phelan Kane (Chair of the AES London Regional Committee), the aim is to create a dynamic forum that features free flowing discussion and debate with contribution from panel and audience members alike.
The purpose of this evening is to explore the relationship between technology and creativity within the landscape of modern audio practice. What form does this relationship take? How do modern audio practitioners use technology creatively within their everyday practice and what role does the technology play? How important is the creative output of practitioners within the development of new audio paradigms? How is R&D influenced by current creative workflow trends? Does the realisation of R&D lead to new creative workflows and to what extent do creative workflows influence the R&D process?
Confirmed Panel Members:
On Saturday 6th June 2015, I’ll be performing with John Williams, Gary Ryan and friends at the beautiful Shakespeare Globe in London. Among other works, we’ll be performing Phillip Houghton’s sumptuous Light on the Edge by candlelight. I’ll be providing electronics (courtesy of Ableton and one of my many MIDI controllers) and it should be rather magical, unless of course I accidentally play Chloe’s playlist of Wheels on the Bus and other hi-energy toddler classics.
Click the pretty picture for info and tickets.
Digital animator (with a PhD in Chemistry) Anna Tanczos has set my composition ‘The Escher Café’ to video for live performances. Here’s an extract with animated tessellating lizards no less.
Bridget and I will be performing at 1.30pm Sunday March 30th (University of Surrey) at the launch of the International Guitar Research Centre (IGRC) run by Steve Goss and me. We’ll be performing 7 new works for classical guitar and electronics. Not the usual guitar rep. Tickets are £2 for students and £10 for the rest of us. Would be lovely to have some friends (of ours and new music) there.
The Times Higher Education have run a well-written feature on the Bloodlines project.
It’s a huge 160-page book of original (and beautiful) arrangements, with 2 CDs of Bridget playing the pieces.
The book launch at London Guitar Studio, with Craig Ogden, Gary Ryan and Amanda Cook playing a selection of the arrangements was too much fun to express in WordPress.
On the 14th October 2013, I’ll be joining the eminent musician and fellow Wodehouse character Peter Gregson at the Luton Music Club to max out on Minimalism. Details to follow, but on the programme is Terry Riley’s seminal work In C in all its heterophonic and generative gorgeousness.
Rather than play it (on my guitar) traditionally with the ensemble, I thought it fun (and more interesting) to use Ableton Live (and Push) to rebuild it so it ran generatively. I’ve done a very simple version which adopt most of the instructions, any errors kindly forgive me and my future generations.
Here’s a screenshot of the resultant tapestry:
And here’s the WIP for your to download in the spirit of musical democracy.
Hit scene 1 and they’ll tumble through, I’ve weighted the follow actions to have on average more repeats on shorter phrases which makes sense musically. You can intervene, urging on any stragglers and holding back any clips forging too far ahead. Actually I may need to consult a statistician as the deviation in fall rates seems (in my fallible intuition) to suggest something’s a bit skewy with the randomising process. But oh well, the variation in performances is intriguing (I’m on listen number 6 and still very happy). You can of course ride volumes, edit instruments and send out effects to your ears’ content. You can always set clip 53 to – rather than stop – return to clip 1 so the piece lasts forever, with clips lapping each other – adding another dimension to the work. If Follow Actions in Live were more sophisticated or I had time to render in Max the clips could behave more intelligently by grouping together, dropping out and changing velocity more responsively as per score instructions, but it actually works quite beautifully as is, which is a testament to the power of Riley’s concept.
Incidentally if you’d like to tweak the ‘fall rate’ – and hence the resulting approximate duration – change this number on the clips (you can select multiple), but remember that you may want longer clips with lower numbers. Again I wish there was more sophistication with follow actions, which would also allow the pulse to stop after all other clips are finished, but I await Ableton Live
10 11 17
Thanks to user Jeepee on the Ableton forum, whose patch I discovered when googling this idea, I’ve kept many of Jeepee’s clips as I like how he/she played them, but am also thinking of doing it PROPERLY in Max and crowdsourcing midi and audio clips from the interwebs when the Earth slows and there’s enough hours in the day for such mischief.
For those sensible people not in to Ableton or this sort of thing, you can hear a rendered version from this patch here:
(If that doesn’t buffer or you are a suffering iPad browser) -> In C Live
I have a little place in Greece, on a lesser known corner of the Peloponnese, on a little beach with a derelict and rarely visited acropolis from which the islands of Ψιλι, Πλατεια and (just about) Σπετσεσ are visible.
It’s a magical (and for me painfully nostalgic) place where even when we eventually installed a phone (1996), modem (2006) and wi-fi (2013) seems eerily frozen (well baked) in time. This part of the world is home to some odd creatures: deafening cicada, scorpions, flying fish, swordfish and a plant with fruit that explode on the lightest touch.
One such unusual animal I have yet to (knowingly) see but I’ve been fasciated by its sound for years. It’s some kind of bird that emits a short tweet at intervals so regular that we use it as a metronome. (It sounds particularly good on beat 4 & in a bossa).
Here’s an unedited audio sample recorded on Tuesday, 7 July 2009 19:32
Notice how (separated by an unmeasured pause) there is a decent metronomic tempo established. Logic Pro X’s transient detector and beat mapping tools reveal that once a pulse is established it tends to stay within a couple of bpm. I’ve played with far worse time-keepers of the human species. Here are the numbers:
To get a feel for it, listen to the same unedited clip with a click track.
Does anyone know what type of bird it is, an what evolutionary pressures gave it such tight timing?
Live lecture/performance of BloodLines at the Dana Centre, Science Museum. Thursday July 18th, 7-9pm.