A couple of weeks ago, my piano teacher and I were bouncing ideas off of each other concerning meta-composing music software. The idea was this:
There is a system taking midi input from a bunch of instruments, and pushes output to the speakers and lights. The software running on this system analyzes the midi data it's getting, and determines which sounds to use, based on triggers set up by the composer (when I play an F7 chord 3 times within 2 seconds, switch from the harpsichord sound to the piano sound), pedals, or actual real-time analysis of the music. It would control the lights based on the performance and sounds of the instruments in a similar fashion - the musician would only have to vaguely specify what they wanted - and real time analysis of their playing would do the rest. On the fly procedurally generated music could play along with the musician as well. Essentially, the software would play along with the performer, with one guiding the other. I imagine that it would take some practice to get use to such a system, but that it could have quite incredible results.
I'm a big fan of improv jazz. One characteristic of improv that is lacking from other art forms is the temporalness of it. A painting can be appreciated 10 or 1000 years after it has been painted, but music (especially extemporized music) is about the performance as it is the creation. I think that the software that I described would add a great deal to the performance, as with it, as playing the exact same piece would result in a completely different show each time.
So, now for the questions.
Am I crazy?
Does software to do any or all of this exist yet? I've done some research and haven't turned up anything. The key to this system is that it is running during the performance.
Were I to write something like this, would a scripting language such as Python be fast enough to do the computations that I need? Presumably it'd be running on a fairly quick system, and could take advantage of the 2^n core processors Intel keeps releasing.
Can any of you share your experience and advice concerning interfacing with musical instruments and lights and the like?
Have any ideas or suggestions? Cold and harsh criticism?
Thanks for your time in reading this, and for any and all advice! (And sorry for the joke in the tags, I couldn't resist.)
A finale is the last movement of a sonata, symphony, or concerto; the ending of a piece of non-vocal classical music which has several movements; or, a prolonged final sequence at the end of an act of an opera or work of musical theatre.
As for the concept designated by the term I am using, 'metamusic', which. denotes music that self-reflexively refers to itself as a medium and/or artefact, it has received some attention, in particular in recent research, albeit under. different designations such as 'music on music' or 'musical self-reflexivity'.
The title of the piece refers to the total length in minutes and seconds of a given performance, 4′33″ being the total length of the first public performance.
End bar lines: Two vertical lines, the second line thicker than the first. This indicates the end of a musical movement or an entire composition.
People have used Max MSP to do this kind of thing with Midi and creating video accompaniment, or just Midi accompaniment. It's a completely domain specific app, that probably was inspired by small talk or something, which barely any real programmer could love, but musician-programmers do.
Despite the text on the site I just linked to, and the fact that 'everyone' uses the commercial version, it wasn't always a commercial product. Ircam eventually released it's own lineage. It's called jMax. PureData, mentioned in another post here is another rewrite of that lineage.
There's also CSound; which wasn't meant to be real-time, but is likely able to be pretty real-time now that you have a decent computer compared to where CSound started.
Some people have also hacked Macromedia Director extensions to allow for doing midi stuff in Lingo... That's very outdated, and hence some of them have moved to more modern Adobe environments.
Look at PureData. It can do extensive midi analysis and folks use it for performance.
Indeed, here's a video that flashes past a puredata screen. It shows someone interacting with a rather complex instrument using PD.
Also, look at CSounds.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With