Embracing uncertainty, beginning a conversation
I've been using Max/MSP to process audio for a while, and am charmed by its ability to create dynamic, unpredictable results in real time. Having been in a Max mindset for the past few months, I've been finding the transition back to notating music quite difficult. I find myself sitting at the piano in front of a blank sheet of paper, paralyzed by every possibility that my fingers feel out of the keyboard.
No, I shouldn't use this chord here. It should come later. But what chord comes now?
I don't want to commit to this harmony.
This rhythm is too active, but I don't know how to slow it down.
I liked this passage when I wrote it, but in context it's way too long.
Where do I go from here? Where do I even start?
These are the frustrating questions every composer faces when working on a new piece, but rather than leaning into them and powering ahead, I shut down when faced with the uncertainty. But I don't face these kinds of hurdles when I'm working in Max! I simply keep patching smaller and smaller bits together until I find something I like, and the ideas come as the system develops. I enjoy guiding the process of creation and decision-making, rather than dictating it. The end result usually becomes a template rather than a document: a purpose-built framework for improvisation, for performance, for gesture. It is my deliberate incorporation of unpredictability that frees me from worry about it! And yet, I love instrumental music; I love working with musicians to interpret my score realized on paper. I enjoy a performer pulling myriad, unconscious threads from a score and infusing them with personal meaning.
When I began putting together a system to iterate different voicings of chords for a piano piece - and, to be honest, as a method of procrastinating *actually* writing the music - I realized that dovetailing these two interests was possible. Maybe I could use my fluidity in Max to free myself from that patient and overpowering abyss of the blank page, using custom-built algorithms to generate material.
Algorithmic composition is not a novel idea. John Cage famously implemented the I-Ching to make musical decisions, Iannis Xenakis applied mathematical stochastic processes to composition, even Mozart created music using dice. Using computers, composers like David Cope have created systems that can compose in the style of other composers. There are lots of tools out there for composers wanting to create music from algorithms, like AC Toolbox and Nodal. Why should I want to reinvent the wheel? Why not learn one of these systems?
Well, I want to create a system which I can develop from scratch with my own compositional language. I will never face a black box in which "I know why my code works, but I don't know why" (or, more frustratingly, not knowing why it doesn't work). I don't aim to program a machine that will spit out finished products for me - rather, I seek to have a two-way conversation with the code that I write. As I listen to the results, I'll curate the output, pulling the happy accidents that come from the algorithms and refining their specific realizations on paper, recombining and rescoring fragments. And then I can use my observations of the raw output to refine the generator. JALG - what I'm calling this collection of modules at the moment ("Jason Algorithms") will be teaching me, as I teach it. We'll be developing my language as a composer together.
This series of posts will document not only the gritty details of the technical obstacles and breakthroughs I have in building and uncovering the modules, but also help me address the philosophical questions inherent in any kind of composition:
What does it mean to be a composer?
Who is responsible for content?
How do I articulate, quantify, and generalize my musical preferences?
How does the realization of these preferences influence future preferences?
Some code-specific goals for this project:
1. Modularity - each one of the tools I'm building should do one thing, and one thing only. For the most part, they should be replicable within a system - I should be able to have multiple versions of the same code operating independently within a patch. As I add more modules to the system, I aim to have them work with the same kind of syntax so I don't have to reprogram anything to create something new. In a way, I'm trying to build a replication of the Max programming environment itself - objects are "dumb" by themselves, but can be made extremely powerful through interconnection.
2. Flexibility - Though I began building this system to aid my composition of a piano piece, I want to be able to tweak and implement this toolbox in all sorts of situations - generating pitch, rhythmic, timbral, and formal material; creating notation or modifying parameters of electronic instruments. Though all the modules I've built at the moment output data in tempo-synced MIDI, I think they should be able to incorporate into audio processing as well.
3. "Best practices" - The other goals mentioned above are concomitant with good habits in programming, but I always want to be focused on making code that is elegant, efficient, and easily debugged. This will include ample commenting, cleaning up visual presentation, making sure that arguments and variables are consistent across modules, and many smaller situational items. I have a bad habit of creating slap-dash machines in Max in which I have to copy and recopy the same code, and often put in stop-gap solutions that obscure the clarity of purpose. I'll keep in mind the goal of efficiency and elegance as I work on these modules, keeping in mind my future, forgetful self that will want to pull up old code and spend as little time as possible deciphering my own creations.
Thanks for following along.
Here are some of the first things I've done with the budding system: