Thursday, June 3, 2010

Handle and Blade


The nature of the common console is that the consoles in different laptops have to update and control each other. As a result, it is very easy to enter into endless, escalating feedback loops that can crash the computer/s involved. As per the diagram A, anything Malek does on his laptop (say set a tone generator to generate a sine wave at 440 Hz) must be communicated to my laptop so that we are both appraised of the current state of the tone generator. If he turns his dial from 1 to 440, I have to see the dial on my screen go from 1 to 440. By the same token, anything I do on my laptop must be reflected on his laptop. As diagram C shows, unless special measures are taken, the machines can wind up sending messages to each other until they crash.

Obviously there is a need to distinguish between two kinds of signals. We started calling them control and display signals. They are different in that a control signal propagates itself. It affects both the unit it is sent to and every other unit that unit is connected to. A display signal on the other hand, does not propagate itself. It is sent to a unit, the unit receives it, and sends nothing on. The signal ends there.



As it happens, the coders of Max/MSP have built this distinction in. So for example, if you send the number "440" to a dial or a number box, the dial/box will display the number "440" and cause the number 440 to be sent to any other units downstream from them. On the other hand, if you preface (or "prepend" as Max has it ) the number with the word "set," as in "set 440", the dial/box will simply show the number 440 without passing anything on.

However, we found that even though we understood this, we kept on making connections that caused feedback loops. Even relatively simple patches in Max can look tangled, and it is not that easy to spot feedback loops, especially if the loops are completed only when a different computer runs the same program on the network. Time and time again we found ourselves giving the other a patch that worked perfectly fine when we built them on our standalone machines, but went completely nuts when it encountered its twin on the network.

It didn't help that patches we made where nowhere as simple as the synth above. They were complex and nested things, boxes within boxes, boxes triggering other boxes, etc etc etc. What we needed was a way to "see" what we were doing.

Accordingly, I came up with the idea of likening the units to the parts of a knife, and classifying them as either handles and blades. I also decided to add one more type to our taxonomy of signals. Aside from control and display signals, we would acknowledge OSC signals (Open Sound Control signals) which were sent between OSC transmitters and OSC receivers, and which could be transformed into control and display signals.



According to this scheme, blades receive control signals, but are forbidden from sending OSC. Handles, on the other hand, are a mirror-image of the blade: they send OSC, but are forbidden from receiving control signals. Blades can be controlled both locally and remotely, but handles can only be controlled locally.

By creating this conceptual dichotomy, we acquired the power to judge whether a given unit/structure/connection/mechanism was "loop legal" as it were. Once a given unit was identified as either a handle or a blade, we were free to control blades with other blades, control blades with several handles, send and receive display signals between any unit, and so on, without fear of creating a feedback loop.

I see these concepts and pieces of language as one of the products of Coding Tuesday. I go into the evolution of the concepts in some detail to illustrate how the same kind of punk, DIY outlook that underlies the technical experiments can generate analytical concepts and technical language. Language useful and concrete, created out of immediate need, without recourse to books or the internet.

Netconsole

I'm wondering whether there's a taxonomy of interactions somewhere in some book on information theory. I have a growing list that so far contains 6 types of interaction that we could implement between machines. I've named them after the image/practice that exemplify them. The list below was started in the post on Music Of The Lost Cities . I've been modifying it since then.

1) Tabletop. People drumming on the same table ie triggering stuff on a common instrument/program. The program reacts to all the triggers independently. This is basically no different from the situation where individual players play individual instruments.

2) Light Switch. The program/instrument can be in one of a set of mutually exclusive states. Anyone can specify that state, but doing so changes/overwrites what anybody has done.

3) Stompbox. I modify somebody else's signal. A variation would be that my output would pass into the mix in parallel with the original signal.

4) Spirit Glass. The program/instrument's output is determined by multiple inputs that are averaged, or somehow combined, the way that the path of the planchette on an Ouija board is the vector sum of all the pressures exerted on it by the participants' hands. "Spirit Glass" refers to the ordinary drinking glass that Filipinos use as a planchette on an improvised Ouija board to play the game known locally as "Spirit of the Glass."

5) Dragon Dance. Parallel inputs are summed to create an output that is perceived as a single, complex output.

6) Slavery. So for instance video is slaved to the drums, so that it cuts exactly to the snare.

6) Counterpoint. For example, instead of slaving video to the snare, it creates a different pattern in the bpm of the drums. Basically acts like another drummer, creating original output based on implicit data.

For the last couple of weeks, Malek and I were working on an implementation of the Light Switch type of interaction. We used OSC, Max/MSP and Max for Live and called the project Netconsole. The idea was to make instruments that everybody on the network could control.

As an example, let's take the case of Malek coding a basic tone generator on his laptop. Although his laptop would be the one generating the actual sound, he would code the interface in such a way that both our laptops would show identical control panels for controlling that sound, just as if we were both looking at the control panel of a single physical machine. As I have already said, this was an instance of the Light Switch paradigm, as both of us had access to the control of a single machine, and any operations on it basically invalidated/deleted all previous operations, just as though we were two people fighting for the control of a room's light switch, or thermostat.

What was fruitful about this idea was that it actively courted positive feedback loops and forced us to figure out and describe ways to avoid that. More on that in the next post.

Coding Tuesdays

This will be a record of notes occasioned by Coding Tuesday, a research group/workshop that takes place in Sweetspot, the Pasig City recording studio of the composer Malek Lopez. I am Tad Ermitano, a Filipino media artist (with a focus on interactivity, video and sound). The workshop explores performance over a computer network, and was formed soon after we performed Music of The Lost Cities with the American composer Chris Brown and his wife, the muralist Johanna Poethig in The Living Room in February of this year. Music of The Lost Cities was an audiovisual performance that was the result of several weeks of exploratory emails and two intensive days of coding in Chris and Johanna's hotel room, an interaction I recorded here in another blog.

Malek and I decided to continue to explore Open Sound Protocol and the performance possibilities of sharing data between computers, an area Chris had explored, both as a member of the San Francisco networked music group The Hub and on his own. However, I should note here that we do not deliberately or systematically explore or recapitulate ideas or experiments by either Chris or the Hub. The workshop is meant to explore what the possibilities of networking has to offer in relation to what we are already doing and are interested in doing.

We've been recently joined by Tengal of SABAW media arts kitchen, which works out as Malek and I had more or less seen the Gangan Ensemble (a sound performance laboratory series I co-founded with Tengal) as being the place where we would show and implement whatever we came up with.