


With two vive trackers and four speakers, one or two users can control the placement of audio across an “azimuth.”
Demo:
here’s some footage of people’s first experience with the system. after experimenting to figure out how it works, people start playing around.
System:
Vive > TouchDesigner Open VR node > TD processing & calibration > OSC to Ableton > 2 Ableton Groups > Envelop.
Music: Do it Anyway – Locked Groove (broken into 4 stems, using NI stems + audacity)
Reactions:
Next Steps:
A year ago I had the pleasure of seeing a new Robert Irwin work installed at Sprüth Magers gallery in Los Angeles. This work is all about light but involves no electricity. The gallery had interior walls removed and a layered cube of scrim in the center.
The center cube has points at which the scrim has been spray painted. It composes the scene, offering a frame as you stare into its abyss.
Same goes for the windows surrounding the gallery, where little squares of tinting have been applied.
As you walk around the interior structure, any one or thing on the opposite end becomes obscured. At various angles, the structures layered walls create levels of transparency and reveal the form of the space around you. It’s a strange experience difficult to put into words but perhaps demonstrated above by the darkening of the black squares with each subsequent layer.
In terms of focus, the use of the squares and structure help guide your eye to the effect of the scrim.
With the openness of the space, and polished concrete floors, light pours in, making you acutely aware of the time of day and quality of light outside. As Iriwn puts it: “What I just did, as far as I’m concerned, has to do with feelings,” he says of the Sprüth Magers installation. “Theoretically, it makes you really aware of how [darn] beautiful the world is, how interesting it is.”
Through his use of scrim, tinting, and paint, he controls natural light to create an otherworldly effect, drawing out the natural beauty of this gallery space and the outside light. Things shift with the movement of the viewer, as layers of fabric change the quality of the light and the perception of the space.
For my Thesis, and Sound In Space assignment #2, I am testing how users react when their movements, or how they move with others, impact ‘voices’ in a composition. I’m trying to suss out the proper number of people per voice and the number of voices for my “interactive dance floor.”
Technical setup: Touchdesigner
I created a ‘track control’ component for touchdesigner. The component has a number of options.
class trackController:
"""
trackController description:
controls one voice in ableton
in[0] - dance amount
in[1] - isPresent
"""
def __init__(self, ownerComp):
# The component to which this extension is attached
self.ownerComp = ownerComp
#does this launch clips?
self.ClipLauncher = False
#does this control racks?
self.RackController = False
#td ableton track object for volume and clips
self.Track = op('abletonTrack1')
#clip launcher, initalize it as off
self.clipControl = op('abletonClipSelector')
self.clipControl.allowCooking = False
#rack component for macro control, initalize it as off
self.rack = op('abletonRack1')
self.rack.allowCooking = False
#number of macros
self.MacroCount = 2
#store the track id that will be controlled
self.TrackNumber = 0
#set the default min/max volumes
self.minVol = .1
self.maxVol = .85
#create a custom menu for controlling this TrackController
self.MakeMenu()
def InitRack(self):
'''
select the appropriate rack device in ableton
assumes that its the first device in the chain
is called every time the TrackID parameter is changed
'''
self.rack.par.Device = 1
self.rack.par.Chain1 = 0
return
def MakeMenu(self):
'''
create custom properties on the comp for controlling the track
'''
o = self.ownerComp
#reset the custom pars
o.destroyCustomPars()
#add page for custom parameters
p = o.appendCustomPage("Track Options")
#make par for which track to control
m = p.appendMenu("Track")[0]
#set its menu labels to match the ableton track's labels
m.menuSource = "op('./abletonTrack1').par.Track"
#add value for number of macros
c = p.appendInt("Macrocount")[0] #[0] cuz it returns a tuple
#set the range of the custom par 'macrocount'
c.normMax = 8
c.normMin = 0
#set the default value
c.normVal = 1
#c.clampMax = True
#c.clampMin = True
#add value for presence volumes
v = p.appendFloat("Maxvol")[0];
v.normVal = self.maxVol
v = p.appendFloat("Minvol")[0];
v.normVal = self.minVol
#add toggles to enable/disable clip launcher or rack controller
p.appendToggle("Rackcontroller")
p.appendToggle("Clipcontroller")
def toggleComp(self,comp):
'''
turns a component on or off
returns its latest state
'''
c = not (comp.allowCooking)
comp.allowCooking = c
print(c)
return c
def ToggleRackControl(self):
'''
toggles rack component
'''
self.RackController = self.toggleComp(self.rack)
def ToggleClipControl(self):
'''
toggles clip component
'''
self.ClipLauncher = self.toggleComp(self.clipControl)
Technical setup: Ableton
In order for a composition in ableton to be controllable. It needs a few things.
Composing
Still in progress. Need two more voices, plus considering adding in a drum voice when all voices are fully active to give a reward for collectively moving.
Demo
TLDR: I struggled to get magenta js to do much. I created the most basic example of using the music VAE + the web midi api, separately. It uses the “mel_4bar_small_q” model which is “A 4-bar, 90-class onehot melody model. Less accurate, but smaller in size than full model. Quantized to 2-byte weights.”
I created a simple holder for an led matrix. I used some leftover cardboard to test out my design. I cut multiple pieces that had the void for the strip, and glued them together. I had a base to attach those to and I taped my leds in to see how it fit. It let me see multiple flaws in my design
We do not know how global warming will effect us, but we know our lives will change because of it. Either we do nothing and adapt to the new world, or change our lives to lessen the damage.
So we created Climate Twister – a game where two users play twister, selecting options in response to future scenarios, and get a prediction for the year 2100 in the end. The game uses 20 soft buttons connected to a laptop running a processing sketch. It was inspired by the research of Jerome Whitington, and his paper “Carbon as a Metric of the Human.”
Here are some details of the build:
Once we had settled on this idea, we did a number of user tests. We bought a twister board and overlaid some ‘options’ and invented content on the fly.
At a certain point we tried different methods for tracking users and displaying options, but they didn’t work out.
When it came time to fabricate, we ordered a vinyl banner which would be our game board, and bought foam and plywood. The plywood was taken down to 4’x6′ (the size of our banner), and we made a stencil of the banner so we could lay things out on the plywood.
We got to work laying out the copper pads, cutting wires to be the right lengths, gluing and soldering. We also attached copper tape to our vinyl banner (which would complete the circuit when pressed).
We cut holes into our foam, at first making them too small and then upping the size.
With everything together, we realized our wires were not going to be long enough. We bought electrician style screw terminals and cut a channel in the board for them to fit and extend the wires.
aaand the fabrication was done! The rest of the process was spent debugging the code and designing the interface.
To be updated with video…
End shot first (it doesn’t work most of the time, but sometimes it will when you jiggle it)