Evolution Of Silence

2013, Interactive sound-installation

The work »Evolution Of Silence« portrays a virtual world populated by artificial creatures. Each creature creates a tone defined by its DNA, and is visually represented by a ray of light inside a holographic projection. The population of these creatures evolves steadily in an evolutionary algorithm.
The virtual world provides an inaudible key as an environmental condition. The more the tone of a creature fits the key, the louder its tone sounds, the stronger and broader its ray of light is, the more attractive it is to the other creatures and the longer it lives. Thus, it can produce many offspring by mating other creatures. Deviant tones however are quiet, have a very thin ray of light and die relatively quickly.
This way, the population inside the world evolves from a disharmonic and chaotic soundscape to a harmonic structure, fitting more and more to the key provided by the environment.
After two to three minutes the environment key changes randomly, and the soundscape immediately becomes chaotic again. The population will adapt to this new environment over the period of several generations.

The visitor can influence the evolutionary process by touching the rays of lights belonging to the creatures and thus “feeding” them. If the weak, quiet creatures with deviant, disharmonic tones, a thin ray of light and imminent death are touched or “fed”, they immediately become strong and loud. Thus, they are much more attractive to the other creatures and can produce many offspring. So the process of harmonising controlled by the evolutionary algorithm will slow down and be complemented with disharmonies.

»Evolution Of Silence« is the result of a series of experiments where I applied evolutionary algorithms to the emergent generation of sounds.
Following John Cage, »Silence« is interpreted as the occurrence of all sounds that arise inside an environment by the very existence of all beings and elements – without any intention associated with those sounds: The artificial beings of the virtual world don’t sound to fulfill a musical purpose, they just sound because they exist.
Only the environment with its progressing evolutionary process judges every tone and sets it into a harmonic context. This context attributes an intention to every tone, that is however, only valid within this context.

Technical Information

All software is written in Java based on the Processing-library. The sound is generated with SuperCollider, which is controlled by the java-application through the great SuperCollider-client for Processing by Daniel Jones. The tracking has been realized with a Kinect and Daniel Shiffman’s OpenKinect-Library.

List of all used libraries:
Toxiclibs by Karsten “toxi” Schmidt, DNA by André Sier, Openkinect by Daniel Shiffman, processing-sc by Daniel Jones, oscP5 and controlP5 by Andreas Schlegel, Ani by Benedikt Groß

Tunnelstudie 1

2013, video, ratio 1:1, 1:36 min

This work evolved from a vvvvorkshop by David Gann at HS Augsburg, in 2012. When I had time to work on it again, I added an export mechanism and some parameters, that I could change during the export to create an animation.

Myopic Bounds

2012, Dance-theater

The dance theatre »Myopic Bounds« deals with the corset of conventions, emotions and socialisation that surround us all, and asks questions concerning the need for freedom, limitation, bound and isolation in reference to this corset. How much are we restricted by our outer and inner worlds? How can we develop despite or because of these different bounds that we are moving in? And where might the imagined but fortress space of emotion that surrounds us lead to?

The dancers show mutual approximation, the effort of overcoming personal boundaries and the merging and collapse of community to the point of solitude during the performance. The dance refers to models of human behaviour and repetitive acts in daily life situations.

The stage design is built only by diverse light-situations. The described ‘spaces’ are projected from a video projector, whose light is broken by stage-fog. Different space situations are built throughout the piece and are changed in real time.

The projections have been realized with vvvv in conjunction with TouchOSC.

Choreography: Dustin Klein
Dancers: Dustin Klein, Emma Barrowman
Music, Dramaturgy: Simon Karlstetter
Scenery: Matthias Lohscheidt
Costumes: Louise Flanagan
Light: Peter Dürrschmidt
Kindly supported by: Limelight Veranstaltungstechnik, Lab Binaer

Pyramid Cascades

2012, Interactive installation 42,5 x 110 cm

»Pyramid Cascades« is the result of video-mapping experiments during a semester at ECAL, Lausanne. 3D-objects made of polystyrene and paper, mounted on servo-motors build the projection surface for a beamer. Virtual 2D-pyramids are mapped on these real objects. Via a trackpad the user can interact with the installation. By moving the finger on the trackpad from left to right or vice versa, the real pyramids and the virtual pyramids rotate simultaneously, which creates an illusion of glowing pyramids. By clicking on the trackpad, small projected circles fall down from the top of the surface and collide with the pyramids. Every collision produces a synthetic bell-like sound, depending on the side and size of the hit surface. The circle-cascade works like a sequencer. Every dropped circle is repeated after two bars. To sum things up, this installation allows you to create melodies in a playful way.

It has been realized as an adobe air application that communicates through a serial connection with an arduino-board to control the servo-motors. This serial communication was mainly maintained with the as3glue lib. For the 2d-physics the as3 port of the powerful box2D physics-engine by Erin Catto was used. The sound is synthetically generated by the air application with the Tonfall as3 framework by Andre Michelle. Thanks to all these people for sharing their great work!

Previous Posts