Display Mobile Installation
In this work the interaction follows the evolutionary principle of mutation of the graphic and acoustic structure and selection from six available variants. On the visual level, specific formal patterns that have been extracted from the natural world are combined arbitrarily and generate creations that are both familiar and yet have never been seen before. The momentary state of the graphic objects controls the sound level in "SonoMorphis". The parameters of the graphics must be interpreted from acoustical viewpoints in such a way that a musical structure emerges from them. In this way automatic compositions arise, the results of which are functions of their components and are variable in the details of their contours, complexity, and their behaviors. The overlapping of visual levels and sound levels produces an open structure that can be continually and endlessly configured in new ways by each viewer.
The installation is accessible from two coupled access points. The first of these exists in real space, visitors interact with a virtual organic projected onto a screen via a special interface box. The second access point uses the world wide web as its user interface.
In both systems, users evolve a three-dimensional organic object which is created using genetic algorithms. The organic is defined by a genom, a set of components, which is successively mutated by the users. Out of six randomly generated mutations users select one, which in the succeeding steps becomes the starting point for new mutations. This way users choose a thread through a space out of approximately 10 to the 80th power of possible forms.
In the real space users additionally change the shape and dynamic behavior of the life-like organic object via an interface box. Both systems are coupled and operate on the same data set constituting the genom. Actions in the web space effect the real space and vice versa. If a change on the web happens, the organic in the real space slowly morphs towards the web selection, a change in real space directly affects the next web action.
Both sound and projection relate in equal parts to the same underlying abstract structure which they make palpable to the user. The sound acoustically represents selected properties of the genoms, i.e. their structure, position, and behavior in a non-arbitrary way. The easiest way to think of this representation metaphorically is that of a musical instrument: a set of rules with associated variables by which to generate sound, with the possibility included to control these variables in real-time according to the underlying genoms' structures.
As one of the installation's aesthetic goals is the bodily impression of the generated object on the user, a sound synthesis technique was in demand, that is able to both render a visible object's genuine sound thru all its user-inferred alterations in shape and space in a plausible way, and to be abstract enough where needed to not duplicate a real-world artefact. The technique of choice is known as physical modelling which derives the emerging sound from the physical properties of an assumed object, i.e. its shape, material, excitation mode etc.
Based on associative relationship to the genoms' textures, each acoustic representation has first been assigned a set of material properties, causing its basic timbre. Second, the genoms' shape is taken into account, controlling the representations' basic modes of vibration and their reaction to parameter-induced deformations. Third, the single graphic objects' current spatial positions are mapped to the sound space, rendering their horizontal movement as well as their proximity to the user.
It is possible and intended to handle the installation as flexible as a musical instrument, consisting of an image and a sonic component. Observation of the system's behavior during exhibitions has shown its ability to respond to users' varying approaches, playing styles, and temperaments in a differentiated and recognizable way.