4D.Engine

4D.Engine

The 4D.Engine forms the core of a 4DSOUND system configuration. The 4D.Engine is a stand-alone processor that can generate up to twenty-four sound sources in real-time. Audio and data streams can come from a variety of external devices and instruments, and are collectively processed in the 4D.Engine into spatial sound sources that can move in a virtually unlimited spatial continuum. The 4D.Engine is running a dedicated multi-channel distribution protocol called 4d.pan. The protocol deals with the translation of dimensional sound sources and the interaction between objects to actual loudspeaker outputs. 

4DSOUND is an object-based sound system. In practice, users of the system are not dealing directly with the distribution of sound to multiple speakers, but instead are encouraged to think about space in terms of physical dimensions and distances, unlimited by the physical constraints of the room you are in and independent from the positions and amount of speakers used. During a working process in 4DSOUND, the 4D.Engine is a stand-alone processor that is passively handled by the user for the monitoring of audio levels, data flow and performance.

4D.Animator

4D.Animator

The 4D.Animator is a software application that animates and visualises the sound sources in space. It contains modules for generative movements, scripts for movements trajectories, Lissajous figures and complex shapes for multiple-point delay patterns. These processes are all generated using visual algorithms and then communicate the resulting spatial sound data to the 4D.Engine.

The 4D.Animator contains a dedicated editing and drawing function for composing paths - trajectories of the sound sources moving in space. Paths can be edited in real-time including changes in speed and dimensionality of the sound source along the course of a trajectory. Paths can then be exported as scripts and applied inside a composition or live-performance with 4DSOUND. The editing of paths and generating other spatial processes in the 4D.Animator can be done both online and offline from the 4DSOUND system.

 

 

The 4D.Animator is used on the one hand as a physics engine for modelling complex spatial processes, and on the other hand functions as a visual monitor of what happens with sounds in space while you are mixing in 4DSOUND. The animator shows the dimensionality of sound sources relative to the size of the speaker configuration you are working with. It shows the real-time transformations that are applied to the sound sources, such as the shape of the spatial movements, changes in dimensionality of the sound sources and the architecture of virtual walls that reflect and dampen the sound.

As the 4D.Animator provides direct and simplified visual feedback in real-time of what happens with sounds in space, it enhances an intuitive and accessible working process and provides important feedback during a live-performance on the 4DSOUND system.

 

Custom Controllers

Custom Controllers

iPad & Touchscreen Control

4DSOUND has developed an extensive library of template patches for the iPad using Lemur by Liine, and its extended programming language Canvas. Besides the iPad, more and more controls have been developed specifically for a large 23” touchscreen using Lemur/Canvas for Android. The integrated OSC-scripting options inside Lemur makes it a versatile environment for customising 4DSOUND controllers, tailoring control pages specifically for each performance. Lemur’s integrated physics engine offers an interesting extension to generate spatial movements from within the iPad or touchscreen itself, allowing control over the speed, friction and tension of moving objects, slides and strings.

Vladislav Delay working with an iPad as part of a complex synthesis chain, ADE 2014. Image: Georg Schroll

Vladislav Delay working with an iPad as part of a complex synthesis chain, ADE 2014. Image: Georg Schroll

The iPad control for 4DSOUND was initially developed as a remote control of a 4D.Live set, including pages to trigger scenes and clips within 4D.Live, control selected audio effects and spatial effects. This enables the performer to move freely through the space while performing on the 4DSOUND system. In practice, interfaces developed with Lemur have been applied in a great variety of ways: from having a few hands-on live spatial controllers next to a 4D.Live set, up to completely self-sufficient control systems independent from other software.

The standard 4DSOUND template includes: 

  • control of global position
  • ploding and rotation of the spatial field
  • spatial delay effect
  • live sculpting of spatial acoustics
  • dedicated modules for controlling swarming behaviour of multiple sources.

Many variations have been developed from this initial template. 4DSOUND also applied Lemur/Canvas to prototype the development of custom hardware controllers, such as the Space Control 01 and the hybrid spatial DJ-production tool TiS 0.9.

 

Techno Is Space (TiS) Control 0.9

The TiS Control 0.9 is a prototype 23” touchscreen interface designed by 4DSOUND to provide artists more direct access to the possibilities of space in their sound production and performance. Designed as a stand-alone spatial sound controller to perform on the 4DSOUND system, it caters to the immediacy of a DJ-mixer but at the same time strives to integrate extensive spatial control options within a compact and accessible format. 

The TiS Control 0.9 contains eight channels that are used to mix live on the 4DSOUND system. The controller was designed for 4DSOUND: Techno Is Space, our series of experimental club performances, challenging artists to rethink their way of designing and mixing sounds from a spatial perspective. Instead of controlling volume of a channel, artists control the distance of a sound source; EQ balance of a channel is created by changing the shape and dimensions of the sound source.

 

The controller also streamlines a huge range of control parameters in the 4DSOUND environment into 4 effect knobs per channel/sound source, enabling artists to play with key spatial attributes live during their sets, such as reverbs, spatial delays, spatial movement paths and on-the-fly looping of movements. 

To enable artists to construct the framework of a set in less time and prepare more complex processes for recall during a live-set, the touchscreen controller has an integrated memory which allows to save source- and effect-settings that can be instantly recalled during performance, providing a hybrid spatial DJ-production tool.

 

Space Control 01

The dedicated hardware controller Space Control 01 extends the chain of controls, and can be used in sequence with or independent from a 4D.Live set and iPad control. This device is tailored for manual performance of eight sound sources in space. The joysticks are used for the horizontal positioning and side wheel for the vertical positioning of each source in space. The wheel below allows to change the physical dimensions of each source: from a infinitely small point of sound to a large block of sound that fills the entire room. Each source is equipped with an additional knob which sends the amount of reverb from the virtual walls. 

The Space Control 01 is communicating OSC and the addressing, value and scaling of its controls can be fully customized using 4DSOUND’s OSC.Mapping options. Space Control 01 was commissioned by Stimming and specifically designed for his 4DSOUND live show. The hardware was built by Studio360 in Austria. 

 

Ableton & Max For Live

Ableton & Max For Live

A set of Max4Live devices provides integral control of the 4DSOUND system inside Ableton Live. Inside a 4D.Live set, every audio channel can be linked with a corresponding midi-channel that communicates spatial sound data (OSC) to the 4D.Engine. All parameters of the system can be automated on a timeline using Ableton Live’s arrangement view. This mode of working integrates extensive spatial sound control into the established workflow of editing and mixing a music composition or soundtrack. 

Ableton’s session view allows further options for spatial sequencing by utilising midi-clips as spatial sound scripts. Spatial position- or movement-data can be stored and recalled by writing spatial coordinates or pathnames into the clip. The clip’s start, stop and loop-options can be used to synchronise the speed and scale of spatial movements to fit the timing of audio clips. The clips also allow the option to automate any parameters of a sound source linked or unlinked from the clip’s time selection. 

The options for working in Ableton’s session view are further extended with a dedicated system for storing and recalling of spatial sound settings from the clip. This allows users to create their own libraries of pre-composed spatial attributes, movements and processes and recall these settings instantly at any given moment and place within the 4D.Live set. 

Interactive Sensory Systems

Interactive Sensory Systems

Motion Capture & Gesture Control

A variety of applications have further extended the physical control of the 4DSOUND system with motion capturing technology - mapping various controller-data of the way the body moves in space to OSC messages for the spatial control of sound. 

Multimedia-artist Rumex created an integrated spatial sound control by the movements of a dancer in her work ‘Sonic Lure’. She used Kinect video motion capture to translate gestural qualities of the body to move sounds in space and RiOT sensors for capturing the orientation of the limbs and body to control rotations in the sound field. Composers and live-electronic performers Mathieu Chamagne and Herve Birolini played on 4DSOUND using two LeapMotion sensors and a motion-sensitive Wii-controller to perform the spatial behaviour of twenty-four sound sources with their hands and fingers.

 

Real Time Position Tracking

4DSOUND has developed a real-time position tracking system based on UWB-frequency (Ultra Wide Band) with Ubisense sensors and tags. The sensors map the physical space and trace up to five tags that can be carried by people moving in the space. 4DSOUND maps the real-time position data of the moving bodies to spatial position information that can then be interpreted inside the 4DSOUND system to move or transform spatial sound images.

Screen Shot 2016-03-29 at 16.36.49.png

The position tracking system has been used in various projects created in 4DSOUND. In the opera Nikola, singers carry their voices with them as a sonic aura they moved through the space, providing for a natural sounding and localisable amplification of their voices. Performance artist and composer Marco Donnarumma used the real-time position tracking in his 4DSOUND work ‘0:Infinity’ to capture three visitors in a complex interwoven system of sound processing based on their spatial positions, distance in relation to each other and behavioural derivatives, such as how much attraction visitors developed to each other over time. The position tracking system has also been used to create an interactive spatial sound remake of the well-know videogame Pong.

 

Bio-Physical Feedback Systems

4DSOUND is focused on investigating the relationship between the external environment and our internal physical and physiological space, and how our ability to listen opens up new levels of awareness about those spaces. As a result, a variety of new interfaces and control applications have been developed that establish a direct connection between the inaudible states and transformations of the inner-body and the audible space around us.

the Xth Sense is a bio-wearable instrument that captures inaudible sounds from the internal body, like the heartbeat, muscle tension and blood flow. Then, the pitch, texture and rhythm of those sounds can be performed by free movement in space. 4DSOUND combined with Xth Sense creates a musical instrument that allows to sonically explore the spatial dimension of one’s inner body, by being able to moving inside it and interact with it. Xth Sense allows twelve derived parameters from the inner-body to be mapped in a variety of ways to attributes of the sounding space. 4DSOUND and XTH are currently developing a template for integrated spatial sound control with Xth Sense.

 

In the work ‘Body Echoes’, techno-artist Lucy was able to capture the inner movements in the body of yoga-teacher Amanda Morelli through a system of custom-built Arduino controllers and close-up microphone capturing of Morelli’s breath and heartbeat, and translated the raw data from the body into corresponding sound images moving the energy of the audible breath in space. 

4DSOUND and Lisa Park developed communication between commercial EEG brainwave headsets and the 4DSOUND system through a custom-built smart-phone app that translates the real-time brainwave data into OSC messages for the spatial control of sound. 

Tactile Sound

Tactile Sound

4DSOUND and SubPac developed a tactile bass application for spatial sound integrating the wireless wearable SubPac M1s within the 4DSOUND system. The low frequencies present in the spatial sound environment are translated into touch by the haptic feedback inside the wearable.

With this application, the experience of bass is naturally extended with physical vibrations anywhere one moves in the room. In combination with a position-tracking system, it can be used for specialised uses of spatial bass, such as personalised experiences of bass, imitation of low-frequency standing waves in space and resonant bass-tones that emerge according to the the distance of listeners in relation to each other.