Media Control Interfaces

warning: call_user_func_array() [function.call-user-func-array]: First argument is expected to be a valid callback, 'phpmailer_preview_access' was given in /var/www/web/html/includes/menu.inc on line 454.
'The Plucker' – a case study in interface specific control signal processing by Zlatko Baracskai/ Advanced Media Control Through Drawing: Using a graphics tablet to control complex audio and video data in a live context by Steve Gibson and Justin Love/ Embodied Schemas for Cross-modal Mapping in the Design of Gestural Controllers by Mark Stephen Linnane, Linda Doyle, and Dermot Furlong/ Touch Interfaces – between Hyperrealism and Invisibility by David Oswald
Dates: 
Friday, 16 September, 2011 - 14:45 - 16:05
Chair Person: 
Wim van der Plas
Presenters: 
Zlatko Baracskai
Presenters: 
Steve Gibson
Presenters: 
Justin Love
Presenters: 
Mark Linnane
Presenters: 
Linda Doyle
Presenters: 
Dermot Furlong
Presenters: 
David Oswald

'The Plucker' – a case study in interface specific control signal processing

by Zlatko Baracskai

In this paper the applications of musical control signal processing algorithms are discussed. A generic classification of these signals is used to provide a set of basic processing strategies. ‘The Plucker’ musical interface is discussed and used to present a variety of interaction models that can be conceived for specific purposes. Basic algorithms for a variety of aspects that can be analyzed from the interface data are presented. These include continuous aspects such as speed, energy, regularity and also time instances such as attack, direction change and activity thresholds among others. The specific controllers in focus are joysticks, potentiometers and buttons as can be found on ‘The Plucker’. These are analyzed in the light of physical convenience and virtuosity as well as available data streams, compatibility and calibration. The described processing algorithms are real-time implementations using max for live environment and thereby easily translatable to max/msp and pure data environments.

Further control signal processing functionalities are presented that form the building blocks of more complex gesture following and interaction analysis implementations. Also, musical aspects of control rate signal processing are discussed in the light of the various musical parameters, the hierarchy of descriptive layers achievable and the mapping. A projection on the future of control signal processing is established as the commercial software increasingly supports these strategies. The paper presentation includes both technical and musical demonstration of the interface and the control signal processing described.

Advanced Media Control Through Drawing: Using a graphics tablet to control complex audio and video data in a live context

by Steve Gibson and Justin Love

This paper demonstrates the results of the authors’ Wacom tablet MIDI user interface. This application enables users’ drawing actions on a graphics tablet to control audio and video parameters in real-time. The programming affords five degrees (x, y, pressure, x tilt, y tilt) of concurrent control for use in any audio or video software capable of receiving and processing MIDI data. Drawing gesture can therefore form the basis of dynamic control simultaneously in the auditory and visual realms. This creates a play of connections between parameters in both mediums, and illustrates a direct correspondence between drawing action and media transformation that is immediately apparent to viewers.

The paper considers the connection between drawing technique and media control both generally and specifically, postulating that dynamic drawing in a live context creates a performance mode not dissimilar to performing on a musical instrument or conducting with a baton. The use of a dynamic and physical real-time media interface re-inserts body actions into live media performance in a compelling manner. Performers can learn to “draw/play” the graphics tablet as a musical and visual “instrument”, creating a new and uniquely idiomatic form of electronic drawing.

The act of live drawing, though here removed from its traditional reference to a produced “drawing” (either on-screen or in print), is one that allows for dramatic gesture in a way that pressing keys on a computer keyboard or moving a mouse could never hope to achieve. In addition the fact that the graphics tablet can unite five degrees of control over live audio and video makes it an ideal tool to consolidate the roles of the DJ and the VJ under one control interface.

The paper also discusses how to practically program the application and the authors will present examples of its use as a media manipulation tool.

Links

Audio Demo 1: http://www.telebody.ws/TRACEY/Tablet_demo1.mov

Audio Demo 2: http://www.telebody.ws/TRACEY/Tablet_demo2.mov

Audio-Video Performance Demo: http://www.telebody.ws/TRACEY/Tablet_Demo_Split.mov

Wacom MIDI software download: http://www.roguescience.org/wacom2MIDI.zip

Setup the Wacom MIDI software: http://www.telebody.ws/TRACEY/Wacom_MIDI_Setup_Demo.mov

Load your saved setup: http://www.telebody.ws/TRACEY/Wacom_MIDI_Load_Demo.mov

Embodied Schemas for Cross-modal Mapping in the Design of Gestural Controllers

by Mark Stephen Linnane, Linda Doyle, and Dermot Furlong

Alongside changing processes and theoretical frameworks for making art, artist-researchers working with digital technology are extending the physical interface by incorporating gestural control into their work, looking at ways to read human actions and ways to exploit the ways in which people understand the world through their body. Applications of gestural controllers, such as these, can be seen in a wide variety of presentation formats, from interactive installation and performance to gaming. Such technologies allow for the design of novel interactive experiences but challenges still remain in designing controllers that support expressivity, meaningful interaction and intuitive control.

Embodied Cognition is an area of cognitive science that focusses on perception, cognition and action as profoundly shaped by the human experience of having a body and living in a physical environment and in a culture. It offers a new way and an alternative approach that complements current initiatives in the design of interactive technologies and gestural controllers. Embodied Cognition offers a novel way to analyse the complex interactions between user and technology in terms of the fundamental categories of embodied existence.

Image Schemata are fundamental categories that are neurally encoded, pre-conceptual symbols that are recruited from experiences of bodily movement and perceptual interaction, including the body’s interaction with its environment in terms of spatial relations, perception of force and magnitude. These categories are involved in organizing mental representations into meaningful coherent units and are implicated in the formation of new concepts. This paper looks at the design of meaningful controllers that depend upon empirical knowledge of fundamental categories in terms of how the body interacts in and understands its environment. It proposes that these structures may be used in order to identify correlations between gesture and other phenomena such as sound and colour. Such correlations are the necessary building blocks of a grounded cross-modal mapping schema on which to base the design of controllers that allow for meaningful gestural interaction with music and image.

In doing so, it presents an interdisciplinary approach to the design of interfaces to digital technology, one that can have considerable impact in the arts and technology domain.

Touch Interfaces – between Hyperrealism and Invisibility

by David Oswald

Multitouch technology has existed for several years today. While big multitouch tables have mostly been found in public places like exhibitions, small screen devices like multitouch smartphones have become an everyday phenomenon. In both cases the context of use has been different from the use of a desktop computer. Multitouch table systems often are designed for specific content, an individual location and fixed context of use. In contrast, smartphone applications are to be used in any context – due to mobility. With the emergence of medium sized multitouch devices like the iPad, more and more digital products which are known from a work-related desktop context, are being redesigned for multitouch use. But in addition to that we will see the emergence of novel media formats and applications which will be specific and typical for medium-sized multitouch devices.

Just like the invention of the computer mouse was a prerequisite and an activator for the design of graphical user interfaces, multitouch interaction will be a prerequisite and activator for novel interfaces. Today two trends in multitouch interface design become already apparent. One is the re-introduction of the real-world-metaphor approach of the 1990s, in which symbolic interface elements disappear in favour for hyperrealistic everyday objects in everyday environments. This is said to reduce cognitive complexity and learning time. It can be argued if reducing learnability should always be top priority and if the destiny of interface design will simply be hyperrealistic copies of the real world. The other trend is towards invisibility. In traditional interfaces interactive elements are visible, and users learned in which way they have to interact with them. With multitouch and especially with sensor based interaction (controlling parameters by spacial orientation or acceleration of the gadget) visual cues that indicate interaction possibilities disappear. Therefore the famous "pinch" gesture to zoom images is not intuitive at all: Neither the interface shows any affordance that would indicate "pinchability", nor does the idea of a real photograph suggest "zoomability".

These trends will be demonstrated and discussed in a talk, which will be illustrated by examples from research projects and student's work.