Visual Effects Remixed
Chair: Peter Richardson
2nd Chair: Dr. Steve Gibson
Visual effects (VFX) are the various computer generated processes by which imagery is created and or manipulated outside the context of a live action film shoot. Traditionally moving image visual mediums in a performative / gallery context have been primarily experienced as “playback” mediums, in which material is fixed in time and is played from beginning to end. Real-time visuals on the other hand require the intervention of a performer or a user. In the case of the VJ or live filmmaker, he or she chooses the video clips in real-time, selects the options for effects and determines the compositing of images and effects. Recently a number of (traditional) Narrative film makers have moved away from structural narrative and into the realm of ‘live cinema’, remixing their films for audiences as a performative experience. This raises interesting possibilities to extend the genre with a performative art based approach. British directors Peter Greenaway and Mike Figgis increasingly work with this method. The ‘live cinema’ experience is generally limited to pre shot or captured visuals which are processed or remixed. As yet few have attempted to incorporate ‘live’ visual effects as part of this cinematic experience. “VFX Remixed” seeks to stimulate debate and generate theoretical prototypes for live cinema experiences that utilize the technologies of VFX and combine to create a more immersive cinematic performance experience. Areas of interest include audio-visual performance VJ(ing), physical and serious gaming, computer-based installations incorporating real-time visual processing.
Simulating synaesthesia in real-time performance
by Dr. Steve Gibson
In this paper the author will describe and show examples of his live audio-visual work for 3D spatial environments. These projects use motion tracking technology to enable users to interact with sound, light and video using their body movements in 3D space. Specific video examples of one past project (Virtual DJ) and one current project (Virtual VJ) will be shown to illustrate how subjective and flexible user interaction is enabled through a complex but predictable mapping of 3D space to media control. In these projects audience members can interact with sound, light and video in real-time by simply moving around in space with a tracker in hand. Changes in sound, light and real-time visual effects can be synchronized with changes in sound and or light (i.e. music volume = light brightness = video opacity). These changes can be dynamically mapped in real-time to allow the user to consolidate the roles of DJ, VJ and light designer in one interface. This interaction model simulates the effect of synaesthesia, in which certain people experience light or colour in response to musical tones.
Responsive Illuminated Architecture
by Stefan Müller Arisona and Christian Schneider
Illumination of buildings with projectors or media facades has become a popular means of visual communication: in the art context, many festivals around the world exhibit pieces, and a growing number of applications can be observed in the commercial context. As technology advances, the scale of illuminated surfaces increases, the visual perception of objects can be altered in real-time, and the use of sensors or smart phones enables the interaction between visuals and the environment. Parameters, such as temperature, movement or gestures drive the visuals and allow people to perceive their body and the environment in a new way. In this paper, we present two academic projects, "Sensitive Tapestry" and "Projected Realities”, which aimed at integrating above techniques into the architecture curriculum and at leveraging the architect’s knowledge for a more seamless integration of interaction and visualization. The goal of both projects was to create a novel experience that arises from architecture augmented with digital information, and from architecture that acts a user interface to reveal information about the building, its occupants and its environment.
Visual Effects Remixed
by Peter Richardson
Traditionally moving image visual mediums in a performative / gallery context have been primarily experienced as “playback” mediums, in which material is fixed in time and is played from beginning to end. Real-time visuals on the other hand require the intervention of a performer or a user. In the case of the VJ or live fimmaker, he or she chooses the video clips in real-time, selects the options for effects and determines the compositing of images and effects. Recently a number of (traditional) Narrative film makers have moved away from structural narrative and into the realm of ‘live cinema’, remixing their films for audiences as a performative experience. This raises interesting possibilities to extend the genre with a performative art based approach. British directors Peter Greenaway and Mike Figgis increasingly work with this method. The ‘live cinema’ experience is generally limited to pre shot or captured visuals which are processed or remixed. As yet few have attempted to incorporate ‘live’ visual effects as part of this cinematic experience. This paper investigates potential methods for incorporating visual effects into ‘live cinema’ experiences.
Bios of the Participants
Steve Gibson is a Canadian media artist, interface designer and media curator. He completed his Ph.D. at SUNY Buffalo, where he studied music composition and electronic music. He was also a postdoctoral researcher in media and technology at Concordia University in Montréal. He currently serves as Senior Lecturer in Interactive Media Design at Northumbria University. He was curator for the Media Art event Interactive Futures from 2002-07 and currently serves as co-Director of Digital Art Weeks (ETH Zurich). He currently holds a Visiting Professorship at Xi’an Academy of Fine Arts, China and is Artistic Director of Limbic Media Corporation, a company specialising in physical computing applications. Influenced by a diverse body of art and popular movements his work fuses electronica, immersive art, game art, montage and post-minimalism. He works in a range of media, from live electronic music to immersive and physical installation. His works have been presented in such venues as: Ars Electronica; the Whitney Museum of American Art; Banff Centre for the Arts; Digital Art Weeks; the European Media Arts Festival; ISEA; Interface3, Hamburg; the San Francisco Art Institute; 4 & 6CyberConf. He recently co-edited a volume entitled Transdisciplinary Digital Art which was published by Springer (Germany) in Spring 2008.
Stefan Müller Arisona
Stefan Müller Arisona is a computer scientist and artist with main interests at the intersections of science, art and technology. His research focuses on interactive and generative design tools, on computer-assisted techniques for architectural and urban modelling and simulation, and on real-time multimedia systems. Stefan was visiting researcher at IRCAM Centre Pompidou (2003), received his PhD from the University of Zurich in 2004, and was a post-doctoral fellow at ETH Zurich (2005 – 2007) and UC Santa Barbara (2007 – 2008). Since 2008 he is a senior researcher at ETH Zurich and recently became a principal investigator at ETH’s Future Cities Laboratory in Singapore. Stefan has been the scientific chair of ETH Zurich's Digital Art Weeks since 2005, an annual symposium and festival that explores new movements in digital art. As an artist, DJ, and VJ, he has performed internationally and his artworks have appeared worldwide at renowned venues such as Zurich’s Cabaret Voltaire or the Ars Electronica Center in Linz. URL: http://www.arch.ethz.ch/~stefanmu
Peter Richardson is a filmmaker and researcher based in Dundee and London. After graduating form Goldsmiths College in 1989, Peter spent 14 years in the film industry. He has exhibited video works at The Barbican London, City Racing Gallery London and Marian Goodman Gallery New York. His experimental films have been screened on television and at film festivals worldwide including: ’Out Takes’ Brazil, New York, Los Angeles, Cannes, Cork, London and Hamburg film festivals and The National Review Of Live Art The Tramway Glasgow. Peter is a Lecturer at Duncan Of Jordanstone College Of Art & Design in Scotland and is currently Director of the Visual Effects Research Lab (VERL) A European Union funded project that undertakes transdiscaplinary research into high-resolution image technologies.
Christian Schneider studied Computer Science at the University of Neuchatel in Switzerland and the University of Aarhus in Denmark. After working in New Zealand for a branding campaign for MTV World, his passion for computational design brought him back to Switzerland. Since 2007 he teaches and researches at the ETH Zurich, Department of Architecture, in the areas of projection mapping, responsive systems, data visualization and computational design. He currently teaches a course in parametric urban design.