Augmented Reality in Practice: from Motion Tracking to Live Memory Machines

warning: call_user_func_array() [function.call-user-func-array]: First argument is expected to be a valid callback, 'phpmailer_preview_access' was given in /var/www/web/html/includes/menu.inc on line 454.
A Mobile System for an Indoor Augmented Reality in Building Services by Benachir Medjoub and Serhat KUT/netBody by Suguru Goto/ AR(t) in the Orient by Margriet Schavemaker and Hein Wils/ augment_me: algorithmic memory, absence and presence in the cloud by brad Miller/ Augmented Movement-Vision : Moving, Seeing and Sensing by Tyng Shiuh Yap
Dates: 
Thursday, 15 September, 2011 - 17:00 - 18:40
Chair Person: 
John McCormick
Presenters: 
Benachir Medjoub
Presenters: 
Serhat Kut
Presenters: 
Suguru Goto
Presenters: 
Margriet Schavemaker
Presenters: 
Hein Wils
Presenters: 
Brad Miller
Presenters: 
Tyng Shiuh Yap

A Mobile System for an Indoor Augmented Reality in Building Services

by Benachir Medjoub and Serhat KUT

The convergence of ubiquitous computing and visualization technologies is creating unprecedented opportunities for engineering teams to respond more effectively to the increasing demand for sophisticated, customized and high quality products and services in industry. Ubiquitous computing allows people to move between gateways to the information world in ways that are appropriate to their current setting. Visualisation related technologies, from immersive displays to Augmented Reality (AR), are routinely used in industry, with huge potential in engineering, health, heritage and archaeology, Architecture and culture.

In this paper we will present an indoor augmented reality system for building services maintenance using motion tracking sensors within ubiquitous access to 3D models and services specifications including equipment location (fan coil, diffusers, etc.), pipe routes and technical specification using a variety of devices (HMD, PDA and Desktop computers). This mobile system is used by the engineers for building services maintenance using real time visualisation and co-location of the existing service equipment hidden behind ceiling or walls.

netBody - “Augmented Body and Virtual Body II” with the system, BodySuit, Powered Suit and Second Life - Its Introduction and The Case Study of An Application of The System

by Suguru Goto

This paper is intended to introduce the system, which combines BodySuit, especially Powered Suit, and Second Life, as well as its possibilities and its uses in an artistic application. The system which we propose contains both a gesture controller and robots at the same time. In this system, the Data Suit, BodySuit controls the avatar in Second Life and Second Life controls the exoskeleton, Powered Suit in real time. These are related with each other in conjunction with Second Life in Internet. BodySuit doesn't contain a hand-held controller. A performer, for example a dancer, wears a suit. Gestures are transformed into electronic signals by sensors. Powered Suit is another suit that a dancer wears, but gestures are generated by motors. This is a sort of wearable robot. Second Life is software that is developed by Linden Lab. It allows creating a virtual world and a virtual human (avatar) in Internet. Working together with BodySuit, Powered Suit, and Second Life the idea behind the system is that a human body is augmented by electronic signals and is reflected in a virtual world in order to be able to perform interactively.

AR(t) in the Orient

by Margriet Schavemaker and Hein Wils

In 2009 Stedelijk Museum Amsterdam started the project ARtours. This project aims at investigating and producing augmented reality tours around the collection of the Stedelijk and modern art in general. The project aims at both indoor and outdoor applications:

-       Indoor as a replacement or addition to the classic audio tour.

-       Outdoor as a new experience and new interactions with modern art

In both instances not only the current collection is used but also new developed art forms and new narratives will be incorporated in the project.

The project focuses on a wide range of aspects such as technical, user experience, creation and management of information, indoor and outdoor, young and older audiences

The presentation will cover three subjects:

1. Introduction to the Stedelijk Museum and why the innovative and suprising AR-project was started.  How do we engage the audience into new user experiences and how is it changing the museum space itself.

2. Explanation of the platform we are currently building. This platform will be used to create tours in and around our museum and also to commission artists to come up with  original AR artworks. During the process of platform development we will create and evaluate business models that will suit the public, the art community and its business partners.
3. Presentation of the following ARtours projects:

- Me on the museum-square: a virtual exhibition of original 3D art on Museum Square in Amsterdam.

- The ARtotheque as shown at Lowlands festival and PICNIC: an art-library where the public can borrow (virtual) masterpieces from the Stedelijk's collection and place these themselves.

- The Design Tour: a: a walk through Amsterdam with design artifacts in historical context. b: Active participation redesigning the city's beauty by covering up construction sites.

- Two AR exhibitions in the museum space itself. These are commisioned ARt tours.

- Newly developed tours that are currently in concept phase.

augment_me: algorithmic memory, absence and presence in the cloud

by brad Miller

Can you have a memory without photographing it? Where does a memory live? How do we make sense of them? If it’s online, who owns it? What is association at a global scale? What are the conventions of western time and how might that be visualised or utilised? Can you live forever in the cloud? This paper examines these questions through the media art installation augment_me.

augment_me is a responsive visual database; a memory machine of sorts but a live and developing one. The images that constitute the database are a sequence of photographs and videos, collected over the past 8 years and tracks my relationships with people, things, places, scenarios (all of which are viewable on the photo-sharing site Flickr®). They are sequentially embedded with contextual associations arranged (initially) by time and date. This, combined with being able to access and make those images move, appear and disappear – by anyone or anything within view of the camera/sensor in the space where the installation is exhibited, makes manifest the metaphor of memory.

These images can take on a particular and slightly voyeuristic significance. For others, I imagine, it is the generic face anyone. We recognise these compositions, these tableaux vivants, these experiences – the urban middle class individual’s photo of leisure and tourism. All that distinguishes this collection from the endless expanse of similar imagery is their artist. The vector of all these moments is the artist own existence, “which they affirm and erase simultaneously. We are seeing his life through the eyes of an invisible protagonist. Or are we seeing his life flash before his eyes? Is this how it will be at the moment of death? The ordinariness of our existence spread before us. The objects, people and places that have made up our lives flowing, like data, away from the organizing principle of our own subjectivity.”

“The ultimate promise is that the flow of data may restore the flow of life when it is temporarily halted. Biological death becomes a small death, data becomes the through-line that joins old subject to new.”

Augmented Movement-Vision : Moving, Seeing and Sensing

by Tyng Shiuh Yap

With the augmented reality technology available, the screen can now not only be directly fused with our visual field through the use of video glasses, but it can also response and correlate with our movement in space. Due to the proximity of the video glasses to the eye, the screen can be said to have become embodied. The images on this screen does not have to align with the strict Cartesian perspectival space inscribed to the external world. Instead, this 'embodied' screen has the potential to alter and augment the dimensionality of our perceptual field through the form and content of the overlaid image(s).

Such augmented space would affect the way our body habitually move and navigate. How would the body re-adjust to movement beyond our usual body-in-space relationship? For instance, fragmented vision is adaptable, coherent and manageable when it is external to the body – as in the multiple viewpoints employed in surveillance systems or in computer games – but it is seemingly unmanageable when embodied by the moving body in augmented vision. Are there some thresholds in our movement-vision which the human body cannot overcome? Or is it simply a matter of re-habituation, practise and making adjustment to body-mind concepts.

This paper explores such expanded space and our body capacity, or plasticity, to reconfigure and adapt to movement in space with augmented vision by discussing this author's ongoing media art projects from her PhD research, as well as other related media works in this area. Discussion will include cross-examination of studies from the field of cognitive science, philosophy and media theory.