Chasing Ghosts: Reactive Notation and Extreme Sight Reading

This panel event seeks to convey the excitement of current real-time notation practice to the public by presenting work done in the area by prominent composers, musicians, and researchers. The presenters will explore key issues behind virtual scoring and real-time notation from technical, musical and design perspectives and provide an overview of the various approaches, their systems, and the styles of music that have emerged from them.
Dates: 
Thursday, 15 September, 2011 - 09:00 - 10:30
Chair Person: 
Arthur Clay
Chair Person: 
Jason Freeman
Presenters: 
Basak Dilara Ozdemir
Presenters: 
Thor Magnusson
Presenters: 
Georg Hajdu
Presenters: 
Shane Mc Kenna
Presenters: 
John Eacott

Chair: Arthur Clay
2nd Chair: Jason Freeman

Over the last decade, a growing number of composers have begun to use what is known as real-time notation in their work and many have developed diverse systems to facilitate its use in all types of performative situations.

Real-time music notation includes any notation, either traditional or graphic, which is formed or created during the actual performance. Other terms such as dynamic music notation, live scoring, virtual scoring, and reactive notation have also been used to describe the same process.

This panel event seeks to convey the excitement of current real-time notation practice to the public by presenting work done in the area by prominent composers, musicians, and researchers. The presenters will explore key issues behind virtual scoring and real-time notation from technical, musical and design perspectives and provide an overview of the various approaches, their systems, and the styles of music that have emerged from them.

Relevant works from the past and the present will be discussed to show how real-time notation relates to earlier experimental methods in open-form and malleable musical scores and in computer-assisted composition, in order to facilitate understanding through showcasing the exploration of the connections and boundaries among composers, performers and the audience. 

Participants from the planned accompanying workshop Interactive Music Basics & RealTime Scoring will join the panel and discuss their experiences while using software and hardware tools to create real-time notation systems or dealing with the challenges as interpreters of extreme sight reading.

Above all, the organizers of the panel hope that the events will spark interest and discussion that will further the development of a community of practice around virtual scoring and real-time playing and raise awareness of this new area within the contemporary music circles to aid in attracting new people to this exciting field.

Paper Abstracts

Aspects of Realtime Scoring & Extreme Sight Reading

by Art Clay

The relationship between the composer of a work and the interpreter of it can often be deducted from the type of score at hand and how it is notated. Various degrees of freedom have been given and taken away from the interpreter over the history of Western music. In early music styles, interpreters embellished melodies with simple to elaborate ornamentations and improvised cadenzas; in contemporary music the amount of freedom an interpreter is given varies, but is all too often very restrictive. The following paper introduces the concept of malleability in score making and reading, in a step-by-step manner. Key concepts are illustrated with examples of various score types from malleable paper scores to the interactive screen based ones.

Real-time Notation, Text-based Collaboration, and Laptop Ensembles

by Jason Freeman

For me, dynamically-generated music notation is a powerful mechanism for connecting people to each other through interactive music systems. Much of my artistic work is concerned with exploring new relationships among composers, performers, and listeners (often blurring those categories beyond recognition). Over the last six years, I've incorporated dynamic music notation into my compositional practice in three ways. In live performance, I use real-time notation to link the creative activities of audience members to the music performed by instrumental musicians in real time, creating a continuous feedback loop linking performers and audience members throughout the performance. On the Internet, I have use dynamically-generated scores as a way to incorporate the ideas of web-site visitors into future concert performances of works. And with laptop orchestras, I have used real-time notation to link the activities of improvising laptop musicians to the music played by instrumental musicians and to share this process with the audience. I bring considerable expertise to the panel in terms of design challenges and potential solutions, technical platforms and implementations of real-time notation and aesthetic and historical perspectives on its use.

Interpretating Reactive Notation and Extreme Sight Reading

by Basak Dilara Ozdemir

One can take the sight reading and real- time score reading from different points of view but the basic idea of both is actually very similar. When we question sight reading, it is as an important level if inquiry as are the fields of harmony, structure, orchestration, and the tempi of the pieces may vary depending on the century of the written music. This paper’s contribution to the panel will focus on the reading part of notated and non- notated music. Real-time music includes surprises and as in aleatoricism, these occur automatically as the outcome of the interpretation may vary from between performances due to the parameters given at the time being. The real-time scores may include diverse styles as there is no obligation of only making contemporary music or classical music; there are no boundaries for writing the music and interpretating it.

Computer Music, Music Languages, Live Coding

by Thor Magnusson

My contribution to the panel will be to present live coding as a new path in the evolution of the musical score. I will argue that live coding practice accentuates the score, and whilst being the perfect vehicle for the performance of algorithmic music, it also transforms the compositional process itself into a live event. A continuation of 20th century artistic developments of the musical score, live coding systems often embrace graphical elements and language syntaxes foreign to standard programming languages. I will show how live coding as a highly technologized artistic practice, is at its core, still a scoring practice, and one that is able to shed light on how non-linearity, play and generativity will become prominent in future creative media productions. I might demonstrate some live coding systems and show videos of key performers at play.

Real-time Composition, Real-time notation, Spectral Composition

by Georg Hajdu

Schwer…unheimlich schwer (difficult…incredible difficult) is a piece for bass clarinet, viola, piano and percussion about German Red Army Faction member Ulrike Meinhof. It zooms into the moment when she, in a TV interview, talks publicly about the fate of politically active women and expresses the possibility of leaving her children in order to pursue her interests. In this piece, based on the transcription of Meinhof's speech and composed in real time by a computer (sending parts onto computer screens), the difficulty of extreme sight-reading, including microtones and large leaps for viola and clarinet as well as complex harmonies for marimba and piano, conveys a sense of the dilemma that Meinhof is clearly experiencing. The real-time composition was done with MaxScore, a composition and notation environment developed by Nick Didkovsky and the author.

Animated Graphic Notations

by Shane Mc Kenna

Since the 1950's composers like John Cage, Cornelius Cardew, Morton Feldman and many since, have been putting ideas of notational reform into practice. These ideas challenge not only the deterministic nature of traditional notation but reflect an alternative philosophy behind the creation of music. The use of graphic notation requires a change in the composer performer relationship and questions the traditional concept of musicality, creating opportunities for more accessible music making for amateur musicians. This essay discusses the authors' use of animated graphic notation to encourage collaborative music making for a wide range of performers with different musical backgrounds and levels of experience. This includes an examination of research carried out by the author through an interactive installation that gives an insight into immediate vocal interpretations of moving shapes and symbols by a range of professional and amateur musicians. Understanding the common human associations between visual parameters and musical sounds is an important factor in creating animated graphic notation that is both accessible and engaging. This use of moving shapes, colors, visual rhythms and textures to encourage individuals and groups in creative musical collaborations will be discussed with reference to large ensemble performances and installation works by the author.

Using Live Notation for Musical Sonification Performances

by John Eacott

The musical works Flood Tide and Hour Angle are sonifications of live environmental data. Flood Tide takes data from the flow of tidal water and Hour Angle uses a computer model of the angular relationship between Earth and Sun. Both works are performed by live musicians as the data is collected so a mechanism to display musical notation as it is generated is essential. I have been performing both works regularly since 2008 with varying sizes of ensemble up to 39 which has been an opportunity to gather practical experience of techniques of developing and implementing live notation together with discussions about why it is an important and emerging area of music. The design of a live notation system is challenging and intriguing as it involves much more than simply displaying conventional notation. My own system written in SuperCollider is fairly basic although still may be used to generate performances that are musically rich and challenging for performers. I’m interested to learn better ways of designing and implementing live notation systems and to help define ways that it can be used to produce powerful and meaningful musical performances.

Bios of the Participants

Art Clay

Art Clay, a sound artist and curator was born in New York and lives in Basel. He is a specialist in the performance of self created works with the use of intermedia and has appeared at international festivals, on radio and television television in Europe, Asia and North America. His recent work focuses on media based works and large performative works and spectacles using mobile devices. He has won prizes for performance, theatre, new media art and curation. He has taught media and interactive arts at various art schools and universities in Europe and North America including the University of the Arts in Zurich.

Jason Freeman

Jason Freeman's works break down conventional barriers between composers, performers, and listeners, using cutting-edge technology and unconventional notation to turn audiences and musicians into compositional collaborators. His music has been performed by groups such as the American Composers Orchestra and the Rova Saxophone Quartet and featured in the New York Times and on National Public Radio. Freeman studied at Yale University and Columbia University. He is currently an assistant professor in the School of Music at Georgia Tech in Atlanta.

Georg Hajdu

Georg Hajdu was born in Göttingen, Germany in 1960. He is among the first composers of his generation dedicated to the combination of music, science and computer technology. After studies in Cologne and at the Center for New Music and Audio Technologies (CNMAT), he received his PhD from UC Berkeley. In 1996, following residencies at IRCAM and the ZKM, Karlsruhe, he co-founded the ensemble WireWorks with his wife Jennifer Hymer, a group specializing in the performance of electro-acoustic music. Georg Hajdu has published articles on several topics on the borderline of music and science. His areas of interest include multimedia, microtonality, algorithmic, interactive and networked composition. Currently, Georg Hajdu is professor of multimedia composition at the Hamburg School of Music and Theater.

Basak Dilara Ozdemir

Basak Dilara Ozdemir, pianist-composer started to study piano at the age of seven at Istanbul University State Conservatory; afterwards she got her diploma and master's degree in Budapest at Liszt Ferenc Music Academy. She also studied at the Paris Conservatory and did a year long composition- electronic music course at IRCAM in Paris. Currently, she is pursuing her PhD and working as a lecturer at the music academy in Istanbul.

Shane Mc Kenna

Shane Mc Kenna is a music teacher musician based in Dublin. He has completed both a Bachelor of Music Education, specialising in performance and a Masters in Music and Media Technology in Trinity College, Dublin. His work seeks to encourage collaborative music making for professional and amateur musicians through graphic notation.

Thor Magnusson

Thor Magnusson is a musician/writer/programmer working in the fields of music and generative art. He is a Senior Lecturer in the School of Art and Media at the University of Brighton and teaches courses on computer music and algorithmic and interactive systems. He is a co-founder of the ixi audio collective.

John Eacott

John Eacott is a trumpeter and composer who changed almost overnight from making ‘traditional’ acoustic music to music made using algorithmic methods. His works Flood Tide and Hour Angle have been performed 14 times mostly in the UK including Thames Festival 2009 and at London’s Southbank Centre 2010. He lectures in music at University of Westminster, London.