Creative Artificial Intelligence

 Communication Acoustics and Aural Architecture Research Laboratory





          




CAIRA - A Creative Artificially-Intuitive and Reasoning Agent in the Context of Ensemble Music Improvisation

was a project funded by the National Science Foundation within the CreativeIT program (NSF 1002851). The goal of the CAIRA project was to develop artificially intelligent system that is capable of improvising live music along with human performers. We aimed to achieve this goal using a multi-layer structure simulating core functions of the auditory pathway, higher cognitive functions as well as an efferent pathway to produce actual music. The main layers of the CAIRA structure were to include:
  1. Stages to perform an auditory signal analysis and auditory scene analysis to extract musical sound features like pitch, loudness, roughness, tempo etc.
  2. Mid-level brain functions that organize the perceived auditory streams using machine learning algorithms including Hidden Markov Models (HMM) and Empirical Mode Decomposition (EMD).
  3. Functions representing human cognition. For this stage, we set a strong focus on first-order logic reasoning with the ability to prove musical theorems. The ability for CAIRA to switch between different music improvisation strategies, represented by logic reasoning on the one hand and artificial intuition on the other hand, was another goal.
  4. An effector that can perform audio material was chosen to broadcast CAIRA's music. The system was conceived to either record music material live from the human performance to be reused by CAIRA or to use an audio date base from which CAIRA could draw audio material. In both cases, CAIRA would analyze the audio files using auditory scene analysis and machine learning techniques.
One of the main challenges of the CAIRA project was to communicate with other musicians on an audio signal level rather than a symbolic music level. Most existing music improvising and composition system operate using a symbolic notation, e.g., by using the MIDI protocol to represent musical notes and their duration. This meant that we would need to extract all relevant music information from the audio signal itself monitoring more than one human performer. We also wanted to go beyond just representing a note-based score, but have CAIRA analyze musical features that were non-pitch based such as roughness or musical tension in general, using the approach of focal and global attention. To accomplish these goals we needed new machine learning tools that could create a meaningful feature space for these cues. With regard to CAIRA's reasoning capabilities, we were eager to have the system "understand" what a music ensemble is and how to operate within the rules of an improvisational music ensemble using a musical calculus. It was also important to us to see how CAIRA would operate within different genres (free music, traditional jazz, classical music) using different approaches (e.g., reasoning, artificial intuition). It was important for us to develop a system that is able to actually perform with musicians and not just to construct a theoretical framework.

Project Investigators

Jonas Braasch (PI), Selmer Brinsjord (Co-PI), Pauline Oliveros (Co-PI), Doug Van Nort (Post-Doc)

Student Investigators

Nikhil Deshpande, Simon Ellis, Colin Kuebler, Anthony Parks, Naveen Sundar G., M. Torben Pastore, Joe Valerio
 
Publications
  • Braasch, J. (2013). A precedence effect model to simulate localization dominance using an adaptive, stimulus parameter-based inhibition process.  Journal of the Acoustical Society of America. 134 (1),  420�435.

  • Jonas Braasch (2011). A cybernetic model approach for free jazz improvisations.  Kybernetes. 40 (7/8),  984-994.

  • Van Nort, D., Braasch, J. and Oliveros, P. (2012). Sound Texture Recognition through Dynamical Systems Modeling of Empirical Mode Decomposition.  J. Acoust. Soc. Am.. 132  2734-2744.

  • Van Nort, D., Oliveros, P. and Braasch, J. (2013). Developing Systems for Improvisation based on Listening.  Journal of New Music Research. 42 (4),  303--324.

  • Braasch, J. (2013). The Microcosm Project: An Introspective Platform to Study Intelligent Agents in the Context of Music Ensemble Improvisation. Sound - Perception - Performance  Bader, R..  Springer.  Berlin, Heidelberg, New York.  257-270.

  • Ellis, S. and Haig, A. and Sundar G., N. and Bringsjord, S. and Valerio, J. and Braasch, J. and Oliveros, P. (2015). Handle: Engineering Artificial Musical Creativity at the ``Trickery'' Level. Computational Creativity Research: Towards Creative Machines  Besold, T. R. and Schorlemmer, M. and, Smaill, A..  Springer.  Heidelberg, New York, Dordrecht, London.  in press.

  • Braasch, J.,Blauert, J., and Parks, A.J. and Pastore, M.T. (2013). A cognitive approach for binaural models using a top-down feedback structure. 21st International Congress on Acoustics. Montrael, Canada.

  • Ellis, S., Sundar Govindarajulu, N., Valerio, J., Bringsjord, S., Braasch, J. and Oliveros, P. (2013). Creativity in Artificial Intelligence as a Hybrid of Logic and Spontaneity. Computational Creativity, Concept Invention, and General Intelligence (C3GI) Workshop at the 23rd International Joint Conference on Artificial Intelligence (IJCAI). Beijing, China.

  • Braasch, J., Van Nort, D., Oliveros, P. and Krueger, T. (2013). Telehaptic interfaces for interpersonal communication within a music ensemble. 21st International Congress on Acoustics. Montreal, Canada.

Media


Here are two takes from a recording session with CAIRA @ EMPAC:

Jonas Braasch: Soprano Saxophone
Doug Van Nort: granular-feedback expanded instrument system (GREIS)
CAIRA: realtime music improvisation agent - using audio material from Pauline Oliveros, V-Accordion, for this recording





A Robust Distributed Intelligent System for Telematic Music Applications

was a project funded by the National Science Foundation within the CreativeIT program. Within this project, we are interested to develop intelligent agents for improvisational music collaborations over the internet.  

Project Investigators

Jonas Braasch, Pauline Oliveros, Doug Van Nort

TriplePoint
Jonas Braasch, Pauline Oliveros, Doug Van Nort
(photo: Jonathan Chen)


Overview

Complex communication for co-located performers within telepresence applications across networks is still impaired compared performers sharing one physical location. This impairment must be significantly reduced to allow the broader community to participate in complex communication scenarios. To achieve this goal, an avatar in the form of a musical conductor with forms of artificial intelligence will coordinate between co-located musicians. Improvised Contemporary Live Music of a larger ensemble, serving as a test bed, is arguably one of the most complex scenarios one could think of, because it requires engaged communication between individuals within a multiple-source sound field that also has to be considered as a whole. The results are expected to inspire solutions for other communication tasks.

The avatar system will actively coordinate co-located improvisation ensembles in a creative way. To achieve this goal, Computational Auditory Scene Analysis (CASA) systems, to allow robust feature recognition, and Evolutionary algorithms, for the creative component, will be combined, to form the first model of its kind. The research results are expected to be significant by themselves and are not bound to telematic applications. With regard to the latter, the proposed system will have a clear advantage over a human musician/conductor, while intelligent algorithms are clearly lacking behind human performance in most other applications, especially when it comes to creativity.


Triple Point

Triple Point  (Pauline Oliveros, Jonas Braasch & Doug van Nort) is a 2009 trio with a post-genre approach well inclined to the Deep Listening practice.  The trio work is based on the development of the new IT tools which are developed within their CreativeIT grant from the National Science Foundation. The band derives its name from the thermodynamical point in the phase diagram where all three phases of water exist. Figuratively, this is where the trio operates exploring musical spaces and boundary conditions where contrasting ideas and streams can co-exist, while expanding the vocabulary of musical instruments acoustically (Braasch on soprano saxophone) and electronically (Oliveros, digital accordion  and Expanded Instrument System, EIS, Doug van Nort on laptop and GREIS).  For many decades Pauline Oliveros has been actively expanding the voice of her main instrument, the accordion. Given the limited natural possibilities of this instrument with respect to sound (fixed tuning, no pitch bends, narrow variety in overtone spectrum), Oliveros has begun half a century ago to alter the sound of her instrument using tape delays and other electronic devices. Van Nort�s work is based in digital signal processing, transforming Oliveros�, Braasch�s an his own sounds using Granular Synthesis, psychoacoustically-motivated sound analysis tools and Genetic Algorithms to explore new musical textures and timbres.

Publications

J. Braasch (2009) The Telematic Music System: Affordances for a New Instrument to Shape the Music of Tomorrow, Contemporary Music Review, 28(4): 421-432

J. Braasch, C. Chafe, P. Oliveros, D. Van Nort (2009) Mixing Console Design Considerations for Telematic Music Applications, Proc. 127th Audio Engineering Society Convention, Preprint 7942

J. Braasch (2009) Importance of visual cues in networked music performances, J. Acoust. Soc. Am. (conference abstract), 125, pp. 2516

D. Van Nort, J. Braasch, P. Oliveros (2009)  A system for musical improvisation combining sonic gesture recognition and genetic algorithms, in: Proceedings of the SMC 2009 - 6th Sound and Music Computing Conference, 23-25 July 2009, Porto, Portugal, 131-136.


Music Releases

Triple Point (2009) Sound Shadows (full digital album, six tracks), Jonas Braasch, soprano saxophone; Stuart Dempster, trombone, didjeridu, little instruments; Shane Myrbeck, mixing and mastering; Pauline Oliveros, digital accordion, Expanded Instrument System (EIS, Track 3); Doug Van Nort, laptop, GREIS, additional mixing and spectral processing on Tracks 4 and 6. Deep Listening Records, DL-DD-1

Concerts

March 12, 2009: Triple Point (Jonas Braasch, saxophone, Pauline Oliveros, Accordion, Doug Van Nort, live electronics) at the Emily Harvey Foundation 537 Broadway NYC at 8pm sponsored by Deep Listening Institute, Ltd.

April 3, 2009: Triple Point (Jonas Braasch, saxophone, Pauline Oliveros, Accordion, Doug Van Nort, live electronics), NYCEMF festival, New York City.

September 14, 2009: Triple Point (Jonas Braasch, saxophone, Pauline Oliveros, Accordion, Doug Van Nort, live electronics) at Harvestworks, New York City.

October 18, 2009: Triple Point (Jonas Braasch, saxophone, Pauline Oliveros, Accordion, Doug Van Nort, live electronics), Roulette, Part of New York Electronic Art Festival, New York City.

Listen to a recent excerpt from their concert at the New York Electronic Art Festival in New York City ( October 18, 2009). Two other works by Moritz Wettstein and Lucky Dragons are also shown in this clip:



Here is another excerpt from our concert at the Emily Harvey Foundation (March 12, 2009, New York City), in which a number of new developments from the CreativeIT project were presented:



Here is a short excerpt from our telematic work




Tintinnabulate & related Courses

Students are able to participate in the project through seminars that are linked to the project. The following excerpt shows a concert at EMPAC and Second Life with the RPI ensemble Tintinnabulate.

Tintinnabulate and our CreativeIT project play an important role in our ongoing seminar on Music Composition, Improvisation and Telematic Music. The current seminar is:

"Experimental Telepresence" (ARTS 4962/6962, instructor: Pauline Oliveros, co-instructors: Jonas Braasch, Doug Van Nort) Spring, 2010

and these were the related seminars in the past:

"Composition, Improvisation & Performance" (ARTS 4964/6964, instructor: Pauline Oliveros, co-instructors: Jonas Braasch, Doug Van Nort), Fall 2009/10.
"Experimental Telepresence"  (ARTS-4962/6962, 4 credits, instructor: Pauline Oliveros, co-instructor: Jonas Braasch), Spring 2009.

"Mixed Reality Seminar" (ARTS-4961/6961, instructor: Pauline Oliveros, co-instructor: Jonas Braasch), Fall 2008/09.

Here is a short clip of Tintinnabulate in a Concert at EMPAC and Second Life:


Useful Links

Telematic Circle







copyright � 2009 -CA3RL, Rensselaer Polytechnic Institute, Troy, New York