information

Type
Conférence scientifique et/ou technique
performance location
Ircam, Salle Igor-Stravinsky (Paris)
duration
57 min
date
June 2, 2012

Abstract: Although MIR did arguably not start as a research discipline for promoting creativity and music performance, this trend has begun to gain importance in recent years. The possibilities of MIR for supporting musical creation and musical education are indeed many-folded. While the use of MIR techniques for real-time music creation may indeed help both experts and complete novices to explore new creative musical universes laying somewhere in between micro-level synthesis control and macro-level remixing, the application of MIR tools oriented to music education and to children's musical performance seems another unexplored area with radical new possibilities. In this talk we will describe some recent applications of MIR techniques to music and multimodal creation recently developed at the MTG and at Reactable Systems, and we will explore the potential of MIR for children music education and performance.

Bio: Sergi Jordà (1961) holds a B.S. in Fundamental Physics and a Ph.D. in Computer Science and Digital Communication. He is a researcher in the Music Technology Group of Universitat Pompeu Fabra in Barcelona, where he specializes in real-time interaction and tabletop interfaces, and an Associate Professor in the same university, where he teaches computer music, HCI, and interactive media arts. He has written many articles, books, given workshops and lectured though Europe, Asia and America, always trying to bridge HCI, music performance and interactive media arts. He has received several international awards, including the prestigious Ars Electronica’s Golden Nica in 2008. He is currently best known as one of the inventors of the Reactable, a tabletop musical instrument that in 2007 accomplished mass popularity after being integrated in Icelandic artist Bjork’s Volta world Tour. He is also one of the founding partners of Reactable Systems, a spin-off company created in 2009 (www.reactable.com).

speakers

From the same archive

Introduction - Geoffroy Peeters

June 2, 2012 16 min

Video

Descripteurs audio : un enjeux majeur pour la composition en temps réel

Abstract: Un des aspects les plus prospectifs de la relation instrument/machine réside dans le développement des moyens d’analyse acoustique des sons instrumentaux et vocaux en temps réel que l’on appelle descripteurs audio. Le nombre de ce

June 2, 2012 56 min

Video

VirtualBand, a MIR-approach to interactive improvisation - François Pachet

Abstract: This talk introduces a new music interaction system based on style modeling, in a MIR-oriented perspective called VirtualBand. VirtualBand aims at combining musical realism and quality with real-time interaction, by capturing esse

June 2, 2012 54 min

Video

Table ronde

June 2, 2012 44 min

Video

Information retrieval and deployment in interactive improvisation systems - Gérard Assayag

Abstract: Interactive Improvisation Systems involve at least three cooperating and concurrent expert agents: machine listening, machine learning, model based generation. Machine listening may occur during the initial learning stage (off-lin

June 2, 2012 59 min

Video

Gestural Re-Embodiment of Digitized Sound and Music - Norbert Schnell

Abstract: Over the past years, music information research has elaborated powerful tools for creating a new generation of applications that redefine the boundaries of music listening and music making. The recent availability of affordable mo

June 2, 2012 28 min

Video

Playing with Music - Tristan Jehan

Abstract: For the past 60 years, machines have been involved in all aspects of music: playing, recording, processing, editing, mixing, composing, analyzing, and synthesizing. However, in software terms, music is nothing but a sequence of nu

June 2, 2012 59 min

Video

Interactive Exploration of Sound Corpora for Music Performance and Composition - Diemo Schwarz

Abstract: The wealth of tools developed in music information retrieval (MIR) for the description, indexation, and retrieval of music and sound can be easily (ab)used for the creation of new musical material and sound design. Based on automa

June 2, 2012 44 min

Video

share


Do you notice a mistake?

IRCAM

1, place Igor-Stravinsky
75004 Paris
+33 1 44 78 48 43

opening times

Monday through Friday 9:30am-7pm
Closed Saturday and Sunday

subway access

Hôtel de Ville, Rambuteau, Châtelet, Les Halles

Institut de Recherche et de Coordination Acoustique/Musique

Copyright © 2022 Ircam. All rights reserved.