Casual Creators: How New Tools are Changing Artistic Practices

Watch the webinar in replay here:

Professional musicians use tools to be creative, but there are even more amateurs who enjoy being creative, albeit in casual ways. During this event, we will explore whether digital and other systems can be designed to help these casual users tap into their creative sides. In addition, we will discuss the design of creativity-augmentation systems that rely on computational creativity. 

October 19, 2021 – 11:30 a.m.

Panelists:

This online event is free and open to the public.

Biographies:

  • Arshia Cont is the CEO and Co-founder of the Music Tech Startup Antescofo (Metronaut: Antescofo was created when the best of research and entrepreneurship met. Our technology was conceived during more than 7 years of R&D at IRCAM in Paris, the world’s largest music and technology laboratory, and tested with the world’s greatest musicians and orchestras) , behind the mobile Application Metronaut, since 2016. After completing a PhD in Computer Music at UC San Diego, he served as research scientist between 2008 and 2016 at IRCAM, a center dedicated to fostering music technology and artistic creativity; and head of the MuTant research team at INRIA. In the same period, he acted as the director of Research/Creativity Interfaces in the institute where he founded the IRCAM Musical Research Residency program and pushed IRCAM Forum to the social network era. He created the award-winning Antescofo technology in 2008, joining artificial intelligence and music in real-time computer-human experience.
  • Keith Groover is a musician, inventor, teacher, and the 2019 winner of the international Guthman Musical Instrument Competition for his instrument, The Glide. The Glide is an accelerometer-based melodic instrument designed to be physically, technically, and financially accessible to all. He lives in Spartanburg, USA, and is a freelance composer, performer, and one-half of the guitar/cello duo WireWood.
  • Edgar Hemery serves as the CEO of Embodme (Embodme, they bring the richness of musical gesture to the world of synthesizers. Embodying musical expressivity means sensing every finger touch, attacks, after touch and preparation gestures. Their technology fuses pressure sensitive surfaces with computer vision capturing mid-air gestures). Edgar Hemery’s background is at the crossroads of human-computer interaction, machine learning and signal processing. Previous experiences in academic research (University of Edinburgh, IRCAM, Mines ParisTech) have led him to specialize in the design of musical interactions based on gesture recognition and then to the creation of Embodme.
  • Dr. Adrien Mamou-Mani serves as the CEO of HyVibe (HyVibe’s mission is to provide excellent sound quality to connected objects. We do this by ‘exciting’ them with smart, connected vibration technology.) Prior to creating HyVibe, he was the lead researcher for the IRCAM Instrumental Acoustics team. He holds a PhD in Acoustics and Mechanics from the University Pierre et Marie Curie (Paris) and has been a post-doctoral researcher at the Paris Philharmonic Museum (Paris) and at the Open University Acoustics Laboratory (United Kingdom) as Newton Fellow. Adrien is recognized as a world expert in vibration and control of musical instruments.
  • Jean-Louis Giavitto is a senior computer scientist at CNRS. His work at IRCAM has focused on the representation and manipulation of musical objects, for music analysis, composition and live performance. He designed and developed the real-time reactive programming language of the Antescofo system. This programming language, together with a listening module, is used for the production of mixed music pieces at IRCAM and elsewhere in the world, synchronizing human performance with electronic actions. This technology now benefits everyone, thanks to the creation of a spinoff to develop and distribute an automatic accompaniment application. He was also deputy director of the joint research lab IRCAM-CNRS-Sorbonne University until September 2021.
  • Jason Freeman is a Professor of Music at Georgia Tech and Chair of the School of Music. His artistic practice and scholarly research focus on using technology to engage diverse audiences in collaborative, experimental, and accessible musical experiences. He also develops educational interventions in K-12, university, and MOOC environments that broaden and increase engagement in STEM disciplines through authentic integrations of music and computing. His music has been performed at Carnegie Hall, exhibited at ACM SIGGRAPH, published by Universal Edition, broadcast on public radio’s Performance Today, and commissioned through support from the National Endowment for the Arts. Freeman’s wide-ranging work has attracted over $10 million in funding from sources such as the National Science Foundation, Google, and Turbulence. It has been disseminated through over 80 refereed book chapters, journal articles, and conference publications. Freeman received his B.A. in music from Yale University and his M.A. and D.M.A. in composition from Columbia University.

_

This event is organized by IRCAM with the support of the Atlanta Office of the Cultural Services of the Embassy of France in the United States. IRCAM, the Institute for Research and Coordination in Acoustics/Music (Institut de recherche et coordination acoustique/musique) is a French institute dedicated to the research of music and sound, especially in the fields of avant garde and electro-acoustical art music.