The Brain Orchestra – neural activity sonification

Neural activity data-sonification, with Sébastien Wolf of the ENS Institute of Biology.

Data sonification could be an effective tool for neuroscience research, complementing data visualization. Recent advances in brain imaging have made it possible to record the activity of tens of thousands of mammalian neurons simultaneously in real time. The spatial and temporal dynamics of neuron activation can be translated into sound triggering, according to the functional groups to which these neurons belong.

We have developed software to load such datasets as binary matrices and translate them into MIDI messages, triggering notes whose velocity is a function of neuronal activity. In order to process this vast quantity of data — several tens of thousands of neurons over several tens of thousands of samples — the software enables neurons to be associated in sub-groups, such as those proposed in common atlases, or in an arbitrary manner. The same interface can also be used to sonify continuous data sets from electroencephalography recordings of human brain activity.

This software, developed with Max, can be used as a stand-alone program, but can also be loaded directly as a plugin into the Ableton Live digital audio workstation. This makes it easy to get to grips with the software, enabling you to test different mappings between neural activity data and musical values: which chords, which harmonic progressions, which orchestration, etc. translate the neural activity data set in the most interesting way from the point of view of their scientific understanding and/or musical aesthetics.

Staccato – Musical sound streams vibrification

Vibrotactile transformation of musical audio streams

Vibrification, or the transformation of audio streams into tactile vibration streams, involves the development of transformation algorithms, in order to translate perceptible cues in the audible domain into vibratory cues in the tactile domain. Although auditory and tactile perceptions have similarities, and in particular share part of their sensitive frequency space, they differ in many respects. Consequently, the process of vibrating an audio signal requires a set of strategies for selecting the elements to be translated into vibrations. To this end, as part of the “Staccato” project, a set of free and open-source tools in the Max software have been developed:

  • a framework based on the “Model-View-Controller” (MVC) pattern to facilitate settings and experimentation
  • a set of algorithms, enabling adaptation to different types of vibrotactile transducers and leaving the choice between various vibrification strategies, adaptable according to the content of the audio signal and the user’s preferences.

This set of tools aims at easing the exploration of the vibrotactile modality for musical sound diffusion, supported by a recent boom in technical devices enabling its implementation.

The Staccato project was funded by the french National Research Agency (ANR-19-CE38-0008) and coordinated by Hugues Genevois from the Luthery-Acoustics-Music team at ∂’Alembert Institute- Sorbonne University.

Code is available on GitHub.

Reference

Vincent Goudard, Hugues Genevois. Transformation vibrotactile de signaux musicaux. Journées d’informatique musicale, Laboratoire PRISM; Association Francophone d’Informatique Musicale, May 2024, Marseille, France. [online]

attracteurs étranges / forages

High-resolution digital prints by FORAGES.
Pigment ink on Canson paper, 600 dpi.
Dimensions 40x60cm. Photo satin premium RC 270g.
Single copies, signed by the artists.
Contact me for price and availability [here].

“Attracteurs étranges” is a series of digital images created with custom-made chaotic algorithms for the performance “FIB_R” by FORAGES [Gladys Bregeon & Vincent Goudard].
The title refers to the mathematical object of the same name, which exhibits both a chaotic and organized nature, through aperiodic states. According to mathematician David Ruelle, who coined the term, strange attractors should help us to “elucidate the fundamental mechanisms of turbulence, chemical reactions, weather forecasting and bacterial population genetics”.

Note that previews here below are low resolution.

attracteur étrange 32
attracteur étrange 39
attracteur étrange 48
attracteur étrange 64
attracteur étrange 70
attracteur étrange 73

AIM-Framework

Designing a complete in-car audio experience requires rapid prototyping solutions in a complex audio configuration, bringing together different areas of expertise ranging from sound-design and composition, down to hardware protection, with every conceivable layer of audio engineering in-between, up to A-B comparisons setups for end-users perception evaluation in real demonstration vehicules.

The AIM project started as a request from the Active Sound eXperience team at Volvo Cars Company to meet such goals.To this end, it was decided to develop a framework on top of Max/MSP, so that dedicated audio processing modules could be easily created, with the ability to store presets for various configurations, and to take advantage of Max’s modular design to distribute the complexity of audio engineering among the various expert teams involved in the project.

The core part of the package (building blocks) was presented at the Sound and Music Conference (SMC’22) organized by GRAME in Saint Etienne, France.

Summary: https://zenodo.org/record/6800815

ReCoDIN [PhD]

ReCoDIN stands for “representation and control in the interactive design of digital musical instruments”, the topic of a doctoral research project started in 2016.

Abstract : Digital musical instruments appear as complex objects, being positioned in a continuum with the history of lutherie as well as marked with a strong disruption provoked by the digital technology and its consequences in terms of sonic possibilities, relations between gesture and sound, listening situations, reconfigurability of instruments and so on. This doctoral work tries to describe the characteristics originating from the integration of digital technology into musical instruments, drawing notably on a musicological reflection, on softwares and hardwares development, on musical practice, as well as a number of interactions with other musicians, instruments makers, composers and researchers.

This PhD was led under the joint supervision of Jean-Dominique Polack from the Lutherie-Acoustics-Music team at Institut ∂’Alembert (LAM, CNRS-UMR7190) and Pierre Couprie from the Research Institute in Musicology (IReMus, CNRS-UMR 8223).

Advisor : Hugues Genevois from the Lutherie-Acoustics-Music team at Institut ∂’Alembert (CNRS-UMR7190).

This research was supported by Collegium Musicæ at Sorbonne Université.

Tools

This research led to the development of various Open-Source tools and softwares, some of which are described in academic publications  (see below). Feel free to fork them on GitHub !

  • LAM-lib : a random collection of objets and utilities for digital luthery in Max.
  • ModularPolyphony (MP) : a protocol and set of abstractions in Max, allowing expressive control of polyphonic processes, connected in a modular way.
  • ModularPolyphony-TUI (MP-TUI) : a set of objects and utilities built on top of MP, meant for designing custom multitouch tangible user interfaces (TUI).
  • Sagrada : a library for audio-rate control of modular processes, particulary targeted at granular synthesis.
  • John, the Semi-Conductor : a web-based collective score generator, editor and player, crafted for helping collective free improvisation of electroacoustic music.
Related publications
  • V. Goudard, « Représentation et contrôle dans le design interactif des instruments de musique numériques », PhD thesis, 2020. [online]
  • V. Goudard, « Ephemeral instruments », in Proceedings of the International Conference on New Interfaces for Musical Expression (NIME’19), Porto-Alegre, Brésil, 2019, p. 349–354. [online]
  • V. Goudard, « John, the Semi-Conductor: A Tool for Comprovisation », in Proceedings of the International Conference on Technologies for Music Notation and Representation – TENOR’18, Montreal, Canada, 2018, p. 43–49. [online]
  • V. Goudard, « Ergonomics of touch-screen Interfaces », in Proceedings of the International Conference on Live Interfaces (ICLI’18), Porto, Portugal, 2018. [online]
  • V. Goudard et H. Genevois, « Mapping modulaire de processus polyphoniques », in Actes des Journées d’Informatique Musicale (JIM’17), 2017. [online]

Xypre — a sound-enhanced multitouch screen interface

Xypre is a multitouch-screen based interface enhanced with contact microphones and tactile speakers that I designed for live performance. It was built in just a few days for a forthcoming performance. I am playing it in FIB_R and with the ONE ensemble.

Making of timelapse.

 

Sagrada — Sample Accurate Granular Synthesis

 

Sagrada is an open-source Max package performing sample-accurate granular synthesis in a modular way. Grains can be triggered both synchronously and asynchronously. Each grain can have its own effects and eveloppes (for instance the first “attack” and last “release” grains of a grains stream).

You can get it from the Github repository:

https://github.com/vincentgoudard/Sagrada

Sagrada screenshot
sagrada.play~ will play grain synchronously or asynchronously (click for video demo)

Sagrada multilayers
sagrada.multilayer~ allows for running multiple streams of grains in parallel (click for video demo)

Sagrada was partly developed during my PhD at LAM. It was inspired by the very good GMU tools developped at GMEM (and its sample-rate triggering) and the FTM package developed at IRCAM (and its modularity). Not to mention all of Curtis Roads’ work on granular synthesis.

raspiCamGrab for Peauème by Gladys Brégeon

Development of an autonomous microscope camera device running in OpenFrameworks on a cased Raspberry Pi, for the art installation peauème by artist Gladys Brégeon.

The program is self-bootable with a hidden boot sequence so that it can be easily (and elegantly) started in the exhibition space. It allows for hue, saturation, brightness, RGB gains and contrast settings.

Code is open-source and available there : https://github.com/vincentgoudard/raspiCamGrab

Peauème, Gladys Brégeon
Peauème, ©2019 Gladys Brégeon

John — the semi-conductor (reactive web version)

John (“the semi-conductor”) is an open-source software designed to help collective free improvisation. It provides a constraint-based score generator and displays screen scores running on distributed, reactive web browsers.
The musicians can then concurrently edit the scores in their own browser. One of the original features of John is that its design takes care of leaving the musician’s attention as free as possible, relying on large colorful blocks and minimal text-data.

John is used by ONE, a ensemble playing improvised electro-acoustic music with digital musical instruments. John was presented at the TENOR’2018 conference in Montreal, CA [pdf here].

Table Sonotactile Interactive

Image ©Anne Maregiano.

The Interactive Sonotactile Table is a device invented for the Maison des Aveugles (“House of the Blinds”) in Lyon by french composer Pascale Criton in collaboration with Hugues Genevois from the Luthery-Acoustics-Music team of the Jean Le Rond d’Alembert Institute and Gérard Uzan, researcher in accessibility. The table was designed by Pierrick Faure (Captain Ludd) in collaboration with Christophe Lebreton (GRAME)

I coded the embedded Arduino boards as well as the Max patch for the gesture/sound interactive design.

The Table Sonotactile Interactive is part of a larger project : La Carte Sonore by Anne Maregiano at the Villa Saint Raphaël: https://www.mda-lacartesonore.com.