Diving into unknown pleasures

Joy Division - "Unknown Pleasure" album cover

Il y a quelque chose d’intrigant dans la pochette de l’album “Unknown Pleasures” de Joy Division. On attribue souvent cette image à Peter Saville, designer pour le label Factory Records à l’époque de la sortie de l’album, mais il n’a fait que mettre en blanc sur noir un dessin déjà paru dans le Scientific American.

Continue reading “Diving into unknown pleasures”

The Brain Orchestra – neural activity sonification

Neural activity data-sonification, with Sébastien Wolf of the ENS Institute of Biology.

Data sonification could be an effective tool for neuroscience research, complementing data visualization. Recent advances in brain imaging have made it possible to record the activity of tens of thousands of mammalian neurons simultaneously in real time. The spatial and temporal dynamics of neuron activation can be translated into sound triggering, according to the functional groups to which these neurons belong.

We have developed software to load such datasets as binary matrices and translate them into MIDI messages, triggering notes whose velocity is a function of neuronal activity. In order to process this vast quantity of data — several tens of thousands of neurons over several tens of thousands of samples — the software enables neurons to be associated in sub-groups, such as those proposed in common atlases, or in an arbitrary manner. The same interface can also be used to sonify continuous data sets from electroencephalography recordings of human brain activity.

This software, developed with Max, can be used as a stand-alone program, but can also be loaded directly as a plugin into the Ableton Live digital audio workstation. This makes it easy to get to grips with the software, enabling you to test different mappings between neural activity data and musical values: which chords, which harmonic progressions, which orchestration, etc. translate the neural activity data set in the most interesting way from the point of view of their scientific understanding and/or musical aesthetics.

Staccato – Musical sound streams vibrification

Vibrotactile transformation of musical audio streams

Vibrification, or the transformation of audio streams into tactile vibration streams, involves the development of transformation algorithms, in order to translate perceptible cues in the audible domain into vibratory cues in the tactile domain. Although auditory and tactile perceptions have similarities, and in particular share part of their sensitive frequency space, they differ in many respects. Consequently, the process of vibrating an audio signal requires a set of strategies for selecting the elements to be translated into vibrations. To this end, as part of the “Staccato” project, a set of free and open-source tools in the Max software have been developed:

  • a framework based on the “Model-View-Controller” (MVC) pattern to facilitate settings and experimentation
  • a set of algorithms, enabling adaptation to different types of vibrotactile transducers and leaving the choice between various vibrification strategies, adaptable according to the content of the audio signal and the user’s preferences.

This set of tools aims at easing the exploration of the vibrotactile modality for musical sound diffusion, supported by a recent boom in technical devices enabling its implementation.

The Staccato project was funded by the french National Research Agency (ANR-19-CE38-0008) and coordinated by Hugues Genevois from the Luthery-Acoustics-Music team at ∂’Alembert Institute- Sorbonne University.

Code is available on GitHub.

Reference

Vincent Goudard, Hugues Genevois. Transformation vibrotactile de signaux musicaux. Journées d’informatique musicale, Laboratoire PRISM; Association Francophone d’Informatique Musicale, May 2024, Marseille, France. [online]

AIM-Framework

Designing a complete in-car audio experience requires rapid prototyping solutions in a complex audio configuration, bringing together different areas of expertise ranging from sound-design and composition, down to hardware protection, with every conceivable layer of audio engineering in-between, up to A-B comparisons setups for end-users perception evaluation in real demonstration vehicules.

The AIM project started as a request from the Active Sound eXperience team at Volvo Cars Company to meet such goals.To this end, it was decided to develop a framework on top of Max/MSP, so that dedicated audio processing modules could be easily created, with the ability to store presets for various configurations, and to take advantage of Max’s modular design to distribute the complexity of audio engineering among the various expert teams involved in the project.

The core part of the package (building blocks) was presented at the Sound and Music Conference (SMC’22) organized by GRAME in Saint Etienne, France.

Summary: https://zenodo.org/record/6800815

Sagrada — Sample Accurate Granular Synthesis

 

Sagrada is an open-source Max package performing sample-accurate granular synthesis in a modular way. Grains can be triggered both synchronously and asynchronously. Each grain can have its own effects and eveloppes (for instance the first “attack” and last “release” grains of a grains stream).

You can get it from the Github repository:

https://github.com/vincentgoudard/Sagrada

Sagrada screenshot
sagrada.play~ will play grain synchronously or asynchronously (click for video demo)
Sagrada multilayers
sagrada.multilayer~ allows for running multiple streams of grains in parallel (click for video demo)

Sagrada was partly developed during my PhD at LAM. It was inspired by the very good GMU tools developped at GMEM (and its sample-rate triggering) and the FTM package developed at IRCAM (and its modularity). Not to mention all of Curtis Roads’ work on granular synthesis.

tangible user interfaces @ ICLI, Porto

After presenting “John, the semi-conductor” at the TENOR conference three weeks ago (http://matralab.hexagram.ca/tenor2018), I just arrived in Porto to attend the International Conference on Live Interfaces aka ICLI 2018 (http://www.liveinterfaces.org/) and present some work on tangible user interfaces.
Very happy to be here, the city is beautiful and the program of the conference is pretty exciting!…

Table Sonotactile Interactive

Image ©Anne Maregiano.

The Interactive Sonotactile Table is a device invented for the Maison des Aveugles (“House of the Blinds”) in Lyon by french composer Pascale Criton in collaboration with Hugues Genevois from the Luthery-Acoustics-Music team of the Jean Le Rond d’Alembert Institute and Gérard Uzan, researcher in accessibility. The table was designed by Pierrick Faure (Captain Ludd) in collaboration with Christophe Lebreton (GRAME)

I coded the embedded Arduino boards as well as the Max patch for the gesture/sound interactive design.

The Table Sonotactile Interactive is part of a larger project : La Carte Sonore by Anne Maregiano at the Villa Saint Raphaël: https://www.mda-lacartesonore.com.

mp.TUI — a Max package for multitouch screen interaction

mp.TUI is a Max package with OpenGL UI components ready for multitouch interaction (using the TUIO protocol).

It was presented at the ICLI’2018 conference in Porto and was used in a series of projects including the Phonetogramme, Xypre and FIB_R.

Sources available on Github : https://github.com/LAM-IJLRA/ModularPolyphony-TUI

A few reactive UI components.

Dynamic cursors tracking placed objects

Dynamic semi-stable cursors

 

PANAM — accessible tools for digital art pedagogy

PANAM (Pédagogie artistique numérique accessible et multimodale) is a research and development project led by Puce Muse and concerned with the development and analysis of HCI strategies and tools for collective music practice with digital music instruments. It focuses on the accessibility of such tools for disabled people.

As part of the LAM team, several tools have been developed for the mapping, visualisation, and building of digital music instruments. They have been implemented as modules for the Meta-Mallette software (©PuceMuse), and are available as part of the LAM-lib, a software library for Max/MSP.

Publication

[pdf] Vincent Goudard, Hugues Genevois, Lionel Feugère. On the playing of monodic pitch in digital music instruments. Anastasia Georgaki and Giorgos Kouroupetroglou. ICMC/SMC 2014, Sep 2014, Athènes, Greece. National and Kapodistrian University of Athens, pp.1418, 2014.

Partners