Gil Sperling

video, stage and music

Progress Report

April 28, 2020 by gilsperling

Final project

A vocal music piece controlled by the user’s breathing – when the user exhales, the singing is played.

Breathing is sensed through sound, using FFT in tone.js.

The p5 sketch will send OSC data to Unreal Engine.

Inside Unreal the audio file of the singing will be played/paused based on the OSC data. Using an audio visualization plugin, the track being played goes through spectrum analysis which generates visuals.

Song:

Ich bin der Welt abhanden gekommen, by Gustav Mahler.

Visual environment:

interiors of the respiratory system

I experimented with interfaces for breathing (such as different tubes and mouthpieces) trying to create distinct sounds for exhaling and inhaling. However, when looked at through spectrum analysis the sounds aren’t distinct enough. I also tried training teachable machine to make the distinction but it does not give reliable results.
One thing that works is breathing through a straw into a glass filled with a small amount of water. The sound of the bubbles has a distinct high frequency that can be used for triggering by FFT.

The p5 sketch actually works with any vocal production as trigger.

Next steps:
Add OSC to p5 sketch
Implement OSC input in Unreal Engine
Import audio to Unreal and develop audio visualization
Add audio spatialization (reverb) to the sound

Presentation next week:
I will present the first part (user vocal input triggering music), maybe with symbolic visual representation.

Posted in Uncategorized |

Comments are closed.