Gil Sperling

video, stage and music

Prototype

March 31, 2020 by gilsperling

Classical voice simulation

Using face.api, the user can control the pitch and volume of singing by raising and lowering the eyebrows, and opening and closing their mouth, respectively.

Code

In an attempt to simplify the interaction and make it fully web compatible, I eliminated any physical computing interaction elements (such as sensing diaphragm/belly/chest motion).

For the sake of simplicity, both for user and code, I focused on what I could get out of one machine learning/computer vision model.
Theoretically I could have tried using Posenet for hand control of pitch and face.api for mouth control of volume but:
a. That would require two separate camera inputs and two ML models running in the same code
b. An interaction experience focused on the face feels more cohesive.

Using the mouth for volume feels intuitive and is quite easy to control.
The eyebrows don’t afford much accuracy (pretty damn hard to sing a tune!) but the physicality does have an association to pitch, at least to me.

There is a lot of jitteriness when running face.api. If I could find a way to improve the performance and smoothen it, that might improve the experience.

Potential further development:

  • Add note visualization
  • Play with different singing samples and Tone.loop parameters
  • See if face.api can be used to extract vowels

Alternative direction for final –
VR interaction

A musical experience through architecture – using an interactive experience in VR to advance the understanding of musical concepts or expand on the meaning and emotional effect of a piece of music.

Ideas:

A. Take a specific piece of music and realize it spatially.
As you explore the space you unlock different elements of the piece.
Musical material (likely candidate): Wagner’s Ring Cycle.
Represent musical Leitmotifs in 3D space, as a physical experience

B. Continue the singing interaction project – capture the user’s mic audio and create a sound experience and visualization from it.

Sound interactivity in Unreal Engine

Mic input – visualization and effects

3D physics-controlled musical instruments
audio synthesis

Posted in Spring 20 - Music Interaction Design |

Comments are closed.