Gil Sperling

video, stage and music

Week 5

October 9, 2019 by gilsperling

Beginnings of a concept

An interactive movable projector: a system that can move a projected image on the x and y axes, guided by some physical input, either:
– hand movements that indicate up/down and left/right motion
– tracking a body or body part to place a projection on it or near it


The key here is that I don’t want to animate an image inside a projector, rather I want to move the entire projected image. Covering a large space with projection requires multiple high luminosity projectors. Dynamically moving a projected image in that space means an area is only “covered” when the projection light source is pointed at it.

The movement is controlled by two servo motors placed in a pan/tilt bracket. Each servo can rotate through a 180 degree range on either the x or y axis.

In my initial tests I tried connecting a pico projector to the bracket, but quickly found out that even the smallest projector in the ER is too heavy for the servo to move stably or even hold in place. (It’s possible that I misunderstood what is meant by “stall torque” in the servo’s datasheet, and that I’m making an error by feeding it with 3.3V from the Arduino when its voltage range is 4.8V – 6.0V). A better approach would be to have a mirror be the object controlled by the servos, and move the reflection of the projected light around the room (a technique also used to make moving lights).

What is the input guiding the movement? In my basic test setup I controlled each servo using a potentiometer. But what I’m looking for is interaction with a performer or spectator. In the next step, I connected an ultrasonic range sensor and was able to control one servo by moving a hand towards and away from the sensor. I found the code in the range sensor example sketch and adjusted the minimum and maximum distances that would correspond with minimum and maximum angle range. But there is no intuitive or aesthetic value to this kind of behavioral input. In further steps I want to be able to track detailed hand movement, like pointing the palm of the hand up or down. Thats requires a more sophisticated sensor, like Leap Motion. I also started playing with body tracking through machine learning as an option. Using the Posenet model in ml5 I can track body movement in a webcam – I would then need to write code in p5.js that would analyze the relevant data and export it to the Arduino over serial connection.

Posted in Fall '19 - Introduction to Physical Computation |

Comments are closed.