Gil Sperling

video, stage and music

Final project concept

November 6, 2019 by gilsperling

The interactive scroll

Project context – exhibit design/educational

The imagined application of this project is an interactive exhibit in a museum/library that owns ancient scrolls, and those being valuable and delicate, they are inaccessible to the public. The project is a prototype for an installation that would allow visitors to physically interact with a simulated version of the artifact.

How it works

sketch showing the layer structure of the object

The user interfacing object is a blank scroll that can be rolled using handles at the sides (see image).The scroll is placed on a hard transparent surface, such as acrylic.Text is projected from under the scroll, in Hebrew. As the user rolls the handles, the scroll advances, and the projected text moves with it. This works in reverse as well, using the other handle. Wherever the user places their finger on the scroll, a translation of the text into English appears. The more the user touches the scroll, additional layers of content are projected (annotations, illustrations, animated illustrations).
In the scope of the final, I will only deal with text layers.
Later developments might include response to different kinds of touch on the page (wiping with the entire hand to erase, for example).


The implementation requires:
A. Sensing the lapsed degree of turn of the handles (differentiating between 360 degrees and 720 degrees, to know how far in the text we’ve scrolled). This should be possible using rotary encoders connected to the scroll handles.
B. Sensing the position of the user’s finger on the page.
Two options possible options are:
1. A touchpad, which would have to be clear to allow projection through it. This seems to be the function of a resistive touchscreen overlay – the linked example has a diagonal size of 7″ and costs $15, so this may limit the scale of the installation.
2. A camera that can detect the finger’s position. This option would limit the possibilities of being sensitive to the kind of touch the user makes with the page. 

Project phases

  1. Construct prototype object for playtesting, possibly with manually controlled projections
  2. use playtest feedback to design user interaction scenarios
  3. research and test sensors, materials
  4. fabricate object, integrating the sensors
  5. integrate video content, create code for video playback that is controlled by Arduino data

Posted in Fall '19 - Introduction to Physical Computation |

Comments are closed.