The design brief was to use Arduino in a display environment to evoke movement with the user. The group designed a basic approach using a servo motor to control the moving eyes, and a ping sensor to judge the position of the viewer in relation to t
  Code was developed to calibrate the position of the eyes (angle of the servo)  to the position of the viewer (reading from the ping sensor) to create an accurate effect of the paintings eyes following the user.
 A problem addressed was how to encourage the viewer to walk back and forth in a linear relation to a ping sensor hidden to the side of them. This was accomplished by creating a gallery environment with two pedestals on either side and rope barr
prev / next