Saturday, February 28, 2009

Microphone Recognittion

The key components to user interaction in the M.K. system is to know where the microphone is being held.

By knowing X/Y position of the microphone, we can assume a very general position of the user. Essentially user body center is < 1 meter from microphone. There are better ways to get user position, but this one is the easiest, since we are leveraging the the technology that the user is ALREADY holding.

The original concept was to build a band of IR LEDs that circle the microphone. A problem with this is knowing how close the user is to the camera. The hack solution is to provide a narrow stage that limits where the user can position themselves. A better solution is for the microphone system to let the computer know its exact position.

For this reason, ARToolkit was considered. ARToolkit faducial can be rigged as a backlight. This would give X,Y,Z and tilt position. Very powerful for user interaction. Unfortunately, ARToolkit is built for Augmented Reality,and seems to assume the user and camera are in same position - heads up or see through displays. For a 7 inch faducial we get 50 feet distance from faducial to camera. Assuming a 1 foot covering is largest we can give user, then we can expect no more than a 100 foot distance from camera, or 8 feet.

In looking at positioning under this system, the ideal is to place camera and projector as close to one another as possible. This model was suggested by Zach Lieberman, whose work with Open Framework makes him an expert - for sure. It seems to make sense, since you are minimizing skew between projection and vision systems. Problem is that projectors typically require 15 feet for proper projection throw. This ofcourse depends on size of the projection. Were camera and projector next to each other, then faducial would be ... about 2 feet big! This would be cumbersome and a huge obstruction.

So for now, back to drawing board.