Saturday, February 28, 2009

Microphone Recognittion

The key components to user interaction in the M.K. system is to know where the microphone is being held.

By knowing X/Y position of the microphone, we can assume a very general position of the user. Essentially user body center is < 1 meter from microphone. There are better ways to get user position, but this one is the easiest, since we are leveraging the the technology that the user is ALREADY holding.

The original concept was to build a band of IR LEDs that circle the microphone. A problem with this is knowing how close the user is to the camera. The hack solution is to provide a narrow stage that limits where the user can position themselves. A better solution is for the microphone system to let the computer know its exact position.

For this reason, ARToolkit was considered. ARToolkit faducial can be rigged as a backlight. This would give X,Y,Z and tilt position. Very powerful for user interaction. Unfortunately, ARToolkit is built for Augmented Reality,and seems to assume the user and camera are in same position - heads up or see through displays. For a 7 inch faducial we get 50 feet distance from faducial to camera. Assuming a 1 foot covering is largest we can give user, then we can expect no more than a 100 foot distance from camera, or 8 feet.

In looking at positioning under this system, the ideal is to place camera and projector as close to one another as possible. This model was suggested by Zach Lieberman, whose work with Open Framework makes him an expert - for sure. It seems to make sense, since you are minimizing skew between projection and vision systems. Problem is that projectors typically require 15 feet for proper projection throw. This ofcourse depends on size of the projection. Were camera and projector next to each other, then faducial would be ... about 2 feet big! This would be cumbersome and a huge obstruction.

So for now, back to drawing board.

Friday, January 16, 2009

motion tracking in flash

This video shows a pretty cool implementation of motion tracking using flash.


Tracking Multiple Objects Using a Webcam from chris teso on Vimeo.


Full blog post here.

DEERTANK


one last one... testing some other body types... Let me know if im getting to crazy.

i give you lobster crotch


ok heres one more before the weekend. IM experimenting with a color system now also.. like the body parts the coloration should be generated at random like the parts.

Wednesday, January 14, 2009

MR TEEEEEEEEEEEETH


Im trying a new body type here and seeing if you guys can code in like an IF THEN statement. LIke IF the body is this THEN stop adding parts... I present MR TEEEEEEEEETH.

Thursday, January 8, 2009

TITFISH

Here is the my latest effort TITFISH. I am finding more and more that the way to have these things work (multiple character generation) is to remove most of the joints. That way placement is easier to asses.

Tuesday, January 6, 2009

Playing with AS3

One of the first things I ran into was the lack on an onEnterFrame. It seems that event listeners are the way to go.

Here's what I did today. There is no analysis yet, but it tracks the x and y locations of the mouse for a period of thirty seconds (when running at 30 frames/second).

I just wanted to check that I am using the best approximation of an onEnterFrame function.

----------
var xLocations:Array=new Array();
var yLocations:Array=new Array();
var testFreq=3;
var counter=0;

//300 at 10 sample per second is 30 seconds
for (var i=0; i<300; i++) {
xLocations.push(0);
yLocations.push(0);
}

addEventListener(Event.ENTER_FRAME,TrackMouse);

function TrackMouse(event:Event) {
counter++;
if (counter>=testFreq) {
counter=0;
xLocations.push(mouseX);
xLocations.splice(0,1);
yLocations.push(mouseY);
yLocations.splice(0,1);
}
}
-------------

-Andy