Hi David,True 6DOF was solved elegantly by Marek Alboszta's team (Espi) (quaternions are the way to go) and supplied by Polhemus using a more elaborate set of sensors. Each solution has trade-offs, for good engineering reasons. I wouldn't try to solve it myself. These are not cheap gadgets for good reason.
On 2011-06-10, at 1:20 PM, David Morris <email@example.com> wrote:
Actually, I wonder if this might do what we need:
Glue rigid wood or fibreglass dowel to iPhone case.
Fix a ring of LEDs around the dowel shaft at two points, so an overhead camera can always pick up two points of light at known distances from the iPhone.how would these distances be measured?not enough degrees of freedom to recover 6DOF I understand what you have in mind.And at what time and space accuracy and frequency ? ( over the years many generations of TMLabbers and I have conceived diff solutions, but I'm interested in fresh feasible approaches.)so this is a line of sight approach.if we use aline of sight approach, then I'd simply use Wii and hack up its IR receiver . sender pair since it's so well engineered.ditto Kinect for camera based tracking.Adrian and I agreeit makes sense to buy and hack this inexpensive off the shelf gear to get a feel for what can be done. good practice : "rapid prototyping, quick fail" The question now is who will do the DIY electronics.In that spirit, before that or meanwhile we can wizard of oz, walk thru some scenarios, and get a lot more juice out of the gloves by using them in many alternative scenarios, I think.Looking forward to talking with you again, and catching up w people Wed. next week!Xin Wei
Put the iPhone in the case.
Run GyroOSC on the iPhone.
We now know the tilt of the iPhone relative to 3 axes—that gyroscope is pretty amazing.
From the overhead camera monitoring the LEDs we can get the position of the iphone in the XY plane, and its rotation relative to X and Y axes.
And then, I am thinking that by measuring the apparent distance between the two LEDs in the camera image, and combining this with the gyroscope tilt data, we could solve for the height of the iphone from the floor.
Xin Wei, would the math work here? Does they gyroscope data plus the two LEDs in the image give us a unique 6DOF solution?
From: David Morris [mailto:firstname.lastname@example.org]
Sent: June-10-11 12:57 PM
To: 'Niomi Anna Cherney'; 'zohar'; 'p.a. duquette'; 'Sha Xin Wei'; 'Noah Brender'; 'Tristana Martin Rubio'; 'Andrew Forster'
Subject: Update: Further Research on 'Magic Wand'
I spent some further time today researching possibilities for the tech for the ‘magic wand’, first to see if we can get this via the iPhone. Results:
--it turns out if you want to simulate Lightsaber duels on the iphone, you need to solve the 6 degrees of freedom problem, position and tilt in 3 dimensions. The App Lightsaber duel from THQ apparently lets you have virtual duels through Bluetooth. I guess it’s doing relative position through Bluetooth? There is also iSamurai. I’ve downloaded these, and I am hoping someone else can bring an iPhone, so we can see how this works. What I don’t know is if this only gives you audio feed back when you hit the other person’s virtual lightsaber, or also if you are pointing their way. (NB, the thing we are trying to do is like Luke in the first Starswars movie tracking the practice drone while blindfolded through his light sabre.)
--there are also a bunch of apps for doing read outs of the gyroscope sensor on the iPhone. One sends the data via OSC, GyroOSC. If an overhead camera tracking a fiducial on the hand could gives us 3D location, then we could use an iPhone or WiiMote to give tilt.
--I also came across this: http://www.free-track.net/english/. It’s a freeware package for Windows that uses LED’s mounted on someone’s head to track 6DO data through a webcam or Wii camera. Maybe we could mount the LEDs on the end of a stick, and use an overhead camera?
By the way, GyroOSC a great little app -- atop Adrian / John / CNMAT OSC lib. ;) No need for Polhemus if we use iPhone ? And maybe the familiar form factor works in our favor?
I've downloaded it but don't have time to play with it just yet in Max -- expect they have demo patches? Have you looked at the data coming out on Max/MSP? I bet it's clean but would like to check effects of interference, range.
If GyrOSC works we can get attitude from onboard sensors, so the only remaining issue is x-y position on the floor. I hope the demo patches include a simple little calibration to output orientation relative to local room. ("Let's do the time warp again!")
Tracking x-y location by LED on the body or handheld accoutrement is delicate -- that's the solution TG2001 used in Linz and Rotterdam, 2000-2002. We abandoned on the body LED's after 2002 because they impose stiffening constraints on people moving around: people have to keep parts of their bodies or accoutrement presented to the camera. It's quite annoying when the systems stops responding to you simply because line of sight is broken, or because of light-interfernce. For our purposes this may be ok for now -- but we'll want to carefully fold the artifactual "stiffening" constraints into our design.
We (mostly) eliminate interference by visible light conditions by tracking in infra-red. CV has improved so much that we just track with cv.jit tools + overhead camera(s). Students and I have dreamed up various schemes for minimally encumbering ID-based tracking. But I have deliberately avoided tracking ID-ed people, expecting interesting ethico-epistemic implications. (For wearables, to be considered minimally encumbering I use jewelry as the criterion. :)
Speaking of infra-red, it could be interesting to fold our prosthetic sense-organ "back" into the "ordinary" sensorium by using an IR bright source (assuming our photocells can pick it up). Then we have more amplitude for the event design.
On 2011-06-10, at 10:13 PM, Sha Xin Wei wrote: