Author Topic: Dual Wiimote 3DOF single IR source tracking  (Read 15349 times)

Offline nebulus

  • *
  • Posts: 7
  • Karma: +1/-0
    • View Profile
on: February 18, 2008, 06:21:12 AM
Hi All

Here is a very brief video of a demo I'm working on that uses dual wiimotes to track an IR source in 3D.

The blobs in the video represent the positions of the Wiimotes, and the line coming from the blobs show the direction of the source.  The source is represented as a 3-axis cross, and a trail showing where it has been over the last 120frames.
 
[youtube]http://www.youtube.com/watch?v=c9eOq-cSdSM[/youtube]

The tracking is pretty responsive, but a bit noisy due to the data from the wiimotes being a bit jittery and could do with some smoothing.

The application is running on Ubuntu 7.10 using the libwiimote library, FLTK and OpenGL.

Next steps to get it smooth, add more IR sources, and add more wiimotes to increase the range and accuracy.








Offline atomriot

  • *
  • Posts: 177
  • Karma: +16/-0
    • View Profile
Reply #1 on: February 18, 2008, 08:31:43 AM
Great job!

that looks pretty neat. But i am sure you are going to (if you have not already) get battered with a plethora of questions asking for that in windows.

Do you have your source available somewhere or is it going to be available somewhere?

Details, details. Things to do. Things to get done. Don't bother me with details, just tell me when they're done.
--
James Lionel Price



Offline nebulus

  • *
  • Posts: 7
  • Karma: +1/-0
    • View Profile
Reply #2 on: February 18, 2008, 09:41:57 AM
The app in the video is integrated into a much bigger project that I can't easily opensource, but the ideas and code behind this little section to do the tracking is pretty trivial.

If anyone is interested I can write up what I've done and post it ...



Offline Helza

  • *
  • Posts: 36
  • Karma: +3/-0
    • View Profile
Reply #3 on: February 18, 2008, 11:22:25 AM
If you could release the code just belong to this part that would be great :)



Offline nebulus

  • *
  • Posts: 7
  • Karma: +1/-0
    • View Profile
Reply #4 on: February 19, 2008, 05:29:22 AM
Rather than releasing the code, I think the algorithm is more interesting to release, and its really very easy.

What I'm doing is some simple ray intersection calculations to approximate the position of the IR source.
Take a look at this page, for finding the shortest line between two lines.

http://local.wasp.uwa.edu.au/~pbourke/geometry/lineline3d/

In my code I'm making a line that starts at the wiimotes position, and calculating a direction vector from the wiimote IR data.  Due to the orientation of the wiimote I do a rotation of the direction vector, (in my this case 45 degrees).  Now adding the direction vector and the start position gives another vector that I use as the end point of the ray.  Repeat for the other wiimote and you should have two line definitions, these can be fed into the equation from the above link.

So from the drawing on that link you can imagine wiimote 1 has the line P1->P2 and wiimote 2 P3->P4

Now you'll get a line back from that equation, all I'm doing is using the midpoint from that line as the position of the IR source.

So effectively calculating a point directly between Pa and Pb.

This approximation is good enough as a starting point, there are probably some extra information that could be derived from the length of the line that the equation returns, which will help in error analysis.









Offline inio

  • Wiki Admin
  • *
  • Posts: 124
  • Karma: +5/-0
    • View Profile
    • my Wii Remote projects
Reply #5 on: February 20, 2008, 10:45:48 AM
How did you do the camera calibration?  Did you just solve for the extrinsic parameters, or also intrinsic, or even better, did you do a full photometric calibration of the cameras?  If you solved for any intrinsic parameters, how different did the two wii remotes end up?



Offline nebulus

  • *
  • Posts: 7
  • Karma: +1/-0
    • View Profile
Reply #6 on: February 22, 2008, 02:18:29 PM
Hi inio

Sorry you lost me there a bit with all your intrinsics and extrinsics!
Could you describe better what you mean?


I did very little in terms of calibration, the wiimotes were placed at 45degrees purely by eye, I did some measurements to work out the wiimote field of view which worked out to be almost exactly the same for the 3 wiimotes I have (41 HFOV x 31 VFOV), and used this in the calculation of the ray directions.

The distance between the wiimotes was roughly 30cm, but this is almost irrelevant as the sources positional information calculated in relation to the two wiimotes position.
As long as the wiimote position and directions were set relatively accurately, it works perfectly well.

I was very surprised by how well this very unscientific method worked!

I'm not sure how strict Johnny was with the maths in his original example application, I imagine he was more thorough than me, but in truth I doubt all the other instances where people have played with the demo have been as strict with getting the numbers correct.  I've never played with Johnny's demo so I could be mistaken about the need for configuring it with accurate measurements.

To be honest I had more issue with the noisy jittery signal that the IR cameras produced, which was more of an issue for me. 






Offline inio

  • Wiki Admin
  • *
  • Posts: 124
  • Karma: +5/-0
    • View Profile
    • my Wii Remote projects
Reply #7 on: February 22, 2008, 09:55:07 PM
Ah, I figured it was a pretty unscientific approach but you can always hope ;)*

From a Computer Vision perspective, cameras are defined as "Intrinsic parameters" (fov, optical axis, and skew), and "extrinsic parameters" (position and orientation).  For more accurate applications, you also compensate for lens distortions that can't be represented as fov, optical axis, and skew (typically represented as a polynomial function from the measured distance off the optical axis to the actual distance off the optical axis).

*my research requires very accurately calibrated wii remote cameras so I'm always hoping I'll find someone who's solved that problem for me.  I can't use traditional approaches because it isn't a "camera" and doesn't record images.



Offline tarantula78

  • *
  • Posts: 6
  • Karma: +0/-0
    • View Profile
Reply #8 on: February 23, 2008, 01:53:30 PM
'scuse me if this is a noob question but if one wiimote can track XYZ why do you need two wiimotes? isnt one enough to capture the depth, drop and height?



Offline defray

  • *
  • Posts: 6
  • Karma: +0/-0
    • View Profile
Reply #9 on: February 24, 2008, 08:45:21 AM
@ tarantula:
Wiimote cannot track light sources in 3D by infrared, only the other way round (position of the Wiimote via the sensor bar) or by using the acceleration sensors, but they are too sluggish for reliable und fast tracking.
In infrared the Wiimote essentially only sees a dot of light on an 2D-pane, but with two Wiimotes you get two different panes and you can easily calculate a 3D position.

Using this 3D tracking one could achieve some kind of virtual input (like on a keyboard) or other interactions ... is somebody working on something like that ? This could be used for all kinds of stuff you'd normally use a touchscreen (or even multitouch screen) for, e.g. an image viewer.
Would be nice to do something on my TV this way, comfortably from the sofa, with just an IR glove or two.



Offline vadali

  • *
  • Posts: 30
  • Karma: +0/-0
    • View Profile
Reply #10 on: February 24, 2008, 10:05:32 AM
Hey,

In infrared the Wiimote essentially only sees a dot of light on an 2D-pane, but with two Wiimotes you get two different panes and you can easily calculate a 3D position.

Could you post some link that can explain how to get a 3d position from two different panes?

thanks
Vaadali



Offline nebulus

  • *
  • Posts: 7
  • Karma: +1/-0
    • View Profile
Reply #11 on: February 24, 2008, 02:02:05 PM
@ vadali  - take a look at my post on this thread (Posted on: February 19, 2008, 06:29:22 AM)

What I'm doing is calculating the objects position in 3D by using simple ray casting calculations.
Knowing the Field Of View of the wiimote (roughly 41x31 degrees) I can calculate the direction the source is relative to the wiimote given the numbers it is returning.
Its by comparing these two directions that the wiimotes return that I can calculate where the source is in 3D, this is done using the shortest line between rays test that is in the earlier post.

I recommend looking at a text book on frustums the OpenGL redbook has a nice section on exactly this type of concept...



@ tarantula

Each wiimote sees the each source from one perspective, using only 1 wiimote and 1 source allows you to calculate relative direction of the source from the wiimotes direction, but no information about the distance.
Using a second wiimote gives me the direction of the source from two perspectives, and using some simple ray calculations (give that I know the postions of the wiimotes and their FOV's) I can calculate the position of the source with respect to the wiimotes.

You always need three points of interest two of which you must know the relative positions of with respect to each other, for my demo I know the position/orientations of the two wiimotes, and I want to find the IR source's position relative to them.  In Johnny's headtracking VR demo, its the other way round, the wiimote is static, but you know the position of the two IR sources on his glasses, relative to each other.

Its effectively simple triangle geometry where you know the length of one side of the triangle and one (or more) of angles in the triangle...


@ inio  - Ok I understand now what you mean by the intrinsic/extrinsic stuff now!
I come from the 3d graphics and VR side rather than machine vision!
I'm not sure that wiimotes are really high enough quality for what you want, mine all produce really jittery data, which is a real pain.  I can smooth it over time but that introduces a big latency which is a pain for the application I'm using it for...

Incidentally given your from the machine vision side of things, I'd like to do what the wiimote does and convert an image into several blob samples of just x,y coordinates.  I've got a vaio laptop with integrated camera, and want to do something along the lines of this:

http://kotaku.com/351539/vr-head-tracking-for-the-ps3

Just need the library/algorithm to convert the image to blob data...



 





Offline vadali

  • *
  • Posts: 30
  • Karma: +0/-0
    • View Profile
Reply #12 on: February 24, 2008, 02:46:34 PM
@nebulus
for a library check out opencv http://opencvlibrary.sourceforge.net/, or http://opencvlibrary.sourceforge.net/ (which uses opencv as its base), this has alot of algorithms already built in.

I understood the algorithm, and since i am not that familiar with 3d graphics, i was more concerned with how to get the direction vector of each wiimote - how do I map the (x,y) plot I get from the wiimote's pane, and translate it into a vector. if I get these two vectors, it will be simple to get the mid-point of their shortest line, I just dont know how to get the direction vectors all together..

thanks alot,
vadali



Offline tarantula78

  • *
  • Posts: 6
  • Karma: +0/-0
    • View Profile
Reply #13 on: February 26, 2008, 03:15:08 PM
ah course you want 3d on 1 LED not 2.. thanks for the explanation :)



Offline steve6

  • *
  • Posts: 15
  • Karma: +0/-0
    • View Profile
Reply #14 on: June 06, 2008, 07:51:48 AM
Hi nebules do you recive my message??