3rd of February, 2012
Over the summer, I worked with Advanced Interfaces Group at The University of Manchester on a research project to investigate Phantom Limb Pain. Previously, Steve Pettifer and Toby Howard (the principle researchers) has tried VR and ping pong ball style tracking in order to recreate a virtual world and 'mirror' the missing limb. The idea is to create a more immersive version of the Ramachandran Mirror Box. It's a way to try and unlearn what has been learnt.
Since the Kinect and cheaper VR goggles have become available, it seemed the right time to rebuild this system with modern technology in order to test what things affect pain relief. Although the mirror box only costs about 2 pounds, we can't easily change parameters and exercises, so there is definitely something to look at.
The technical aspects of the project were quite interesting. The first problem is how to get from the Kinect to a moving person? The OpenNI team were the first to do this with the OGRE model Sinbad:
The technique is known as skinning. It works by attaching vertices to bones with weights. As the bones are transformed, the vertices follow the bones by a certain amount, relative to the weight. Imagine a lower arm bone moving up and down. The vertices in the lower arm will be attached with high weightings whereas the vertices in the leg will not.
The problem here is there are no real APIs or similar for this. The closest modelling tool kit is the FBX SDK which has support for skinning, textures, animation and indeed, anything you could possibly need for models. It's my goto library for this sort of thing.
OpenNI provides tracking for the skeleton, as most of us know by now. In addition, it provides a matrix for each bone. Using this matrix with the FBX SDK, one can skin a model in the normal way. First challenge over!
One thing the Kinect can't do is detect rotation. Imagine you are trying to rotate your wrist to look at your palm. That motion cannot be detected and its an open problem at the moment. As my video shows, a gyroscope can help to fix that.
The next step is to create the visuals and immerse the user in the scene. Vuzix produce a set of reasonable headsets. The Vuzix VR920 sits in my toolbox most of the time. They are superior to the Wrap models that Vuzix are pushing. The problem is they are now discontinued. This means the drivers for OSX don't actually work properly when it comes to the gyros and accelerometers. This meant that headtracking would be a problem.
Thanks to Adafruit Industries and Sparkfun I decided it would be possible to create an agnostic head tracking solution using XBee wireless modules. Adafruit provided the USB dev boards for driving the XBees and Sparkfun provided the Gyro unit - The Atomic IMU. We tried a few other boards but this one seemed to work the best.
At the time, I was up north and away from London Hackspace. Thankfully, FabLab came in with their laser cutter. I built the box and 3D Printed some cases for all the chargers and parts we needed. Rather than just have the XBee USB boards and the lipoly chargers being exposed, I figured some 3D printing might be in order.
Google sketchup to 3D printing works quite well. Again, thanks to FabLab Manchester for helping out with that.
Converting a Gyro and an Accelerometer into orientation data is not a trivial task. At first, I thought just a gyro would be fine but thats not the case at all. A gyro measures the force acting on the gyro to keep it stable, not the orientation itself. An accelerometer measures the acceleration in a particular direction. In order to understand the movement, one must have a start point and then integrate over the values. This means that errors will build up. These can be corrected with the accelerometer data and that is fine, so long as you actually have accelerometer data for that axis. You certainly don't for Yaw.
This is known as yaw drift. You can't correct for it, because gravity acts along the Y axis, the up down axis. So if you are spinning around that axis the acceleration due to gravity doesn't change. In our version, we don't have a compass to correct for that. Its definitely something we can improve on.
Most of the code is actually related to Unmanned Aerial Vechicles. Ultimately, I settled on the Discrete Cosine Matrix with a set of Kalman filters. Eventually, I'd like to implement this imu algorithm as its likely to improve things drastically. That said, it's still not too bad, though the yaw needs to be reset a little too often. We also tested the Sparkfun Razor 6DOF as well as there was quite a lot of good things said about it.
Headtracking and limb tracking complete, the next step was the graphics. Shadows were seen as important in order to gain some context about where the user appears in the scene. I tried Screen Space Ambient Occlusion and basic shadow mapping, settling with the later.
One problem to these who know a little about OpenNI and Kinect tracking is the PSI Pose. This is where the calibration problem comes in. How can you ask a person with one arm to stand with both arms up in the air? Fortunately, we can save a calibration from a similar person and use these with the participant. The SDK has improved since then, so this should be less of a problem.
At the same time, we needed to create some games. The participant needs to move their phantom limb in order to feel relief. I was joined on the project by a Manchester student, David Edwards who came up with a few games like Connect 4. At this point, I needed to consider documentation, APIs and similar - handing over my code to another person and asking them to run with it.
This project covered all the bases for me. Working with talented people, making physical things, software engineering, computer graphics and most importantly, making a difference in someone's life. Our first participant said his pain was going down whilst he was using our program. Theres an awful lot left to explore. Im sure the AIG will come up with something great.
The paper has been accepted at the GRAPP conference this year:
S. Pettifer, T.L.J. Howard, B. Blundell, and D. Edwards. An immersive virtual environment for phantom limb pain rehabilitation. In Proceedings of the International Conference of Computer Graphics Theory and Applications (GRAPP), February 2012. Accepted for publication.
At the time, I was motorcycling between Southport and Manchester quite regularly. For a few months, I was the leather clad biker programmer guy. It did feel pretty damn cool! :)