Kinect + Parrot Drone = ?

We’d really like to do interesting things with indoor UAVs, like the fun little Parrot AR.Drone 2.0.  Outdoors, you have GPS (see ArduPilot), but indoors GPS isn’t accurate enough even if you had good reception.

Enter the Kinect.  It returns a live 2D + depth image that can be converted to true 3D point cloud quite easily, and at 30 frames per second.  If you have a beefy enough indoor UAV, you can mount the Kinect on the UAV, but the Kinect’s weight, power, and processing requirements make this expensive.  I’m a fan of mounting the Kinect on the wall, off to the side, where it can see most of the room (it’s easy to rotate the 3D points into any coordinate system you like).

Together, you have a sub-$500 UAV with a 3D position sensor.  The only problem?  The UAV is thin and black, so the Kinect can’t see it beyond about 10 feet, even with the big indoor hull attached:

Parrot AR.Drone 2.0 seen in Kinect field of view.
I’ve circled a Parrot AR.Drone flying in the Kinect depth image. No valid depth samples were returned, so it shows up as black (unknown depth).

One solution is to attach a small reflector, which can be as simple as a piece of paper.  Here, I’ve got a small 8×4 inch pink piece of paper attached to the front of the drone, to give the Kinect something to see.

With a 8 by 4 inch pink sheet of paper attached, the drone is clearly visible with depth in the Kinect image.

This works quite reliably, although the paper does mess up the vehicle dynamics somewhat.  Anybody tried this?  Would painting the indoor hull help the Kinect see the UAV?