OpenCV for multi-color tracking and localization

Localization, or “where is my robot?”, is really important, since you can’t tell the robot where to go unless you know where you’re starting.  It’s also a hard problem to solve reliably, especially indoors where you don’t have GPS.  For the 2013 CPS challenge, we used a Kinect to find our flying UAV, but we’d like to support ground vehicle localization too, and that’s not easy in a depth image.

I’ve done webcam color matching for decades, but I’ve always used my own homebrew image access and processing libraries, which makes it hard to use, port, and modify my stuff–even for me!  This month, I’ve been finally learning a standard, portable video and image analysis library: OpenCV.  It’s even got a simple GUI library built in, so you can add sliders and such.

Here’s the program in action:

The red and green blobs to the right of my head are bright pink and green squares of paper, detected and highlighted in the image.  Note the bad matches on my red chair.

The basic process is to find all the pixels that match a given color, estimate their center of mass, then draw a smaller box around that center of mass for a second pass.  This produces a “trimmed mean” position, which is less sensitive to outliers.

The output of the program is the average (X,Y) coordinates of the center of mass of each color, and the number of pixels that matched.  If not enough pixels match, the target object probably isn’t visible.  The program has sliders so you can adjust the matching tolerances, and you can click to get a new hue, saturation, value (HSV) colorspace target color.

If you have the target color set exactly right, and your webcam is working well, you can get accuracy better than 0.1 pixel!  But if the colors are dancing around, you might get 2-3 pixel variance in the location.  And if you have the colors wrong, or something else is the same color, you can get entirely the wrong location.

Because the apparent color of a reflective object like a sheet of paper depends on the ambient lighting, I need to allow loose color tolerances or else tweak the target colors to get a good match.  We should try using some diffused LEDs or color lasers to produce inherent emissive colors; back in 2008 David Krnavek used an LED in a white ping-pong ball for color tracking, with good results.

Latency seems good, and I get about 20 frames per second even on my low-end backup laptop and random webcam.  However, the OpenCV graphical output is fairly expensive, so I don’t do it every frame.

Download the color tracker code at my github.  The plan is to build up something like this into a reliable web accessible multi-robot tracking system!

Aero Balance Beam: Control Theory Demo

At our workshop this week, we did some hands-on testing of our control theory knowledge with an aerodynamically-driven balance beam–basically just an electrically controlled fan that blows to lift up a weighted board.  The idea was to capture the difficulties of UAV control in a simpler and slightly less dangerous system.  Little did we know what we were signing up for!  (Arduino code and data after the break.)

Continue reading Aero Balance Beam: Control Theory Demo