Rover 5 platform test run

Mike just ordered a Sparkfun Rover 5 chassis ($60), and Steven decided to test drive it at 12V using his new quad-bridge driver board, in turn hooked to an Arduino Mega (for the interrupt lines) and an XBee for wireless serial communication.

http://www.youtube.com/watch?v=ZD30zdlazpc

It’s a nice little platform, big enough to put a laptop and some actuators on, and it moves authoritatively when driven at 12V.  It tracks quite straight even without using the onboard encoders.  Mark, from Mount Edgecumbe, suggested putting on Mechanum wheels, so the chassis can translate sideways too.

We’ll work on a decent mounting system, so all the electronics don’t fall out when the thing tips over!

Arduino Serial Latency & Bandwidth vs. Message Size

We’ve been using Arduinos for all our projects, and it was time I got around to benchmarking the serial communication performance.  It’s actually not very fast; even at the maximum baud rate of 115200 bits per second, delivered performance is only a little over 10KB/second each way, and it only hits this bandwidth when sending over 100 bytes at a time.

Arduino Uno to PC roundtrip serial communication bandwidth, as a function of message size.

The problem for small messages seems to be a 4 millisecond minimum roundtrip latency.  Messages over about 40 bytes seem to take several such latencies, so there’s a stair step pattern to the latency.  Paul Stoffregen says this is due to the Uno firmware’s 4.1 millisecond transmit timeout.

Arduino Uno to PC roundtrip serial communication latency, measured in milliseconds, for various message sizes.

Evidently, the Teensy (with direct USB to the chip, not a USB-to-serial onboard) gets about 1ms serial latency.  The same page reported the Duemillanove at 16ms minimum (62Hz!).

Overall, this means you’re only going to get a 250Hz control rate if you’re shipping sensor data from an Arduino Uno up to a PC, and then sending actuator signals back down to the Arduino.  But 250Hz is enough for most hardware projects we’ve been thinking about.

The other annoying problem?  After opening the serial port, the Arduino Uno takes 1.7 seconds to boot before it responds to serial commands.  Anything you send from the PC before that time seems to be lost, not buffered.  The fix is probably to have the Uno send the PC one byte at startup, so the PC knows the Uno is ready.

Using Kinect + libfreenect on Modern Linux

Here’s how I got libfreenect working on my Ubuntu 12.04 machine, running Linux kernel 3.2.   Generally, I like libfreenect because it’s pretty small and simple, about 8MB fully installed, and gives you a live depth image without much hassle.  The only thing it doesn’t do is human skeleton recognition; for that you need the much bigger and more complicated OpenNI library (howto here).

Step 0.) Install the needed software:
sudo apt-get install freeglut3-dev libxmu-dev libxi-dev build-essential cmake usbutils libusb-1.0-0-dev git-core
git clone git://github.com/OpenKinect/libfreenect.git
cd libfreenect
cmake CMakeLists.txt
make
sudo make install

Step 1.) Plug in the Kinect, both into the wall and into your USB port. Both the front LED, and power adapter plug LED, should be green (sometimes you need to plug and unplug several times for this). lsusb should show the device is connected:
lsusb
… lots of other devices …
Bus 001 Device 058: ID 045e:02b0 Microsoft Corp. Xbox NUI Motor
Bus 001 Device 059: ID 045e:02ad Microsoft Corp. Xbox NUI Audio
Bus 001 Device 060: ID 045e:02ae Microsoft Corp. Xbox NUI Camera

Step 2.) Run the libfreenect “glview” example code:
cd libfreenect/bin
./glview
Press “f” to cycle through the video formats: lo res color, hi res color, and infrared. The IR cam is very interesting!

The source code for this example is in libfreenect/examples/glview.c.  It’s a decent place to start for your own more complex depth recognition programs: equations to convert depth to 3D points here!

———— Debugging Kinect connection ——————-

Number of devices found: 1
Could not open motor: -3
Could not open device
-> Permissions problem.
Temporary fix:
sudo chmod 777 /dev/bus/usb/001/*

Permanent fix:
sudo nano /etc/udev/rules.d/66-kinect.rules

Add this text to the file:
 # ATTR{product}=="Xbox NUI Motor"
 SUBSYSTEM=="usb", ATTR{idVendor}=="045e", ATTR{idProduct}=="02b0", MODE="0666"
 # ATTR{product}=="Xbox NUI Audio"
 SUBSYSTEM=="usb", ATTR{idVendor}=="045e", ATTR{idProduct}=="02ad", MODE="0666"
 # ATTR{product}=="Xbox NUI Camera"
 SUBSYSTEM=="usb", ATTR{idVendor}=="045e", ATTR{idProduct}=="02ae", MODE="0666"

sudo /etc/init.d/udev restart
—————–
Number of devices found: 1
Could not claim interface on camera: -6
Could not open device
-> The problem: modern kernels have a video driver for Kinect.
Temporary fix:
sudo modprobe -r gspca_kinect

Permanent fix:
sudo nano /etc/modprobe.d/blacklist.conf
Add this line anywhere:
blacklist gspca_kinect
——————-
if lsusb only shows the motor, not audio or the camera:
Bus 001 Device 036: ID 045e:02b0 Microsoft Corp. Xbox NUI Motor

This MEANS THE KINECT IS NOT POWERED via the 12V line!

Front green light blinking: Kinect is plugged into USB.
AC plug cable green light: 12V power is connected.

Solution: plug in the power cable!  If it is plugged in, unplug and replug it.

 

Kinect + Parrot Drone = ?

We’d really like to do interesting things with indoor UAVs, like the fun little Parrot AR.Drone 2.0.  Outdoors, you have GPS (see ArduPilot), but indoors GPS isn’t accurate enough even if you had good reception.

Enter the Kinect.  It returns a live 2D + depth image that can be converted to true 3D point cloud quite easily, and at 30 frames per second.  If you have a beefy enough indoor UAV, you can mount the Kinect on the UAV, but the Kinect’s weight, power, and processing requirements make this expensive.  I’m a fan of mounting the Kinect on the wall, off to the side, where it can see most of the room (it’s easy to rotate the 3D points into any coordinate system you like).

Together, you have a sub-$500 UAV with a 3D position sensor.  The only problem?  The UAV is thin and black, so the Kinect can’t see it beyond about 10 feet, even with the big indoor hull attached:

Parrot AR.Drone 2.0 seen in Kinect field of view.
I’ve circled a Parrot AR.Drone flying in the Kinect depth image. No valid depth samples were returned, so it shows up as black (unknown depth).

One solution is to attach a small reflector, which can be as simple as a piece of paper.  Here, I’ve got a small 8×4 inch pink piece of paper attached to the front of the drone, to give the Kinect something to see.

With a 8 by 4 inch pink sheet of paper attached, the drone is clearly visible with depth in the Kinect image.

This works quite reliably, although the paper does mess up the vehicle dynamics somewhat.  Anybody tried this?  Would painting the indoor hull help the Kinect see the UAV?

The Arduino-Based Dual Relay Control Board

This project allows us to use an Arduino to control any AC powered device, including lamps and small heaters.  Its limited to 20A, but a lot can be done with 20A.  The picture below shows it running two 12V PC case fans. The arduino Uno controlling this device is located underneath the board (out of view in the picture).

A video of this device in operation may be found here:

http://www.youtube.com/watch?v=7rWQjZVRCU8

Datasheet, parts list, and Eagle files for this device can be located here.  the GNU General Public License applies to this device and its files.  Please let us hear your comments or questions on this device!

DataSheet:

Relay Board DataSheet

Parts List:

Parts List for Relay Board

Eagle Files:

Relay Board V3

 

Meet the Team: Steven

I am a Computer Engineering Graduate Student, working toward my PhD.  My focus is in embedded systems, wireless sensor networks, robotics, and networked robots.  My Master’s thesis covered the construction and programming of a small network of tiny robots that could communicate with each other and explore a model building in a cooperative, coordinated manner.

 A video of a successful exploration using 3 robots can be seen here:

http://www.youtube.com/watch?v=GipvllwpOpU

Before I built networks of these robots, I had built just one robot and used it for a college competition known as Micromouse.  Micromouse is a competition in which a robot must explore, map, solve, and then race through a small 16 cell x 16 cell maze.  The robot knows only the size of the maze and the location of the start cell and stop cell when it begins its run.  It must start by exploring and mapping the maze to find routes from start to finish, then it must race along the route that will give it the best time from start to finish.  The robot with the best time wins the competition.  My team and I have taken 1st place 5 out of 6 of the last years at the NW/NE area competitions, and took 2nd the other year.  A video of our winning run from year 5 can be found here:

http://www.youtube.com/watch?v=a11M-kZXASA&feature=relmfu

The exciting speed run occurs at time 3:40, so zoom ahead to that point if you want to skip the slower mapping run.

For years I’ve been fascinated with the idea of functional art: devices that perform useful tasks yet look good while they do it, or devices that are mostly artistic in nature yet also have some function.  A few years ago I did my first functional art design: the Digilog clock, a clock with an analog design that uses LEDs to represent the hands of the clock.

Lately I’ve also been growing increasingly fascinated with UAVs, an outgrowth of my years as a private pilot and ultralight flight instructor.  I’ve been informally trained to operate the Aeryon Scout, a nimble and easy-to-fly quadcopter, and have been experimenting with open source hardware and software solutions, specifically the ArduPilot Arduino based open source autopilot system.  I look forward to working more with these systems in the coming years ahead.

Meet the Team: Mike

I’m Mike, and I too have a soft spot for robots!  I’m a student seeking a degree in Computer Science at the University of Alaska Fairbanks.

Background

  • Born and raised in Fairbanks, AK.
  • Got my first computer when I was 4.
  • Took my first computer apart when I was 5.
  • Started programming when I was 8.
  • Starting building robots when I was 15.

Looking forward to building some new hardware and software!

“TriloBYTE” two-wheeled robot chassis

I fabricated a quick two-wheeled robot chassis tonight, which I’m calling TriloBYTE because of the rounded front.  It’d look more like a Cambrian-era Trilobite if I had the USB cable trailing out the back, but the scuttling motion is definitely Trilobitian.

Red plate with wires and wheels
The TriloBYTE chassis

The chassis fits in a 7″ diameter circle, and is made from 1/2″ thick red HDPE cutting board, scroll saw cut following my chassis template (PDF or SVG).  I always draw up a template in Inkscape and print and glue it down before cutting, drilling, or milling–it’s so much easier to build stuff if you already know where everything is going!

It drives nicely when powered by two tiny Pololu 1:100 gearmotors and 42x19mm wheels. The motors drop into milled slots in the chassis, and are held down with 1/2″ pan head self tapping screws (with 3/32″ holes predrilled).

The motor mount for TriloBYTE. It's just a milled slot in the HDPE, with two pan-head screws holding the motor down.

The motor controller is a sparkfun ardumoto shield, which seems to work fine when powered only by the Arduino’s 5v line.  With bigger motors I’m sure I’d be stressing the USB connection, but these tiny motors work fine–the stall current is only 200mA or so.

I definitely need mechanical strain relief on the main USB cable, because I’m finding myself putting a lot of stress on it just bumping around. You could definitely argue for putting the Arduino in backwards, so the USB cord trails behind. Surprisingly, the USB cable doesn’t seem to drag the robot around too badly, at least until it tries to climb its own cable. The motors definitely have enough power to hop the cable, but not with the open-loop PWM setup I have.

The cat seems fascinated by it!

My cat sniffing at the TriloBYTE robot. No cats or robots were harmed in the making of this photo. I'm hoping I won't find the robot clawed to shreds tomorrow morning!

Next up: some sensors, and real control software.  Using Firmata to test out new hardware is sure easy!

 

Meet the Team: Dr. Lawlor

Hi! I’m Dr. Orion Lawlor, and I love robots!  I’m an assistant professor of Computer Science at the University of Alaska Fairbanks, and have helped Dr. Bogosyan set up the CYBER-Alaska project since the basic idea back in 2009.

Facts about Dr. Lawlor:

  • Grew up in Glennallen, Alaska eating moosemeat and salmon, and donating blood to mosquitos.
  • In a construction frenzy during the summer of 2009, built a 2,000 square foot shop with help from his dad Tom.
  • Can weld with stick, MIG, or gas; cast aluminum, zinc, or urethane; cut steel on a manual mill or CNC lathe; and drive heavy equipment… all without leaving his driveway!

I’ve worked on a lot of different computer hardware and robotics projects over the years:

  • In 2009 through today, I worked with a really broad array of students building a solar powered robot to drive around Greenland.  We’re still waiting on NSF funding for the full build, but work continues on this project in my spare time.
  • In 2007, 2008, and 2010, I worked with some great students building underwater robots for the worldwide MATE ROV contest.   I used ever more complex PIC microcontrollers for this project: in 2007 I used a network of 16F676 chips running software serial code, in 2008 I upgraded to the PICkit 2 programmer and 16F690 chip for more pins and hardware serial communication, and in 2010 I used the 18F2455 for direct onboard USB.
  • In the early 2000’s, I wrote usb_pickit, the first open-source driver for the PICkit 1 programmer.  I learned a lot about hardware, including how to laser print and iron on printed circuit boards.
  • Back in 1998, I soldered together an ISA card by hand, to collect analog data at a higher rate than I could get over a serial line.  The card actually still works, although the ISA slot is so old, I can only plug it into an equally ancient machine!

My “day job” is writing software and building high performance simulations, so I’m excited to combine my fabrication, electronics, and software experience to build cutting-edge systems!

Howto: Arduino + Firmata = Easy Hardware!

I’ve been playing with microcontrollers for over a decade now, and have hands-on experience with PIC, ARM, 68HC11, and MSP430 devices.  Usually, it’s a long slow road where you (1) pick a processor to buy, (2) pick a device programmer, (3) pick a compiler, (4) read the processor documentation, (5) find/fix example code until it compiles, (6) program the chip, (7) figure out why the lights aren’t blinking.  It’s usually very painful, takes a few days at the very least, and every new task or chip is just more work.

But this is easy!

  1. Buy an Arduino Uno R3 for about $30.  Lots of places have the hardware, including Amazon (and cable) or SparkFun (and cable).  The cable is an ordinary USB A to B cable, with the thick square end like on most USB printers.
  2. Download and unzip the Arduino 1.0 IDE.  (On Linux, you’ll also need to “sudo apt-get install gcc-avr avr-libc” to get the compiler.)
  3. Open the IDE’s “Drivers” folder.  Right click “Arduino UNO R3.inf” and hit “Install”. Plug in the Arduino, and it should show up as a serial port (something like COM3 or /dev/ttyUSB0.)
    1. The very similar Arduino Duemilanove 2009 board doesn’t need drivers; it shows up as an ordinary FTDI USB to serial device.
  4. Start up the IDE by double clicking Arduino.exe, and:
    1. Choose File -> Examples -> Firmata -> StandardFirmata.
    2. Hit File -> Upload.  The TX and RX LEDs will flicker as the device is programmed.
    3. The Arduino now responds to the serial Firmata command protocol.
  5. Download the Firmata Test Program, run it, and choose the “Port” from the menu (like COM3 or ttyUSB0.)
  6. Click pin 13 on and off, and watch the LED blink!  You can set any pin to input (reading low or high voltage) or output (producing low or high voltage), and many pins have other functions available like analog input, PWM(pulse width modulation), or servo control.  Just click and drag to interact with all the pins!
    Pin descriptions for Firmata connected to Arduino.
    Firmata Test showing all the pins for a live connected Arduino.

The beautiful thing about this is you don’t have to figure out how to enable analog inputs, initiate ADC conversions, correctly set the PWM control registers, or set interrupt modes–it’s pure plug and play!