2013 Workshop Notes

CYBER-Alaska 2013 Workshop Notes

August 5-10, IARC 417

Attendees

Teachers:

  • Larry Ehnert, Lathrop HS.  His Digital Electronics (PLTW) class took the CPS challenge in 2013.  Needs to leave at the end of Tuesday.
  • Mark Nance, Mt. Edgecumbe HS.  His Robotics 1 class took the CPS challenge in 2013 remotely, doing teleconference 2-3 times per week.
  • Mike Backus, Mat-Su homeschool math.  He worked with Mike & Steven in the 2013 ASRA module.  Needs to leave Friday.
  • Mariya Shapran, UAF Alaska Satellite Facility working with Jeremy to do Effie Kokrine robotics outreach.  The student assistant there is Clay Allen, a Lathrop HS grad and FIRST veteran.

Students:

  • Ben Neubauer, UAF EE graduate student (got EE Bachelor’s in 2013).  Working on smart house project.  Leaving in the fall.  Using PCduino to host a webserver for building automation.
  • Bindu Gadamsetty, UAF EE PhD student of Seta.  Self funded, working on UAV/ground vehicle collaboration for a thesis topic.
  • Mike Moss, UAF CS BS/MS student.  Built webserver for 2013 challenge.
  • Shaun Bond, UAF CS BS/MS student.  Working on Kinect and UAV.
  • Ahmet Kuzu, Turkey ITU PhD student.  Working on quadcopter control for his research.

Faculty & Staff:

  • Seta Bogosyan, CYBER-Alaska principal investigator.
  • Rex Estrada, CYBER-Alaska project manager.
  • Orion Lawlor, UAF Computer Science.

Topics for 2013-2014 Challenge

Argument is that quad copter would be compelling.  Both Mikes feel like quad copter construction would be more compelling for students.

Programming could actually be somewhat simpler than ground vehicle (no obstacles) *if* stabilize is already implemented.

Topics for Workshop

  • Quadcopters [Orion, Mike]
  • I2C interfacing [Mike]
  • RGB video analysis [Mike]

Lessons Learned from 2012-2013

For 2013-2014, we definitely need to do less programming, and more hardware–circuit design and fabrication.  Our 2013 spring competition ended up virtually all software due to the tight schedule.  Mike’s simulated robots web pages were really useful for teaching students robot control, and Mt. Edgecumbe got a lot of good content from the videos.

“Things that got students up and moving in the classroom were more valuable than lectures.”  (Mark)   Hands-on projects are much more useful than lecture.

“Having grad students in the classroom was one of the strengths of the program.”  (Larry)

“These are kids.  They have to *move* now and then!”  (Mike B.)

“The videos are like 10x more valuable than skype.”  (Mike B.)  One big advantage of the videos is that a student can get information about sensor interfacing *when* they’re working on and interested in sensors, which isn’t the same time for every student.

If rovers don’t work 100%, we can’t use them in the classroom–we lose students when there’s a hardware issue, even if you can swap parts to fix it immediately.  Much simpler hardware (fewer encoders, motors) might simplify students’ learning process.  From ASRA: we lost basically all the kids when the rovers didn’t work with servos when unplugged.  Was this capacitance?

A student said they could make a better robot with scotch tape and cardboard, so we let them!  Students naturally divided into hardware people, doing foam and duct tape work; and software people, doing programming.  Students were actually able to build robot from foam.

Regarding 3D printers, one big barrier is the complexity of 3D CAD and multiple revisions required.  But beyond that, surprisingly even kids that know the tools, mostly don’t seem to actually need anything built.  Only the senior design class was able to make useful parts for a real-world problem.  By contrast, the laser cutter was more compelling, since students could just stick in a phone and engrave their name.

For ASRA, two students worked on the hovercraft, two on mice, one on a spinning LED, two kids on an R/C monster truck, two kids scratch-built foam body ground robots.  The advantage to cutting kids loose is motivated kids can work on things they care about; top-down task list didn’t excite people.  For next year, Mike Backus is planning on standardizing the basic platform, then having the kids spec sensors and add a 3D sensor package on the front.

Robotics Hardware and Software

Kinect

The Microsoft Kinect sensor is an infrared-based distance sensor.  It returns a 640×480 array of distance values, accessible via libfreenect.  Typically you want to recognize objects like robot parts or the field–this is a really hard problem in general.  We simplified the Kinect data analysis problem for our spring 2013 challenge, by treating anything inside a 3D box over the field as *being* the UAV.

Experience: Dr. Lawlor, Bond, Moss.

UAV Hardware

Ahmet has built some APM UAVs, and finds vibration is an important factor–typically the APM is nonrigidly mounted on rubber band mounts, and the motors also are bolted onto O-rings, to absorb vibration.  Steven didn’t need to do anything special about vibration on his hex, although his APM is velcro’d down, so the high frequency vibration might not transmit through.

Onboard Robot Controllers

Basically every robot needs an onboard controller of some sort.

  • No controller onboard: RC transmitter/receiver pair for control, and FPV system for video.  This could be full teleop, or automated to any extent via ground-station compute power.  Some FPV systems are fairly heavy; this $80 FPV TX/RX is about 100 grams, since they include their own high-power radio.
  • KK2.0 sits between the RC RX and the UAV’s motor controllers.  It’s got gyros and accelerometers, and is built to stabilize a variety of copter geometries.  It’s an Arduino CPU, and has 5 input channels, FTDI, and 8 total output channels. Ahmet has used the KK1.0, and found the expansion fairly limited.  Weight is 55 grams, cost is $30.
  • APM2.5 (or the generic copy HKPilot) also sits between an RC RX and the motor controllers, but it adds GPS and a much more sophisticated software stack.  Can’t really be used indoors: needs GPS to even arm (after Firmware 2.92).  It can take a while to tune the many control parameters for your custom UAV hardware, and many failure modes can result in a crash–typically people set up a testbed where they can fly safely.
  • Arduino is quite slow, but has tons of expansion shields, that are very easy to use.  Hooking up a camera to an Arduino is hard, and the data rate is so low (several seconds per frame) that it’s more slideshow than video.
  • Embedded single-board computer options:
    • PCduino is an ARM Linux embedded machine that can hook up sophisticated hardware like a USB webcam, but also has Arduino-style pins for direct motor control, analog in, PWM, etc.  Mike broke the Ethernet port off his example, and we had to scramble to find an HDMI monitor to debug a network problem. 25 grams and $50.
    • BeagleBone Black is another ARM Linux embedded computer.  It has two huge rows of double-wide GPIO pins on each side.
    • Raspberry Pi is an ARM Linux embedded computer that can handle a camera, either the “Pi Eye” or a USB webcam.  Some people really love it (Noah Betzen), others think it’s a pretty typical SBC.
  • The Parrot drone has an onboard ARM processor, HD video, discussed below.  Total platform mass is 400g (payload is very small, under 50g), cost is $300.
  • Any cellphone or tablet has onboard video, a beefy ARM processor, and a variety of network connectivity.  Dr. Lawlor likes the idea of a flying phone, because it’s a friendly and self-contained control system.

Parrot Customization

The Parrot drone is actually a Linux machine, with a master control program “program.elf” to do the flight control.  It runs as a wifi access point, and you can connect via an iOS or Android app. Folks have built an open-source flight control program back in 2011.  ROS supports “PTAM” visual navigation from the drone directly (see below).  There’s also a USB host port; and people have soldered an Arduino to the Parrot serial port too.  We used the Parrot successfully in spring 2013, but Mike’s Parrot control program Falconer commands the target orientation; we didn’t do anything with the onboard control. 

KK2-Based UAV System

A possible replacement for the Parrot is to build our own UAV.  This probably wouldn’t save money, but would open up the control system for students to play with.  The basic control system would be:

  • XBee communications.
  • XBee shield.
  • Arduino to decode comms, read sensors, send out servo signals for flight control.
  • KK2 stabilization and control board.
  • R/C motor controllers.
  • Brushless motors.
  • Props.

Video could be via an FPV system: typically you get analog video out, which would need to be digitized.  Video could also be via a cellphone, which is already digital.

Computer Vision

Identifying a color from a video image is a fairly straightforward pixel-by-pixel operation.  It’s a little tricky when lighting conditions are variable (the same reflectance appears as a variety of different brightnesses depending on incoming illumination), but can be made to work reliably, especially when using brightly colored fluorescent targets.  Dr. Lawlor has done color-based target tracking for years, with his own homebrew Linux-based software stack.

A much more extensible video analysis software stack is OpenCV.

There are lots of more sophisticated vision algorithms, including PTAM.

Robot Communication

Serial comms were a problem in the classroom in spring 2013.  Mike Moss built a cool packetized self-synchronizing “SerialSync” library.  One correctable issue was on the PC side, we were trying to get the students to use “Serial.write(&byteData,1);”, or the multi-byte “Serial.write(&intData,2);”, instead of the byte and value-based “Serial.write(data);”.

Using OrionDev-gk12, “Serial.begin” and “Serial.write” both work fine from the PC side, as long as there’s only one COM port.  If there are multiple ports, “Serial.Open(“COM15″);” and “Serial.Set_baud(9600);” first.

For Bluetooth, curiously an Android phone connects fine, and the Arduino IDE also connects fine, but for some reason our older OrionDev serial libraries don’t seem to connect properly from Linux.  Bluetooth seems to work down a pretty long hallway–20m range.

Mike Backus uses a single call to “readBytes” to receive two byte messages, and this works reliably for him–it doesn’t seem to ever desynchronize in practice.

Building Infrastructure

Ben’s been working on the “Little House in Alaska”, an instrumented mini-house with a webserver, to control lighting, temperature, and anything else via the web from a cellphone.

He’s using a PCduino as the web server and building controller, reading temperature sensors from four rooms, and driving an array of 8 relays.  The house is heated with one 25W light bulb, with air pushed out to any of four rooms via four independent fans.  Each room also has four RGB LEDs, and there’s a photocell on top of the building.

He’s using a regular Arduino as a data acquisition device, sending serial data to the PCduino, where it’s received by a C++ program using Mike’s web server library.  Mike’s web library sounds out data in JSON format to a web page, which requests it using JavaScript.  His code is in github.

On that web page, Ben has a simple charting interface already, and he’s previously done historical data logging.

It’s not clear how we should add real security to this: install it behind a firewall for local-only access (but then you can’t shut your lights off from work if you forget), use a username / password, or use a secure DNS server.  This needs to be built, however.

CPS Challenge

What is a cyber-physical system?  A network of sensors, collected and integrated using an online software control algorithm, with actuators working in a coordinated way.

Possible Locations

Indoor:

  • Localization is via Kinect.
  • UAV could be: Parrot, which has its own camera and comms onboard, and no assembly work, but no real hands-on tasks for kids to do; KK2-based kit would need a camera (FPV or cellphone), comms (XBee and Arduino), and a lot of integration.

Outdoor:

  • Localization is via GPS.  Can’t even test it indoors.  Not easy to run outdoors in Fairbanks in March.
  • We need quite a bit of hard integration work to get a ground vehicle working via GPS.

Possible Challenge Vehicles:

  • Parrot UAV.  Good for integrated vehicle testing, and has a camera already.  Doesn’t have much payload capacity, and it’s an integrated system not designed to be extended.  $300 and 400g; <100g payload.
  • Backus mouse: two servo wheels driven by a nano and bluetooth.  Students can build *everything* themselves, and take the platform home.  $100.
  • Balance beam: one/two BLDC thrusters on a pivoting wood beam. Makes a good intro to sensors and active control, but may not be very interesting in itself–the minichallenge is to rapidly keep the board level, which is really hard to do by hand!
  • Hovercraft: foam body, BLDC lift and drive thrusters.  High mobility “fun” vehicle for kids to learn control principles.  We’ve never actually built one, but they sound fun!
  • UAV: four BLDC thrusters, stabilizer like KK2, commands via R/C (for initial teleop), then XBee/Arduino (for autonomy).

Possible Task: Warm Body Search and Rescue (Video Challenge)

Find a warm body on the field.  This requires some sort of video, and some sort of data analysis (automated or manual).

Color search challenge:

  • Survey area with quadcopter (possible fallback: Parrot)
  • Find the coordinates of brightly colored area(s) using vision algorithm on video from quadcopter.  Time bonus for time to coordinates.
  • Drive there with a ground vehicle, such as a hovercraft (possible fallback: Rovoduino, Backus mice).  Tons of extra points for autonomous, not teleop.
  • Make the ground vehicle contact the colored area.  Time bonus for time to ground vehicle contact.

Difficulties: this is essentially a software project, again–students only build hardware for the hovercraft.

Possible Task: Build a UAV

Incremental path to a custom UAV:

  1. Hovercraft with basic levitation: Turnigy 8-channel R/C RX unit, plugged into one motor controller, plugged into one lift motor blowing downward.  No programming, plug-and-play.
  2. Hovercraft with float and forward: add a second motor controller and a lift motor to pull the craft forward.  (Front wheel drive is easier to drive.)
  3. Hovercraft with float, forward, and steering: add a third control channel  so you can steer the direction?  R/C servo changing thrust direction?
  4. Introduction to control: use an accelerometer to keep a board level, using one/two BLDC motors with props.  Board is pinned to the desk, so it can only rotate. This is simple single-axis control, but due to inertia, proportional control isn’t quite enough–you need the derivative term to converge rapidly.
  5. Hovercraft with autonomous stabilization: stabilize position and orientation, probably using gyros and accelerometer.  KK2?  Bare Arduino?
  6. Quadcopter with autonomous stabilization.
  7. Quadcopter with autonomous point-to-point travel.

Another option is to incrementally add control sophistication to an existing system, like a Parrot or ground R/C car.  The advantage of this is the system already works, and works really well already (spare parts, no fabrication).  We can then take off into doing higher-level control, like multi-robot interaction.

Possible Task: UAV Teleop Exploration

Students fly a UAV (homemade or Parrot), and map a room remotely to find an oil spill. They send us the coordinates, and we can then send an autonomous ground robot out to clean up the spill.

The “cyber-physical” aspect: combining mapping, sensor collection, and autonomous actuation (ground robot).

Localization: Dead reckoning doesn’t really work in the air.  A webcam or Kinect looking down from the ceiling, GPS style, could provide reliable localization.  The hull should be cut from 2″ white EPS foam for Kinect visibility, but needs to be lightweight and aerodynamic.   Orientation determination is a hard problem; we need to paint the hull different colors or something.

Most of the mapping, charting, and control work would happen on the ground station.  We need to figure out how to get students involved in programming this ground station, if only at the level of tweaking parameters.

Drawing the world: the UAV has detected obstacles and targets.  We need an integrated way to store, display, and process this.  For example, we could store a list of sensor reports:

  • XYZ location of the robot
  • Orientation of the robot (use yaw only)
  • Sensed distance to the obstacle

These could be displayed by walking the list of reports and drawing each one.

Detecting a wall: walls are lightweight and movable, like plywood.  They can’t be very tall, or the Kinect can’t see the UAV over the top.  Possible wall sensors are ultrasonic (FIRST sensor is $34, and works OK from 30cm up to a little over a meter), infrared (emitter / detector pair used in Mike’s mouse, but these are sub-foot distance resolution), or even virtual software-defined sensors read from the Kinect image.  Physical sensors need to be powered (you could power the Arduino from the UAV’s 12V battery).

Detecting a target: the “oil spill” should be big, like a square foot, and colored brightly (hyperspectral sensors actually can distinguish materials reliably).  Concentric circles would provide a more precise center point.  Would a LEGO color sensor work,  could we automatically watch a camera with OpenCV, or could students just watch video manually?

Sending in the ground vehicle: skip it entirely?

Learning Outcomes / Overall Project Goals

For students:

  • Basic motors, leverage, prop size, gearing.
  • The idea of “control”: starting with some hard problem like “keep a UAV in the air,” learn about sensors and actuators to stabilize flight.
    • A simpler version of this would be to keep a single 2×4 board balanced, using one gyro and R/C motor controllers.
  • Autonomous operation requires sensors, code, communication.

For teachers: improve technical skills.

For fellows: improve communication skills, especially when talking about technical info with nontechnical people.

Component-based CYBER-Alaska

We have a ton of different technologies, software, and hardware we can cover, so one option is to divide our content into a selection of components.

Component: basic circuits.  Battery, wires, and lamp.  Battery, wires, and DC motor.  Resistance (V=IR) for LEDs.

Component: 3-phase motors.  One possible sequence: incrementally jamming wires onto a 6v battery [Lawlor], go to manual control via six momentary buttons [Moss] or three SPDT on-off-on switches [Nance], then an Arduino-driven programmable commutator [Backus], then a fully integrated R/C motor controller.

Component: Radio Control Servos.  Radio receiver sends a PWM signal to a servo.  The same protocol is used by brushed or brushless motor controllers.

Component: UAV Stabilization and control.  Start with balance beam.  Go to hovercraft, from there to a UAV.

Component: PID control.  I term: “accumulated error over time” (not integral).  D term: “rate of change of error” (not derivative).  Examples of PID-style lagged control problems: keeping car lined up on an icy road (especially over curves); adjusting hot and cold on an unfamiliar shower (especially when somebody flushes!).

Component: Fabrication, from drilling wood to laser cutting to 3D printers.  (Mis)using PVC pipe fittings to assemble flexible or waterproof structures.

Component: Video processing.  FPV video over radio, digitization and pixels.  Measuring object sizes and velocities from video.  Greenscreen?  Telemetry overlay?

Component: Color digitization and recognition. RGB LEGO sensor, analyzing the colors viewed with RobotC.

Component: Computer Vision.  Object detection (paint it red, or blue, like standard FTC colors). Object tracking.  A bridge too far: 2D to 3D / SLAM?  OpenCV or SimpleCV software?  Pixel traversal?

Component: Kinect 3D Sensing.  Infrared imaging.  It’s big, heavy, electrically and computationally intensive, though!

Component: Sensor interfacing.  Start with basic switches, then simple analog sensors, work up to complex sensors like I2C mouse motion sensor.

Component: Localization.  For outdoors, we have GPS, but only if a few meters is close enough (UAV).  For indoors, a webcam with color analysis can be made to work with bright colors.  The Kinect works over short ranges, although it has problems with range, field of view, and reflectance.  For indoors, we could also use something like Northstar, a two-laser spot system from iRobot?

Component: Cellphone programming.  In general this can be hard, but Mike Backus has connected an Android phone’s accelerometer over bluetooth to his Arduino.

Component: Serial programming in C++, with Mike’s library.  Are there other options?

Component: Robot communication.  R/C PPM/PWM (analog), Bluetooth (short range, wide compatibility low bandwidth serial port), XBee (medium-longrange special radios), Wifi (standardized PC-scale).

Component: Web-based communication between a microcontroller and the wider internet.  For the classroom, we need a rock-solid reliable system for this.  Hardware: have our own router?  Software:  Building web servers with Mike’s library in C++?   node.js?  nginx?

Component: Building Infrastructure. Actuating real-world utilities from an Arduino-style platform.  This is a good way to apply all the knowledge learned above!

Classroom Plans

Larry’s path through the material for his Digital Electronics class: start with brushless motor controller (first hardwired logic, then software sequenced with a microcontroller), radio R/C comms, stabilization of a horizontal pivoting beam with a direct actuation, build a “meat tray hovercraft” with stabilization, hovercraft with stable forward motion, hovercraft with network comms [need onboard controller: maybe a phone?], remote teleoperation via video, video object detection [OpenCV?], quadcopter assembly, UAV stabilization [KK2?], UAV with R/C, UAV with network comms.

Mike Backus, during the school year might only get 5 students doing robotics at his school.  He’s planning on covering his little micro-mouse design for Fall; students are also preparing for FTC during late Fall.  One suggestion is to do a series of videos about FTC-relevant sensors.  He really needs to see: a parts list by November; and a reliable hovercraft and quadcopter running before Christmas break so he can play with them before January starts–it’s got to be reliable in the classroom.

Mark’s Fall schedule is Introduction to Engineering, a new class that covers a variety of content including some electrical and some mechanical, etc.  He’s likely to need fellows for intense week or two stretches, not regularly through the semester.  His Spring Robotics 2 might be ready to handle this stuff; this class will do FTC intensely until February.  Our plan for Robotics 2 is doing CPS content on Fridays; Saturday might be useful too, since Mark’s students are a captive audience.  Mark will be busy preparing a grant until after September 4; can give us semester plans by September 9 or so; wants to meet second week in September.

Mark’s FTC process:

  • Brainstorm
  • Pick teams & captains (the “person that does everything”)
  • Play with R/C servos, 180 degree (position control) and continuous motion (speed control)
  • Build up chassis and drivetrain
  • Joystick input (Mike’s RobotC joystick library), idea of proportional control
  • Tank control

Maryia will be bringing a group of high school students to the GI machine shop.  They’re a mix of Effie, West Valley, and homeschool volunteers, schedule to be determined.  Again, FTC will dominate their time during fall and early spring.  She can’t give us a weekly timeline before September 16 or so.

Expectations for teachers:

  • Overall week-by-week scheduling info by mid September.
  • Weekly recap / problem report emailed to Seta and Rex.  This can be very brief.
  • Immediate email notification when classes are canceled / rescheduled.

In July, for ASRA, we get 12 students, 8 hours/day for 2 weeks.

To Do

This week:

  • Test Lego ultrasonic sensors talking to an Arduino, and running next to a UAV.
  • Check UAV visibility in Kinect, for various foam hulls.

Over the next month, the CYBER-Alaska team needs to:

  • Tune and document pivoting beam system for reliable operation.  Simulation?
  • Build hovercraft prototype.
  • Build, test, tune, document KK2-based UAV prototype.
  • Build, test, tune, document the UAV-mounted wall sensor package for the CPS challenge.
  • Test and document using OpenCV for video-based localization.  Vehicle will have two LEDs, green in front and red in back, as seen by a webcam overhead.  The reconstructed vehicle center position and orientation (yaw angle) needs to be sent via bluetooth or a web interface.
  • Tune and document our existing Parrot/Kinect piloting system.
  • Prepare/find a huge number of videos.  These should be organized like Mike Backus’ pages.  Maryia recommends adding subtitles, so the video works even without audio.

One thought on “2013 Workshop Notes”

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>