Research Coordination

Fall 2013 Research Vision

Major tasks for August 2013:

Vehicles:

  • Build an R/C UAV, decide how to get commands in, stabilize the flight, and get digital telemetry in and out.  Questions to answer:
    • Should we build a biggish (8-10 inch props) or small (5-6 inch props) UAV?  There are tradeoffs for safety (big UAV has more energy in the props), convenience (small UAV doesn’t blow papers around as badly), control (big UAV might have slower dynamical time), and payload (big UAV can add more shields without changing the dynamics much).  We’re leaning toward smaller.
    • How will the KK2.0 work out for control? September: Mike has now built a KK2 test UAV, and it seems to be pretty easy to use but still provides good PID-level customization.
    • How do we interface a KK2.0 to an onboard Arduino.
    • Can we use an ultrasonic distance sensor on a UAV?  Can we use multiple sensors without interfering with the servo outputs used for flight control?
    • Tasks: Lawlor: provide a short component wishlist.
  • Update rover software and hardware.
    • People: Mark Stover, Aisha Peters
  • Build an R/C hovercraft, as a more sophisticated ground vehicle.
    • Tasks: Lawlor: build prototype from foam.    Recruit Jessie, from Lathrop?   Jake?
  • Parrot firmware updates.
    • Tasks: Ahmet will be looking at the existing 2011 open source version of the flight control firmware.  Mike will provide advice on software issues.
  • Aero balance beam.  We should build a good-looking version, from aluminum extrusions.
    • Tasks: Isaac will spec out and machine this.

Localization:

  • OpenCV image tracking.  Converting camera images into tuned reliable robot location & orientation.  Questions to answer:
    • Reliability: can we reliably locate colored patches, LEDs, lucite glow patches in the images from common webcams?
      • August 26 update: color detection is fairly reliable for garish colors, like florescent pink and green.  White balance can be a problem.
    • Latency: how many milliseconds of latency in the camera-OpenCV end-to-end process?
      • August 26 update: latency is quite good (<100ms) when the processing can keep up with the video camera, but poor (several seconds) if the processing is too slow.
    • Accuracy: how many milliradians/millimeters of error do we get from visual tracking?  Clearly this depends on the camera, target, and processing.
      • August 26 update: image shows 2-3 pixel variance during bad color match, 0.1 pixel variance with a good color image.
    • Can we get the Parrot’s video output into OpenCV?
    • Tasks: Lawlor: now has reliable color blob detection in OpenCV. Need to pass the OpenCV stuff off to Shaun.
  • Kinect tuning and reliability.
    • Tasks: Shaun will keep working on tuning the Kinect, in particular getting the new model working with libfreenect.  Background subtraction would be useful too.
  • iRobot NorthStar hardware for indoor navigation–this is a little box that projects two dots onto the ceiling.  Unfortunately, the receiver side is only available inside a commercial cleaning robot, the iRobot Mint.
  • Could we use laser beams somehow for indoor navigation?  Or for near-range outdoor navigation, like the NASA Sample Return Challenge?

Integration Software:

  • Need a central “robot data fusion center”, which receives a list of robot sensor reports:
    • Robot’s current location (XYZ).
    • Robot orientation vector, pointing in direction of the sensor.  Also a copy of 2D yaw angle in degrees.
    • Robot sensor report: distance to obstacle, hiker, or another robot.
    • Robot ID.
  • This list of robot sensor reports needs to be accessible via the web, and displayed in a nice little GUI.
  • Tasks: Dr. Lawlor will try building a Mongoose server to accept and produce JSON sensor reports.

Other projects:

  • Educational modules: boards exist, and are waiting for soldering.
    • Sensor board: an embedded-Arduino board with accelerometer, heat, light, etc sensors.  This would be useful in Lathrop or Mark’s classroom to explore microcontroller-sensor interfacing.  2013-09-24 Isaac helped  Zach Krehlik figure out the pins to address for each sensor.
    • Motor controller board (more sophisticated four-motor control from encoders): the board exists, but we’re having some trouble with the onboard arduino.
    • H-bridge demo board (“you are the H bridge!”): nobody has seen an actual fabricated board yet, but the layout exists.
    • August 26 update: Mike Moss has soldered up a sensor board, and 4-motor control board.
  • SmartHouse: humidity?  Dr. Lawlor is building electrically actuated window shutters, which need a control system.  PIR occupancy detector?  Moose detector?  Door locks?
    • People: Charlie and Ben.

Pre-workshop summer vision

Shared software infrastructure and interface: (Shaun)

  • UAV/UGV shared world: mark map with locations of ground and air vehicles, command tracks
  • Multi-vehicle control: Mavlink?
  • Moving targets & obstacles?

Indoor challenge track: (Shaun, maybe Mike after finals recovery)

  • Tune PID & limits for Kinect/Parrot
  • UGV position determination: video analysis?
  • Trapezoidal field, to better match Kinect FOV?
  • Use an IMU on both ground and air vehicles
    • UGV: one or two optical mouse sensors for ground tracking?
  • UAV yaw control:
    • Kinect depth normals?
    • Control response cross product?

Outdoor challenge track: (Steven)

  • Need a field: a few hundred yards, tops
  • Need an outdoor ground vehicle: ardurover
  • GPS lets us throw away Kinect (woo hoo!)
  • Multi-vehicle points for detect, assist,
  • Fly Steven’s hex, or scout
  • Challenge runs every Friday?

Finish up educational modules: (Ben)

  • Motor control demo board.

2013 Workshop  dates are fixed at August 5-10 (Monday through Saturday).

Telecon via skype: meeting Monday, June 3, 11am (AK time).

Overall Research Vision

Networked ground and aerial vehicles, helping each other explore in a semi-autonomous fashion.  Realtime simulation helps the pilot maintain situational awareness, helps interpolate INU readings across position sensor dropouts, and helps us build and deploy the system.

This applies to lots of different research problems:

  • GPS-supported UAV uses its onboard webcam to provide position updates to a GPS-denied UAV or UGV down in a valley.
  • Any 3D localization scheme used to control a network of air or ground robots.

See also:

Classroom CPS Challenge Planning

Schedule

Lathrop: Digital Electronics 4th quarter can be devoted to GK12/CPS basically from March 25 all the way until middle of May.  5th period time varies: Monday 12:24-1:15, Tuesday, after lunch Wednesday 11:25-1pm (starting 15 minutes late), Friday 12:24-1:15.

Effie: Starting the week of March 4, school runs until about the middle of May.  Students meet Tuesday 2:40-3:40, Thursday same time, and Friday 2-3:40 afternoons. Probably will have Jeremy work with students on Tuesdays and Thursdays, and meet with fellows on Fridays.

See the Challenge Google Doc for details on the Challenge.

Student CPS Challenge Design Choices

Overall vision: we build a UAV, which maintains station over the play field.  The kids build a ground vehicle, which drives around and explores.  We write a software interface and graphical simulator that a pilot (or team of pilots) can use to drive both robots around the field.

Ground Vehicle Electrical Design

*Tons* of options here.  Sorted from most to least plug-and-play:

  • Lego NXT.  Larry’s somewhat preferred choice.  Advantage: simple, reliable, and plug-and-play.  Disadvantage: very similar to FIRST, and kids get no soldering or detailed electrical interfacing.  It’s also not possible to do video.
  • Turtlebot style: basically a laptop interfaced to a commercial mobile platform like the iRobot Create.  Laptop handles wifi and webcam; platform has motors and encoders built in.  Advantage: plug and play.  Disadvantage: again, kids do zero detailed electrical work.
  • Laptop + Arduino: laptop handles wifi and webcam, and feeds motor control commands to onboard Arduino.  Arduino watches motor encoders, drives control commands, reads sensors.  Advantage: good balance, customizable from LunAlaska to ITEST teleoperation platform.  Disadvantage: the laptop’s size and weight (so maybe an Android phone, or a Raspberry Pi?)
  • Raspberry + Arduino: Raspberry Pi single-board computer handles USB webcam and USB wifi, Arduino is counting encoder tics and talking with analog sensors.  Smaller and lighter than laptop.   Beagle Bone would be a similar setup.
  • All-Arduino: one Arduino stacked with shields.  We’d need an RF comm shield like Wifi or XBee, a USB Host Board+webcam or a dedicated camera shield, and a two or four motor driver board.  Plus encoders.  Advantage: mostly off the shelf stuff.  Disadvantage: very difficult to get them all to work together, since shields tend to use the same pins.  A stack of Arduinos sounds tricky to set up and debug.
  • A custom-everything board with an MSP430, XBee or other RF comms (wifi?), video processing chips, motor drivers, and encoders.  Basically a scaled up micro-mouse.  Advantage: highest integration, lightest weight, tons of interesting EE work to be done.  Disadvantage: lots of engineering effort required, and difficult to integrate kids into the board design/build process.

Votes:

  • Dr. Lawlor prefers the laptop+arduino approach for maximum simplicity, although I know we can make *any* of them work, and Raspberry + Arduino would be very similar.

Ground Vehicle Mechanical Design

This will be driven mostly by the size and weight of the electronics above, plus their batteries.  Dr. Lawlor feels comfortable mostly leaving this choice to the kids: as long as it’s got 2 or 4 electric motors, we should be able to handle it pretty easily in software.

Mark is leaning toward the Rover 5, with four Mechanum wheels.  This is more motors and encoders than we might want to deal with.

Indoors or Outdoors

Indoors: definitely easier for the contest
– No FAA clearance required (although we’re probably OK for noncommercial)
– Can run in the corner of classroom, next to tools, night and day, in any weather.
– Probably use Parrot AR Drone 2.0 (cheap, kid-safe)
– Commands via UDP packets over Wifi
– Kinect on the wall for 3D realtime localization
– Better match for GPS-denied environments

Outdoors: probably next year
– Better match with Greg’s research goals
– Probably use ArduPilot quad/hexcopter platform (higher payload, more details exposed, more “researchy”)
– Navigation via GPS
– More realistic search and rescue problem

Conclusion: Indoors, at least this year.

Aerial Vehicle Control Interface

One possible design:

  • Aerial vehicle is a Parrot AR Drone, with the indoor hull for safety, possibly painted or modified to be more visible to the Kinect’s IR sensor.
  • 3D position sensor is a Kinect, which updates at 30fps.
  • A desktop machine runs a closed-loop position control algorithm: it takes 3D position commands from student pilot (via a web interface?), reads the current UAV 3D position from the Kinect, and feeds thrust and tilt commands to the UAV’s autopilot to match the two together.

Weekly Research Tasks

2012-10-11

Steve: work on RF comms for micromouse, test out Arduino + USB Host shield + webcam performance and reliablity.
Mike: work on GUI for micromouse demo
Charlie: Parrot Drone UDP communication
Shaun: try out libfreenect for the Kinect
Pascale: Begin work on a 4-motor Arduino shield

2012-10-18

Fri, October 26, 9:15-10pm: UAF Inside Out, Duckering 208.
Prep meeting Sunday, Integration at 1pm, Lesson plan at 2pm, in 208.

Steve: UAF Inside Out RF comms are operational; will work this weekend to finish up the GUI with Mike (Sunday, 2pm). Suspects Arduino + webcam is going to be a long hard road, for little benefit–nobody seems to have done it yet.  Suggests BeagleBone/BeagleBoard ($90) as a replacement for Raspberry Pi (like the Arduino Shield, it supports “Beagle Capes”).

Mike: Worked on booting a Raspberry Pi, now has uvcvideo webcam driver working; also added an Arduino-style programming interface for the Raspberry GPIOs.  No analog inputs on the Pi, though, so we’d need an Arduino or something similar for analog interfacing.

Charlie: Read up on Parrot drone communication library.

Shaun: Has libfreenect library installed, and can grab depth image.

Pascale: Working on an H-bridge educational board, for Mark.  Also working on Micromouse.

2012-11-01

Mike’s got an Arduino-style programming interface for the PIC microcontroller; still working on the PWM interface for the PIC.

Steven finished up UAF Inside Out Day presentation with flying colors.  Mike & Steven will do basically the same presentation for Jeremy’s two classes 12:20pm-2:05pm, and 2:10pm-3:38pm Friday, Nov 2.  There’s a scheduling conflict for this: Steven can’t be there until like 1pm, so he’ll need to drop off the maze + robots earlier, then Mike can arrive to start the presentation.

Mike bought and Steven built up a Rover5 platform, using an XBee, Arduino Mega, and Steve’s 4-H bridge board.  It’s a surprisingly robust mechanical platform, and it tracks very straight along a table–only a few mm total curvature after 1m drive.  It’d be interesting to add (1) a Mechanum wheels version, and (2) an integrated driver board.

Shaun’s been working on getting input from the Kinect, and can pretty reliably segment moving objects from the background.  Still working on undefined areas somewhat.  He’ll bring a Kinect for next week’s meeting.

Kayla and Pablo were visiting to evaluate the project, and need some interesting projects to work on.

We scheduled a work session, this Sunday at 2pm in 208 & 210 Duckering, to talk about simulation/animation in OpenGL.

Research plans: Get Kinect + Parrot working reliably.  I’ll bring in two Kinects on Sunday, to see how they interact: we could use one for UAV localization, and another for ground obstacle avoidance?

Someday, somebody should build a simple Raspberry Pi pluggable I/O board with a real microcontroller, to do hard realtime stuff like interrupt processing, A/D, and tight control loops like PWM counting.

2012-11-05

1. Decided to contact Julie at Tanana Valley about presenting for her 8th grade class.

2. Discussed past week’s activities:
Marks class: Mike discussed variables (numeric and character based), and covered while loops and if/else statements. Will link this presentation into the prezi.  Larry’s class: Mike did the same discussion as with Mark’s class, then after his “lecture” time, they went and began using the NXTs….downloaded RobotC firmware and that was about it.

3. Coming week’s activities:
Larry’s class: Steven will be talking about functions (as black boxes) and teaching the students how to turn the motors and read the sensors. Then they’ll actually go and work on the NXT.
Marks class: Mike will be discussing basic theory on functions in C++.

4. Ideas for hands on class experiments.
A. Basic Electronics (use LEDs to explain how resistance changes voltage and current).  Then plug it into an arduino and explain how you can use an arduino to change avg voltage across LED.
B. Motors..connect two motors and crank one..the other turns.
C. H-bridge motor control..Arduino H-bridge control
D. Sensors
E. Batteries

5. Discuss bubbles in CPS Prezi. Biggest problem was where “simulation” fit. In the end it was put in the virtual world.

2012-11-08

Steven will be building a Color Organ for Larry’s class: analyzing sound (from where?) into frequency bins.   Larry would like to build up some girl-friendly course modules to take to Ryan and other local middle schools, to try to get more girls interested in STEM.  We should contact SparkFun, to see if their Arduino outreach stuff is reusable by us.

Juliet would like us to visit her classroom to show MicroMouse before she’s gone on November 14.  That’s too soon, so we should postpone this until January or so.

Generally, middle school might actually work *better* than high school, while it’s still cool to be excited about stuff.  Even elementary is within the realm of possibility, if we break down the questions far enough.

Dr. Lawlor’s suggestion for Steven’s thesis topic: blink IR emitters, and receive data from *all* IR sensors, to estimate the global illumination mutual reflectance of the world–the hope is that n emitters and m sensors give you (n*m) floats of information about the world.  This n*m float snapshots, and the history of these snapshots, should tell you more about the 3D structure of the world, possibly even more than you can get from a camera.

Kits: Steven and Pascale are working on an H-Bridge board kit.  It will include both big robust arcade buttons for manual control, and pluggable MOSFETs for automated control.  They’ll just use 5V motor power, and p-Channel high side and n-Channel low side, to avoid needing level shifters to drive the MOSFETs.

Long term, rather than developing the TriloBYTE, we should probably use the Rover 5 instead: the TriloBYTE motors ($15 each) plus encoders ($37) are already more expensive than the bigger and more robust Rover 5. Working on designing the Rover 5 driver board, and trying to pick an appropriate microcontroller:

  • Arduino Mega has 4 interrupts yet is big and expensive
  • An Uno would need funky timer code to do latency-limited polling (although 1kHz sampling should be plenty)
  • The Mini uses much less space
  • Even the MSP430 with Energia?

Pluggable sensor modules for the Rover 5 board might include:

  • IR (or ultrasonic) distance sensors.  Basically analog input, 3-wire.
  • Touch sensors/switches.

We need more non-robotics CPS examples:

  • Little house: measure air temperature, automatically apply heat.
  • Houseplant: measure soil conductivity, automatically solenoid-valve watering.

Charlie is still working on Parrot UDP communication, and is considering skipping the SDK’s complexity and just sending out raw packets.

Shaun can now back-project the Kinect’s output to real world 3D locations, in meters.  He can highlight the 1m cube directly in front of the Kinect.  Future work: calibrate with a meterstick to determine the XYZ error bounds, and output a single XYZ center of mass position.

For the contest, how much work should we do to support NXT?  We don’t want kids’ focus to be just mechanics–wheels and gears and motors like FIRST.  But on the other hand, it’s clear students in Robotics 1 will need lots of help to get simulations running: they need a compiler, and libraries, and *tons* of example code.  Kids that want more can definitely go underneath and build their own: especially Mark’s kids, who have had a year of programming.

Upcoming course topics:

  • Larry: color organ, and joystick-driven motor.
  • Mark: passing parameters to functions.

For this week, Seta and Steven and Mike will meet at 2pm on *Wednesday*.

2012-11-11: Sunday Research Session

Mike, Steven, Shaun, and Dr. Lawlor built up a little quadcopter UAV simulator.  The source code is here:
http://www.cs.uaf.edu/2012/svn/cyberalaska/aerial_sim/

It’s got:

  • Throttle and thrust vector level control.
  • Autopilot altitude control, with a PD control.
  • Random wind direction & magnitude, with official wind drag formula.
  • Multi-UAV support.
  • Basic Newtonian forces, velocity, and position.
  • A simple visual display of the 3D UAV positions and thrust vectors.
  • A simple keyboard user interface (WASD to move camera, IJKL to tilt copter).

It’s still missing:

  • The ability of the UAV to rotate (it tilts, but doesn’t rotate).
  • A clean way to tweak the simulation parameters (maybe osl/webconfig?)
  • A clean way to simulate sensor lag.
  • A translation into JavaScript, to run in a browser.
  • Kinect input and Parrot output!

The whole repository is:
svn co http://www.cs.uaf.edu/2012/svn/cyberalaska/

2012-11-15

Planning for 2pm Sunday work session:

  • Original goal was to tune lesson plans by delivering them to our fellow undergraduates.
  • This week: 2D/3D OpenGL graphics, by Mike.  Talk about how to get 2D/3D simulations working in a web browser (JavaScript?  .exe?)
  • Different meeting time?  Mike is in his office from 1-5 MWF; Steven is in his office almost all the time.

Mark Nance needs some sort of small, appealing, hands-on example projects fairly rapidly, so he can get more students to come back for Robotics 2. Options:

  • Two motors on a chunk of wood, for generator/motor/gearing/braking demo.
  • Somehow build up something with an NXT/RobotC and simulation.
  • MicroMouse style robots, with a simulator?
  • Teleoperated web-based demo?  It’d be a good way to get our server infrastructure running.
  • Automated plotting / charting / graphing combined with a sensor.  If we started with the Launchpad (or Arduino) as an A/D converter and serial, then we could do some fun in-class work with a variety of sensors.  This could be a single sensor board to help train kids; horizontally, a row of sensors: thermistor, Cadmium Sulfide (CdS) or solar panel, ultrasonic or IR distance sensor, accelerometer with analog output, gyro (if cheap), hall effect or reed magnetic field sensor, etc.  You could even leave the sensors unlabeled, and have the kids figure out what makes the sensor change.  Tougher for kids to figure out: pressure sensor.

Mr. Nance does an effective trick where students can’t leave until they answer a question at the end of the class; this raises the spectre of social shaming if you get the wrong answer, which keeps kids alert.

Larry’s Period 1 and Period 4 aren’t quite as far apart as we’d feared.  As soon as students got motors moving, they caught on much faster.

Things that help students remember things out are:

  • Knowing it matters: that you the instructor will come back, see them again, and talk about related stuff.
  • Having some hands-on experience, even if it’s bad (tried it, didn’t work).  Sometimes failure is actually a good teacher, as long as you can show them it’s possible, they’ll work harder to finish it.
  • Multimedia: hear it, write it down, use it hands-on, say it, sing it, dance it!
  • Hearing things *repeatedly*, at increasing time intervals.

Shaun calibrated the Kinect 3D sensor; it reads a little small, by 2-3%.

Possible new Steven thesis topic: multichannel audio processing on ground robot, to detect the direction of an overhead UAV.  This is reusing the unintentional audio emissions for UAV localization.  He’d be doing low-latency coincidence detection not via the usual cross correlation, but via a “spike neural network” on an FPGA.  Next steps: compare transducers, speakers, microphones for input; measure hexacopter audio spectral signature (plus doppler?).  It’s biologically inspired hardware design.

Current boards in progress:

  • Steven’s Dual Relay Board is capable of switching 120VAC and ten amps, driven from an Arduino Uno.  It’s basically done, just needs a house kit (or actual house!) added.  A 3D printed plastic case would be a good way to make it safer for wall current, but would need to be designed.  Same board could be used for fish tank/plant terrarium (fish runoff feeds the plants; plant runoff feeds the fish), it’d just need sensors.
  • RJ45 to four motor 12V H bridge board, originally for Mark’s ROV.  Basically done, needs heatsinks and shipped off to Mark.
  • Manual switch-or-FET single motor H-bridge demo circuit board.  Control lines from FETs are pulled off to a header, so theoretically the thing could be controlled with an Arduino.  Pascale has the layout basically done.
  • MSP430-based programmable quad H-bridge PID demo circuit board, which will come to maybe $25 per board finished.  Isaac has a bill of materials, and is working on a final layout.
  • Rover5 driver board.  Includes two LM298 drivers to drive four motors, inputs for four encoders, spare input slots for sensors, an XBee, and Arduino Mega *or* Pro Mini (although it’s tight to fit the limited pins).
  • Larry wants a 7-segment LED decoder and display driver.

Scheduling: Lathrop: Semester 1 ends Dec 21, students back January 8th.  Mt. Edgecumbe probably starts about 1 week later. Next semester: Robotics 2 is 1:15-2:45, doing FIRST then UAV or AUV; Robotics 1 will basically repeat this semester.

Purchase Order:  BeagleBone ($90) plus some Capes.  Rover 5 platform ($60) plus 4 mechanum wheels ($75).  Parts to populate a half dozen relay boards.

2012-11-18

We should build a teleoperation demo, with:

  • Rover 5 base platform, starting with tracks, but eventually Mechanum wheels.
  • PC to Rover serial protocol: 115200 baud, 8-N-1.
    • Start bytes: 0xBB 0xF0.
    • Motor speed command bytes: FR, FL, BR, BL.  These are unsigned char PWM motor speed targets.
    • Flags byte.  Low bits are 4 direction bits for all four motors.
  • Rover to PC serial protocol:115200 baud, 8-N-1.
    • Start bytes: 0xAA 0xF0.
    • Encoder count bytes: FR, FL, BR, BL.  These are unsigned char encoder transition counts.
    • Flags byte.  Low bits are 4 direction bits.
    • Sensors: two 10 bit ADC values packed into two-byte little endian values–front right and front left distance values.
    • Byte length of extra sensor data, like our 9DOF I2C board.  For no extra data, send 0.
  • Top-down Kinect view, for localization and pilot view of *real* location.
  • Web server to talk to rover via XBee
  • Need a client program to display current state and send commands back to web server.
    • Web client: trivial to install and run anywhere, and easy to get in and change the JavaScript control algorithms.  But harder to interface with joysticks.
    • OpenGL exe: hard to install, build, and change.  But can be very high performance for onboard simulation
  • Need an “arena”, possibly just a room or a cardboard box.

2012-11-29

We’re hoping to get a grant to make some CPS videos, for a variety of example projects.  Possible project topics:

  • Mobile robots for search and rescue: UGV and UAV (our main project).
  • Building infrastructure: heaters, fans, A/C, lighting; with the goal to minimize energy consumption.
  • Cool demos like the LED ball.  And a Laser harp: play tone when laser line is crossed; adjust timbre based on the height of the crossing.
  • Social connections: facebook, twitter, etc.  Telepresence?
  • Radio wireless signal strength surveys outdoors, via UAV?

For building infrastructure, we need to figure out how to build a house model for use in the classroom.  Lathrop laser cutter is 16×24″, and could fabricate attractive sides if we could make DXF files.  We could also use real buildings, such as Dr. Lawlor’s shop, or a spare building at Poker Flat, although their thermal response is a lot slower, and freeze-ups would be annoying.

Tuesday, we’re hoping to demo some teleoperation for Mark’s class, to get the kids to sign up for Robotics 2.  We definitely have the chassis working, and need to make the software look better.

  • We need a 3D model of the Rover 5, probably a .obj.
  • Ideally, we’ll include encoder feedback to get accurate 2D positions.

For programming, Lathrop kids are struggling but slowly making it–learning takes several repetitions for everybody.  One thing that worked this week is when a kid gets their version working, they get their name on a slip of paper, and then go help somebody else–if that somebody gets it working, then *their* names go on the slip of paper, which at the end are collected for extra credit.  This multi-level mentoring will work, although we’re going to need to show up more than once a week in January.

Terminology: one “course module” is about a week of classroom work.

Purchasing: tripod + 1080p handheld video camera, to start making better YouTube footage.  We should also get a Parrot AR Drone 2.0, plus a fast charge pack and aftermarket battery.

2012-12-06

Visiting: Ben.

Teleoperation demo with Sitka: the common question was “How did you make this?”  Mark still had to prod kids into asking questions, but there was definitely much more excitement than just showing slides.  We should have advertised more explicitly, saying “Sign up for Robotics 2 to build your own!”.

Lathrop: robots driving in a square next week (as final exam).

Effie: this Friday, read thermistor and blink LED.  High schoolers have seen code, and *some* of the middle schoolers have too.

Effie is having a “FLL Fun Tournament” this Saturday, 10am-2pm.  This would be a good chance to demo teleoperation.

Shaun and Charlie TODO: need to integrate Kinect and Parrot Drone code.

Steven TODO: detailed purchasing list, and list of classroom activities/lesson plans.

For building the ground control interface software, Bruce, Shaun, Ray, Steven, and Dr. Lawlor need to meet.  It’s not clear how we should build the robot to ground station interface: do we use MAVlink for our ground robots, or our own custom protocol?  It’s also not clear where to tie together the server-side C++ that Mike & Dr. Lawlor prefer, and the Node.js server UI that Bruce and Ray use.

Winter break plans. Gone: Charlie, Seta, Isaac.  Everybody else will be here; our goal over the break will be to finish *all* our lesson plans, including hands-on activities and formative assessment in every class.  Probably this needs to be outlined in a Google Doc before diving into the details in Prezi.

2012-12-13

Availability over the break:

  • Seta will be available via teleconference from Dec 22 through Jan 12; and back in town January 17.
  • Isaac will be back Jan 12.
  • Charlie will be back Jan 3, and will be on email.

Our first winter break meeting will be Monday, December 17 at noon, in Duckering.  Goals for over the break: talk to Steven!

  • Finish up lesson plan design document.
  • Build up the Rover5 into a reliable, production-ready ground rover.
  • Build a useful software simulation environment with WebGL.
  • Integrate the Parrot, Kinect, and ground rover.  They will all communicate with one central ground station (at least to start).
  • Restart the UAV ground station user interface collaboration with Ray, Bruce, and the rest of Greg’s folks.

2012-12-17

Monday noon meeting.   Next GK12 research meeting Thu, Dec 20, 2pm, Chapman 201A.

Todos:

  • Steven: get lesson plans outlined.
  • Lawlor: WebGL UAV simulator.  This will be used in my upcoming Simulations in Graphics class.
  • Shaun: get Parrot control code from Charlie, and get the UAV flying based on Kinect.
  • Mike & Steven will get the Rover5 operational.

Cool hardware: the Arduino MEGA compatible chipKIT Max32, from Diligent.  It’s got an IDE that looks a lot like the Ardunio IDE.  It’d be an interesting pin-compatible alternative if we need a little more compute than an Arduino (midway between ARM and Arduino).

For the Rover 5, Mike found a 6V 4.5Ah sealed lead-acid battery ($15 at Lowes; $11 on eBay with free shipping) instead of a 2S Lipo (also $15 at HobbyKing).  Kind of a toss-up.

2013-01-07

Telecon with Seta.  Updates on:

  • Tuition waiver for “Communicating Science” course: waiting on Greg’s input.  We will make this happen.
  • Software: Mike is using processing.js successfully, in an Arduino style.

Plans with Larry: onboard with Steve’s video delivery ideas.  Lathrop is selecting “top ten” teams today and tomorrow.  Classes starting now, Mike will start talk on the 9th.  Idea is some simpler CPS system (temperature control like the house?) in freshman Robotics 1, where kids usually pick their own final project.  For Digital Electronics we can do the full Search & Rescue CPS challenge.  FIRST regionals are Feb 15-16, and winds down soon afterwards.

Plans with Seta: she’s back after January 21; meetings after 2:30pm anytime that week will work.  We’ll meet with her at 2pm Monday, January 21st.

Whether we get our video-based grant funded or not, we’re definitely going to try using the inverted structure video-based-lecture technique.

Charlie and Shaun will meet at Chapman at 5pm sometime this week, the exact day to be coordinated via email.

2013-02-20

Seta and Orion met with Larry Ehnert (Lathrop) and Jeremy Nicoll (Effie Kokrine) to coordinate our CPS course material and plan the Challenge at the end of this semester.

Larry still needs about 3 more weeks with Digital Electronics, at least until Lathrop’s Spring Break on March 11-15 (same as UAF).  Digital Electronics 4th quarter can be devoted to GK12/CPS contest: basically from March 25 all the way until middle of May.  It’s taught 5th period, but the time varies: Monday 12:24-1:15, Tuesday, after lunch Wednesday 11:25-1pm (starting 15 minutes late), Friday 12:24-1:15.
Larry spent the last of his budget to buy 40x Arduinos for Robotics 1.  Typical projects:
– Servos
– 7-segment display (scrolling)
– *Some* will do an H-bridge
L298 dual motor driver, used by Steven, is the official PLTW chip for Digital Electronics.
———-
Jeremy: wants about another week, but starting the week of March 4 would work.  School runs until about the middle of May.
Has them Tuesday 2:40-3:40, Thursday same time, and Friday 2-3:40 afternoons.  Probably will have Jeremy work offline with students on Tuesday/Thursday, and only have our fellows on Fridays.
Jeremy will be teaching Physics at Effie Fall 2013.

2013-03-04

Meetings Thursday will now be converted to work sessions.

Things to finish by the end of spring break, March 18:

UAV needs a web UI:
– Backend: go to XYZ. [Charlie & Steven, starting with example from Mike]
– Front end: HTML + THREE.js. [Shaun]

Arena generator & format.
– Generate Arenas [Hector]
– Accessibility? Hand design?
– Examples to Read and show Arenas [Hector & Mike]
– Obstacles
– Floor tiles? Standard 1ft square [Orion]

Build Lost Hikers.
– Electronics designed [Orion]
– Need to design base & head
– Need to print & assemble

Rover5:
– Brackets to hold battery
– PID control for rover velocity [Steven]
– Brackets to hold servos & sensors
– 15 copies need to be printed & assembled (by end of month)

Sensor board:
– Boards ordered
– Need to be built [Michael T & Steven]

Videos:
– Have CPS intro [Steven]
– Have a good start on animation [Mike]

 

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>