Jeeves has achieved autonomy! Still more work and tuning to be done before he's ready to become a tour guide, but we've reached an important milestone: Jeeves can traverse from one end of the building to the other, with no human intervention apart from telling him where to go.
Magick Smoke
Sometimes you just gotta make stuff. Current project: An autonomous tour guide robot!
Sunday, November 9, 2014
Friday, August 22, 2014
Toe the Line!!!
A while back (a couple of months ago, actually), a friend of mine built a line-following robot kit with his son. The kit went together fine but didn't work quite right so he asked if I'd take a quick look. I finally got around to it and after some head-scratching, got the little sucker to do its thing:
Friday, July 4, 2014
Getting in the Flow
Fedex delivered a shiny new PX4FLOW sensor today:
I'm hoping that I'll be able to get reliable odometry from a pair of these little babies, thereby solving a very thorny issue. Cross your fingers.
Thursday, April 17, 2014
Jeeves' First Map!
Got all of the pieces put together to get the hector_mapping package up-and-running on Jeeves, then took him for a spin around room EB84 (his current parking spot). Here's the generated map. You can't tell from this single static image, but the localization was impressive. This is using no odometry at all, only the laser scanner:
Tuesday, September 10, 2013
Sunday, September 8, 2013
What do we want? Truth! When do we want it? Now!
Ground truth, that is.
I'm working on a vision-based gyroscope, in hopes of getting a really solid heading-hold on my quad, without the use of magnetometers. For prototyping purposes, I'm using Python and OpenCV to test the effectiveness of the various approaches I'm considering. To do that, I'll need ground truth data. I attached a coaster to the shaft of a hobby rc servo as a target, and mounted this assembly onto a piece of 2x6 connected one of these. The idea is that I can rotate the little disc in a known, repeatable way, then compare the known rotations to the output of my vision-based gyro program. Here's a diagram (because I love diagrams):
I'm working on a vision-based gyroscope, in hopes of getting a really solid heading-hold on my quad, without the use of magnetometers. For prototyping purposes, I'm using Python and OpenCV to test the effectiveness of the various approaches I'm considering. To do that, I'll need ground truth data. I attached a coaster to the shaft of a hobby rc servo as a target, and mounted this assembly onto a piece of 2x6 connected one of these. The idea is that I can rotate the little disc in a known, repeatable way, then compare the known rotations to the output of my vision-based gyro program. Here's a diagram (because I love diagrams):
Here's some video of the thing. The program you see running on the screen is OpenCV's Lucas-Kanade tracker demo. The little green streamers are the feature tracks that the program is following :
Sunday, August 18, 2013
Arms Too Short to Box with God*
Controller yaw authority still isn't as good as it could be, so in the course of my experimentation I decided to try out shorter arms (less rotational inertia). Here's a view of the whole contraption sitting on the table. I moved the landing gear inward and attached pool noodle for padding. Nice, but this arrangement was very tippy on landing if there was any lateral motion at all. I didn't want to take the time to add little legs to the ends of the arms, so I inserted some square carbon fiber tubes that I had laying around into the pool noodle. Worked like a charm. Adds weight, but I'll worry about that after I've truly solved the heading hold issue:
Subscribe to:
Posts (Atom)