Saturday, February 22, 2014

3D Printing Goodness, First-ish Attempt

Here it is, the first print from my new 3D printer... Well, it's the second. The first one stalled out when my laptop went into sleep mode after a minute. This a quick accuracy checker from Thingiverse: http://www.thingiverse.com/thing:20996


Not bad for a first attempt, I guess.

The printer is a Mix G1, from mixshop.com, and took a night to build, a night to wire, and another few hours to load software and calibrate.

It was pretty straightforward to assemble, with only a few snags. Some of the structural parts were actually injection molded, but a couple of the smaller printed parts delaminated (broke) without any significant stress. Luckily they weren't critical breaks, and it's obviously running well enough to print it's own replacements.

Now... to start printing some rover parts!

Wednesday, February 19, 2014

Serendipity

Check this out:

http://www.adafruit.com/products/1722

Cool, no? Not convinced?



I don't really need this for autonomous racing, although there might be applications for determining gravel from grass, or tall shrubs and other veggie-hazards from background noise.

But as a gardening assistant it might give the rover something to do when it's not racing.

I think what this means is that I'll only populate one of the camera eyes in the Pan/Tilt mast, and leave one empty for this upgrade.

Now... off to buy the replacement webcam...

Monday, February 17, 2014

for (;;) { GoodNews(); BadNews(); }

Ever feel like you're stuck in an infinite loop?

Lots going on… let's start with some Good News; I think I got a subscription to Make: Magazine. Nifty. I'll know for sure when the first one shows up…

More Good News: I wired up my Adafruit TTL JPEG camera to an Uno for testing. Not having a Windows workstation I found that Winebottler could sort-of run the windows-only CommTool. The only thing you need to remember to do is create a symlink for com1 to whatever your tty port is:

ln -s /dev/tty.usbmodemfa131 com1

in the 'dosdevices' folder of the app definition. To see that folder just highlight the commtool.app (or whatever you called it in wine bottler) and choose 'Show Package Contents' and navigate down into it.

I did some fakery with copying the serial output to another softserial port to another Arduino so I could check that the bytes sent/received were as the adafruit protocol cheat sheet suggested. They did (mostly, I think) and the camera also output NTSC video to an old analog colour monitor I have from my Apple II days.

Here is where the Bad News starts. I never did get the commtool working right, it would issue commands, and I could see the return bytes, but I don't think they ever made it back to commtool. Maybe one day I'll bother to gram VMWare Fusion and all that jazz, but not today. Needless to say I spent a day futzing around with serial ports and basically cloning some of commtools functionality to test the camera, which worked. But it did waste a lot of time.

More Bad News: No matter what I did, I couldn't coax any colour signal out of the NTSC port. I spent a lot of time looking at every VC0706 data sheet and thread I could find, fiddling with saturation and other parameters, but it looked pure monochrome. I wasn't until the end of that adventure that I found this post:

http://forums.adafruit.com/viewtopic.php?f=25&t=41050&p=203980&hilit=monochrome+camera#p203980

Guess what? It only outputs monochrome, so it was working 100% right from the start!

I'm sure the product description changed in the year since I bought it, because that's not a detail I'd overlook when the entire purpose was to take the NTSC signal into a video-to-USB frame grabber for the Beaglebone Black, as a way to do colour detection of the goal gates in the autonomous robot race here in Calgary. One gate is blue, one gate is green. Oh well...

The only way to get colour images is to take a jpeg snapshot and stream it out at 38400 serial. Yuck, and not happening… that is WAY to slow for my tastes.

So I guess that means this $40 camera is now going back into the project box. I might use it as a 'backup camera' for the rover, since it does have one final interesting function (and reason #2 I bought it): It has motion detection built in.

So from a stop, set the motion detection mode, and see if it triggers in some short time interval. If it does, then there is something moving behind the rover, where it has fewer and shorter range sensors. In this mode I wouldn't actually ever use either the NTSC video or JPEG imagery functions, just wait for the serial trigger to fire.

So now the rover is essentially 'blind' until I find a replacement camera system. I'm thinking that I might purchase a webcam, as the prices have come down a lot since the good ol' days and I know that I can go direct into the Beaglebone for video via USB.

Then some Good News happened to me. I sat down this morning to write out the code for the low-mounted infrared pan sensors, which would have Atmel AT Tiny 85's as controllers. As I was cleaning up from the camera debacle I found an old Sharp IR sensor, one that I thought I had fried. On a lark I plugged it in and pointed it at the iSight camera on the iMac. I could see that it was emitting IR light, and a multimeter test showed it was actually not dead at all.

That saved me the purchase cost of a replacement IR sensor, or about $15, which got me thinking that I could now justify purchasing two webcams for stereo vision. I'll have to play around a bit to see if this is wise, since I'm not sure I have the CPU power left to do feature matching across two images.

But it's worth a shot, no?



Tuesday, February 11, 2014

Interesting Tidbit - Wheel Slips & Obstacles

I was thinking about routing over the last week, and trying to tie up the loose ends between mapping for routing purposes, and dead-reckoning for more detailed driving when GPS doesn't have enough resolution to discern a small distance.

Wheel slip is something that I knew would happen in various scenarios, and for dead-reckoning it becomes important to at least apply a guess factor in known situations.

As it happens I was surfing youtube and discovered a video from the JPL Mars Yard that I hadn't seen yet (!) and answered a question on wheel slip when obstacles are involved. Here it is:


The important bit happens after the front wheel comes down off the rock, starting around 0:22 seconds. If you watch carefully the wheels are apparently driven at the same speed, I think. The middle wheel mounts the rock, but the front and rear wheel are held back and slip quite a bit.

I was thinking the correct thing to do would be to drive the middle wheel faster, since it has a larger distance (over the rock) to travel, but it looks like given the slip involved to drag the middle wheel 'up and over' the rock, it's simpler to drive them all the same.

At least I know when to expect another wheel slip situation, and for dead-reckon driving that it might happen more often, i.e. when the suspension feedback is indicating that a wheel is mounting an obstacle.

Saturday, February 8, 2014

Quickie Updates!

I've had a busy month with a few big wins. In rough chronological order they are the completion of a workable DIY steering servo software package, the design of numerous parts in OpenSCAD (because I have a 3D printer on order) and a whole whack of Go software written. Some details:

The Steering Servos

It was a bit harder than I thought to write the code for a simple servo. I think when I imagined it I had assumed the weight of the rover would tend to dampen the motion to a point where 'porpoising' - an unwanted oscillation around the desired position - would not be a problem. But to counteract the dampening I had to drive the motors at 100% power until very near the desired point, which tends to create overshoots... which tends to set up an oscillation.

What I ended up doing was creating two kinds of drive through the L293D; one is a simple on/off at 100% power when the requested angle is different enough from the current angle, and a pulsed power level related to the angular difference when more accuracy is needed. I also added a bit of a power booster to overcome stalls when the power level is too low to cause motion due to small obstacles or surface friction.

3D Printed Parts

I needed several parts that work with the Servocity/Actobotics aluminum channel; everything from the steering & feedback assembly to simple covers for the pan/tilt sensors. My best guess as to how I might assemble these things from their standard catalog meant a TON of redesign time, and in most cases too much added weight for the job at hand. In most cases, especially for gears, it was simpler to create multifunction elements in OpenSCAD.

After the usual amount of research I decided to order a printer. That was 3 weeks ago, and I don't think it's shipped yet. Not happy... the site says 1-2 weeks delay, and they indicated in an email that it should ship last week, but nope, nada, zilch. The company I've ordered from has (apparently) really good post-purchase support so I'm willing to wait, but if I don't see it shipped by the end of the week I'll have to cancel the order and look elsewhere. Time is getting pretty short, so I may have to go with a pre-assembled unit that costs twice as much to make up for the lost time.

Go, Go, Go...

I think I've raved before about concurrency in Go before, but I've found another nifty thing about it - interfaces. When you consider you can, for example, collect variables into a slice (a slice is like an array) and then call the same method name on them, with each type having it's own method implementation, things get interesting in a good way. One application I found was that I wanted to create command sequences, basically lists of commands, and it only took a few hours to create a json reader for sequence files, then simply calling the Run() method on each one in sequence (or concurrently :)

Another less glamorous thing I got around to is understanding multiple source files and the package system. Being able to limit the editing to a particular file once it's done is huge. I can't count the number of times I've introduced bugs (or 'features'...) as a result of well meant cleanups.

I also did a fun little toy program to see if it was reasonable to represent a least-cost routing map as a series of go-routines that distribute routes via channels. It works, but my first stab at it is about 10x slower than a non-concurrent version, and takes about 100x the memory. I'm sure it could be made more efficient, but I had to move on. So the routing system will stay with a more traditional implementation.

That's it for now; hopefully I'll have something with some pretty pictures next time!