Have been carefully going through what’s going on with SloBot and what I need to do. This has involved a lot of reading on numpy and other python related matters – and still a lot to do. Progress has felt quite slow as a result, but at least it’s progress and I’m starting to understand where this robot is at and what’s wrong with the software I’ve put together so far.
I’ve checked out all the sensors by placing them at a known distance from an obstacle and seeing what they read. That wasn’t hard really, here’s the setup:
That also allowed me to better estimate the angle of the sensor using an android protractor app – the outer ones are ±55º, not ±60º.
All the sensors read 30cm, as they should so that doesn’t explain the weird readings I got when doing 360º rotations the other day:
What I think we’re seeing there is a mixture of sonar readings and the current yaw value converted to x and y coordinates. Because the sonar sensors take a finite time to return a result, the code that issues a “ping” doesn’t wait around for the answer. If you’re mixing sonar and yaw readings then they can, and do, get mixed up causing all kinds of fun and games. That needs sorting out.
I’ve been trying to get some kind of control so have been focussing on movement in a straight line (you have to start somewhere). Here’s a video of some pre-programmed motion – the robot is told to:
- Make distance from centre sensor = 30cm
- Wait 2 seconds
- Distance = 40cm
- Wait 2
- Distance = 60cm
- Wait 1
- Distance = 10cm
[embedplusvideo height=”480″ width=”768″ editlink=”http://bit.ly/1io3Xt5″ standard=”http://www.youtube.com/v/f6A_7Hc7MfY?fs=1&hd=1″ vars=”ytid=f6A_7Hc7MfY&width=768&height=480&start=&stop=&rs=w&hd=1&autoplay=0&react=0&chapters=¬es=” id=”ep4546″ /]
The delay at the start is because I choose to reset the Gertduino board when starting, this means that the MPU-6050 sensor has to be re-initialised. After that (when the yellow led starts flashing) I have to wait for the MPU-6050 yaw measurement to settle – the MPU-6050 does some internal magic to compensate for gyro drift but this takes time. Then we’re off. Not too bad really.
Clearly, the motors and tyres don’t really want to go in a straight line. I was working on using yaw angle to adjust motor speeds to try to maintain a straight line when I hit the yaw/ping mixup issue.
How accurate is it?
That’s about 8 and a bit cm (the camera angle makes it look less that it is, the real value is close to 9cm). Not bad really since there’s no “control” involved and I just stop the robot once it’s reached or gone past it’s target.
It’s coming along. I’d like to have some basic control over what it does before I try to get SLAM up and running. There are lots of issues to think about (and lots more stuff to read). I would seem to have options like detecting edges in the data (assuming things like walls are made up of straight lines, most anything can be approximated that way) or just having a cloud of points (which is simpler and more complicated, at the same time).