I’ve kind of outgrown the capabilities of my Mindstorm, and so I’ve decided to get into something a little more powerful!
While I work my way through this, leading up to the VITTA conference in late November (where I’m presenting on building robots!), I’ll keep a step by step record on my blog of going from complete n00b to g2us!
I suppose the start of anything is working out the base. Now, my bot needs to be able to cart a fair bit of kit around; it needs to be able to carry some electronics (the brain), some sensors (eyes, ears, that kind of stuff), and it also needs juice (as I don’t want to screw the environment anymore than I have to, the juice needs to be rechargeable).
So onto what we will call, “the platform”. This is essentially what we will screw, bolt, weld, sticky tape stuff to.
How will MapBot (I’m calling her MapBot because she will use GPS and MapPoint for some of her recon tasks) move? This is an interesting question. Essentially she will have motors attached to the platform via mounts. These motors will be connected to what is known as a “speed controller”, and is essentially a device that you can communicate with via an RS-232 interface to control the speed of your motors. All this will need juice, so a set of rechargeable NiMH batteries will be in order, and we’ll bolt those to the platform too.
So now we have movement, power, and something to bolt stuff onto (very important).
Next, let’s get a brain. I’ve got a Mini-ITX board from MEDC, and this is capable of running Windows CE 6.0 (which supports .NET Compact Framework 2.0) amongst other things like SQL Server, etc. This is what will control the motors, based on some basic (initially) AI that will use the sensors to determine how much power to put to the floor, and in what direction.
Oh, while we���re on direction, I’m going to go with the simplest form of turning, which is track based movement. Essentially, my bot will have tracks, like that of a tractor, and when it wants to “turn”, it will simply reverse one side and forward the other, and spin.
The other thing the tracks will have are what are known as “encoders”. These are simply discs that have some form of hole or slot cut into them, and an optical reader. The way it works is the discs are stuck to the rotating wheel or track that you want to monitor, and a device shoots a beam through the disc. When the beam detects a break, it knows the disc is turning. The time between breaks gives the encoder an idea of how quickly you’re moving, and the number of breaks can tell you how far you’ve moved in units of movement. We’ll record all this into the onboard database for real-time analysis.
We will also have some other sensors, such as infrared sensors for “sight” and touch sensors to detect other objects, and perhaps something like a scanning laser range finder to help me get a layout of rooms and such.
Of course MapBot will have to somehow plug into the Microsoft Robotics Studio, but that will be later down the track.
I’m so excited, robots are cool!