The PAA5100JE Near Optical Flow Sensor

The PAA5100JE Near Optical Flow Sensor

Awhile back I’d tested out the PWM3901 Optical Flow Sensor, a Breakout Garden board from Pimoroni. Its a vision sensor that contains a tiny camera, typically used on flying drones. Coupled with a drone’s GPS sensor and Inertial Measurement Unit (IMU), an optical flow sensor assists in navigational control by observing the motion of the ground below it, taking successive pictures and comparing how far an image has moved relative to the previous one. Since the camera itself doesn’t know how far it is from the surface it can only provide relative numbers, not distances in measurement units like feet or meters.

Since the PWM3901’s chip was designed for drones, its camera’s effective range is 80mm to infinity. This means that if used on a ground-based robot its lens must be at least 80mm from the ground, and its view must not be impeded. This can pose a challenge for the design of a robot, whose bottom must either be at least 80mm from the ground or contain a large open area for the sensor, ideally right in the center of the robot. But even then we’d be using the sensor at the extreme lower limit of its range; it’s really tuned for being way up in the sky.

PAA5100JE-Q

Now it turns out that PixArt, the maker of the sensor chip, also manufactures a very similar sensor called the PAA5100JE-Q that is tuned for working close to the ground, apparently used in service robots and robot vacuum cleaners. Whereas the Pimoroni carrier board is 24 by 24 mm, the PixArt sensor itself is only 6mm square. Depending on your screen resolution the image to the right is roughly the size of the sensor. According to the PAA5100JE datasheet, the chip is advertised as having a tracking speed of up to 45 inches per second and a range of 10-35mm, which seems ideal for hanging from the bottom of a robot. They consider it suitable for use on “carpet, granite, tiles, wooden floors with dry or wet surfaces.”

Fabricating the new PAA5100JE Near Optical Flow Sensor

So I went ahead and sent an email to Pimoroni, suggesting that since the two sensors are very similar it might be possible to develop a new version of their existing carrier board, but using the PAA5100JE.

I discussed the idea of a ground-based optical flow sensor with the DPRG during the weekly video conference and there was a general consensus that this sounded like a great idea, especially for mechanum-wheeled robots where wheel odometry was problematic. Its president Carl is currently building a mechanum-wheeled robot and the availability of such a sensor would potentially enable visual odometry.

I received an email from Pimoroni indicating that they were going to develop a new Breakout Garden board using the PAA5100JE, and in March told me they’d send me one prior to it being released. The photo of the boards being fabricated was quite exciting to see. So I waited patiently for the package to arrive in the post from the UK. In New Zealand I think we’re accustomed to waiting for overseas shipments to arrive.

The New Sensor Arrives

Last Friday a small mysterious package arrived containing the new PAA5100JE-Q Near Optical Flow Sensor. When placed next to the original PMW3901 it’s clear they’re related, the only discernable differences seem to be that the PAA5100JE has a different lens, the illumination LEDs seem a bit larger (though this may not be significant), and the SMT components to the right of the lens have been reorganised. Perhaps the nicest thing is that I didn’t have to wait for a software to be developed as the open source library for the PWM3901 seemed to work just fine.

Sister Sensors

Ground Testing

So I thought the first thing to do was test the new sensor to see how I might use it for visual odometry. The output is simply an x,y value that is pushed from the sensor when it senses motion. But there are no units to that x,y, nor could there be since its camera doesn’t know how far away the image is. We’ve got to help it a bit.

One of the first surprises was that the optical flow varied quite significantly depending on the surface below it, which seems to be related to the level of surface detail.

Fence Paling
Treated Pine Fence
Concrete
Concrete
Birch Ply
Unfinished Birch Ply
Matai Flooring
Hardwood Floor
Lawn
Lawn
Persian Rug
Persian Rug
Painted Stripes
Painted Wood
Bathroom Tile
Tile
Kwila Hardwood Deck
Hardwood Deck
The sensor was tested on a variety of surfaces.

About a week after the sensor arrived I received an email from Pimoroni stating that their engineers had finished the modifications to the Python library and that it was now available. This uses the same file as the PWM3901 but you import the PAA5100 class instead, e.g.,

from pmw3901 import PAA5100

I updated the library on my Raspberry Pi and changed the import statement in my test file, then went ahead and retested the sensor across my nine test surfaces. It turns out this made a significant difference in the results.

PWM3901 vs. PAA5100JE

It’s clear that the ranges of the PWM3901 (80mm to infinity) and the PAA5100JE (10-35mm) represent two distinct design choices. If your robot’s base is close to the ground, that is, between 10 and 35mm, the latter will work just fine.

If you’re building something like a Mars Rover or some other design where the lowest part of the chassis is above 35mm (making it unsuitable for the PAA5100JE) and greater than 80mm, then the PWM3901 would work better. It’s that span between 35mm and 80mm that may present a design problem: either hang the PAA5100JE lower than the robot’s chassis (reducing ground clearance) or somehow position the PWM3901 up higher on the robot so that it is at least 80mm from the ground. Not ideal in either case.

This blog entry isn’t really finished as I need to do more testing, but the KROS-Core Python library project has been taking up most of my time of late, so I’m publishing it now just to get it out the door (somebody has asked to see it). So I’ll come back to this later when I get a bit more time…

In Pursuit of the Club Robot

I spent about twenty minutes earlier today with a needle file, carefully and patiently filing away part of a hard metal battery terminal that I’d inadvertently epoxied oh so very slightly too high, such that it was blocking the little tab meant to fit in that hole and keeping me from reattaching the black plastic battery cover of my new Zumo robot. In the photo above you can see that tiny bit of metal peeking up through the hole, with that bit of black plastic to the left less than a millimeter thick. Very fragile polystyrene plastic. One slip and I’d break it.

I’d already filed away some plastic to leave space for the battery wires to exit the battery compartment, which normally fits snug against the PC board. Normally one would just solder the battery terminals directly to the PC board, meaning: permanent connections. I have been avoiding permanent connections. This is not a reflection on my private life, just the Zumo.

To some degree this represents the end of my journey. For over a year now I’ve been searching for a “club robot”, that is, a starter robot with enough appeal for beginners but still with enough growth potential that it wouldn’t sit on the shelf either due to its owner losing interest in robotics or growing out of it into a more functional robot.

It’s a bit of a tall order. My requirements included:

  1. low cost: the overall cost of the project should be under NZ$200 (~US$145)
  2. easy to buy: use a hardware platform that if single-sourced would be from a reputable company with good support, or otherwise could be built from readily-available parts using a recipe
  3. easy to build: anyone should be able to build the robot using common mechanical skills and tools
  4. easy to modify: the robot should permit customisation, such as adding new sensors
  5. programmable in python: preferably programmable in Python rather than C/C++, since the latter’s difficulty and learning curve are significantly higher [one goal of my own robot journey was to learn Python]. Many single board computers and microcontrollers can be programmed in multiple languages; some, like the Arduino, only in C/C++

First Steps

The MakeBlock mBot

My thoughts for a club robot first went to various small robots such as the MakeBlock mBot, which is an attractive option but targeted at primary school children. They’re simple robots often designed as part of a larger school learning programme. These would excel at the first three requirements but fail on the fourth and fifth. They’re advertised as an “entry-level” robot for beginners. Fair enough. But not for this club’s club robot.

The mBot uses a micro:bit, an inexpensive single board microcontroller designed by the BBC (yes, the British Broadcasting Service, that BBC). The one thing about the micro:bit that keeps it somewhat in the running is that it can be programmed using either of two different graphical user interfaces: Microsoft MakeCode or Scratch, or using an online tool that edits code in Python and downloads it to the micro:bit to execute.

Assembling the AlphaBot2

Last year I’d helped some friends buy a micro:bit-based Waveshare Alphabot2 robot for their daughters, which after batteries and charger ended up north of NZ$250. It’s a nicely designed and manufactured robot but after purchasing the LiPo batteries and a safe battery charger, somewhat too expensive, and not expandable, kind of a fixed entity. There’s not even a place on the robot to attach any additional features. I haven’t asked lately if it’s gathering dust or if the eldest girl, who is now 11, has shown any ongoing interest in the robot. It seemed like a success at the time, but the whole exercise pointed out the likelihood that, without some guidance and mentoring, a lot of kids might feel a bit left adrift with what is a lot more than simply a toy.

But all three of the avenues for programming the micro:bit are very limited. It’s not the kind of tool set that would appeal much to a teenager, much less an adult, so it’s not really suitable as the processor for a club robot. So the AlphaBot with micro:bit was out, and its Arduino and Raspberry Pi brethren too, since the robot is not expandable and doesn’t have the ability to fit motor encoders or additional sensors.

The Zumo Sumo Robot

Then the idea of the Zumo robot came up, though it uses an Arduino, rather old-school and slow nowadays with its 16MHz ATmega32u4 processor, and only programmable in C/C++. That being said, there are numerous code libraries available in C/C++ for Robot Sumo competitions, line-following and many other challenges and techniques. It’s widely known and widely available, so it certainly qualifies as a viable club robot if I’m willing to give up the Python requirement. But I wasn’t, at least not initially.

The default Zumo Robot for Arduino consists of a chassis, the Arduino shield, a pair of motors, a stainless steel blade to push off combatants, and an optional array of infrared sensors to sense the black and white transition at the edge of the Sumo arena. You’d then add sensors to locate the opponent. If you build it exactly as according to that plan it’s a robot wiht a reasonable cost and reasonable build time.

But I wasn’t going to give up on programming it in Python so I also ordered some other parts that would permit me to swap out the Arduino shield for a Kiktronik Robotics Board for the Raspberry Pi Pico. The Pico can be programmed in C/C++ but also in MicroPython.

So I put together an order of Zumo parts from Pololu, as well as the Kiktronik/Pico parts from Pimoroni.

What seemed like a very long time later the packages had both arrived.

I’m myself pretty happy with working in Python and believe it’s a good job skill, and thought: I wouldn’t want to be tasked with teaching any kids C++. I don’t even like it as a language. So unless there’s some way to program a Zumo in Python it’s not in the running.

So why even bother?

I then remembered a conversation a few weeks’ back with David Anderson of the Dallas Personal Robotics Group (DPRG), who in what I believe out of sincere curiosity had asked me: “Why are you bothering to try to start a club anyway? Why not simply build your own robots?” I had a nice pat answer about enjoying teaching people things and wanting to share some of the knowledge I’d gained over the years. But that question kinda lodged in my skull and has been festering there since then. It’s a good question.

I believe David explained his own motivation as simply wanting to build some great robots with the understanding that if people are interested they’ll be inspired by them and get involved in robotics. Lead by example. Seeing David’s robots in action were actually what spurred me to consider, after almost 40 years, to get back into robotics. David seems to be mostly interested in building robots, not trying to teach anyone or lead a group of people. It occurred to me that he’d also asked: “Why do you care if people are involved in robotics or not?” And he had me there. Why do I care? It’s not like I would benefit if more people built robots. There’s this idealistic notion that the world needs more engineers, and it’s a nice thought that I might inspire someone to take up engineering, but that’s pretty indirect. I’m not a school teacher, I don’t have any students. There are currently no NZPRG members except myself. I’m not even sure I even have the time to actually lead a local club. And I’m sure David would rightly say that all the time I’d spend running a club would be time taken away from doing the thing I actually enjoy doing, which is building robots.

My Customised Zumo

Today I finally finished the hardware of my Zumo. I could have simply built it according to the plans, which would likely have taken me less than an hour. But I was in my head exploring this idea that with suitable modifications it might suit as a club robot. There should be in how much time I’d invested an answer to the question of it posing as a possible club robot: absolutely no way. I wasn’t keeping track of the hours, but it was certainly more than an eight hour day’s work. Pretty silly.

The Zumo robot from Pololu comes in two forms: an already-assembled Zumo 32U4 Robot model and a Zumo Robot for Arduino. With the latter the robot effective plugs upside-down onto the top of an Arduino microcontroller as a shield, the common name for Arduino accessories. The 32U4 version actually has more features built-in, like a little LCD display, motor encoders and line-following sensors, whereas the shield is, in theory, more flexible. That theory turned out to be rather illusory.

But for the Arduino shield version that I’d built, I had to figure out how to add motor encoders, which aren’t part of the deal normally. The motors are held in their own small space in the chassis, with no holes even for the twelve wires to exit. Yeah, twelve wires. Each motor has positive and negative power, plus the four wires (A1, B1, A2, B2) for the encoder. The Pololu encoders come in three varieties, but only the bare board version would seem to fit into the space.

Just enough room for the motor encoders. Now where to put twelve wires?

So where to go with twelve wires? If one is going to fit the infrared sensor array it can’t be forward, as the connector fits snugly against the front of the chassis. We can’t go up as that would run into the Arduino shield, nor back as that is the battery compartment. So it had to be down, though that is the ground. There’s a few millimeters of clearance between the bottom of the motor chamber and the actual bottom of the robot, so that’s it.

The twelve wires found their way out. This also shows the Reflectance Sensor Array in place.

So I managed to find a way to connect the motors and encoders and route the wires up to the controller.

This also meant that the twelve wires and two battery connections (now a black and red wire) couldn’t be soldered to the Arduino shield either, I’d have to use Dupont connectors.

So… long story short, I ended up after many hours with an Arduino Leonardo atop the Pololu Zumo shield, all connected with a bunch of wires. The infrared array plugs very cleanly underneath the shield. But I’ve not yet added any ability for the robot to see Sumo opponents, like a pair of Sharp analog infrared sensors or an ultrasonic sensor.

The completed Zumo with motor encoders and a removable shield.

The End

Yeah, also a song by The Doors. But having mostly-completed the hardware for this Zumo brought me to what had been that question at the back of my head all along: why am I doing this? It was all in mind of this “club robot” idea.

The End: a little bit depressing.

I have several much larger, more complicated robots that are outfitted with a Raspberry Pi and lots of sensors, and I’ve been happily programming them in Python. Over the past few years I’ve developed a fair bit of code in support of the hardware, and continue to work on that. Was I now going to start over in C++ on this Zumo? Really?

What would be perhaps an ideal platform for someone else now just felt like a burden. As I said earlier, I don’t even like programming in C++, and while there are available libraries for the Zumo, I’d be treading on territory others have passed through long ago. I’d be going from a robot with more than a dozen sensors to one where even adding a few is rather more difficult. Why would I do this?

What’s after The End?

So what’s next? Well, I could sell the Zumo. It’s perfectly functional and has motor encoders. There’s also the idea that I could swap out that Arduino shield for the Kiktronik Robotics Board and program the Raspberry Pi Pico in MicroPython.

The Kiktronik Robotics Board for Raspberry Pi Pico

I’d basically be putting the shield and the Leonardo in a plastic storage box and never looking at it again, which seems a bit of a waste. But this was an experiment, a thesis, and my thesis has borne its fruit: the knowledge that while a robot based on the Zumo Robot for Arduino is a perfectly fine small robot for anyone who wants to program in C++ and compete in Mini Sumo competitions — which is what it’s designed for — it is difficult to modify. If one wants to use odometry for navigation the readymade Zumo 32U4 Robot already has motor encoders built in, and would therefore be more suited for more advanced robotics, something beyond a Sumo competition.

But to fulfill my set of requirements, a robot based on the Zumo chassis still remains a real possibility, using a different controller. I’ll explore that in a future post.

Cheap Blob Detection

[No, not detecting cheap blobs, but rather blob detection on the cheap.]

I’m still kinda blown away that Steve McQueen was in this movie… and the theme song was composed by Burt Bacharach!

Way back in mid-May (seems a very long time ago now), I’d done a bit of exploring using the default Raspberry Pi camera in sight of a possible solution for the DPRG Four Corners Competition. The challenge is to have a robot trace out a large square on the ground marked with orange traffic cones/pylons. Even with well-tuned odometry it’s a difficult trick, as part of the requirements include that the square the robot traces must align with the four markers of the course. You can’t just travel off in any direction, make three 90 degree turns and come back to the start, you have to be able to aim at that first marker accurately. There’s the rub: how to aim at something.

So to address that challenge I’d mounted a Raspberry Pi camera to the front of my KR01 robot and posted a 5mm pink LED as a sentinel three meters in front of the robot. Would it be possible for the robot to aim at that pink LED? The preliminaries from that experiment were largely positive: the robot could at least see the LED at a distance of 3 meters, even in daylight. What I’d managed to achieve was with a deliberately very low resolution image, detection of a blob of pixels indicating a close match to the hue of the pink LED.

The front of the KR01 robot showing the Rasperry Pi camera and three analog infrared detectors

The next part would be how to aim the robot at that blob. This post further explores that solution, still just using the Raspberry Pi camera.

Building an LED Beacon

Photo of disassembled beacon

First, one thing about LEDs is that they’re largely directional, a bit like a small laser encased in plastic. Some use translucent rather than transparent plastic to diffuse the light a bit, but if the LED is facing away from the robot it will basically disappear from view, except for perhaps its light shining on something else, which wouldn’t be helpful (see more on floor reflections, below).

Photo of assembled beacon

As you do, there was this bit of plastic I’d been saving for what, over ten years? that I kept trying to find a use for. It was the diffuser off of a pretty useless LED camp light, useless as it was large, didn’t put out that much light, its plastic case had gone sticky, and I was already carrying a LED headlamp that was smaller and brighter. So I discarded the rest and kept the omnidirectional diffuser. I also had this rather beautiful polished aluminum hub from an old hard drive, and it turns out the plastic diffuser fit exactly onto the hub.

So I wired up a small PC board with three pink LEDs and a potentiometer to control the voltage to the LEDs, and built a small frame out of 3mm black Delrin plastic to hold it and the USB hub providing the 5 volt power supply. Another thing just lying around the house…

I added a tiny bit of cotton wool inside to further diffuse the light, so it would show up on the Pi’s camera as a pink blob. I opened several images taken from the robot in a graphics program and noted the RGB values of various pixels within the pink of the beacon. One of these colors would be the “target color”.

Blob Detection

Now the bright idea I had was that we didn’t really need much in terms of image data to figure out where the pink blob was.

Given my robot is a carpet crawler, we can make some relatively safe assumptions about the environment it will be in, even if I let it out onto the front deck. Lighting and floor covering may vary, but my floors and the front deck are largely horizontal. Since the floor and the robot are mostly level, we can therefore expect to find the beacon near the vertical center of the image, and safely ignore the top and bottom of the image. All we care about is where, horizontally, that blob appears.

I’ll talk more about the Python class I wrote to handle the blob processing below.

I placed the beacon on the rug, plugged it into a power supply and pointed the robot at it. The Raspberry Pi camera’s resolution can be set in software. Set at High Definition (1920×1080) we see the view the robot sees, with the pink beacon exactly one meter away from the camera:

With the Pi camera set at 1920×1088 we can see the view from the robot
Actual size image

Now, for our purposes there’s no need for that kind of resolution. It’s instead set at what seems to be around the Pi camera’s minimum resolution of 128 x 64 pixels. While irritatingly small for humans, robots really don’t care. Or if they do care, they don’t complain.

The Blob Class

The Python class for the blob detector is just named Blob (and available as with the rest of the code on github). In a nutshell it does the following:

  • Configure and start up the Raspberry Pi camera.
  • Take a picture as a record, storing it as a JPEG.
  • Processing the image by iterating over each row, pixel-by-pixel, by comparing the color distance (hue) of the pixel with the target color (pink). Each pixel’s color distance value is stored in a new array the same size as the image. We enumerate each color distance value and assign a color from a fixed palette, printing a small square character (0xAA) in that color. We do this row by row, displaying a grid-like image representing color distance. Note that there’s a configuration option to start and stop processing at specified rows, effectively ignoring the top and bottom of the image.
  • Sum the color distance values of each column, reducing the entire image to a single line array.
  • Enumerate the distribution of this array so that there are only ten distinct values, replacing each element with this enumerated value. We also use a low-pass filter to eliminate (set to zero) all elements where there wasn’t enough color match to reach that threshold. Like the color distance image we print this single line after the word “sum” in order to see what this process has turned up.
  • Find all the peaks by determining which of the values in this array are the highest of all, given a threshold, and again print this line after the word “peaks”.
  • Find the highest peak: If there are multiple peaks within 5% of the image width of each other we assume they’re coming from the same light source, so we average them together, again drawing a line of squares following the word “peak”.
  • Return a single value from the function (the index of the highest peak in the array), considered as the center of the light source. If there are too many peaks or they’re too far apart we can’t make any assumptions about the location of the beacon and the function returns -1 as an error value.

The result of this is shown below:

The console output from Blob showing the 128 x 64 pixels each colorised depending on how close it in hue to pink (irregardless of brightness or saturation). Click on the image for a full size view.

So if the image resolution is 128 x 64, the result will either be a -1 if we can’t determine the location of the beacon, or a value from 0 to 128.

An image taken from the Pi camera of a 1 meter rule at a distance of 1 meter

So what to do with this value? If the robot is aiming straight at the beacon the value will be 64. A lower value means the beacon is off to the left (port), higher than 64 the beacon is off to the right (starboard). I’m playing around righ tnow with figuring out what these values mean in terms of turn angle by comparing what 0 and 128 represent when compared against an image taken from the robot of a one meter rule taken at a distance of one meter (see above).

One of the things we’ve talked about in the weekly DPRG videoconferences is the idea that the difference between two PID controller’s velocities as an integral can be used for steering. Or something like that, I can’t pretend to understand the math yet. But it occurs to me that the value returned from the Blob function as a difference from center could be used to steer the robot towards the beacon.

Performance & Reliability

In my earlier tests there was clearly an issue of performance. While the Blob processing time is pretty quick the time the Pi camera was taking to create the image was around 600ms for that 128 x 64 pixel image, with of course more time for larger images. It turns out this was due to a bug in my code: I was creating and warming up the camera for each image. The Pi’s camera can stream images at video speed, at least 30 frames per second, so I’ve got a bit of work to figure out how to grab JPEG images from the camera at speed. So the code posted on github as of this writing is okay for a single image but can’t run at full speed. Until I fix the bug. I’ll update this blog post when I do.

And as to reliability, the pink LED beacon is not very bright. In high ambient light settings it’s more difficult for the robot to discern. Some of the DPRG’s competitions, which are usually held either outdoors or indoors under bright lighting conditions, use a beer can covered in flourescent orange tape. Since the Blob color distance method is designed for hue, if the robot were running in a bright room perhaps an orange beer can would work better than a pink LED beacon. If we tuned that algorithm to also include brightness, perhaps a red LED laser dot might work. All room for experimentation.

I’d noted early on that it was somewhat easy to fool the image processor. My house has a wide variety of colors, rugs, books, all manner of things. That particular pink doesn’t show up much however, and if I threw away the times when multiple peaks showed up, basically gave up when I wasn’t sure, then the reliability was relatively high. If the beacon’s light is reflected on another surface (like my couch) the robot might think that was the beacon so I need to make sure the beacon is not too near another object. I also noted that when placing the beacon on a wood floor the reflection on the floor would show up on camera, but this actually amplified the result, since the reflection will always been directly below the beacon and we’re only concerned with horizontals.

Next Steps

There’s plenty of room to both optimise and improve the Blob class. I’m still getting either too many false positives or abandoning images where the algorithm can’t make out the beacon.

It occurs to me also that we don’t have to train the Blob class to look for pink LEDs or orange cans. It would be relatively trivial to convert it to a line follower by aiming the camera down towards the front of the robot, altering the color distance method to deal solely with brightness, and using the returned value to tell the robot the location of the line.

Maybe next time.

The Four Corners Challenge

This is just a little ditty that I will update as things progress. It comes about from conversations on the DPRG mailing list about one of their commonly held robot contests, called the Four Corners Competition. Here’s the actual definition from April of 2018:

Objective: The robot will travel a rectangular path around a square course. The corners of the course will be marked with a small marker or cone. Before the robot makes its run, a mark or sticker will be placed on the center front of the robot and on the floor of the course. The objective is to minimize the distance between the two marks at the end of the run.

David Anderson pointed out that this contest goes back to a 1994 University of Michigan Benchmark (called UMBmark), “A Method for Measuring, Comparing, and Correcting Dead-reckoning Errors in Mobile Robots“, developed by J. Borenstein and L. Feng. As David describes it, the “concept is to drive around a large square, clockwise and counter-clockwise, while tracking the robot’s position with odometry, and stop at the starting point and measure the difference between the stopping point and the starting point. This shows how much the odometry is in error and in which direction, and allows calibration of the odometry constants and also the potential difference in size between the two wheels of a differentially driven robot. The DPRG uses this calibration method as a contest.” He even wrote a paper about it.

Well, I keep maintaining the purpose of my robotics journey is not to engage in competition (and I swear that I’m not a competitive person, I’m really not, no I am not), but I do think that this benchmark is a good exercise for fine-tuning a robot’s odometry. Nuthin’ to do with competition, nope, just a challenge.

The key to this challenge seems to be twofold: 1. getting the odometry settings correct; and 2. being able to accurately point the robot at that first marker. As regards the latter, the contest permits the robot to be aimed at the first marker using any method, so long as the method is removed prior to the contest starting. Over the years various approaches at this have been tried: ultrasonics, aiming the robot using a laser pointer, etc. I tried creating a gap between two boards and seeing if my existing VL53L1X sensor could see the gap, but then realised its field of view is 27°, so it’s not going to see a narrow gap at a distance of several meters. I then contemplated mounting a different, more expensive LiDAR-like sensor with a 2-3° field of view, but at 8-15 feet (the typical size of the course) that’s still not accurate enough.

Tactical Hunting Super Mini Red Dot Laser

This challenge has somehow lodged itself in the back of my head, the buzzing sound of a mosquito in a darkened bedroom. As I may resort to aiming the robot using a laser pointer I’ve put in an order from somewhere in China for a tiny “tactical hunting super mini red dot laser” (which kinda says it all). I’ll in any case definitely install the tactical hunting super mini red dot laser just ’cause it will look so cool and dangerous. But a laser pointer feels a bit like cheating: it’s not the robot doing the hard work, it’s like aiming a diapered, blindfolded child towards grandma’s waiting knees and hoping she makes it there. Hardly autonomous.

A Possible Solution: The Pi Camera

I’ve been planning to install a Raspberry Pi camera on the front of the KR01 robot for awhile, and since I was going to have a camera available I thought: heck, the robot will be at very least facing that first marker, so why not use its camera to observe the direction and let it try to figure its own way there? No hand-holding, no laser pointer aiming, no diapers, no grandma. Autonomous.

The Pink LED

So what would I use for a target? How about an LED? What color is not common in nature? What color LED do I have in stock? Pink (or actually, magenta). So I mounted a pink LED onto a board with a potentiometer to adjust brightness, using a 9 volt battery for the power source. Simple enough.

The Raspberry Pi camera’s resolution is 640×480. I wrote a Python script to grab a snapshot from the camera as an x,y array of pixels. I’m actually processing only a subset of the rows nearer the center of the image, since the robot is likely to be looking for the target of the Four Corners Challenge somewhat near the vertical center of the image, not closer to the robot or up in the sky.

The Four Corners Course with target LED at the 1st corner

I found an algorithm online to measure the color distance between two RGB values. The color distance is focused mostly on hue (the angle on the color wheel), so if that particular pink is sufficiently unusual in the camera image, the robot should be able to pick it out, regardless of relative brightness. I took a screenshot of the camera’s output, opened it up in gimp and captured the RGB color of the pink LED. I avoided the center of the LED, which showed up as either white (R255, G255, B255, hue=nil) or very close to white, and instead chose a pixel that really displayed the pinky hue (R151, G55, B180, hue= 286).

For each pixel in the array I calculated the color distance between its color and that of our target pink. To be able to see the results of the processing I then printed out not the pixel array of the original image but an enumerated conversion of the color distance — just ten possible values. So magenta is very close, red less close, yellow even less, et cetera down to black (not close at all). So the image is what we might call a “color distance mapping”. I just printed out each row to the console as it was processed, so what you’re seeing is just a screen capture of the console, not a generated image.

What the Rasbperry Pi camera sees (click to enlarge)

My first attempts were of just the LED against a dark background, enough to try out the color distance code. Since that seemed to work I tried it against a much more complicated background: the bookcase in my study (see photo). The distance from the camera to the pink LED was about 2 meters. Despite several objects on my bookcase being a fairly close match to the LED’s color, things seemed to still work: the LED showed up pretty clearly as you can see below:

The LED can be seen in the upper left quadrant on the bookcase shelf (click to enlarge)

That object just to the left of the LED with the 16 knobs is a metallic hot pink guitar pedal I built as a kit a few years ago. There’s another guitar pedal that same color on the shelf below. There’s enough difference between that hot pink and the magenta hue of the LED that it stands out alone on the shelf. Not bad.

Outdoors Experiments

So today, on a relatively bright day I tried this out on the front deck. There was a lot more ambient light than in my study and I was able to set the LED a full 3 meters (9 feet 10 inches) from the front of the robot. How would we fare in this very different environment?

The view from the robot

The 3mm LED I’d been using turned out to be too small at that distance, so replaced it with a larger pink LED and turned up the brightness. Surprisingly, the LED is clearly visible below:

This is a pretty happy result: the robot is able to discern a 5mm pink LED at a distance of 3 meters, using the default Raspberry Pi 640×480 camera. This required nothing but a camera I already had and less than a dollar’s worth of parts.

The Python code for this is called pink_led.py and is available in the scripts project on github.

Next step: figure out how to convert that little cluster of pixels into an X coordinate (between 0 and 640), then using that to set the robot’s trajectory. It could be that converting that trajectory into a compass heading and then following that heading might get the robot reliably to that first course marker.

But I’m still going to install that tactical hunting super mini red dot laser.

Facilius Est Multa Facere Quam Diu

KR01 obstacle avoidance

This is another article in the series about the KR01 robot.

The title translates as “It is easier to do many things than one thing consecutively“, attributed to Quintilian, a Roman educator about two thousand years ago. It sounds curiously like a motto for either multi-tasking or multi-threaded processing. But also for how I’ve approached designing and building the KR01 robot.

One thing I’ve learned about building a robot is that, at least for me, the hardware and software is ever-changing. I guess that’s what makes the journey enjoyable. In my last post I ended up with too much philosophising and not enough about the robot, so this one makes up for that and provides an update of where things are at right now.

But before we get into the hardware and software I thought to mention that I’ve been quite happily welcomed into the weekly videoconferences of the Dallas Personal Robotic Group (DPRG) and about a week ago did a presentation to them about the KR01:

The DPRG is one of the longest-standing and most experienced personal robotics club in the US, with a great deal of experience across many aspects of robotics. They’re also some very friendly folks, and I’ve really been enjoying chatting with them. New friends!

Hardware

So… the biggest issue with the KR01 was imbalance. There was simply no room on the chassis for the big, heavy Makita 3.0Ah 18 volt power tool battery so I’d, at least temporarily, hung it off the back on a small perforated aluminum plate.

[You can click on any of the images on this page for a larger view.]

Earlier design: the black platform at the aft end held the Makita battery

The KR01 without a battery weighed 1.9 kilograms (4 lbs 3oz), so at 770 grams (1 lb 11 oz) the Makita battery and its holder comprised about 40% of the the robot’s total weight. By comparison, my little KRZ01 robot weighs 160 grams, including the battery. Since that photo was taken the robot has gained a bit of weight, now up to 2.1 kg. But that imbalance remained.

With the battery hanging off the back, when trying to spin in place the KR01 would typically sit on one back wheel and rotate around that wheel, but which wheel was almost arbitrary. I’d kinda knew something like this might happen but I was willing to keep moving forward on other parts of the robot (“facilius est…“), because fixing that problem meant making some big hardware changes.

Since I’d gotten to the point where I was actually testing the robot’s movement, I finally needed to bite the bullet and bought another piece of 3mm Delrin plastic. This time I positioned the battery as close to the physical center of gravity of the robot as possible. Then I spent a lot of time reorganising where things fit, as well as finally adding all of the sensors I’d been planning. I think I might have gone overboard a bit. The current design is shown below.

The redesigned KR01 with space for the battery closer to the center of gravity

The copper shielding is my attempt at cutting down on the amount of high-pitched ambient noise put out by the speaker (hidden underneath the front Breakout Garden Mini, next to the servo). This didn’t seem to make much difference but it looks kinda cool and a bit NASA-like so I’ll leave it for now. Yes, the KR01 can now beep, bark, and make cricket sounds. It also has a small 240 x 240 pixel display screen (visible at top center) and two wire feelers to theoretically protect the upper part of the robot, basically an emergency stop. I have no idea how well that will work. The inside of my house is pretty hazardous for a small defenseless robot.

Side view of the KR01 robot’s chassis, showing the power/enable switches and a Samsung 250gb SSD drive lodged underneath

Earlier versions of the robot had a 15cm range infrared sensor for the center, which being digital, replied with a yes or no. It worked as advertised, but 15cm wasn’t enough distance to keep the robot from running into things, even at half speed, so I’ve since replaced it with a longer range analog infrared sensor (the long horizontal black thing in the cutout in the plastic bumper, shown below) that I’ve coded to react to two separate ranges, “short” (less than 40cm) and “long” (triggered at about 52 cm). This permits the robot to slow down rather than stop when it gets within the longer range of an object.

The robot currently has a 15 cm range infrared sensor on each side but I’m planning to replace them with a pair of Sharp 10-150cm analog distance sensors, which hopefully will permit some kind of wall-following behaviour. I’m eagerly awaiting another package in the post…

Front view of the KR01 robot’s chassis, with polycarbonate bumper-sensor and infrared distance sensors

The KR01 now sports a variety of sensors from Adafruit, Pimoroni’s Breakout Garden, Pololu and others, including:

  • a servo-mounted 4m ultrasonic sensor, or
  • a servo-mounted Time of Flight (ToF) laser rangefinder with a range of 4m and accuracy of 25mm
  • four Sharp digital 15cm range infrared distance sensors
  • a Sharp analog 80cm infrared ranging sensor
  • an infrared PIR motion detector (for detecting humans and cats)
  • an X-Band Bi-Static Doppler microwave motion detector (for detecting humans and cats through walls)
  • a 9 Degrees of Freedom (DoF) sensor package that includes Euler and Quaternion orientation (3 axis compass), 3 axis gyroscope, angular velocity vector, accelerometer, 3 axis magnetometer, gravity vector and ambient temperature
  • a 6 channel spectrometer
  • two 5×5 RGB LED matrix displays
  • one 11×7 white LED matrix display
  • an HDMI jack for an external monitor (part of the Raspberry Pi)
  • WiFi capability (part of the Raspberry Pi)
  • a microphone
  • a speaker with a 1 watt amplifier

This is all powered by a Makita 18V power tool battery. The clear polycarbonate bumper (inspired by David Anderson’s SR04 robot) has six lever switches, with two wire feelers protecting the upper part of the robot.

All that just for the territory of my lounge. Or maybe my front deck.

Software

So while working on hardware I’ve also been working on the software. I’ve been writing a Behaviour-Based System (BBS) based on a Subsumption Architecture as the operating system for the KR01 in Python. The concept of a BBS is hardly new. Rodney Brooks and his team at MIT were pioneering this area of research back in the 1980s; it’s an entire field of research in its own right. Here’s a few links as a beginning:

The idea with a BBS is that each sensor triggers either a “servo” or a “ballistic” behaviour. Servo Behaviours (AKA “feedback and control systems”) immediately alter the robot’s movement or make a temporary change in its behaviour, such as speeding up or slowing down, in a relatively simple feedback loop. Ballistic Behaviours (AKA “finite state machines”) are small sub-programs that are (theoretically) meant to run from start to completion without interruption. The below video shows a ballistic behaviour that might occur should the robot find itself facing a wall: it backs up, scans the neighbourhood for a place where there’s no barrier, then attempts to drive in that direction. Yes, that is a sonar “ping” you hear.

This video shows a simple obstacle avoidance behaviour.

My understanding (i.e., the way I’m writing the software) is that for every sensor there is an associated servo or ballistic behaviour, and that each of these behaviours are prioritised so that the messages sent by the sensors contend with each other (in the subsumption architecture), the highest priority message being the one that the robot executes. It does this in a 20ms loop, with ballistic behaviours taking over the robot until they are completed or subsumed by a higher-priority ballistic behaviour. It’s the emergent behaviour as a consequence of these programmed behaviours that gives the robot its personality. When the robot has nothing to do it could begin a “cruising around” behaviour, whistle a tune, or go into standby mode awaiting the presence of a cat.

Not that my cat pays much attention to the robot. In the robot-plus-cat experiments that have been performed in our home laboratory he sniffs at the robot a bit and nonchalantly stays out of its way.

He wasn’t fooled for a moment by the barking.

From Failure, Success

KRZ-01 Robot

There’s been another mishap around here. I guess building robots has its ups and downs and last week was no different.

I’m kinda ashamed to say that while I was working on the KR01 robot I’ve now managed to burn out two Thunderborg motor controllers and one Ultraborg servo controller. Well, not quite “burn out”. The motor controller parts of the Thunderborgs still work but the RGB LED used to display the battery level has somehow gotten fried on both units, and the Ultraborg (which is used for sonar and servo control) seems to have died during the first Thunderborg catastrophe (sympathetic death). I have no idea really how this has happened, but of course the only real possibility is that I’ve done something wrong. I mean, I’ve been very careful with checking my wiring before applying the power, but at some point I must have got my wires crossed. The PiBorg folks who make these boards have been quite helpful and I’m sending them back to the UK to see if they can figure it out. But that will take awhile.

King Ghidorah anatomy by Shoji Phtomo
Not to be confused with Monster Zero 1

This means that for at least a few weeks I would be without a robot (the horror)! I really can’t have this happening just as I’m getting the robot operating system up and running. So last weekend I went ahead and built out one of my design prototypes, which I’ve been calling the KRZ-01 (Kiwi Robot Zero), as it’s based on a Raspberry Pi Zero W. It uses a Picon Zero for a motor controller, a Pimoroni Breakout Garden to mount some of its sensors, and a trio of infrared detectors rather than a front bumper.

Happily, the build posed only a few problems and I had it up and running rather quickly. I rewrote the Python modules that had been used to control the KR01’s motors to instead use the Picon Zero and I had it dancing around on the carpet today for the first time:

KRZ-01 Motor Control Demo

The KRZ01 is meant to be small and relatively cheap, but still have the ability to carry some impressive sensors. It actually isn’t a whole lot less capable than its larger sibling, the KR01. Without including shipping the parts come out around NZ$250, so it’s not the cheapest robot you could build but it’s got a lot of functionality2.

Side View of KRZ-01
Side View of KRZ-01

It’s based on a Raspberry Pi Zero W, which has 500MB of memory and supports both WiFi and Bluetooth. The OS is Raspbian Linux. The Picon Zero motor controller and a Breakout Garden Mini are both mounted on a Mini Black HAT Hack3r breakout board. This is an extremely compact setup. You can see this on the side view photo.

The sensors include: three Sharp infrared detectors; a VL53L1X Time of Flight (ToF) distance sensor mounted on a micro servo, which can measure distance up to about 4m with a 25mm accuracy (this is the same sensor I used on my night light); and two 298:1 ratio micro gear motors with encoders so we can measure how far we’ve travelled.

Bottom view of KRZ01
Bottom View Showing the Motors and Motor Encoders

There’s still two free I²C Breakout Garden sockets so additional sensors can be swapped in and out without any soldering. I added a couple of 11×7 LED Matrix boards as status displays but they’re hardly necessary. The whole thing runs on a common USB battery. The chassis is made out of 3mm Delrin plastic. For locomotion it uses a pair of Moon Buggy wheels, a lightweight plastic ball caster in the front, a heavier stainless ball in the back (so its balance is towards the back caster).

Since the robot supports WiFi I connect to it remotely using ssh, which is how I’ve been installing and loading its software, starting and stopping programs, and shutting it down 3. Remarkably, the Raspberry Pi W includes a tiny HDMI connector so I could plug it into a monitor, but that hardly seems necessary. This seems like a command line robot.

The chassis is 75mm wide and 120mm long. Without a battery the whole thing weighs 120 grams. For comparison, that’s 17 grams less than my iPhone 5. I have a 5200mAh battery that weighs 136 grams and a 4400mAh battery that only weighs 40 grams, so unless battery life is an issue I’ll probably use the smaller battery. I have a 10000mAh battery (200g) that would last many hours but I can’t imagine leaving the robot alone that long. What kind of trouble could it get into?

For more information about the KRZ01 Robot, visit its NZPRG wiki page.

Note: as of today the NZPRG has its own YouTube Channel.

Edit: after some back in forth in email and finally posting the boards back to PiBorg in the UK, I learned from them that what seemed to have happened was that the UltraBorg tested as faulty, and that was apparently what was burning out the LEDs on the ThunderBorgs. They’ve since sent me replacements for both and all is working well now. A well-deserved thank you to PiBorg for their patience and help!