While I’ve lately been mostly focused on my KR01 robot, I’ve also been planning a Mecanum-wheeled robot to be called the KRZ02. I long ago decided my robots would incorporate odometry, counting the rotations of each motor in order to track the robot’s location. This is accomplished by either an optical or Hall-effect encoder connected to the left and right motor shafts. On the KR01, since I know the gear ratio of the motor, the size of the wheel (and that it travels 218mm per rotation), and that the encoder sends 494 steps per rotation of the wheel, I can calculate there’ll be 2262 steps per meter.
To be honest, I didn’t actually come up with such a formula but simply measured this by repeatedly having the robot move exactly one meter forward. In science this is called an observation. Observing is generally easier than calculating, but is still a valid means of finding things out. You can make calculations without having a robot, but if your calculation (the model) is flawed or incomplete it won’t be accurate, whereas careful observation of the robot while in the actual working environment can be quite accurate. My guess is that if I had gone to the trouble to develop an odometric formula I wouldn’t have ended up with a value of 2262 steps per meter, as somewhere in the mix a concrete, physical system often introduces variables that haven’t been accounted for in the model.
But back to the point. With knowledge of how many steps the motor encoders return I can calculate both the velocity and distance traveled for each motor, and therefore with a reasonable degree of accuracy where the robot is located from its starting location. David Anderson of the DPRG has made odometry into a fine art, and his robots can travel great distances in both complex indoor environments and even across difficult terrain up in the mountains and return to within inches of their starting locations. I find David’s work very inspiring.
But all this clever odometry falls apart when using Mecanum wheels, which have a series of rubber rollers that each have an axis of rotation at 45° to the wheel plane and at 45° to the axle line.

Whereas a traditional wheel translates its energy to the ground in a rather predictable way, the rotation of a Mecanum wheel interacts in complex ways with the other Mecanum wheels. I’m sure there’s a mathematical formula for this, one that would involve the direction and rotational velocity of each wheel, the total weight and weight distribution of the robot, the hardness of and therefore how much of each roller is in contact with the ground, the traction and rolling friction of the wheel rollers, the friction due to contact with the ground, roller slippage, and probably another half dozen unknowns. Call me lazy but my life is too short to even consider trying to create that formula and develop sensors for the various parameters of that equation. And we’re back to that abstract model versus concrete observation issue. If the goal is accurately tracking the robot’s movement over the ground (odometry), then we need to come up with another method.
Flying drones can’t do motor-based odometry but instead use a specialised camera called an optical flow sensor, something we might call a camera-as-sensor. An optical flow sensor’s camera looks down at the ground, generating a series of image frames, calculating the distance the drone has moved across the landscape by tracking the differences in position between image frames. This is returned as an x,y value at the frame rate of the camera. With both a GPS unit and a LiDAR providing a distance to ground measurement it’s possible to accurately perform odometry in mid-air.

2x actual size
Now, my robot lives on the ground but there’s no reason we can’t try a similar trick. The actual sensor I’m using is from PixArt Imaging, designated the PMW3901MB-TXQT, and is all of 6 x 6 x 3 mm in size, using 6 milliamps of current. That’s including the camera and all of the electronics. This reminds me of the sensor used in the VL531LX Time of Flight sensor, which is a LiDAR the size of a grain of rice.
These sensors are generally sold on a carrier board so that they can be integrated into a commercial or hobbiest application. The carrier board I’m using is from Pimoroni, called the PMW3901 Optical Flow Sensor Breakout, part of their Breakout Garden series of sensors. The cost is about what we pay for two meals at a local Indian restaurant. You can solder a 7 pin header to the carrier board or simply plug it into an SPI socket on a Breakout Garden board.

The PMW3901 has a frame rate of 121 frames per second and a minimum range of about 80mm. I’m hoping to mount it looking down from the underside of the robot’s upper board, which will be just above that 80mm minimum. The problem is twofold: how to provide the sensor’s camera with a clear view of the ground, and how to mount it in the center of the robot. This is necessary so that when the robot rotates around its center the sensor won’t register any absolute movement, just a rotational theta.
On the KRZ02 I’m using 48mm Mecanum wheels made by Nexus Robot. They’re a high quality steel framed wheel with some rather solid brass hubs and a load capacity of 3kg. When mounted on a Pololu Micro Metal Gearmotor the bottom of the robot’s 3mm thick Delrin plastic lower board is 30mm from the ground. This means the PMW3901 needs to be at least 47mm above that board.
I built a test rig frame out of a cheap nylon chopping board. A photo of the rig can be found at the top of this post. This holds the PMW3901 at a distance of 90mm from the ground, looking through a 50mm hole cut into a lower board. 50mm is the biggest hole cutter I have:

I used a Raspberry Pi Zero W and wrote a Python library and test file. In the test, the PMW3901 sensor returns positive and negative x and y values as it senses movement. Based on those values I’m setting the RGB LEDs to red for a movement to port, green for starboard, cyan for forward and yellow for aft/reverse.

My initial tests quickly proved faulty. The white plastic of the cutting board was providing all sorts of reflections of ambient light as well as the PMW3901’s illumination LEDs, and this confused the sensor to no end. Even at rest it would be indicating what seemed to be random motion. I taped some black matte paper to the board and this almost entirely eliminated the problem. I won’t be using white nylon on the robot but rather black Delrin plastic, which I can sand to a dull matte finish, so hopefully this will be sufficient. If not, there’s a rubberised black fabric used in photography that reflects almost no light.
The following video shows the test rig in operation.
As I noted in the video, the results indicate the test is a success, i.e., mounting a PMW3901 Optical Flow Sensor at about 90mm from the ground can provide odometry information for a ground-based robot.
With this important test out of the way I can now finish the plans and begin building the KRZ02 robot.
Credit where credit’s due: the beautiful fabric you can see in the photos and video is actually a reusable grocery bag designed by one of my favourite New Zealand artists, Michel Tuffery, as part of the Paper Rain Project.