ROBOTS
Robot object detection, side motion and video
To put the electronic cherry on Les Pounder’s finished robot, we add sonar sound, streaming video and sideways motion.
Part Six! Subscribe on p16 and access all the parts now!
OUR EXPERT
Les Pounder is associate editor at Tom’s Hardware and a freelance maker. He blogs about hacks and makes at http://bigl.es.
YOU NEED
Robot build from part five Pi camera Code and more: https:// github.com/ lesp/Linux-Format-Robot/ archive/ refs/heads/ main.zip
At the end of part five, our robot had been connected to the Anvil web service and was successfully controlled via a web interface. In this final part, we will unleash the robot in autonomous mode, where it can use its sensor to navigate the world around it. We’ll refine our motor.py module to add further debug features, before finally adding a special sliding mode for our mecanum wheels. We’ll also take a quick look at a video-streaming Python script from Raspberry Pi which will let us view what our robot can see as it traverses the kitchen floor.
Using the sensors
The primary sensor for this build is an HC-SR04P ultrasonic sensor. These are what typically feed your parking sensors. You’re reversing the car and you hear a “beep, beep, beep” and then as you get nearer to the garage door you hear “BEEP!” That is an ultrasonic distance sensor. It sends a ping of ultrasound at 34,300cm per second (the speed of sound), then waits for the returning echo. The time taken for the ping to send and receive is halved, then multiplied by the speed of sound to give us a distance in centimetres.
The common distance sensor for Arduinos and Raspberry Pi is the HC-SR04. But way back in part three, we chose to use the HC-SR04P (or you can use the + model) as it is compatible with the 3V logic used by the Raspberry Pi’s 3.3V GPIO. The older HC-SR04 uses 5V logic, and would require a potential divider to drop the Echo pins’ output from 5V to 3V.
The job of this sensor is to give the robot ‘sight’ so that it can navigate the world using one basic rule: don’t get too close to an object. In part three we wrote a simple test for this sensor that checked if the distance was less than 50cm from an object. If that was the case, the robot would follow the make_space instructions, which would stop the robot, reverse and then spin left. If the obstacle was still in the way, the robot would repeat make_space until there was nothing in its path. For this final part, we will refine the process, giving the robot a shorter distance as a trigger and a routine to follow for both normal navigation and obstacle avoidance.