Linux Format

Keep your robot busy

What you do with your robot is limited only by your imaginatio­n and attention span. But for now, here are a few suggestion­s to inspire you…

-

H aving constructe­d our Diddy, we wanted to know more about what it could do. Not content with making glorious hardware, PiBorg also provides some great code examples to get you started. These will need some tweaking to work on other hardware, but should give you some idea of how to talk to the hardware in Python.

Robot kits will provide scripts to deal with all the lowlevel communicat­ion. In our case this is all done through the ThunderBor­g.py file (see www.piborg.org/blog/ build/thunderbor­g-build/thunderbor­g-examples). This handles all the raw I2C coding, so you don’t need to worry about that, and provides much more humanfrien­dly functions such as SetMotor1() , which sets the speed of the left hand wheels.

Web control

Assuming your Pi is connected to a wireless network, then one slightly roundabout way to control it is to have it run a small webserver with an HTML form to control it. If your robot has a camera attached too, then you can stream the video to this webpage, for a delightful firstperso­n video driving experience.

Creating a streaming video processor in Python is less complicate­d than you’d think, but more complicate­d than we’d like to get into in this summary. So study the DiddyBorg web UI example at www.piborg.org/blog/build/diddyborg-v2-build/ diddyborg-v2-examples-web-ui to see how the magic happens. If you’re are lucky enough to own a DiddyBorg then copy that script to it,

Gamepad control

Controllin­g your robot with a gamepad is a little easier to get along with. However, as we discovered, they might need a little persuasion to work when a GUI isn’t running (such as when your Pi isn’t connected to a monitor). In theory, you could set this up beforehand, or by removing the SD card from the robot and booting it in another Pi – the settings should be remembered. If not, we can set this up by SSHing into our robot.

There are two challenges to overcome: the actual Bluetooth pairing and the subsequent setting up of the device nodes. The latter is handled by the joystick package (or evdev which it depends on) and the former by the bluetoothc­tl command (this will be installed as standard). After installing the joystick package run bluetoothc­tl . This will start a console where we can scan, pair and connect our controller.

First, put the device in pairing mode and then initiate a scan with scan on . You should see a list of all nearby Bluetooth devices and their MAC addresses. Hopefully your controller is in there, in which case copy the address. Deactivate the scan with scan off . Then pair with pair <MAC address> , connect with connect

<MAC address> and take your leave with exit . Now run evtest , which will greet you with a list of detected input devices. Select your desired controller and mash the buttons. You should see a different cryptic response for each button. The DiddyBorg includes an example script for joypad control, which uses the PyGame libraries to listen for the relevant button events.

Image recognitio­n

Our second feature this month is all about machine learning, and if you’ve read it you’ll see we mention running Tensorflow on the Pi. This is all thanks to the work of Sam Abrahams, who’s provided precompile­d wheel files for Python 2.7 and 3.4. This is good news if you’re running the second-to-last (Jessie) version of Raspbian, since that includes Python 3.4. If you’re running the latest version (Stretch, which uses Python 3.5), however, then you’ll need to use the Python 2.7.

Having two different major versions of Python like this is fine (Raspbian ships them both as standard), but one cannot have 3.4 and 3.6 installed concurrent­ly, and the 3.4 wheel won’t work with Python 3.6. Before we begin, be advised that Tensorflow models’ repository is large and more than once we ran out of space using an 8GB SD card. This can be worked around, by removing larger packages such as LibreOffic­e and WolframAlp­ha, but using a 16GB card is recommende­d. The following commands will set up everything you need: $ wget https://github.com/samjabraha­ms/ tensorflow-on-raspberry-pi/releases/download/ v1.1.0/tensorflow-1.1.0-cp27-none-linux_armv7l.whl $ sudo apt install python-pip python-dev python-pil python-matplotlib python-lxml $ sudo pip install tensorflow-1.1.0-cp27-none-linux_ armv7l.whl $ git clone https://github.com/tensorflow/models. git This last stage will start a roughly 1GB download, so beware. If you run out of space the process can be resumed, once you’ve cleared some clutter, by running git checkout -f HEAD from the models/ directory. Once it completes successful­ly, test it with: $ cd models/tutorials/image/imagenet $ python2 classify_image.py

This should identify the bundled panda image (which was extracted to /tmp/imagenet/cropped_panda.jpg). The script can also take an --image-file parameter to identify user-supplied images. So with a bit of jiggerypok­ery we could adapt things to take a photo and then attempt to identify it. Since the whole process takes about 10 seconds on a Pi 3 (although this could be sped up by keeping the program running in a loop, using C++ or with a more slimline Tensorflow model) we don’t really have any hope of classifyin­g things in real time.

Furthermor­e, it’s likely that a photo taken at ground level with the Pi Cam will be tricky for the script to identify. But that’s okay, it just adds to the fun. All we need to do is tweak classify_image.py a little. Copy this file to classify_photo.py, then edit the new file. You’ll need to import the picamera module early on, then in the main(_) function replace the line that begins with

image = with something like: cam = picamera.PiCamera() cam.capture(‘/tmp/picam.jpg’)

And finally, change the run_inference_on_image() call to run on our freshly captured picam.jpg. If you’re feeling adventurou­s then why not bind the new script to some button press event? The controller script for the DiddyBorg, which we discussed earlier, could easily be adapted to do this.

Ball following

OpenCV is a powerful computer vision framework that includes Python bindings. It’s actually used to draw the image in the earlier Web UI example, but we can use it for more advanced purposes. It’s capable of detecting objects within an image, which means we can make our robot drive towards them. If those objects move, then it will follow them. For this to work, said objects need to be fairly distinct, such as a brightly coloured ball. You’ll find just such an example at www.piborg.org/blog/ diddyborg-v2-examples-ball-following.

follow the robot action “If your robot has a camera attached, then you can stream the video for a delightful first-person video driving experience”

 ??  ?? Bluetooth gamepads are easy to set up from the Pixel desktop. If only things were so simple from the command line…
Bluetooth gamepads are easy to set up from the Pixel desktop. If only things were so simple from the command line…
 ??  ?? Tensorflow figured that this was a rabbit, but then thought Jonni was a sweet potato, Neil a bow tie and Effy a punch bag.
Tensorflow figured that this was a rabbit, but then thought Jonni was a sweet potato, Neil a bow tie and Effy a punch bag.
 ??  ?? We need to convert the image to HSV so that OpenCV can better detect any red ball-like objects to follow.
We need to convert the image to HSV so that OpenCV can better detect any red ball-like objects to follow.

Newspapers in English

Newspapers from Australia