So, my daughter got a toy robot as a Christmas gift, but it turned out more of a gift to myself. Ever heard of that happening?
This is the product that I got her: https://amzn.to/3axmTUn
Its a $70 basic robotic unit that gets its controls from a Raspberry Pi and powered by a couple of 18650 LiPo batteries. Neither of these come in the $70 price point, so once you add those two in, its more or less $120. But for $70 you get 4 speed reduction motors, two servos, a baseboard containing 8 multicolor LEDs, an 8-servo controller, two light sensors, a pair of ultrasonic transceivers, and a three IR transceivers. In addition, the package comes with a Raspberry Pi camera and cable. The two additional things you will need (required) to get this working are:
- A Raspberry Pi - you only need the board - no case, no power supply, no cables (they support most versions - I am using a Pi 3B): https://amzn.to/2IGbx4u
- Two 18650 batteries and a charger - I have 3200mAh batteries that can spin the wheels for over an hour and keep the pi alive for at least 4 hours. You can get higher rated batteries as well: https://amzn.to/2W5WWr0
- Here is a cheaper option that comes with batteries: https://amzn.to/2vOqD5a - you can get them even cheaper on eBay but those change frequently
The programming is quirky at best, controlled mostly using a set of Python 2.7 socket applications that drive the I/O using Raspberry PI I2C. The nice thing that it does not cover all the GPIO pins of the Pi, so there are plenty of other pins to drive other hardware if you need.
I will post additional information as I work on this project, including adding a robot claw, headlights and taillights with blinkers, etc.
The best part about this product is that it is extremely expandable - it does not use any proprietary parts - the camera is the standard Raspberry Pi camera, comes with two commonly found servos, and 4 "speed-reduction" motors - I am trying to find part numbers for the motors but I am hoping that they are pretty common off-the-shelf motors as well. The board is well-put-together and does not become a mess of wires once assembled. The kit even comes with a cable organizer wrap which is nice. The kit comes with a Pi Hat that does take away 16 GPIO pins including the I2C pins of the Raspberry Pi (but it uses I2C extensively to control the hardware so that is out of necessity) but interestingly it keeps the Raspberry Pi on top of their board and does not cover the rest of the GPIO pins, so you can easily expand your project further with the rest of the pins. The board even comes with an 8-port servo controller (out of which only 2 are used for the camera pan-tilt so the rest are available for expansion).
Here are the sensors that come packaged (either soldered on the board or attached separately):
- 3 IR pseudoreflective transceivers soldered on an attached daughterboard pointing downwards at the front - any light color surface close to the IR pins will be detected. Nice way to detect a white lane marker on a blacktop road or black lane marker on a regular floor. They are at the center front so you can't really use it to stay within a lane in a road, but can be used to follow a center line. Provides three channels of true/false inputs (true if light, false if drak)
- 2 photoresistors soldered to the board right at the front (one left front, one right front) - detect bright light shining over it - each resistor goes through an ADC (Analog-Digital Converter) and returns a value between 0 and 3.3 (voltage value). 0 is for no light, 3.3 for extremely bright light (unlikely). I have got 3.1 by shining a flashlight directly on top of the photoresistors from a couple inches away).
- 2 Ultrasonic transceivers that is able to send a sound pulse and listen for echo back, to determine approximate distance from a barrier. The API returns a value in centimeters (not super accurate, but pretty decent - I have gotten as much as a 25% error up to about 60cm (2ft) away. I have not tested beyond about 4 ft but I suspect that beyond 4ft (120cm) the values are not very accurate. These sensors, along with the camera are mounted on a pan-tilt mechanism using two servos, that can be turned to look left/right and up/down.
- The camera is mounted along with the ultrasonic transceivers and is a standard Raspberry Pi camera connecting to the Pi's camera port using a long ribbon cable.
The output and control devices are as follows:
- The robot's transmission is achieved by 4 independently driven motors. This is nice because it gives you pretty good torque at higher speeds, but sometimes almost too much torque causing the wheels to spin out. The other drawback is that there is no simple way to turn. In order to turn, you have to spin two wheels on one side forward, and two wheels on the other side backward. The API gives full control over all 4 motors and lets you go from a speed value of 0 (stopped) to 4000 (full-speed) but at about 2000 there is a good combination of speed and torque so you are going at a decent speed but not over-torquing (if there is such a word). Simple to update the API to create methods like goForward, goBackward, turnLeft, turnRight, etc.
- The robot has 8 independently controlled RGB LEDS (4 on each side in a strip) - more of a gimmick than actually being useful, but I suppose its gives you a pretty cool effect.
- The robot has a single frequency Piezoelectric buzzer in the front. Just two modes - on or off. The sound is loud but not annoying unless you keep it on or continue it. Nice to scare the cat although she is more scared by the sound of the motors.
- Finally, there is a servo controller with 8 servo outputs - two are used by the pan-tilt mechanism (channels '0' and '1' - '0' for turning the camera left-right and '1' for up-down). The angle is dependent on the mounting accuracy - in my setup I ended up with about 70 degrees for pointing straight, 20 to achieve about a 45 degree right and 120 to look 45 degrees left. I don't recommend over turning because you may end up damaging the cheap servos or break the flimsy plastic horns. Simple to create methods like lookForward, lookLeft, lookRight.
In our build, we obviously were not satisfied with the outputs so we added two more servos to control a very simple arm mechanism (unfortunately that had to be mounted in the back because the camera and ultrasonic sensors with their pan-tilt makes it extremely difficult to add anything big in the front. In addition, we added a couple of headlights using two small superbright white LEDs, and two taillights using two bicolor LEDs - I wired the red LEDs independently so the red can be used to create a turn signal effect.
Most of the software is open-source and is available via a GitHub repo. The documentation is also available there as a 120-page fairly well-detailed guide to installation of both the hardware and software. Assembling the pan-tilt mechanism was probably the hardest part of the assembly but the rest of it was fairly simple. There is also an Android (yes, Android-only) app called freenove that one can use to remote control the robot from an Android device (assuming your Android device and the Pi are on the same network). The Android app, unfortunately, is not in the github repository. The only software in the repository is the Python code.
The Python code is decent - not the greatest, and not very well-documented. Moreover, it is written using the (now deprecated) Python 2.7 so will not build if you have Python 3. You will need the python 2.7 compiler and runtime to run this code. Thankfully, raspberry pi comes standard with Python 2.7 as the default (if not, can easily be swtiched to make 2.7 as the default) so the server-side code is not hard to set up and run. However, depending on what client you choose, your mileage may vary. ON Windows, it is not difficult to install Python 2.7 and using the directions in the guide, not too difficult to set this up. On my Mac, I was able to get python 2.7 to run, but could not get libqt installed correctly no matter what I tried, so gave up because I did not want to mess up my working python 3 applications. I guess it may be possible. The python client app is just a big screen with a bunch of on-screen controls and a window showing the camera output. The Android app is better with on-screen joystick-like controls. Still, both of them require you to log in to your Pi, start up the server application (which, for some weird reason, also uses QT and needs a working X windows graphical environment in order to run). So, you can't just ssh into it and start up the server, or start the server from a cron or init script. But once you start the server up, and the Pi has a decent stable wifi connection, both the client apps work well and do what they are supposed to do. The Python client has some separate modes:
- Standard mode - lets you control the robot and the pan-tilt using on-screen buttons.
- Light following mode - the robot runs until you shine a flashlight to the front - then you can make it turn by moving the flashlight focus on one of the front photoelectric sensors.
- Line tracking mode - the robot runs along a black tape (the kit actually comes with a coil of electric tape)
- Ultrasonic mode - the robot does a crazy left, center right continuously with the pan-tilt and tries unsuccessfully to navigate barriers but fails miserably
My Hardware tweaks
As mentioned above, I have added several things to the robot and they work well.
- Two cheap superbright white LEDs as headlight, and two bi-color (red-green) LEDs in the back as taillight. These are connected to the standard PI GPIO ports directly via a small PCB using 100 ohm pull-down resistors.
- Two additional servos using the included 8 port servo controller to control a very basic up-down, open-close robot arm put together using tongue press sticks and hot glue :-)
- A USB-connected accelerometer - I am using the Phidgetspatial 3/3/3 basic - https://amzn.to/2Q7fsLF
- A very small wireless keyboard that connects directly to one of the USB ports of the Raspberry Pi - there are many options available - this is a very similar one from Amazon: https://amzn.to/2Qmds2r
The best thing about this kit is that the server software is completely open-source. Although written using the older Python 2.7 (and I don't have time right now to convert to the new Python completely) the Raspberry Pi is fully compatible with python 2.7 and it is still completely available, including all the external libraries that the software needs (right now). So it is pretty easy to extend the software to support additional functionality or hardware. Here are (to date) some of the changes I have made to the software:
- The first change I made was to get rid of the UI functionality of the server. No reason to have to remote login to the Pi just to start up the server. Got rid of the QT interface, got rid of the QT import and turned the server completely command-line. This means you can start up the server via cron or init or any other way you like.
- Used a keyboard library (I used evdev) to detect keyboard input to control basic functionalities of the transmission to provide simple driving features - forward, backward, left, right using the keyboard almost like a joystick control - not quite a joystick since you cannot control the speed but it is a lot better than having to connect via wifi to the server
- Added keyboard shortcuts to control other hardware as well - headlight, buzzer, robot arm, taillight with turn signal, flasher, etc.
- Added keyboard shortcuts to start the various modes (line following, light following, ultrasonic)
- Rewrote the ultrasonic control to use a simple state model-based control which IMO works a lot better than the stock version, although it needs a lot more improvement