Robot arm Owi with RPi
How did it start
Recently I get a Raspberry Pi (Model B+) as a present for my 27th birthday. First thing I did was that I played with the GPIOs (general purpose input/output) pins that can be programmable. The first language I found that supports GPIOs, was python library RPI.GPIO. So I tried a few basic circuits. Toggled some leds, RGB-led, speaker beeps and so on. Few months earlier a colleague at work showed me a webpage with some robotic arms that were looking catchy. The closest shop that sales any robot arms, was in Czech Republic with only one model - Owi. So I decides to get it.
Connecting to RPi
The first task was to connect the Owi-arm to the Raspberry Pi. You can find the Owi robotic arm with a USB or a joystick. Mine's get a joystick controller as an output, so I had to find how to connect it with RPi. After few hours of googling I found a video that showed RPi how to toggle relays with RPi. So I got two boards with relays and also two adapters for 3V DC.
Using keyboard to control RPi
After linking all motors to RPi through relays, it was easy to write a small script to toggle the relays and move with all Owi's joints.
As this arm is more of a toy than a real device, it has no step-motors. So the only way to do some keyframing was to store moves as periods of time. After few moments you will realise that it is almost impossible to return the arm to the previous position with just recorded time for each motor. So it is possible to save positions and then run them, but you have to restore the arm to it's default position from time to time. The only idea how to restore the arm to default position is to use openCV and user image features to learn how the default position looks like. This might be interesting but also a waste of time, you can rather get a new arm with step-motors instead.
Control via wifi
The most exciting part what I was looking forward to, was combining a computer vision with Owi arm. The first step was to install python libs openCV.
Apart from installing few libs, the task with the follow movements was easy. If you can detect the face with a rectangle, you can move Owi-arm towards it.
After trying some resolutions I was a bit disappointed, because, RPi only allows 160x120px in a real-time. Larger resolutions has unacceptable lag.
The other disappointment was the restriction of speed of the motors (because it is only a toy). There is no option to set a speed, you can only toggle motors.
Owi robotic arm is good for fun, and maybe for easy projects. But I think it is useless for any, more complex tasks.
download source code here