Posts

Skate tricks recognition using gyroscope

Image
In this article we're going to describe how to recognize a skateboard trick using a gyroscope. This sensor is already present in most of the smartphones but in case you are not familiar with it, here is a description . Before we begin, small disclaimer. This project was originally as a part of hackathon that was used with Slido and the purpose was to confirm that it's possible to recognize a trick using a gyroscope. For the sake of simplicity of this project we're not considering skater's stance on a board and we're only trying to recognize 2 simple tricks. So, let's begin by splitting the problem into several smaller ones. We need to record the trick, store it, describe it, understand it, and then of course recognize it. Recording the trick Let's start with the trick recording. To have a precise data we need to have a device with a gyroscope that is attached directly to a skateboard. We need to place it on the bottom of the board, otherwise it might interfe

YI Dash Car Camera API

Image
Why A friend asked me if I could create an android app using his new  Yi Dash Car Camera . Camera's parameters look very good and the price was quite low. He showed me a native Android app that was already on Google Play. He needed to download a list of video files and search them by specific time. About camera Camera has its own wifi. So If you want to do something, you need to be connected to its wifi.  Camera automatically starts to record a video (each about 3 min long) when it's turned on. During download or changing settings recording is stopped. In the official app, there is also an option to take a picture (this is done via screenshot from the stream). What I did First, I've tried some SDK from the camera's company, but it was not supporting this one. After quick mail to support, I got confirmed that this camera does not have any official SDK. My next move was to download their app and look at the source code. I've used online decompile

Toilet light wall

Image
LED screen made of empty toilet rolls Why ? The first idea was to reuse empty toilet rolls and make some fun with arduino and some LEDs. After gluing first block (4x3 LEDs connected to Arduino Nano), I've decided that it would be cool to control it right from the android. So I've added one BT adapter and created an android app. As the time went by, another empty toilet roll appeared. So I've built one block after another... After some time I've set my goal to have 4x4 blocks all together (192 LEDs). As the blocks were adding, the functionality of android app was also rising. first 6 blocks What it can do ? Because it is quite straightforward from the video below, let's just briefly list the features from an android app: draw (real-time) toggle random LED (screen saver) animation (includes also editor for key-frame animation) font loop incoming sound simple game using accelerometer front view  Was there something interesting ? Aside fro

Drone following instructions

Image
Reading instructions from QR codes and executing them using android application Intro Recently I got an opportunity to build a drone prototype controlled by Android device. Firstly I had to choose the best candidate. The requirements were: small size and SDK with a video streaming. After some research I've decided that the Bebop 2 from Parrot would be the best choice. Parrot is one of the few companies that has an open SDK for developers. Recently they have released the 3rd version of their SDK. The first step led to try the android application example . This example covers almost every basic feature: connecting to drone, moving around, making picture in high quality and accessing the drone's media. One of the steps for the prototype would be autonomous landing onto a pattern. I've done some research about the existing solutions and found this paper that describes the theory behind the landing. So I've decided to create an android application that navigates

Robotic arm with computer vision

Image
Robotic arm with computer vision - picking up the object Idea The main idea was to build an environment with a robotic arm that can execute various commands based on an image analysis of a scene. In this article I'm going to describe all parts of the idea. For the first task I've chosen detection and moving one object. Environment The whole environment consists of few parts mounted together. For the base I've chosen an old table and repainted it with a white color to get better contrast with objects. Onto the middle of longer side I mounted robotic arm that I got from e-bay. The arm has 6 servo motors, with rotation base and claws on the other size. Parts are made of aluminium and are quite solid. Then I got some perforated metal ledges, short them, and mounted them to the corners of the table. Screw it all together. Then I put RGB Led strip to the bottom side of top part of construction. In the end i placed USB camera at the top of construction so it ca

Counting dice and train wagons using computer vision

Image
Computer vision exercises with preprocessing Before the next project I decided to do some computer vision exercises. Each example is based on a simple logic image preprocessing. No data structure or learning is required. Dice I got this idea while browsing the net. I was curious about how hard can it be to write such a script. I'll describe the algorithm in steps. movement detection : Comparing few frames with thresholds gives us the information, whether something is moving in the frame. Adding some small time frame after the movement stops gives us more precise information. remove background : Thresholding gray frame removes the background and gives us binary image with objects cropping the objects : Using contours to detect object and then separate them by cropping. detecting dots : Inverting the image we get objects that can be again simply detected using contours. filtering dots : If dice is visible also from the side therefore dots from that side can be recogn

Play table

Image
Using proximity sensors for playing midi tones combined with LED visualization Description The aim of this project was to create a table sized device with multiple proximity sensors that can play midi tones. Each sensor had a few LEDs to show hand distance above the table. This table could be used by one or more persons at once. Hardware setup First I created a prototype from a cardboard to test sensors and some logic behind. Than I ordered a customized plottered sticker with a design which was painted with bare conductive paint. Afterwards I drilled some holes and connected the touchboard with 7 arduino Nanos. Each arduino was connected to 13 LEDs diodes. At the end I added two potentiometers. One for controlling volume and one for changing notes setup. Programming Programming consists of two parts. Master(touchboard) program that reads values from proximity sensors and sends messages to slaves(arduinos). Second program for slaves that reads messages from m