Posts with «opencv» label

Eye-Controlled Wheelchair Advances from Talented Teenage Hackers

[Myrijam Stoetzer] and her friend [Paul Foltin], 14 and 15 years old kids from Duisburg, Germany are working on a eye movement controller wheel chair. They were inspired by the Eyewriter Project which we’ve been following for a long time. Eyewriter was built for Tony Quan a.k.a Tempt1 by his friends. In 2003, Tempt1 was diagnosed with the degenerative nerve disorder ALS  and is now fully paralyzed except for his eyes, but has been able to use the EyeWriter to continue his art.

This is their first big leap moving up from Lego Mindstorms. The eye tracker part consists of a safety glass frame, a regular webcam, and IR SMD LEDs. They removed the IR blocking filter from the webcam to make it work in all lighting conditions. The image processing is handled by an Odroid U3 – a compact, low cost ARM Quad Core SBC capable of running Ubuntu, Android, and other Linux OS systems. They initially tried the Raspberry Pi which managed to do just about 3fps, compared to 13~15fps from the Odroid. The code is written in Python and uses OpenCV libraries. They are learning Python on the go. An Arduino is used to control the motor via an H-bridge controller, and also to calibrate the eye tracker. Potentiometers connected to the Arduino’s analog ports allow adjusting the tracker to individual requirements.

The web cam video stream is filtered to obtain the pupil position, and this is compared to four presets for forward, reverse, left and right. The presets can be adjusted using the potentiometers. An enable switch, manually activated at present is used to ensure the wheel chair moves only when commanded. Their plan is to later replace this switch with tongue activation or maybe cheek muscle twitch detection.

First tests were on a small mockup robotic platform. After winning a local competition, they bought a second-hand wheel chair and started all over again. This time, they tried the Raspberry Pi 2 model B, and it was able to work at about 8~9fps. Not as well as the Odroid, but at half the cost, it seemed like a workable solution since their aim is to make it as cheap as possible. They would appreciate receiving any help to improve the performance – maybe improving their code or utilising all the four cores more efficiently. For the bigger wheelchair, they used recycled car windshield wiper motors and some relays to switch them. They also used a 3D printer to print an enclosure for the camera and wheels to help turn the wheelchair. Further details are also available on [Myrijam]’s blog. They documented their build (German, pdf) and have their sights set on the German National Science Fair. The team is working on English translation of the documentation and will release all design files and source code under a CC by NC license soon.


Filed under: Medical hacks, Raspberry Pi, video hacks

Keep your candies safe with Candy Locker and Intel Edison


Candy Locker is a mouth-watering tutorial based on  Intel® Edison and image recognition of objects.
You can keep your candies safe from greedy hands with this color recognition lock and a dispenser using a set of 5 distinct color images and setting up a pattern that will activate and dispense candies.

Follow the link and open the magic door of the video and picture recognition,  invent cool systems learning how to manage object recognition and OpenCv foundations.

 

L33T - Personal service robot

Primary image

What does it do?

Personal assistance (Butler)

Time to make a formal reveal as to the project I am working on, L33T. The idea behind L3 is to be a personal service robot, to help out wherever he can and interact with people in the household, efectively being a robot butler. 

When thinking of what L3 should be and what it should look like I drew a lot of inspiration from R2, even the naming scheme is similar. R2 was able to manevour well in tight situations, he could 'talk' to and understand people, and was capable of performing bulter like tasks. 

Cost to build

$100,00

Embedded video

Finished project

Number

Time to build

Type

wheels

URL to more information

Weight

read more

Pixar-style lamp project is a huge animatronics win

Even with the added hardware that lamp still looks relatively normal. But its behavior is more than remarkable. The lamp interacts with people in an incredibly lifelike way. This is of course inspired by the lamp from Pixar’s Luxo Jr. short film. But there’s a little bit of most useless machine added just for fun. If you try to shut it off the lamp shade is used to flip that switch on the base back on.

[Shanshan Zhou], [Adam Ben-Dror], and [Joss Doggett] developed the little robot as a class project at the Victoria University of Wellington. It uses six servo motors driven by an Arduino to give the inanimate object the ability to move as if it’s alive. There is no light in the lamp as the bulb has been replaced by a webcam. The image is monitored using OpenCV to include face tracking as one of the behaviors. All of the animations are procedural, making use of Processing to convey movement instructions to the Arduino board.

Do not miss seeing the video embedded after the break.

[via Gizmodo]


Filed under: robots hacks

Face tracking with Arduino, Processing and OpenCV

How this actually works is described about below but also in the video. I suggest you watch the video as my writing is terrible.

Video : http://www.youtube.com/watch?v=1cEp7duDbNU&feature=plcp

The Processing code : https://www.dropbox.com/sh/v9hkdxuoazoyb0d/fCGQzEfBAK The OpenCV and Arduino libraries are needed. OpenCV also needs to be installed onto the computer.

The Arduino code : https://www.dropbox.com/sh/ujjlahx83ilv1j2/QB0bu8E-EJ

read more

Arduino hack gives a second screen to Android phones, isn't very useful (video)

Who knows why tech tinkerers do what they do. We're just happy to see those idle hands try the untested. Like this latest Arduino hack from modder Michael of Nootropic Design, who's seen fit to rig a 16 x 32 LED matrix up to an Android phone for use as a secondary display. The outputted video, downscaled via OpenCV software to an appropriate resolution and 12-bit color, is admittedly unimpressive, as it chugs along at a paltry four frames per second. But that's not the point of this can-do experiment -- it's all about the possibilities, however blurry and pointless they may be (although, we're sure Barbara Walters would beg to differ). Ready to see this modjob in motion? Then head on past the break for a brief video demo.

Continue reading Arduino hack gives a second screen to Android phones, isn't very useful (video)

Arduino hack gives a second screen to Android phones, isn't very useful (video) originally appeared on Engadget on Wed, 25 Jan 2012 09:28:00 EST. Please see our terms for use of feeds.

Permalink | Email this | Comments