Posts with «gesture» label

Arduino, Accelerometer, and TensorFlow Make You a Real-World Street Fighter

A question: if you’re controlling the classic video game Street Fighter with gestures, aren’t you just, you know, street fighting?

That’s a question [Charlie Gerard] is going to have to tackle should her AI gesture-recognition controller experiments take off. [Charlie] put together the game controller to learn more about the dark arts of machine learning in a fun and engaging way.

The controller consists of a battery-powered Arduino MKR1000 with WiFi and an MPU6050 accelerometer. Held in the hand, the controller streams accelerometer data to an external PC, capturing the characteristics of the motion. [Charlie] trained three different moves – a punch, an uppercut, and the dreaded Hadouken – and captured hundreds of examples of each. The raw data was massaged, converted to Tensors, and used to train a model for the three moves. Initial tests seem to work well. [Charlie] also made an online version that captures motion from your smartphone. The demo is explained in the video below; sadly, we couldn’t get more than three Hadoukens in before crashing it.

With most machine learning project seeming to concentrate on telling cats from dogs, this is a refreshing change. We’re seeing lots of offbeat machine learning projects these days, from cryptocurrency wallet attacks to a semi-creepy workout-monitoring gym camera.

Sensor lets Gestures and an Arduino Control the Tunes

Every time we watch Minority Report we want to make wild hand gestures at our computer — most of them polite. [Rootsaid] wanted to do the same and discovered that the PAJ7620 is an easy way to read hand gestures. The little sensor has a serial interface and can recognize quite a bit of hand waving. To be precise, the device can read nine different motions: up, down, left, right, forward, backward, clockwise, anticlockwise, and wave.

There are plenty of libraries to read it for common platforms. If you have an Arduino that can act as a keyboard for a PC, the code almost writes itself. [Rootsaid] uses a specific library for the PAJ7620 and another — Nicohood — for sending media keys.

With those two libraries, it is very simple to write the code. You simply read a register from the sensor and determine which key to send using the Nicohood library. The serial communications is I2C and there’s a tiny optical sensor onboard along with an IR LED.

Of course, you could send other keys than media controls. We wouldn’t mind going back and forward on web pages with a gesture, for example.

We’ve seen gesture recognition with radar. We’ve also seen it with ultrasonics.


u

Hack a Day 17 Aug 21:00

Listening for Hand Gestures

[B. Aswinth Raj] wanted to control a VLC player with hand gestures. He turned to two common ultrasonic sensors and Python to do the job. There is also, of course, an Arduino. You can see a video of the results, below.

The Arduino code reads the distance from both sensors — one for the left hand and the other for the right. This allows the device to react to single hand gestures that get closer or further away from one sensor as well as gestures involving both hands. For example, raising your left hand and moving it closer or further away will adjust the volume. The right hand controls rewind and fast forward. Raising both hands will start or stop playback.

Of course, since the Arduino is reading the gestures you could change them to suit you. We might have mounted the sensors further back (or, perhaps, added more sensors) so you could use trigonometry to triangulate the hand’s exact position. Well, perhaps not exact, but you could get an idea of the hand’s motion from right to left as well as forward and backward.

On the host computer side, Python receives serial data from the Arduino and then simulates keystrokes to get the desired result. Of course, this is also highly customizable.

By coincidence, we did a similar project a few years ago using one sensor and the Arduino’s ability to appear like a USB keyboard. We’ve also seen 8 sensors making piano music.


Filed under: Arduino Hacks
Hack a Day 02 Nov 03:00

Making a Gesture

When [krich] switched keyboards he lost his volume control. So he decided to hack one together out of an Arduino, an old floppy disc case, and a Hover Labs Hover board (not the Back to the Future kind). You can see the result in the videos below.

You’ll notice in the video that the device reads a “spin” motion to resemble a round volume control. The program sends simulated keyboard presses to the PC to control the audio. In the write-up, [krich] mentions it may be the first gesture-based volume control. However,we’ve done it before and so have others.

The Hoverboard does all the hard work, of course. It passes data to the Arduino via I2C. The PC side is handled by some Windows on screen display software, 3RVX.

Still, this one seems to work well with the Hover board. Hard to argue against anything that upcycles a floppy disc container.


Filed under: Android Hacks

Hand Waving Unlocks Door

Who doesn’t like the user interface in the movie Minority Report where [Tom Cruise] manipulates a giant computer screen by just waving his hands in front of it? [AdhamN] wanted to unlock his door with hand gestures. While it isn’t as seamless as [Tom’s] Hollywood interface, it manages to do the job. You just have to hold on to your smartphone while you gesture.

The project uses an Arduino and a servo motor to move a bolt back and forth. The gesture part requires a 1sheeld board. This is a board that interfaces to a phone and allows you to use its capabilities (in this case, the accelerometer) from your Arduino program.

The rest should be obvious. The 1sheeld reads the accelerometer data and when it sees the right gesture, it operates the servo. It would be interesting to do this with a smart watch, which would perhaps look a little less obvious.

We covered the 1sheeld board awhile back. Of course, you could also use NFC or some other sensor technology to trigger the mechanism. You can find a video that describes the 1sheeld below.


Filed under: Arduino Hacks

Controlling a Quadcopter with Gestures

[grassjelly] has been hard at work building a wearable device that uses gestures to control quadcopter motion. The goal of the project is to design a controller that allows the user to intuitively control the motion of a quadcopter. Based on the demonstration video below, we’d say they hit the nail on the head. The controller runs off an Arduino Pro Mini-5v powered by two small coin cell batteries. It contains an accelerometer and an ultrasonic distance sensor.

The controller allows the quadcopter to mimic the orientation of the user’s hand. The user holds their hand out in front of them, parallel to the floor. When the hand is tilted in any direction, the quadcopter copies the motion and will tilt the same way. The amount of pitch and roll is limited by software, likely preventing the user from over-correcting and crashing the machine. The user can also raise or lower their hand to control the altitude of the copter.

[grassjelly] has made all of the code and schematics available via github.


Filed under: Arduino Hacks, drone hacks