Posts with «gesture control» label

Low-Cost Computer Gesture Control with an I2C Sensor

Controlling your computer with a wave of the hand seems like something from science fiction, and for good reason. From Minority Report to Iron Man, we’ve seen plenty of famous actors controlling their high-tech computer systems by wildly gesticulating in the air. Meanwhile, we’re all stuck using keyboards and mice like a bunch of chumps.

But it doesn’t have to be that way. As [Norbert Zare] demonstrates in his latest project, you can actually achieve some fairly impressive gesture control on your computer using a $10 USD PAJ7620U2 sensor. Well not just the sensor, of course. You need some way to convert the output from the I2C-enabled sensor into something your computer will understand, which is where the microcontroller comes in.

Looking through the provided source code, you can see just how easy it is to talk to the PAJ7620U2. With nothing more exotic than a switch case statement, [Norbert] is able to pick up on the gesture flags coming from the sensor. From there, it’s just a matter of using the Arduino Keyboard library to fire off the appropriate keycodes. If you’re looking to recreate this we’d go with a microcontroller that supports native USB, but technically this could be done on pretty much any Arduino. In fact, in this case he’s actually using the ATtiny85-based Digispark.

This actually isn’t the first time we’ve seen somebody use a similar sensor to pull off low-cost gesture control, but so far, none of these projects have really taken off. It seems like it works well enough in the video after the break, but looks can be deceiving. Have any Hackaday readers actually tried to use one of these modules for their day-to-day futuristic computing?

Groovin’ With a Gesture-Controlled MP3 Player

Touchscreens are great, but they’re not always the perfect solution. Trying to operate one with gloves on (even alleged “touchscreen-friendly” ones) can be cumbersome at best, and if the screen is on a publicly-shared device, such as a checkout kiosk it can easily become a home for bacteria, viruses and all sorts of other nasty stuff.

That’s what [Norbert Zare] was thinking when he built his gesture-controlled MP3 player. It uses a PAJ7620U2 gesture sensor to register a few intuitive hand motions including finger twirls to control the volume, hand swipes to skip forward and backwards, and a flat hand to play and pause the song. It even has a motorized knob and cute cutout music notes that move to provide some visual feedback for the gestures, which you can see in-action in the video below. If this seems familiar, it’s because on Tuesday we took a look at the camera-based, glance-to-skip-tracks controller he built.

To actually play some music, he gutted an old MP3 player and hooked the solder pads from the control buttons up to an Arduino, which reads gesture information from the sensor and emulates the MP3 player’s buttons by setting the appropriate pins to HIGH and LOW. Finally, he topped the whole thing off with an LCD screen and a case.

The great thing about [Norbert]’s approach is that it isn’t just limited to an MP3 player — it can be extended to replace the buttons on pretty much any device. Because the Arduino only needs to be connected to the button inputs of the device, it should be relatively easy to adapt most existing tactile interfaces to be touch-free. Paired with this gesture-tracking macro keyboard we saw earlier in the year, the days of actually having to touch our tech may soon be behind us.

3D Printed Gesture-Controlled Robot Arm is a Ton of Tutorials

Ever wanted your own gesture-controlled robot arm? [EbenKouao]’s DIY Arduino Robot Arm project covers all the bases involved, but even if a robot arm isn’t your jam, his project has plenty to learn from. Every part is carefully explained, complete with source code and a list of required hardware. This approach to documenting a project is great because it not only makes it easy to replicate the results, but it makes it simple to remix, modify, and reuse separate pieces as a reference for other work.

[EbenKouao] uses a 3D-printable robotic gripper, base, and arm design as the foundation of his build. Hobby servos and a single NEMA 17 stepper take care of the moving, and the wiring and motor driving is all carefully explained. Gesture control is done by wearing an articulated glove upon which is mounted flex sensors and MPU6050 accelerometers. These sensors detect the wearer’s movements and turn them into motion commands, which in turn get sent wirelessly from the glove to the robotic arm with HC-05 Bluetooth modules. We really dig [EbenKouao]’s idea of mounting the glove sensors to this slick 3D-printed articulated gauntlet frame, but using a regular glove would work, too. The latest version of the Arduino code can be found on the project’s GitHub repository.

Most of the parts can be 3D printed, how every part works together is carefully explained, and all of the hardware is easily sourced online, making this a very accessible project. Check out the full tutorial video and demonstration, embedded below.

3D printing has been a boon for many projects, especially those involving robotic arms. All kinds of robotic arm projects benefit from the advantages of 3D printing, from designs that focus on utility and function, to clever mechanical designs that reduce part count in unexpected ways.

GesturePod is a clip-on smartphone interface for the visually impaired

Smartphones have become a part of our day-to-day lives, but for those with visual impairments, accessing one can be a challenge. This can be especially difficult if one is using a cane that must be put aside in order to interact with a phone.

The GesturePod offers another interface alternative that actually attaches to the cane itself. This small unit is controlled by a MKR1000 and uses an IMU to sense hand gestures applied to the cane. 

If a user, for instance, taps twice on the ground, a corresponding request is sent to the phone over Bluetooth, causing it to output the time audibly. Five gestures are currently proposed, which could expanded upon or modified for different functionality as needed.

People using white canes for navigation find it challenging to concurrently access devices such as smartphones. Build­ ing on prior research on abandonment of specialized devices, we explore a new touch free mode of interaction wherein a person with visual impairment can perform gestures on their existing white cane to trigger tasks on their smartphone. We present GesturePod, an easy-to-integrate device that clips on to any white cane, and detects gestures performed with the cane. With GesturePod, a user can perform common tasks on their smartphone without touch or even removing the phone from their pocket or bag. We discuss the challenges in build­ ing the device and our design choices. We propose a novel, efficient machine learning pipeline to train and deploy the gesture recognition model. Our in-lab study shows that Ges­ turePod achieves 92% gesture recognition accuracy and can help perform common smartphone tasks faster. Our in-wild study suggests that GesturePod is a promising tool to im­ prove smartphone access for people with VI, especially in constrained outdoor scenarios.

Gesture Control for Lunch Money

[Dimitris Platis] wanted to add gesture control to his PC. You’d think that would be expensive, but by combining a diminutive Arduino, a breakout board with a gesture controller, and an interconnect PCB, he managed to pull it off for about $7. That doesn’t include the optional 3D-printed case and we think you could omit the interconnect board if you don’t mind some wires and further cut costs. [Dimitris] calls it Nevma, and you can see how the device works in the video below.

The heart of the project is a sensor that measures light and motion. The chip and the breakout board are just a couple of bucks if you order them from China. You can find them in the US if you don’t mind spending a little bit more. The device has an I2C interface, and [Dimitris] uses a tiny Mini SS Micro for the USB interface and the CPU.

The sensor chip is made for the mobile phone market and can also sense proximity. From its data sheet:

Gesture detection utilizes four directional photodiodes to sense reflected IR energy… The architecture of the the gesture engine features automatic activation (based on proximity engine results), ambient light subtraction, cross-talk cancellation, dual 8-bit data converters, power saving inter-conversion delay, 32-dataset FIFO, and interrupt-driven I2C communications.

That seems like a lot of power for a few bucks. Sparkfun has a library (and a matching board) and [Dimitris] uses it. The library is released as beerware. In particular, the documentation says: “The code is beerware; if you see me (or any other SparkFun employee) at the local, and you’ve found our code helpful, please buy us a round!”

We really like Nevma. You don’t have to hold any device in your hand. It also looks slicker than the solutions we’ve seen (and even created) using SONAR.


Filed under: Arduino Hacks

Create a gesture control unit for your PC using Skywriter and Arduino

While keyboards are great, and custom shortcuts can make things even better, why not do away with buttons and knobs altogether, controlling your computer instead via simple gestures? Maker Ben James has done just this, creating a unique interface using a Skywriter device to pick up finger movements, along with an Arduino Leonardo to emulate a keyboard on his laptop.

Since the Skywriter can detect a number of gestures, James assigned various swipes, taps and circular motions to keyboard commands. As you can see in the video here, the results are pretty neat. 

More info on this project can be found on his blog post, and its code is available on GitHub.

Computer gesture control via webcam and Arduino

While touchscreens are nice, wouldn’t it be even better if you could simply wave your hand to your computer to get it to do what you want? That’s the idea behind this Iron Man-inspired gesture control device by B. Aswinth Raj.

The DIY system uses an Arduino Nano mounted to a disposable glove, along with hall effect sensors, a magnet attached to the thumb, and a Bluetooth module. This smart glove uses the finger-mounted sensors as left and right mouse buttons, and has a blue circle in the middle of the palm that the computer can track via a webcam and a Processing sketch to generate a cursor position.

You can see it demonstrated in the video below, drawing a stick man literally by hand, and also controlling an LED on the Nano. Check out this write-up for code and more info on the build!

Arduino Blog 31 May 21:41

These Makers built a gesture-controlled robotic arm

Using a Kinect sensor with MATLAB/Simulink and an Arduino, B.Avinash and J.Karthikeyan made a robotic arm to mimic their every move.

If you need a robotic arm to follow your movements, the Kinect sensor is a great place to start. On the other hand, it’s a long leap programming-wise to go from sensor input to coordinated movement of servo motors. Through a toolchain stretching from the sensor itself, to a computer, and finally to an Arduino Mega controlling the servos directly, Avinash and Karthikeyan did just that.

For their process, the computer takes data from the Kinect sensor, then translates it into servo angles using the MATLAB and Simulink computer programs. Resulting data is then fed into the Arduino via a serial connection, which controls the robot’s movements appropriately with a slight delay.

Be sure to check out the project’s Instructables page to learn more about this awesome build!

Woven's wearable platform for gaming, cool points and a whole lot more (video)

TshirtOS showed us one take on wearable gadgetry earlier this month, and now it's Woven's turn. This particular e-garment packs quite the selection of hardware, as you can see above -- a trio of LilyPad Arduino boards (and some custom ones), a Bluetooth module, 12 x 12 RGB LED "screen", speakers, bend sensors, a heart rate monitor, shake motors and a power pack. You'll need to accessorize, of course, with a smartphone for hardware harmony and to run companion apps. So what's it for, you ask? Well, the creators are touting it primarily as a "pervasive" gaming platform, and even seem to have a working first title in the form of SPOOKY (think gesture-based ghost-fighting). Other uses (which appear a little more conceptual) see Woven as a workout companion, TV remote, Wii controller, social network alerter or simply a fashion accessory. Check out the videos below to see it in action and imagine all the fun you could have in the five minutes before you're ushered into that padded room.

Continue reading Woven's wearable platform for gaming, cool points and a whole lot more (video)

Filed under: Wearables

Woven's wearable platform for gaming, cool points and a whole lot more (video) originally appeared on Engadget on Fri, 31 Aug 2012 05:36:00 EST. Please see our terms for use of feeds.

Permalink | Email this | Comments