Posts with «due» label

Communicate using your ear with Orecchio

When conversing face-to-face, there are a wide range of other emotions and inflections conveyed by our facial and body expressions. But what if you can’t express emotion this way, whether due to a physical impairment, or simply because of a covering—like a dust mask—temporarily hides your beautiful smile, and perhaps your hands are otherwise occupied?

As a solution to this dilemma, a team of researchers has been working on Orecchio, a robotic device that attaches to the ear and bends it to convey emotion. Three motors allow the ear to be bent in 22 distinct poses and movements, indicating 16 emotional states. Control is accomplished via an Arduino Due, linked up with a windows computer running a C# program. 

The prototype was implemented using off-the-shelf electronic components, miniature motors, and custom-made robotic arms. The device has a micro gear motor mounted on the bottom of a 3D-printed ear hook loop clip. The motor drives a plastic arm against the side of the helix, able to bend it towards the center of the ear. Rotating the plastic arm back to its rest position allows the helix to restore to its original form. Near the top of the earpiece is another motor that drives a one-joint robotic arm that is attached to the top of the helix, using a round ear clip. Rotating the motor extends the robotic arm from its resting position, to bend the top helix downwards the center of the ear. The motor together with the one-joint robotic arm is mounted on a linear track that can be moved vertically through a rack-and-pinion mechanism, driven by a third motor. Moving the rack upwards stretches the helix.

The prototype is demonstrated in the video below, and more info is available in the project’s research paper.

DualPanto is a non-visual gaming interface for the blind

While there are tools that allow the visually impaired to interact with computers, conveying spacial relationships, such as those needed for gaming, is certainly a challenge. To address this, researchers have come up with DualPanto.

As the name implies, the system uses two pantographs for location IO, and on the end of each is a handle that rotates to indicate direction. One pantograph acts as an output to indicate where the object is located, while the other acts as a player’s input interface. One device is positioned above the other, so the relative position of each in a plane can be gleaned. 

The game’s software runs on a MacBook Pro, and an Arduino Due is used to interface the physical hardware with this setup. 

DualPanto is a haptic device that enables blind users to track moving objects while acting in a virtual world.

The device features two handles. Users interact with DualPanto by actively moving the ‘me’ handle with one hand and passively holding on to the ‘it’ handle with the other. DualPanto applications generally use the me handle to represent the user’s avatar in the virtual world and the it handle to represent some other moving entity, such as the opponent in a soccer game.

Be sure to check it out in the video below, or read the full research paper here.

Single-handed smartwatch text entry with WrisText

Smartwatches can keep us informed of incoming information at a glance, but responding still takes the use of another hand, potentially occupied by other tasks. Researchers at Dartmouth College are trying to change that with their new WrisText system.

The device divides the outside of a Ticwatch 2 into six sections of letters, selected by the movement of one’s wrist. As letters are chosen, possible words are displayed on the screen, which are then selected automatically, or by rubbing and tapping gestures between one’s finger and thumb. 

The prototype employs an Arduino DUE to pass information to a computer, along with proximity and piezo sensors to detect hand and finger movements. 

We present WrisText – a one-handed text entry technique for smartwatches using the joystick-like motion of the wrist. A user enters text by whirling the wrist of the watch hand, towards six directions which each represent a key in a circular keyboard, and where the letters are distributed in an alphabetical order. The design of WrisText was an iterative process, where we first conducted a study to investigate optimal key size, and found that keys needed to be 55o or wider to achieve over 90% striking accuracy. We then computed an optimal keyboard layout, considering a joint optimization problem of striking accuracy, striking comfort, word disambiguation. We evaluated the performance of WrisText through a five-day study with 10 participants in two text entry scenarios: hand-up and hand- down. On average, participants achieved a text entry speed of 9.9 WPM across all sessions, and were able to type as fast as 15.2 WPM by the end of the last day.

More information can be found in the project’s research paper, or you can see it demonstrated in the video below.

A HID For Robots

Whether with projects featured here or out in the real world, we have a tendency to focus most upon the end product. The car, solar panel, or even robot. But there’s a lot more going on behind the scenes that needs to be taken care of as well, whether it’s fuel infrastructure to keep the car running, a semiconductor manufacturer to create silicon wafers, or a control system for the robot. This project is one of the latter: a human interface device for a robot arm that is completely DIY.

While robots are often automated, some still need human input. The human input can be required all the time, or can be used to teach the robot initially how to perform a task which will then be automated. This “keyboard” of sorts built by [Ahmed] comes with a joystick, potentiometer, and four switch inputs that are all fully programmable via an Arduino Due. With that, you can perform virtually any action with whatever type of robot you need, and since it’s based on an Arduino it would also be easy to expand.

The video below and project page have all the instructions and bill of materials if you want to roll out your own. It’s a pretty straightforward project but one that might be worth checking out since we don’t often feature controllers for other things, although we do see them sometimes for controlling telescopes rather than robots.

 

 

Hack a Day 02 Jun 06:00

Scribble is an Arduino-controlled haptic drawing robot

As part of his master’s studies at Eindhoven University, Felix Ros created a haptic drawing interface that uses a five-bar linkage system to not only take input from one’s finger, but also act as a feedback device via a pair of rotary outputs.

“Scribble” uses an Arduino Due to communicate with a computer, running software written in OpenFrameworks.

For over a century we have been driving cars, enabling us to roam our surroundings with little effort. Now with the introduction of automated driving, machines will become our chauffeurs. But how about getting us around a road construction, or finding a friend in a crowded area? Or what if you just want to explore and find new places, will these cars be able to handle such situations and how can you show your intentions?

Currently there is no middle ground between the car taking the wheel or its driver, this is where Scribble comes in: a haptic interface that lets you draw your way through traffic. You draw a path and the car will follow, not letting you drive but pilot the car. Scribble lets you help your car when in need, and wander your surroundings once again.

You can learn more about Ros’ design in his write-up here, including the code needed to calculate and output forward kinematics to set the X/Y position, and inverse kinematics to sense user input.

Be sure to check it out in the video below piloting a virtual car through traffic with ease!

Integrating a Nintendo Power Glove with today’s VR technology

When the Power Glove was released in the early 1990s, the idea that you could control games with hand motions was incredible, but like the Virtual Boy that followed years later, the hardware of the day just couldn’t keep up. Today, hardware has finally gotten to the point where this type of interface could be very useful, so Teague Labs decided to integrate a Power Glove with an HTC Vive VR headset.

While still under development, the glove’s finger sensors have shown great promise for interactions with virtual touchscreen devices, and they’ve even come up with a game where you have to counter rocks, paper, and scissors with the correct gesture.

Making this all possible is the Arduino Due, which supports the library for communicating with the Vive tracker.

We took a Power Glove apart, 3D scanned the interfacing plastic parts and built modified parts that hold the Vive Tracker and an Arduino Due on the glove. After some prototyping on a breadboard, we designed a shield for the Due and etched it using the laser-cutter transfer technique. We then soldered all components and spray-painted the whole shield to protect the bare copper. After mounting the tracker and tweaking the code by matzmann666, we had the glove work.

If you’d like to see the details of what has been accomplished so far, check out the Teague Labs team’s design files and code on GitHub.

Project Aslan is a 3D-printed robotic sign language translator

With the lack of people capable of turning written or spoken words into sign language in Belgium, University of Antwerp masters students Guy Fierens, Stijn Huys, and Jasper Slaets have decided to do something about it. They built a robot known as Aslan, or Antwerp’s Sign Language Actuating Node, that can translate text into finger-spelled letters and numbers.

Project Aslan–now in the form of a single robotic arm and hand–is made from 25 3D-printed parts and uses an Arduino Due, 16 servos, and three motor controllers. Because of its 3D-printed nature and the availability of other components used, the low-cost design will be able to be produced locally.

The robot works by receiving information from a local network, and checking for updated sign languages from all over the world. Users connected to the network can send messages, which then activate the hand, elbow, and finger joints to process the messages.

Although it is one arm now, work will continue with future masters students, focusing on expanding to a two-arm design, implementing a face, and even integrating a webcam into the system. For more info, you can visit the project’s website here as well as its write-up on 3D Hubs.

Ball-on-plate machine uses touchscreen position sensing

Redditor “xmajor9x” has spent several weeks building a three-legged machine to balance a metal ball on top of a plate. The device uses three servos attached to a rectangular surface with linkages that translate servo position into linear displacement of the table. This allows it to keep the ball centered, or rotate around the perimeter in a circle or square pattern.

An Arduino Due controls the ball using a PID loop, and the ball’s position is sensed not by an external camera, but by the top “plate,” which is actually made out of a resistive touchscreen. Although this adds a very unique element, it means that the ball on top must be quite heavy to be reliably tracked, and its creator is considering switching to a computer vision system in the future.

Be sure to check out the project’s GitHub page for code and more info on the build! <!–more–>

Arduino Blog 14 Jul 19:48

Building an Arduino-controlled single-pixel scanner

If you’ve seen color sensors such as the TCS34725,  you may have considered them for projects that can pick out one colored object over another. On the other hand, if you were to take one of these sensors, mount them to an Arduino-driven plotter, and then take readings in an X/Y plane, you’d have all the elements needed for a simple single-pixel scanner.

In the video seen below, Kerry D. Wong does just this using his hacked HP 7044A plotter to scan a picture, recording RGB color values in a 128 x 128 grid. As the device scans, the Arduino Due used for control passes these values to a computer, which assembles them together into a low-resolution image.

You can find more details on the project, including its code, in Wong’s blog post here.

Smartwatch convenience ‘moves’ to the next level

To address the limitations of today’s fixed-face watches, researchers have come up with an actuated smartphone concept that physically moves itself using an Arduino Due, Bluetooth and several motors.

Receiving Internet notifications has gone from using a computer, to checking them on your smartphone, to now simply seeing them come in on your wearable device. On the other hand, you still have to rotate your wrist into the right position to see the screen. Worse yet, if you want to show others what is on your wrist, you may even have to twist your arm awkwardly.

Fortunately, there is a possible solution to this scourge in the form of Cito, which bills itself as “An Actuated Smartwatch for Extended Interactions.” This design can move in five different directions–rotates, hinges, translates, orbits and rises–potentially making viewing more convenient, or even providing haptic feedback. Prototype electronics are housed inside a control box on the upper arm, but presumably would become much smaller in a production version.

You can see the team’s entire paper here, or read this write-up for a more involved summary.

Photo: Jun Gong