Posts with «assistive technology» label

Upgrading a ride-on car to a joystick-controlled assistive device

Child-sized wheelchairs can be difficult to come by, and unfortunately aren’t as much fun as something like a ride-on car. The South Eugene Robotics Team, or FRC2521, decided to address both challenges by building a mini Jeep augmented for kids with limited mobility.

Instructions found here detail how to modify the battery-powered toy, including what can be recycled and what extra parts will need to be purchased. In the new configuration, the Jeep’s two rear motors are configured for differential control, with the input regulated by an Arduino Nano and a pair of electronic speed controllers (ESCs). 

In this project, a joystick replaces the original pedal and steering wheel, and it looks like a lot of fun when implemented in the similarly-outfitted firetruck below.

GesturePod is a clip-on smartphone interface for the visually impaired

Smartphones have become a part of our day-to-day lives, but for those with visual impairments, accessing one can be a challenge. This can be especially difficult if one is using a cane that must be put aside in order to interact with a phone.

The GesturePod offers another interface alternative that actually attaches to the cane itself. This small unit is controlled by a MKR1000 and uses an IMU to sense hand gestures applied to the cane. 

If a user, for instance, taps twice on the ground, a corresponding request is sent to the phone over Bluetooth, causing it to output the time audibly. Five gestures are currently proposed, which could expanded upon or modified for different functionality as needed.

People using white canes for navigation find it challenging to concurrently access devices such as smartphones. Build­ ing on prior research on abandonment of specialized devices, we explore a new touch free mode of interaction wherein a person with visual impairment can perform gestures on their existing white cane to trigger tasks on their smartphone. We present GesturePod, an easy-to-integrate device that clips on to any white cane, and detects gestures performed with the cane. With GesturePod, a user can perform common tasks on their smartphone without touch or even removing the phone from their pocket or bag. We discuss the challenges in build­ ing the device and our design choices. We propose a novel, efficient machine learning pipeline to train and deploy the gesture recognition model. Our in-lab study shows that Ges­ turePod achieves 92% gesture recognition accuracy and can help perform common smartphone tasks faster. Our in-wild study suggests that GesturePod is a promising tool to im­ prove smartphone access for people with VI, especially in constrained outdoor scenarios.

Voice Controlled Camera for Journalist in Need

Before going into the journalism program at Centennial College in Toronto, [Carolyn Pioro] was a trapeze performer. Unfortunately a mishap in 2005 ended her career as an aerialist when she severed her spinal cord,  leaving her paralyzed from the shoulders down. There’s plenty of options in the realm of speech-to-text technology which enables her to write on the computer, but when she tried to find a commercial offering which would let her point and shoot a DSLR camera with her voice, she came up empty.

[Taras Slawnych] heard about [Carolyn’s] need for special camera equipment and figured he had the experience to do something about it. With an Arduino and a couple of servos to drive the pan-tilt mechanism, he came up with a small device which Carolyn can now use to control a Canon camera mounted to an arm on her wheelchair. There’s still some room for improvement (notably, the focus can’t be controlled via voice currently), but even in this early form the gadget has caught the attention of Canon’s Canadian division.

With a lavalier microphone on the operator’s shirt, simple voice commands like “right” and “left” are picked up and interpreted by the Arduino inside the device’s 3D printed case. The Arduino then moves the appropriate servo motor a set number of degrees. This doesn’t allow for particularly fine-tuned positioning, but when combined with movements of the wheelchair itself, gives the user an acceptable level of control. [Taras] says the whole setup is powered off of the electric wheelchair’s 24 VDC batteries, with a step-down converter to get it to a safe voltage for the Arduino and servos.

As we’ve seen over the years, assistive technology is one of those areas where hackers seem to have a knack for making serious contribution’s to the lives of others (and occasionally even themselves). The highly personalized nature of many physical disabilities, with specific issues and needs often unique to the individual, can make it difficult to develop devices like this commercially. But as long as hackers are willing to donate their time and knowledge to creating bespoke assistive hardware, there’s still hope.

[Thanks to Philippe for the tip.]

Communicate using your ear with Orecchio

When conversing face-to-face, there are a wide range of other emotions and inflections conveyed by our facial and body expressions. But what if you can’t express emotion this way, whether due to a physical impairment, or simply because of a covering—like a dust mask—temporarily hides your beautiful smile, and perhaps your hands are otherwise occupied?

As a solution to this dilemma, a team of researchers has been working on Orecchio, a robotic device that attaches to the ear and bends it to convey emotion. Three motors allow the ear to be bent in 22 distinct poses and movements, indicating 16 emotional states. Control is accomplished via an Arduino Due, linked up with a windows computer running a C# program. 

The prototype was implemented using off-the-shelf electronic components, miniature motors, and custom-made robotic arms. The device has a micro gear motor mounted on the bottom of a 3D-printed ear hook loop clip. The motor drives a plastic arm against the side of the helix, able to bend it towards the center of the ear. Rotating the plastic arm back to its rest position allows the helix to restore to its original form. Near the top of the earpiece is another motor that drives a one-joint robotic arm that is attached to the top of the helix, using a round ear clip. Rotating the motor extends the robotic arm from its resting position, to bend the top helix downwards the center of the ear. The motor together with the one-joint robotic arm is mounted on a linear track that can be moved vertically through a rack-and-pinion mechanism, driven by a third motor. Moving the rack upwards stretches the helix.

The prototype is demonstrated in the video below, and more info is available in the project’s research paper.

DualPanto is a non-visual gaming interface for the blind

While there are tools that allow the visually impaired to interact with computers, conveying spacial relationships, such as those needed for gaming, is certainly a challenge. To address this, researchers have come up with DualPanto.

As the name implies, the system uses two pantographs for location IO, and on the end of each is a handle that rotates to indicate direction. One pantograph acts as an output to indicate where the object is located, while the other acts as a player’s input interface. One device is positioned above the other, so the relative position of each in a plane can be gleaned. 

The game’s software runs on a MacBook Pro, and an Arduino Due is used to interface the physical hardware with this setup. 

DualPanto is a haptic device that enables blind users to track moving objects while acting in a virtual world.

The device features two handles. Users interact with DualPanto by actively moving the ‘me’ handle with one hand and passively holding on to the ‘it’ handle with the other. DualPanto applications generally use the me handle to represent the user’s avatar in the virtual world and the it handle to represent some other moving entity, such as the opponent in a soccer game.

Be sure to check it out in the video below, or read the full research paper here.

Ariadne Headband is a wearable device for haptic navigation

In a new take on haptic navigation, makers Vojtech Pavlovsky and Tomas Kosicek have come up with a novel feedback system called the “Ariadne Headband.”

This device—envisioned for use by people with visual impairments, as well as those that simply want to get around without looking down at a phone while walking or biking—uses four vibrating motors arranged in a circle around the wearer’s head to indicate travel direction.

An Arduino Nano provides computing power for the setup, along with a compass module and a Bluetooth link to communicate with a companion smartphone app. The Ariadne Headband is currently a prototype, but this type of interface could one day be miniaturized to the point that it could be placed in a hat, helmet, or other everyday headgear.

Project Ariadne Headband is made out of two parts: headband and control app. The common usage flow is following. First, you open Ariadne Headband Android app. Using this app you connect via Bluetooth to your Headband. Next, the app will ask for you current GPS location. Then you open Google Maps integrated into our app and select your destination (place where you want to go).

Our Android app will compute the geographical azimuth from your current location and chosen destination. When you are ready you start navigating by pressing a button that sends computed azimuth to the Headband you put on your head.

Headband consists of Arduino Nano board, GY-271 compass module, HC-06 Bluetooth module (we selected this module only for local availability and will switch to BLE soon) and 4 vibration motors. Compass module allows us to know current azimuth, that is where is the user looking. All components are placed into a small box on back of your head. Our aim in the future will be to make this as small as possible so you will not even feel it. It is also possible to place everything into a hat or helmet for example instead of rubber headband. We are using rubber headband because it is very easy to manipulate.

Vibration motors around your head are placed in set directions so they can signalize where you should head. Your heading is computed by taking your current azimuth and the azimuth sent from android app (that is where you are currently going and where you should go, respectively).

Robust wheelchair model with treads!

Most people accept that a wheelchair is, in fact, a chair with wheels. This, however, didn’t stop recent Galileo Galilei Technical Institute graduate Davide Segalerba from turning this concept on its head and producing a “wheelchair” scale model driven instead by a pair of treads. 

This concept was inspired by Segalerba’s experience using a wheelchair himself while recovering from multiple surgeries, observing that our environment isn’t always conducive to wheeled transportation.

An Arduino board controls the device, and user input is via a joystick, or from a smartphone app over Bluetooth. You can read more about the projector on Wired Italia or translated to English here.

Sip and puff Morse code entry with Arduino

Those that need a text entry method other than a traditional keyboard and mouse often use a method where a character is selected, then input using a sip or puff of air from the user’s mouth. Naturally this is less than ideal, and one alternative interface shown here is to instead use sip/puff air currents to indicate the dots and dashes of Morse code.

The system—which can be seen in action in the video below—uses a modified film container, along with a pair of infrared emitters and detectors to sense air movement. The device was prototyped on an Arduino Mega, and its creators hope to eventually use a Leonardo for direct computer input. 

A tube connected to a custom made bipolar pressure switch drives an Arduino which translates puffing and sucking into Morse code and then into text.

Puffs make repeating short pulses (dots) and sucks repeating longer pulses (dashes) just like ham radio amateurs do with a dual-lever paddle.

Code for this open source project can be found on GitHub.

Notable Board Books are an Arduino-powered way to enjoy music

Annelle Rigsby found that her mother, who suffers from Alzheimer’s, is delighted to hear familiar songs. While Annelle can’t always be there to help her enjoy music, she and her husband Mike came up with what they call the Notable Board Book that automatically plays tunes.

The book itself is well laid-out, with song text and familiar photos printed on the pages. Electronics for the book are in a prototype state using an Arduino Uno and an Adafruit Sound Board to store and replay the audio bits.

Page detection is handled by an array of photocells, and it is meant to turn on automatically when picked up via a series of tilt switches. When a switch is triggered, a relay can then hold the book on until the song that is playing is done, or for a predetermined amount of time.

14 Year Old Builds Communication Device for Brain-Injured Friend

Try not to get anything in your eye as you hear this moving story of a teen helping an injured friend communicate with the world again.

Read more on MAKE

The post 14 Year Old Builds Communication Device for Brain-Injured Friend appeared first on Make: DIY Projects and Ideas for Makers.