Posts with «due» label

Rudimentary ultrasound machine made with Arduino Due

Ultrasound images are an important tool for medical diagnosis, and while units used by doctors can be very expensive, getting a basic image doesn’t have to be. Inspired by this attempt at a $500 ultrasound machine seen here, maker “stoppi71” decided to create his own using a 5 MHz ultrasound transducer via a paint-thickness gauge.

An Arduino Due provides computing power to turn sound pulses into images, while a 3.5-inch TFT display shows what it’s examining. Short pulses in the 100-200 nanosecond range are generated with the help of a monoflop and MOSFET, returning an echo corresponding to what it’s “looking” at. 

Although the results are not nearly what you’d expect at the doctor’s office, rudimentary readings of skin and bone are definitely visible. 

I’ve examined different objects from aluminum-cylinders over water-filled balloons to my body. To see body-echos the amplification of the signals must be very high. For the aluminum-cylinders a lower amplification is needed. When you look at the pictures you can clearly see the echoes from the skin and my bone.

So what can I say about the success or failure of this project. It is possible to look inside the body with such simple methods and using parts, which aren’t commonly intended for that purpose. But these factors are limiting the results too. You don’t get such clear and well structured pictures compared with commercial solutions.

Relativty is a low-cost VR headset you can build yourself

While you’ve been hearing about virtual reality for the last 20 years or so, today the hardware required to build such a rig is finally to the point where it’s within the reach of consumers. As seen here, Relativty is a SteamVR-compatible headset that can be made for around $100.

Relativty uses a 3D-printed frame to house its 2560 x 1440 LCD screen, along with a pair of 80mm Fresnel lenses to properly focus the image. Control is accomplished via an Arduino Due and an MPU-6050 accelerometer, which feeds head-tracking info to an external gaming system. 

At this point, the device is clean though fairly basic, and will hopefully be the start of a truly excellent open source project as features are added.

Arduino and Pi Share Boardspace

A Raspberry Pi Zero (W) and Arduino are very different animals, the prior has processing power and connectivity while the latter has some analog to digital converters (ADCs) and nearly real-time reactions. You can connect them to one another with a USB cable and for many projects that will happily wed the two. Beyond that, we can interface this odd couple entirely through serial, SPI, I2C, and logic-level signaling. How? Through a device by [cburgess] that is being called an Arduino shield that supports a Pi0 (W). Maybe it is a cape which interfaces with Arduino. The distinction may be moot since each board has a familiar footprint and both of them are found here.

Depending on how they are set up and programmed, one can take control over the other, or they could happily do their own thing and just exchange a little information. This board is like a marriage counselor between a Raspberry Pi and an Arduino. It provides the level-shifting so they don’t blow each other up and libraries so they can speak nicely to one another. If you want to dig a bit deeper into this one, design files and code examples are on available.

Perhaps we’ll report on this board at the heart of a pinball machine retrofit, a vintage vending machine restoration, or maybe a working prop replica from the retro bar in Back to the Future II.

Communicate using your ear with Orecchio

When conversing face-to-face, there are a wide range of other emotions and inflections conveyed by our facial and body expressions. But what if you can’t express emotion this way, whether due to a physical impairment, or simply because of a covering—like a dust mask—temporarily hides your beautiful smile, and perhaps your hands are otherwise occupied?

As a solution to this dilemma, a team of researchers has been working on Orecchio, a robotic device that attaches to the ear and bends it to convey emotion. Three motors allow the ear to be bent in 22 distinct poses and movements, indicating 16 emotional states. Control is accomplished via an Arduino Due, linked up with a windows computer running a C# program. 

The prototype was implemented using off-the-shelf electronic components, miniature motors, and custom-made robotic arms. The device has a micro gear motor mounted on the bottom of a 3D-printed ear hook loop clip. The motor drives a plastic arm against the side of the helix, able to bend it towards the center of the ear. Rotating the plastic arm back to its rest position allows the helix to restore to its original form. Near the top of the earpiece is another motor that drives a one-joint robotic arm that is attached to the top of the helix, using a round ear clip. Rotating the motor extends the robotic arm from its resting position, to bend the top helix downwards the center of the ear. The motor together with the one-joint robotic arm is mounted on a linear track that can be moved vertically through a rack-and-pinion mechanism, driven by a third motor. Moving the rack upwards stretches the helix.

The prototype is demonstrated in the video below, and more info is available in the project’s research paper.

DualPanto is a non-visual gaming interface for the blind

While there are tools that allow the visually impaired to interact with computers, conveying spacial relationships, such as those needed for gaming, is certainly a challenge. To address this, researchers have come up with DualPanto.

As the name implies, the system uses two pantographs for location IO, and on the end of each is a handle that rotates to indicate direction. One pantograph acts as an output to indicate where the object is located, while the other acts as a player’s input interface. One device is positioned above the other, so the relative position of each in a plane can be gleaned. 

The game’s software runs on a MacBook Pro, and an Arduino Due is used to interface the physical hardware with this setup. 

DualPanto is a haptic device that enables blind users to track moving objects while acting in a virtual world.

The device features two handles. Users interact with DualPanto by actively moving the ‘me’ handle with one hand and passively holding on to the ‘it’ handle with the other. DualPanto applications generally use the me handle to represent the user’s avatar in the virtual world and the it handle to represent some other moving entity, such as the opponent in a soccer game.

Be sure to check it out in the video below, or read the full research paper here.

Single-handed smartwatch text entry with WrisText

Smartwatches can keep us informed of incoming information at a glance, but responding still takes the use of another hand, potentially occupied by other tasks. Researchers at Dartmouth College are trying to change that with their new WrisText system.

The device divides the outside of a Ticwatch 2 into six sections of letters, selected by the movement of one’s wrist. As letters are chosen, possible words are displayed on the screen, which are then selected automatically, or by rubbing and tapping gestures between one’s finger and thumb. 

The prototype employs an Arduino DUE to pass information to a computer, along with proximity and piezo sensors to detect hand and finger movements. 

We present WrisText – a one-handed text entry technique for smartwatches using the joystick-like motion of the wrist. A user enters text by whirling the wrist of the watch hand, towards six directions which each represent a key in a circular keyboard, and where the letters are distributed in an alphabetical order. The design of WrisText was an iterative process, where we first conducted a study to investigate optimal key size, and found that keys needed to be 55o or wider to achieve over 90% striking accuracy. We then computed an optimal keyboard layout, considering a joint optimization problem of striking accuracy, striking comfort, word disambiguation. We evaluated the performance of WrisText through a five-day study with 10 participants in two text entry scenarios: hand-up and hand- down. On average, participants achieved a text entry speed of 9.9 WPM across all sessions, and were able to type as fast as 15.2 WPM by the end of the last day.

More information can be found in the project’s research paper, or you can see it demonstrated in the video below.

A HID For Robots

Whether with projects featured here or out in the real world, we have a tendency to focus most upon the end product. The car, solar panel, or even robot. But there’s a lot more going on behind the scenes that needs to be taken care of as well, whether it’s fuel infrastructure to keep the car running, a semiconductor manufacturer to create silicon wafers, or a control system for the robot. This project is one of the latter: a human interface device for a robot arm that is completely DIY.

While robots are often automated, some still need human input. The human input can be required all the time, or can be used to teach the robot initially how to perform a task which will then be automated. This “keyboard” of sorts built by [Ahmed] comes with a joystick, potentiometer, and four switch inputs that are all fully programmable via an Arduino Due. With that, you can perform virtually any action with whatever type of robot you need, and since it’s based on an Arduino it would also be easy to expand.

The video below and project page have all the instructions and bill of materials if you want to roll out your own. It’s a pretty straightforward project but one that might be worth checking out since we don’t often feature controllers for other things, although we do see them sometimes for controlling telescopes rather than robots.



Hack a Day 02 Jun 06:00

Scribble is an Arduino-controlled haptic drawing robot

As part of his master’s studies at Eindhoven University, Felix Ros created a haptic drawing interface that uses a five-bar linkage system to not only take input from one’s finger, but also act as a feedback device via a pair of rotary outputs.

“Scribble” uses an Arduino Due to communicate with a computer, running software written in OpenFrameworks.

For over a century we have been driving cars, enabling us to roam our surroundings with little effort. Now with the introduction of automated driving, machines will become our chauffeurs. But how about getting us around a road construction, or finding a friend in a crowded area? Or what if you just want to explore and find new places, will these cars be able to handle such situations and how can you show your intentions?

Currently there is no middle ground between the car taking the wheel or its driver, this is where Scribble comes in: a haptic interface that lets you draw your way through traffic. You draw a path and the car will follow, not letting you drive but pilot the car. Scribble lets you help your car when in need, and wander your surroundings once again.

You can learn more about Ros’ design in his write-up here, including the code needed to calculate and output forward kinematics to set the X/Y position, and inverse kinematics to sense user input.

Be sure to check it out in the video below piloting a virtual car through traffic with ease!

Integrating a Nintendo Power Glove with today’s VR technology

When the Power Glove was released in the early 1990s, the idea that you could control games with hand motions was incredible, but like the Virtual Boy that followed years later, the hardware of the day just couldn’t keep up. Today, hardware has finally gotten to the point where this type of interface could be very useful, so Teague Labs decided to integrate a Power Glove with an HTC Vive VR headset.

While still under development, the glove’s finger sensors have shown great promise for interactions with virtual touchscreen devices, and they’ve even come up with a game where you have to counter rocks, paper, and scissors with the correct gesture.

Making this all possible is the Arduino Due, which supports the library for communicating with the Vive tracker.

We took a Power Glove apart, 3D scanned the interfacing plastic parts and built modified parts that hold the Vive Tracker and an Arduino Due on the glove. After some prototyping on a breadboard, we designed a shield for the Due and etched it using the laser-cutter transfer technique. We then soldered all components and spray-painted the whole shield to protect the bare copper. After mounting the tracker and tweaking the code by matzmann666, we had the glove work.

If you’d like to see the details of what has been accomplished so far, check out the Teague Labs team’s design files and code on GitHub.

Project Aslan is a 3D-printed robotic sign language translator

With the lack of people capable of turning written or spoken words into sign language in Belgium, University of Antwerp masters students Guy Fierens, Stijn Huys, and Jasper Slaets have decided to do something about it. They built a robot known as Aslan, or Antwerp’s Sign Language Actuating Node, that can translate text into finger-spelled letters and numbers.

Project Aslan–now in the form of a single robotic arm and hand–is made from 25 3D-printed parts and uses an Arduino Due, 16 servos, and three motor controllers. Because of its 3D-printed nature and the availability of other components used, the low-cost design will be able to be produced locally.

The robot works by receiving information from a local network, and checking for updated sign languages from all over the world. Users connected to the network can send messages, which then activate the hand, elbow, and finger joints to process the messages.

Although it is one arm now, work will continue with future masters students, focusing on expanding to a two-arm design, implementing a face, and even integrating a webcam into the system. For more info, you can visit the project’s website here as well as its write-up on 3D Hubs.