Posts with «arduino» label

Store and replay this robot’s movements from your phone

Robotic arms can be interesting, as are robots that roll around—especially on a semi-exotic Mecanum wheel setup. Dejan Nedelkovski’s latest How To Mechatronics build, however, combines both into one package.

This project actually starts out in a previous post, where he constructs the moving base with Mecanum wheels, enabling it to slide and rotate in any direction.

In this final(?) stage, he adds a five-axis robot arm mounted on top of its boxy frame, or six-axis if you count the gripper. Either way, the arm uses a total of six servos for actuation, and the base of the bot travels around under the power of four stepper motors. Each motor is controlled by an Arduino Mega, using a custom shield, allowing repeatable movements in any direction. These can be stored and replayed via the robot’s custom Android app as desired.

Star Trek TOS sickbay display on a breadboard

The future envisioned in the original Star Trek included, among other things, a shipboard sickbay with electronic monitors strangely reminiscent of the machines that medical personnel use today. To recreate a functional mini-replica of these displays, YouTuber Xtronical turned to a 2.8” TFT screen, a breadboard, and an Arduino Nano—noting that an Uno would also work.

The LCD display nails the look of Dr. McCoy’s device, and heartbeat sound can be played along with an onscreen flashing “PULSE” circle. A MAX30100 pulse/oximeter sensor and a temperature sensor take body readings, while a second DS18B20 is implemented to read ambient conditions for increased accuracy. 

It’s a fun Trekkie project, and Xtronical plans to elaborate on how it was made in future videos. 

A build of a working original Star Trek display with real sampled heart beat sound. Uses various sensors to get the readings from your body (via just your fingers) and displays them in the style of the 60’s sick bay screen. This bare “Bones” system could be built into a model unit or even a replica Tricorder.

Arduino Blog 08 Jul 16:19

Safely Measuring Single And Three-Phase Power

There are many reasons why one would want to measure voltage and current in a project, some applications requiring one to measure mains and even three-phase voltage to analyze the characteristics of a device under test, or in a production environment. This led [Michael Klopfer] at the University of California, Irvine along with a group of students to develop a fully isolated board to analyze both single and three-phase mains systems.

Each of these boards consists out of two sections: one is the high-voltage side, with the single phase board using the Analog Devices ADE7953 and the three-phase board the ADE9708. The other side is the low-voltage, isolated side to which the microcontroller or equivalent connects to using either SPI or I2C. Each board type comes in either SPI or I2C flavor.

Each board can be used to measure line voltage and current, and the Analog Devices IC calculates active, reactive, and apparent energy, as well as instantaneous RMS voltage and current. All of this data can then be read out using the provided software for the Arduino platform.

The goal of this project is to make it easy for anyone to reproduce their efforts, with board schematics (in Eagle format) and the aforementioned software libraries provided. Here it is somewhat unfortunate that the documentation can be somewhat incomplete, with basic information such as input and measurement ranges missing. Hopefully this will improve over the coming months as it does seem like a genuinely useful project for the community.

We’ve covered the work coming out of [Michael]’s lab before, including this great rundown on Lattice FPGAs. They’re doing machine vision, work on RISC-Vchips, and more. A stroll through the lab’s GitHub is worth your time.

 

 

 

 

LoRa security camera detects and transmits trespasser data

Security cameras are a great way to deter theft and vandalism, but what if the camera is out of WiFi range, or otherwise would need long cables to transmit pictures? As explained here, Tegwyn Twmffat has an interesting solution–taking advantage of neural network processing to recognize moving objects, along with a LoRa connection to sound the alarm when there is a potential problem.

Images are captured by a Raspberry Pi and camera, then processed with the help of an Intel Movidius Neural Compute Stick for identification. If it’s something of interest—a human, for example—a relatively small amount of data is transmitted to a MKR WAN 1300 base station, beeping faster and faster as the person approaches. 

As seen in the video below, it’s able to properly ignore the ‘test dog,’ while it beeps away when a person approaches! 

This electric soapbox car can reach a top speed of 35 km/h

If Elon Musk was to design a soapbox car, the prototype might look something like this by David Traum.

Traum’s project is powered by a 500W motor which is fed by a pair of 12V batteries and a 40 W solar cell, allowing it to attain a top speed of 35 km/h and a range of 10 to 15km. Although that might not sound like a huge number, it looks pretty fast at the end of the video below!

But that’s not all. The vehicle features a rather unique control system, with front wheel steering actuated by a stepper and cable assembly. An Arduino Mega is the brains of the operation, while user input is via a small touchscreen, a joystick, and even a steering wheel (equipped with an Uno, a 9V battery, radio module, and gyro sensor) that can work wirelessly as needed—perhaps to park remotely, or simply as a gigantic RC car

The clip here is in German, but you can read more in this English-translated article.

Arduino Blog 02 Jul 19:57

Experience the world like a cat with this whisker-style sensory extension

Imagine if you had whiskers. Obviously, this would make you something of an oddity in today’s society. On the other hand, you’d be able to sense nearby objects via the transmission of force through these hair structures.

In order to explore this concept, Chris Hill has created a whisker assembly for sensory augmentation, substituting flex sensors for the stiff hairs that we as humans don’t possess. The sensors—four are used here—vary resistance when bent, furnishing information about their status to the Arduino Uno that controls the wearable device. Forehead-mounted vibratory motors are pulsed via PWM outputs in response, allowing the user to feel what’s going on in the surrounding environment.

If this looks familiar, Hill is quick to credit Nicholas Gonyea’s Whisker Sensory Extension Wearable as the basis for this project. He hopes his take on things improves the original, making it lighter, more cost-effective, and easier to construct. 

The purpose of this project was to focus on the creation of novel, computationally-enriched “sensory extensions” that allow for augmented-sensing of the natural world. My major effort with this project was devoted to the fabrication and implementation of sensory augmentations that will extend a sense through sensors and respond with a tactile output for the user. The intent is to enable anyone to fabricate their own sensory extensions, and thusly map intrinsically human/animal senses onto hardware. Effectively extending our senses in new and exciting ways that will lead to a better understanding of how our brain is able to adapt to new external senses.

Computer 1.0 explores the relationship between textile and technology

While you might have never considered the idea, looms—especially the punchcard-driven Jacquard loom, which helped inform both Ada Lovelace and Charles Babbage’s pioneering work—are an important part of computing history. As reported here, Victoria Manganiello and Julian Goldman have created an awe-inspiring ode to this computing heritage in the form of a handwoven tapestry that constantly changes the way it looks, aptly named “Computer 1.0.”

The tapestry, which was recently on display at the Museum of Arts and Design in New York City, stretches nine meters in length and features tubing woven throughout. An Arduino actuates pumps and valves to produce familiar patterns in this tubing with blue-dyed water and air.

These patterns soon become abstract and perhaps more open to interpretation, though with more development it’s noted that images and even smartphone-readable designs could be possible. 

Be sure to see the short demo of this incredible installation in the video below! 

A handwoven textile activated by computer code, Computer 1.0 explores connections between weaving and technology. For the project, Victoria Manganiello invited designer Julian Goldman to collaborate on designing and programming a pump controlled by Arduino microcomputers to move precise sequences of air and liquid through the approximately 2,000 feet of tubing woven through the cloth. The movement of the air and liquid evokes traditional weaving patterns such as bird’s eye, monk’s cloth, and twill. And the operating system—the computer and the pump—is not kept out of sight in the service of the woven screen and the pixelated patterns that run across it, but rather are an integral part of the work; nothing is hidden.


Manganiello’s textile reflects and expands on the ob­scured history of weaving and coding, calling attention to the “under-over, under-over” movement of thread becoming cloth that originally inspired the “zero-one-zero-one” of binary code. The jacquard loom of 1801, which used punch cards to program the movement of thread into increasingly complex woven patterns, is a direct, though frequently forgotten, ancestor of modern computers.

This Amazon engineer made an AI-powered flap to keep his cat’s “gifts” outside

Amazon senior product manager Ben Hamm has a cat named Metric. While this adorable feline friend helped with a rat infestation problem in his apartment, he also likes to take his hunting skills out into nature, bringing… whatever home around one out of 10 nights.

To combat this situation, Hamm used an Amazon DeepLens camera to detect the cat, then examine whether or not it’s carrying something extra, based on a machine learning algorithm trained with over 23,000 images.

If the cat is carrying prey, an Arduino locks the cat out for 15 minutes, while the system texts Hamm pictures. It also gives a donation to the National Audubon Society, described by Hamm in his presentation below, as “blood money.” Currently it only works with Metric, but could be generalized with more cat data if you’re having the same problem.

Blisteringly Fast Machine Learning On An Arduino Uno

Even though machine learning AKA ‘deep learning’ / ‘artificial intelligence’ has been around for several decades now, it’s only recently that computing power has become fast enough to do anything useful with the science.

However, to fully understand how a neural network (NN) works, [Dimitris Tassopoulos] has stripped the concept down to pretty much the simplest example possible – a 3 input, 1 output network – and run inference on a number of MCUs, including the humble Arduino Uno. Miraculously, the Uno processed the network in an impressively fast prediction time of 114.4 μsec!

Whilst we did not test the code on an MCU, we just happened to have Jupyter Notebook installed so ran the same code on a Raspberry Pi directly from [Dimitris’s] bitbucket repo.

He explains in the project pages that now that the hype about AI has died down a bit that it’s the right time for engineers to get into the nitty-gritty of the theory and start using some of the ‘tools’ such as Keras, which have now matured into something fairly useful.

In part 2 of the project, we get to see the guts of a more complicated NN with 3-inputs, a hidden layer with 32 nodes and 1-output, which runs on an Uno at a much slower speed of 5600 μsec.

This exploration of ML in the embedded world is NOT ‘high level’ research stuff that tends to be inaccessible and hard to understand. We have covered Machine Learning On Tiny Platforms Like Raspberry Pi And Arduino before, but not with such an easy and thoroughly practical example.

A Doom-esque Port To The ATmega328

Doom holds a special place as one of the biggest games of the 1990s, as well as being one of the foundational blocks of the FPS genre. Long before 3D accelerators hit the market, iD Software’s hit was being played on computers worldwide, and later spread to all manner of other platforms. [David Ruiz] decided to build a cutdown version for everyone’s favourite, the ATmega328.

Due to the limited resources available, it’s not a direct port of Doom. [David] instead took some sprites and map data from the original game, and built a raycasting engine similar to that of Wolfenstein 3D. Despite the limited memory and CPU cycles, the basic game can run at between 8-11 FPS. There are fancy dithering tricks to help improve the sense of depth, a simplified enemy AI, and even a custom text library for generating the UI.

It’s a great example of what can be done with a seemingly underpowered part. We’ve seen similar work before, with Star Fox replicated on the Arduboy. A hacker’s ingenuity truly knows no bounds.

 

Hack a Day 29 Jun 12:00