Posts with «raspberry pi» label

A SuperCap UPS

If you treat your Pi as a wearable or a tablet, you will already have a battery. If you treat your Pi as a desktop you will already have a plug-in power supply, but how about if you live where mains power is unreliable? Like [jwhart1], you may consider building an uninterruptible power supply into a USB cable. UPSs became a staple of office workers when one-too-many IT headaches were traced back to power outages. The idea is that a battery will keep your computer running while the power gets its legs back. In the case of a commercial UPS, most generate an AC waveform which your computer’s power supply converts it back to DC, but if you can create the right DC voltage right to the board, you skip the inverting and converting steps.

Cheap batteries develop a memory if they’re drained often, but if you have enough space consider supercapacitors which can take that abuse. They have a lower energy density rating than lithium batteries, but that should not be an issue for short power losses. According to [jwhart1], this quick-and-dirty approach will power a full-sized Pi, keyboard, and mouse for over a minute. If power is restored, you get to keep on trucking. If your power doesn’t come back, you have time to save your work and shut down. Spending an afternoon on a power cable could save a weekend’s worth of work, not a bad time-gamble.

We see what a supercap UPS looks like, but what about one built into a lightbulb or a feature-rich programmable UPS?

Alma The Talking Dog Might Win Some Bar Bets

Students at the University of Illinois at Urbana-Champaign have a brain-computer interface that can measure brainwaves. What did they do with it? They gave it to Alma, a golden labrador, as you can see in the video below. The code and enough info to duplicate the electronics are on GitHub.

Of course, the dog doesn’t directly generate speech. Instead, the circuit watches her brainwaves via an Arduino and feeds the raw data to a Raspberry Pi. A machine learning algorithm determines Alma’s brainwave state and plays prerecorded audio expressing Alma’s thoughts.

Alma’s collar duplicates — to some degree — the fictional collar from the movie Up. Of course, Dug was a bit more loquacious. It isn’t very clear from the video how many states the program classifies. A quick peek at the code reveals five audio clips but only one appears to be wired to the recognizer — the one for a treat. We think it might be a harder problem to figure out when the dog does not want a treat.

The last time we saw a talking dog collar it was phone-controlled. If you really want to probe a brain — canine or human — you could do worse than to check out OpenHardwareExG.

Oh. By the way. Good dog! Very good dog!

A Modern Solution To Tea Bag Inventory Management

Britain is famously known as a land of manners and hospitality. Few situations could make an Englishman’s stiff upper lip quiver, short of running out of tea bags while entertaining house guests. Thankfully, [The Gentleman Maker] is here and living up to his name – with a helpful tea monitor to ensure you’re never caught out again.

The Intelli-T, as it has been dubbed, monitors tea inventory by weight. An Arduino Uno combined with a HX711 IC monitors a load cell mounted under a canister, with a reed switch on the lid. Upon the canister being open and closed, the Arduino takes a measurement, determining whether tea stocks have dipped below critical levels. If the situation is dire, a Raspberry Pi connected over the serial port will sound an urgent warning to the occupants of the home. If there is adequate tea, the Raspberry Pi will instead provide a helpful tea fact to further educate the users about the hallowed beverage.

It’s a fun project, and one that has scope for further features, given the power of the Raspberry Pi. A little more work could arrange automatic ordering of more tea online, or send alerts through a service like IFTTT. We’ve seen [The Gentleman Maker]’s uniquely British hacks before, such as the umbrella that tells you the weather. Video after the break.

Hack a Day 21 Feb 06:00

RVR is a Sphero robot for budding tinkerers

Sphero's been amusing us with its collection of robotic balls, like its adorable BB-8, for eight years. But lately the company has been getting away from the toy aspect of its products and embracing its educational potential. It's had an app that can be used to program many of its current bots for a while now, but that's only for budding coders — what do kids interested in hardware have to tinker with? Indeed, Sphero is about to release its first robot specifically made to be physically modded, called the RVR.

RVR is a Sphero robot for budding tinkerers

Sphero's been amusing us with its collection of robotic balls, like its adorable BB-8, for eight years. But lately the company has been getting away from the toy aspect of its products and embracing its educational potential. It's had an app that can be used to program many of its current bots for a while now, but that's only for budding coders — what do kids interested in hardware have to tinker with? Indeed, Sphero is about to release its first robot specifically made to be physically modded, called the RVR.

How To Time Drone Races Without Transponders

Drone racing is nifty as heck, and a need all races share is a way to track lap times. One way to do it is to use transponders attached to each racer, and use a receiver unit of some kind to clock them as they pass by. People have rolled their own transponder designs with some success, but the next step is ditching add-on transponders entirely, and that’s exactly what the Delta 5 Race Timer project does.

A sample Delta 5 Race Timer build (Source: ET Heli)

The open-sourced design has a clever approach. In drone racing, each aircraft is remotely piloted over a wireless video link. Since every drone in a race already requires a video transmitter and its own channel on which to broadcast, the idea is to use the video signal as the transponder. As a result, no external hardware needs to be added to the aircraft. The tradeoff is that using the video signal in this way is trickier than a purpose-made transponder, but the hardware to do it is economical, accessible, and the design is well documented on GitHub.

The hardware consists of RX508 video receiver PCBs modified slightly to enable them to communicate over SPI. Each RX508 is attached to its own Arduino, which takes care of low-level communications. The Arduinos are themselves connected to a Raspberry Pi over I2C, allowing the Pi high-level control over the receivers while it serves up a web-enabled user interface. As a bonus, the Pi can do much more than simply act as a fancy stopwatch. The races themselves can be entirely organized and run through the web interface. The system is useful enough that other projects using its framework have popped up, such as the RotorHazard project by [PropWashed] which uses the same hardware design.

While rolling one’s own transponders is a good solution for getting your race on, using the video transmission signal to avoid transponders entirely is super clever. The fact that it can be done with inexpensive, off the shelf hardware is just icing on the cake.

Machine Learning on Tiny Platforms Like Raspberry Pi and Arduino

Machine learning is starting to come online in all kinds of arenas lately, and the trend is likely to continue for the forseeable future. What was once only available for operators of supercomputers has found use among anyone with a reasonably powerful desktop computer. The downsizing isn’t stopping there, though, as Microsoft is pushing development of machine learning for embedded systems now.

The Embedded Learning Library (ELL) is a set of tools for allowing Arduinos, Raspberry Pis, and the like to take advantage of machine learning algorithms despite their small size and reduced capability. Microsoft intended this library to be useful for anyone, and has examples available for things like computer vision, audio keyword recognition, and a small handful of other implementations. The library should be expandable to any application where machine learning would be beneficial for a small embedded system, though, so it’s not limited to these example applications.

There is one small speed bump to running a machine learning algorithm on your Raspberry Pi, though. The high processor load tends to cause small SoCs to overheat. But adding a heatsink and fan is something we’ve certainly seen before. Don’t let your lack of a supercomputer keep you from exploring machine learning if you see a benefit to it, and if you need more power than just one Raspberry Pi you can always build a cluster to get your task done just a little bit faster, too.

Thanks to [Baldpower] for the tip!

Arduino and Pi Share Boardspace

A Raspberry Pi Zero (W) and Arduino are very different animals, the prior has processing power and connectivity while the latter has some analog to digital converters (ADCs) and nearly real-time reactions. You can connect them to one another with a USB cable and for many projects that will happily wed the two. Beyond that, we can interface this odd couple entirely through serial, SPI, I2C, and logic-level signaling. How? Through a device by [cburgess] that is being called an Arduino shield that supports a Pi0 (W). Maybe it is a cape which interfaces with Arduino. The distinction may be moot since each board has a familiar footprint and both of them are found here.

Depending on how they are set up and programmed, one can take control over the other, or they could happily do their own thing and just exchange a little information. This board is like a marriage counselor between a Raspberry Pi and an Arduino. It provides the level-shifting so they don’t blow each other up and libraries so they can speak nicely to one another. If you want to dig a bit deeper into this one, design files and code examples are on available.

Perhaps we’ll report on this board at the heart of a pinball machine retrofit, a vintage vending machine restoration, or maybe a working prop replica from the retro bar in Back to the Future II.

A Star-Trek-Inspired Robot With Raspberry Pi and AI

When [314Reactor] got a robot car kit, he knew he wanted to add some extra things to it. At about the same time he was watching a Star Trek episode that featured exocomps — robots that worked in dangerous areas. He decided to use those fictional devices to inspire his modifications to the car kit. Granted, the fictional robots were intelligent and had a replicator. So you know he won’t make an actual working replica. But then again, the ones on the TV show didn’t have all that either.

A Raspberry Pi runs Tensorflow using the standard camera.  This lets it identify objects of interest (assuming it gets them right) and sends the image back to the operator along with some identifying information. The kit already had an Arduino onboard and the new robot talks to it via a serial port. You can see a video about the project, below.

The design is complicated a bit by the fact that the original kit uses a Bluetooth adapter to send and receive serial commands from a mobile device. However, the controller software with the kit, though, allows for extra buttons, so the Arduino can receive command and send them to the Pi.

The code for the robot — known as Scorpion — is available on GitHub. The extra commands relate to the camera and also some servos that move pincers to mimic the TV robot. Images return to the operator via the Telegram cloud service.

We have to admit, the Scorpion isn’t quite the same as an exocomp. But we can see the influence on the design. It wasn’t smart enough to identify itself in the mirror, so we don’t think it achieved sentience.

We’ve seen smart robots using Tensorflow before. If you prefer, you can always try OpenCV.

Artistic Images Made With Water Lens

It’s said that beauty and art can be found anywhere, as long as you look for it. The latest art project from [dmitry] both looks in unassuming places for that beauty, and projects what it sees for everyone to view. Like most of his projects, it’s able to produce its artwork in a very unconventional way. This particular project uses water as a lens, and by heating and cooling the water it produces a changing image.

The art installation uses a Peltier cooler to periodically freeze the water that’s being used as a lens. When light is projected through the frozen water onto a screen, the heat from the light melts the water and changes the projected image. The machine uses an Arduino and a Raspberry Pi in order to control the Peliter cooler and move the lens on top of the cooler to be frozen. Once frozen, it’s moved again into the path of the light in order to show an image through the lens.

[dmitry] intended the project to be a take on the cyclical nature of a substance from one state to another, and this is a very creative and interesting way of going about it. Of course, [dmitry]’s work always exhibits the same high build quality and interesting perspective, like his recent project which created music from the core samples of the deepest hole ever drilled.

Hack a Day 08 Dec 03:00
arduino  art  cooler  ice  lens  peltier  projection  raspberry pi  water