Posts with «video hacks» label

Big Spinning Disk Makes a Small Color Video Display

Believe it or not, the Mickey Mouse clip used for this demonstration is actually in the public domain.

The earliest televisions used a spinning disk technology called the Nipkow disk, which is exactly what [Science ‘n’ Stuff] recreated with their Arduino-based mechanical color television (video link, also embedded below.) The device reads video and audio from an SD card, and displays the video using a precisely-timed RGB LED visible through a perforated spinning disk. The persistence of vision effect results in a video that is small, relative to the size of the disk, but perfectly watchable. A twist is that the video is in color!

A Nipkow disk is a fairly simple and electromechanical device that relies on timing; something a modern microcontroller and RGB LED is perfectly capable of delivering. In this device, the holes in the disk create 32 vertical scanlines with 96 “pixels” making up each of those lines. Spinning disk technology was always limited to being monochromatic, but in this implementation, each “pixel” is given its own unique color by adjusting the RGB LED accordingly.

The first video shows off the device and demonstrates it working; note that it may look like there are multiple little screens, but the center one can be thought of as the “true” display with the others essentially being artifacts due to light leakage. If you’re interested in the nuts and bolts of exactly how a Nipkow disk works, then the second video is what you’ll be more interested in, because it goes through all the details of exactly how everything functions.

Another neat thing about Nipkow disks is that image acquisition is really not much more complex than image display.

[via Arduino Blog]

 

Eyecam is Watching You in Between Blinks

We will be the first to admit that it’s often hard to be productive while working from home, especially if no one’s ever really looking over your shoulder. Well, here is one creepy way to feel as though someone is keeping an eye on you, if that’s what gets you to straighten up and fly right. The Eyecam research project by [Marc Teyssier] et. al. is a realistic, motorized eyeball that includes a camera and hangs out on top of your computer monitor. It aims to spark conversation about the sensors that are all around us already in various cold and clinical forms. It’s an open source project with a paper and a repo and a how-to video in the works.

The eyebrow-raising design pulls no punches in the uncanny department: the eye behaves as you’d expect (if you could have expected this) — it blinks, looks around, and can even waggle its brow. The eyeball, brow, and eyelids are actuated by a total of six servos that are controlled by an Arduino Nano.

Inside the eyeball is a Raspberry Pi camera connected to a Raspi Zero for the web cam portion of this intriguing horror show. Keep an eye out after the break for the Eyecam infomercial.

Creepy or fascinating, it succeeds in making people think about the vast amount of sensors around us now, and what the future of them could look like. Would mimicking eye contact be an improvement over the standard black and gray oblong eye? Perhaps a pair of eyes would be less unsettling, we’re not really sure. But we are left to wonder what’s next, a microphone that looks like an ear? Probably. Will it have hair sprouting from it? Perhaps.

Yeah, it’s true; two eyes are more on the mesmerizing side, but still creepy, especially when they follow you around the room and can shoot frickin’ laser beams.

Thanks for the tip, [Sven, greg, and Itay]!

Learn Multirotors From First Principles

Multirotors, or drones as they’re popularly called, are so ubiquitous as to have become a $10 toy. They’re no less fun to fly for it though, and learning how they work is no less fascinating. It’s something [Science Buddies] has addressed in a series of videos examining them from first principles. They may be aimed at youngsters, but they’re still an entertaining enough watch for those of advancing years.

Instead of starting with a multirotor control board, the video takes four little DC motors and two popsicle sticks to make a rudimentary drone frame. Then with the help of dowels and springs it tethers the craft as the control mechanisms are explained bit by bit, from simple on-off motor control through proportional control to adding an Arduino and following through to how a multirotor stays in flight. It’s instructional and fun to watch, and maybe even for some of us, a chance to learn something.

We’ve had multirotor projects aplenty here over the years, but how about something completely different made from popsicle sticks?

Hack a Day 11 Apr 09:00

How To Time Drone Races Without Transponders

Drone racing is nifty as heck, and a need all races share is a way to track lap times. One way to do it is to use transponders attached to each racer, and use a receiver unit of some kind to clock them as they pass by. People have rolled their own transponder designs with some success, but the next step is ditching add-on transponders entirely, and that’s exactly what the Delta 5 Race Timer project does.

A sample Delta 5 Race Timer build (Source: ET Heli)

The open-sourced design has a clever approach. In drone racing, each aircraft is remotely piloted over a wireless video link. Since every drone in a race already requires a video transmitter and its own channel on which to broadcast, the idea is to use the video signal as the transponder. As a result, no external hardware needs to be added to the aircraft. The tradeoff is that using the video signal in this way is trickier than a purpose-made transponder, but the hardware to do it is economical, accessible, and the design is well documented on GitHub.

The hardware consists of RX508 video receiver PCBs modified slightly to enable them to communicate over SPI. Each RX508 is attached to its own Arduino, which takes care of low-level communications. The Arduinos are themselves connected to a Raspberry Pi over I2C, allowing the Pi high-level control over the receivers while it serves up a web-enabled user interface. As a bonus, the Pi can do much more than simply act as a fancy stopwatch. The races themselves can be entirely organized and run through the web interface. The system is useful enough that other projects using its framework have popped up, such as the RotorHazard project by [PropWashed] which uses the same hardware design.

While rolling one’s own transponders is a good solution for getting your race on, using the video transmission signal to avoid transponders entirely is super clever. The fact that it can be done with inexpensive, off the shelf hardware is just icing on the cake.

RGB Sensor’s New Job: Cryptocurrency Trade Advisor

[XenonJohn] dabbles in cryptocurrency trading, and when he saw an opportunity to buy an RGB color sensor, his immediate thought — which he admitted to us would probably not be the immediate thought of most normal people — was that he could point it to his laptop screen and have it analyze the ratio of green (buy) orders to red (sell) orders being made for crypto trading. In theory, if at a given moment there are more people looking to buy than there are people looking to sell, the value of a commodity could be expected to go up slightly in the short-term. The reverse is true if a lot of sell orders coming in relative to buy orders. Having this information and possibly acting on it could be useful, but then again it might not. Either way, as far as out-of-left-field project ideas go, promoting an RGB color sensor to Cryptocurrency Trading Advisor is a pretty good one.

Since the RGB sensor only sees what is directly in front of it, [XenonJohn] assembled a sort of simple light guide. By enclosing the area of the screen that contains orders in foil-lined cardboard, the sensor can get a general approximation of the amount of red (sell orders) versus green (buy orders). The data gets read by an Arduino which does a simple analysis and sends alerts when a threshold is crossed. He dubbed it the Crypto-Eye, and a video demo is embedded below.

Could this have been done purely in software? Certainly, but there’s a certain charm to the Crypto-Eye being a standalone device that uses a simple visual input to make buy and sell predictions like a Speak & Spell.

Inventive crypto trading is just a side project for [XenonJohn], he’s better known around these parts for his outstanding contributions to one-wheeled electric vehicles, like this 3000W Electric Unicycle, which also happens to feature an Arduino with 80’s-style voice feedback, just like the Crypto-Eye.

Arduino Video isn’t Quite 4K

Video resolution is always on the rise. The days of 640×480 video have given way to 720, 1080, and even 4K resolutions. There’s no end in sight. However, you need a lot of horsepower to process that many pixels. What if you have a small robot powered by a microcontroller (perhaps an Arduino) and you want it to have vision? You can’t realistically process HD video, or even low-grade video with a small processor. CORTEX systems has an open source solution: a 7 pixel camera with an I2C interface.

The files for SNAIL Vision include a bill of materials and the PCB layout. There’s software for the Vishay sensors used and provisions for mounting a lens holder to the PCB using glue. The design is fairly simple. In addition to the array of sensors, there’s an I2C multiplexer which also acts as a level shifter and a handful of resistors and connectors.

Is seven pixels enough to be useful? We don’t know, but we’d love to see some examples of using the SNAIL Vision board, or other low-resolution optical sensors with low-end microcontrollers. This seems like a cheaper mechanism than Pixy. If seven pixels are too much, you could always try one.

Thanks [Paul] for the tip.


Filed under: Arduino Hacks, video hacks

DIY Motion Control Camera Rig Produces Money-Shots On A Budget

Motion control photography allows for stunning imagery, although commercial robotic MoCo rigs are hardly affordable. But what is money? Scratch-built from what used to be mechatronic junk and a hacked Canon EF-S lens, [Howard’s] DIY motion control camera rig produces cinematic footage that just blows us away.

[Howard] started this project about a year ago by carrying out some targeted experiments. These would not only assess the suitability of components he gathered together from all directions, but also his own capacity in picking up enough knowledge on mechatronics to make the whole thing work. After making  himself accustomed to stepper motors, Teensies and Arduinos, he converted an old moving-head disco light into a pan and tilt mount for the camera. A linear axis was added, and with more degrees of freedom, more sophisticated means of control became necessary.

Using the Swift programming language, [Howard] wrote a host program automatically detects the numerous stepper and servo motor based axis and streams the position data to their individual Teensy LC based controllers. To the professional motion graphics artist , these shots aren’t just nice and steady footage: The real magic happens when he starts adding perfectly matched layers of CGI. Therefore, he also wrote some Python scripts that allow him to manually control his MoCo rig from a virtual rig in Blender, and also export camera trajectories directly from his 3D scenes.

On top of the 4-axis camera mount and a rotary stage, [Howard] also needed to find an electronic follow-focus mechanism to keep the now moving objects in focus. Since the Canon EF-S protocol had already been reverse engineered, he decided to tap into the SPI control bus between the camera and the lens to make use of its internal ring motor. Although the piezo motors in autofocus lenses aren’t actually built for absolute positioning, a series of tests revealed that a Canon EF-S 17-55mm IS USM lens can be refocused a few hundred times and still return to its starting position close enough. The caveat: [Howard] had to hack open the £600 lens and drill holes in it. In retrospect, he tells us, it’s a miracle that his wife didn’t leave him during the project.

After several iterations of mechanical improvements, the motion control rig is now finished, and the first clips have already been recorded and edited. They’re stunning. Only the 6-axis robot arm hiding in [Howard’s] basement tells us that he just warming up for the real game. Enjoy the video below, but don’t miss out on the full 3-part video documentary on how this project came to be.


Filed under: digital cameras hacks, video hacks

The Most Immersive Pinball Machine: Project Supernova

Over at [Truthlabs], a 30 year old pinball machine was diagnosed with a major flaw in its game design: It could only entertain one person at a time. [Dan] and his colleagues set out to change this, transforming the ol’ pinball legend “Firepower” into a spectacular, immersive gaming experience worthy of the 21st century.

A major limitation they wanted to overcome was screen size. A projector mounted to the ceiling should turn the entire wall behind the machine into a massive 15-foot playfield for anyone in the room to enjoy.

 

With so much space to fill, the team assembled a visual concept tailored to blend seamlessly with the original storyline of the arcade classic, studying the machine’s artwork and digging deep into the sci-fi archives. They then translated their ideas into 3D graphics utilizing Cinema4D and WebGL along with the usual designer’s toolbox. Lasers and explosions were added, ready to be triggered by game interactions on the machine.

To hook the augmentation into the pinball machine’s own game progress, they elaborated an elegant solution, incorporating OpenCV and OCR, to read all five of the machine’s 7 segment displays from a single webcam. An Arduino inside the machine taps into the numerous mechanical switches and indicator lamps, keeping a Node.js server updated about pressed buttons, hits, the “Lange Change” and plunged balls.

The result is the impressive demonstration of both passion and skill you can see in the video below. We really like the custom shader effects. How could we ever play pinball without them?

 


Filed under: classic hacks, video hacks

Spit Out VGA with Non-Programmable Logic Chips

It’s not uncommon to bitbang a protocol with a microcontroller in a pinch. I2C is frequently crunched from scratch, same with simple serial protocols, occasionally complex systems like Ethernet, and a whole host of other communication standards. But VGA gets pretty tricky because of the timing requirements, so it’s less common to bitbang. [Sven] completely threw caution to the wind. He didn’t just bitbang VGA on an Arduino, but he went one step further and configured an array of 7400 logic chips to output a VGA signal.

[Sven]’s project is in two parts. In part one, he discusses choosing a resolution and setting up the timing signal. He proceeds to output a simple(-ish) VGA signal that can be displayed on a monitor using a single gate. At that point only a red image was displayed, but getting signal lock from the monitor is a great proof of concept and [Sven] moved on to more intricate display tricks.

With the next iteration of the project [Sven] talks about adding in more circuitry to handle things like frame counting, geometry, and color. The graphics that are displayed were planned out in a simulator first, then used to design the 7400 chip configuration for that particular graphic display. It made us chuckle that [Sven] reports his monitor managed to survive this latest project!

We don’t remember seeing non-programmable integrated circuits used for VGA generation before. But bitbanging the signal on an Arduino or from an SD card slot is a great test of your ability to calculate and implement precise timings with an embedded system. Give it a try!


Filed under: video hacks
Hack a Day 16 Oct 00:01

Arduino Video Over 2 Wires for Under $50: Mesa-Video

If you want video support on your project, you might start from a device like a Raspberry Pi that comes with it built in. [Kevinhub88] doesn’t accept such compromises, so he and his Black Mesa Labs have come up with a whole new way to add video support to devices like the Arduino and other cheap controllers. This project is called Mesa-Video, and it can add digital video at a resolution of up to 800 by 600 pixels to any device that has a single serial output.

The video is created by an FT813, a low cost GPU from FTDI that offers a surprising amount of video oomph from a cheap, low power chip (he has demoed it running from a lemon battery), meaning that he is hoping to be able to sell the Mesa-Video for under $50.

However, Mesa-Video is just the beginning. [Kevinhub88] wanted to get around the problem of stacking shields on Arduinos: add more  than one and you get problems. He wanted to create an interface that would be simpler, faster and more open, so he created the Mesa-Bus. This effectively wraps SPI and I2C traffic together over a simple, fast serial connection that doesn’t require much decoding. This means that you can send power and bi-directional data over a handful of wires, and still connect multiple devices at once, swapping them out as required. You could, for instance, do your development work on a PC talking to the prototype devices over Mesa-Bus, them swap the PC out for an Arduino when you have got the first version working in your dev environment. Is the Arduino not cutting it? Because Mesa-Bus is cross-platform and open source, it is easy to swap the Arduino for a Raspberry Pi without having to change your other devices. And, because all the data is going over a simple serial connection in plain text, it is easy to debug.

It’s an ambitious project, and [Kevinhub88] has a way to go: he is currently working on getting his first prototype Mesa-Bus devices up and running, and finalizing the design of the Mesa-Video. But it is an impressive start and we’ll be keeping a close eye on this work. Hopefully he can avoid that head crab problem as well because those things are as itchy as hell.


Filed under: Arduino Hacks, video hacks
Hack a Day 01 Sep 09:00