Posts with «machine learning» label

Machine Learning on Tiny Platforms Like Raspberry Pi and Arduino

Machine learning is starting to come online in all kinds of arenas lately, and the trend is likely to continue for the forseeable future. What was once only available for operators of supercomputers has found use among anyone with a reasonably powerful desktop computer. The downsizing isn’t stopping there, though, as Microsoft is pushing development of machine learning for embedded systems now.

The Embedded Learning Library (ELL) is a set of tools for allowing Arduinos, Raspberry Pis, and the like to take advantage of machine learning algorithms despite their small size and reduced capability. Microsoft intended this library to be useful for anyone, and has examples available for things like computer vision, audio keyword recognition, and a small handful of other implementations. The library should be expandable to any application where machine learning would be beneficial for a small embedded system, though, so it’s not limited to these example applications.

There is one small speed bump to running a machine learning algorithm on your Raspberry Pi, though. The high processor load tends to cause small SoCs to overheat. But adding a heatsink and fan is something we’ve certainly seen before. Don’t let your lack of a supercomputer keep you from exploring machine learning if you see a benefit to it, and if you need more power than just one Raspberry Pi you can always build a cluster to get your task done just a little bit faster, too.

Thanks to [Baldpower] for the tip!

Kinetic Sculpture Achieves Balance Through Machine Learning

We all know how important it is to achieve balance in life, or at least so the self-help industry tells us. How exactly to achieve balance is generally left as an exercise to the individual, however, with varying results. But what about our machines? Will there come a day when artificial intelligences and their robotic bodies become so stressed that they too will search for an elusive and ill-defined sense of balance?

We kid, but only a little; who knows what the future field of machine psychology will discover? Until then, this kinetic sculpture that achieves literal balance might hold lessons for human and machine alike. Dubbed In Medio Stat Virtus, or “In the middle stands virtue,” [Astrid Kraniger]’s kinetic sculpture explores how a simple system can find a stable equilibrium with machine learning. The task seems easy: keep a ball centered on a track suspended by two cables. The length of the cables is varied by stepper motors, while the position of the ball is detected by the difference in weight between the two cables using load cells scavenged from luggage scales. The motors raise and lower each side to even out the forces on each, eventually achieving balance.

The twist here is that rather than a simple PID loop or another control algorithm, [Astrid] chose to apply machine learning to the problem using the Q-Behave library. The system detects when the difference between the two weights is decreasing and “rewards” the algorithm so that it learns what is required of it. The result is a system that gently settles into equilibrium. Check out the video below; it’s strangely soothing.

We’ve seen self-balancing systems before, from ball-balancing Stewart platforms to Segway-like two-wheel balancers. One wonders if machine learning could be applied to these systems as well.

Redeem Your Irresponsible 90s Self

If you were a youth in the 90s, odds are good that you were a part of the virtual pet fad and had your very own beeping Tamagotchi to take care of, much to the chagrin of your parents. Without the appropriate amout of attention each day, the pets could become sick or die, and the only way to prevent this was to sneak the toy into class and hope it didn’t make too much noise. A more responsible solution to this problem would have been to build something to take care of your virtual pet for you.

An art installation in Moscow is using an Arduino to take care of five Tamagotchis simultaneously in a virtal farm of sorts. The system is directly wired to all five toys to simulate button presses, and behaves ideally to make sure all the digital animals are properly cared for. Although no source code is provided, it seems to have some sort of machine learning capability in order to best care for all five pets at the same time. The system also prints out the statuses on a thermal printer, so you can check up on the history of all of the animals.

The popularity of these toys leads to a lot of in-depth investigation of what really goes on inside them, and a lot of other modifications to the original units and to the software. You can get a complete ROM dump of one, build a giant one, or even take care of an infinite number of them. Who would have thought a passing fad would have so much hackability?

Hack a Day 19 Oct 06:00

Nvidia Jetson TX1 Cat Spotter and Laser Teaser

The Jetson TX1 Cat Spotter uses advanced neural networking to recognize when there's a cat in the room — and then starts teasing it with a laser.

Read more on MAKE

The post Nvidia Jetson TX1 Cat Spotter and Laser Teaser appeared first on Make: DIY Projects and Ideas for Makers.

Sorting cucumbers using AI, Raspberry Pi + Arduino

When it comes to farming veggies like cucumbers, the sorting process can often be just as hard and tricky as actually growing them. That’s why Makoto Koike is using Google’s TensorFlow machine learning technology to categorize the cucumbers on his family’s farm by size, shape and color, enabling them to focus on more important and less tedious work.

A camera-equipped Raspberry Pi 3 is used to take images of the cucumbers and send them to a small-scale TensorFlow neural network. The pictures are then forwarded to a larger network running on a Linux server to perform a more detailed classification. From there, the commands are fed to an Arduino Micro that controls a conveyor belt system that handles the actual sorting, dropping them into their respective container.

You can read all about the Google AI project here, as well as see it in action below!

Machine learning for the maker community

At Arduino Day, I talked about a project I and my collaborators have been working on to bring machine learning to the maker community. Machine learning is a technique for teaching software to recognize patterns using data, e.g. for recognizing spam emails or recommending related products. Our ESP (Example-based Sensor Predictions) software recognizes patterns in real-time sensor data, like gestures made with an accelerometer or sounds recorded by a microphone. The machine learning algorithms that power this pattern recognition are specified in Arduino-like code, while the recording and tuning of example sensor data is done in an interactive graphical interface. We’re working on building up a library of code examples for different applications so that Arduino users can easily apply machine learning to a broad range of problems.

The project is a part of my research at the University of California, Berkeley and is being done in collaboration with Ben Zhang, Audrey Leung, and my advisor Björn Hartmann. We’re building on the Gesture Recognition Toolkit (GRT) and openFrameworks. The software is still rough (and Mac only for now) but we’d welcome your feedback. Installations instructions are on our GitHub project page. Please report issues on GitHub.

Our project is part of a broader wave of projects aimed at helping electronics hobbyists make more sophisticated use of sensors in their interactive projects. Also building on the GRT is ml-lib, a machine learning toolkit for Max and Pure Data. Another project in a similar vein is the Wekinator, which is featured in a free online course on machine learning for musicians and artists. Rebecca Fiebrink, the creator of Wekinator, recently participated in a panel on machine learning in the arts and taught a workshop (with Phoenix Perry) at Resonate ’16. For non-real time applications, many people use scikit-learn, a set of Python tools. There’s also a wide range of related research from the academic community, which we survey on our project wiki.

For a high-level overview, check out this visual introduction to machine learning. For a thorough introduction, there are courses on machine learning from coursera and from udacity, among others. If you’re interested in a more arts- and design-focused approach, check out alt-AI, happening in NYC next month.

If you’d like to start experimenting with machine learning and sensors, an excellent place to get started is the built-in accelerometer and gyroscope on the Arduino or Genuino 101. With our ESP system, you can use these sensors to detect gestures and incorporate them into your interactive projects!

My virtual robot project

Concept:

read more

Math machine learning

Whenever I observe my daughter, I find that she likes to collect things. The more, the better. Usually as much as she can carry. As she can not count, how can she distinguish, what are more things and what are less things?

My daughter with objects in both hands

read more