Posts with «arduino» label

Create a voice-controlled device with Alexa and Arduino IoT Cloud in 7 minutes

We’re excited to announce the launch of the official Arduino Amazon Alexa Skill. 

You can now securely connect Alexa to your Arduino IoT Cloud projects with no additional coding required. You could use Alexa to turn on the lights in the living room, check the temperature in the bedroom, start the coffee machine, check on your plants, find out if your dog is sleeping in the doghouse… the only limit is your imagination! 

Below are some of the features that will be available:

  • Changing the color and the luminosity of lights
  • Retrieving temperature and detect motion activity from sensors
  • Using voice commands to trigger switches and smart plugs

Being compatible with one of the most recognized cloud-based services on the market, bridges the communication gap between different applications and processes, and removes many tricky aspects that usually follows wireless connectivity and communication.

Using Alexa is as simple as asking a question — just ask, and Alexa will respond instantly. 

Integrating Arduino with Alexa is as quick and easy as these four simple steps:

1. Add the Arduino IoT Cloud Smart Home skill.

2. Link your Arduino Create account with Alexa.

3. Once linked, go to the device tab in the Alexa app and start searching for devices.

4. The properties you created in the Arduino IoT Cloud now appear as devices!

Boom — you can now start voice controlling your Arduino project with Alexa!

IoT – secure connections

The launch of the Arduino IoT Cloud & Alexa integration brings easy cross platform communication, customisable user interfaces and reduced complexity when it comes to programming. These features will allow many different types of users to benefit from this service, where they can create anything from voice controlled light dimmers to plant waterers. 

While creating IoT applications is a lot of fun, one of the main concerns regarding IoT is data security. Arduino IoT Cloud was designed to have security as a priority, so our compatible boards come with an ECC508 crypto chip, ensuring that your data and connections remain secure and private to the highest standard. 

The latest update to the Arduino IoT Cloud enables users with a Create Maker Plan subscription to use devices based on the popular ESP8266, such as NodeMCU and ESPduino. While these devices do not implement a crypto chip, the data transferred over SSL is still encrypted. 

Getting started with this integration

In order to get started with Alexa, you need to go through a few simple steps to make things work smoothly:

  • Setting up your Arduino IoT Cloud workspace with your Arduino Create account
  • Getting an IoT Cloud compatible board
  • Installing the Arduino Alexa Skill

Setting up the Arduino IoT Cloud workspace

Getting started with the Arduino IoT Cloud is fast and easy, and by following this tutorial you will get a detailed run through of the different functionalities and try out some of the examples! Please note, you will need an Arduino Create account in order to use the Arduino IoT Cloud and a compatible board.

Getting an IoT Cloud compatible board

The Arduino IoT Cloud currently supports the following Arduino boards: MKR 1000, MKR WiFi 1010, MKR GSM 1400 and Nano 33 IoT. You can find and purchase these boards from our store

The following properties in the Arduino IoT Cloud can currently be used with Alexa:

  • Light
  • Dimmable light
  • Colored light
  • Smart plug
  • Smart switch
  • Contact sensor
  • Temperature sensor
  • Motion sensor

Any of these properties can be created in the Arduino IoT Cloud platform. A sketch will be generated automatically to read and set these properties.

Installing the Arduino Alexa Skill

To install the Arduino Alexa Skill, you will need to have an Amazon account and download the latest version of the Alexa app on a smartphone or tablet, or use the Amazon Web application. You can find the link to the Amazon Alexa app here. Once we are successfully logged into the app, it is time to make the magic happen. 


To integrate Alexa and Arduino IoT Cloud, you need to add the Arduino skill. Then link your Arduino Create account with Alexa. Once linked, select the device tab in the Alexa app and start discovering devices.

The smart home properties already in existence in the Arduino IoT Cloud now appear as devices, and you can start controlling them with the Alexa app or your voice!

For more information, please visit the Arduino Alexa Skill.

Step-by-step guide to connecting Arduino IoT Cloud with Alexa

 A simple and complete step-by-step guide showing you how to connect the Arduino IoT Cloud with Alexa, is available via this tutorial.

Share your creativity with us!

Community is everything for Arduino, so we would love to see what you create! Make sure you document and share your amazing projects for example on Arduino Project Hub and use the #ArduinoAlexa hashtag to make it discoverable by everyone! 

This New Nano Pack Has EVERYthing You Need

The Arduino Nano Every is now available in a 3 and 6 pack – perfect for running a course or powering all your projects with Arduino.

For those countless creations requiring a small and easy to use microcontroller board, the Nano Every has the tiniest Arduino form factor out there measuring just 45x18mm. Whether you’re working on a low-cost robotics project for the entire classroom or presenting a complex prototype with many functional blocks, this pack offers exactly what you need – a batch of Nano Every boards at a great price! 

This robust little board costs as little as €7.50 each ($9.30 each) in the 6 pack, saving €0.50 ($0.60) per board versus the single. It’s now more affordable than ever to forecast the local across town by building your own little band of Gnome Weather Forecasters in your class.

If you are interested in the ARDUINO NANO EVERY – PACK, visit the Arduino online store at this link.

Rolling robot transformed into a zip lining contraption

MOREbot is an Arduino-powered educational robotic platform that’s currently available for pre-order. While the base kit is geared (literally and figuratively) towards building a small two-motor robot, MORE Technologies CEO Canon Reeves shows off how it can be reconfigured into an RC zip lining device in the video below.

The project uses the kit’s DC motors for traversing the cable, with O-rings that normally form the tires taken off in order to grip the top of a paracord. Everything is controlled by an Arduino Uno and a motor shield, while a Bluetooth module provides wireless connectivity. Control is via an iPad app, which simply rotates both motors at the same time as needed.

Since the parts are all modular, Reeves is planning on adding a few other attachments including a GoPro camera mount and perhaps even a servo that lets him drop a payload like a water balloon from it.

Arduino Blog 11 Nov 19:46

ElastImpact brings a bit more realism to VR

If you’ve ever used a VR system and thought what was really missing is the feeling of being hit in the face, then a team researchers at the National Taiwan University may hold just the solution. 

ElastImpact takes the form of a head-mounted display with two impact drivers situated roughly parallel to one’s eyes for normal — straight-on — impacts, and another that rotates about the front of your face for side blows.

Each impact driver first stretches an elastic band using a gearmotor, then releases it with a micro servo when an impact is required. The system is controlled by an Arduino Mega, along with a pair of TB6612FNG motor drivers. 

Impact is a common effect in both daily life and virtual reality (VR) experiences, e.g., being punched, hit or bumped. Impact force is instantly produced, which is distinct from other force feedback, e.g., push and pull. We propose ElastImpact to provide 2.5D instant impact on a head-mounted display (HMD) for realistic and versatile VR experiences. ElastImpact consists of three impact devices, also called impactors. Each impactor blocks an elastic band with a mechanical brake using a servo motor and extending it using a DC motor to store the impact power. When releasing the brake, it provides impact instantly. Two impactors are affixed on both sides of the head and connected with the HMD to provide the normal direction impact toward the face (i.e., 0.5D in z-axis). The other impactor is connected with a proxy collider in a barrel in front of the HMD and rotated by a DC motor in the tangential plane of the face to provide 2D impact (i.e., xy-plane). By performing a just-noticeable difference (JND) study, we realize users’ impact force perception distinguishability on the heads in the normal direction and tangential plane, separately. Based on the results, we combine normal and tangential impact as 2.5D impact, and performed a VR experience study to verify that the proposed 2.5D impact significantly enhances realism.

TipText enables one-handed text entry using a fingertip keyboard

Today when you get a text, you can respond with message via an on-screen keyboard. Looking into the future, however, how would you interact unobtrusively with a device that’s integrated into eyeglasses, contacts, or perhaps even something else?

TipText is one solution envisioned by researchers at Dartmouth College, which uses a MPR121 capacitive touch sensor wrapped around one’s index finger as a tiny 2×3 grid QWERTY keyboard.

The setup incorporates an Arduino to process inputs on the grid and propose a number of possible words on a wrist-mounted display that the user can select by swiping right with the thumb. A new word is automatically started when the next text entry tap is received, allowing for a typing speed of around 12-13 words per minute.

Competition robot picks up (almost) all the balls

For the Warman Design and Build Competition in Sydney last month, Redditor ‘Travman_16 and team created an excellent Arduino-powered entry. The contest involved picking up 20 payloads (AKA balls) from a trough, and delivering them to a target trough several feet away in under 60 seconds.

Their autonomous project uses Mecanum wheels to move in any direction, plus a four-servo arm to collect balls in a box-like scoop made out of aluminum sheet. 

An Arduino Mega controls four DC gear motors via four IBT-4 drivers, while a Nano handles the servos. As seen in the video, it pops out of the starting area, sweeps up the balls and places them in the correct area at an impressive ~15 seconds. 

It manages to secure all but one ball on this run, and although that small omission was frustrating, the robot was still able to take fifth out of 19 teams. 

Arduino Blog 09 Nov 00:17
arduino  mega  nano  robotics  robots  

This 3D-printed SCARA robot dispenses ball bearings

SCARA robots are often used in industrial settings to move components in the proper location. In order to demonstrate the concept to students, Nicholas Schwankl has come up with a simple unit that employs three servos and 3D-printed parts to dispense 4.5mm bearings.

The device runs on an Arduino Mega (though an Uno or other model would work) and as seen in the video below, it twists its ‘shoulder’ and ‘elbow’ joint to position its dispenser tube. Once in place, a micro servo releases a bearing, allowing the tiny steel ball to drop into an empty slot.

STL files, a parts list, and Arduino code are available in the Schwankl’s write-up.

Rock ‘n Roll With 3D-Printed Tonewheels

What can you do with ferromagnetic PLA? [TheMixedSignal] used it to give new meaning to the term ‘musicians’ gear’. He’s made a proof of concept for a DIY tone generator, which is the same revolutionary system that made the Hammond organ sing.

Whereas the Hammond has one tonewheel per note, this project uses an Arduino to drive a stepper at varying speeds to produce different notes. Like we said, it’s a proof of concept. [TheMixedSignal] is proving that tonewheels can be printed, pickups can be wound at home, and together they will produce audible frequencies. The principle is otherwise the same — the protruding teeth of the gear induce changes in the magnetic field of the pickup.

[TheMixedSignal] fully intends to expand on this project by adding more tone wheels, trying different gear profiles, and replacing the stepper with a brushless motor. We can’t wait to hear him play “Karn Evil 9”. In the meantime, put on those cans and check out the demo/build video after the break.

We don’t have to tell you how great Hammond organs are for making music. But did you know they can also encode secret messages?

Via the Arduino blog.

Fruit identification using Arduino and TensorFlow

By Dominic Pajak and Sandeep Mistry

Arduino is on a mission to make machine learning easy enough for anyone to use. The other week we announced the availability of TensorFlow Lite Micro in the Arduino Library Manager. With this, some cool ready-made ML examples such as speech recognition, simple machine vision and even an end-to-end gesture recognition training tutorial. For a comprehensive background we recommend you take a look at that article

In this article we are going to walk through an even simpler end-to-end tutorial using the TensorFlow Lite Micro library and the Arduino Nano 33 BLE Sense’s colorimeter and proximity sensor to classify objects. To do this, we will be running a small neural network on the board itself. 

Arduino BLE 33 Nano Sense running TensorFlow Lite Micro

The philosophy of TinyML is doing more on the device with less resources – in smaller form-factors, less energy and lower cost silicon. Running inferencing on the same board as the sensors has benefits in terms of privacy and battery life and means its can be done independent of a network connection. 

The fact that we have the proximity sensor on the board means we get an instant depth reading of an object in front of the board – instead of using a camera and having to determine if an object is of interest through machine vision. 

In this tutorial when the object is close enough we sample the color – the onboard RGB sensor can be viewed as a 1 pixel color camera. While this method has limitations it provides us a quick way of classifying objects only using a small amount of resources. Note that you could indeed run a complete CNN-based vision model on-device. As this particular Arduino board includes an onboard colorimeter, we thought it’d be fun and instructive to demonstrate in this way to start with.

We’ll show a simple but complete end-to-end TinyML application can be achieved quickly and without a deep background in ML or embedded. What we cover here is data capture, training, and classifier deployment. This is intended to be a demo, but there is scope to improve and build on this should you decide to connect an external camera down the road. We want you to get an idea of what is possible and a starting point with tools available.

What you’ll need

About the Arduino board

The Arduino Nano 33 BLE Sense board we’re using here has an Arm Cortex-M4 microcontroller running mbedOS and a ton of onboard sensors – digital microphone, accelerometer, gyroscope, temperature, humidity, pressure, light, color and proximity. 

While tiny by cloud or mobile standards the microcontroller is powerful enough to run TensorFlow Lite Micro models and classify sensor data from the onboard sensors.

Setting up the Arduino Create Web Editor

In this tutorial we’ll be using the Arduino Create Web Editor – a cloud-based tool for programming Arduino boards. To use it you have to sign up for a free account, and install a plugin to allow the browser to communicate with your Arduino board over USB cable.

You can get set up quickly by following the getting started instructions which will guide you through the following:

  • Download and install the plugin
  • Sign in or sign up for a free account

(NOTE: If you prefer, you can also use the Arduino IDE desktop application. The setup for which is described in the previous tutorial.)

Capturing training data

We now we will capture data to use to train our model in TensorFlow. First, choose a few different colored objects. We’ll use fruit, but you can use whatever you prefer. 

Setting up the Arduino for data capture

Next we’ll use Arduino Create to program the Arduino board with an application object_color_capture.ino that samples color data from objects you place near it. The board sends the color data as a CSV log to your desktop machine over the USB cable.

To load the object_color_capture.ino application onto your Arduino board:

  • Connect your board to your laptop or PC with a USB cable
    • The Arduino board takes a male micro USB
  • Open object_color_capture.ino in Arduino Create by clicking this link

Your browser will open the Arduino Create web application (see GIF above).

  • Press OPEN IN WEB EDITOR
    • For existing users this button will be labeled ADD TO MY SKETCHBOOK
  • Press Upload & Save
    • This will take a minute
    • You will see the yellow light on the board flash as it is programmed
  • Open the serial Monitor
    • This opens the Monitor panel on the left-hand side of the web application
    • You will now see color data in CSV format here when objects are near the top of the board

Capturing data in CSV files for each object

For each object we want to classify we will capture some color data. By doing a quick capture with only one example per class we will not train a generalized model, but we can still get a quick proof of concept working with the objects you have to hand! 

Say, for example, we are sampling an apple:

  • Reset the board using the small white button on top.
    • Keep your finger away from the sensor, unless you want to sample it!
    • The Monitor in Arduino Create will say ‘Serial Port Unavailable’ for a minute
  • You should then see Red,Green,Blue appear at the top of the serial monitor
  • Put the front of the board to the apple. 
    • The board will only sample when it detects an object is close to the sensor and is sufficiently illuminated (turn the lights on or be near a window)
  • Move the board around the surface of the object to capture color variations
  • You will see the RGB color values appear in the serial monitor as comma separated data. 
  • Capture at a few seconds of samples from the object
  • Copy and paste this log data from the Monitor to a text editor
    • Tip: untick AUTOSCROLL check box at the bottom to stop the text moving
  • Save your file as apple.csv
  • Reset the board using the small white button on top.

Do this a few more times, capturing other objects (e.g. banana.csv, orange.csv). 

NOTE: The first line of each of the .csv files should read:

Red,Green,Blue

If you don’t see it at the top, you can just copy and paste in the line above. 

Training the model

We will now use colab to train an ML model using the data you just captured in the previous section.

  • First open the FruitToEmoji Jupyter Notebook in colab
  • Follow the instructions in the colab
    • You will be uploading your *.csv files 
    • Parsing and preparing the data
    • Training a model using Keras
    • Outputting TensorFlowLite Micro model
    • Downloading this to run the classifier on the Arduino 

With that done you will have downloaded model.h to run on your Arduino board to classify objects!

The colab will guide you to drop your .csv files into the file window, the result shown above
Normalized color samples captured by the Arduino board are graphed in colab

Program TensorFlow Lite Micro model to the Arduino board

Finally, we will take the model we trained in the previous stage and compile and upload to our Arduino board using Arduino Create. 

Your browser will open the Arduino Create web application:

  • Press the OPEN IN WEB EDITOR button
  • Import the  model.h you downloaded from colab using Import File to Sketch: 
Import the model.h you downloaded from colab
The model.h tab should now look like this
  • Compile and upload the application to your Arduino board 
    • This will take a minute
    • When it’s done you’ll see this message in the Monitor:
  • Put your Arduino’s RGB sensor near the objects you trained it with
  • You will see the classification output in the Monitor:
Classifier output in the Arduino Create Monitor

You can also edit the object_color_classifier.ino sketch to output emojis instead (we’ve left the unicode in the comments in code!), which you will be able to view in Mac OS X or Linux terminal by closing the web browser tab with Arduino Create in, resetting your board, and typing cat /cu/usb.modem[n]. 

Output from Arduino serial to Linux terminal using ANSI highlighting and unicode emojis

Learning more

The resources around TinyML are still emerging but there’s a great opportunity to get a head start and meet experts coming up 2-3 December 2019 in Mountain View, California at the Arm IoT Dev Summit. This includes workshops from Sandeep Mistry, Arduino technical lead for on-device ML and from Google’s Pete Warden and Daniel Situnayake who literally wrote the book on TinyML. You’ll be able to hang out with these experts and more at the TinyML community sessions there too. We hope to see you there!

Conclusion

We’ve seen a quick end-to-end demo of machine learning running on Arduino. The same framework can be used to sample different sensors and train more complex models. For our object by color classification we could do more, by sampling more examples in more conditions to help the model generalize. In future work, we may also explore how to run an on-device CNN. In the meantime, we hope this will be a fun and exciting project for you. Have fun!

Proxino takes your virtual circuit into the real world

While circuit simulation tools become more accessible all the time, at some point it’s necessary to actual build your device and test it. Proxino, developed by researchers at Dartmouth College, takes a different approach, and enables you to virtually create a circuit, then test parts of it as needed with electronic components via physical proxies. 

To accomplish this, Proxino hardware sits on an Arduino Uno as a shield, and generates the virtual circuit’s responses to inputs. This setup allows for the implementation of physical elements like buzzers, lights, and sensors to complement the simulated environment, which can even be shared by remote collaborators in different locations. 

Proxino certainly looks like it could be an excellent instructional tool, or perhaps more!