Posts with «nano 33 ble sense» label

Designing a two-axis gesture-controlled platform for DSLR cameras

Holding your phone up to take an occasional picture is no big deal, but for professional photographers who often need to manipulate heavier gear for hours on end, this can actually be quite tiring. With this in mind, Cornell University students Kunpeng Huang, Xinyi Yang, and Siqi Qian designed a two-axis gesture-controlled camera platform for their ECE 4760 final project.

Their device mounts a 3.6kg (~8lb) DSLR camera in an acrylic turret, allowing it to look up and down (pitch) as well as left and right (yaw) under the control of two servo motors. The platform is powered by a PIC32 microcontroller, while human operation is performed via a gamepad-style SparkFun Joystick Shield or through an Arduino Nano 33 BLE Sense

When in Nano mode, the setup leverages its IMU to move the camera along with the user’s hand gestures, and its built-in light and proximity sensing abilities activate the camera itself.

Our 2-DOF gesture-controlled platform can point the camera in any direction within a hemi-sphere based on spherical coordinates. It is capable of rotating continuously in horizontal direction and traversing close to 180 degrees in vertical direction. It is able to support a relatively large camera system (more than 3kg in total weight and 40cm in length), orient the camera accurately (error less than 3 degree), and respond quickly to user input (transverse 180 degrees in less than 3 seconds). In addition to orienting the camera, the system also has simple control functionality, such as allowing the user to auto-focus and take photos remotely, which is achieved through DSLR’s peripheral connections.

At a high level, our design supports three user input modes — the first one uses a joystick while the other two use an inertial measurement unit (IMU). In the first mode, the x- and y-axis of a joystick is mapped to the velocities in the yaw and pitch directions of the camera. In the second mode, the roll and pitch angles of the user’s hand are mapped to the velocities of the camera in the yaw and pitch directions, while the third mode mapped the angles to the angular position of the camera.

Fruit identification using Arduino and TensorFlow

By Dominic Pajak and Sandeep Mistry

Arduino is on a mission to make machine learning easy enough for anyone to use. The other week we announced the availability of TensorFlow Lite Micro in the Arduino Library Manager. With this, some cool ready-made ML examples such as speech recognition, simple machine vision and even an end-to-end gesture recognition training tutorial. For a comprehensive background we recommend you take a look at that article

In this article we are going to walk through an even simpler end-to-end tutorial using the TensorFlow Lite Micro library and the Arduino Nano 33 BLE Sense’s colorimeter and proximity sensor to classify objects. To do this, we will be running a small neural network on the board itself. 

Arduino BLE 33 Nano Sense running TensorFlow Lite Micro

The philosophy of TinyML is doing more on the device with less resources – in smaller form-factors, less energy and lower cost silicon. Running inferencing on the same board as the sensors has benefits in terms of privacy and battery life and means its can be done independent of a network connection. 

The fact that we have the proximity sensor on the board means we get an instant depth reading of an object in front of the board – instead of using a camera and having to determine if an object is of interest through machine vision. 

In this tutorial when the object is close enough we sample the color – the onboard RGB sensor can be viewed as a 1 pixel color camera. While this method has limitations it provides us a quick way of classifying objects only using a small amount of resources. Note that you could indeed run a complete CNN-based vision model on-device. As this particular Arduino board includes an onboard colorimeter, we thought it’d be fun and instructive to demonstrate in this way to start with.

We’ll show a simple but complete end-to-end TinyML application can be achieved quickly and without a deep background in ML or embedded. What we cover here is data capture, training, and classifier deployment. This is intended to be a demo, but there is scope to improve and build on this should you decide to connect an external camera down the road. We want you to get an idea of what is possible and a starting point with tools available.

What you’ll need

About the Arduino board

The Arduino Nano 33 BLE Sense board we’re using here has an Arm Cortex-M4 microcontroller running mbedOS and a ton of onboard sensors – digital microphone, accelerometer, gyroscope, temperature, humidity, pressure, light, color and proximity. 

While tiny by cloud or mobile standards the microcontroller is powerful enough to run TensorFlow Lite Micro models and classify sensor data from the onboard sensors.

Setting up the Arduino Create Web Editor

In this tutorial we’ll be using the Arduino Create Web Editor – a cloud-based tool for programming Arduino boards. To use it you have to sign up for a free account, and install a plugin to allow the browser to communicate with your Arduino board over USB cable.

You can get set up quickly by following the getting started instructions which will guide you through the following:

  • Download and install the plugin
  • Sign in or sign up for a free account

(NOTE: If you prefer, you can also use the Arduino IDE desktop application. The setup for which is described in the previous tutorial.)

Capturing training data

We now we will capture data to use to train our model in TensorFlow. First, choose a few different colored objects. We’ll use fruit, but you can use whatever you prefer. 

Setting up the Arduino for data capture

Next we’ll use Arduino Create to program the Arduino board with an application object_color_capture.ino that samples color data from objects you place near it. The board sends the color data as a CSV log to your desktop machine over the USB cable.

To load the object_color_capture.ino application onto your Arduino board:

  • Connect your board to your laptop or PC with a USB cable
    • The Arduino board takes a male micro USB
  • Open object_color_capture.ino in Arduino Create by clicking this link

Your browser will open the Arduino Create web application (see GIF above).

  • Press OPEN IN WEB EDITOR
    • For existing users this button will be labeled ADD TO MY SKETCHBOOK
  • Press Upload & Save
    • This will take a minute
    • You will see the yellow light on the board flash as it is programmed
  • Open the serial Monitor
    • This opens the Monitor panel on the left-hand side of the web application
    • You will now see color data in CSV format here when objects are near the top of the board

Capturing data in CSV files for each object

For each object we want to classify we will capture some color data. By doing a quick capture with only one example per class we will not train a generalized model, but we can still get a quick proof of concept working with the objects you have to hand! 

Say, for example, we are sampling an apple:

  • Reset the board using the small white button on top.
    • Keep your finger away from the sensor, unless you want to sample it!
    • The Monitor in Arduino Create will say ‘Serial Port Unavailable’ for a minute
  • You should then see Red,Green,Blue appear at the top of the serial monitor
  • Put the front of the board to the apple. 
    • The board will only sample when it detects an object is close to the sensor and is sufficiently illuminated (turn the lights on or be near a window)
  • Move the board around the surface of the object to capture color variations
  • You will see the RGB color values appear in the serial monitor as comma separated data. 
  • Capture at a few seconds of samples from the object
  • Copy and paste this log data from the Monitor to a text editor
    • Tip: untick AUTOSCROLL check box at the bottom to stop the text moving
  • Save your file as apple.csv
  • Reset the board using the small white button on top.

Do this a few more times, capturing other objects (e.g. banana.csv, orange.csv). 

NOTE: The first line of each of the .csv files should read:

Red,Green,Blue

If you don’t see it at the top, you can just copy and paste in the line above. 

Training the model

We will now use colab to train an ML model using the data you just captured in the previous section.

  • First open the FruitToEmoji Jupyter Notebook in colab
  • Follow the instructions in the colab
    • You will be uploading your *.csv files 
    • Parsing and preparing the data
    • Training a model using Keras
    • Outputting TensorFlowLite Micro model
    • Downloading this to run the classifier on the Arduino 

With that done you will have downloaded model.h to run on your Arduino board to classify objects!

The colab will guide you to drop your .csv files into the file window, the result shown above
Normalized color samples captured by the Arduino board are graphed in colab

Program TensorFlow Lite Micro model to the Arduino board

Finally, we will take the model we trained in the previous stage and compile and upload to our Arduino board using Arduino Create. 

Your browser will open the Arduino Create web application:

  • Press the OPEN IN WEB EDITOR button
  • Import the  model.h you downloaded from colab using Import File to Sketch: 
Import the model.h you downloaded from colab
The model.h tab should now look like this
  • Compile and upload the application to your Arduino board 
    • This will take a minute
    • When it’s done you’ll see this message in the Monitor:
  • Put your Arduino’s RGB sensor near the objects you trained it with
  • You will see the classification output in the Monitor:
Classifier output in the Arduino Create Monitor

You can also edit the object_color_classifier.ino sketch to output emojis instead (we’ve left the unicode in the comments in code!), which you will be able to view in Mac OS X or Linux terminal by closing the web browser tab with Arduino Create in, resetting your board, and typing cat /cu/usb.modem[n]. 

Output from Arduino serial to Linux terminal using ANSI highlighting and unicode emojis

Learning more

The resources around TinyML are still emerging but there’s a great opportunity to get a head start and meet experts coming up 2-3 December 2019 in Mountain View, California at the Arm IoT Dev Summit. This includes workshops from Sandeep Mistry, Arduino technical lead for on-device ML and from Google’s Pete Warden and Daniel Situnayake who literally wrote the book on TinyML. You’ll be able to hang out with these experts and more at the TinyML community sessions there too. We hope to see you there!

Conclusion

We’ve seen a quick end-to-end demo of machine learning running on Arduino. The same framework can be used to sample different sensors and train more complex models. For our object by color classification we could do more, by sampling more examples in more conditions to help the model generalize. In future work, we may also explore how to run an on-device CNN. In the meantime, we hope this will be a fun and exciting project for you. Have fun!

Get started with machine learning on Arduino

This post was originally published by Sandeep Mistry and Dominic Pajak on the TensorFlow blog.

Arduino is on a mission to make machine learning simple enough for anyone to use. We’ve been working with the TensorFlow Lite team over the past few months and are excited to show you what we’ve been up to together: bringing TensorFlow Lite Micro to the Arduino Nano 33 BLE Sense. In this article, we’ll show you how to install and run several new TensorFlow Lite Micro examples that are now available in the Arduino Library Manager.

The first tutorial below shows you how to install a neural network on your Arduino board to recognize simple voice commands.

Example 1: Running the pre-trained micro_speech inference example.

Next, we’ll introduce a more in-depth tutorial you can use to train your own custom gesture recognition model for Arduino using TensorFlow in Colab. This material is based on a practical workshop held by Sandeep Mistry and Dan Coleman, an updated version of which is now online

If you have previous experience with Arduino, you may be able to get these tutorials working within a couple of hours. If you’re entirely new to microcontrollers, it may take a bit longer. 

Example 2: Training your own gesture classification model.

We’re excited to share some of the first examples and tutorials, and to see what you will build from here. Let’s get started!

Note: The following projects are based on TensorFlow Lite for Microcontrollers which is currently experimental within the TensorFlow repo. This is still a new and emerging field!

Microcontrollers and TinyML

Microcontrollers, such as those used on Arduino boards, are low-cost, single chip, self-contained computer systems. They’re the invisible computers embedded inside billions of everyday gadgets like wearables, drones, 3D printers, toys, rice cookers, smart plugs, e-scooters, washing machines. The trend to connect these devices is part of what is referred to as the Internet of Things.

Arduino is an open-source platform and community focused on making microcontroller application development accessible to everyone. The board we’re using here has an Arm Cortex-M4 microcontroller running at 64 MHz with 1MB Flash memory and 256 KB of RAM. This is tiny in comparison to Cloud, PC, or mobile but reasonable by microcontroller standards.

Arduino Nano 33 BLE Sense board is smaller than a stick of gum.

There are practical reasons you might want to squeeze ML on microcontrollers, including: 

  • Function – wanting a smart device to act quickly and locally (independent of the Internet).
  • Cost – accomplishing this with simple, lower cost hardware.
  • Privacy – not wanting to share all sensor data externally.
  • Efficiency – smaller device form-factor, energy-harvesting or longer battery life.

There’s a final goal which we’re building towards that is very important:

  • Machine learning can make microcontrollers accessible to developers who don’t have a background in embedded development 

On the machine learning side, there are techniques you can use to fit neural network models into memory constrained devices like microcontrollers. One of the key steps is the quantization of the weights from floating point to 8-bit integers. This also has the effect of making inference quicker to calculate and more applicable to lower clock-rate devices. 

TinyML is an emerging field and there is still work to do – but what’s exciting is there’s a vast unexplored application space out there. Billions of microcontrollers combined with all sorts of sensors in all sorts of places which can lead to some seriously creative and valuable TinyML applications in the future.

What you need to get started

The Arduino Nano 33 BLE Sense has a variety of onboard sensors meaning potential for some cool TinyML applications:

  • Voice – digital microphone
  • Motion – 9-axis IMU (accelerometer, gyroscope, magnetometer)
  • Environmental – temperature, humidity and pressure
  • Light – brightness, color and object proximity

Unlike classic Arduino Uno, the board combines a microcontroller with onboard sensors which means you can address many use cases without additional hardware or wiring. The board is also small enough to be used in end applications like wearables. As the name suggests it has Bluetooth LE connectivity so you can send data (or inference results) to a laptop, mobile app or other BLE boards and peripherals.

Tip: Sensors on a USB stick – Connecting the BLE Sense board over USB is an easy way to capture data and add multiple sensors to single board computers without the need for additional wiring or hardware – a nice addition to a Raspberry Pi, for example.

TensorFlow Lite for Microcontrollers examples

The inference examples for TensorFlow Lite for Microcontrollers are now packaged and available through the Arduino Library manager making it possible to include and run them on Arduino in a few clicks. In this section we’ll show you how to run them. The examples are:

  • micro_speech – speech recognition using the onboard microphone
  • magic_wand – gesture recognition using the onboard IMU
  • person_detection – person detection using an external ArduCam camera

For more background on the examples you can take a look at the source in the TensorFlow repository. The models in these examples were previously trained. The tutorials below show you how to deploy and run them on an Arduino. In the next section, we’ll discuss training.

How to run the examples using Arduino Create web editor

Once you connect your Arduino Nano 33 BLE Sense to your desktop machine with a USB cable you will be able to compile and run the following TensorFlow examples on the board by using the Arduino Create web editor:

Compiling an example from the Arduino_TensorFlowLite library.

Focus on the speech recognition example: micro_speech

One of the first steps with an Arduino board is getting the LED to flash. Here, we’ll do it with a twist by using TensorFlow Lite Micro to recognise voice keywords. It has a simple vocabulary of “yes” and “no”. Remember this model is running locally on a microcontroller with only 256KB of RAM, so don’t expect commercial ‘voice assistant’ level accuracy – it has no Internet connection and on the order of 2000x less local RAM available.

Note the board can be battery powered as well. As the Arduino can be connected to motors, actuators and more this offers the potential for voice-controlled projects.

Running the micro_speech example.

How to run the examples using the Arduino IDE

Alternatively you can use try the same inference examples using Arduino IDE application.

First, follow the instructions in the next section Setting up the Arduino IDE.

In the Arduino IDE, you will see the examples available via the File > Examples > Arduino_TensorFlowLite menu in the ArduinoIDE.

Select an example and the sketch will open. To compile, upload and run the examples on the board, and click the arrow icon:

For advanced users who prefer a command line, there is also the arduino-cli.

Training a TensorFlow Lite Micro model for Arduino

Gesture classification on Arduino BLE 33 Nano Sense, output as emojis.

Next we will use ML to enable the Arduino board to recognise gestures. We’ll capture motion data from the Arduino Nano 33 BLE Sense board, import it into TensorFlow to train a model, and deploy the resulting classifier onto the board.

The idea for this tutorial was based on Charlie Gerard’s awesome Play Street Fighter with body movements using Arduino and Tensorflow.js. In Charlie’s example, the board is streaming all sensor data from the Arduino to another machine which performs the gesture classification in Tensorflow.js. We take this further and “TinyML-ifiy” it by performing gesture classification on the Arduino board itself. This is made easier in our case as the Arduino Nano 33 BLE Sense board we’re using has a more powerful Arm Cortex-M4 processor, and an on-board IMU.

We’ve adapted the tutorial below, so no additional hardware is needed – the sampling starts on detecting movement of the board. The original version of the tutorial adds a breadboard and a hardware button to press to trigger sampling. If you want to get into a little hardware, you can follow that version instead.

Setting up the Arduino IDE

Following the steps below sets up the Arduino IDE application used to both upload inference models to your board and download training data from it in the next section. There are a few more steps involved than using Arduino Create web editor because we will need to download and install the specific board and libraries in the Arduino IDE.

  • In the Arduino IDE menu select Tools > Board > Boards Manager…
    • Search for “Nano BLE” and press install on the board 
    • It will take several minutes to install
    • When it’s done close the Boards Manager window
  • Now go to the Library Manager Tools > Manage Libraries…
    • Search for and install the Arduino_TensorFlowLite library

Next search for and install the Arduino_LSM9DS1 library:

  • Finally, plug the micro USB cable into the board and your computer
  • Choose the board Tools > Board > Arduino Nano 33 BLE
  • Choose the port Tools > Port > COM5 (Arduino Nano 33 BLE) 
    • Note that the actual port name may be different on your computer

There are more detailed Getting Started and Troubleshooting guides on the Arduino site if you need help.

Streaming sensor data from the Arduino board

First, we need to capture some training data. You can capture sensor data logs from the Arduino board over the same USB cable you use to program the board with your laptop or PC.

Arduino boards run small applications (also called sketches) which are compiled from .ino format Arduino source code, and programmed onto the board using the Arduino IDE or Arduino Create. 

We’ll be using a pre-made sketch IMU_Capture.ino which does the following:

  • Monitor the board’s accelerometer and gyroscope 
  • Trigger a sample window on detecting significant linear acceleration of the board 
  • Sample for one second at 119Hz, outputting CSV format data over USB 
  • Loop back and monitor for the next gesture

The sensors we choose to read from the board, the sample rate, the trigger threshold, and whether we stream data output as CSV, JSON, binary or some other format are all customizable in the sketch running on the Arduino. There is also scope to perform signal preprocessing and filtering on the device before the data is output to the log – this we can cover in another blog. For now, you can just upload the sketch and get sampling.

To program the board with this sketch in the Arduino IDE:

  • Download IMU_Capture.ino and open it in the Arduino IDE
  • Compile and upload it to the board with Sketch > Upload

Visualizing live sensor data log from the Arduino board

With that done we can now visualize the data coming off the board. We’re not capturing data yet this is just to give you a feel for how the sensor data capture is triggered and how long a sample window is. This will help when it comes to collecting training samples.

  • In the Arduino IDE, open the Serial Plotter Tools > Serial Plotter
    • If you get an error that the board is not available, reselect the port:
    • Tools > Port > portname (Arduino Nano 33 BLE) 
  • Pick up the board and practice your punch and flex gestures
    • You’ll see it only sample for a one second window, then wait for the next gesture
  • You should see a live graph of the sensor data capture (see GIF below)
Arduino IDE Serial Plotter will show a live graph of CSV data output from your board.

When you’re done be sure to close the Serial Plotter window – this is important as the next step won’t work otherwise.

Capturing gesture training data 

To capture data as a CSV log to upload to TensorFlow, you can use Arduino IDE > Tools > Serial Monitor to view the data and export it to your desktop machine:

  • Reset the board by pressing the small white button on the top
  • Pick up the board in one hand (picking it up later will trigger sampling)
  • In the Arduino IDE, open the Serial Monitor Tools > Serial Monitor
    • If you get an error that the board is not available, reselect the port:
    • Tools > Port > portname (Arduino Nano 33 BLE) 
  • Make a punch gesture with the board in your hand (Be careful whilst doing this!)
    • Make the outward punch quickly enough to trigger the capture
    • Return to a neutral position slowly so as not to trigger the capture again 
  • Repeat the gesture capture step 10 or more times to gather more data
  • Copy and paste the data from the Serial Console to new text file called punch.csv 
  • Clear the console window output and repeat all the steps above, this time with a flex gesture in a file called flex.csv 
    • Make the inward flex fast enough to trigger capture returning slowly each time

Note the first line of your two csv files should contain the fields aX,aY,aZ,gX,gY,gZ.

Linux tip: If you prefer you can redirect the sensor log output from the Arduino straight to a .csv file on the command line. With the Serial Plotter / Serial Monitor windows closed use:

 $ cat /dev/cu.usbmodem[nnnnn] > sensorlog.csv

Training in TensorFlow

We’re going to use Google Colab to train our machine learning model using the data we collected from the Arduino board in the previous section. Colab provides a Jupyter notebook that allows us to run our TensorFlow training in a web browser.

Arduino gesture recognition training colab.

The colab will step you through the following:

  • Set up Python environment
  • Upload the punch.csv and flex.csv data 
  • Parse and prepare the data
  • Build and train the model
  • Convert the trained model to TensorFlow Lite
  • Encode the model in an Arduino header file

The final step of the colab is generates the model.h file to download and include in our Arduino IDE gesture classifier project in the next section:

Let’s open the notebook in Colab and run through the steps in the cells – arduino_tinyml_workshop.ipynb

Classifying IMU Data

Next we will use model.h file we just trained and downloaded from Colab in the previous section in our Arduino IDE project:

  • Open IMU_Classifier.ino in the Arduino IDE.
  • Create a new tab in the IDE. When asked name it model.h
  • Open the model.h tab and paste in the version you downloaded from Colab
  • Upload the sketch: Sketch > Upload
  • Open the Serial Monitor: Tools > Serial Monitor
  • Perform some gestures
  • The confidence of each gesture will be printed to the Serial Monitor (0 = low confidence, 1 =  high confidence)

Congratulations you’ve just trained your first ML application for Arduino!

For added fun the Emoji_Button.ino example shows how to create a USB keyboard that prints an emoji character in Linux and macOS. Try combining the Emoji_Button.ino example with the IMU_Classifier.ino sketch to create a gesture controlled emoji keyboard ?.

Conclusion

It’s an exciting time with a lot to learn and explore in TinyML. We hope this blog has given you some idea of the potential and a starting point to start applying it in your own projects. Be sure to let us know what you build and share it with the Arduino community.

For a comprehensive background on TinyML and the example applications in this article, we recommend Pete Warden and Daniel Situnayake’s new O’Reilly book “TinyML: Machine Learning with TensorFlow on Arduino and Ultra-Low Power Microcontrollers.”

BLE central support added to ArduinoBLE

This post is from Sandeep Mistry, Senior Software Engineer at Arduino. 

Today, we are pleased to announce BLE (Bluetooth Low Energy) central support in v1.1.0 of the ArduinoBLE library. This major feature addition allows your Arduino board to scan for and connect to BLE peripheral devices. With one simple library, you can now use BLE to directly connect your Arduino board to:

  • A smartphone, tablet, laptop or PC 
  • BLE peripherals (e.g. TI SensorTag) – NEW!
  • Another Arduino board – NEW!

The ArduinoBLE library and new BLE central feature are supported on the following Arduino boards:

Prior to this release, Arduino only officially supported BLE peripheral functionality on these boards. A BLE peripheral is typically used to expose some sensor data or actuators to another device BLE central capable device such as a smartphone or PC. With the new BLE central functionality, you’ll be able to wirelessly connect two boards together for communication or connect to a third party BLE peripheral, such as a TI SensorTag.

We think that the ArduinoBLE library is much easier to use than anything else out there and are excited to see what you build with this new capability!

The development journey

Back in 2015, the Arduino 101 was released, based on the Curie module developed by Intel. It was the first official Arduino board with on-board BLE support. The CurieBLE library initially only supported BLE peripheral mode.

After launch, the Arduino and Intel teams worked together to design an Arduino friendly BLE API that supported both BLE Peripheral and Central functionality. This was released later in 2016 in the v2.0 of the Arduino core for Arduino 101.

The BLE features of the 101 were also incorporated into the CTC 101 kit in many classrooms around the world. Students used smartphones or tablets for exercises in the kit to interact with project based lessons running on the board. Unfortunately, Intel decided to stop producing the Curie module in 2017, bringing the Arduino 101 board to end of life.

Last year, at Maker Faire Bay Area 2018, Arduino launched two new boards: the MKR WiFi 1010 and Uno WiFi Rev.2. Both boards use the u-blox NINA-W102 as a 2.4 GHz wireless module. Initially both boards only supported WiFi using the WiFiNINA library. However, the ESP32 chip inside the u-blox NINA-W102 supports Bluetooth classic and BLE as well.

Later in 2018, the Arduino core team was tasked with adding BLE support to the MKR WiFi 1010 board so that it could be used with the upcoming Arduino Science Kit Physics Lab product. The Science Kit Physics Lab product is another educational kit, targeted for students in the classroom. We had several choices to move forward, including:

  • Bridging the ESP-IDF’s BLE API’s via RPC to the main MCU that sketches run on
  • Basing things on the industry standard Bluetooth HCI protocol and investing in a Bluetooth HCI host stack as an Arduino library

The first option above was expected to take an equal amount of time to the second, but also would make the BLE library exposed to users highly dependent on the underlying firmware running on the ESP32. It was also not as portable to other chip sets in the future. Thus, ArduinoBLE was created. The NINA firmware only needed small change to bridge its virtual Bluetooth HCI controller to the UART pins of the module.

Earlier this year, we released the Arduino Nano 33 IoT and Arduino Nano 33 BLE boards. Since the Arduino Nano 33 IoT uses the same chipset as the MKR WiFi 1010, things worked out of the box. For the Nano 33 BLE, which is based on the Nordic nRF52840 chip, a new Arduino core was developed for this board based on mbed OS (see this blog post for more info). mbed OS includes a radio stack called Cordio, which provides both a Bluetooth HCI link controller and HCI host. Creating a single C++ class that interfaced with Cordio’s Bluetooth HCI link layer allowed us to re-use 95%+ of ArduinoBLE on this board.

After the Nano 33 BLE started shipping, there was even more demand for BLE central support. So, development for feature was scheduled and is now available. It combines the API designed for the Arduino 101 in CurieBLE ported on top of ArduinoBLE’s Bluetooth HCI host stack.

Many thanks to Tom Igoe, one of the co-founders of Arduino, for providing feedback on the official Arduino BLE libraries throughout the years.

Chirp brings data-over-sound capabilities your Arduino projects

We are excited to announce a new partnership with Chirp, a London-based company on a mission to simplify connectivity using sound. Chirp’s machine-to-machine communications software enables any device with a loudspeaker or microphone to exchange data via inaudible sound waves. 

Starting today, our Chirp integration will allow Arduino-powered projects to send and receive data wirelessly over sound waves, using just microphones and loudspeakers. Thanks to some compatible libraries included in the official Arduino Library Manager and in the Arduino Create — as well as further comprehensive documentation, tutorials and technical support — it will be easy for anyone to add data-over-sound capabilities to their Arduino projects.

Our new Nano 33 BLE Sense board, with a DSP-optimised Arm Cortex-M4 processor, will be the first board in the Arduino range with the power to transmit and receive Chirp audio signals leveraging the board’s microphone as a receiver. From now on, the Chirp SDK for Arduino will support the following boards in send-only mode: Arduino MKR Zero, Arduino MKR Vidor 4000, Arduino MKR Fox 1200, Arduino MKR WAN 1300, Arduino MKR WiFi 1010, Arduino MKR GSM 1400, Arduino MKR NB 1500 and the Arduino Nano 33 IoT.

Creative applications of Arduino and Chirp include, but certainly are not limited to:

  • Triggering events from YouTube audio
  • Securely unlocking a smart lock with sound 
  • Sending Wi-Fi credentials to bring offline devices onto a Wi-Fi network
  • Having a remote control that only interacts with the gadgets in the same room as you

Connectivity is a fundamental asset for our users, as the demands of IoT uptake require devices to communicate information seamlessly and with minimal impact for the end user. Chirp’s data-over-sound solution equips our boards with robust data transmission, helping us to deliver enhanced user experiences whilst increasing the capabilities of our hardware at scale,” said Massimo Banzi, Arduino co-founder.  

“Sound is prevailing as a highly effective and versatile means of seamless data transmission, presenting developers with a simple to use, software-defined solution which can connect devices. Working with Arduino to extend the integration of data-over-sound across its impressive range of boards will not only increase the reach of Chirp’s technology, but provide many more developers with an accessible and easily integrated connectivity solution to help them drive their projects forward in all purposes and environments. We can’t wait to see what the Arduino community builds,” commented James Nesfield, Chirp CEO. 

To learn how to send data with sound with an Arduino Nano 33 BLE Sense and Chirp, check out this tutorial and visit Chirp website here