Schlagwort: TinyML

  • Training embedded audio classifiers for the Nicla Voice on synthetic datasets

    Training embedded audio classifiers for the Nicla Voice on synthetic datasets

    Reading Time: 2 minutes

    The task of gathering enough data to classify distinct sounds not captured in a larger, more robust dataset can be very time-consuming, at least until now. In his write-up, Shakhizat Nurgaliyev describes how he used an array of AI tools to automatically create a keyword spotting dataset without the need for speaking into a microphone.

    The pipeline is split into three main parts. First, the Piper text-to-speech engine was downloaded and configured via a Python script to output 904 distinct samples of the TTS model saying Nurgaliyev’s last name in a variety of ways to decrease overfitting. Next, background noise prompts were generated with the help of ChatGPT and then fed into AudioLDM which produces the audio files based on the prompts. Finally, all of the WAV files, along with “unknown” sounds from the Google Speech Commands Dataset, were uploaded to an Arduino ML project

    Training the model for later deployment on a Nicla Voice board was accomplished by adding a Syntiant audio processing block and then generating features to train a classification model. The resulting model could accurately determine when the target word was spoken around 96% of the time — all without the need for manually gathering a dataset.

    [youtube https://www.youtube.com/watch?v=4ike-duV0G8?feature=oembed&w=500&h=281]

    To read more about this project, you can check out Nurgaliyev’s detailed write-up on Hackster.io.

    The post Training embedded audio classifiers for the Nicla Voice on synthetic datasets appeared first on Arduino Blog.

    Website: LINK

  • Fall detection system with Nicla Sense ME

    Fall detection system with Nicla Sense ME

    Reading Time: 4 minutes

    The challenge

    Personal safety is a growing concern in a variety of settings: from high-risk jobs where HSE managers must guarantee workers’ security to the increasingly common work and study choices that drive family and friends far apart, sometimes leading to more isolated lives. In all of these situations, having a system capable of sensing and automatically contacting help in case of emergency can not only give people peace of mind, but save lives.

    A particularly interesting case – as the world population ages – regards the increasing number of elderly people who are still healthy enough to be independent yet must also accept the fact their bodies are becoming weaker and their bones more fragile. This specific target is more prone to falls, which can result in fractures, head injuries, and other serious accidents that can severely impact the quality of life. Detecting falls early can allow for prompt medical attention and prevent serious consequences. Additionally, detecting falls can help identify underlying health issues or environmental factors that may be contributing to accidents, allowing for appropriate interventions to be put in place to avoid future falls.

    A variety of person-down systems and fall detection methods exist, ranging from threshold-based algorithms to traditional machine learning applications. The biggest challenge they all share is they suffer from high false-positive triggers. In other words, they cause unnecessary alarm and distress to both the seniors and their caregivers, resulting in unwarranted actions. 

    Our solution

    A tiny but mighty deployment device: Nicla Sense ME

    For its project, Aizip selected the Nicla Sense ME: a compact module integrating multiple cutting-edge sensors from Bosch Sensortec, enabling sensor fusion applications directly at the edge. Additionally, the module houses an Arm® Cortex®-M4 microcontroller (nRF52832) leveraging Bluetooth® 4.2. Aizip’s neural network model fits right in with the remaining resources of the microcontroller, thanks to its compact footprint. The result? A small and lightweight device that can be clipped onto one’s belt and worn all day without hassle, able to monitor health parameters and immediately alert assistance in case of fall, with near-zero latency and full respect for privacy.

    A more accurate fall detection algorithm

    Aizip’s fall detection solution integrates a neural network algorithm with sensor fusion to greatly enhance detection accuracy, while also being lightweight enough it can run in real time on a microcontroller. The neural network within the microcontroller continuously processes sensor readings from the accelerometer (BHI260AP) and the pressure sensor (BMP390). Upon detecting a fall, the device sends an alarm via Bluetooth and activates an on-board LED. In order to minimize frequent false alarms that could significantly affect user experience, the neural network is optimized to differentiate real falls from abrupt movements such as jumping, sprinting, and quickly sitting down. The neural network-based algorithm excels at capturing subtle features in inputs, leading to a substantial reduction in false alarm rates compared to threshold-based approaches or traditional machine learning algorithms.

    Typical neural networks offer superior performances but also pose additional challenges, when deploying them onto resource-constrained microcontroller devices, due to the extensive computing and memory resources required. The simultaneous need for Bluetooth connectivity and sensor fusion further compounds this issue. However, Aizip’s proprietary efficient neural network architecture makes this solution stand out because it minimizes resource requirements while maintaining high accuracy. The neural network is quantized to 8-bit and deployed onto the microcontroller using Aizip’s automated design tool. The implemented model achieves a 94% fall detection accuracy and a <0.1% false positive rate, all while utilizing less than 3KB of RAM. A perfect fit for the low-consumption Nicla Sense ME!

    Solving it with Arduino Pro

    Now let’s explore how we could put all of this together and what we would need for deployment both in terms of hardware and software stack. The Arduino Pro ecosystem is the latest generation of Arduino solutions bringing users the simplicity of integration and scalable, secure, professionally supported services.

    Hardware requirements

    • Arduino Nicla Sense ME
    • Single-cell 3.7 V Li-Po or Li-Ion battery
    • Jumper wires (for connecting the board and the battery)

    Software requirements

    Conclusion

    When personal safety is a concern, smart wearables that leverage AI can help. And processing the data required to monitor health conditions and prevent falls doesn’t have to come at the expense of comfort or privacy. Thanks to extremely efficient models like Aizip’s and compact yet high-performance modules like Arduino Pro’s Nicla Sense ME, you can create a discreet and reliable solution able to immediately call for help when needed (and only when needed).

    The post Fall detection system with Nicla Sense ME appeared first on Arduino Blog.

    Website: LINK

  • This GIGA R1 WiFi-powered wearable detects falls using a Transformer model

    This GIGA R1 WiFi-powered wearable detects falls using a Transformer model

    Reading Time: 2 minutes

    For those aged 65 and over, falls can be one of the most serious health concerns they face either due to lower mobility or decreasing overall coordination. Recognizing this issue, Naveen Kumar set out to produce a wearable fall-detecting device that aims to increase the speed at which this occurs by utilizing a Transformer-based model rather than a more traditional recurrent neural network (RNN) model.

    Because this project needed to be both fast and consume only small amounts of current, Kumar went with the new Arduino GIGA R1 WiFi due to its STM32H74XI dual-core Arm CPU, onboard WiFi/Bluetooth®, and ability to interface with a wide variety of sensors. After connecting an ADXL345 three-axis accelerometer, he realized that collecting many hours of samples by hand would be far too time consuming, so instead, he downloaded the SisFall dataset, ran a Python script to parse the sample data into an Edge Impulse-compatible format, and then uploaded the resulting JSON files into a new project. Once completed, he used the API to split each sample into four-second segments and then used the Keras block edit feature to build a reduced-sized Transformer model.

    The result after training was a 202KB large model that could accurately determine if a fall occurred 96% of the time. Deployment was then as simple as using the Arduino library feature within a sketch to run an inference and display the result via an LED, though future iterations could leverage the GIGA R1 WiFi’s connectivity to send out alert notifications if an accident is detected. More information can be found here in Kumar’s write-up.

    [youtube https://www.youtube.com/watch?v=wPJF7lJrIWw?feature=oembed&w=500&h=281]

    The post This GIGA R1 WiFi-powered wearable detects falls using a Transformer model appeared first on Arduino Blog.

    Website: LINK

  • Detect a crying baby with tinyML and synthetic data

    Detect a crying baby with tinyML and synthetic data

    Reading Time: 2 minutes

    When a baby cries, it is almost always due to something that is wrong, which could include, among other things, hunger, thirst, stomach pain, or too much noise. In his project, Nurgaliyev Shakhizat demonstrated how he was able to leverage ML tools to build a cry-detection system without the need for collecting real-world data himself.

    The process is as follows: ChatGPT generates a series of text prompts that all involve a crying baby in some manner. These prompts are then passed to AudioLDM which creates sounds according to the prompts. Finally, Shakhizat used the Arduino Cloud’s Machine Learning Tools integration, powered by Edge Impulse, to train a tinyML model for deployment onto an Arduino Nicla Voice board. To create the sounds themselves, Shakhizat configured a virtual Python environment with the audioldm package installed. His script takes the list of prompts, executes them within an AudioLDM CLI command, and saves the generated sound data as a WAV file.

    Once this process was done, he configured a project in the Edge Impulse Studio which trains a classifier model. The result after training completed was a model that could accurately distinguish between background noise and a crying baby 90% of the time, and deploying it onto the Arduino Nicla Voice showed the effectiveness of how synthetic datasets and embedded models can be used in the real world.

    To read more, you can check out Shakhizat’s write-up here on Hackster.io.

    [youtube https://www.youtube.com/watch?v=6Qe1PPLstW8?feature=oembed&w=500&h=281]

    The post Detect a crying baby with tinyML and synthetic data appeared first on Arduino Blog.

    Website: LINK

  • Cave exploration made safer with the Nicla Sense ME-powered Sajac Project

    Cave exploration made safer with the Nicla Sense ME-powered Sajac Project

    Reading Time: 2 minutes

    The art of cave exploration, or spelunking, can get its practitioners far closer to nature and the land we inhabit, but it also comes with a host of potential dangers. Some include extreme environmental conditions, lack of oxygen/toxic gases, and simply having their path closed off due to rock falls. Seeing these problems, Rifqi Abdillah decided to create the Sajac Project based on the Nicla Sense ME attached to a K-Way jacket with the aim of assisting cavers.

    Because the Nicla Sense ME contains a combination of motion, pressure, and gas sensors onboard, Abdillah used it to gather raw data about the wearer’s surroundings by continuously taking readings and then transmitting the values over BLE to a mobile device. Each sensor fusion sample was then added to the Edge Impulse Studio and labeled with either “safe,” “bad,” or “danger” depending on how harmful the conditions would be. Finally, a Keras classification model was trained and deployed back to the Nicla as an Arduino library, which is used in conjunction with an OLED screen to show the classification result.

    With the model now outputting the sensor readings and if they are safe or unsafe, Abdillah went one step further and developed an app to display them in real-time on a Seeed Studio Wio Terminal. Built in MIT’s App Inventor, it allows the user to select the current status as shown by the Nicla and have it appear on the Terminal’s screen. Fellow cavers are able to be notified in an emergency via a connected LoRa radio that can transmit an alert message. 

    [youtube https://www.youtube.com/watch?v=PzvEHBK3Hok?feature=oembed&w=500&h=281]

    For more details on this proof of concept, which was shortlisted as part of our K-Way competition, you can read Abdillah’s well-documented write-up on the Arduino Project Hub. It was also featured on our Arduino Day 2023 livestream, which you can see here.

    The post Cave exploration made safer with the Nicla Sense ME-powered Sajac Project appeared first on Arduino Blog.

    Website: LINK

  • Predicting when a fan fail by listening to it

    Predicting when a fan fail by listening to it

    Reading Time: 2 minutes

    Embedded audio classification is a very powerful tool when it comes to predictive maintenance, as a wide variety of sounds can be distinguished as either normal or harmful several times per second automatically and reliably. To demonstrate how this pattern recognition could be incorporated into a commercial setting, Kevin Richmond created the Listen Up project that aims to show the current status of a running fan based solely on its noise profile.

    Richmond started by collecting 15 minutes of data for each label, namely background noise, normal operation, soft failure, and severe failure. Once collected, the data was split into two-second samples and uploaded to the Edge Impulse Studio, after which an impulse was configured to use an MFE audio processing block and a Keras classification model. Once trained on the dataset, the model achieved an accuracy of almost 96% using real-world testing data.

    In order to utilize the classifier, Richmond deployed his Edge Impulse project as an Arduino library for use in an Arduino Portenta H7 sketch. In it, an accompanying Portenta Vision Shield’s microphone continuously gathers new audio data before passing it into the classification model to receive a result. The probability of each label is then used to set a corresponding LED color if the probability is greater than 80%, otherwise blue is shown to indicate a failed reading.

    To see the project in action, you can watch Richmond’s video below or read his write-up on Hackster.io.

    The post Predicting when a fan fail by listening to it appeared first on Arduino Blog.

    Website: LINK

  • Using sensor fusion and tinyML to detect fires

    Using sensor fusion and tinyML to detect fires

    Reading Time: 2 minutes

    The damage and destruction caused by structure fires to both people and the property itself is immense, which is why accurate and reliable fire detection systems are a must-have. As Nekhil R. notes in his write-up, the current rule-based algorithms and simple sensor configurations can lead to reduced accuracy, thus showing a need for more robust systems.

    This led Nekhil to devise a solution that leverages sensor fusion and machine learning to make better predictions about the presence of flames. His project began with collecting environmental data consisting of temperature, humidity, and pressure from his Arduino Nano 33 BLE Sense’s onboard sensor suite. He also labeled each sample either Fire or No Fire using the Edge Impulse Studio, which was used to generate spectral features from the three time-series sensor values. This information was then passed along to a Keras neural network that had been configured to perform classification, resulting in an overall accuracy of 92.86% when run on real world test samples.

    Confident in his now-trained model, Nekhil deployed his model as an Arduino library back to the Nano 33 BLE Sense. The Nano sends a message over its UART pins to an awaiting ESP8266-01 board when a fire has been detected. And in turn, the ESP8266 triggers an IFTTT webhook to alert the user via an email.

    If you would like to learn more about the construction of this fire recognition system, plenty of details can be found on the project page.

    The post Using sensor fusion and tinyML to detect fires appeared first on Arduino Blog.

    Website: LINK

  • Add gesture recognition and environmental sensing to your hiking jacket with the Nicla Sense ME

    Add gesture recognition and environmental sensing to your hiking jacket with the Nicla Sense ME

    Reading Time: 2 minutes

    As part of our ongoing collaboration with K-Way, Justin Lutz set out to integrate intelligent electronics into one of the company’s iconic outdoor jackets. Due to the active lifestyle of the brand, Lutz chose to use the Arduino Nicla Sense ME board to detect gestures while hiking as well as monitor the barometric pressure for potential storms.

    The project began by first gathering many samples of either idle motion, walking, or drawing a “C” in the air to set a checkpoint with the Nicla’s onboard accelerometer. Once this data was added to the Edge Impulse Studio, he trained a model to recognize each of the motions and return the corresponding label. Beyond this functionality, the Nicla Sense ME also outputs the detected motion and current pressure reading over Bluetooth to a connected phone.

    Interacting with the wearable device is done completely through an Android app that Lutz created with the help of MIT’s App Inventor online tool. While running in the background, the app checks for new Bluetooth data and marks the current coordinates on a map as a checkpoint whenever the user draws a “C” with their finger. Drops in pressure are also displayed by the app as a large warning that bad weather is on its way.

    More details on the project can be found in Lutz’s Edge Impulse write-up. You can also learn more about the Arduino x K-Way collaboration here.

    The post Add gesture recognition and environmental sensing to your hiking jacket with the Nicla Sense ME appeared first on Arduino Blog.

    Website: LINK

  • This DIY Apple Pencil writes with gestures

    This DIY Apple Pencil writes with gestures

    Reading Time: 2 minutes

    Released in 2015, the Apple Pencil is a technology-packed stylus that allows users to write on iPad screens with variations in pressure and angle — all while communicating with very low latencies. Nekhil Ravi and Shebin Jose Jacob of Coders Café were inspired by this piece of handheld tech to come up with their own pencil concept, except this one wouldn’t need a screen in order to function.

    The pair’s writing utensil relies on recognizing certain gestures as letters, and once one has been detected, outputs the result over USB or Bluetooth® to the host device. They started by first gathering many samples of different letters and how they correlate to the change in motion on the Arduino Nano 33 BLE Sense’s built-in accelerometer. From here, they designed an impulse in the Edge Impulse Studio to extract spectral features from the time series accelerometer data and pass it to a classification Keras neural network. The resulting model could accurately determine the correct letter from each gesture, making it suitable for deployment back to the Nano 33 BLE Sense.

    Before testing their new inferencing code on the hardware, a simple 3D-printed case was designed to fit around the board to look like the real Apple Pencil. Additionally, the team made a simple website that could receive data from the board over BLE and display the corresponding letter within the browser window. To see more about this project, you can watch their video below!

    The post This DIY Apple Pencil writes with gestures appeared first on Arduino Blog.

    Website: LINK

  • Predicting potential motor failures just using sound

    Predicting potential motor failures just using sound

    Reading Time: 2 minutes

    Nearly every manufacturer uses a machine at some point in their process, and each of those machines is almost guaranteed to contain at least one motor. In order to maintain uptime and efficiency, these motors must always work correctly, as even a small breakdown can lead to disastrous effects. Predictive maintenance aims to achieve this goal while also not going overboard in trying to prevent them entirely by combining sensors with predictive techniques that can schedule maintenance when a failure is probable.

    Shebin Jose Jacob’s solution utilizes the Arduino Nano 33 BLE Sense, along with its built-in microphone, to capture audio and predict when a motor is about to fail. He achieved this by first creating a new Edge Impulse project and gathering samples for four classes of sound: OK, anomaly 1, and anomaly 2, as well as general background noise. After designing an impulse and training a classification model on the samples, he was able to achieve an impressive accuracy of about 95% on the test samples.

    The final step involved deploying the model as firmware for the Arduino, which would allow it to classify sounds in real-time by continuously reading from the microphone. Whenever an anomaly is detected, a red LED at the top illuminates.

    You can read more about the project here on its Edge Impulse tutorial.

    The post Predicting potential motor failures just using sound appeared first on Arduino Blog.

    Website: LINK

  • Detect vandalism using audio classification on the Nano 33 BLE Sense

    Detect vandalism using audio classification on the Nano 33 BLE Sense

    Reading Time: 2 minutes

    Having something broken into and/or destroyed is an act that most people hope to avoid altogether or at least catch the perpetrator in the act when it does occur. And as Nekhil R. notes in his project write-up, traditional deterrence/detection methods often fail, meaning that a newer type of solution was necessary.

    Unlike other glass breaking sensors, Nekhil’s project relies on a single, inexpensive Arduino Nano 33 BLE Sense and its onboard digital microphone to record audio, classify it, and then alert a property owner over WiFi via an ESP8266-01 board. The dataset used to train the machine learning model came from two sources: the Microsoft Scalable Noisy Speech Dataset for background noise, and breaking glass recorded on the device itself. Both of these were added to an Edge Impulse project via the Studio and split into two-second samples before being processed by a Mel-filterbank Energy (MFE) algorithm.

    The resulting model, trained using 200 training cycles and slight noise additions, resulted in an impressive 92% accuracy, with some glass breaking samples being misclassified as mere noise. This was then exported to the Nano 33 BLE Sense as a library for use in a sketch that continually classifies incoming sounds and sends an email with the help of IFTTT if breaking glass is detected.

    You can watch Nekhil’s demo video below and read more about this project here on the Edge Impulse blog.

    The post Detect vandalism using audio classification on the Nano 33 BLE Sense appeared first on Arduino Blog.

    Website: LINK

  • Turning a K-Way jacket into an intelligent hike tracker with the Nicla Sense ME

    Turning a K-Way jacket into an intelligent hike tracker with the Nicla Sense ME

    Reading Time: 2 minutes

    Going for a hike outdoors is a great way to relieve stress, do some exercise, and get closer to nature, but tracking them can be a challenge. Our recent collaboration with K-Way led Zalmotek to develop a small wearable device that can be paired to a jacket to track walking speed, steps taken, and even the current atmospheric conditions.

    At its core, the tracker can be split into having three main functions: weather prediction, step/climbing activity, and a way to gather and send raw data over Bluetooth® Low Energy to the Arduino IoT Cloud for additional processing and training machine learning models. Performing these tasks is a Nicla Sense ME board, which contains an advanced six-axis BHI260AP IMU, a three-axis magnetometer, a pressure sensor, and a BME688 four-in-one gas sensor with temperature and humidity capabilities.

    Zalmotek first collected data samples using the Edge Impulse Studio from the barometer ranging from rising to falling air pressure, as they predict clear or stormy conditions, respectively. Once finished, a classification model was trained and deployed to the Nicla Sense, where the LEDs could indicate which weather pattern is more likely. The activity tracking model, however, was trained using data collected from the IMU and labeled with either walking, climbing, or staying. After integrating them both into a single sketch, Zalmotek created an Arduino IoT Cloud dashboard for displaying these values in real-time.

    For a deeper dive into the device, read Edge Impulse’s blog post. You can also discover more about the Arduino x K-Way project here.

    The post Turning a K-Way jacket into an intelligent hike tracker with the Nicla Sense ME appeared first on Arduino Blog.

    Website: LINK

  • This wearable cough monitor can help improve respiratory disease detection

    This wearable cough monitor can help improve respiratory disease detection

    Reading Time: 2 minutes

    A large number of diseases involve coughing as one of their primary symptoms, but none are quite as concerning as chronic obstructive pulmonary disease (COPD), which causes airflow blockages and other breathing problems in those afflicted by it. Consistently monitoring the frequency and intensity of coughing is vital for tracking how well the disease is being treated, yet current solutions are impractical outside of a hospital setting.

    Eivind Holt had the idea to use an Arduino Nano 33 BLE Sense running a custom tinyML model to automatically classify sounds as either a cough or non-cough and report them to a cloud service. Once a total of 647 audio samples had been collected, Eivind trained a Keras neural networking using Edge Impulse that could correctly identify the sound about 99% of the time. The program he wrote for the Nano creates a custom BLE service with a single cough counting characteristic that is incremented for each detection.

    Getting the number of coughs from the local device to the cloud for later analysis and display was accomplished by using the nRF Android app to receive BLE data and transmit it to the nRF Cloud. Meanwhile, a pair of 500mAh batteries were connected and everything was placed into a 3D-printed case that could easily sit near a person’s neck.

    To see more about how Eivind designed and built this valuable project, check out the Edge Impulse docs page.

    The post This wearable cough monitor can help improve respiratory disease detection appeared first on Arduino Blog.

    Website: LINK

  • Preventing excessive water consumption with tinyML

    Preventing excessive water consumption with tinyML

    Reading Time: 2 minutes

    As the frequency and intensity of droughts around the world continues to increase, being able to reduce our water usage is vital for maintaining already strained freshwater resources. And according to the EPA, leaving a faucet running, whether intentionally or by accident for just five minutes can consume over ten gallons of water. However, Naveen has leveraged the power of machine learning to build a device that can automatically detect running faucets and send alerts over a cellular network in response.

    The hardware for this project is primarily centered around a Blues Wireless Notecard for cellular connectivity, a Blues Wireless Notecarrier-B as its breakout board, and a machine learning-capable microcontroller in the form of an Arduino Nano 33 BLE Sense. Beyond merely having a 32-bit Arm Cortex-M4 processor and 1MB of flash storage, its built-in microphone can be used to easily capture audio data. In this project, Naveen uploaded a dataset containing 15 minutes of either faucet noises or background noise into the Edge Impulse Studio before training a 1D convolutional neural network, which achieved an accuracy of 99.2%.

    From here, a new Twilio route was created that allows the Blues Wireless Notecard to generate SMS messages by sending an API request. Now whenever a faucet has been classified as running for too long, the Nano 33 BLE Sense can transmit a simple command over I2C to the Notecard and alert the recipient.

    For more information about this project, you can read Naveen’s write-up here on Hackster.io.

    The post Preventing excessive water consumption with tinyML appeared first on Arduino Blog.

    Website: LINK

  • Spotting defects in solar panels with machine learning

    Spotting defects in solar panels with machine learning

    Reading Time: 2 minutes

    Large solar panel installations are vital for our future of energy production without the massive carbon dioxide emissions we currently produce. However, microscopic fractures, hot spots, and other defects on the surface can expand over time, thus leading to reductions in output and even failures if left undetected. Manivannan Sivan’s solution for tackling this issue revolves around using computer vision and machine learning to find small defects at the surface before automatically reporting the information.

    Sivan compiled his dataset by first gathering images of solar panels that have visible cracks using an Arduino Portenta H7 and Vision Shield and then drawing bounding boxes around each one. From here, he trained a MobileNetV2 model with the addition of Edge Impulse’s recent FOMO object detection algorithm for better performance. He was able to improve the model’s accuracy even further by augmenting the dataset with images taken at different camera angles and lighting conditions in order to prevent mistaking the white boundary lines for cracks.

    After testing and deploying the model from the Edge Impulse Studio to his Portenta H7 board, it was able to successfully find cracks in a solar panel’s surface around 80% of the time. In the future, Sivan might add other features that take advantage of the onboard connectivity to communicate with outside services for faster response times. You can read more about the project here.

    Boards:Portenta
    Categories:Arduino

    Website: LINK

  • The Smart-Badge recognizes kitchen activities with its suite of sensors

    The Smart-Badge recognizes kitchen activities with its suite of sensors

    Reading Time: 2 minutes

    We all strive to maintain healthier lifestyles, yet the kitchen is often the most challenging environment by far due to it containing a wide range of foods and beverages. The Smart-Badge project, created by a team of researchers from the German Research Centre for Artificial Intelligence (DFKI), aims to track just how many times we reach for the refrigerator door or drink water using machine learning and a suite of environmental sensors.

    The wearable device itself is comprised of a single PCB that houses a pair of microcontrollers, an NXP iMXRT1062 for quickly gathering complex data, and an Arduino Nano 33 BLE Sense for collecting more basic samples. Whether it’s the digital gas sensor, the accelerometer, an IR thermal array, or an air pressure sensor, each reading is compiled into a single stream which updates at 6Hz and can either be stored locally on an SD card or sent via Bluetooth® to a phone.

    After having 10 volunteers perform various tasks around a mock kitchen while wearing the Smart-Badge and then labeling each activity, the researchers were able to collect a sizable dataset. The 791 total data channels were fed through several layers of a neural network that could ultimately classify activities with 92.4% accuracy.

    For more details on the project, you can read the team’s paper here.

    Image credit: Liu and Suh et al.

    Categories:Arduino

    Website: LINK

  • This tinyML-powered baby swing automatically starts when crying is detected

    This tinyML-powered baby swing automatically starts when crying is detected

    Reading Time: 2 minutes

    No one enjoys hearing their baby cry, especially when it occurs in the middle of the night or when the parents are preoccupied with another task. Unfortunately, switching on a motorized baby swing requires physically getting up and pressing a switch or button, which is why Manivannan Sivan developed one that can automatically trigger whenever a cry is detected using machine learning.

    Sivan began his project by first gathering real world samples of crying sounds and background noise from an Arduino Portenta H7 and Vision Shield before labeling them accordingly in the Edge Impulse Studio. From here, he created a simple impulse which takes in time-series audio data and generates a spectrogram which is then used to train a Keras neural network model. Once fully trained, the model could accurately distinguish between the two sounds about 98% of the time.

    Beyond merely classifying the sounds from the two onboard microphones, Sivan’s custom program also sets a relay to activate for 20 seconds if crying has been detected, after which it turns off until crying is recognized again. He hopes to use this project as a convenient way to assist busy parents with the difficult task of calming a crying baby without the need for constant manual intervention. You can read more about it here on the project’s Edge Impulse docs page.

    Boards:Portenta
    Categories:Arduino

    Website: LINK

  • Add ML-controlled smart suspension adjustment to your bicycle

    Add ML-controlled smart suspension adjustment to your bicycle

    Reading Time: 3 minutes

    Some modern cars, trucks, and SUVs have smart active suspension systems that can adjust to different terrain conditions. They adjust in real-time to maintain safety or performance. But they tend to only come on high-end vehicles because they’re expensive, complicated, and add weight. That’s why it is so impressive that Jallson Suryo was able to add a similar smart suspension adjustment system to his bicycle.

    This system will only work on specific bicycles that have suspension forks that the user can adjust with a knob. A servo-driven mechanism mounts onto the fork and turns the knob to tweak the firmness and rebound of the front suspension. Normally the rider would need to stop and turn that knob by hand when necessary, but this system can perform that adjustment automatically in response to the current conditions. It can recognize and accommodate five different conditions: idle, medium, rough, smooth, and sprint. 

    Suryo’s project is especially interesting because it recognizes the conditions with a machine learning model that monitors an Arduino Nano 33 BLE Sense board’s built-in nine-axis inertial sensor. Suryo didn’t have to program explicit sensor reading classifications. He trained the machine learning model, built with Edge Impulse Studio, on real-world data gathered through the Arduino Science Journal app. He could, for example, ride on a rough trail and tell the model that the inertial sensor readings it sees correspond to that mode.

    The Arduino receives power from a lithium battery via a SparkFun charger/booster board. It runs the trained and deployed Edge Impulse ML model. When it detects inertial sensor readings that indicate a specific terrain or action, it turns the servo to adjust the suspension knob to the ideal setting. 

    [youtube https://www.youtube.com/watch?v=qQyYXJHtRaE?feature=oembed&w=500&h=281]
    Categories:Arduino

    Website: LINK

  • Count elevator passengers with the Nicla Vision and Edge Impulse

    Count elevator passengers with the Nicla Vision and Edge Impulse

    Reading Time: 3 minutes

    Modern elevators are powerful, but they still have a payload limit. Most will contain a plaque with the maximum number of passengers (a number based on their average weight with lots of room for error). But nobody has ever read the capacity limit when stepping into an elevator or worried about exceeding it. In reality, manufacturers build their elevators to a size that prevents an excessive number of passengers. But as a demonstration, Nekhil R. put together a tutorial that explains how to use the Edge Impulse ML platform with an Arduino Nicla Vision board to count elevator passengers.

    The Nicla Vision is a new board built specifically for computer vision applications — especially those that incorporate machine learning. In its small footprint (less than a square inch), there is a powerful STM32H747AII6 microcontroller, a 2MP color camera, a six-axis IMU, a time of flight sensor, a microphone, WiFi and Bluetooth, and an onboard LiPo battery charger — and it’s officially supported by Edge Impulse, making it well suited for ML projects.

    To build this passenger counter, all you need is the Nicla Vision, a buzzer, an LED, a push button, a power source, and the 3D-printable enclosure. The guide will walk you through how to train and deploy the object detection model, which is what Edge Impulse excels at. It lets you train a model optimized for microcontrollers and then outputs code that is easy to flash onto an Arduino. There are many optimization tricks involved, such as lowering the video resolution and processing the video as grayscale, but Edge Impulse takes care of all of the difficult work for you.

    After deploying your model to the Nicla Vision, you can mount this device anywhere in an elevator that gives you a view of the whole car. It keeps a running log of passenger counts, which you can visualize later in graphs or as raw data. If the device sees a passenger count that exceeds the set limit, it will flash the LED and sound the buzzer.

    You probably don’t have a reason to count elevator passengers, but this is a fantastic demonstration of what you can accomplish with the Nicla Vision board and Edge Impulse.

    [youtube https://www.youtube.com/watch?v=yD8CJGDpgfY?feature=oembed&w=500&h=281]

    Website: LINK

  • tinyML device monitors packages for damage while in transit

    tinyML device monitors packages for damage while in transit

    Reading Time: 2 minutes

    Arduino TeamSeptember 10th, 2022

    Although the advent of widespread online shopping has been a great convenience, it has also led to a sharp increase in the number of returned items. This can be blamed on a number of factors, but a large contributor to this issue is damage in shipping. Shebin Jose Jacob’s solution involves building a small tracker that accompanies the package throughout its journey and sends alerts when mishandling is detected.

    Jacob started by creating a new Edge Impulse project and collecting around 30 minutes of motion samples from an Arduino Nano 33 BLE Sense’s onboard three-axis accelerometer. Each sample was sorted into one of five categories that range from no motion all the way to a hard fall or vigorous shaking. Features were then generated and used to train a Keras model, which yielded an accuracy of 91.3% in testing.

    To communicate with the outside world, Jacob added a GSM module that allows the Nano 33 BLE Sense to send alerts over a 3G network to an awaiting Firebase endpoint. When the database updates, new data is propagated to a user-face webpage that shows the current status of the package along with any important events.

    More details can be found here in Jacob’s project write-up.

    Website: LINK

  • This piece of art knows when it’s being photographed thanks to tinyML

    This piece of art knows when it’s being photographed thanks to tinyML

    Reading Time: 2 minutes

    This piece of art knows when it’s being photographed thanks to tinyML

    Arduino TeamSeptember 9th, 2022

    Nearly all art functions in just a single direction by allowing the viewer to admire its beauty, creativity, and construction. But Estonian artist Tauno Erik has done something a bit different thanks to embedded hardware and the power of tinyML. His work is able to actively respond to a person whenever they bring up a cell phone to take a picture of it.

    At the center are four primary circuits/components, which include a large speaker, an abstract LED sculpture, an old Soviet-style doorbell board, and a PCB housing the control electronics. The circuit contains an Arduino Nano 33 BLE Sense along with an OV7670 camera module that can capture objects directly in front. Tauno then trained a machine learning model with the help of Edge Impulse on almost 700 images that were labeled as human-containing, cell phone, or everything else/indeterminate. 

    With the model trained and deployed to the Nano 33 BLE Sense, a program was written that grabs a frame from the camera, converts its color space to 24-bit RGB, and sends it to the model for inferencing. The resulting label can then be used to activate the connected doorbell and play various animations on the LED sculpture.

    [youtube https://www.youtube.com/watch?v=u6I8PXLvG6I?feature=oembed&w=500&h=281]

    More details about this project can be found here on Tauno’s website.

    Website: LINK

  • Detecting and tracking worker falls with embedded ML

    Detecting and tracking worker falls with embedded ML

    Reading Time: 2 minutes

    Certain industries rely on workers being able to reach high spaces through the use of ladders or mobile standing platforms. And because of their potential danger if a fall were to occur, Roni Bandini had the idea to create an integrated system that can detect a fall and report it automatically across a wide variety of scenarios.

    A fall can be sensed by measuring changes in acceleration; therefore, Bandini went with an Arduino Nano 33 BLE Sense board due to its built-in three-axis accelerometer. It also supports low-power consumption, meaning that a LiPo battery and accompanying TP4056 charging module could be added for completely wireless operation. Acceleration data was collected by taking several samples within the Edge Impulse Studio and labeling them either “fall” or “stand” when no movement is present. Once tested, the resulting model was integrated into an Arduino sketch, which emits a Bluetooth® advertising packet whenever a fall is detected.

    Collecting each of these packets is the responsibility of a central Raspberry Pi server. It runs a Python script that constantly scans for new BLE advertising data and inserts a new record into its database file accordingly. All of this data can then be queried in a separate script and used to create a chart showcasing how many times every worker has fallen.

    [youtube https://www.youtube.com/watch?v=bM0FFCcvyPQ?feature=oembed&w=500&h=281]

    More details can be found in Bandini’s project write-up and Edge Impulse’s blog post here.

    The post Detecting and tracking worker falls with embedded ML appeared first on Arduino Blog.

    Website: LINK