Schlagwort: Nicla Vision

  • Using an Arduino Nicla Vision as a drone flight controller

    Using an Arduino Nicla Vision as a drone flight controller

    Reading Time: 2 minutes

    Drone flight controllers do so much more than simply receive signals and tell the drone which way to move. They’re responsible for constantly tweaking the motor speeds in order to maintain stable flight, even with shifting winds and other unpredictable factors. For that reason, most flight controllers are purpose-built for the job. But element14’s Milos Rasic was building his own drone from scratch and found that the Arduino Nicla Vision board makes a great flight controller.

    To perform that critical job of keeping the drone stable, the flight controller needs precises information about the orientation of the drone and any movement in three-dimensional space. Luckily, the Nicla Vision has an integrated six-axis motion sensor that is perfect for the job. It has also a powerful STM32H7 microcontroller, a built-in camera for machine vision and learning tasks, onboard Wi-Fi and Bluetooth connectivity, and more. And because it is very small (22.86×22.86mm) and very light, it is a good choice for a drone.

    Rasic designed and made the entire drone from zero, using 8520 brushed DC motors and a 3D-printed frame. That is cool, but it isn’t uncommon. The Nicla Vision-based flight controller is what stands out the most.

    Rasic developed a custom PCB for the Nicla Vision that acts like a breakout board and contains a few other useful components, such as for regulating and boosting power. But it didn’t need much, as the Nicla Vision already has most of the necessary hardware. 

    While he could have turned to existing flight controller firmware, Rasic chose to develop his own and that is the most impressive part of this project. That necessitated the creation of three PID (proportional-integral-derivative) controller algorithms for balancing pitch, roll, and yaw. Those work with control inputs to let the drone hover and move stably. The control signals come from a PC over Wi-Fi, with the pilot providing input through a USB flight stick.

    The drone isn’t yet flying well, as PID tuning is a challenge for even the most experienced drone builders. But the foundation is there for Rasic to build on.

    [youtube https://www.youtube.com/watch?v=4CI2XqS5YiA?feature=oembed&w=500&h=281]

    The post Using an Arduino Nicla Vision as a drone flight controller appeared first on Arduino Blog.

    Website: LINK

  • Making a car more secure with the Arduino Nicla Vision

    Making a car more secure with the Arduino Nicla Vision

    Reading Time: 2 minutes

    Shortly after attending a recent tinyML workshop in Sao Paolo, Brazil, Joao Vitor Freitas da Costa was looking for a way to incorporate some of the technologies and techniques he learned into a useful project. Given that he lives in an area which experiences elevated levels of pickpocketing and automotive theft, he turned his attention to a smart car security system.

    His solution to a potential break-in or theft of keys revolves around the incorporation of an Arduino Nicla Vision board running a facial recognition model that only allows the vehicle to start if the owner is sitting in the driver’s seat. The beginning of the image detection/processing loop involves grabbing the next image from the board’s camera and sending it to a classification model where it receives one of three labels: none, unknown, or Joao, the driver. Once the driver has been detected for 10 consecutive seconds, the Nicla Vision activates a relay in order to complete the car’s 12V battery circuit, at which point the vehicle can be started normally with the ignition.

    Through this project, da Costa was able to explore a practical application of vision models at-the-edge to make his friend’s car safer to use. To see how it works in more detail, you can check out the video below and delve into the tinyML workshop he attended here.

    [youtube https://www.youtube.com/watch?v=LG1YhM2kelI?feature=oembed&w=500&h=281]

    The post Making a car more secure with the Arduino Nicla Vision appeared first on Arduino Blog.

    Website: LINK

  • Empowering the transportation of the future, with the Ohio State Buckeye Solar Racing Team

    Empowering the transportation of the future, with the Ohio State Buckeye Solar Racing Team

    Reading Time: 3 minutes

    Arduino is ready to graduate its educational efforts in support of university-level STEM and R&D programs across the United States: this is where students come together to explore the solutions that will soon define their future, in terms of their personal careers and more importantly of their impact on the world.

    Case in point: the groundbreaking partnership with the Ohio State University Buckeye Solar Racing Team, a student organization at the forefront of solar vehicle technology, committed to promoting sustainable transportation by designing, building, and racing solar-powered vehicles in national and international competitions. This collaboration will see the integration of advanced Arduino hardware into the team’s cutting-edge solar vehicles, enhancing driver displays, data transmission, and cockpit metric monitoring.

    In particular, the team identified the Arduino Pro Portenta C33 as the best option for their car: “extremely low-powered, high-quality and reliable, it also has a CAN interface – which is how we will be getting data from our sensors,” team lead Vasilios Konstantacos shared.

    We have also provided Arduino Student Kits for prototyping and, most importantly, accelerating the learning curve for new members. “Our goal is to rapidly equip our newcomers with vital skills, enabling them to contribute meaningfully to our team’s progress. Arduino’s hardware is a game-changer in this regard,” Vasilios stated.
    In addition, the team received Nicla Vision, Nicla Sense ME, and Nicla Voice modules to integrate essential sensors in the car, and more Portenta components to make their R&D process run faster (pun intended!): Portenta Breakout to speed up development on the Portenta C33, Portenta H7 to experiment with AI models for vehicle driving and testing, and Portenta Cat. M1/NB IoT GNSS Shield to connect the H7 to the car wirelessly, replacing walkie-talkie communication, and track the vehicle’s location.

    Combining our beginner-friendly approach with the advanced features of the Arduino Pro range is the key to empower students like the members of the Buckeye Solar Racing Team to learn and develop truly innovative solutions with the support of a qualified industrial partner and high-performance technological products. In particular, the Arduino ecosystem offers a dual advantage in this case: components’ extreme ruggedness, essential for race vehicle operations, paired with the familiarity and ease of use of the Arduino IDE.

    The partnership will empower Ohio State University students to experiment with microcontrollers and sensors in a high-performance setting, fostering a seamless, hands-on learning experience and supporting the institution’s dedication to providing unparalleled opportunities for real-world application of engineering and technology studies. Arduino’s renowned reliability and intuitive interface make it an ideal platform for students to develop solutions that are not only effective in the demanding environment of solar racing but also transferable to their future professional pursuits.

    “We are thrilled to collaborate with the Ohio State University Buckeye Solar Racing Team,” commented Jason Strickland, Arduino’s Higher Education Sales Manager. “Our mission has always been to make technology accessible and foster innovation. Seeing our hardware contribute to advancing solar racing technology and education is a proud moment for Arduino.”

    The post Empowering the transportation of the future, with the Ohio State Buckeye Solar Racing Team appeared first on Arduino Blog.

    Website: LINK

  • This Nicla Vision-powered ornament covertly spies on the presents below

    This Nicla Vision-powered ornament covertly spies on the presents below

    Reading Time: 2 minutes

    Whether it’s an elf that stealthily watches from across the room or an all-knowing Santa Claus that keeps a list of one’s actions, spying during the holidays is nothing new. But when it comes time to receive presents, the more eager among us might want to know what presents await us a few days in advance under the tree, which is what prompted element14 Presents host Milos Rasic to build a robotic ornament equipped with vision and a compact movement system.

    On the hardware side, Rasic went with an Arduino Nicla Vision board as it contains a camera and the ability to livestream the video feed over the network. A pair of continuous servo motors allow the mobile robot platform to move along the ground while another set of servos open the ornament’s trapdoor to expose the wheels and carefully lower it from the tree through a clever system of bands and thread.

    The livestreaming portion of the project was based off an existing MJPEG RTP example that exposes a web API endpoint for fetching the latest frame from the Nicla’s onboard camera and delivering it via Wi-Fi. To control the robot, including winching, driving, and toggling the lights, Rasic created a Node-RED interface that sent MQTT messages to the Nicla.

    To see more about how this creative device was designed, watch Rasic’s video below or read his full write-up here.

    [youtube https://www.youtube.com/watch?v=UnwMHgpKqw4?feature=oembed&w=500&h=281]

    The post This Nicla Vision-powered ornament covertly spies on the presents below appeared first on Arduino Blog.

    Website: LINK

  • Helping robot dogs feel through their paws

    Helping robot dogs feel through their paws

    Reading Time: 2 minutes

    Your dog has nerve endings covering its entire body, giving it a sense of touch. It can feel the ground through its paws and use that information to gain better traction or detect harmful terrain. For robots to perform as well as their biological counterparts, they need a similar level of sensory input. In pursuit of that goal, the Autonomous Robots Lab designed TRACEPaw for legged robots.

    TRACEPaw (Terrain Recognition And Contact force Estimation Paw) is a sensorized foot for robot dogs that includes all of the hardware necessary to calculate force and classify terrain. Most systems like this use direct sensor readings, such as those from force sensors. But TRACEPaw is unique in that it uses indirect data to infer this information. The actual foot is a deformable silicone hemisphere. A camera looks at that and calculates the force based on the deformation it sees. In a similar way, a microphone listens to the sound of contact and uses that to judge the type of terrain, like gravel or dirt.

    To keep TRACEPaw self-contained, Autonomous Robots Lab chose to utilize an Arduino Nicla Vision board. That has an integrated camera, microphone, six-axis motion sensor, and enough processing power for onboard machine learning. Using OpenMV and TensorFlow Lite, TRACEPaw can estimate the force on the silicone pad based on how much it deforms during a step. It can also analyze the audio signal from the microphone to guess the terrain, as the silicone pad sounds different when touching asphalt than it does when touching loose soil.

    More details on the project are available on GitHub.

    The post Helping robot dogs feel through their paws appeared first on Arduino Blog.

    Website: LINK

  • This Nicla Vision-based fire detector was trained entirely on synthetic data

    This Nicla Vision-based fire detector was trained entirely on synthetic data

    Reading Time: 2 minutes

    Due to an ever-warming planet thanks to climate change and greatly increasing wildfire chances because of prolonged droughts, being able to quickly detect when a fire has broken out is vital for responding while it’s still in a containable stage. But one major hurdle to collecting machine learning model datasets on these types of events is that they can be quite sporadic. In his proof of concept system, engineer Shakhizat Nurgaliyev shows how he leveraged NVIDIA Omniverse Replicator to create an entirely generated dataset and then deploy a model trained on that data to an Arduino Nicla Vision board.

    The project started out as a simple fire animation inside of Omniverse which was soon followed by a Python script that produces a pair of virtual cameras and randomizes the ground plane before capturing images. Once enough had been created, Nurgaliyev utilized the zero-shot object detection application Grounding DINO to automatically draw bounding boxes around the virtual flames. Lastly, each image was brought into an Edge Impulse project and used to develop a FOMO-based object detection model.

    By taking this approach, the model achieved an F1 score of nearly 87% while also only needing a max of 239KB of RAM and a mere 56KB of flash storage. Once deployed as an OpenMV library, Nurgaliyev shows in his video below how the MicroPython sketch running on a Nicla Vision within the OpenMV IDE detects and bounds flames. More information about this system can be found here on Hackster.io.

    [youtube https://www.youtube.com/watch?v=OFCwgWvivHo?feature=oembed&w=500&h=375]

    The post This Nicla Vision-based fire detector was trained entirely on synthetic data appeared first on Arduino Blog.

    Website: LINK

  • Intelligently control an HVAC system using the Arduino Nicla Vision

    Intelligently control an HVAC system using the Arduino Nicla Vision

    Reading Time: 2 minutes

    Shortly after setting the desired temperature of a room, a building’s HVAC system will engage and work to either raise or lower the ambient temperature to match. While this approach generally works well to control the local environment, the strategy also leads to tremendous wastes of energy since it is unable to easily adapt to changes in occupancy or activity. In contrast, Jallson Suryo’s smart HVAC project aims to tailor the amount of cooling to each zone individually by leveraging computer vision to track certain metrics.

    Suryo developed his proof of concept as a 1:50 scale model of a plausible office space, complete with four separate rooms and a plethora of human figurines. Employing Edge Impulse and a smartphone, 79 images were captured and had bounding boxes drawn around each person for use in a FOMO-based object detection model. After training, Suryo deployed the OpenMV firmware onto an Arduino Nicla Vision board and was able to view detections in real-time.

    The last step involved building an Arduino library containing the model and integrating it into a sketch that communicates with an Arduino Nano peripheral board over I2C by relaying the number of people per quadrant. Based on this data, the Nano dynamically adjusts one of four 5V DC fans to adjust the temperature while displaying relevant information on an OLED screen. To see how this POC works in more detail, you can visit Suryo’s write-up on the Edge Impulse docs page.

    The post Intelligently control an HVAC system using the Arduino Nicla Vision appeared first on Arduino Blog.

    Website: LINK

  • Meet Arduino Pro at tinyML EMEA Innovation Forum 2023

    Meet Arduino Pro at tinyML EMEA Innovation Forum 2023

    Reading Time: 3 minutes

    On June 26th-28th, the Arduino Pro team will be in Amsterdam for the tinyML EMEA Innovation Forum – one of the year’s major events for the world where AI models meet agile, low-power devices.

    This is an exciting time for companies like Arduino and anyone interested in accelerating the adoption of tiny machine learning: technologies, products, and ideas are converging into a worldwide phenomenon with incredible potential – and countless applications already.

    At the summit, our team will indeed present a selection of demos that leverage tinyML to create useful solutions in a variety of industries and contexts. For example, we will present:

    • A fan anomaly detection system based on the Nicla Sense ME. In this solution developed with SensiML, the Nicla module leverages its integrated accelerometer to constantly measure the vibrations generated by a computer fan. Thanks to a trained model, condition monitoring turns into anomaly detection – the system is able to determine whether the fan is on or off, notify users of any shocks, and even alert them if its super precise and efficient sensor detects sub-optimal airflow.
    • A vineyard pest monitoring system with the Nicla Vision and MKR WAN 1310. Machine vision works at the service of smart agriculture in this solution: even in the most remote field, a pheromone is used to attract insects inside a case lined with glue traps. The goal is not to capture all the insects, but to use a Nicla Vision module to take a snapshot of the captured bugs, recognize the ones that pose a real threat, and send updated data on how many specimens were found. New-generation farmers can thus schedule interventions against pests as soon as needed, before the insects get out of control and cause damage to the crops. Leveraging LoRa® connectivity, this application is both low-power and high-efficiency.
    • An energy monitoring-based anomaly detection solution for DC motors, with the Opta. This application developed with Edge Impulse leverages an Opta WiFi microPLC to easily implement industrial-level, real-time monitoring and fault detection – great to enable predictive maintenance, reducing downtime and overall costs. A Hall effect current sensor is attached in series with the supply line of the DC motor to acquire real-time data, which is then analyzed using ML algorithms to identify patterns and trends that might indicate faulty operation. The DC motor is expected to be in one of two statuses – ON or OFF – but different conditions can be simulated with the potentiometer. When unexpected electric consumption is shown, the Opta WiFi detects the anomaly and turns on a warning LED.

    The Arduino Pro team is looking forward to meeting customers and partners in Amsterdam – championing open source, accessibility, and flexibility in industrial-grade solutions at the tinyML EMEA Innovation Forum!

    The post Meet Arduino Pro at tinyML EMEA Innovation Forum 2023 appeared first on Arduino Blog.

    Website: LINK

  • Enabling automated pipeline maintenance with edge AI

    Enabling automated pipeline maintenance with edge AI

    Reading Time: 2 minutes

    Pipelines are integral to our modern way of life, as they enable the fast transportation of water and energy between central providers and the eventual consumers of that resource. However, the presence of cracks from mechanical or corrosive stress can lead to leaks, and thus waste of product or even potentially dangerous situations. Although methods using thermal cameras or microphones exist, they’re hard to use interchangeably across different pipeline types, which is why Kutluhan Aktar instead went with a combination of mmWave radar and an ML model running on an Arduino Nicla Vision board to detect these issues before they become a real problem.

    The project was originally conceived as an arrangement of parts on a breadboard, including a Seeed Studio MR60BHA1 60GHz radar module, an ILI9341 TFT screen, an Arduino Nano for interfacing with the sensor and display, and a Nicla Vision board. From here, Kutluhan designed his own Dragonite-themed PCB, assembled the components, and began collecting training and testing data for a machine learning model by building a small PVC model, introducing various defects, and recording the differences in data from the mmWave sensor. The system is able to do this by measuring the minute variations in vibrations as liquids move around, with increased turbulence often being correlated with defects.

    After configuring a time-series impulse, a classification model was trained with the help of Edge Impulse that would use the three labels (cracked, clogged, and leakage) to see if the pipe had any damage. It was then deployed to the Nicla Vision where it achieved an accuracy of 90% on real-world data. With the aid of the screen, operators can tell the result of the classification immediately, as well as send the data to a custom web application. 

    [youtube https://www.youtube.com/watch?v=ghSaefzzEXY?feature=oembed&w=500&h=281]

    More details on the project be found here in its Edge Impulse docs page.

    The post Enabling automated pipeline maintenance with edge AI appeared first on Arduino Blog.

    Website: LINK

  • Want to keep accurate inventory? Count containers with the Nicla Vision

    Want to keep accurate inventory? Count containers with the Nicla Vision

    Reading Time: 2 minutes

    Maintaining accurate records for both the quantities and locations of inventory is vital when running any business operations efficiently and at scale. By leveraging new technologies such as AI and computer vision, items in warehouses, store shelves, and even a customer’s hand can be better managed and used to forecast changes demand. As demonstrated by the Zalmotek team, a tiny Arduino Nicla Vision board can be tasked with recognizing different types of containers and sending the resulting data to the cloud automatically.

    The hardware itself was quite simple, as the Nicla Vision already contained the processor, camera, and connectivity required for the proof-of-concept. Once configured, Zalmotek used the OpenMV IDE to collect a large dataset featuring images of each type of item. Bounding boxes were then drawn using the Edge Impulse Studio, after which a FOMO-specific MobileNetV2 0.35 model was trained and could accurately determine the locations and quantities of objects in each test image.

    Deploying the model was simple thanks to the OpenMV firmware export option, as it could be easily incorporated into the main Python script. In essence, the program continually gathers new images, passes them to the model, and gets the number of detected objects. Afterwards, these counts are published via the MQTT protocol to a cloud service for remote viewing.

    You can read more about the proof of concept in much more detail here on the Edge Impulse blog.

    The post Want to keep accurate inventory? Count containers with the Nicla Vision appeared first on Arduino Blog.

    Website: LINK

  • Vineyard pest monitoring with Arduino Pro

    Vineyard pest monitoring with Arduino Pro

    Reading Time: 7 minutes

    The challenge

    Pest monitoring is essential for the proper management of any vineyard as it allows for the early detection and management of any potential pest infestations. By regularly monitoring the vineyard, growers can identify pests at early stages and take action to prevent further damage. Monitoring can also provide valuable data on pest behaviour, seasonality, and population size. This information can be used to adjust management strategies and protect the quality of grapes harvested from the vineyard.

    One of the most effective ways to monitor pests is with pheromone traps. Pheromone traps use synthetic hormone-like compounds to attract specific insects and correctly estimate their overall presence based on their number, preventing major damage and disease to the plants. Using pheromone traps can help protect vines from serious infestations, reduce pesticide use, and ensure a healthy crop. Additionally, these traps can be used to track the activity of a particular species over time which is useful for predicting when pest populations are likely to peak or decline. By knowing when insect pressure is high or low, winemakers can better plan for treatments and cultivate their land accordingly. 

    The value of conservation and pest control initiatives is immeasurable as the effects of climate change, biodiversity loss, and species invasions become more evident. Traps are widely used for population detection, tracking progress on projects, determining management solutions; in addition to assessing treatment performance.

    Popillia japonica

    Vineyard Pest Monitoring is the practice of monitoring and controlling vineyard pests, such as Popillia japonica. Popillia japonica is a species of scarab beetle native to Japan that feeds on grapevine leaves and can cause significant damage in vineyards. Traditional pest management techniques involve manual monitoring with traps or pheromone traps. These methods are labor-intensive and may not provide accurate and timely monitoring or pest control.

    Our solution

    We propose a solution for estimating Popillia japonica populations in vineyards using pheromone traps and Computer Vision.  

    This system utilizes LoRa® technology to enable remote monitoring of Popillia japonica in vineyards. Arduino Pro allows farmers to monitor Popillia japonica activity with pheromone traps and collect the data remotely. This makes it easier for farmers to detect infestations early and take action, leading to improved efficiency and higher yields. The IoT technology also helps reduce labor costs associated with manual monitoring.

    By using Computer Vision in combination with LoRa® technology, real-time data of pest activity can be collected. This information allows growers to better understand the dynamics of Vineyard pests such as Popillia japonica, helping them to make more informed decisions and reduce their environmental impact. With the right monitoring tools, vineyards can now be better prepared to face the increased risk of Japanese beetle outbreaks posed by climate change.  With IoT devices, there is no longer any excuse not to employ pest monitoring in vineyards. The use of IoT-based pest monitoring is not only cost-effective, but also helps to reduce the environmental impact of pesticide applications. This makes it an important tool for vineyard managers looking to protect their crops in an ever-changing environment. The future of vineyard management lies in the hands of innovative technologies like this one, enabling farmers to ensure their crops are healthy and safe.  By taking advantage of the latest technologies, vineyard managers can make sure their crops are protected from infestations and ensure a successful harvest season year after year.

    To address the challenge we will devise a pest monitoring system based on sensor nodes that monitor areas in the vineyard and send the collected data to a LoRa® gateway that can either display it locally or push it toward a cloud solution where further computation can be done. Either at the gateway level or in the cloud, alerts can be set based on certain thresholds considered relevant. 

    Bug counting

    For monitoring the number of Popillia Japonica in each section of the vineyard we have chosen the Arduino Nicla Vision which is ideal for this project because of its advanced image processing capabilities. It combines a powerful Dual ARM® Cortex® M7/M4 IC processor with a 2MP color camera that supports TinyML in a compact format. The full datasheet is available here. For training the object detection model, we have chosen the Edge Impulse platform where we can easily train and deploy a model that will allow us to detect the number of bugs in the view of the camera. After the deployment, no further need of internet connectivity is needed for the camera and only the number of bugs will be relayed to the Arduino MKR WAN 1310 through UART.

    Connectivity

    The Arduino MKR WAN 1310 is a powerful and versatile IoT development board based on the ARM Cortex®-M0+ 32-bit processor, perfect for building connected projects. It supports the LoRa® communication protocol, making it suitable for long-range applications such as vineyard pest monitoring. Moreover, it also supports the UART, I2C, and SPI communication protocols so it can easily be interfaced with other devices. Additionally, the MKR WAN 1310 features an integrated LiPo battery charger to keep your project running 24/7. With its compact size and low energy consumption, this board can be used in a wide range of projects where connectivity is required without sacrificing power efficiency.

    Thanks to its radio connectivity via LoRa® radio transceivers, the data can be sent directly to the nearest LoRa® gateway which forwards it to the Arduino IoT Cloud. The gateway, Arduino Pro WisGate Edge Pro powered by RAKwireless™ ensures secure and reliable connectivity for a wide range of professional applications and is suitable for medium-sized to wide area coverage in industrial environments and remote regions. Its high transmission power and 2x fiberglass antennas with 5dBi gain provide extensive coverage in open environments, making it the perfect fit for IoT commercial outdoor deployment – required for example for parking sensors, remote fleet management, livestock tracking and geofencing, and soil monitoring solutions that maximize crops’ yield.

    Solving it with Arduino Pro

    Now let’s explore how we could put all of this together and what we would need for deployment both in terms of hardware and software stack. The Arduino Pro ecosystem is the latest generation of Arduino solutions bringing users the simplicity of integration and scalable, secure, professionally supported services.

    Hardware requirements

    Software requirements

    The Nicla Vision has been programmed in MicroPython since the Edge Impulse model was created/tested using the OpenMV IDE and thus we have also sent the number of detected bugs to the Arduino MKR WAN 1310 via UART.

    The Arduino MKR WAN 1310 has been programmed in C/C++ using the Arduino IDE and the Arduino IoT Cloud and registered on the The Things Stack (TTS) platform. The Arduino MKR WAN 1310 acts as an end device programmed to receive the number of detected Popilia Japonica bugs from the Nicla Vision through UART and forward it to the Arduino IoT Cloud through the nearest LoRa® gateway connected to the TTS service.

    Here is a screenshot from a dashboard created directly in the Arduino IoT Cloud showcasing data received from the sensor nodes:

    Here is an overview of the software stack and how a minimum deployment with one of each hardware module communicates to fulfill the proposed solution:

    Conclusion

    By combining Computer Vision with LoRa® technology, farmers can create a reliable vineyard pest monitoring system that is capable of estimating the population of Popillia japonica quickly and accurately. With this IoT-based op-solution, farmers can monitor Popillia japonica activity in their vineyard and take action before Popillia japonica causes significant damage. This helps protect the vineyard from Popillia japonica infestations and ensures higher yields for the farmer.  With Vineyard Pest Monitoring with Arduino Pro, farmers no longer need to rely on labor-intensive manual methods for Popillia japonica monitoring. Instead, they can use IoT technology to create an efficient and cost-effective pest monitoring system that provides accurate data about Popillia japonica activity in their vineyards. 

    In summary, pheromone traps are an important tool for protecting vineyards from pests and ensuring a healthy harvest season and great wines. Salute! 

    The post Vineyard pest monitoring with Arduino Pro appeared first on Arduino Blog.

    Website: LINK

  • Environmental monitoring of corporate offices with Arduino Pro

    Environmental monitoring of corporate offices with Arduino Pro

    Reading Time: 7 minutes

    The challenge

    The quality of the air we breathe has a direct impact on our health. Poor air quality can cause a variety of health problems, including respiratory infections, headaches, and fatigue. It can also aggravate existing conditions such as asthma and allergies. That’s why it’s so important to monitor the air quality in your office and take steps to improve it if necessary.

    Furthermore, the number of people in an office can have a significant impact on air quality. The more people there are, the greater the chance of contaminants being emitted into the air. This is why environmental monitoring is so important in corporate offices; it helps to ensure that the air quality is safe for all workers.

    The last few years added to this challenge yet another layer: The COVID-19 pandemic has forced many businesses to re-evaluate their workplace safety protocols. One of the most important considerations is air quality. Poor air quality can lead to a variety of health problems, including respiratory infections.

    Environmental monitoring in buildings refers to the security and privacy practices used to protect workers and office buildings from airborne contaminants. This includes collecting data on air quality, temperature, humidity, and other environmental factors. This data is then used to assess the risk of exposure to hazardous materials and take steps to mitigate or eliminate those risks.

    Our solution

    To address the challenge, we will devise an environmental monitoring system based on sensor nodes that monitor each room and send the collected data to a gateway that can either display it locally or push it toward a cloud solution where further computation can be done. Either at the gateway level or in the cloud, alerts can be set based on certain thresholds considered relevant. 

    Air quality monitoring

    For monitoring the environmental conditions we have chosen the Arduino Nicla Sense ME, which is designed to easily analyze motion and the surrounding environment – hence the “M” and “E” in the name. It measures rotation, acceleration, pressure, humidity, temperature, air quality, and CO2 levels by introducing completely new Bosch Sensortec sensors on the market.

    The sensor we are most interested in on the Nicla Sense ME is the BME688, the first gas sensor with artificial intelligence (AI) and integrated high-linearity and high-accuracy pressure, humidity, and temperature sensors. It is housed in a robust yet compact 3.0 x 3.0 x 0.9 mm³ package and specially developed for mobile and connected applications where size and low power consumption are critical requirements. The gas sensor can detect volatile organic compounds (VOCs), volatile sulfur compounds (VSCs), and other gasses such as carbon monoxide and hydrogen in the part per billion (ppb) range.

    The full datasheet is available here

    People counting

    For monitoring the number of people in each room we have chosen the Arduino Nicla Vision, which combines a powerful STM32H747AII6 dual Arm Cortex-M7/M4 processor with a 2MP color camera that supports tinyML, as well as a smart six-axis motion sensor, integrated microphone, and distance sensor.

    One thing that must be addressed when using cameras is privacy concerns and for good reasons! In our case, the cameras are used to execute an edge model to evaluate the number of persons in the view and no actual video stream or pictures are leaving the camera. Only the actual number makes it both safe and efficient. 

    For this purpose, we have chosen the Edge Impulse platform where we can easily train and deploy a model that will allow us to detect the number of persons in the view of the camera. After the deployment, no further need of internet connectivity is needed for the camera and only the number of persons will be relayed to the gateway.

    Both the Nicla Vision and Nicla Sense ME have the same size and PCB format, with the main difference being that one features a camera and the other one an array of sensors. For each, we have created a 3D-printed enclosure to accommodate mounting and fulfilling their primary functions easily.

    Edge computing

    For the gateway we have chosen the Portenta X8, which is a powerful, industrial-grade SOM with Linux OS preloaded onboard, capable of running device-independent software thanks to its modular container architecture. It features an NXP i.MX 8M Mini Cortex-A53 quad-core, up to 1.8GHz per core + 1x Cortex-M4 up to 400MHz, plus the STMicroelectronics STM32H747 dual-core Cortex-M7 up to 480Mhz and M4 32-bit Arm MCU up to 240MHz.

    Since space is not an issue when designing building management issues, we have chosen the Portenta Max Carrier, to host and power the Portenta X8 while enhancing its connectivity options and providing it with easy-to-mount options and power supply plugs. We hosted the devices inside an easy to mount on a wall enclosure according to the overall size of the hardware.

    The Portenta X8 can gather via BLE the data from quite a few sensor nodes as long as they are in range and not blocked by heavy walls or structures in between and either store the data locally for displaying it via the local server stack or relay it further to the cloud.

    IoT Cloud solution

    Although the Portenta X8 board is capable of storing data locally, there may be times when it is also desirable to send data to the cloud. This can be accomplished by forwarding data from the InfluxDB database on the Portenta X8 board to the Arduino IoT Cloud via MQTT. The arduino-iot-js NPM module makes it easy to set up this connection, and the steps to do so are not covered in this tutorial. For illustrative purposes, however, the diagram below offers a brief overview of our proposed architecture for one potential deployment scenario in a building with multiple rooms.

    Solving it with Arduino Pro

    Now let’s explore how we could put all of this together and what we would need for deployment both in terms of hardware and software stack. The Arduino Pro ecosystem is the latest generation of Arduino solutions bringing users the simplicity of integration and scalable, secure, professionally supported services.

    Hardware requirements

    • Arduino Nicla Vision
    • Arduino Nicla Sense Me
    • Arduino Portenta X8
    • Enclosures

    Software requirements

    • Arduino IDE
    • OpenMV IDE
    • Edge Impulse account

    The Nicla Vision has been programmed in Python since the Edge Impulse model was created/tested using the OpenMV IDE and thus we have also sent the data over BLE using the Python library.

    The Nicla Sense ME has been programmed in C/C++ using the Arduino IDE since reading the sensors and sending their data over the BLE can be done faster via the C/C++  programming language since the code is already compiled and we do not need any heavy computing like when dealing with video on the Nicla Vision.

    The Portenta X8 with its Linux OS preloaded onboard is fully capable of running Docker and thus containers with a vast array of functionalities. In our case, we found it most useful to use a time series database to store the data and display it locally. There is a pre-built container including InfluxDB, Grafana, and Node-Red that can be easily deployed to achieve this task.

    Here is a screenshot from a dashboard created directly in InfluxDB showcasing data received from the sensor nodes:

    The dashboard can be visualized by accessing the InfluxDB interface on the Portenta X8 IP on port 8086 in a browser on another computer connected to the same WiFi network (for example, http://192.168.1.199:8086/).

    Here is an overview of the software stack and how a minimum deployment with one of each hardware module communicates to fulfill the proposed solution:

    Conclusion

    Environmental monitoring is essential for corporate offices in order to ensure the safety and health of workers. The correlation between the number of people in an office and air quality means that more people can lead to more contaminants in the air. Additionally, data latency can be a challenge when it comes to environmental monitoring. This is why it is important to have systems in place that can collect data quickly and efficiently and make this data available to decision-makers in a timely manner.

    There are many benefits to using this solution. First, it enables building managers to monitor the environmental conditions in each room and take steps to mitigate any risks. Second, it provides a system for collecting data on occupancy, air quality, temperature, humidity, and other environmental factors. This data can be used to assess the risk of exposure to hazardous materials and take steps to mitigate or eliminate those risks. Finally, our solution is easy to use and can be installed in any office building.

    Website: LINK

  • Count elevator passengers with the Nicla Vision and Edge Impulse

    Count elevator passengers with the Nicla Vision and Edge Impulse

    Reading Time: 3 minutes

    Modern elevators are powerful, but they still have a payload limit. Most will contain a plaque with the maximum number of passengers (a number based on their average weight with lots of room for error). But nobody has ever read the capacity limit when stepping into an elevator or worried about exceeding it. In reality, manufacturers build their elevators to a size that prevents an excessive number of passengers. But as a demonstration, Nekhil R. put together a tutorial that explains how to use the Edge Impulse ML platform with an Arduino Nicla Vision board to count elevator passengers.

    The Nicla Vision is a new board built specifically for computer vision applications — especially those that incorporate machine learning. In its small footprint (less than a square inch), there is a powerful STM32H747AII6 microcontroller, a 2MP color camera, a six-axis IMU, a time of flight sensor, a microphone, WiFi and Bluetooth, and an onboard LiPo battery charger — and it’s officially supported by Edge Impulse, making it well suited for ML projects.

    To build this passenger counter, all you need is the Nicla Vision, a buzzer, an LED, a push button, a power source, and the 3D-printable enclosure. The guide will walk you through how to train and deploy the object detection model, which is what Edge Impulse excels at. It lets you train a model optimized for microcontrollers and then outputs code that is easy to flash onto an Arduino. There are many optimization tricks involved, such as lowering the video resolution and processing the video as grayscale, but Edge Impulse takes care of all of the difficult work for you.

    After deploying your model to the Nicla Vision, you can mount this device anywhere in an elevator that gives you a view of the whole car. It keeps a running log of passenger counts, which you can visualize later in graphs or as raw data. If the device sees a passenger count that exceeds the set limit, it will flash the LED and sound the buzzer.

    You probably don’t have a reason to count elevator passengers, but this is a fantastic demonstration of what you can accomplish with the Nicla Vision board and Edge Impulse.

    [youtube https://www.youtube.com/watch?v=yD8CJGDpgfY?feature=oembed&w=500&h=281]

    Website: LINK

  • Reading analog gauges with the Nicla Vision

    Reading analog gauges with the Nicla Vision

    Reading Time: 2 minutes

    Arduino TeamAugust 13th, 2022

    Analog instruments are everywhere and used to measure pressure, temperature, power levels, and much more. Due to the advent of digital sensors, many of these became quickly obsolete, leaving the remaining ones to require either conversions to a digital format or frequent human monitoring. However, the Zalmotek team has come up with a solution that incorporates embedded machine learning and computer vision in order to autonomously read these values.

    Mounted inside of a custom enclosure, their project relies on an Arduino Pro Nicla Vision board, which takes periodic images for further processing and inference. They began by generating a series of synthetic gauge pictures that have the dial at various positions, and labeled them either low, normal, or high. This collection was then imported into the Edge Impulse Studio and used to train a machine learning model on the 96x96px samples due to the limited memory. Once created, the neural network could successfully determine the gauge’s state about 92% of the time.

    The final step of this project involved deploying the firmware to the Nicla Vision and setting the image size to the aforementioned 96x96px size. By opting to use this technique of computer vision, frequent readings can be taken while also minimizing cost and power consumption.

    More details on Zalmotek’s system can be found here on its Edge Impulse docs page

    Website: LINK

  • Meet Nikola, a camera-enabled smart companion robot

    Meet Nikola, a camera-enabled smart companion robot

    Reading Time: 2 minutes

    Arduino TeamJune 6th, 2022

    For this year’s Embedded Vision Summit, Hackster.io’s Alex Glow created a companion robot successor to her previous Archimedes bot called Nikola. This time, the goal was to embed a privacy-focused camera and microphone system as well as several other components that would increase its adorability.

    The vision system uses a Nicla Vision board to read a QR code within the current frame thanks to the OpenMV IDE and the code Glow wrote. After it detects a code containing the correct URL, it activates Nikola’s red LED to signify that it’s taking a photo and storing it automatically.

    Apart from the vision portion, Glow also included a pair of ears that move with the help of two micro servos controlled by a Gemma M0 board from Adafruit, which give it some extra character. And lastly, Nikola features an internal mount that holds a microphone for doing interviews, thus letting the bot itself get held near the interviewee. 

    Nikola is a great compact and fuzzy companion robot that can be used not just for events, but also for interviews and simply meeting people. You can see how Glow made the robot in more detail here on Hackster.io or by watching her video below!

    [youtube https://www.youtube.com/watch?v=E4bL_V7fydc?feature=oembed&w=500&h=281]

    Website: LINK

  • Meet the Nicla Vision: Love at first sight!

    Meet the Nicla Vision: Love at first sight!

    Reading Time: 3 minutes

    Arduino TeamMarch 8th, 2022

    Nicla Vision

    We’re proud to announce a new addition to the Arduino ecosystem, the Nicla Vision.

    This is a brand new, ready-to-use, 2MP standalone camera that lets you analyze and process images on the edge for advanced machine vision and edge computing applications.

    Now you can add image detection, facial recognition, automated optical inspection, vehicle plate reading, gesture recognition and more to your projects. Nicla Vision has a powerful dual processor and is packed with features that make an infinite number of applications possible in building and industrial automation, safety and security, and prototyping. Everything from business-savvy predictive maintenance (by detecting and analyzing surface wear, for example) to user-friendly smart kiosks that anyone can explore via intuitive gestures. All true to Nicla’s mission to provide a new range of easy-to-use, cost-effective and accessible tools to advanced users and enthusiasts alike.

    Packed with potential, you can use Nicla Vision anywhere because it’s small in size and can be battery-powered. At the far end of your construction site, or in a tight spot in front of analog meters, it works wherever it’s needed.

    What’s more, Nicla Vision is compatible with Portenta and MKR components. It fully integrates with OpenMV, supports MicroPython and features WiFi and Bluetooth® Low Energy connectivity to complement a wide range of professional and consumer equipment. This means you can easily add ready-to-use sensors such as a camera, microphone, IMU and ToL to your projects.

    Prototyping machine vision applications just got a lot faster.

    Nicla Vision Features

    • Part of the Nicla range alongside the Nicla Sense ME, it’s based on Arduino’s smallest form factor at just 22.86 x 22.86 mm.
    • It has a powerful dual core microcontroller and is equipped with a 2MP color camera that supports tinyML, taking image processing capabilities to the next level.
    • Nicla Vision offers both WiFi and Bluetooth® Low Energy connectivity.
    • On top of image detection and recognition, it senses and captures distance, sound, movement and vibration data, thanks to a smart six-axis motion sensor, integrated microphone and distance sensor.
    • It maximizes compatibility: all Nicla products also fit perfectly into any solution based on Portenta or MKR.
    • Need a standalone solution? Nicla Vision can be battery-powered.

    Whether in a dark warehouse or on the other side of your house, the Nicla Vision has the abilities and intelligence to keep an eye on things for you. If you’ve ever dreamed about interacting with machines that simply understand what you need because they can see you, this device is for you. 

    Take a closer look at the Nicla Vision right here, or find out more in our open online documentation. Or you can purchase a Nicla Vision from the Arduino Store and see for yourself.

    Website: LINK