Schlagwort: tech

  • Can tripedal robots actually walk?

    Can tripedal robots actually walk?

    Reading Time: 2 minutes

    Building walking robots is difficult, because they either need a lot of legs or some ability to balance through their gait. There is a reason that the robots designed by companies like Boston Dynamics are so impressive. But lots of hobbyists have made bipedal and quadrupedal robots, while largely ignoring tripedal robots. To find out if they could be practical, James Bruton created a prototype tripedal robot.

    When compared to a bipedal robot, a tripedal robot is more stable when standing still. But a bipedal robot is more stable when walking. That’s because it can keep its center of gravity almost directly above the foot that contacts the ground. A tripedal robot, on the other hand, needs to attempt to balance on two legs while move the third, while the center of gravity is somewhere above the middle of a triangle formed by the three feet. That makes walking gaits difficult to achieve.

    Bruton built this prototype using a 3D-printed body, legs actuated by servo motors, and an Arduino Mega 2560 for control. The three legs are arranged with radial symmetry and each leg has three joints. Bruton attempted to give the robot a gait in which it tries to momentarily balance on two legs, while lifting and swinging the third around.

    [youtube https://www.youtube.com/watch?v=wobHLlWGX98?feature=oembed&w=500&h=281]

    But that was very inefficient and clumsy. Bruton believes that he could achieve better results by equipping the robot with an IMU. That would give it a sense of balance, which could help it remain steady on two legs through a gait. With a counterbalancing weight, that could make a big difference. But for now, Bruton is putting this experiment on the back burner.

    The post Can tripedal robots actually walk? appeared first on Arduino Blog.

    Website: LINK

  • This electronic compass guides you using LEDs and OLED display

    This electronic compass guides you using LEDs and OLED display

    Reading Time: 2 minutes

    Of all the tools humanity has developed for navigation over our history, the humble compass is arguably the most important. A compass is cheap to produce, reliable, and incredibly useful for every kind of navigation. But standard mechanical compasses aren’t perfect; they can stick, are easy for people to misread, and don’t always indicate exact angles well. This simple digital compass built by Mirko Pavleski solves those problems.

    There are, of course, many digital compasses on the market that will work just as well as this device. But this could still be a fun weekend project. It is a digital compass that aids navigation in two ways. First, it indicates magnetic north using an outer ring of LEDs. But there are only 60 LEDs, so that can be off by as much as three degrees in either direction. For that reason, this device also contains an OLED display that shows the exact angle (down to two decimal places) relative to magnetic north.

    This is a very simple device to build and it only requires four components: an Arduino Nano board, an HMC5883L compass module, a ring with 60 WS2812B individually addressable RGB LEDs, and an OLED screen. If you want to use it on the go, a standard 9V battery will provide portable power. Those components fit into a basic 3D-printable case that protects the electronics and puts everything in the correct orientation.

    [youtube https://www.youtube.com/watch?v=4SRXFxqP2Fw?feature=oembed&w=500&h=281]

    The post This electronic compass guides you using LEDs and OLED display appeared first on Arduino Blog.

    Website: LINK

  • Add peripheral lighting to improve VR immersion

    Add peripheral lighting to improve VR immersion

    Reading Time: 2 minutes

    The entire purpose of virtual reality (VR) is to improve immersion beyond what can be achieved with a regular TV or monitor. So it can be frustrating to VR users when the darkness in their peripheral vision reminds them that they are, in fact, wearing a headset. To enhance the situation with Valve Index VR headsets without adding too much cost, Staton developed peripheral ambient lightning called VR Ambilight.

    VR Ambilight works in the same way as Ambilight ambient backlighting for TVs and monitors. The system looks at the colors of the pixels around the entire outer border of the screen, then sets the colors of LEDs to match. That creates a gentle transition from the screen to the surrounding wall. When applied to VR, it extends the screen content into the user’s periphery. Because the user can’t see anything in their periphery in detail, the colored light is enough to maintain the illusion and eliminate breaks in immersion.

    The only hardware components necessary for this system were an Arduino Nano board and two short strips of WS2812B individually addressable RGB LEDs. The LEDs mount inside of the Valve Index VR headset, with a thin sheet of translucent white plastic acting as a diffuser. Prismatik software works with Steam to detect the pixel colors along the screen edges, then uses a simple script to pass that along to the Arduino. The Valve Index has a handy USB port built-in, which helps to keep the wiring nice and tidy.

    The post Add peripheral lighting to improve VR immersion appeared first on Arduino Blog.

    Website: LINK

  • Small-footprint keyword spotting for low-resource languages with the Nicla Voice

    Small-footprint keyword spotting for low-resource languages with the Nicla Voice

    Reading Time: 2 minutes

    Speech recognition is everywhere these days, yet some languages, such as Shakhizat Nurgaliyev and Askat Kuzdeuov’s native Kazakh, lack sufficiently large public datasets for training keyword spotting models. To make up for this disparity, the duo explored generating synthetic datasets using a neural text-to-speech system called Piper, and then extracting speech commands from the audio with the Vosk Speech Recognition Toolkit.

    Beyond simply building a model to recognize keywords from audio samples, Nurgaliyev and Kuzdeuov’s primary goal was to also deploy it onto an embedded target, such as a single-board computer or microcontroller. Ultimately, they went with the Arduino Nicla Voice development board since it contains not just an nRF52832 SoC, a microphone, and an IMU, but an NDP120 from Syntiant as well. This specialized Neural Decision Processor helps to greatly speed up inferencing times thanks to dedicated hardware accelerators while simultaneously reducing power consumption. 

    With the hardware selected, the team began to train their model with a total of 20.25 hours of generated speech data spanning 28 distinct output classes. After 100 learning epochs, it achieved an accuracy of 95.5% and only consumed about 540KB of memory on the NDP120, thus making it quite efficient.

    [youtube https://www.youtube.com/watch?v=1E0Ff0ds160?feature=oembed&w=500&h=375]

    To read more about Nurgaliyev and Kuzdeuov’s project and how they deployed an embedded ML model that was trained solely on generated speech data, check out their write-up here on Hackster.io.

    The post Small-footprint keyword spotting for low-resource languages with the Nicla Voice appeared first on Arduino Blog.

    Website: LINK

  • VBox is like an AI-powered Pandora boombox

    VBox is like an AI-powered Pandora boombox

    Reading Time: 2 minutes

    Discovering new music is difficult, making it a frustrating experience for both listeners and services. Identifying what one person liked about a specific song is a challenge when music is so subjective. Two different people may love the same song, but for different reasons that affect their wider tastes. In an attempt to improve the situation, Danning Liang and Artem Laptiev from MIT’s School of Architecture and Planning built a kind of AI-powered boombox called VBox that helps listeners discover music in a new way.

    Most existing services use some combination of listener data and qualitative categorization of songs to aid in music discovery. But those connections are obvious and tend not to identify the factors that actually predict a listener’s enjoyment of a song. Artificial intelligence models, on the other hand, excel at finding connections and patterns that we might not see ourselves. In this case, VBox uses OpenAI’s natural language models to categorize music and find similar songs. As a song plays, VBox will list keywords related to the music. If a specific keyword resonates with the listener, they can select it to influence the next song choice.

    There aren’t a lot of technical details available, but we do know that an Arduino board is somewhere in the mix. It uses RFID to identify genre cards that start the music discovery process. The keywords scroll across a LED matrix display and a crank handle attached to a rotary encoder lets listeners move through the keyword list. The enclosure is made of gorgeous stamped sheet metal with a leather strap and some 3D-printed internal framework for the electronic components. Music pumps out through a pair of small speakers.

    [youtube https://www.youtube.com/watch?v=6Xez70PCn6s?feature=oembed&w=500&h=281]

    This is more of an art project and an AI experiment than a true attempt at creating an appealing music discovery system, but the idea is novel and it very well could prove useful for some music lovers.

    The post VBox is like an AI-powered Pandora boombox appeared first on Arduino Blog.

    Website: LINK

  • This recycling bin sorts waste using audio classification

    This recycling bin sorts waste using audio classification

    Reading Time: 2 minutes

    Although a large percentage of our trash can be recycled, only a small percentage actually makes it to the proper facility due, in part, to being improperly sorted. So as an effort to help keep more of our trash out of landfills without the need for extra work, Samuel Alexander built a smart recycling bin that relies on machine learning to automatically classify the waste being thrown in and sort it into separate internal compartments.

    Because the bin must know what trash is being tossed in, Alexander began this project by first constructing a minimal rig with an Arduino Nano 33 BLE Sense to capture sounds and send them to an Edge Impulse project. From here, the samples were split into 60 one-second samples for each rubbish type, including cans, paper, bottles, and random background noise. The model, once trained, was then deployed to the Nano as a custom Arduino library.

    With the board now able to determine what type of garbage has been thrown in, Alexander got to work on the remaining portions of the smart bin. The base received a stepper motor which spins the four compartments to line up with a servo-actuated trap door while a LiPo battery pack provides power to everything for fully wireless operation.

    [youtube https://www.youtube.com/watch?v=roWY29RNFU0?feature=oembed&w=500&h=281]

    To read more about how this bin was created, you can visit Alexander’s write-up here on Hackaday.io.

    The post This recycling bin sorts waste using audio classification appeared first on Arduino Blog.

    Website: LINK

  • Amazon Echo becomes charming animatronic robot

    Amazon Echo becomes charming animatronic robot

    Reading Time: 2 minutes

    There is no shame in taking advantage of a voice assistant device, like an Amazon Echo with Alexa. Those devices are useful and can add real convenience to your life. But they lack personality and any feeling of a soul—not just because of the stilted voices, but also because of the boring industrial designs intended for mass market appeal. To inject some life into his Amazon Echo, Workshop Nation turned it into a charming animatronic robot.

    At its heart, this is still an Amazon Echo and it retains all of that functionality. But the Alexa brain now inhabits a body that looks like it was made by a wacky scientist from an ’80s movie featuring robot hijinks. It was cobbled together from salvaged parts, like an old CRT TV, as well as new components. It has 3D-printed animatronic eyes based on a design by Will Cogley and actuated by servo motors. Something akin to a voice waveform appears on the CRT whenever Alexa speaks, which works by using that signal voltage to drive one of the electron beam coils.

    An Arduino Mega 2560 board controls the animatronics and also monitors a Useful Sensors Person Sensor. Its purpose is to direct the movement of the eyes to follow any people in the area. The Arduino also lets the users bypass the normal “Alexa” wake word so they can ask questions starting with whatever term they prefer. Those components (the bulk of which belong to the CRT) all attach to a frame made of laser-cut clear acrylic and threaded rods.

    [youtube https://www.youtube.com/watch?v=bO-DWWFolPw?start=1011&feature=oembed&w=500&h=281]

    The result is a contraption that combines all of the convenience of a modern voice assistant with the aesthetic appeal of a science fair reject.

    The post Amazon Echo becomes charming animatronic robot appeared first on Arduino Blog.

    Website: LINK

  • You have 3 ways to meet Massimo Banzi in the UK!

    You have 3 ways to meet Massimo Banzi in the UK!

    Reading Time: 3 minutes

    Massimo Banzi and the Arduino Pro team will be crossing the Channel soon for a short tour of Southern England, touching base with long-time partners and meeting many new Arduino fans! 

    On July 11th at 4PM BST, Massimo has been invited to give a Tech Talk at Arm’s headquarters in Cambridge, as part of the company’s ongoing series where “leading experts cover topics across the industry, including artificial intelligence, automotive, consumer technology, infrastructure, and IoT.” Register now to attend the talk remotely, anywhere in the world.

    Fancy a pint and a fireside chat? Come and meet us in London at the Cittie of Yorke, July 12th at 6PM in Holborn. You can learn about Arduino’s latest products and future vision, straight from the co-founder himself. The event is free and no registration is required, but admission will be regulated depending on the venue’s capacity – get there early to save your seat!

    Finally, on July 13th we are excited to announce Arduino Pro will debut with a booth at Hardware Pioneers Max. Come visit us at the Business Design Center in London, booth #48, to chat with our experts. Not sure where to begin? Our demos make great conversation starters! At the show, look for these:

    • An industrial-grade computer built with a Portenta X8 and Max Carrier. The X8’s hybrid combination of microprocessor and microcontroller yields unprecedented flexibility to simultaneously run Linux apps and perform real-time tasks. Pair that with the Max Carrier and an 8″ screen and you have a secure and powerful computer to deploy advanced AI algorithms and ML on the edge. The Portenta X8 can also act as a multi-protocol gateway: data from onsite sensors and controllers (e.g. temperature, operation time, warning codes) are collected and processed thanks to the module’s supported industrial protocols, then sent to the Cloud or ERP system via Wi-Fi, LoRa®, NB/IoT or LTE Cat.M1.
    • A vibration-based condition monitoring system to detect anomalies with Nicla Sense ME. Developed in collaboration with SensiML, this solution makes great use of Nicla’s self-learning AI smart sensor – with integrated accelerometer and gyroscope – to measure vibrations generated by a computer fan. With the intelligence of a trained ML model, the system monitors the fan’s conditions and can determine whether it is on or off, if there are any shocks, and even if the airflow is simply sub-optimal.
    • A solution to monitor vineyard pests, thanks to Nicla Vision and MKR WAN 1310. Smart farming leverages machine vision and valuable data on pest behavior, seasonality, and population size to optimize manual interventions against the dangerous Popillia japonica. Insects are attracted by pheromones inside the trap, where a low-power sensing solution leverages an ML model trained, tested and deployed with Edge Impulse to recognize and count insects, sending real-time data via LoRa® connectivity to the Cloud for remote monitoring.

    And don’t miss Massimo’s talk, “Everything you think you know about Arduino is WRONG” at 4PM (see the event agenda). It’s your chance to find out how the brand that made tech accessible for the first generation of makers is now evolving to support a new generation of innovators.

    We can’t wait to see you there!

    The post You have 3 ways to meet Massimo Banzi in the UK! appeared first on Arduino Blog.

    Website: LINK

  • Watch this mesmerizing machine roll a ball into eternity

    Watch this mesmerizing machine roll a ball into eternity

    Reading Time: 2 minutes

    If you’ve ever found yourself in a trance while watching an episode of How It’s Made, then you understand the appeal of seeing a well-designed machine in action. But what if you didn’t have to turn on your TV or venture into a factory to get that buzz? That’s why JBV Creative built this real life “oddly satisfying GIF” machine that rolls a ball into eternity.

    This project is a real life representation of the those simulation GIFs that show virtual machines moving balls in endless cycles. A simulation like that is easy to make when you don’t have to adhere to physics, but it takes some doing to accomplish the same thing in real life. Perpetual motion is, of course, impossible. So this machine utilizes motor-driven mechanisms to keep a ball moving around a vertical track. That track is clear laser-cut acrylic, while the rest of the custom parts are 3D-printed.

    An Arduino Nano board controls the machine and it has two jobs: monitoring a sensor and rotating a motor. There are two mechanisms: one to lift the ball to the top of the track and one to move a “bridge” from the top track to the bottom track. A single stepper motor spins gears to actuate both mechanisms, and the Arduino controls that motor through an H-bridge. An infrared sensor tells the Arduino when the lift mechanism arm reaches a specific angle, so the Arduino can vary the motor rotation speed to ensure smooth movement.

    [youtube https://www.youtube.com/watch?v=9-iBA7778u8?feature=oembed&w=500&h=281]

    This is oddly satisfying to watch and the machine would be great on a desk where it could act as a soothing distraction.

    The post Watch this mesmerizing machine roll a ball into eternity appeared first on Arduino Blog.

    Website: LINK

  • Predicting soccer games with ML on the UNO R4 Minima

    Predicting soccer games with ML on the UNO R4 Minima

    Reading Time: 2 minutes

    Based on the Renesas RA4M1 microcontroller, the new Arduino UNO R4 boasts 16x the RAM, 8x the flash, and a much faster CPU compared to the previous UNO R3. This means that unlike its predecessor, the R4 is capable of running machine learning at the edge to perform inferencing of incoming data. With this fact in mind, Roni Bandini wanted to leverage his UNO R4 Minima by training a model to predict the likelihood of a FIFA team winning their match.

    Bandini began his project by first downloading a dataset containing historical FIFA matches, including the country, team, opposing team, ranking, and neutral location. Next, the data was added to Edge impulse as a time-series dataset which feeds into a Keras classifier ML block and produces “win” and “lose/draw” values. Once trained, the model achieved an accuracy of 69% with a loss value of 0.58.

    Inputting the desired country and rank to make a prediction is done by making selections on a DFRobot LCD shield, and these values are then used to populate the input tensor for the model before it gets invoked and returns its classification results. Bandini’s device demonstrates how much more powerful the Arduino UNO R4 is over the R3, and additional information on the project can be found here in his post.

    [youtube https://www.youtube.com/watch?v=dYTukgY9kEU?feature=oembed&w=500&h=281]

    The post Predicting soccer games with ML on the UNO R4 Minima appeared first on Arduino Blog.

    Website: LINK

  • UNO R4: The new dimension of making

    UNO R4: The new dimension of making

    Reading Time: 3 minutes
    Arduino UNO R4

    The revolutionary UNO R4, announced on Arduino Day, is now available on the Arduino Store!

    The fourth version of the iconic, beloved UNO adds a whole new dimension to the world of DIY and making revolutionized by the simple 8-bit microcontroller over a decade ago. Take your maker potential to new heights: UNO R4 has a speedy 32-bit Arm® Cortex®-M4 and offers a 16-fold increase in memory, as well as more connectors and connectivity options than ever – in two variants: UNO R4 Minima and UNO R4 WiFi

    What’s in the new UNO?

    Both variants allow you to start making or easily upgrade UNO R3-based projects with more computational power, memory and speed than previous versions – courtesy of the RA4M1 microcontroller by Renesas – while maintaining the same form factor and 5 V operating voltage. 

    • More memory and faster clock: A 16x increase in memory and 3x the clock speed mean the UNO R4 can perform more precise calculations and handle more complex and sophisticated projects than ever.
    • New built-in peripherals: Among several exciting new peripherals, you get a 12-bit DAC, CAN BUS, OP AMP, and SWD port – each one expanding your maker potential, allowing you to easily tackle increasingly advanced projects.
    • A USB-C® Connector: Embracing the smaller, more powerful, and robust cable standard!
    • An HID device: This feature makes interactive projects a million times cooler, allowing you to create interfaces with minimal effort and in no time.

    You can get started on your next automation or audio project with the UNO R4 Minima – a top pick for first-time makers, with its combination of powerful technology and affordable price – or you can opt for the UNO R4 WiFi if you need more specific features.

    Connect to your inner creativity with the UNO R4 WiFi

    The UNO R4 WiFi variant is perfect for all users, from beginners to experts, wanting to explore the forefront of innovation. This full-fledged board features an ESP32-S3 coprocessor that handles Wi-Fi® and Bluetooth® Low Energy connectivity, so the RA4M1 microcontroller can focus on other tasks. On top of all the features in the Minima variant, it offers:

    • Wi-Fi® and Bluetooth® connectivity, plus Cloud-compatibility: build IoT projects, create interactive dashboard and control your project remotely.
    • A 12×8 LED matrix: imagine working on a creative project using animations or plotting sensor data, without any additional hardware!
    • A Qwiic connector to facilitate quick prototyping: thanks to a wide variety of compatible modules that can be connected over I2C, you can easily create custom projects and expand the capabilities of your UNO R4 WiFi.
    • A built-in mechanism that detects operations that might cause a crash: if it catches errors such as a division by zero, the board stops them before they can cause trouble and provides a detailed explanation on the serial monitor

    The Arduino ecosystem has evolved, over the last two decades, thanks to the powerful idea of open source and the stellar community of 30+ million makers who keep sharing, building, and improving. Every new hardware product, every new software solution has added new worlds – ready to welcome everyone. Now, the UNO R4 takes the ecosystem and the community to a new dimension of performance. 

    – Massimo Banzi, Arduino co-founder

    So, if you’re just beginning your Arduino journey, the release of the UNO R4 makes it a great time to start!

    Already an UNO fan? The UNO R4 makes it easy to port existing shields and projects to a whole new level, and selected contributors are already helping us update UNO R3 libraries. 

    A final big thank you goes to our community and early adopters, who have supported us these past few months by contributing their efforts to porting libraries.

    What are you waiting for?

    The post UNO R4: The new dimension of making appeared first on Arduino Blog.

    Website: LINK

  • Intelligently control an HVAC system using the Arduino Nicla Vision

    Intelligently control an HVAC system using the Arduino Nicla Vision

    Reading Time: 2 minutes

    Shortly after setting the desired temperature of a room, a building’s HVAC system will engage and work to either raise or lower the ambient temperature to match. While this approach generally works well to control the local environment, the strategy also leads to tremendous wastes of energy since it is unable to easily adapt to changes in occupancy or activity. In contrast, Jallson Suryo’s smart HVAC project aims to tailor the amount of cooling to each zone individually by leveraging computer vision to track certain metrics.

    Suryo developed his proof of concept as a 1:50 scale model of a plausible office space, complete with four separate rooms and a plethora of human figurines. Employing Edge Impulse and a smartphone, 79 images were captured and had bounding boxes drawn around each person for use in a FOMO-based object detection model. After training, Suryo deployed the OpenMV firmware onto an Arduino Nicla Vision board and was able to view detections in real-time.

    The last step involved building an Arduino library containing the model and integrating it into a sketch that communicates with an Arduino Nano peripheral board over I2C by relaying the number of people per quadrant. Based on this data, the Nano dynamically adjusts one of four 5V DC fans to adjust the temperature while displaying relevant information on an OLED screen. To see how this POC works in more detail, you can visit Suryo’s write-up on the Edge Impulse docs page.

    The post Intelligently control an HVAC system using the Arduino Nicla Vision appeared first on Arduino Blog.

    Website: LINK

  • This converted RC car uses a Portenta H7 to drive itself

    This converted RC car uses a Portenta H7 to drive itself

    Reading Time: 2 minutes

    Jeremy Ellis is a teacher, and as such, wanted a longer-term project that his students could do to learn more about microcontrollers and computer vision/machine learning, and what better way is there than a self-driving car. His idea was to take an off-the-shelf RC car which uses DC motors, add an Arduino Portenta H7 as the MCU, and train a model to recognize target objects that it should follow.

    After selecting the “RC Pro Shredder” as the platform, Ellis implemented a VNH5019 Motor Driver Carrier, a servo motor to steer, and a Portenta H7 + Vision Shield along with a 1.5” OLED module. After 3D printing a small custom frame to hold the components in the correct orientation, nearly 300 images were collected of double-ringed markers on the floor. These samples were then uploaded to Edge Impulse and labeled with bounding boxes before a FOMO-based object detection model was trained.

    Rather than creating a sketch from scratch, the Portenta community had already developed one that grabs new images, performs inferencing, and then steers the car’s servo accordingly while optionally displaying the processed image on the OLED screen. With some minor testing and adjustments, Ellis and his class had built a total of four autonomous cars that could drive all on their own by following a series of markers on the ground.

    For more details on the project, check out Ellis’ Edge Impulse tutorial here.

    The post This converted RC car uses a Portenta H7 to drive itself appeared first on Arduino Blog.

    Website: LINK

  • Meet Arduino Pro at tinyML EMEA Innovation Forum 2023

    Meet Arduino Pro at tinyML EMEA Innovation Forum 2023

    Reading Time: 3 minutes

    On June 26th-28th, the Arduino Pro team will be in Amsterdam for the tinyML EMEA Innovation Forum – one of the year’s major events for the world where AI models meet agile, low-power devices.

    This is an exciting time for companies like Arduino and anyone interested in accelerating the adoption of tiny machine learning: technologies, products, and ideas are converging into a worldwide phenomenon with incredible potential – and countless applications already.

    At the summit, our team will indeed present a selection of demos that leverage tinyML to create useful solutions in a variety of industries and contexts. For example, we will present:

    • A fan anomaly detection system based on the Nicla Sense ME. In this solution developed with SensiML, the Nicla module leverages its integrated accelerometer to constantly measure the vibrations generated by a computer fan. Thanks to a trained model, condition monitoring turns into anomaly detection – the system is able to determine whether the fan is on or off, notify users of any shocks, and even alert them if its super precise and efficient sensor detects sub-optimal airflow.
    • A vineyard pest monitoring system with the Nicla Vision and MKR WAN 1310. Machine vision works at the service of smart agriculture in this solution: even in the most remote field, a pheromone is used to attract insects inside a case lined with glue traps. The goal is not to capture all the insects, but to use a Nicla Vision module to take a snapshot of the captured bugs, recognize the ones that pose a real threat, and send updated data on how many specimens were found. New-generation farmers can thus schedule interventions against pests as soon as needed, before the insects get out of control and cause damage to the crops. Leveraging LoRa® connectivity, this application is both low-power and high-efficiency.
    • An energy monitoring-based anomaly detection solution for DC motors, with the Opta. This application developed with Edge Impulse leverages an Opta WiFi microPLC to easily implement industrial-level, real-time monitoring and fault detection – great to enable predictive maintenance, reducing downtime and overall costs. A Hall effect current sensor is attached in series with the supply line of the DC motor to acquire real-time data, which is then analyzed using ML algorithms to identify patterns and trends that might indicate faulty operation. The DC motor is expected to be in one of two statuses – ON or OFF – but different conditions can be simulated with the potentiometer. When unexpected electric consumption is shown, the Opta WiFi detects the anomaly and turns on a warning LED.

    The Arduino Pro team is looking forward to meeting customers and partners in Amsterdam – championing open source, accessibility, and flexibility in industrial-grade solutions at the tinyML EMEA Innovation Forum!

    The post Meet Arduino Pro at tinyML EMEA Innovation Forum 2023 appeared first on Arduino Blog.

    Website: LINK

  • James Bruton builds a walking AT-AT robot

    James Bruton builds a walking AT-AT robot

    Reading Time: 2 minutes

    Was there anything more exciting than watching AT-ATs walk across Hoth towards the Rebel base for the first time? Those massive machines were iconic and helped to solidify The Empire Strikes Back as the best movie set in the Star Wars universe. After experiencing disappointment with AT-AT toys that couldn’t walk, James Bruton built his own AT-AT robot that strolls with the best of them.

    While Bruton’s 3D-printed robot isn’t an exact replica of an AT-AT and doesn’t incorporate all of the design elements, it does walk a lot like what we all saw in the movie. Those AT-ATs had a very distinctive way of moving and the robot does a good job of mimicking their gait. It incorporates two key elements: forward knees and feet that remain level with the ground during a stride. Bruton’s robot has legs made up of parallel linkages in order to replicate that movement.

    Each leg requires three servo motors: two for the hip and one for the knee. That’s a total of 12 servos, which Bruton controlled with an Arduino Mega 2560 board. It receives commands from Bruton’s own universal DSM remote through a DSM radio receiver module. The robot lacks sensors and autonomy, so Bruton has to pilot it himself. After he solved some minor balance issues caused by the weighty head, the mini AT-AT was able to walk very well and it should excite every Star Wars fan.

    [youtube https://www.youtube.com/watch?v=AAgPmHINRJA?feature=oembed&w=500&h=281]

    The post James Bruton builds a walking AT-AT robot appeared first on Arduino Blog.

    Website: LINK

  • This Star Wars-inspired display shows volumetric objects in 3D space

    This Star Wars-inspired display shows volumetric objects in 3D space

    Reading Time: 2 minutes

    A little over a year ago, Maker Mac70 set out to build an inexpensive DIY version of a volumetric display — a system which enables users to view objects in 3D, without the need for special glasses or extremely fancy components. But unlike in the Star Wars series, this system relies on a projector that can refresh hundreds of times per second, a moving projection screen (swept volume) to show the image, and some way to synchronize everything, as the ultimate goal was a 3D persistence of vision presentation.

    After modifying a standard DLP projector to output faster, monochrome images by moving the internal color wheel, Mac70 introduced an LCD shutter which either allows or blocks light in less than 1ms and could be used to precisely select the subframes being projected. Because a solenoid or motor would be too slow to move the swept volume back and forth, a pair of old speakers were repurposed, and when combined with an Arduino Mega 2560, were able to move in sync with the projector once calibrated.

    The last steps involved creating a slicer utility for transforming 3D models into 2D images, using a LattePanda single-board computer to drive the projector, and making an elegant Star Wars-themed case to house all of the components. For more details about this years-long project, you can read Mac70’s excellent write-up here on Hackster.io or watch the videos below!

    [youtube https://www.youtube.com/watch?v=VTe0p5BMEBM?feature=oembed&w=500&h=281]
    [youtube https://www.youtube.com/watch?v=9dWt7pVx9i8?feature=oembed&w=500&h=281]

    The post This Star Wars-inspired display shows volumetric objects in 3D space appeared first on Arduino Blog.

    Website: LINK

  • Build your own human horse race arcade game using Arduino and Strava

    Build your own human horse race arcade game using Arduino and Strava

    Reading Time: 2 minutes

    Every arcade seems to have one particular game that appeals to the gamblers among us: the horse race game. That game has several model horses and people can gamble on the outcome of their race. The results are either random or controlled by the players through some sort of input mechanism. In a new video, element14’s Lorraine Underwood explains how to recreate that game with an Arduino — but powered by people.

    The game concept is similar to the horse race arcade game, except players run to influence the success of their respective racers. Those racers are little 2D people figures instead of horses. Everyone can run on their own schedule. Then, once everyone completes their run, the game shows the results by moving the figures across the track. This links to Strava, which is an app for tracking runs using popular devices like smartphones and smartwatches. Each player just has to complete a run as part of the group that Underwood set up for this purpose.

    Because the game needs to pull run information from Strava’s API over the internet, Underwood chose to use an Arduino UNO WiFi Rev2 board. It controls the movement of each figure using a micro DC gearmotor with a belt. Those motors came from DFRobot and include encoders, which are helpful for ensuring the movement is exact. The game supports four players, so Underwood added a DFRobot quad DC motor shield to the Arduino to drive the motors.

    The Arduino sketch connects to the local WiFi, then pulls the distance covered by each runner from the Strava API. It then scales that down to the distance of the game track and spins the motors until each figure reaches the corresponding distance.

    [youtube https://www.youtube.com/watch?v=ToxQFa7qT1I?feature=oembed&w=500&h=281]

    The post Build your own human horse race arcade game using Arduino and Strava appeared first on Arduino Blog.

    Website: LINK

  • Join us at Sensors Converge in Santa Clara

    Join us at Sensors Converge in Santa Clara

    Reading Time: 2 minutes

    We’ll be in California at the Santa Clara Convention Center June 20nd-22th for Sensors Converge, where the building blocks of IoT converge. As North America’s largest electronics event for design engineers, Sensors Converge brings together all the industrial components for innovative end to end commercial solutions design. Arduino will be there to give our take on open technology for industrial-grade applications at the speed of innovation.

    At Arduino’s booth #1346, our experts will be on hand to showcase the Pro line of products with a few notable demonstrations. Our own Guneet Bedi to keynote on the innovation open environments offer to businesses, in addition to two other talks on agtech and the CHIPS Act. We look forward to sharing our business solutions for digital transformation with Arduino’s ease of use and open-source philosophy. We invite you to stop by the booth during the show or join us for one of our networking receptions.

    The full schedule of Arduino events at Sensors Converge (all times PT):

    Tuesday, June 20th

    • VIP conference reception sponsored by Arduino, 4-6pm, Hyatt Regency Rooftop.

    Wednesday, June 21st

    • Keynote address “Open Source as the Future of Innovation: An Unconventional and Powerful Approach for Commercial Success,” with featured speaker Guneet Bedi (SVP and GM of the Americas), 2:40-3pm, Main Stage.
    • Agriculture track session “How an Open Ecosystem Promotes Smart Agriculture, Connected Irrigation, and Sustainable Natural Resource Use,” technical session from Allan Gagnon (Solutions Architect), 4:10-4:30pm, Grand Ballroom F.
    • Expo floor reception sponsored by Arduino, 4:30-6pm, Expo Floor.

    Thursday, June 22nd

    • “Opportunities, Challenges and Collaboration: Government Policy Impact on the Semiconductor and Tech Industries,” panel featuring Guneet Bedi (SVP and GM of the Americas), 11:30am-12:10pm, Main Stage.

    Arduino will feature its Arduino Pro ecosystem, designed for industrial and commercial environments. With a robust end-to-end ecosystem of system modules, intelligent sensors, ready-to-deploy open PLCs and cloud services, Pro ensures optimal performance and longevity, making it an ideal choice for businesses seeking dependable solutions. The company will be showcasing its Portenta, Nicla, and Opta products along with several demonstrations:

    To learn more at Sensors Converge about how Arduino PRO is helping companies transform through rapid IoT deployment, schedule a visit to our booth.

    The post Join us at Sensors Converge in Santa Clara appeared first on Arduino Blog.

    Website: LINK

  • This Arduino reads punch cards for cloud computing

    This Arduino reads punch cards for cloud computing

    Reading Time: 2 minutes

    You already know that computers store and interpret data in binary: ones and zeroes. There are many, many ways to store binary data, because it works with anything that can maintain at least two states. In the early days of computing, punch cards were common. They were paper cards with a grid of points. A hole at any of those points could represent either a one or zero (depending on the system), while the lack of a hole would represent the opposite. To bring old and new together, Nino Ivanov built an Arduino punch card reader for cloud computing.

    Cloud computing turns processing into an internet service, letting customers run whatever calculations or programs they like without needing to maintain their own hardware. In a typical use case scenario, the customer would simply upload their code or data from a computer to the cloud service for processing. But that data can come from anything, including punch cards. Ivanov’s project uses punch cards (which he seems to have cut by hand) that are read by an Arduino, transferred to a computer, and then entered into the cloud computing terminal interface.

    Each punch card is just a piece of grid paper with some squares cut out. Each row contains six bits, which an Arduino Uno reads using a cobbled-together reader made with LEDs and photoresistors. If there is a square cut out, the light will pass through and the corresponding photoresistor will detect the light. A small servo motor feeds the punch card through the reader. In this case, Ivanov encoded a Common Lisp program to process factorials in the cloud. But the same general concept could apply to any language or data — though most are much less efficient than Lisp and would require a far greater number of punch cards.

    [youtube https://www.youtube.com/watch?v=Oke-7lsvFug?feature=oembed&w=500&h=281]

    The post This Arduino reads punch cards for cloud computing appeared first on Arduino Blog.

    Website: LINK

  • Recreating the famous Miller-Urey experiment with Arduino

    Recreating the famous Miller-Urey experiment with Arduino

    Reading Time: 2 minutes

    Evolution is fact — at least as much as anything is a “fact” in the scientific sense. What is still very much in question is abiogenesis, which is the origin of life from inorganic compounds. Evolution is the mechanism that took unicellular life to our current biodiversity, but scientists don’t know exactly how the first single-cell organisms came to be. Most reputable theories posit that the conditions on a young Earth allowed organic compounds, like amino acids, to form and then develop into organisms. In 1952, the Miller-Urey experiment confirmed that such a thing is possible. To recreate that famous experiment, M. Bindhammer turned to Arduino.

    The conditions that, theoretically, resulted in the first amino acids were an atmosphere of compounds like methane, water, ammonia, and hydrogen along with electricity in the form of lightning. Stanley Miller, under the supervision of Harold Urey, recreated those conditions in the lab with a closed system that heated the gases, passed them through a chamber with electrodes to produce sparks, then down through a cooler and back to the start. That produced many amino acids that could (again, in theory) come together to form genetic code. M. Bindhammer’s reproduction of this experiment works in a similar way, but with a plasma arc instead of sparks.

    M. Bindhammer needed a way to control the plasma arc and chose an Arduino Uno Rev3 board for the job. That plasma arc oxidizes the nitrogen in the air, so it is crucial to the experiment. Generating the plasma arc required a lot of voltage (45000V), which M. Bindhammer controlled through a MOSFET connected to the Arduino. That let them modulate power to the electrodes in order to avoid overheating them.

    This seems to have been a success and M. Bindhammer reports that the initial experiment produces the hypothesized results.

    [youtube https://www.youtube.com/watch?v=suCMnIyDBho?feature=oembed&w=500&h=281]

    The post Recreating the famous Miller-Urey experiment with Arduino appeared first on Arduino Blog.

    Website: LINK

  • Creating intelligent IoT devices with ChatGPT and Arduino Cloud: A journey into natural language interaction

    Creating intelligent IoT devices with ChatGPT and Arduino Cloud: A journey into natural language interaction

    Reading Time: 5 minutes

    Artificial intelligence (AI) and natural language processing (NLP) are changing the way we interact with technology. With advancements in machine learning and data processing, we now have AI-powered virtual assistants, chatbots, and voice recognition systems that can understand and respond to our queries in a natural, human-like way. One such technology is ChatGPT, a large language model developed by OpenAI based on the GPT-3.5 architecture. ChatGPT has the ability to generate coherent, context-aware responses to a wide range of questions, making it an ideal tool for communication.

    Integrating ChatGPT and Arduino Cloud for IoT projects

    Integrating ChatGPT and the Arduino Cloud, a platform that allows you to develop, deploy and manage IoT devices in the easiest way, opens up a brand new world of possibilities for IoT applications. By combining ChatGPT’s natural language processing capabilities with the Arduino Cloud’s IoT platform, we can create intelligent devices that can understand and respond to natural language queries, making the user experience more seamless and intuitive. For example, imagine a smart home system that can be controlled using voice commands, or a chatbot that can provide instant technical support for IoT devices.

    Chat with ChatGPT through Arduino IoT Cloud dashboards

    This project is a simple demonstration of an Arduino IoT Cloud-compatible device, such as an Arduino Nano RP2040 Connect or any ESP32/ESP8266 device, acting as a middleware between the IoT Cloud and OpenAI’s GPT-3.5 language model. The device acts as a bridge by receiving prompts (questions) from the IoT Cloud and forwarding them to the OpenAI API. Once the model processes the prompts, the device receives and parses the replies and sends them back to the IoT Cloud, which displays the response to the user.

    To embark on this project, you will need to create an OpenAI account, create an API key, and have enough credits. Then, you can create your device on the IoT Cloud, program it, and set up the dashboard on the IoT Cloud. The dashboard serves as a user interface, allowing you to write questions (prompts) and receive ChatGPT’s replies.

    Check out the project on Arduino’s Project Hub and get more information about how to build the system yourself.

    As you get into the project, you can explore variable tweaking, defining the maximum number of tokens that ChatGPT will use in generating a response, and keeping in mind the limits on OpenAI API usage. Overall, this project presents a unique opportunity to integrate the cutting-edge capabilities of OpenAI’s language model with the versatile Arduino IoT Cloud, enabling you to create more intelligent and intuitive IoT applications.

    Connect to ChatGPT using MicroPython

    If you are interested in an alternative approach of connecting to ChatGPT, you can do so by using a MicroPython script. If you are familiar with making HTTP requests using Python, this is a great approach.

    To authenticate and successfully make requests with ChatGPT, you will need to first get your API key from OpenAI, and construct a POST request. We will be using the urequests and ujson modules, where we will simply ask a question to ChatGPT, and get a response. 

    The response is printed on a 128×64 OLED display, and that’s pretty much it. It is a minimal example, but a fun one, and easy to get started with.

    To get started with MicroPython and ChatGPT, visit this repository which has the code and instructions to get started. 

    This type of integration paves the way for many cool projects. You can for example ask ChatGPT to evaluate recently recorded data, or a companion-bot that knows everything that the Internet knows..

    Introducing the Arduino Cloud

    The Arduino Cloud is a platform that simplifies the process of developing, deploying, and managing IoT devices. It supports various hardware, including Arduino boards, ESP32, and ESP8266 based boards, and makes it easy for makers, IoT enthusiasts, and professionals to build connected projects without coding expertise. What makes Arduino Cloud stand out is its intuitive interface that abstracts complex tasks, making it accessible to all users. With its low-code approach and extensive collection of examples and templates, Arduino Cloud offers a simple way for users to get started. 

    The platform’s IoT Cloud tool allows for easy management and monitoring of connected devices through customizable dashboards, which provide real-time visualisations of the device’s data. Furthermore, the IoT Cloud can be accessed remotely through the mobile app Arduino IoT Cloud Remote, which is available for both Android and iOS devices, enabling users to manage their devices from anywhere.

    Build your own

    The integration of ChatGPT and Arduino Cloud has opened up a new world of opportunities for IoT applications. These projects are just some examples of how these technologies can be used to create intelligent devices that can understand and respond to natural language queries. 

    If you have been inspired by these projects and want to share your own creation with the community, we encourage you to publish your project on Arduino Project Hub. By doing so, you can showcase your project and share your knowledge with others. Arduino Project Hub is a platform where users can share their Arduino-based projects and find inspiration for new ones. With a global community of makers and enthusiasts, the hub is the perfect place to collaborate, learn and explore the endless possibilities of IoT. So, whether you are a seasoned maker or just starting, we invite you to join our community and share your project with the world!

    Ready to start?

    Ready to unleash the potential of IoT devices and ChatGPT integration? Visit the Arduino IoT Cloud website to access official documentation and resources for the Arduino IoT Cloud. Create an account and start building your own projects today!

    The post Creating intelligent IoT devices with ChatGPT and Arduino Cloud: A journey into natural language interaction appeared first on Arduino Blog.

    Website: LINK

  • Launch your model rockets over LoRaWAN with this MKR WAN 1310-powered system

    Launch your model rockets over LoRaWAN with this MKR WAN 1310-powered system

    Reading Time: 2 minutes

    Designing, constructing, and launching your own model rockets is a great hobby for learning more about the world of aerodynamics, computer simulations, and physics as a whole. But when it comes to actually lighting the solid rocket fuel to achieve ignition, the user normally lights a fire directly on a fuse or lays out a reel of wire to electronically burn the propellent, both of which are not ideal.

    Milos Rasic of element14 Presents, in contrast, had the idea to create a remotely-operated launching system that would allow the user to simply flick a switch and press a button to achieve lift-off without the need for kilometers of wire. His ignition circuit relied on an Arduino MKR WAN 1310 to receive commands over LoRaWAN™ and the board, in turn, would begin charging a pair of supercapacitors via a series of MOSFETs, relays, and op-amps until they each reached about 8V. Once everything had been tested on a breadboard, Rasic soldered his components onto perfboard and arranged them inside a custom weatherproof case.

    On the controller side, Rasic grabbed another MKR WAN 1310 and connected a 16×2 LCD display, a rotary encoder for making selections, and an array of switches and buttons for selecting when the system is charging, armed, and igniting the rocket. Better yet, the model rocket was also a custom design along with the launchpad.

    To see this system in action, check out Rasic’s video below!

    [youtube https://www.youtube.com/watch?v=lU-NAV25VCo?feature=oembed&w=500&h=281]

    The post Launch your model rockets over LoRaWAN with this MKR WAN 1310-powered system appeared first on Arduino Blog.

    Website: LINK