Schlagwort: arduino

  • Rebuilding a Passap E6000 knitting machine with Arduino and Raspberry Pi

    Rebuilding a Passap E6000 knitting machine with Arduino and Raspberry Pi

    Reading Time: 2 minutes

    Rebuilding a Passap E6000 knitting machine with Arduino and Raspberry Pi

    Arduino TeamJuly 8th, 2020

    Irene Wolf is the owner a Passap E6000, a computerized knitting machine which features pair of needle beds, and decided it was time to give it an upgrade. In particular, she wanted the ability to control its rear needle bed automatically in a similar manner to the way the front is normally operated for extra functionality.

    To accomplish this non-trivial task, she’s using a Raspberry Pi 3 as the new controller, along with two Arduino M0 boards to directly handle the machine’s actions via interrupts.

    She’s also driving the device’s motor with a frequency converter and an Uno, as the original control board was broken. Plenty of details are available in Wolf’s write-up and on GitHub, and you can even see it in action (plus a resulting knitted sock) in the video below.

    [youtube https://www.youtube.com/watch?v=wtQ6Al-69Xc?feature=oembed&w=500&h=375]

    Website: LINK

  • Robotic cornhole board guarantees three points every time

    Robotic cornhole board guarantees three points every time

    Reading Time: < 1 minute

    Robotic cornhole board guarantees three points every time

    Arduino TeamJuly 8th, 2020

    You may have seen Mark Rober’s automated dartboard or Stuff Made Here’s backboard, which use advanced engineering to create apparatuses that ensure you “can’t miss.” Now that summer is in full swing, what about a robotic cornhole board?

    Michael Rechtin decided to take on this challenge using a webcam pointed at the sky for sensing and DC motors that move the board along an X/Y plane on a set of sliding drawer rails.

    When a bean bag is thrown, the camera feeds the video over to a laptop running a Processing sketch to analyze its trajectory and passes adjustment info to an Arduino. This then controls the motors for repositioning, which attempts to predict where the bag will land and guide it into the hold for three points!

    [youtube https://www.youtube.com/watch?v=FkhxhMJtkHA?feature=oembed&w=500&h=281]

    Website: LINK

  • Create a digital clock out of 24 analog clocks

    Create a digital clock out of 24 analog clocks

    Reading Time: < 1 minute

    Create a digital clock out of 24 analog clocks

    Arduino TeamJuly 7th, 2020

    What if you were to use the hands of a clock not as an individual display, but as part of an array that moves together to form digits? That’s the idea behind Clockception by “Made by Morgan,” which utilizes 48 servo motors to drive 24 clock-like faces in an 8×3 grid.

    The build uses an Arduino Nano and three servo driver boards for control, along with a DS1302 RTC module to track time. The overall clock is constructed out of stained poplar, while the dial assemblies are 3D-printed.

    Clockception was actually inspired by the ClockClock project by Humans Since 1982, but by using his own design and DIY methods, he was able to get the cost down to around $200.

    [youtube https://www.youtube.com/watch?v=PdpLYvw7i-E?feature=oembed&w=500&h=281]

    Website: LINK

  • Change the volume of any app on your PC with the turn of a knob

    Change the volume of any app on your PC with the turn of a knob

    Reading Time: 2 minutes

    Change the volume of any app on your PC with the turn of a knob

    Arduino TeamJuly 7th, 2020

    Overall computer volume control is important, but what if you want to get more granular, adjusting sound from various applications individually? Rather than going through a series of menus and on-screen sliders, Ruben Henares’ Maxmix lets you do this on the fly.

    Based on an Arduino Nano, the simple yet stylish knob takes input from an encoder and button to cycle through and select a program. Just push down and then rotate to turn the volume up or down. Want to switch from Discord to Spotify? Click it again and repeat the process.

    A small OLED screen on the Maxmix shows which app is running, and there’s even an optional LED ring for extra lighting effects. All the electronics are housed inside a nicely designed 3D-printed enclosure.

    You can find the build instructions on Henares’ site and see a demo of it below.

    [youtube https://www.youtube.com/watch?v=SEnBzFcOdMI?feature=oembed&w=500&h=281]

    Website: LINK

  • Build a comment-critiquing keyboard adapter using TensorFlow Lite and Arduino

    Build a comment-critiquing keyboard adapter using TensorFlow Lite and Arduino

    Reading Time: 2 minutes

    Build a comment-critiquing keyboard adapter using TensorFlow Lite and Arduino

    Arduino TeamJuly 7th, 2020

    If you’ve ever left an online comment that you later regretted, this anti-troll bot will keep that from happening again by letting you know when you’re being a bit too harsh.

    The device — which was created by Andy of element14 Presents — intercepts raw keyboard inputs using a MKR Zero board and analyzes them using a TensorFlow Lite machine learning algorithm.

    As an output, the Arduino controls the mouth of a rather hilarious human cutout via a servo motor, which as seen in the video below, also features a wisp of black hair and oversized googly eyes. If you’re typing happy thoughts, its mouth turns up into a smile, while mean words produce a frowny face.

    [youtube https://www.youtube.com/watch?v=uxG065trbQQ?feature=oembed&w=500&h=281]

    The project is a great example of running ML code on limited hardware, and more info on the sentiment-analyzing keyboard adapter can be found here.

    Website: LINK

  • Capture cinematic shots with this motorized slider and pan-tilt camera mount

    Capture cinematic shots with this motorized slider and pan-tilt camera mount

    Reading Time: 2 minutes

    Capture cinematic shots with this motorized slider and pan-tilt camera mount

    Arduino TeamJuly 6th, 2020

    DIY camera sliders are a great way to get professional-looking video shots on an amateur budget, but few can compare to the quality of this project by “isaac879.”

    His device features a pan/tilt mechanism outlined in a previous video, but in the clip below he’s attaching it to a piece of aluminum extrusion to enable it to slide as well.

    The build is controlled by an Arduino Nano, which actuates three stepper motors using A4988 drivers. The carriage is pulled along by a belt drive, via a stepper mounted to the carriage itself. This allows for easy disassembly when needed.

    It’s a clever and extremely clean design, and the video shows some great examples of the shots it can take (even when upside down).

    [youtube https://www.youtube.com/watch?v=v1b7Wvu87-U?feature=oembed&w=500&h=281]

    Website: LINK

  • The three pillars of the Arduino CLI

    The three pillars of the Arduino CLI

    Reading Time: 5 minutes

    The Arduino CLI is an open source command line application written in Golang that can be used from a terminal to compile, verify and upload sketches to Arduino boards, and that’s capable of managing all the software and tools needed in the process. But don’t get fooled by its name: the Arduino CLI can do much more than the average console application, as shown by the Pro IDE and Arduino Create, which rely on it for similar purposes but each one in a completely different way from the other.

    In this article, we introduce the three pillars of the Arduino CLI, explaining how we designed the software so that it can be effectively leveraged under different scenarios.

    Console applications for humans

    As you might expect, the first way to use the Arduino CLI is from a terminal and by a human, and user experience plays a key role here. The UX is under a continuous improvement process as we want the tool to be powerful without being too complicated. We heavily rely on sub-commands to provide a rich set of different operations logically grouped together, so that users can easily explore the interface while getting very specific contextual help.

    Console applications for robots

    Humans are not the only type of customers we want to support and the Arduino CLI was also designed to be used programmatically — think about automation pipelines or a CI/CD system. 

    There are some niceties to observe when you write software that’s supposed to be easy to run when unattended and one in particular is the ability to run without a configuration file. This is possible because every configuration option you find in the arduino-cli.yaml configuration file can be provided either through a command line flag or by setting an environment variable. To give an example, the following commands are all equivalent and will proceed fetching the unstable package index that can be used to work with experimental versions of cores: 

    See the documentation for details about Arduino CLI’s configuration system.

    Consistent with the previous paragraph, when it comes to providing output the Arduino CLI aims to be user friendly but also slightly verbose, something that doesn’t play well with robots. This is why we added an option to provide output that’s easy to parse. For example, the following figure shows what getting the software version in JSON format looks like.

    Even if not related to software design, one last feature that’s worth mentioning is the availability of a one-line installation script that can be used to make the latest version of the Arduino CLI available on most systems with an HTTP client like curl or wget and a shell like bash.

    gRPC is a high-performance RPC framework that can efficiently connect client and server applications. The Arduino CLI can act as a gRPC server (we call it daemon mode), exposing a set of procedures that implement the very same set of features of the command line interface and waiting for clients to connect and use them. To give an idea, the following is some Golang code capable of retrieving the version number of a remote running Arduino CLI server instance:

    gRPC is language-agnostic: even if the example is written in Golang, the programming language used for the client can be Python, JavaScript or any of the many supported ones, leading to a variety of possible scenarios. The new Arduino Pro IDE is a good example of how to leverage the daemon mode of the Arduino CLI with a clean separation of concerns: the Pro IDE knows nothing about how to download a core, compile a sketch or talk to an Arduino board and it demands all these features of an Arduino CLI instance. Conversely, the Arduino CLI doesn’t even know that the client that’s connected is the Pro IDE, and neither does it care.

    The Arduino CLI is written in Golang and the code is organized in a way that makes it easy to use it as a library by including the modules you need in another Golang application at compile time. Both the first and second pillars rely on a common Golang API, a set of functions that abstract all the functionalities offered by the Arduino CLI, so that when we provide a fix or a new feature, they are automatically available to both the command line and gRPC interfaces. 

    The source modules implementing this API can be imported in other Golang programs to embed a full-fledged Arduino CLI. For example, this is how some backend services powering Arduino Create can compile sketches and manage libraries. Just to give you a taste of what it means to embed the Arduino CLI, here is how to search for a core using the API:

    Embedding the Arduino CLI is limited to Golang applications and requires a deep knowledge of its internals. For the average use case, the gRPC interface might be a better alternative; nevertheless this remains a valid option that we use and provide support for.

    You can start playing with the Arduino CLI right away. The code is open source and we provide extensive documentation. The repo contains example code showing how to implement a gRPC client, and if you’re curious about how we designed the low-level API, have a look at the commands package and don’t hesitate to leave feedback on the issue tracker if you’ve got a use case that doesn’t fit one of the three pillars.

    Website: LINK

  • Old becomes new again with this glowing clock

    Old becomes new again with this glowing clock

    Reading Time: 2 minutes

    Old becomes new again with this glowing clock

    Arduino TeamJuly 3rd, 2020

    Whether for work or play, and now for various video/voice socials that have been set up, “chebe” spends most of the day at their desk — so much so that, in some instances they lose track of time. To address the issue, this maker dug out a vintage Arduino Duemilanove circa 2010 to create a unique new clock.

    The build consists of other parts from chebe’s electronics stash as well, including a SparkFun RTC module and a 7-segment 4-digit display to show the hour and minutes. An LoLshield that was soldered up nine years ago was also implemented, producing an extra glowing effect through the unit’s translucent cover.

    Impressively, the only component obtained specifically for the project was a Proto Shield for attaching things together.

    Website: LINK

  • This puck-slapping robot will beat you in table hockey

    This puck-slapping robot will beat you in table hockey

    Reading Time: 2 minutes

    This puck-slapping robot will beat you in table hockey

    Arduino TeamJuly 3rd, 2020

    Mechanical table hockey games, where players are moved back and forth and swing their sticks with a series of knobs, can be a lot of fun; however, could one be automated? As Andrew Khorkin’s robotic build demonstrates, the answer is a definite yes — using an Arduino Mega and a dozen stepper motors to score goals on a human opponent.

    The project utilizes an overhead webcam to track the position of the players and puck on the rink, with a computer used for object detection and gameplay. Each player is moved with two steppers, one of which pushes the control rod in and out, while the other twists the player to take shots.

    Training the game took six months of work, which really shows in the impressive gameplay seen below.

    [youtube https://www.youtube.com/watch?v=ryq2LKFTg3Q?feature=oembed&w=500&h=281]

    Website: LINK

  • Modified printer simplifies home PCB fabrication

    Modified printer simplifies home PCB fabrication

    Reading Time: < 1 minute

    Modified printer simplifies home PCB fabrication

    Arduino TeamJuly 2nd, 2020

    In many locations you can get PCBs made fast, cheap, and of very good quality. In Brazil, where Vítor Barbosa lives, this isn’t the case, so he built a “haxmark460” PCB printer to help manufacture circuits at home.

    The build modifies a Lexmark E460dn laserjet printer to mark PCBs directly, using an aluminum carrier plate instead of its normal paper feed operation.

    An Arduino is utilized to hijack and output printer signals, enabling it to it to work in a much different way than how it was originally designed. The carrier plate, with blank PCB material taped on, is fed into the front and the PCB is pulled through and properly marked by the printer. After a dip in acetone to allow the toner to stick to the copper, the board is ready to etch!

    [youtube https://www.youtube.com/watch?v=Cp2N5aJ5IuY?feature=oembed&w=500&h=281]

    Website: LINK

  • Arduino Security Primer

    Arduino Security Primer

    Reading Time: 5 minutes

    SSL/TLS stack and HW secure element

    At Arduino, we are hard at work to keep improving the security of our hardware and software products, and we would like to run you through how our IoT Cloud service works.

    The Arduino IoT Cloud‘s security is based on three key elements:

    • The open-source library ArduinoBearSSL for implementing TLS protocol on Arduino boards;
    • A hardware secure element (Microchip ATECCX08A) to guarantee authenticity and confidentiality during communication;
    • A device certificate provisioning process to allow client authentication during MQTT sessions.

    ArduinoBearSSL

    In the past, it has been challenging to create a complete SSL/TLS library implementation on embedded (constrained) devices with very limited resources. 

    An Arduino MKR WiFi 1010, for instance, only has 32KB of RAM while the standard SSL/TLS protocol implementations were designed for more powerful devices with ~256MB of RAM.

    As of today, a lot of embedded devices still do not properly implement the full SSL/TLS stack and fail to implement good security because they misuse or strip functionalities from the library, e.g. we found out that a lot of off-brand boards use code that does not actually validate the server’s certificate, making them an easy target for server impersonation and man-in-the-middle attacks.

    Security is paramount to us, and we do not want to make compromises in this regard when it comes to our offering in both hardware and software. We are therefore always looking at “safe by default” settings and implementations. 

    Particularly in the IoT era, operating without specific security measures in place puts customers and their data at risk.

    This is why we wanted to make sure the security standards adopted nowadays in high-performance settings are ported to microcontrollers (MCUs) and embedded devices.

    Back in 2017, while looking at different SSL/TLS libraries supporting TLS 1.2 and modern cryptography (something that could work with very little RAM/ROM footprint, have no OS dependency, and be compatible with the embedded C world), we decided to give BearSSL a try.

    BearSSL: What is it?

    BearSSL provides an implementation of the SSL/TLS protocol (RFC 5246) written in C and developed by Thomas Pornin.

    Optimized for constrained devices, BearSSL aims at small code footprint and low RAM usage. As per its guiding rules, it tries to find a reasonable trade-off between several partly conflicting goals:

    • Security: defaults should be robust and using patently insecure algorithms or protocols should be made difficult in the API, or simply not possible;
    • Interoperability with existing SSL/TLS servers; 
    • Allowing lightweight algorithms for CPU-challenged platforms; 
    • Be extensible with strong and efficient implementations on big systems where code footprint is less important.

    BearSSL and Arduino

    Our development team picked it as an excellent starting point for us to make BearSSL fit in our Arduino boards focusing on both security and performance.

    The firmware developers team worked hard on porting BearSSL to Arduino bundling it together as a very nice and open-source library: ArduinoBearSSL.

    Because the computational effort of performing a crypto algorithm is high, we decided to offload part of this task to hardware, using a secure element (we often call it a “cypto chip”). Its advantages are:

    • Making the computation of cryptography operations faster;
    • You are not forced to use all the available RAM of your device for these demanding tasks;
    • Allows storing private keys securely (more on this later);
    • It provides a true random number generator (TRNG).

    How does the TLS protocol work?

    TLS uses both asymmetric and symmetric encryption. Asymmetric encryption is used during the TLS handshake between the client and the server to exchange the shared session key for communication encryption. The algorithms commonly used in this phase are based on Rivest-Shamir-Adleman (RSA) or Diffie-Hellman algorithms. 

    TLS 1.2 Handshake flow

    After the TLS handshake, the client and the server both have a session key for symmetric encryption (e.g. algorithms AES 128 or AES 256).

    The TLS protocol is an important part of our IoT Cloud security model because it guarantees an encrypted communication between the IoT devices and our servers.

    The secure element

    In order to save memory and improve security, our development team has chosen to introduce a hardware secure element to offload part of the cryptography algorithms computational load, as well as to generate, store, and manage certificates. For this reason, on the Arduino MKR family, Arduino Nano 33 IoT and Arduino Uno WiFi Rev2, you will find the secure element ATECC508A or ATECC608A manufactured by Microchip.

    How do we use the secure element?

    A secure element is an advanced hardware component able to perform cryptographic functions, we have decided to implement it on our boards to guarantee two fundamental security properties in the IoT communication: 

    • Authenticity: You can trust who you are communicating with;
    • Confidentiality: You can be sure the communication is private.

    Moreover, the secure element is used during the provisioning process to configure the Arduino board for Arduino IoT Cloud. In order to connect to the Arduino IoT Cloud MQTT broker, our boards don’t use a standard credentials authentication (username/password pair). We rather opted for implementing a higher-level authentication, known as client certificate authentication.

    How does the Arduino provisioning work?

    The whole process is possible thanks to an API, which exposes an endpoint a client can interact with.

    As you can see in the diagram below, first the Client requests to register a new device on Arduino IoT Cloud via the API, to which the server (API) returns a UUID (Universally Unique IDentifier). At this point, the user can upload the sketch Provisioning.ino to the target board. This code is responsible for multiple tasks:

    • Generating a private key using the ATECCX08A, and store it in a secure slot that can be only read by the secure element;
    • Generating a CSR (Certificate Signing Request) using the device UUID as Common Name (CN) and the generated private key to sign it;
    • Storing the certificate signed by Arduino acting as the authority.

    After the CSR generation, the user sends it via the API to the server and the server returns a certificate signed by Arduino. This certificate is stored, in a compressed format, in a slot of the secure element (usually in slot 10) and it is used to authenticate the device to the Arduino IoT Cloud.

    Such a human-unfriendly process is hidden from our users thanks to the work our design team did to build a user-friendly plug-and-playGetting Started” process in the browser, to help configure the IoT devices and Arduino IoT Cloud. Our users simply connect their Arduino boards and follow the steps. 

    In addition, Arduino offers two-factor authentication across all web services, so users can add an additional security layer to their accounts and IoT devices connected to Arduino IoT Cloud.

    Website: LINK

  • This robo-dog sprays poison ivy with weed killer

    This robo-dog sprays poison ivy with weed killer

    Reading Time: < 1 minute

    This robo-dog sprays poison ivy with weed killer

    Arduino TeamJuly 1st, 2020

    Poisonous plants, like poison ivy, can really ruin your day. In an effort to combat this “green menace,” YouTuber Sciencish decided to create his own quadruped robot.

    The robotic dog is equipped with two servos per leg, for a total eight, which enable it to move its shoulders and elbows back and forth.

    An Arduino Uno controller determines leg positions via trigonometric calculation, and when in position, it dispenses weed killer via a relay and aquarium pump setup. The reservoir can also be used to hold other liquids, whether for watering duties or even to provide extra fuel to a fire.

    [youtube https://www.youtube.com/watch?v=gm-EslOemfE?feature=oembed&w=500&h=281]

    Website: LINK

  • The Arduino CLI just got some new exciting features

    The Arduino CLI just got some new exciting features

    Reading Time: 3 minutes

    The arduino-cli tool just got some new exciting features with the release of 0.11.0:

    • Command-line completion
    • External programmer support
    • Internationalization and localization support (i18n)

    Command-line completion

    Finally, the autocompletion feature has landed!

    With this functionality, the program automatically fills in partially typed commands by pressing the tab key. For example, with this update, you can type arduino-cli bo:

    And, after pressing the <TAB> key, the command will auto-magically become: 

    There are a few steps to follow in order to make it work seamlessly. We have to generate the required file — to do so, we have added a new command named “completion.” 

    To generate the completion file, you can use:

    By default, this command will print on the standard output (the shell window) the content of the completion file. To save to an actual file, use the “>” redirect symbol. Now you can move it to the required location (it depends on the shell you are using). Remember to open a new shell! Finally, you can press <TAB><TAB> to get the suggested command and flags.

    In a future release, we will also be adding the completion for cores names, libraries, and boards.

    Example with Bash (from the documentation)

    To generate the completion file, use:

    At this point, you can move that file in /etc/bash_completion.d/ (root access is required) with:

    A not recommended alternative is to source the completion file in .bashrc.

    Remember to open a new shell to test the functionality.

    External programmer

    Another brand new feature is support for external programmers!

    Now you can specify the external programmer to use when uploading code to a board. For example, you can use arduino-cli upload …. –programmer programmer-id for that. You can list the supported programmers with arduino-cli upload –fqbn arduino:avr:uno –programmer list.

    And if you’re using the external programmer to burn a bootloader, you can do that from arduino-cli as well: arduino-cli burn-bootloader –fqbn …

    Internationalization and localization support

    Now the Arduino CLI messages can be translated to your native language thanks to i18n support! We are currently setting up the infrastructure; however, if you would like to help us with the translation, we will provide you more details in another blog post soon!

    That’s all folks!

    That’s it, we’ve worked hard to add these new features. Check them out by downloading 0.11.0 here. Do you like them? What are your thoughts on the arduino-cli? Are you using it for your projects? Let us know in the comments!

    Website: LINK

  • A tabletop bowling game with automated scoring

    A tabletop bowling game with automated scoring

    Reading Time: < 1 minute

    A tabletop bowling game with automated scoring

    Arduino TeamJune 24th, 2020

    Do you love to bowl? Are you still unable to do so due to the pandemic? Then this project by high school engineering teacher “lainealison” is right up your alley!

    This rig features a 5½ foot (1.68 meter) lane made of MDF, along with ball bearings used to strike miniature pins. Each pin is arranged on top of an LDR sensor, which detects whether a pin remains in place or if it has been knocked over via 10 LEDs shining from overhead.

    An Arduino Uno uses pin presence information to output game stats, automatically displaying the frame, ball and score on an I2C LCD screen.

    You can find more on the project, including code and construction details, in lainealison’s write-up.

    Website: LINK

  • Machine vision with low-cost camera modules

    Machine vision with low-cost camera modules

    Reading Time: 6 minutes

    If you’re interested in embedded machine learning (TinyML) on the Arduino Nano 33 BLE Sense, you’ll have found a ton of on-board sensors — digital microphone, accelerometer, gyro, magnetometer, light, proximity, temperature, humidity and color — but realized that for vision you need to attach an external camera.

    In this article, we will show you how to get image data from a low-cost VGA camera module. We’ll be using the Arduino_OVD767x library to make the software side of things simpler.

    Hardware setup

    To get started, you will need:

    You can of course get a board without headers and solder instead, if that’s your preference.

    The one downside to this setup is that (in module form) there are a lot of jumpers to connect. It’s not hard but you need to take care to connect the right cables at either end. You can use tape to secure the wires once things are done, lest one comes loose.

    You need to connect the wires as follows:

    Software setup

    First, install the Arduino IDE or register for Arduino Create tools. Once you install and open your environment, the camera library is available in the library manager.

    • Install the Arduino IDE or register for Arduino Create
    • Tools > Manage Libraries and search for the OV767 library
    • Press the Install button

    Now, we will use the example sketch to test the cables are connected correctly:

    • Examples > Arduino_OV767X > CameraCaptureRawBytes
    • Uncomment (remove the //) from line 48 to display a test pattern
    Camera.testPattern();
    • Compile and upload to your board

    Your Arduino is now outputting raw image binary over serial. To view this as an image we’ve included a special application to view the image output from the camera using Processing.

    Processing is a simple programming environment that was created by graduate students at MIT Media Lab to make it easier to develop visually oriented applications with an emphasis on animation and providing users with instant feedback through interaction.

    • Install and open Processing 
    • Paste the CameraVisualizerRawBytes code into a Processing sketch
    • Edit line 31-37 to match the machine and serial port your Arduino is connected to
    • Hit the play button in Processing and you should see a test pattern (image update takes a couple of seconds):

    If all goes well, you should see the striped test pattern above!

    Next we will go back to the Arduino IDE and edit the sketch so the Arduino sends a live image from the camera in the Processing viewer: 

    • Return to the Arduino IDE
    • Comment out line 48 of the Arduino sketch
    // We've disabled the test pattern and will display a live image
    // Camera.testPattern();
    • Compile and upload to the board
    • Once the sketch is uploaded hit the play button in Processing again
    • After a few seconds you should now have a live image:

    Considerations for TinyML

    The full VGA (640×480 resolution) output from our little camera is way too big for current TinyML applications. uTensor runs handwriting detection with MNIST that uses 28×28 images. The person detection example in the TensorFlow Lite for Microcontrollers example uses 96×96 which is more than enough. Even state-of-the-art ‘Big ML’ applications often only use 320×320 images (see the TinyML book). Also consider an 8-bit grayscale VGA image occupies 300KB uncompressed and the Nano 33 BLE Sense has 256KB of RAM. We have to do something to reduce the image size! 

    Camera format options

    The OV7670 module supports lower resolutions through configuration options. The options modify the image data before it reaches the Arduino. The configurations currently available via the library today are:

    • VGA – 640 x 480
    • CIF – 352 x 240
    • QVGA – 320 x 240
    • QCIF – 176 x 144

    This is a good start as it reduces the amount of time it takes to send an image from the camera to the Arduino. It reduces the size of the image data array required in your Arduino sketch as well. You select the resolution by changing the value in Camera.begin. Don’t forget to change the size of your array too.

    Camera.begin(QVGA, RGB565, 1)

    The camera library also offers different color formats: YUV422, RGB444 and RGB565. These define how the color values are encoded and all occupy 2 bytes per pixel in our image data. We’re using the RGB565 format which has 5 bits for red, 6 bits for green, and 5 bits for blue:

    Converting the 2-byte RGB565 pixel to individual red, green, and blue values in your sketch can be accomplished as follows:

     // Convert from RGB565 to 24-bit RGB uint16_t pixel = (high << 8) | low; int red = ((pixel >> 11) & 0x1f) << 3; int green = ((pixel >> 5) & 0x3f) << 2; int blue = ((pixel >> 0) & 0x1f) << 3;

    Resizing the image on the Arduino

    Once we get our image data onto the Arduino, we can then reduce the size of the image further. Just removing pixels will give us a jagged (aliased) image. To do this more smoothly, we need a downsampling algorithm that can interpolate pixel values and use them to create a smaller image.

    The techniques used to resample images is an interesting topic in itself. We found this downsampling example from Eloquent Arduino works with fine the Arduino_OV767X camera library output (see animated GIF above).

    Applications like the TensorFlow Lite Micro Person Detection example that use CNN based models on Arduino for machine vision may not need any further preprocessing of the image — other than averaging the RGB values in order to remove color for 8-bit grayscale data per pixel.

    However, if you do want to perform normalization, iterating across pixels using the Arduino max and min functions is a convenient way to obtain the upper and lower bounds of input pixel values. You can then use map to scale the output pixel values to a 0-255 range.

    byte pixelOut = map(input[y][x][c], lower, upper, 0, 255); 

    Conclusion

    This was an introduction to how to connect an OV7670 camera module to the Arduino Nano 33 BLE Sense and some considerations for obtaining data from the camera for TinyML applications. There’s a lot more to explore on the topic of machine vision on Arduino — this is just a start!

    Website: LINK

  • Capture the flag with a GPS, RFID and LoRa twist

    Capture the flag with a GPS, RFID and LoRa twist

    Reading Time: 2 minutes

    Capture the flag with a GPS, RFID and LoRa twist

    Arduino TeamJune 23rd, 2020

    Capture the flag can be fun, but Karel Bousson has put a new spin on the game that allows you to compete against neighbors over who can keep a single item — a modified tool case — in their possession the longest.

    The box contains an Arduino Mega that interfaces with an RFID reader to enable the current owner to scan in, plus a GPS module for location data. Additionally, an LDR sensor can be incorporated to set the brightness of an LED matrix on the outside.

    Data passed along to a Raspberry Pi for time of possession tracking via LoRa with The Things Network. This also runs a server that shows game info to others playing, meaning that you’ll have to be very careful to keep it around!

    Code for the project is available on GitHub.

    Website: LINK

  • Raspberry Pi High Quality Camera powers up homemade microscope

    Raspberry Pi High Quality Camera powers up homemade microscope

    Reading Time: 3 minutes

    Wow, DIY-Maxwell, wow. This reddit user got their hands on one of our new Raspberry Pi High Quality Cameras and decided to upgrade their homemade microscope with it. The brains of the thing are also provided by a Raspberry Pi.

    Key features

    • Raspberry Pi OS
    • 8 MegaPixel CMOS camera (Full HD 30 fps video)
    • Imaging features from several centimetres to several micrometers without changing the lens
    • 6 stepper motors (X, Y, tilt, rotation, magnification, focus)
    • Variable speed control using a joystick controller or keyboard
    • Uniform illumination for imaging reflective surface
    • Modular design: stages and modules can be arranged in any configuration depending on the application

    Here’s what a penny looks like under this powerful microscope:

    Check out this video from the original reddit post to see the microscope in action.

    Bill of materials

    Click image to enlarge

    The user has put together very detailed, image-led build instructions walking you through how to create the linear actuators, camera setup, rotary stage, illumination, title mechanism, and electronics.

    The project uses a program written in Python 3 (MicroscoPy.py) to control the microscope, modify camera settings, and take photos and videos controlled by keyboard input.

    Click image to enlarge

    Here is a quick visual to show you the exact ports you need for this project on whatever Raspberry Pi you have:

    Click image to enlarge

    In the comments of the original reddit post, DIY_Maxwell explains that $10 objective lens used in the project limited the Raspberry Pi High Quality Camera’s performance. They predict you can expect even better images with a heavier investment in the lens.

    The project is the result of a team at IBM Research–Europe, in Zurich, who develop microfluidic technologies for medical applications, needing to provide high-quality photos and videos of their microfluidic chips.

    [youtube https://www.youtube.com/watch?v=PBSYnk9T4o4]

    In a blog for IEEE Spectrum, IBM team member Yuksel Temiz explains: “Taking a photo of a microfluidic chip is not easy. The chips are typically too big to fit into the field of view of a standard microscope, but they have fine features that cannot be resolved using a regular camera. Uniform illumination is also critical because the chips are often made of highly reflective or transparent materials. Looking at publications from other research groups, it’s obvious that this is a common challenge. With this motivation, I devoted some of my free time to designing a multipurpose and compact lab instrument that can take macro photos from almost any angle.”

    Here’s the full story about how the Raspberry Pi-powered creation came to be.

    And for some extra-credit homework, you can check out this document comparing the performance of the microscope using our Raspberry Pi Camera Module v2 and the High Quality Camera. The key takeaway for those wishing to upgrade their old projects with the newer camera is to remember that it’s heavier and 50% bigger, so you’ll need to tweak your housing to fit it in.

    Website: LINK

  • This Arduino-controlled robot leaves messages in the sand

    This Arduino-controlled robot leaves messages in the sand

    Reading Time: < 1 minute

    This Arduino-controlled robot leaves messages in the sand

    Arduino TeamJune 22nd, 2020

    Ivan Miranda has come up with a novel method for drawing messages in the sand, using a tread assembly that prints as it travels along the beach.

    The robot uses a length of square tubing to connect a pair of half tanks, with 50 SG90 micro servos spaced out on the bottom. As it pulls itself, the motors are controlled with a total of three Arduino Mega boards, intermittently extending into the sand. This creates lines that combine to form individual letters.

    You can see the build process in the video below, including his initial trial at around the 11:00 mark. This is actually Miranda’s second attempt at a “beach drawer,” and his first version, which uses a much different technique is seen here.

    [youtube https://www.youtube.com/watch?v=i6IIcoQ3-C0?feature=oembed&w=500&h=281]

    Website: LINK

  • Two-factor authentication on Arduino

    Two-factor authentication on Arduino

    Reading Time: 2 minutes

    Two-factor authentication on Arduino

    Arduino TeamJune 22nd, 2020

    Today, we’re announcing a new security feature for our community: two-factor authentication (2FA) on Arduino web services. We have implemented a two-step verification login to arduino.cc, so our users can be sure of their online safety.

    If enabled, two-factor authentication offers an additional security layer to the user’s account, so the user can have better protection of their IoT devices connected to Arduino IoT Cloud. We encourage our users to enable 2FA to improve their online safety.

    How to enable two-factor authentication

    Arduino supports two-factor authentication via authenticator software as Authy or the Google Authenticator. To enable 2FA on your account:

    1. Go to id.arduino.cc and click on Activate in the Security frame of your account:

    2. Scan the QR code using your own authenticator app (e.g. Authy, Google Authenticator, Microsoft Authenticator, etc.)

    3. Now, in your authenticator app, it appears a six-digit code that changes every 30 seconds: copy it in the text field and click Verify.

    4. Important: Save your Recovery code in a safe place and do not lose it. If you lose your 2FA codes (e.g. you misplace or break your phone), you can still restore your account using the recovery code. If you lose both 2FA and recovery codes, you will no longer be able to access your account.

    5. Great! Now you have the Two-Factor Authentication enabled on your Arduino account.

    Website: LINK

  • Building an Arduino-based bipedal bot

    Building an Arduino-based bipedal bot

    Reading Time: < 1 minute

    Building an Arduino-based bipedal bot

    Arduino TeamJune 21st, 2020

    If you’d like to build a walking biped robot, this 3D-printed design by Technovation looks like a fantastic place to start. Each leg features three servos that actuate it at the hip, knee, and ankle for a total of six degrees of freedom.

    Control is handled by an Arduino Uno board that rides on top of the legs, along with a perfboard to connect to the servos directly.

    Movements are calculated via inverse kinematics, meaning one simply has to input the x and z positions, and the Arduino calculates the proper servo angles. The bot is even able to take steps between two and 10 centimeters without falling over.

    [youtube https://www.youtube.com/watch?v=CxociTjzR4Q?feature=oembed&w=500&h=281]

    Website: LINK

  • Bike signal display keeps riders safe with machine learning

    Bike signal display keeps riders safe with machine learning

    Reading Time: 2 minutes

    Bike signal display keeps riders safe with machine learning

    Arduino TeamJune 21st, 2020

    Cycling can be fun, not to mention great exercise, but is also dangerous at times. In order to facilitate safety and harmony between road users on his hour-plus bike commute in Marseille, France, Maltek created his own LED backpack signaling setup.

    The device uses a hand mounted Arduino Nano 33 BLE Sense to record movement via its onboard IMU and runs a TinyML gesture recognition model to translate this into actual road signals. Left and right rotations of the wrist are passed along to the backpack unit over BLE, which shows the corresponding turn signal on its LED panel.

    Other gestures include a back twist for stop, forward twist to say “merci,” and it displays a default green forward scrolling arrow as the default state.

    More details on the project can be found in Maltek’s write-up here.

    [youtube https://www.youtube.com/watch?v=da8K2eS4XyU?feature=oembed&w=500&h=281]

    [youtube https://www.youtube.com/watch?v=w5kqfRDzFDU?feature=oembed&w=500&h=281]

    Website: LINK

  • This ephemeral display shows messages using floating bubbles

    This ephemeral display shows messages using floating bubbles

    Reading Time: < 1 minute

    This ephemeral display shows messages using floating bubbles

    Arduino TeamJune 19th, 2020

    While electronics and water don’t generally mix, researchers at Ochanomizu University in Japan have come up with an ephemeral display method that uses floating clusters of bubbles to show messages on a liquid surface.

    The device, known as UTAKATA, utilizes a line of seven electrodes under Arduino Uno control that activate to form hydrogen bubbles via electrolysis. When arranged properly, these bubbles can be made to produce letters and words, which as shown in the video below, dissipate as they flow downstream in the container.

    UTAKATA follows previous work where a static configuration of bubbles was used as the output. This water output gives a much better refresh rate, along with an interesting visual effect.

    More details are available in the researchers’ paper.

    [youtube https://www.youtube.com/watch?v=2q6C9iIaIuY?feature=oembed&w=500&h=281]

    Website: LINK