Schlagwort: Gesture Control

  • Using The Force to open a door

    Using The Force to open a door

    Reading Time: 2 minutes

    Arduino TeamMay 28th, 2022

    In the Star Wars Universe, the Jedi and Sith use The Force for battle and mayhem. But in the real world, people would use The Force for much more mundane everyday tasks. Obi Wan even does this in the prequel trilogy when he closes a door using the force. Star Wars fanatic Nick O’Hara leveraged an Arduino to replicate that trick.

    Despite his wishes, O’Hara lacks the midochlorians to actually wield The Force. But he has technology and that is almost as good. Automatic doors are already a thing, but they conveniently open anytime someone walks up—whether or not they possess The Force. O’Hara wanted his door to only open when he waves his hand. To accomplish that, he needed two things: some way to recognize the hand wave gesture and a method for opening the door.

    The door in question is quite heavy, but O’Hara had a beefy motor with a gearbox to increase the torque. That pulls the door open by reeling in a wire via a pulley. An Arduino Uno controls the motor through a driver board. It receives the command to open the door from a Raspberry Pi running gesture recognition software. The Raspberry Pi looks in front of the door through a webcam and sends the open command to the Arduino when it sees O’Hara wave his hand. It isn’t quite The Force, but it is as close as one can get in our universe.

    [youtube https://www.youtube.com/watch?v=BHhkDDzGz8A?start=302&feature=oembed&w=500&h=281]

    Website: LINK

  • This MP3 player is controlled with a twirl of your finger and wave of your hand

    This MP3 player is controlled with a twirl of your finger and wave of your hand

    Reading Time: 2 minutes

    Arduino TeamNovember 9th, 2021

    The classic MP3 player was a truly innovative device for its time, however with the advent of modern smartphones and other do-it-all gadgets, they have largely fallen by the wayside. In order to add a new twist, Norbert Zare decided to implement an MP3 player that not only responds to user inputs by moving the volume knob and tilting some notes to signal the next track, but can also be controlled simply by waving a finger in front of it.

    Gesture control was achieved using the PAJ7620U2 sensor, which can quickly detect movements within a 3D space and output its findings over the I2C bus to a host microcontroller. Zare set up his Arduino Uno’s program to continually check for a new gesture, and based on the one being read, perform a certain action. For example, making a clockwise circle with a single finger will increase the volume, turn the servo attached to the volume knob, and change the text on the attached LCD to match. Other functions include skipping tracks and resuming/pausing.

    When a motion has been picked up by the Uno, it also sends a signal to an attached and partially disassembled MP3 player’s button pad that controls the actual music being played as well as its volume. You can read more about this project here on Hackaday.io and check out Zare’s video below.

    Website: LINK

  • Meet Grumpy Hedgehog, an adorable gesture-sensing companion

    Meet Grumpy Hedgehog, an adorable gesture-sensing companion

    Reading Time: 2 minutes

    Arduino TeamNovember 3rd, 2021

    Detecting shapes and gestures has traditionally been performed by camera systems due to their large arrays of pixels. However, Jean Peradel has come up with a method that uses cheap time-of-flight (ToF) sensors to sense both objects and movement over time. Better yet, his entire project is housed within a 3D-printed “Grumpy Hedgehog” that contains not only the sensors, but a highly-interactive 1.44” LCD screen as well.

    Peradel’s smart home companion is capable of picking up several different kinds of movements and patterns to perform a wide variety of actions such as sending keystrokes to a PC, controlling a light, or actuating a servo motor. This is accomplished by taking VL53L1X ToF modules, which have a 16×16 scanning array and communicate over the I2C bus. Once the attached Arduino MKR WiFi 1010 has read this data, it can determine if the object (which appears closer on the grid) has moved up, down, left, or right. 

    In order to make this project a bit more friendly, Peradel designed a small enclosure/stand that houses the VL53L1X near the base. Near the top is a small LCD which shows animated hedgehog faces, the “sensor’s view” of the object, and the associated action being taken.

    You can read more about the Grumpy Hedgehog gesture sensor here on Hackaday.io.

    Website: LINK

  • Customizable artificial intelligence and gesture recognition

    Customizable artificial intelligence and gesture recognition

    Reading Time: 2 minutes

    Arduino TeamApril 15th, 2021

    In many respects we think of artificial intelligence as being all encompassing. One AI will do any task we ask of it. But in reality, even when AI reaches the advanced levels we envision, it won’t automatically be able to do everything. The Fraunhofer Institute for Microelectronic Circuits and Systems has been giving this a lot of thought.

    AI gesture training

    Okay, so you’ve got an AI. Now you need it to learn the tasks you want it to perform. Even today this isn’t an uncommon exercise. But the challenge that Fraunhofer IMS set itself was training an AI without any additional computers.

    As a test case, an Arduino Nano 33 BLE Sense was employed to build a demonstration device. Using only the onboard 9-axis motion sensor, the team built an untethered gesture recognition controller. When a button is pressed, the user draws a number in the air, and corresponding commands are wirelessly sent to peripherals. In this case, a robotic arm.

    [youtube https://www.youtube.com/watch?v=ES_Aw7Hq_OA?feature=oembed&w=500&h=281]

    Embedded intelligence

    At first glance this might not seem overly advanced. But consider that it’s running entirely from the device, with just a small amount of memory and an Arduino Nano. Fraunhofer IMS calls this “embedded intelligence,” as it’s not the robot arms that’s clever, but the controller itself.

    This is achieved when training the device using a “feature extraction” algorithm. When the gesture is executed, the artificial neural network (ANN) is able to pick out only the relevant information. This allows for impressive data reduction and a very efficient, compact AI.

    Fraunhofer IMS Arduino Nano with Gesture Recognition

    Obviously this is just an example use case. It’s easy to see the massive potential that this kind of compact, learning AI could have. Whether it’s in edge control, industrial applications, wearables or maker projects. If you can train a device to do the job you want, it can offer amazing embedded intelligence with very few resources.

    Website: LINK

  • Light painting with a gesture-controlled drone

    Light painting with a gesture-controlled drone

    Reading Time: < 1 minute

    Light painting with a gesture-controlled drone

    Arduino TeamOctober 9th, 2020

    Researchers at the Skolkovo Institute of Science and Technology (Skoltech) in Moscow, Russia have come up with a novel way to interface with a drone via hand movements.

    As shown in the video below, the device can be used to create long-exposure light art, though the possibilities for such an intuitive control system could extend to many other applications as well.

    In this setup, a small Crazieflie 2.0 quadcopter is operated by a glove-like wearable featuring an Arduino Uno, along with an IMU and flex sensor for user input, and an XBee module for wireless communication. The controller connects to a base station running a machine learning algorithm that matches the user’s gestures to pre-defined letters or pattern, and directs the drone to light paint them. 

    [youtube https://www.youtube.com/watch?v=SdnIqLjtGeU?feature=oembed&w=500&h=281]

    The team’s full research paper is available here.

    Website: LINK

  • GesturePod is a clip-on smartphone interface for the visually impaired

    GesturePod is a clip-on smartphone interface for the visually impaired

    Reading Time: 2 minutes

    GesturePod is a clip-on smartphone interface for the visually impaired

    Arduino TeamNovember 6th, 2019

    Smartphones have become a part of our day-to-day lives, but for those with visual impairments, accessing one can be a challenge. This can be especially difficult if one is using a cane that must be put aside in order to interact with a phone.

    The GesturePod offers another interface alternative that actually attaches to the cane itself. This small unit is controlled by a MKR1000 and uses an IMU to sense hand gestures applied to the cane. 

    If a user, for instance, taps twice on the ground, a corresponding request is sent to the phone over Bluetooth, causing it to output the time audibly. Five gestures are currently proposed, which could expanded upon or modified for different functionality as needed.

    [youtube https://www.youtube.com/watch?v=Bq1w7fy4SNw?feature=oembed&w=500&h=281]

    People using white canes for navigation find it challenging to concurrently access devices such as smartphones. Build­ ing on prior research on abandonment of specialized devices, we explore a new touch free mode of interaction wherein a person with visual impairment can perform gestures on their existing white cane to trigger tasks on their smartphone. We present GesturePod, an easy-to-integrate device that clips on to any white cane, and detects gestures performed with the cane. With GesturePod, a user can perform common tasks on their smartphone without touch or even removing the phone from their pocket or bag. We discuss the challenges in build­ ing the device and our design choices. We propose a novel, efficient machine learning pipeline to train and deploy the gesture recognition model. Our in-lab study shows that Ges­ turePod achieves 92% gesture recognition accuracy and can help perform common smartphone tasks faster. Our in-wild study suggests that GesturePod is a promising tool to im­ prove smartphone access for people with VI, especially in constrained outdoor scenarios.

    Website: LINK