Schlagwort: Gesture Recognition

  • This DIY Apple Pencil writes with gestures

    This DIY Apple Pencil writes with gestures

    Reading Time: 2 minutes

    Released in 2015, the Apple Pencil is a technology-packed stylus that allows users to write on iPad screens with variations in pressure and angle — all while communicating with very low latencies. Nekhil Ravi and Shebin Jose Jacob of Coders Café were inspired by this piece of handheld tech to come up with their own pencil concept, except this one wouldn’t need a screen in order to function.

    The pair’s writing utensil relies on recognizing certain gestures as letters, and once one has been detected, outputs the result over USB or Bluetooth® to the host device. They started by first gathering many samples of different letters and how they correlate to the change in motion on the Arduino Nano 33 BLE Sense’s built-in accelerometer. From here, they designed an impulse in the Edge Impulse Studio to extract spectral features from the time series accelerometer data and pass it to a classification Keras neural network. The resulting model could accurately determine the correct letter from each gesture, making it suitable for deployment back to the Nano 33 BLE Sense.

    Before testing their new inferencing code on the hardware, a simple 3D-printed case was designed to fit around the board to look like the real Apple Pencil. Additionally, the team made a simple website that could receive data from the board over BLE and display the corresponding letter within the browser window. To see more about this project, you can watch their video below!

    The post This DIY Apple Pencil writes with gestures appeared first on Arduino Blog.

    Website: LINK

  • Customizable artificial intelligence and gesture recognition

    Customizable artificial intelligence and gesture recognition

    Reading Time: 2 minutes

    Arduino TeamApril 15th, 2021

    In many respects we think of artificial intelligence as being all encompassing. One AI will do any task we ask of it. But in reality, even when AI reaches the advanced levels we envision, it won’t automatically be able to do everything. The Fraunhofer Institute for Microelectronic Circuits and Systems has been giving this a lot of thought.

    AI gesture training

    Okay, so you’ve got an AI. Now you need it to learn the tasks you want it to perform. Even today this isn’t an uncommon exercise. But the challenge that Fraunhofer IMS set itself was training an AI without any additional computers.

    As a test case, an Arduino Nano 33 BLE Sense was employed to build a demonstration device. Using only the onboard 9-axis motion sensor, the team built an untethered gesture recognition controller. When a button is pressed, the user draws a number in the air, and corresponding commands are wirelessly sent to peripherals. In this case, a robotic arm.

    [youtube https://www.youtube.com/watch?v=ES_Aw7Hq_OA?feature=oembed&w=500&h=281]

    Embedded intelligence

    At first glance this might not seem overly advanced. But consider that it’s running entirely from the device, with just a small amount of memory and an Arduino Nano. Fraunhofer IMS calls this “embedded intelligence,” as it’s not the robot arms that’s clever, but the controller itself.

    This is achieved when training the device using a “feature extraction” algorithm. When the gesture is executed, the artificial neural network (ANN) is able to pick out only the relevant information. This allows for impressive data reduction and a very efficient, compact AI.

    Fraunhofer IMS Arduino Nano with Gesture Recognition

    Obviously this is just an example use case. It’s easy to see the massive potential that this kind of compact, learning AI could have. Whether it’s in edge control, industrial applications, wearables or maker projects. If you can train a device to do the job you want, it can offer amazing embedded intelligence with very few resources.

    Website: LINK

  • Bike signal display keeps riders safe with machine learning

    Bike signal display keeps riders safe with machine learning

    Reading Time: 2 minutes

    Bike signal display keeps riders safe with machine learning

    Arduino TeamJune 21st, 2020

    Cycling can be fun, not to mention great exercise, but is also dangerous at times. In order to facilitate safety and harmony between road users on his hour-plus bike commute in Marseille, France, Maltek created his own LED backpack signaling setup.

    The device uses a hand mounted Arduino Nano 33 BLE Sense to record movement via its onboard IMU and runs a TinyML gesture recognition model to translate this into actual road signals. Left and right rotations of the wrist are passed along to the backpack unit over BLE, which shows the corresponding turn signal on its LED panel.

    Other gestures include a back twist for stop, forward twist to say “merci,” and it displays a default green forward scrolling arrow as the default state.

    More details on the project can be found in Maltek’s write-up here.

    [youtube https://www.youtube.com/watch?v=da8K2eS4XyU?feature=oembed&w=500&h=281]

    [youtube https://www.youtube.com/watch?v=w5kqfRDzFDU?feature=oembed&w=500&h=281]

    Website: LINK

  • TipText enables one-handed text entry using a fingertip keyboard

    TipText enables one-handed text entry using a fingertip keyboard

    Reading Time: < 1 minute

    TipText enables one-handed text entry using a fingertip keyboard

    Arduino TeamNovember 11th, 2019

    Today when you get a text, you can respond with message via an on-screen keyboard. Looking into the future, however, how would you interact unobtrusively with a device that’s integrated into eyeglasses, contacts, or perhaps even something else?

    TipText is one solution envisioned by researchers at Dartmouth College, which uses a MPR121 capacitive touch sensor wrapped around one’s index finger as a tiny 2×3 grid QWERTY keyboard.

    The setup incorporates an Arduino to process inputs on the grid and propose a number of possible words on a wrist-mounted display that the user can select by swiping right with the thumb. A new word is automatically started when the next text entry tap is received, allowing for a typing speed of around 12-13 words per minute.

    [youtube https://www.youtube.com/watch?v=i3YPZsiHEKM?feature=oembed&w=500&h=281]

    Website: LINK