Schlagwort: Smart Glove

  • Iana’s smart glove in real life

    Iana’s smart glove in real life

    Reading Time: 2 minutes

    Rainbow Six is a series of tactical shooter video games dating back to 1998. The series contains many individual titles set in various points of time, from historical wars to fictional futures. Rainbow Six Siege (2015) is set in the future and the character Iana has the ability to project a virtual clone of herself, which she controls using a smart glove. For his most recent project CiferTech recreated Iana’s smart glove.

    This replica of Iana’s smart glove can control a variety of devices and isn’t intended for any specific purpose. It looks just like the smart glove from Rainbow Six Siege and is a real, functional wearable device. It monitors the movement and orientation of the user’s hand, as well as the positions of their index finger and middle finger. Via radio communication, the user can use their hand and fingers to control any device with a compatible radio receiver.

    CiferTech modeled the smart glove to match what is seen in the game and then 3D-printed the parts. Inside the enclosure is an Arduino Nano board, an MPU-6050 accelerometer/gyroscope sensor to monitor hand movement, and an nRF24 radio transceiver module for communication. Two potentiometers on the finger linkages monitor finger position. The user just needs to equip the end device with another nRF24 module for radio control.

    The post Iana’s smart glove in real life appeared first on Arduino Blog.

    Website: LINK

  • This glove translates sign language using an array of sensors

    This glove translates sign language using an array of sensors

    Reading Time: 2 minutes

    For people not familiar with American Sign Language (ASL), being able to recognize what certain hand motions and positions mean is a nearly impossible task. To make this process easier, Hackster.io user ayooluwa98 came up with the idea to integrate various motion, resistive, and touch sensors into a single glove that could convert these signals into understandable text and speech.

    The system is based around a single Arduino Nano board, which is responsible for taking in sensor data and outputting the phrase that best matches the inputs. The orientation of the hand is ascertained by reading values from the X, Y, and Z axes of a single accelerometer and applying a small change based upon prior calibration. Meanwhile, resistive flex sensors spanning the length of each finger produce a different voltage level according to the bend’s extent.

    At each iteration of the program’s main loop, a series of Boolean statements are evaluated to pick the phrase that best matches the current finger bends and hand orientation, and this data is then outputted via the UART pins to an attached Bluetooth® HC-05 module. The final component is a connected phone running a custom app that takes the incoming words from Bluetooth® and saves them for text-to-speech output when the button is pressed.

    To see more about this project, you can read ayooluwa98’s write-up here on Hackster.io.

    The post This glove translates sign language using an array of sensors appeared first on Arduino Blog.

    Website: LINK

  • The MemGlove detects hand poses and recognizes objects

    The MemGlove detects hand poses and recognizes objects

    Reading Time: 2 minutes

    The MemGlove detects hand poses and recognizes objects

    Arduino TeamJuly 14th, 2020

    Hand movements have long been used as a computer interface method, but as reported here, the MemGlove from a team of MIT CSAIL researchers takes things several steps further. This augmented glove can sense hand poses and how it’s applying pressure to an object.

    The wearable uses a novel arrangement of 16 electrodes to detect hand position based on resistance, and six fluid filled tubes that transmit pressure depending on how an item is gripped.

    An Arduino Due is used to sense these interactions, which pass information on to a computer for processing. Pose verification is accomplished with a Leap Motion sensor. By training neural networks with TensorFlow, the glove is able to identify various hand poses, as well as distinguish between 30 different household things that are grasped.

    [youtube https://www.youtube.com/watch?v=6OY83PSr40I?feature=oembed&w=500&h=281]

    More details on the MemGlove can be found in the researchers’ paper here.

    Website: LINK

  • Bike signal display keeps riders safe with machine learning

    Bike signal display keeps riders safe with machine learning

    Reading Time: 2 minutes

    Bike signal display keeps riders safe with machine learning

    Arduino TeamJune 21st, 2020

    Cycling can be fun, not to mention great exercise, but is also dangerous at times. In order to facilitate safety and harmony between road users on his hour-plus bike commute in Marseille, France, Maltek created his own LED backpack signaling setup.

    The device uses a hand mounted Arduino Nano 33 BLE Sense to record movement via its onboard IMU and runs a TinyML gesture recognition model to translate this into actual road signals. Left and right rotations of the wrist are passed along to the backpack unit over BLE, which shows the corresponding turn signal on its LED panel.

    Other gestures include a back twist for stop, forward twist to say “merci,” and it displays a default green forward scrolling arrow as the default state.

    More details on the project can be found in Maltek’s write-up here.

    [youtube https://www.youtube.com/watch?v=da8K2eS4XyU?feature=oembed&w=500&h=281]

    [youtube https://www.youtube.com/watch?v=w5kqfRDzFDU?feature=oembed&w=500&h=281]

    Website: LINK