Schlagwort: Assistive Technology

  • A gamified approach to therapy and motor skills testing

    A gamified approach to therapy and motor skills testing

    Reading Time: 2 minutes

    For children who experience certain developmental delays, specific types of physical therapies are often employed to assist them in improving their balance and motor skills/coordination. Ivan Hernandez, Juan Diego Zambrano, and Abdelrahman Farag were looking for a way to quantify the progress patients make while simultaneously presenting a gamified approach, so they developed a standalone node for equilibrium evaluation that could do both.

    On the hardware side of things, an Arduino Nano BLE 33 Sense Rev2 is responsible for handling all of the incoming motion data from its onboard BMI270 six-axis IMU and BMM150 three-axis magnetometer. New readings are constantly taken, filtered, and fused together before being sent to an external device over Bluetooth Low Energy. The board was also connected to a buzzer and buttons for user inputs, as well as an RGB LED to get a real-time status.

    The patient begins the session by first putting on the wearable and connecting to the accompanying therapist application. Next, a game starts in which the user must move their torso to guide an image of a shark over the image of a stationary fish within a time period — ultimately trying to get the highest score possible. Throughout all of this, a vision system synchronizes its readings with the IMU sensor readings for an ultra-detailed look at how the patient responds to the game over time.

    To read more about the project, you can visit the team’s write-up on Hackaday.io.

    [youtube https://www.youtube.com/watch?v=EA0CtgWG24I?feature=oembed&w=500&h=281]

    The post A gamified approach to therapy and motor skills testing appeared first on Arduino Blog.

    Website: LINK

  • This small device enables users to feel braille through haptics

    This small device enables users to feel braille through haptics

    Reading Time: 2 minutes

    For the visually impaired community, most of their interactions on mobile phones are confined to text-to-speech (TTS) interfaces that read portions of the screen aloud. Dynamic braille displays also exist as a tactile means of communication, but their prices can get close to $15,000, putting them out of reach for most people. This is why Instructables user bmajorspin wanted to create an inexpensive, portable alternative that could work with other mobile devices.

    Unlike other braille displays that use moving pins, this design leverages a set of six static pins housed within a 3D-printed enclosure that can vibrate independently. After connecting six haptic motors to an Arduino Nano 33 BLE through MOSFET drivers, bmajorspin mounted the entire circuit onto a small piece of perfboard and then soldered a micro USB cable for power. Lastly, a spring and 3D-printed cap were placed over each braille dot to isolate the vibrations and prevent the haptic signals from becoming muddled together.

    The Nano 33 BLE is able to display braille characters thanks to it acting as a Bluetooth® Low Energy server that exposes a custom braille reader service. Through it, bmajorspin’s custom Android app can send encoded dot patterns to the device for it to then decode and present with the haptic motors.

    More information about this highly accessible braille reader can be found here on Instructables

    The post This small device enables users to feel braille through haptics appeared first on Arduino Blog.

    Website: LINK

  • This Arduino GIGA R1 WiFi project turns a coffee maker into a more accessible appliance

    This Arduino GIGA R1 WiFi project turns a coffee maker into a more accessible appliance

    Reading Time: 2 minutes

    While many of the things we interact with every day have become more usable by people with disabilities, the kitchen remains as one important area of our lives that still lacks many accessibility features. One of these commonplace appliances is the coffee maker and its array of small buttons or even a touchscreen that can be hard to see/touch. Orlie on Instructables has developed a set of wireless buttons and an accompanying receiver that translate simple actions into an easy, end-to-end brewing experience.

    Each button started as a custom 3D-printed shell with compartments for a AA battery holder, large arcade button, and the perfboard that also contained the ESP8266 microcontroller. In this system, the ESP8266 communicates with the Arduino GIGA R1 WiFi board via Wi-Fi and an MQTT message broker running on a host PC. This enables each button to be assigned a unique message that dictates the desired task to be performed.

    At the coffee maker, the GIGA R1 WiFi was wired into a pair of ULN2003 stepper motor driver modules that move a gantry across a set of linear rails and eventually push the corresponding buttons once the correct position has been reached. Ultimately, this allows for those with less mobility and/or dexterity to select what they want from anywhere in the house — all over Wi-Fi.

    To see how this project was built in greater detail, you can read Orlie’s write-up here on Instructables.

    The post This Arduino GIGA R1 WiFi project turns a coffee maker into a more accessible appliance appeared first on Arduino Blog.

    Website: LINK

  • Motion control interface facilitates robot operation for those with paralysis

    Motion control interface facilitates robot operation for those with paralysis

    Reading Time: 2 minutes

    Henry Evans suffered a brain-stem stroke 20 years ago that left him paralyzed with quadriplegia. He can move his head, but other than a small amount of movement in his left thumb, he can’t control the rest of his body. To help Evans live a more independent life, researchers from Carnegie Mellon University’s School of Computer Science developed a motion control interface that lets him operate a mobile robot.

    The robot is a Stretch model from Hello Robot, which can navigate a home on its mobile base, interact with objects using its arm and gripper, and provide a live view through a pair of cameras (one on its head and one on its gripper). But this telepresence robot doesn’t have any provisions for operation by a person with quadriplegia like Evans. That’s where the SCS team came in.

    They created a head-worn motion control interface consisting of an Arduino Nano board, a Bosch BNO055 IMU and an HC-05 Bluetooth module. The Arduino monitors Evans’s head movement with the IMU, then sends cursor movement commands over Bluetooth to the computer running the software that controls the Stretch robot. That lets Evans move the cursor on the screen, and then he can click a mouse button thanks to the limited movement of his left thumb.

    During a week-long testing session, Evans successfully used this system to perform many tasks around his home. He was able to use the robot to pick up tissues and bring them to his face, and even to adjust the blinds on his bedroom window. Clever “Drivers Assistance” software lets the robot operate semi-autonomously in order to complete tasks that would have been difficult for Evans to accomplish through manual control.

    While the Stretch robot is expensive at about $25,000 dollars, the HAT (Head-worn Assistive Teleoperation) control interface is affordable. This is just a prototype, but a device like this could help many people around the world living with quadriplegia and other conditions that affect motor control.  

    [youtube https://www.youtube.com/watch?v=XuQKCFJ3-V8?feature=oembed&w=500&h=281]

    The post Motion control interface facilitates robot operation for those with paralysis appeared first on Arduino Blog.

    Website: LINK

  • A gaming platform tailored to those with special needs

    A gaming platform tailored to those with special needs

    Reading Time: 2 minutes

    As a society, we have decided to enact some measures to make our world more accessible to those with disabilities. Wheelchair ramps, for example, are often legal requirements for businesses in many countries. But we tend to drop the ball when it comes to things aren’t necessities. For instance, entertainment options are an afterthought much of the time. That’s why Alain Mauer developed this LED gaming platform for people with special needs.

    This device offers a lot of flexibility so that builders can tailor it to a specific individual’s own needs and tastes. Mauer designed it for his son, who is 17 years old and lives with non-verbal autism. Entertainment options intended for neurotypical people don’t engage the teen, but toys designed for children fail to hold his interest for long. This game, dubbed “Scott’s Arcade,” is simple to understand and interact with, while still offering a lot of replayability. It is also durable and able to withstand rough handling.

    Scott’s Arcade consists of a “screen” made up of individually addressable RGB LEDs and a faceplate with shape cutouts that act as masks for the LEDs. An Arduino Nano controls the lights and responds to presses of the large buttons beneath the screen. It can trigger sound effects through a DFRobot DFPlayer Mini MP3 player as well.

    [youtube https://www.youtube.com/watch?v=p7KceTKOyhQ?feature=oembed&w=500&h=281]

    Mauer programmed a few simple games for the device, such as a matching game that challenges the player to find the circle of the same color as the triangle. When they succeed, they’re rewarded with fanfare sound effects and flashing lights. Makers can also program their own games to suit the players’ abilities and interests. 

    The post A gaming platform tailored to those with special needs appeared first on Arduino Blog.

    Website: LINK

  • Voice-enabled controller makes video games more accessible

    Voice-enabled controller makes video games more accessible

    Reading Time: 2 minutes

    Almost all modern video games require either a gamepad or a keyboard and mouse, which means that they’re inaccessible to many people with disabilities that affect manual dexterity. Bob Hammell’s voice-enabled controller lets some of those people experience the joy of video games.

    This is a simplified video game controller with a minimal number of physical buttons, but with special voice-activated virtual buttons to make up the difference. The gamepad only has six physical buttons, plus an analog joystick. That makes it much easier to handle than a typical modern controller, which might have a dozen buttons and two joysticks. If the player has the ability, they can utilize the physical controls and then speak commands to activate the game functions not covered by those buttons.

    The controller’s brain is an Arduino Micro board, which Hammell selected because it can be configured to show up as a standard USB HID gamepad or keyboard when connected to a PC. The physical controls are an Adafruit analog two-axis joystick and tactile switches. An Adafruit 1.3″ OLED screen displays information, including the status of the voice activation.

    An Elechouse V3 Voice Recognition Module performs the voice recognition and it can understand up to 80 different commands. When it recognizes a command, like “menu,” it tells the Arduino to send the corresponding virtual button press to the connected computer. It takes time for a person to speak a command, so those are best suited to functions that players don’t use very often.

    If you know someone that would benefit from a controller like this, Hammell posted a full tutorial and all of the necessary files to Hackster.io so you can build your own.

    The post Voice-enabled controller makes video games more accessible appeared first on Arduino Blog.

    Website: LINK

  • RoboCup is an assistive drinking device for people living with cerebral palsy

    RoboCup is an assistive drinking device for people living with cerebral palsy

    Reading Time: 2 minutes

    One of the many realities of living with cerebral palsy is limited upper body dexterity, as almost every activity requires the help of a caregiver. That includes something that most of us take for granted: drinking water. To restore at least that little bit of independence, Rice University engineering students Thomas Kutcher and Rafe Neathery designed the RoboCup.

    A typical solution for letting people with cerebral palsy drink without assistance is a “giraffe bottle.” That is a water bottle with a long gooseneck straw that extends in front of the user’s mouth. But while that does give them the ability to drink on their own, it is obtrusive and leaves a bulky straw in front of their face. RoboCup eliminates that issue by rotating the straw out of the way when it isn’t in use. To take a drink, the user just needs to push a button or move their finger over a sensor. The straw will then rotate back over to their mouth.

    The best part is that RoboCup is open source, so anyone with a 3D printer and some basic skill with electronics can build one for around $100. The key component is an Arduino Nano board. It monitors the tactical button or distance sensor (whichever is appropriate for the user’s capability) and controls a servo motor that rotates the straw. Power comes from a small rechargeable battery and all of the components, aside from the 3D-printed parts, are off-the-shelf and readily available.

    [youtube https://www.youtube.com/watch?v=OeWqfF73XDA?feature=oembed&w=500&h=281]

    More details on the RoboCup along with instructions are available on the project’s page here.

    The post RoboCup is an assistive drinking device for people living with cerebral palsy appeared first on Arduino Blog.

    Website: LINK

  • This AI system helps visually impaired people locate dining utensils

    This AI system helps visually impaired people locate dining utensils

    Reading Time: 2 minutes

    People with visual impairments also enjoy going out to a restaurant for a nice meal, which is why it is common for wait staff to place the salt and pepper shakes in a consistent fashion: salt on the right and pepper on the left. That helps visually impaired diners quickly find the spice they’re looking for and a similar arrangement works for utensils. But what about after the diner sets down a utensil in the middle of a meal? The ForkLocator is an AI system that can help them locate the utensil again.

    This is a wearable device meant for people with visual impairments. It uses object recognition and haptic cues to help the user locate their fork. The current prototype, built by Revoxdyna, only works with forks. But it would be possible to expand the system to work with the full range of utensils. Haptic cues come from four servo motors, which prod the user’s arm to indicate the direction in which they should move their hand to find the fork.

    The user’s smartphone performs the object recognition and should be worn or positioned in such a way that its camera faces the table. The smartphone app looks for the plate, the fork, and the user’s hand. It then calculates a vector from the hand to the fork and tells an Arduino board to actuate the servo motors corresponding to that direction. Those servos and the Arduino attach to a 3D-printed frame that straps to the user’s upper arm.

    A lot more development is necessary before a system like the ForkLocator would be ready for the consumer market, but the accessibility benefits are something to applaud.

    [youtube https://www.youtube.com/watch?v=_TgC0KYyzwI?feature=oembed&w=500&h=281]

    The post This AI system helps visually impaired people locate dining utensils appeared first on Arduino Blog.

    Website: LINK

  • Walk-Bot helps people with visual impairments navigate safely

    Walk-Bot helps people with visual impairments navigate safely

    Reading Time: 2 minutes

    It is no secret that visual impairments — even those that don’t result in complete blindness — make it very difficult for people to live their lives. White canes can help people get around, but they require physical contact. Seeing eye dogs provide very valuable assistance, but they’re expensive and need care of their own. That’s why Nilay Roy Choudhury designed the Walk-Bot device to help people with visual impairments navigate safely.

    Walk-Bot is a wearable navigation device that uses audible cues and haptic feedback to give visually impaired people a sense of their immediate environment. It has a host of sensors that let it identify nearby obstacles at any height from the floor to the ceiling. Walk-Bot performs onboard trigonometry to determine the distance to any obstacles that might interfere with its user’s ability to walk safely. And it is affordable and easy to build with common components.

    Those components include an Arduino Nano board, two HC-SR04 ultrasonic sensors, a GP2Y0A02YK0F infrared sensor, a vibration motor, a buzzer, an MPU-6050 gyroscope, and an HC-05 Bluetooth module. Those all fit inside a 3D-printed wearable enclosure.

    One ultrasonic sensor faces upwards at a 45-degree angle to detect high obstacles. The second ultrasonic sensor faces directly forwards. The infrared sensor points downwards at a 45-degree angle to detect low obstacles and was chosen because ultrasonic sensors struggle with some common floor surfaces. The gyroscope lets Walk-Bot determine its own orientation in space. When it detects an obstacle, Walk-Bot sounds the buzzer and activates the vibration motor. It also includes a panic button that will tell Walk-Bot to connect to the user’s smartphone through the Bluetooth module to message a chosen contact in the event of an emergency.

    The post Walk-Bot helps people with visual impairments navigate safely appeared first on Arduino Blog.

    Website: LINK

  • The smartChair is a Nano 33 IoT-based stand-up and walking aid

    The smartChair is a Nano 33 IoT-based stand-up and walking aid

    Reading Time: 2 minutes

    Arduino TeamSeptember 2nd, 2021

    Over time, people age and naturally tend to lose some or most of their mobility, leading to a need for a wheelchair, walker, or other assistive device. This led hitesh.boghani to submit his project, which he calls the smartChair, to element14’s Design for a Cause 2021 contest. This build features a sort of pseudo-walker that enables a user to transition from a sitting to a standing position with some motorized assistance. Apart from that primary use, Hitesh also wanted to create a “summon” mode that would allow the walker to move on its own to where it’s needed.

    As with every other project submitted to the contest, this too makes use of the Arduino Nano 33 IoT to handle both motor control and communication with a client device. In order to lift the walker from a compacted state to an expanded one, Hitesh began by assembling a wooden frame and then placed a brushless DC motor in line with some gearing to increase torque and reduce the speed. Next, an L293D motor driver IC was connected to a breadboard and a Nano 33 IoT for receiving input signals. And finally, a bit of code was written that spins the motor for a certain number of turns depending on the speed and direction requested.

    Unfortunately, time ran out to complete the summon feature, so Hitesh plans on improving this project continually to add a camera, a motorized base, and a basic smartphone app for controlling the whole thing. But even in its current state, the smartChair is a great assistive tool for anyone who needs extra help getting up from a sitting position. 

    Website: LINK

  • Speak4Me is an eye-to-speech module designed to assist those unable to communicate verbally

    Speak4Me is an eye-to-speech module designed to assist those unable to communicate verbally

    Reading Time: 2 minutes

    Arduino TeamSeptember 1st, 2021

    People who suffer from physical disabilities that leave them unable to speak or communicate effectively can end up frustrated or largely ignored. In response to this issue, Hackaday users MalteMarco, and Tim R wanted to create a small device that can turn small eye movements into simple commands and phrases for easier communication, which they call the “Speak4Me.”

    At the most basic level, the Speak4Me consists of an Arduino Nano board that controls a set of four infrared sensors which are pointed at the user’s eye within a single glass lens. Then once every 100 milliseconds, a measurement is taken to determine the location of the pupil and thus the direction being focused on. The word or phrase is chosen by first selecting a profile containing four groups of four elements each, for a total of sixteen possible combinations per profile. As an example, the caretaker profile has elements such as “yes,” “I want to sit,” and even “I need medical treatment.”

    After a command has been selected, it is then sent to a Parallax Emic 2 text-to-speech device that takes in the words and produces the corresponding sounds, which are outputted via a 3.5mm audio jack. 

    This compact Speak4Me eye-to-speech system has great potential, and you can read more about the project here on Hackaday.io.

    Website: LINK

  • This dad built an adaptive USB keyboard for his son and other kids with muscular conditions

    This dad built an adaptive USB keyboard for his son and other kids with muscular conditions

    Reading Time: 2 minutes

    Arduino TeamJune 3rd, 2021

    Having a disability can severely impact one’s ability to perform tasks that others do regularly, such as eating, walking, or even speaking. One maker by the name of ‘gtentacle‘ has a son who needs to use a ventilator constantly in order to breathe as he suffers from a myotubular myopathy, a disease that greatly impacts the strength of his muscles. Due to his condition, he is unable to talk; however, that that didn’t stop his father from coming up with a solution. This project involves five Logitech Adaptive Buttons and an Arduino Micro to type in letters for a text-to-speech (TTS) system to read. 

    Up to 20 letters can be entered in total, and each one can be accessed with a grid-type system. For instance, the letter ‘T’ can be typed by pressing the 3 button followed by the 2 button. The ‘Enter’ command is sent whenever button 5 is the first key pressed. Thanks to the ATmega32u4, the system works with any device that supports a USB keyboard and has TTS software. The project’s creator even used it with Android Talkback. 

    More information on the the assistive technology project can be found in gtentacle’s Hackster write-up.

    Website: LINK

  • Magpie MIDI is an adaptive harmonica-style computer interface

    Magpie MIDI is an adaptive harmonica-style computer interface

    Reading Time: 2 minutes

    Magpie MIDI is an adaptive harmonica-style computer interface

    Arduino TeamSeptember 15th, 2020

    For those with certain physical restrictions, interfacing with a computer can be a difficult task. As a possible solution, Shu Takahashi and Pato Montalvo have come up with the Magpie MIDI hands-free interface. The adaptive tool, inspired in part by a harmonica, has 13 air holes that enable its user to “sip” and “puff” all 26 letters of the alphabet.

    The Magpie MIDI also features an integrated joystick and potentiometer, allowing it to function as a USB mouse for navigating a computer screen, as a MIDI controller, and even as a gaming device. Everything is controlled by an Arduino Leonardo, and uses a CD74HC4067 multiplexer to accommodate the available inputs.

    More info on this amazing assistive technology project can be found in Takahashi’s tutorial, as well as the video below.

    [youtube https://www.youtube.com/watch?v=hUoyafEdK-Q?feature=oembed&w=500&h=281]

    Magpie MIDI is an affordable adaptive tool that enables cerebral palsy patients and others with muscle control disabilities to express themselves in new ways. Meant to be easily customizable to meet different needs of varying degrees of disabilities, every aspect of hardware and software is open-source. The device offers new means for cerebral palsy patients and alike to express their creativity in areas of computer games, music, and writing.

    Website: LINK

  • Upgrading a ride-on car to a joystick-controlled assistive device

    Upgrading a ride-on car to a joystick-controlled assistive device

    Reading Time: 2 minutes

    Upgrading a ride-on car to a joystick-controlled assistive device

    Arduino TeamJanuary 9th, 2020

    Child-sized wheelchairs can be difficult to come by, and unfortunately aren’t as much fun as something like a ride-on car. The South Eugene Robotics Team, or FRC2521, decided to address both challenges by building a mini Jeep augmented for kids with limited mobility.

    Instructions found here detail how to modify the battery-powered toy, including what can be recycled and what extra parts will need to be purchased. In the new configuration, the Jeep’s two rear motors are configured for differential control, with the input regulated by an Arduino Nano and a pair of electronic speed controllers (ESCs). 

    In this project, a joystick replaces the original pedal and steering wheel, and it looks like a lot of fun when implemented in the similarly-outfitted firetruck below.

    [youtube https://www.youtube.com/watch?v=y7esOf6DKY8?feature=oembed&w=500&h=281]

    Website: LINK

  • GesturePod is a clip-on smartphone interface for the visually impaired

    GesturePod is a clip-on smartphone interface for the visually impaired

    Reading Time: 2 minutes

    GesturePod is a clip-on smartphone interface for the visually impaired

    Arduino TeamNovember 6th, 2019

    Smartphones have become a part of our day-to-day lives, but for those with visual impairments, accessing one can be a challenge. This can be especially difficult if one is using a cane that must be put aside in order to interact with a phone.

    The GesturePod offers another interface alternative that actually attaches to the cane itself. This small unit is controlled by a MKR1000 and uses an IMU to sense hand gestures applied to the cane. 

    If a user, for instance, taps twice on the ground, a corresponding request is sent to the phone over Bluetooth, causing it to output the time audibly. Five gestures are currently proposed, which could expanded upon or modified for different functionality as needed.

    [youtube https://www.youtube.com/watch?v=Bq1w7fy4SNw?feature=oembed&w=500&h=281]

    People using white canes for navigation find it challenging to concurrently access devices such as smartphones. Build­ ing on prior research on abandonment of specialized devices, we explore a new touch free mode of interaction wherein a person with visual impairment can perform gestures on their existing white cane to trigger tasks on their smartphone. We present GesturePod, an easy-to-integrate device that clips on to any white cane, and detects gestures performed with the cane. With GesturePod, a user can perform common tasks on their smartphone without touch or even removing the phone from their pocket or bag. We discuss the challenges in build­ ing the device and our design choices. We propose a novel, efficient machine learning pipeline to train and deploy the gesture recognition model. Our in-lab study shows that Ges­ turePod achieves 92% gesture recognition accuracy and can help perform common smartphone tasks faster. Our in-wild study suggests that GesturePod is a promising tool to im­ prove smartphone access for people with VI, especially in constrained outdoor scenarios.

    Website: LINK

  • Communicate using your ear with Orecchio

    Communicate using your ear with Orecchio

    Reading Time: 2 minutes

    Communicate using your ear with Orecchio

    Arduino TeamOctober 23rd, 2018

    When conversing face-to-face, there are a wide range of other emotions and inflections conveyed by our facial and body expressions. But what if you can’t express emotion this way, whether due to a physical impairment, or simply because of a covering—like a dust mask—temporarily hides your beautiful smile, and perhaps your hands are otherwise occupied?

    As a solution to this dilemma, a team of researchers has been working on Orecchio, a robotic device that attaches to the ear and bends it to convey emotion. Three motors allow the ear to be bent in 22 distinct poses and movements, indicating 16 emotional states. Control is accomplished via an Arduino Due, linked up with a windows computer running a C# program. 

    The prototype was implemented using off-the-shelf electronic components, miniature motors, and custom-made robotic arms. The device has a micro gear motor mounted on the bottom of a 3D-printed ear hook loop clip. The motor drives a plastic arm against the side of the helix, able to bend it towards the center of the ear. Rotating the plastic arm back to its rest position allows the helix to restore to its original form. Near the top of the earpiece is another motor that drives a one-joint robotic arm that is attached to the top of the helix, using a round ear clip. Rotating the motor extends the robotic arm from its resting position, to bend the top helix downwards the center of the ear. The motor together with the one-joint robotic arm is mounted on a linear track that can be moved vertically through a rack-and-pinion mechanism, driven by a third motor. Moving the rack upwards stretches the helix.

    The prototype is demonstrated in the video below, and more info is available in the project’s research paper.

    [youtube https://www.youtube.com/watch?v=uGnGSEcgi4E?feature=oembed&w=500&h=281]

    Website: LINK

  • DualPanto is a non-visual gaming interface for the blind

    DualPanto is a non-visual gaming interface for the blind

    Reading Time: 2 minutes

    DualPanto is a non-visual gaming interface for the blind

    Arduino TeamOctober 22nd, 2018

    While there are tools that allow the visually impaired to interact with computers, conveying spacial relationships, such as those needed for gaming, is certainly a challenge. To address this, researchers have come up with DualPanto.

    As the name implies, the system uses two pantographs for location IO, and on the end of each is a handle that rotates to indicate direction. One pantograph acts as an output to indicate where the object is located, while the other acts as a player’s input interface. One device is positioned above the other, so the relative position of each in a plane can be gleaned. 

    The game’s software runs on a MacBook Pro, and an Arduino Due is used to interface the physical hardware with this setup. 

    DualPanto is a haptic device that enables blind users to track moving objects while acting in a virtual world.

    The device features two handles. Users interact with DualPanto by actively moving the ‘me’ handle with one hand and passively holding on to the ‘it’ handle with the other. DualPanto applications generally use the me handle to represent the user’s avatar in the virtual world and the it handle to represent some other moving entity, such as the opponent in a soccer game.

    Be sure to check it out in the video below, or read the full research paper here.

    [youtube https://www.youtube.com/watch?v=Ot55D3WD-Tc?feature=oembed&w=500&h=281]

    Website: LINK

  • Ariadne Headband is a wearable device for haptic navigation

    Ariadne Headband is a wearable device for haptic navigation

    Reading Time: 3 minutes

    Ariadne Headband is a wearable device for haptic navigation

    Arduino TeamSeptember 17th, 2018

    In a new take on haptic navigation, makers Vojtech Pavlovsky and Tomas Kosicek have come up with a novel feedback system called the “Ariadne Headband.”

    This device—envisioned for use by people with visual impairments, as well as those that simply want to get around without looking down at a phone while walking or biking—uses four vibrating motors arranged in a circle around the wearer’s head to indicate travel direction.

    An Arduino Nano provides computing power for the setup, along with a compass module and a Bluetooth link to communicate with a companion smartphone app. The Ariadne Headband is currently a prototype, but this type of interface could one day be miniaturized to the point that it could be placed in a hat, helmet, or other everyday headgear.

    Project Ariadne Headband is made out of two parts: headband and control app. The common usage flow is following. First, you open Ariadne Headband Android app. Using this app you connect via Bluetooth to your Headband. Next, the app will ask for you current GPS location. Then you open Google Maps integrated into our app and select your destination (place where you want to go).

    Our Android app will compute the geographical azimuth from your current location and chosen destination. When you are ready you start navigating by pressing a button that sends computed azimuth to the Headband you put on your head.

    Headband consists of Arduino Nano board, GY-271 compass module, HC-06 Bluetooth module (we selected this module only for local availability and will switch to BLE soon) and 4 vibration motors. Compass module allows us to know current azimuth, that is where is the user looking. All components are placed into a small box on back of your head. Our aim in the future will be to make this as small as possible so you will not even feel it. It is also possible to place everything into a hat or helmet for example instead of rubber headband. We are using rubber headband because it is very easy to manipulate.

    Vibration motors around your head are placed in set directions so they can signalize where you should head. Your heading is computed by taking your current azimuth and the azimuth sent from android app (that is where you are currently going and where you should go, respectively).

    Website: LINK

  • Robust wheelchair model with treads!

    Robust wheelchair model with treads!

    Reading Time: < 1 minute

    Robust wheelchair model with treads!

    Arduino TeamSeptember 14th, 2018

    Most people accept that a wheelchair is, in fact, a chair with wheels. This, however, didn’t stop recent Galileo Galilei Technical Institute graduate Davide Segalerba from turning this concept on its head and producing a “wheelchair” scale model driven instead by a pair of treads. 

    This concept was inspired by Segalerba’s experience using a wheelchair himself while recovering from multiple surgeries, observing that our environment isn’t always conducive to wheeled transportation.

    An Arduino board controls the device, and user input is via a joystick, or from a smartphone app over Bluetooth. You can read more about the projector on Wired Italia or translated to English here.

    Website: LINK

  • Sip and puff Morse code entry with Arduino

    Sip and puff Morse code entry with Arduino

    Reading Time: < 1 minute

    Sip and puff Morse code entry with Arduino

    Arduino TeamSeptember 10th, 2018

    Those that need a text entry method other than a traditional keyboard and mouse often use a method where a character is selected, then input using a sip or puff of air from the user’s mouth. Naturally this is less than ideal, and one alternative interface shown here is to instead use sip/puff air currents to indicate the dots and dashes of Morse code.

    The system—which can be seen in action in the video below—uses a modified film container, along with a pair of infrared emitters and detectors to sense air movement. The device was prototyped on an Arduino Mega, and its creators hope to eventually use a Leonardo for direct computer input. 

    A tube connected to a custom made bipolar pressure switch drives an Arduino which translates puffing and sucking into Morse code and then into text.

    Puffs make repeating short pulses (dots) and sucks repeating longer pulses (dashes) just like ham radio amateurs do with a dual-lever paddle.

    Code for this open source project can be found on GitHub.

    [youtube https://www.youtube.com/watch?v=Zo1gskJ759Q?feature=oembed&w=500&h=281]

    Website: LINK

  • Notable Board Books are an Arduino-powered way to enjoy music

    Notable Board Books are an Arduino-powered way to enjoy music

    Reading Time: < 1 minute

    Notable Board Books are an Arduino-powered way to enjoy music

    Arduino TeamAugust 6th, 2018

    Annelle Rigsby found that her mother, who suffers from Alzheimer’s, is delighted to hear familiar songs. While Annelle can’t always be there to help her enjoy music, she and her husband Mike came up with what they call the Notable Board Book that automatically plays tunes.

    The book itself is well laid-out, with song text and familiar photos printed on the pages. Electronics for the book are in a prototype state using an Arduino Uno and an Adafruit Sound Board to store and replay the audio bits.

    Page detection is handled by an array of photocells, and it is meant to turn on automatically when picked up via a series of tilt switches. When a switch is triggered, a relay can then hold the book on until the song that is playing is done, or for a predetermined amount of time.

    [youtube https://www.youtube.com/watch?v=7jSBl_UZE-c?feature=oembed&w=500&h=281]

    [youtube https://www.youtube.com/watch?v=07F2prjPebc?feature=oembed&w=500&h=281]

    [youtube https://www.youtube.com/watch?v=xtVReDjoA2M?feature=oembed&w=500&h=281]

    Website: LINK