Möbius strips are often used to symbolize infinity, because they are continuous loops with only a single surface. They can’t exist in real life, because every solid object in reality has thickness — even if it is very thin, like a piece of paper. But we can construct similar objects that loop and twist over on themselves. James Bruton demonstrated that concept by building an RC tank with Möbius strip tracks.
This project doesn’t seem to have any real purpose beyond curiosity. Bruton wanted to see how Möbius strip tracks would work and so he constructed this tank to find out. The treads and most of the rest of the tank were 3D-printed, with the tread links getting a special design that lets them pivot relative to each other. They pivot just enough that the each track was able to make a half-twist over the course of 8 or 9 links. That half-twist is what makes the tracks similar to a Möbiusstrip, because the “outer” surface continues endlessly and transitions to being the “inner” surface and then repeats forever.
As is the case for many of Bruton’s creations, this tank has an Arduino Mega 2560 for control. It receives commands from Bruton’s universal remote through an OrangeRX DSM2 radio receiver. A DC gear motor drives each track, providing plenty of torque.
In testing, this tank performed similarly to a standard RC tank—though there is, presumably, more friction to overcome. When the tracks are bare plastic, they slip on hard surfaces a lot. When Bruton added grippy pads, they didn’t slip quite enough. But interestingly, the unique geometry of the tracks means that one “side” can be grippy and the other slick. The track will then alternate between the two, even though that doesn’t seem to provide any real benefit.
When the BattleBots TV show first hit the airwaves in 2000, it felt like we were finally living in the future. Engineers and enterprising hobbyists from around the world would compete to build the most destructive robots, which then entered into televised mortal combat within an arena. The original series had many notable robots, but two of those most iconic were DeathRoll and Hydra. Max Imagination replicated those on a small scale for mini living room battles.
BattleBots competitors could win their matches by either damaging their opponents to the point where they could no longer operate, or by making them unable to move. The most popular way to achieve that second goal was by flipping over the opposing robot and that is the tactic used by both DeathRoll and Hydra. DeathRoll did so with a spinning disc that catches on its opponents body, while Hydra used a hydraulic arm like a pancake spatula to flip opponents.
Max Imagination wanted to create faithful reproductions of both bots, but at a size small enough to be 3D-printed. Because hydraulics are difficult at this scale, Hydra’s flipping arm is spring-actuated and cocked with a motor-driven gear mechanism. Otherwise, both replicas work in the same way as their bigger ancestors.
Each robot takes advantage of the new Arduino UNO R4 WiFi board for control. Max Imagination programmed those with self-hosted web interfaces, so users can pilot the bots through smartphones. The bodies were designed in Autodesk Fusion 360 to be entirely 3D-printable and Max Imagination is even selling those models for anyone who wants to construct their own fighting robots.
Percussion instruments are likely the first kind that humanity invented, because they’re quite simple: hit a thing and a noise happens. Different things produce different frequencies with different timbres, and glass bottles have a nice xylophonic sound to them. Because glass bottles are easy to find among discarded garbage, Jens of the Jens Maker Adventures YouTube channel took advantage of them to build this awesome robotic instrument.
Jens started by collecting a bunch of different bottles. He tapped each to while searching to get a sense of the notes they produced, which he could then lower by adding some water to fine tune the pitch. Once he had enough bottles to cover a range of notes, he set out to construct a robot to play them.
Solenoid actuators tap each bottle and an Arduino UNO Rev3 board controls that tapping. It does so according to MIDI files created in the popular Ableton software. Jens matched the available notes in Ableton to those produced by the glass bottles, so he could simply compose melodies using those notes knowing that the robot could play them. The Arduino reads the MIDI files output by Ableton and strikes the corresponding bottles.
Finally, Jens laser-cut a plywood frame and enclosure that holds the bottles, the Arduino, and the solenoids. It works with seven bottles, which is the number of notes this machine can play.
Jens demonstrated that by playing a guitar along with the robotic instrument and the result sounds very pleasant — especially for something made with garbage.
The future we were promised was supposed to include robot maids and flying cars. The future we got has Roomba vacuums and Southwest Airlines. But at least those Roomba vacuum robots work pretty well for keeping floors slightly cleaner. Sadly, they leave elevated surfaces untouched and dust-ridden. To address that limitation, Jared Dilley built this tiny DIY Roomba to clean his desk.
Dilley is a dog owner and so his desk ends up with quite a bit of dust and loose hair, even though his dog is large and doesn’t sit on the desk — a mystery all pet owners will find relatable. Fortunately, Dilley is an engineer and had already created a small Arduino-controlled tank robot a while back. That operated a bit like a Roomba and would drive around until its ultrasonic sensor detected an obstacle, at which point it would turn. Dilley just needed to repurpose that robot into small mean cleaning machine.
The 3D-printed robot operates under the control of an Arduino UNO Rev3 through a motor driver shield. Originally, it only had the ultrasonic sensor, which was enough to detect obstacles in front of the robot. But because its new job is to patrol desks and countertops, Dilley had to add “cliff” sensors to keep it from falling off. He chose to put an infrared sensor at each of the front two corners. The Arduino will register the lack of a reflection when one of those sensors goes past an edge, and will then change course. A Swiffer-like attachment on the back of the robot wipes up dust and dog hair.
Modern engineering is increasingly cross-disciplinary, so today’s students often take courses that would have seemed to be “outside their field” a couple of decades ago. Pelochus and their classmates at the University of Granada are studying computer engineering, but had a class that challenged them to build battlebots in order to get some hands-on learning with microcontrollers and embedded systems. To dominate the competition, they used an Arduino to create the Rockobot.
This is a play on a meme that was popular in the 3D printing community recently. For laughs, people would slap a bust of Dwayne “The Rock” Johnson — wrestler and actor extraordinaire — onto just about anything that could be 3D-printed. Pelochus and their team figured that such adornment would increase their chances of success in a battle, and we can smell what they’re cooking.
Below the studly noggin, the Rockobot is a pretty standard tank-style battlebot. It has bent sheet metal plows in the front and back, which are the primary offense and defense. An Arduino Nano board controls the motors that drive the tank treads through a custom PCB populated with L289N H-bridge drivers. Power comes from a 1550mAh 14.8V battery through a step-down converter. Ultrasonic sensors on the front and back, along with infrared sensors on the sides, help the Rockobot navigate autonomously while avoiding collisions.
The spirit of Mr. Johnson must have been inhabiting the Rockobot, because it blew through the competition and took the top position in the class tournament.
The rapid rise of edge AI capabilities on embedded targets has proven that relatively low-resource microcontrollers are capable of some incredible things. And following the recent release of the Arduino UNO R4 with its Renesas RA4M1 processor, the ceiling has gotten even higher as YouTuber Nikodem Bartnik has demonstrated with his lidar-equipped mobile robot.
Bartnik’s project started with a simple question of whether it’s possible to teach a basic robot how to make its way around obstacles using only lidar instead of the more resource-intensive computer vision techniques employed by most other platforms. The chassis and hardware, including two DC motors, an UNO R4 Minima, a Bluetooth® module, and SD card, were constructed according to Open Robotic Platform (ORP) rules so that others can easily replicate and extend its functionality. After driving through a series of courses in order to collect a point cloud from the spinning lidar sensor, Bartnik imported the data and performed a few transformations to greatly minify the classification model.
Once trained, the model was exported with help from the micromlgen Python package and loaded onto the UNO R4. The setup enables the incoming lidar data to be classified as the direction in which the robot should travel, and according to Bartnik’s experiments, this approach worked surprisingly well. Initially, there were a few issues when navigating corners and traveling through a figure eight track, but additional training data solved it and allowed the vehicle to overcome a completely novel course at maximum speed.
Building a robot is only half the battle, because you also need a way to control it. Something like a rover may work with a simple joystick or even typed commands. But complex robots often have many motors and controlling those all directly becomes a challenge. That’s why Will Cogley chose motion control for his bionic hand.
This is the newest iteration of a project that Cogley first started a few years ago. It is robotic hand meant to mimic a human hand as much as possible. Human fingers do not contain muscles. Instead, muscles in the forearms and palms pull on tendons to move the fingers. Cogley’s bionic hand works in a similar manner by using servo motors in the forearm to pull on cables that actuate the fingers. An Arduino UNO Rev3 moves the servos according to commands from a PC, but Cogley needed a way to streamline those commands.
Cogley chose a Leap Motion Controller for this job. It can track the motion of the user’s hand in near real-time and update a 3D model on the computer to reflect that. It displays that model in Unity, which is a 3D game engine that has the flexibility to perform in applications like this. Unity can determine the angle of each joint and Cogley was able to take advantage of the Uduino plugin to send servo commands to the Arduino based on those angles.
Lots of kids are excited about robotics, and we have the free resources you need to help your children start making robots.
What’s a robot anyway?
Did you know that the concept of robotics dates back to ancient Greece, where a mathematician built a self-propelled flying pigeon to understand bird flight? Today, we have robots assisting people in everything from manufacturing to medicine. But what exactly is a robot? Ask two people, and you might get two different answers. Some may tell you about Star Wars’ C3PO and R2D2, while others may tell you about self-driving cars or even toys.
In my view, a robot is a machine that can carry out a series of physical tasks, programmed via a computer. These tasks could range from picking up an object and placing it elsewhere, to navigating a maze, to even assembling a car without human interaction.
Why robotics?
My first encounter with robotics was the Big Trak, a programmable toy vehicle created in 1979. You could program up to 16 commands into Big Trak, which it then executed in sequence. My family and I used the toy to transport items to each other around our house. It was a fun and engaging way to explore the basics of robotics and programming.
Understanding something about robotics is not just for scientists and engineers. It involves learning a range of skills that empower your kids to be creators of our digital world, instead of just consumers.
Robotics combines various aspects of science, technology, engineering, and mathematics (STEM) in a fun and engaging way. It also encourages young people’s problem-solving abilities, creativity, and critical thinking — skills that are key for the innovators of tomorrow.
Machine learning and robotics: A powerful duo
What happens when we add machine learning to robotics? Machine learning is an area of artificial intelligence where people design computer systems so they “learn” from data. This is not unlike how people learn from experience. Machine learning can enable robots to adapt to new situations and perform tasks that only people used to do.
We’ve already built robots that can play chess with you, or clean your house, or deliver your food. As people develop machine learning for robotics further, the possibilities are vast. By the time our children start their careers, it might be normal to have robots as software-driven “coworkers”. It’s important that we prepare children for the possible future that robotics and machine learning could open up. We need to empower them to contribute to creating robots with capabilities that complement and benefit all people.
Kids will learn to create interactive stories, games, and animations, all while getting a taste of physical computing. They’ll explore how to use sound and light, and even learn how to create improvised buttons.
It’s a great way to delve deeper into the world of electronics and programming. The path includes a variety of fun and engaging projects that incorporate crafting and allow children to see the tangible results of their coding efforts.
Build a robot
‘Build a robot’ is a project path that allows young people to create a simple programmable buggy. They can then make it remote-controlled and even transform it so it can follow a line by itself.
This hands-on project path not only teaches the basics of robotics but also encourages problem-solving as kids iteratively improve their robot buggy’s design.
The robot building community
Let’s take a moment to celebrate two young tech creators who love building robots. Selin is a digital maker from Istanbul, Turkey, who is passionate about robotics and AI. Selin’s journey into the world of digital making began with a wish: after her family’s beloved dog Korsan passed away, she wanted to bring him back to life. This led her to design a robotic dog on paper, and to learn coding and digital making to build that robot.
Selin has since built seven different robotics projects. One of them is IC4U, a robotic guide dog designed to help people with impaired sight. Selin’s commitment to making projects that help make the world a better place was recognised when she was awarded the Aspiring Teen Award by Women in Tech.
Jay, a young digital maker from Preston, UK, started experimenting with code at a young age to make his own games. He attended free local coding groups, such as CoderDojo, and was introduced to the block-based programming language Scratch. Soon, Jay was combining his interests in programming with robotics to make his own inventions.
Jay’s dad, Biren, comments: “With robotics and coding, what Jay has learned is to think outside of the box and without any limits. This has helped him achieve amazing things.”
Robotics and machine learning are not just science fiction — they shape our lives today in ways kids might not even realise. Whether your child is just interested in playing with robots, wants to learn more about them, or is considering a career in robotics, our free resources are a great place to start.
If a Greek mathematician was able to build a flying pigeon millennia ago, imagine what children could create today!
Ivan Miranda has a humble dream: he wants to build a massive 3D-printed robot that he can ride upon. In other words, he wants a mech. But that is obviously a very challenging project that will take an incredible amount of time and money. So he decided to test the waters with one piece of the mech: a huge 3D-printed robotic hand.
Miranda designed this robotic hand at the scale necessary for an enormous rideable mech, but he has only built the one hand at this point. This let him test the idea before jumping into the deep end with the full project. The structure and most of the mechanical components were 3D-printed. It has four fingers and a thumb, each with three joints (like a real human hand). It is mostly rigid PLA, but there are some flexible TPU parts that add grip.
Servos actuate all 15 of those joints. Most of them have 11kg-cm of torque, but the base of each finger has a more powerful servo with 25kg-cm of torque. An Arduino Mega 2560 controls all of the servo motors with pulse-width modulation (PWM) signals. Power, of course, comes directly from the power supply and not the Arduino.
In testing, the hand seems to work quite well. It can move and grip large objects, though the belts do slip and need to be replaced with a type that can’t stretch. We’re not sure if Miranda will complete the entire mech, but we sure hope that he does!
A popular goal among roboticists is animal-like locomotion. Animals move with a fluidity and grace that is very hard to replicate artificially. That goal has led to extremely complex robots that require a multitude of motors and sensors, along with heavy processing, to walk. But even those don’t quite match biological movement. Taking a new approach, engineers from Carnegie Mellon University and the University of Illinois Urbana-Champaign created a simple bipedal robot named “Mugatu” that walks using a single actuator.
This approach is counter-intuitive, but quite sensible when we actually look at the gaits of real animals. Bipedal animals, such as humans, don’t need to engage many muscles when walking on flat surfaces. We achieve that efficiency with balance and body geometry evolved for this purpose. In a sense, a walking human is always falling forward slightly and redirecting their inertia to take a step. This robot walks in a similar manner and only needs a motor to move one leg forward relative to the other.
The team built Mugatu using 3D-printed legs connected by a servo “hip” joint. An Arduino MKR Zero board controls that motor, moving it with the precise timing necessary to achieve the “continuous falling” gait. This prototype doesn’t utilize it yet, but there is also an IMU in the left leg that could provide useful feedback data in the future. For now, the robot relies on pre-programmed movements.
Your dog has nerve endings covering its entire body, giving it a sense of touch. It can feel the ground through its paws and use that information to gain better traction or detect harmful terrain. For robots to perform as well as their biological counterparts, they need a similar level of sensory input. In pursuit of that goal, the Autonomous Robots Lab designed TRACEPaw for legged robots.
TRACEPaw (Terrain Recognition And Contact force Estimation Paw) is a sensorized foot for robot dogs that includes all of the hardware necessary to calculate force and classify terrain. Most systems like this use direct sensor readings, such as those from force sensors. But TRACEPaw is unique in that it uses indirect data to infer this information. The actual foot is a deformable silicone hemisphere. A camera looks at that and calculates the force based on the deformation it sees. In a similar way, a microphone listens to the sound of contact and uses that to judge the type of terrain, like gravel or dirt.
To keep TRACEPaw self-contained, Autonomous Robots Lab chose to utilize an Arduino Nicla Vision board. That has an integrated camera, microphone, six-axis motion sensor, and enough processing power for onboard machine learning. Using OpenMV and TensorFlow Lite, TRACEPaw can estimate the force on the silicone pad based on how much it deforms during a step. It can also analyze the audio signal from the microphone to guess the terrain, as the silicone pad sounds different when touching asphalt than it does when touching loose soil.
More details on the project are available on GitHub.
Static manipulators and mobile robot chassis each have their own advantages, and so by combining the two into a single platform, AadhunikLabs was able to realize both at the same time. The base frame is comprised of four individual wheels, each with their own high-torque geared motor and driven by a pair of VNH3ASP30 DC motor driver boards. All of the arm’s axes are moved via a single high-torque metal servo motor that not only can support its own weight, but also the weight of an object being picked up by the gripper on the end.
Beyond controlling the geared DC and servo motors, an onboard Arduino Nano RP2040 Connect receives commands over Wi-Fi® from a host PC running the control software. In here, the user can view a live camera feed coming from an ESP32 camera module as well as virtually view the robotic arm’s position in 3D space. Similar to a video game, pressing keyboard keys such as ‘WASD’ and sliding the mouse provide general movements for the chassis and arm, respectively. Meanwhile, other keys allow for manipulating the end-effector, moving the arm to default positions, and adjusting the speed.
Soft robotics is a challenging field, because it comes with all of the difficulties associated with conventional robotics and adds in the complexity of designing non-rigid bodies. That isn’t a trivial thing, as most CAD software doesn’t have the ability to simulate the flexibility of the material. You also have to understand how the actuators will perform. That’s why a team of researchers from Zhejiang University and Carnegie Mellon University developed MiuraKit, which is a modular construction kit for pneumatic robots.
MiuraKit isn’t any one robot, but rather a set of tools and designs that can be combined to build robots and shape-changing interfaces. Anything made with MiuraKit will have a few things in common: pneumatic actuation, flexibility, and origami-like structures. Those structures expand or deform in a variety of different ways to suit the application. For example, one type is a simple one-dimensional expander similar to a linear actuator. Another type twists for rotary actuation. By linking different types together, roboticists can achieve complex motion.
Because these structures rely on pneumatic actuation, they need valves to control airflow. MiuraKit works with electromagnetic valves under the control of an Arduino board. That receives commands from a computer over a serial connection, but it can also work on its own with pre-programmed instructions. MiruaKit includes almost everything needed to create a robot: 3D-printable pneumatic connectors, a CAD design tool, laser cutter templates, and the pump with control system. In the coming weeks, the designers plan to give MiuraKit out to design firms and schools for evaluation.
Many kids and adults have an interest in electronics because they want to build robots. But it can be difficult to figure out where to even start. There are hundreds of kits on the market and the options are endless where you veer into custom territory. But if you’re looking for a tank-style rover that you can control via Bluetooth®, then this robot designed by Mastoras Inc is a fantastic choice.
We like this project because it combines the advantages of robot kits and custom robots. It uses an off-the-shelf chassis to simplify the complicated mechanical parts, but with custom Arduino electronics that allow for customizability and that offer an introduction to coding. It has Bluetooth capability, so you can control it remotely from your smartphone. Mastoras Inc built an Android app, which you can tweak as much as you like. You can also create your own if you want to try you hand at app development.
The project starts with a tracked robot chassis kit, which includes the frame, DC motors, hubs, and tracks. An Arduino Nano Every board controls those motors through an L298N H-bridge driver. An HC-05 module adds connectivity and power comes from a 9V battery. The electronics enclosures are 3D-printable, but you can also use any pre-built project box. If you do have a 3D printer, you can also add a tank turret rotated by a 9g micro servo motor.
This robot won’t make waves at your local hackerspace, but it is a great way to dip your toes into robotics and develop a foundation that you can build upon.
Building walking robots is difficult, because they either need a lot of legs or some ability to balance through their gait. There is a reason that the robots designed by companies like Boston Dynamics are so impressive. But lots of hobbyists have made bipedal and quadrupedal robots, while largely ignoring tripedal robots. To find out if they could be practical, James Bruton created a prototype tripedal robot.
When compared to a bipedal robot, a tripedal robot is more stable when standing still. But a bipedal robot is more stable when walking. That’s because it can keep its center of gravity almost directly above the foot that contacts the ground. A tripedal robot, on the other hand, needs to attempt to balance on two legs while move the third, while the center of gravity is somewhere above the middle of a triangle formed by the three feet. That makes walking gaits difficult to achieve.
Bruton built this prototype using a 3D-printed body, legs actuated by servo motors, and an Arduino Mega 2560 for control. The three legs are arranged with radial symmetry and each leg has three joints. Bruton attempted to give the robot a gait in which it tries to momentarily balance on two legs, while lifting and swinging the third around.
But that was very inefficient and clumsy. Bruton believes that he could achieve better results by equipping the robot with an IMU. That would give it a sense of balance, which could help it remain steady on two legs through a gait. With a counterbalancing weight, that could make a big difference. But for now, Bruton is putting this experiment on the back burner.
Was there anything more exciting than watching AT-ATs walk across Hoth towards the Rebel base for the first time? Those massive machines were iconic and helped to solidify The Empire Strikes Back as the best movie set in the Star Wars universe. After experiencing disappointment with AT-AT toys that couldn’t walk, James Bruton built his own AT-AT robot that strolls with the best of them.
While Bruton’s 3D-printed robot isn’t an exact replica of an AT-AT and doesn’t incorporate all of the design elements, it does walk a lot like what we all saw in the movie. Those AT-ATs had a very distinctive way of moving and the robot does a good job of mimicking their gait. It incorporates two key elements: forward knees and feet that remain level with the ground during a stride. Bruton’s robot has legs made up of parallel linkages in order to replicate that movement.
Each leg requires three servo motors: two for the hip and one for the knee. That’s a total of 12 servos, which Bruton controlled with an Arduino Mega 2560 board. It receives commands from Bruton’s own universal DSM remote through a DSM radio receiver module. The robot lacks sensors and autonomy, so Bruton has to pilot it himself. After he solved some minor balance issues caused by the weighty head, the mini AT-AT was able to walk very well and it should excite every Star Wars fan.
There are many ways to control a robot arm, with the simplest being a sequential list of rotation commands for the motors. But that method is very inefficient when the robot needs to do anything complex in the real world. A more streamlined technique lets the user move the arm as necessary, which sets a “recording” of the movements that the robot can then repeat. We tend to see that in high-end robots, but Mr Innovative built a robot arm with recording capability using very affordable materials.
This uses an input controller that is roughly the same size and shape as the robot arm, so Mr Innovative can manipulate that controller and the arm will mimic the movements like a puppet. The robot arm will also record those movements so it can repeat them later without any direct oversight. The video shows this in action with a demonstration in which the robot picks up small cylindrical objects and places them at the top of chute, where they slide back down for the process to continue indefinitely.
An Arduino Nano board powers the servo motors through a custom driver board to actuate the robot arm. It takes input from the controller, which has rotary potentiometers in the joints where the robot arm has servo motors. Therefore, the values from the potentiometers match the desired angles of the servo motors. The custom driver board has two buttons: one to activate the gripper and one to record to movements. When Mr Innovative holds down the second button, the Arduino will store all the movement commands so that it can repeat them.
Many people find the subjectivity of art to be frustrating, but that subjectivity is what makes art interesting. Banksy’s self-shredding art piece is a great example of this. The original painting sold at auction for $1.4 million—and then it shredded itself in front of everyone. That increased its value and the now-shredded piece, dubbed “Love Is in the Bin,” sold again at auction in 2021 for a record-breaking $23 million. In a similar vein to that infamous work, this robot destroys the artwork that it produces.
“The Whimsy Artist” is a small robot rover, like the kind you’d get in an educational STEM kit. It is the type of robot that most people start with, because it is very simple. It only needs two DC motors to drive around and it can detect obstacles using an ultrasonic distance sensor and has two infrared sensors for line-following. An Arduino Uno Rev3 board controls the operation of the two motors according to the information it receives from the sensors.
That decision-making is where the artistic elements come into play. When it doesn’t detect any obstacles, the robot will run in “creative” mode. It opens a chute on a dispenser to drop a trail of fine sand while it moves in a pleasant spiral pattern. But if it sees an obstacle with the ultrasonic sensor, it gets angry. In that mode, it reverses direction and uses the IR sensors to follow the line it just created while deploying a brush to destroy its own sandy artwork.
While it is easier now than ever before, getting into robotics is still daunting. In the past, aspiring roboticists were limited by budget and inaccessible technology. But today the challenge is an overwhelming abundance of different options. It is hard to know where to start, which is why Saul designed a set of easy-to-build and affordable robots called Bolt Bots.
There are currently five different Bolt Bot versions to suit different applications and you can assemble all of them with the same set of hardware. Once you finish one, you can repurpose the components to make another. The current designs include a large four-leg walker (V1), a tiny four-leg walker (V2), a robot arm (V3), a car (V4), and a hanging plotter that can draw (V5). They all have a shared designed language and utilize 3D-printed mechanical parts with off-the-shelf fasteners.
Every robot has an Arduino Micro board paired with an nRF24L01 radio transceiver module for control. Users can take advantage of existing RC transmitters or build a remote also designed by Saul. The other components include servo motors, an 18650 lithium battery, and miscellaneous parts likes wires and screws. Some of the Bolt Bots require different servo motors, like continuous-rotation and mini 1.8g models, but most of them are standard 9g hobby servo motors.
Because there are five Bolt Bot variations that use the same components, this is an awesome ecosystem for getting started in robotics on a budget — especially for kids and teens.
So much of the research and development in the area of haptic feedback focuses on universal devices that can create a wide range of tactile sensations. But that has proven to be a massive challenge, as it is very difficult to package the number of actuators necessary for that flexibility in a device that is practical for the consumer market. That’s why TactorBots — devised by researchers from University of Colorado’s ATLAS Institute and Parsons School of Design — sidesteps the issue with a complete toolkit of robotic touch modules.
TactorBots includes both software and hardware, with the hardware coming in several different modules. Each module is wearable on the user’s wrist and has a unique way of touching their arm. One Tactor module strokes the user’s arm, while another taps them. There are other Tactor modules for rubbing, shaking, squeezing, patting, and pushing. Because each module only needs to perform a single tactile motion, they can do their jobs very well. It is also possible to chain several modules together so the user can feel the different sensations across their arm.
Custom web-based software running on a PC controls the Tactor modules, activating them to match virtual on-screen content, through a host module built around an Arduino Nano board. That host module is also wearable on the arm. Each Tactor module has a servo motor that connects directly to the host module through standard JST wires. The module enclosures, along with the sensation-specific mechanisms, were all 3D-printed. The mechanisms differ based on the sensation they were designed to create, but they’re also simple and only require a single servo to operate.
If you want a robot arm, either for some practical job or just fun, you have a lot of options. There are many consumer and industrial robot arms on the market, but the models that aren’t glorified toys tend to be pricey. You can also build your own. If you go that route, you’ll want a design that is well-engineered and well-documented. It isn’t free, but the ARCTOS robot arm is a high-quality option that meets both of those criteria.
Based on aesthetics alone, the ARCTOS robot arm looks fantastic. It resembles something you’d see in a lab in a sci-fi movie. But it also offers more than a pretty package. It has six degrees of freedom and a payload of 500 grams, making it suitable for tasks ranging from pick-and-place to packing boxes. Best of all, you can assemble it using easily sourced hardware and 3D-printed parts. Those parts are PLA and just about any modern 3D printer can handle the fabrication.
The ARCTOS design files will set you back €39.95 (about $44) and sourcing all of the parts for the build will cost around $400. Stepper motors actuate the joints, through simple belt drives and cycloidal gear boxes. An Arduino Mega 2560 controls those through a standard CNC shield. It runs open source firmware based on GRBL that will work with a variety of control software options to suit different tasks.
Octopus tentacles are, essentially, long continuous muscles — a bit like your tongue. That anatomy gives octopuses incredible dexterity, but it is very difficult to replicate in robots. Artificial “muscle” fiber isn’t very practical yet, which is why roboticists turn to more conventional means of actuation. Cable-driven tentacles are popular, but they require many powerful cable motors. For his newest project, James Bruton took a different approach and utilized Stewart platforms.
Stewart platforms are somewhat common in industrial settings, because they can work with hydraulic pistons that handle a lot of weight. Six linear actuators arranged between two plates let the second plate move at any angle relative to the first plate, with the exact angle depending on the current lengths of the actuators. By chaining together several Stewart platforms, Bruton created a tentacle-like structure with complete freedom in every joint.
The current prototype only has three Stewart platforms, but those are enough to demonstrate the concept. Bruton used servos instead of linear actuators to keep the costs down. An Arduino Mega 2560 board controls those: a total of 18 servo motors. The entire structure is made up of 3D-printed parts.
But expanding this design into a full Doc Ock tentacle (much less four of them) would come with challenges. As with any robot arm, the motors closer to the base experience more load as the weight and the length of the arm increase. Those would probably need to be replaced with beefier models. And with six servos for every joint, even an Arduino Mega 2560 would quickly run out of pins. That could, however, be solved by using multiple Arduino boards or an IO expander.
Um dir ein optimales Erlebnis zu bieten, verwenden wir Technologien wie Cookies, um Geräteinformationen zu speichern und/oder darauf zuzugreifen. Wenn du diesen Technologien zustimmst, können wir Daten wie das Surfverhalten oder eindeutige IDs auf dieser Website verarbeiten. Wenn du deine Einwillligung nicht erteilst oder zurückziehst, können bestimmte Merkmale und Funktionen beeinträchtigt werden.
Funktional
Immer aktiv
Die technische Speicherung oder der Zugang ist unbedingt erforderlich für den rechtmäßigen Zweck, die Nutzung eines bestimmten Dienstes zu ermöglichen, der vom Teilnehmer oder Nutzer ausdrücklich gewünscht wird, oder für den alleinigen Zweck, die Übertragung einer Nachricht über ein elektronisches Kommunikationsnetz durchzuführen.
Vorlieben
Die technische Speicherung oder der Zugriff ist für den rechtmäßigen Zweck der Speicherung von Präferenzen erforderlich, die nicht vom Abonnenten oder Benutzer angefordert wurden.
Statistiken
Die technische Speicherung oder der Zugriff, der ausschließlich zu statistischen Zwecken erfolgt.Die technische Speicherung oder der Zugriff, der ausschließlich zu anonymen statistischen Zwecken verwendet wird. Ohne eine Vorladung, die freiwillige Zustimmung deines Internetdienstanbieters oder zusätzliche Aufzeichnungen von Dritten können die zu diesem Zweck gespeicherten oder abgerufenen Informationen allein in der Regel nicht dazu verwendet werden, dich zu identifizieren.
Marketing
Die technische Speicherung oder der Zugriff ist erforderlich, um Nutzerprofile zu erstellen, um Werbung zu versenden oder um den Nutzer auf einer Website oder über mehrere Websites hinweg zu ähnlichen Marketingzwecken zu verfolgen.