Yes, the title of this article sounds pretty crazy. But not only is it entirely possible through the lens of physics, but it is also practical to achieve in the real world using affordable parts. Jon Bumstead pulled it off with an Arduino, a photoresistor, and an inexpensive portable projector.
Today’s digital camera sensors are the result of a fairly linear progression from a camera obscura up through film cameras. The light from the scene enters through a lens that focuses all of that light on the 2D plane at the same time. The digital “sensor” is actually a whole grid of tiny sensors that each measure the light they receive. The camera records those values and reconstructing them gives you a digital image.
Bumstead’s “camera” works differently and only records a single point of light at a time. The entire camera is actually just an Arduino Mega 2560 (an UNO also works) with a photoresistor. The photoresistor provides a single analog light measurement and the Arduino reads that measurement, assigns a digital value, and passes the data to a PC.
Here’s the cool part: by only illuminating one point of the scene at a time, the camera can record each “pixel” in sequence. Those pixel values can then be reconstructed into an image. In this case, Bumstead used a portable video projector to provide the illumination. It scans the illumination point across the scene as the Arduino collects data from the photoresistor.
Bumstead also experimented with more complex techniques that rely on projected patterns and a lot of very fancy math to achieve similar results.
Finally, Bumstead showed that this also works when the photoresistor doesn’t have line-of-sight to the scene. In that demonstration, light from the scene bounces off a piece of paper, kind of like a mirror. The photodetector only sees the reflected light. But that doesn’t matter — remember, the photodetector is only seeing a single point of light anyway. Whether that light came directly from the surface of objects in the scene or bounced off paper first, the result is the same (just with a bit less quality, because the paper isn’t a perfect reflector).
One reason that fans prefer mechanical keyboards over membrane alternatives is that mechanical key switches provide a very noticeable tactile sensation at the moment a key press registers. Whether consciously or not, users notice that and stop pressing the key all the way through the maximum travel — reducing strain and RSI potential. Developed by researchers at KAIST’s HCI Tech Lab, UltraBoard is a novel wearable that provides similar tactile feedback while typing in virtual reality.
UltraBoard’s designers wanted a device suitable for VR typing that would provide on-demand haptic feedback sensations, without complicated physical actuators. They achieved that with an array of ultrasonic transducers that produce strong soundwaves that the user can feel, but not hear. That array sits below the hand and can project localized soundwaves targeting specific points. So, typing the letter “A” on a virtual reality keyboard would cause the transducer array to blast soundwaves at the tip of the pinky finger.
An UltraBoard straps on to each of the user’s wrists and a servo motor near the strap tilts the transducer array to match the wrist angle, ensuring that the array is always directly underneath the user’s hand.
The prototype UltraBoard device uses both an Arduino Mega 2560 and an Arduino Micro board. They share duties, with the Micro controlling the servo motor and the transducer board, while the Mega controls an Ultraino driver board. They follow commands from a connected PC, which runs the virtual reality software that the user interacts with through a virtual reality headset.
The results of testing were mixed, but UltraBoard didn’t appear to provide a statistically significant improvement to typing speed. Even so, the concept is interesting and further testing may reveal other benefits, such as a more comfortable typing experience.
If you hear the term “generative art” today, you probably subconsciously add “AI” to the beginning without even thinking about it. But generative art techniques existed long before modern AI came along — they even predate digital computing altogether. Despite that long history, generative art remains interesting as consumers attempt to identify patterns in the underlying algorithms. And thanks to the “Generative Art 1€” vending machine built by Niklas Roy, you can experience that for yourself by spending a single euro.
Roy built this vending machine to display at the “Intelligence, it’s automatic” exhibit, hosted at Zebrastraat in Belgium. Rather than AI, Roy gave the machine more traditional algorithms to generate abstract pieces of line art. Each piece uses the current time as the “seed” for the algorithms, so it will be unique and an identical piece will never appear again. And the current piece, shown on a screen in the machine, always evolves as time passes. If a viewer sees something they like, they’ll need to insert a euro coin immediately or risk losing the opportunity to secure the art.
Once paid, the machine will use a built-in pen plotter to draw the line on a piece of paper. It will also label the art with a unique identifier: the seed number. Then, it will stamp the paper for authenticity. Finally, it will cut that piece from the roll of paper and dispense the art through a chute at the bottom.
That all happens under the direction of an Arduino Mega 2560 board. It controls the pen plotter, which is a repurposed model called Artima Colorgraf. The coin-op mechanism is an off-the-shelf unit and a Python script, running on a connected laptop, performs the art generation. What message is this vending machine meant to convey? Maybe that art is ethereal or that it has little value — just a euro — to modern society. Whatever the case, it is a work of art in its own right.
A vehicle’s wheel diameter has a dramatic effect on several aspects of performance. The most obvious is gearing, with larger wheels increasing the ultimate gear ratio — though transmission and transfer case gearing can counteract that. But wheel size also affects mobility over terrain, which is why Gourav Moger and Huseyin Atakan Varol’s prototype mobile robot, called Improbability Roller, has the ability to dynamically alter its wheel diameter.
If all else were equal (including final gear ratio), smaller wheels would be better, because they result in less unsprung mass. But that would only be true in a hypothetical world on perfectly flat surfaces. As the terrain becomes more irregular, larger wheels become more practical. Stairs are an extreme example and only a vehicle with very large wheels can climb stairs.
Most vehicles sacrifice either efficiency or capability through wheel size, but this robot doesn’t have to. Each of its wheels is a unique collapsing mechanism that can expand or shrink as necessary to alter the effective rolling diameter. Pulley rope actuators on each wheel, driven by Dynamixel geared motors by an Arduino Mega 2560 board through a Dynamixel shield, perform that change. A single drive motor spins the wheels through a rigid gear set mounted on the axles, and a third omni wheel provides stability.
This unique arrangement has additional benefits beyond terrain accommodation. The robot can, for instance, shrink its wheels in order to fit through tight spaces. It can also increase the size of one wheel, relative to the other, to turn without a dedicated steering rack or differential drive system.
Materials, when exposed to light, will reflect or absorb certain portions of the electromagnetic spectrum that can give valuable information about their chemical or physical compositions. Traditional setups use a single lamp to emit white light before it is split apart into a spectrum of colors via a system of prisms, mirrors, and lenses. After hitting the substance being tested, a sensor will gather this spectral color data for analysis. YouTuber Marb’s Lab realized that by leveraging several discrete. LEDs, he could recreate this array of light without the need for the more expensive/complicated optics.
His project uses the AS7431 10-channel spectrometer sensor breakout board from Adafruit due to its adequate accuracy and compact footprint. Once it was attached to the clear sample chamber and wired to a connector, Marb got to work on the electromechanical portion of the system. Here, a stepper motor rotates a ring of six LEDs that are driven by a series of N-channel MOSFETs and a decade counter. Each component was then wired into a custom-designed control board, which acts as a shield when attached to the Arduino Mega 2560 below.
The sketch running on the Mega allows for the user to select between photometer (single wavelength) and spectrometer (multiple wavelengths) modes when sampling the substance. Once the data is captured, the user can then choose one of three interpolation modes to get a smooth curve, as seen here when measuring this chlorophyl.
The human face is remarkably complex, with 43 different muscles contorting the skin in all kinds of ways. Some of that is utilitarian — your jaw muscles are good for chewing, after all. But a lot of it seems to be the result of evolution giving us fantastic non-verbal communication abilities. That isn’t an easy thing to replicate by artificial means, but Will Cogley managed to make this silicone-skinned animatronic head that is frighteningly realistic.
Most people, when seeing this animatronic head, will feel something between unease and outright disgust or terror. Cogley purposefully exaggerated the movements and proportions to give the head a more cartoonish appearance in an attempt to navigate around the uncanny valley, but it still looks a bit too human to be comfortable.
That is largely the result of the realistic silicone skin that Cogley molded and then attached onto the internal skeleton (skull?) frame using magnets. That attachment method is pretty similar to the way our own people skin attaches to our muscles and tendons, and it produces expressions that are quite human.
The internal skeleton was 3D-printed and actuated by a plethora of small servo motors. Cogley designed a custom shield PCB for an Arduino Mega 2560 to control the servos. He hasn’t yet programmed it beyond a handful of basic movements and facial expressions, but he’s set it up so that there is a great deal of potential for future programmatic upgrades.
Everyone loves looking at exotic animals and most of us only get to do that at zoos. But, of course, there is a lot to be said about the morality of keeping those animals in captivity. So, good zoos put a lot of effort into keeping their animals healthy and happy. For more intelligent animals, like elephants, enrichment through intellectual stimulation is a solid strategy. With that in mind, a team of Georgia Tech students worked with Zoo Atlanta to give elephants a musical toy to enrich their lives.
Like the toys you get for your dog, this device’s purpose is to give the elephants some mental stimulation. It provides them with an activity that they can enjoy, thus improving their lives. It works by playing specific tones (known to please elephant ears) when the elephants stick their trunks in holes in a wall. In essence, it is similar to an electronic toy piano for kids — just optimized for elephant physiology.
An Arduino Mega 2560 board plays the tones through a DY-SV5W media player module, which outputs an audio signal to an outdoor speaker system. Each hole in the wall has a VL53L0X ToF (Time of Flight) sensor to detect trunks. Those sensors were paired with ATtiny85 microcontrollers that tell the Arduino when a trunk is present.
The researchers also added a real-time clock and an SD card reader to log activity, giving the team the ability to evaluate the response from the elephants. In the same way that you can tell your dog loves his new toy by how much he plays with it, the team was able to determine that the elephants enjoyed their musical device over the course of about a week.
Inventor Charly Bosch and his daughter Leonie have crafted something truly remarkable: a fully electric, Arduino-powered car that’s as innovative as it is sustainable. Called the Batteryrunner, this vehicle is designed with a focus on environmental impact, simplicity, and custom craftsmanship. Get ready to be inspired by a car that embodies the spirit of creativity!
When the Arduino team saw the Batteryrunner up close at our offices in Turin, Italy, we were genuinely impressed – especially knowing that Charly and Leonie had driven over 1,000 kilometers in this unique car! Their journey began on a small island in Spain, took them across southern France, and brought them to Italy before continuing on to Austria.
Building a car with heart – and aluminum
In 2014, Charly took over LORYC – a Mallorca carmaker that became famous in the 1920s for its winning mountain racing team. His idea was to ??build a two-seater as a tribute to the LORYC sports legacy, but with a contemporary electric drive: that’s how the first LORYC Electric Speedster was born. “We’re possibly the smallest car factory in the world, but have a huge vision: to prove electric cars can be cool… and crazy,” Charly says.
With a passion for EVs rooted in deep environmental awareness, he decided to push the boundaries of car manufacturing with the Batteryrunner: a car where each component can be replaced and maintained, virtually forever.
Indeed, it’s impossible not to notice that the vehicle is made entirely from aluminum: specifically, 5083 aluminum alloy. This material is extremely durable and can be easily recycled, unlike plastics or carbon fiber which end up as waste at the end of their lifecycle.
The car’s bodywork includes thousands of laser-cut aluminum pieces. “This isn’t just a prototype: it’s a real car – one that we’ve already been able to drive across Europe,” Charly says.
The magic of learning to do-it-yourself
“People sometimes ask me why I use Arduino, as if it was only for kids. Simple: Arduino never failed me,” is Charly’s quick reply. After over a decade of experience with a variety of maker projects, it was an easy choice for the core of Batteryrunner’s system.
In addition to reliability, Charly appreciates the built-in ease-of-use and peer support: “The Arduino community helps me with something new every week. If you are building a whole car on your own, you can’t be an expert in every single aspect of it. So, anytime I google something, I start by typing ‘Arduino’, and follow with what I need to know. That’s how I get content that I can understand.”
This has allowed Charly and Leonie to handle every part of the car’s design, coding, and assembly, creating a fully integrated system without needing to rely on external suppliers.
Using Arduino for unstoppable innovation
A true labor of love, after four years since its inception the Batteryrunner is a working (and talking!) car, brought to life by 10+ Arduino boards, each with specific functions.
For instance:
• An Arduino Nano is used to manage the speedometer (a.k.a. the “SpeedCube”), in combination with a CAN bus module, stepper motor module, and stepper motor.
• Different Arduino Mega 2560, connected via CAN bus modules, control the dashboard, steering wheel, lights and blinkers, allowing users to monitor and manage various functions.
• Arduino UNO R4 boards with CAN bus transceivers are used to handle different crucial tasks – from managing the 400-V battery system and Tesla drive unit to operating the linear windshield wiper and the robotic voice system.
Charly already plans on upgrading some of the current solutions with additional UNO R4 boards, and combining the GIGA R1 WiFi and GIGA Display Shield for a faster and Wi-Fi®-connected “InfoCube” dashboard.
All in all, the Batteryrunner is more than a car: it’s a rolling platform for continuous innovation, which Charly is eager to constantly improve and refine. His next steps? Integrating smartphone control via Android, adding sensors for self-parking, and experimenting with additional features that Arduino makes easy to implement. “This is a car that evolves,” Charly explains. “I can add or change features as I go, and Arduino makes it possible.”
Driving environmental awareness
Finally, we see Batteryrunner as more than a fun, showstopping car. Given Charly’s commitment to low-impact choices, it’s a way to shift people’s mindset about sustainable mobility. The environmental challenges we face today require manufacturers to go well beyond simply replacing traditional engines with electric ones: vehicles need to be completely redesigned, according to sustainability and simplicity principles. To achieve this, we need people who are passionate about the environment, technology, and creativity. That’s why we fully agree with Charly, when he says, “I love makers! We need them to change the world.”
Follow LORYC on Facebook or Instagram to see Charly and Leonie’s progress, upgrades, and experiments, and stay inspired by this incredible, Arduino-powered journey.
At Cornell University, Dr. Anand Kumar Mishra and his team have been conducting groundbreaking research that brings together the fields of robotics, biology, and engineering. Their recent experiments, published in Science, explore how fungal myceliacan be used to control robots. The team has successfully created biohybrid robots that move based on electrical signals generated by fungi – a fascinating development in the world of robotics and biology.
A surprising solution for robotics: fungi
Biohybrid robots have traditionally relied on animal or plant cells to control movements. However, Dr. Mishra’s team is introducing an exciting new component into this field: fungi – which are resilient, easy to culture, and can thrive in a wide range of environmental conditions. This makes them ideal candidates for long-term applications in biohybrid robotics.
Dr. Mishra and his colleagues designed two robots: a soft, starfish-inspired walking one, and a wheeled one. Both can be controlled using the natural electrophysiological signals produced by fungal mycelia. These signals are harnessed using a specially designed electrical interface that allows the fungi to control the robot’s movement.
The implications of this research extend far beyond robotics. The integration of living systems with artificial actuators presents an exciting new frontier in technology, and the potential applications are vast – from environmental sensing to pollution monitoring.
At the heart of this innovative project is the Arduino platform, which served as the main interface to control the robots. As Dr. Mishra explains, he has been using Arduino for over 10 years and naturally turned to it for this experiment: “My first thought was to control the robot using Arduino.” The choice was ideal in terms of accessibility, reliability, and ease of use – and allowed for seamless transition from prototyping with UNO R4 WiFi to final solution with Arduino Mega.
To capture and process the tiny electrical signals from the fungi, the team used a high-resolution 32-bit ADC (analog-to-digital converter) to achieve the necessary precision. “We processed each spike from the fungi and used the delay between spikes to control the robot’s movement. For example, the width of the spike determined the delay in the robot’s action, while the height was used to adjust the motor speed,” Dr. Mishra shares.
The team also experimented with pulse width modulation (PWM) to control the motor speed more precisely, and managed to create a system where the fungi’s spikes could increase or decrease the robot’s speed in real-time. “This wasn’t easy, but it was incredibly rewarding,” says Dr. Mishra.
And it’s only the beginning. Now the researchers are exploring ways to refine the signal processing and enhance accuracy – again relying on Arduino’s expanding ecosystem, making the system even more accessible for future scientific experiments.
All in all, this project is an exciting example of how easy-to-use, open-source, accessible technologies can enable cutting-edge research and experimentation to push the boundaries of what’s possible in the most unexpected fields – even complex biohybrid experiments! As Dr. Mishra says, “I’ve been a huge fan of Arduino for years, and it’s amazing to see how it can be used to drive advancements in scientific research.”
Every decade or two, humanity seems to develop a renewed interest in humanoid robots and their potential within our world. Because the practical applications are actually pretty limited (given the high cost), we inevitably begin to consider how those robots might function as entertainment. But Jon Hamilton did more than just wonder, he actually built a robotic performer called Syntaxx and it will definitely make you feel things.
It is hard to describe this robot without sounding like a Mad Libs game filled out by a cyberpunk-obsessed DJ. Hamilton designed it to give performances, primarily in the form of synthetic singing accompanied by electronic music. It looks like a crude Halloween mask given life by a misguided wizard sometime in the 1980s. It is pretty bonkers and you should probably watch the video of it in action to wrap your head around the concept.
Hamilton needed three different Arduino development boards to bring this robot to life. The first, an Arduino Giga R1 WiFi, oversees the robot’s operation and handles voice interaction, as well as audio playback. The second, an Arduino Mega 2560, moves the robot’s neck according to input from two microphones (one on the left, the other on the right). The third, an Arduino Uno R4 WiFi, controls the rest of the servo movement.
The result is a robot that is both impressive and also pretty disconcerting.
Imagine playing Half-Life: Alyx and feeling the gun heat up in your hand as you take down The Combine. Or operating a robot through augmented reality and feeling coldness on your fingers when you get close to exceeding the robot’s limits. A prototype device called ThermoGrasp brings that thermal feedback to the mixed reality applications.
ThermoGrasp is a wearable thermal feedback system designed for virtual reality and augmented reality, created by Arshad Nasser and Khalad Hasan of the University of British Columbia. It consists of thermoelectric modules attached to the user’s fingers with Velcro straps. Those are capable of creating thermal sensations — both warm and cold — in response to what happens in the virtual world. Those sensations can relate to any condition or event that the developer chooses, whether for immersion or utility.
Nasser and Hasan built the prototype using an Arduino Mega 2560 board, which controls the thermoelectric modules through custom H-bridge drivers. Those thermoelectric modules are Peltier devices, which are normally associated with cooling. They can create a cooling feeling on the skin, but can also do the opposite and produce a warm feeling. The Arduino controls the drivers through pulse-width modulation (PWM), allowing for granular adjustment. The thermoelectric modules are capable of changing temperature at a rate of 3.5°C per second and so can produce a noticeable sensation within just a couple of seconds.
In testing, users found that cool sensations were easier to detect than warm sensations, but that both were useful and increased immersion.
For now-college student Joel Grayson, making something that combined his interests in mechanics, electronics, and programming while being simultaneously useful to those around him was a longtime goal. His recent Venderoo project is exactly that, as the creatively named vending machine was designed and built from the ground-up to dispense snacks in his former high school to fellow classmates.
Constructing Venderoo started with a sketch that featured the dimensions, vending mechanism, and the electronics panel on the left. Then through a combination of a CNC router and a jigsaw, Grayson meticulously cut out each plywood panel and assembled them together along with clear acrylic sheets so students could observe the machine in-action. On the electronics side, an Arduino Mega 2560 is responsible for handling selections on the keypad, displaying commands/feedback to users via the character LCD, accepting money, and rotating the motors when it’s time to dispense.
When a student first approaches Venderoo, they are greeted by a message instructing them to select their snack of choice, after which the price will appear and ask for a combination of $1 or $5 bills, depending on the price. Once the balance has met the threshold, Venderoo will find the location of the snack and spin the appropriate motor thanks to powerful MOSFET drivers.
To see more about how Grayson’s Venderoo vending machine works, watch the video below!
Who doesn’t want to explore underwater? To take a journey beneath the surface of a lake or even the ocean? But a remotely operated vehicle (ROV), which is the kind of robot you’d use for such an adventure, isn’t exactly the kind of thing you’ll find on the shelf at your local Walmart. You can, however, follow this guide from Ranuga Amarasinghe to build your own ROV for some aquatic fun.
Amarasinghe is a 16-year-old Sri Lankan student and this is actually the second iteration of his ROV design. As such, he’s dubbed it “ROV2” and it appears to be quite capable. All of its electronics sit safely within a 450mm length of sealed PVC tube. That mounts onto the aluminum extrusion frame structure that also hosts the six thrusters powered by drone-style brushless DC motors.
ROV2’s brain is an Arduino Mega 2560 board and it drives the BLDC motors through six electronic speed controllers (ESCs). It receives control commands from the surface via an umbilical. The operator holds a Flysky transmitter that sends radio signals to a receiver floating on the water. An Arduino UNO Rev3 reads those and then communicates the motor commands to the Mega through the tethered serial connection. That limits the maximum length of the tether to about 40 meters, which subsequently limits the maximum operating depth.
With the specified lithium battery pack, ROV2 can traverse the depths for 30-45 minutes. And when equipped with the 720p FPV camera, pilots can see and record all of the underwater action.
Elastic use in the textile industry is relatively recent. So, what did garment makers do before elastic came along? They relied on smocking, which is a technique for bunching up fabric so that it can stretch to better fit the form of a body. Now a team of computer science researchers from Canada’s University of Victoria are turning to smocking to create interesting new “data physicalization” displays for clothing.
These “smocking displays,” part of the researchers’ VISMOCK approach, can convey information through changes in form and changes in color. The practical implementation of this idea would be up to the garment maker, but there are many intriguing possibilities. Imagine, for instance, that your shirt sleeve could get tighter to indicate that it is time for an appointment on your daily calendar. Or if your pants could show the current time.
Both of those concepts — and much more — are entirely feasible. The team made that true by combining two techniques. The first is impregnating the fabric with thermochromic pigments that change color in the presence of heat. Heating elements embedded in the fabric, controlled by an Arduino Mega 2560 board through MOSFETs, influence that change. Resolution is low, because heat spreads, but this is enough to show quite a bit of information.
The second technique is smocking, but with special SMA (Shape Memory Alloy) wires and springs. Those can be deformed, but will then return to their original shape when current (and heat) runs through them. By integrating SMA into the smocking pattern, the fabric can change shape on-demand. As with the thermochromic heating elements, this occurs under the control of an Arduino.
Nintendo’s Joy-Con controller system is very innovative and generally well-regarded, with one major exception: stick drift. That’s a reliability issue that eventually affects a large percentage of Joy-Cons, to the frustration of gamers. But what if that was intentional and gamepads were designed to deteriorate in short order? That’s the idea behind ICY Interfaces.
Yoonji Lee and Chang Hee Lee at KAIST (Korea Advanced Institute of Science & Technology) created three devices under the ICY Interfaces umbrella: MeltPress, FrostPad, and IceSquish. Each incorporate ice — literal frozen water — in a manner meant to make use of the material’s ephemeral nature. Imagine, for instance, a gamepad with buttons that melt at an increasing rate as you touch them. Or another gamepad with buttons that don’t become accessible until a protective sheet of ice melts away. The ICY Interfaces are experiments in this kind of dynamic design.
Each device contains an Arduino Mega 2560 board to read button presses and control additional hardware, like Peltier coolers. Those are thermoelectric solid-state heat pumps capable of refreezing the ice after it melts.
The researchers developed a simple game, called Iceland: Frozen Journeys, to work with ICY Interfaces. They built that game in Processing in order to take advantage of its strong compatibility with Arduino boards and the Arduino IDE. The game challenges players to build snowmen, capitalizing on the ice theme.
MeltPress has an array of buttons with key caps made of ice. FrostPad has a surface with several capacitive touch pads covered in a layer of ice. IceSquish has buttons made of ice-filled silicone balls, which don’t become flexible enough to press until they’ve melted a bit. All of them make use of ice in an interesting way to explore new gameplay ideas.
Have you ever wanted your very own vending machine? If so, you likely found that they’re expensive and too bulky to fit in most homes. But now you can experience vending bliss thanks to this miniature vending machine designed by m22pj, which you can craft yourself using an Arduino and other materials lying around the house.
This project is fun, because it gives makers the opportunity to experiment with vending machine features without a big budget. That even includes more modern payment options, like one might see on a college campus with vending machine that charge to student identification cards. This design lets DIYers work with those features to learn about RFID, security, and more. And, of course, this is a chance to get hands-on experience with vending mechanisms, too.
The best part is that you can build this with some cardboard and off-the-shelf electronic components. The enclosure and almost all of the mechanical parts are cardboard. The electronics include an Arduino Mega 2560 board, a keypad, an RFID reader module, LEDs, and servo motors. The servos must be full-rotation models, so they can drive the vending mechanisms.
As designed, this vending machine can serve up to four different treats. But it would be possible to expand that to include many more. The Arduino has plenty of pins available to control additional servo motors, so the sky is the limit.
Most of the robots we feature only require a single Arduino board, because one Arduino can control several motors and monitor a bunch of sensors. But what if the robot is enormous and the motors are far apart? James Bruton found himself in that situation when he constructed this huge “tentacle” robot and his solution was to put an Arduino in each joint.
This is an oblique swivel joint robot arm, which means that each joint sits at an angle relative to the axes of the preceding and succeeding segments. This creates movement that is unlike any other kind of robot arm.
Bruton took this concept and scaled it up to ludicrous proportions. Each joint is a big ring made of plywood and 3D-printed parts, driven by a DC motor geared down 1600:1 and controlled through an ODrive module.
Because the robot is so large, it would have been difficult to run wires from a single Arduino to all of the motor drivers — especially because those have to go through slip rings to allow for continuous joint rotation. Instead, Bruton put an Arduino Mega 2560 board in each joint to control that joint’s motor driver. Those operate under the control of a primary Mega 2560 located in the base, with communication handled through a CAN bus system.
There is also another Mega 2560 in the remote control that Bruton built for the robot. That reads control input from switches and rotary encoders, then sends commands to the robot through a direct Wi-Fi connection (established via two ESP32 development boards).
Bruton designed this robot to exhibit at EMF Camp in the UK, where it was a popular attraction.
Held in Hawaii this year, the Association of Computing Machinery (ACM) hosted its annual conference on Human Factors in Computing Systems (CHI) that focuses on the latest developments in human-computer interaction. Students from universities all across the world attended the event and showcased how their devices and control systems could revolutionize how we interact with technology in both the real-world and virtual environments. These 12 projects presented at CHI 2024 feature Arduino at their core and demonstrate how versatile the hardware can be.
First on the list is MouseRing from students at Tsinghua University in Beijing that aims to give users the ability to precisely control mouse cursors with only one or two inertial measurement units (IMUs). Worn as a ring on the index finger, data collected from the MouseRing via an Arduino UNO Rev3 was used to both train a classification neural network and model the finger’s kinematics for fine-grained mouse cursor manipulation.
Because objects in virtual reality are only as heavy as the controller, simulating weight has always presented a challenge, which is why five students from the University of Regensburg in Germany devised their MobileGravity concept. With it, the user can place a tracked object onto a base station where an Arduino Micro then quickly pumps in/extracts water from the object to change its weight.
Another virtual reality device, the AirPush, is a fingertip-worn haptic actuator which gives wearers force feedback in up to eight directions and at five different levels of intensity. Through its system of an Arduino UNO, air compressor, and dual DC motors, this apparatus from students at the Southern University of Science and Technology in Shenzhen can accurately apply pressure around the finger in specific areas for use in games or training.
A Robotic Metamaterial, as described by students at Carnegie Mellon University, is a structure built from repeating cells that, on their own, cannot accomplish much, but when combined in specific configurations are able to carry out very complex tasks. Some of the Arduino Mega 2560-powered cells are able to actuate, sense angles, or enable capacitive touch interactions, thus letting a lattice of cells become a capable robot.
Instead of using pneumatics to bend materials, this team of students from Zhejiang and Tongji universities in China has designed a modular, flexible material using magnets which they call MagPixel. An Arduino UNO powers one such digital clock application leveraging MagPixel by energizing electromagnets within a ring to move the hour “hand” around the clock face.
Proprioception, or the ability to inherently sense where limbs are in 3D space, is vital to how we navigate the world, but VR spaces can limit this ability. The ArmDeformation project from a group of Southern University of Science and Technology students in Shenzhen rests on the wearer’s forearm and then moves the skin below to simulate an external force thanks to an Arduino Mega and several DC motors.
Grasping and moving objects is already quite the task in VR, but sketching a picture takes it to a whole other level of difficulty. Three students from the University of Virginia, therefore, have developed a shape-changing device that attempts to match the forms present in a 3D world for the purpose of sketching. After attaching a piece of paper to the surface, the VRScroll will bend into the correct shape using its two Arduino Uno WiFi Rev 2 boards and six motors.
As an alternative to plastic-based fibers for use in smart textile prototyping/production, four University of Colorado-Boulder students built an open-source machine that is capable of spinning gelatine-based fibers in a compact footprint. Leveraging an Arduino Mega, the machine can spin biofibers through its heated syringe with GCODE input, thus creating a strong thread which potentially integrates wearable sensors.
The art of communication relies on many forms of signals- not just speaking, and harnessing the user’s breathing pattern to better communicate is ExBreath from students at Tsinghua University in Beijing. An Arduino Nano continuously monitors the breathing patterns from a wearer via a bend sensor and translates them into signals for a micro air pump. In doing so, small, externally-worn air sacs are inflated to reflect the sensed breathing pattern.
This smart material, called ConeAct by its creators at Carnegie Mellon University, is a modular system consisting of small cones joined together with four shape memory actuators (SMA) that either flex or become rigid at certain temperatures. An Arduino Nano coordinates the actions of each cone, and when one needs to bend, the onboard ATtiny1616 will activate its MOSFETs to begin heating the corresponding SMA wires.
Targeted to those with blindness or low vision, the Tangible Stats project from a group of students at Stanford University allows them to more easily visualize statistical data by interacting with physical objects. The Arduino Mega-driven platform senses the number of stackable tokens placed into a column and provides quick feedback. Additionally, it can tilt the row of tokens to represent a sloping line.
Everyone needs access to fresh, clean air, but quickly seeing the indoor air quality of somewhere like an office meeting room/lobby is difficult. ActuAir, constructed by students at Newcastle University, is a wall-sized soft robotics display powered by a several Arduino UNO R4 WiFis that can each adjust the shape and color of a wall-mounted pouch to indicate the current CO2, temperature, or humidity levels — all of which is adjustable from an external web application.
Without anyone caring for them, beaches quickly become trash-covered swaths of disappointment. That care is necessary to maintain the beautiful sandy havens that we all want to enjoy, but it requires a lot of labor. A capstone team of students from the University of Colorado Boulder’s Creative Technology & Design program recognized that fact and they create the Seaside Sweeper beach-cleaning robot to lighten the load.
Seaside Sweeper is like a Roomba for beaches. Either autonomously or through manual control, it can patrol a beach for up to 15 hours on a battery charge and scoop up any trash it comes across. This costs less than $450 to build, which is an important consideration when most beaches are public property and have limited maintenance budgets.
There are two Arduino boards used in this project: an Arduino Mega 2560 in the Seaside Sweeper itself and an Arduino UNO Rev3 in the remote. They communicate with each other through nRF24L01+ radio transceivers. The Mega 2560 is able to track its own position using a Neo-6M GPS module and an Adafruit LIS3MDL compass module. Together, those enable the autonomous navigation functionality — though it isn’t clear how Seaside Sweeper detects trash. The Mega 2560 also controls the four drive motors and the scoop mechanism’s servo motor.
The robot’s body and almost all of its mechanical parts were 3D-printed to keep costs down. That even includes the tracks. The electronic components can be connected via breadboards, so no custom PCBs are required.
Fans off Wallace and Gromit will all remember two things about the franchise: the sort of creepy — but mostly delightful — stop-motion animation and Wallace’s Rube Goldberg-esque inventions. YouTuber Gregulations was inspired by Wallace’s Autochef breakfast-cooking contraption and decided to build his own robot to prepare morning meals.
Gregulations wanted his Autochef-9000 to churn out traditional full British breakfasts consisted of buttered toast, eggs, beans, and sausage. That was an ambitious goal, because each of those foods requires several steps to prepare. Gregulations’ solution was to, essentially, create one large machine that contains several smaller CNC machines. Each one is distinct and tailored to suit a particular food. In total — if you add up all of the different sections — this is a 12-axis CNC machine.
The Autochef-9000’s central controller is an Arduino Mega 2560 board. But even with the power and number of pins available, that wouldn’t have been able to handle everything. So it divvies out some tasks to Arduino UNO Rev3 boards.
As you would expect, this takes quite a lot of heat to cook everything. That’s why the Autochef-9000 contains several electric heating elements, which the Arduinos control via relays.
Users can order food using a touchscreen menu system or a smartphone interface. Autochef-9000 will then whir to life. It will open and heat a tin of beans, grab and heat a sausage, hard boil an egg, and toast and then butter bread fed from a magazine. Finally, it will deposit all of those items onto a plate.
There is a lot going on inside of this machine and Gregulations breezes past a lot of the technical details, but it is a joy to see in action. And unlike Wallace’s inventions, this one hasn’t caused any serious disasters (yet).
Do you really understand what is happening within the mysterious black packaging of a microcontroller or microprocessor? Most people don’t — we just learn how to use them. That’s because they’re wildly complex circuits combining many different subsystems that are all abstracted away from the view of the user. To help students better understand these integrated circuits (ICs), Dr. Panayotis Papazoglou designed the Hardware-Oriented Microprocessor Simulator (HOMS).
Dr. Papazoglou is an associate professor at the National and Kapodistrian University of Athens (NKUA), so he has a stake in creating an educational tool like this one. The goal of HOMS is to provide a visual and tactile demonstration of what happens inside an eight-bit microprocessor. For example, it will show a value moving from a counter to a memory register. That’s something that is difficult to visualize when using a microprocessor, even if you’re working close to “the metal” in assembly.
HOMS is a modular system, so students can experiment with blocks that represent different subsystem circuits within a microprocessor. Each module has an Arduino UNO Rev3 board to control its own functions, with all of the modules working under the coordination of a central Arduino Mega 2560 controller. One module may, for instance, represent memory and will show the data “written” to it on a display. Another module may have buttons and switches to allow user input.
There are software simulation tools that seek to illustrate computing fundamentals in a similar way, but many people learn better through physical interaction. For those people, HOMS could be very helpful.
Robots come in all shapes and sizes, but one of the most popular styles for industrial applications is the SCARA (Selective Compliance Assembly Robot Arm). These have multiple degrees of freedom, each of which rotates around the vertical Z axis. But they’re otherwise constrained, which can have advantages for certain applications. For example, they tend to have relatively high payload capacities. If you’re on a budget but want to dip your toes in, tuenhidiy’s SCARA plotter is a great way to start.
This is a follow-up to tuenhidiy’s previous SCARA design from a couple of years ago. The new version is more robust and includes a homing feature, which is important for repeatability. This is set up as a plotter and the firmware reflects that, but it would be possible to adapt the mechanical design for other purposes.
To keep costs down, most of the structure is PVC pipe. Stepper motors provide actuation via GT2 timing belts and pulleys. An Arduino Mega 2560 board controls those steppers through a RAMPS 1.4 board with A4988 stepper drivers. An interface module with a 2004 LCD, rotary encoder, buzzer, and button lets the user start jobs.
In this case, those jobs are G-code files containing the movement commands to reproduce the drawings. That works because the Arduino runs Marlin firmware (popular in the 3D printing community). The use of Marlin made homing easy and it accepts g-code that users can create with most of the standard software tools.
Um dir ein optimales Erlebnis zu bieten, verwenden wir Technologien wie Cookies, um Geräteinformationen zu speichern und/oder darauf zuzugreifen. Wenn du diesen Technologien zustimmst, können wir Daten wie das Surfverhalten oder eindeutige IDs auf dieser Website verarbeiten. Wenn du deine Einwillligung nicht erteilst oder zurückziehst, können bestimmte Merkmale und Funktionen beeinträchtigt werden.
Funktional
Immer aktiv
Die technische Speicherung oder der Zugang ist unbedingt erforderlich für den rechtmäßigen Zweck, die Nutzung eines bestimmten Dienstes zu ermöglichen, der vom Teilnehmer oder Nutzer ausdrücklich gewünscht wird, oder für den alleinigen Zweck, die Übertragung einer Nachricht über ein elektronisches Kommunikationsnetz durchzuführen.
Vorlieben
Die technische Speicherung oder der Zugriff ist für den rechtmäßigen Zweck der Speicherung von Präferenzen erforderlich, die nicht vom Abonnenten oder Benutzer angefordert wurden.
Statistiken
Die technische Speicherung oder der Zugriff, der ausschließlich zu statistischen Zwecken erfolgt.Die technische Speicherung oder der Zugriff, der ausschließlich zu anonymen statistischen Zwecken verwendet wird. Ohne eine Vorladung, die freiwillige Zustimmung deines Internetdienstanbieters oder zusätzliche Aufzeichnungen von Dritten können die zu diesem Zweck gespeicherten oder abgerufenen Informationen allein in der Regel nicht dazu verwendet werden, dich zu identifizieren.
Marketing
Die technische Speicherung oder der Zugriff ist erforderlich, um Nutzerprofile zu erstellen, um Werbung zu versenden oder um den Nutzer auf einer Website oder über mehrere Websites hinweg zu ähnlichen Marketingzwecken zu verfolgen.