Schlagwort: robots

  • Recreating Rosie the Robot with a MKR ZERO

    Recreating Rosie the Robot with a MKR ZERO

    Reading Time: < 1 minute

    Recreating Rosie the Robot with a MKR ZERO

    Arduino TeamAugust 3rd, 2020

    While 2020 may seem like a very futuristic year, we still don’t have robotic maids like the Jetsons’ Rosie the Robot. For his latest element14 Presents project, DJ Harrigan decided to create such a bot as a sort of animatronic character, using an ESP8266 board for interface and overall control, and a MKR ZERO to play stored audio effects.

    The device features a moveable head, arms and eyes, and even has a very clever single-servo gear setup to open and close its mouth.

    UI is via smartphone running a Blynk app, and Rosie’s antennas can light up along with a “beep beep” sound to let you know it needs your attention!

    More details can be found in Harrigan’s post here.

    [youtube https://www.youtube.com/watch?v=8CH7B6zuqAk?feature=oembed&w=500&h=281]

    Website: LINK

  • Auto-blow bubbles with a Raspberry Pi-powered froggy

    Auto-blow bubbles with a Raspberry Pi-powered froggy

    Reading Time: 2 minutes

    8 Bits and a Byte created this automatic bubble machine, which is powered and controlled by a Raspberry Pi and can be switched on via the internet by fans of robots and/or bubbles.

    [youtube https://www.youtube.com/watch?v=Mp7LrYoTGsY?feature=oembed&w=500&h=281]

    They chose a froggy-shaped bubble machine, but you can repurpose whichever type you desire; it’s just easier to adapt a model running on two AA batteries.

    Raspberry Pi connected to the relay module

    Before the refurb, 8 Bits and a Byte’s battery-powered bubble machine was controlled by a manual switch, which turned the motor on and off inside the frog. If you wanted to watch the motor make the frog burp out bubbles, you needed to flick this switch yourself.

    After dissecting their plastic amphibian friend, 8 Bits and a Byte hooked up its motor to Raspberry Pi using a relay module. They point to this useful walkthrough for help with connecting a relay module to Raspberry Pi’s GPIO pins.

    Now the motor inside the frog can be turned on and off with the power of code. And you can become controller of bubbles by logging in here and commanding the Raspberry Pi to switch on.

    A screenshot of the now automated frog in situ as seen on the remo dot tv website

    To let the internet’s bubble fans see the fruits of their one-click labour, 8 Bits and a Byte set up a Raspberry Pi Camera Module and connected their build to robot streaming platform remo.tv.

    Bubble soap being poured into the plastic frog's mouth
    Don’t forget your bubble soap!

    Kit list:

    The only remaining question is: what’s the best bubble soap recipe?

    Website: LINK

  • This Arduino-powered machine folds your shirts at the push of a button

    This Arduino-powered machine folds your shirts at the push of a button

    Reading Time: < 1 minute

    This Arduino-powered machine folds your shirts at the push of a button

    Arduino TeamJuly 21st, 2020

    Inspired by an old FlipFold TV ad, YouTuber Ty Palowski decided to make his own automated shirt folding machine.

    Palowski’s device is made in four folding sections, which lie flat to accept the unfolded piece of laundry. When the shirt is properly placed, a capacitive touch sensor starts the process, which is controlled via an Arduino and motor drivers.

    Two motors bring in the sides sequentially, then a third motor flips the bottom up. Activation is based simply on timing, with no sensor feedback. As seen at the end of the video, the project does save folding time and it works even better once Palowski gets some practice with it!

    [youtube https://www.youtube.com/watch?v=rhWaHSUVGco?feature=oembed&w=500&h=281]

    Website: LINK

  • Painting robot with ‘twin’ control scheme

    Painting robot with ‘twin’ control scheme

    Reading Time: 2 minutes

    Painting robot with ‘twin’ control scheme

    Arduino TeamJuly 16th, 2020

    For a class project, University of Stuttgart students Ekin Sila Sahin, Lior Skoury, and Simon Treml came up with a unique painting robot named the Physical Twin.

    The Physical Twin travels on a three-wheeled chassis and mounts a four-axis arm with a brush. An operator controls the arm to dip the brush into an onboard paint container, and can then manipulate it for application.

    The controller consists of a joystick for movement as well as a mini version of the arm. Four potentiometers measure arm input angles, which are duplicated on four corresponding servos on the robot. A pair of Arduino Mega boards are used for the setup — one on the mobile robot and another in the remote unit.

    You can see the device in action in the videos below, showing off direct operation and the ability to play back prerecorded movements.

    [youtube https://www.youtube.com/watch?v=RLETS29gzqc?feature=oembed&w=500&h=281]

    [youtube https://www.youtube.com/watch?v=SBcUb7kpBWo?feature=oembed&w=500&h=281]

    Website: LINK

  • Meet MrK_Blockvader, a little mobile robot that’s lots of fun

    Meet MrK_Blockvader, a little mobile robot that’s lots of fun

    Reading Time: < 1 minute

    One of the simplest ways to make a mobile robot involves differential steering, where two wheels move at different speeds as needed to turn and a ball caster keeps it from tipping over. The MrK_Blockvader is an excellent take on this type of bot — demonstrated in the first clip below — featuring a nice blocky body comprised out of 3D-printed parts, RC truck wheels driven by tiny gear motors, and an integrated roller on its back.

    The MrK_Blockvader is controlled via an Arduino Nano, along with an nRF24 breakout that allows it to receive signals from a radio transmitter unit. The build includes LED lighting as well as a piezo buzzer for all the beeps and boops. It can also take advantage of various sensors if necessary.

    The eventual goal is to use the MrK_Blockvader in a network of robots, hinted at in the second video with a worker at its side.

    [youtube https://www.youtube.com/watch?v=5P2HXFupO84?feature=oembed&w=500&h=281]
    [youtube https://www.youtube.com/watch?v=ZDzrl0xTYac?feature=oembed&w=500&h=281]

    Website: LINK

  • Robotic cornhole board guarantees three points every time

    Robotic cornhole board guarantees three points every time

    Reading Time: < 1 minute

    Robotic cornhole board guarantees three points every time

    Arduino TeamJuly 8th, 2020

    You may have seen Mark Rober’s automated dartboard or Stuff Made Here’s backboard, which use advanced engineering to create apparatuses that ensure you “can’t miss.” Now that summer is in full swing, what about a robotic cornhole board?

    Michael Rechtin decided to take on this challenge using a webcam pointed at the sky for sensing and DC motors that move the board along an X/Y plane on a set of sliding drawer rails.

    When a bean bag is thrown, the camera feeds the video over to a laptop running a Processing sketch to analyze its trajectory and passes adjustment info to an Arduino. This then controls the motors for repositioning, which attempts to predict where the bag will land and guide it into the hold for three points!

    [youtube https://www.youtube.com/watch?v=FkhxhMJtkHA?feature=oembed&w=500&h=281]

    Website: LINK

  • This puck-slapping robot will beat you in table hockey

    This puck-slapping robot will beat you in table hockey

    Reading Time: 2 minutes

    This puck-slapping robot will beat you in table hockey

    Arduino TeamJuly 3rd, 2020

    Mechanical table hockey games, where players are moved back and forth and swing their sticks with a series of knobs, can be a lot of fun; however, could one be automated? As Andrew Khorkin’s robotic build demonstrates, the answer is a definite yes — using an Arduino Mega and a dozen stepper motors to score goals on a human opponent.

    The project utilizes an overhead webcam to track the position of the players and puck on the rink, with a computer used for object detection and gameplay. Each player is moved with two steppers, one of which pushes the control rod in and out, while the other twists the player to take shots.

    Training the game took six months of work, which really shows in the impressive gameplay seen below.

    [youtube https://www.youtube.com/watch?v=ryq2LKFTg3Q?feature=oembed&w=500&h=281]

    Website: LINK

  • This robo-dog sprays poison ivy with weed killer

    This robo-dog sprays poison ivy with weed killer

    Reading Time: < 1 minute

    This robo-dog sprays poison ivy with weed killer

    Arduino TeamJuly 1st, 2020

    Poisonous plants, like poison ivy, can really ruin your day. In an effort to combat this “green menace,” YouTuber Sciencish decided to create his own quadruped robot.

    The robotic dog is equipped with two servos per leg, for a total eight, which enable it to move its shoulders and elbows back and forth.

    An Arduino Uno controller determines leg positions via trigonometric calculation, and when in position, it dispenses weed killer via a relay and aquarium pump setup. The reservoir can also be used to hold other liquids, whether for watering duties or even to provide extra fuel to a fire.

    [youtube https://www.youtube.com/watch?v=gm-EslOemfE?feature=oembed&w=500&h=281]

    Website: LINK

  • Building an Arduino-based bipedal bot

    Building an Arduino-based bipedal bot

    Reading Time: < 1 minute

    Building an Arduino-based bipedal bot

    Arduino TeamJune 21st, 2020

    If you’d like to build a walking biped robot, this 3D-printed design by Technovation looks like a fantastic place to start. Each leg features three servos that actuate it at the hip, knee, and ankle for a total of six degrees of freedom.

    Control is handled by an Arduino Uno board that rides on top of the legs, along with a perfboard to connect to the servos directly.

    Movements are calculated via inverse kinematics, meaning one simply has to input the x and z positions, and the Arduino calculates the proper servo angles. The bot is even able to take steps between two and 10 centimeters without falling over.

    [youtube https://www.youtube.com/watch?v=CxociTjzR4Q?feature=oembed&w=500&h=281]

    Website: LINK

  • Learning with Raspberry Pi — robotics, a Master’s degree, and beyond

    Learning with Raspberry Pi — robotics, a Master’s degree, and beyond

    Reading Time: 5 minutes

    Meet Callum Fawcett, who shares his journey from tinkering with the first Raspberry Pi while he was at school, to a Master’s degree in computer science and a real-life job in programming. We also get to see some of the awesome projects he’s made along the way.

    I first decided to get a Raspberry Pi at the age of 14. I had already started programming a little bit before and found that I really enjoyed the language Python. At the time the first Raspberry Pi came out, my History teacher told us about them and how they would be a great device to use to learn programming. I decided to ask for one to help me learn more. I didn’t really know what I would use it for or how it would even work, but after a little bit of help at the start, I quickly began making small programs in Python. I remember some of my first programs being very simple dictionary-type programs in which I would match English words to German to help with my German homework.

    Learning Linux, C++, and Python

    Most of my learning was done through two sources. I learnt Linux and how the terminal worked using online resources such as Stack Overflow. I would have a problem that I needed to solve, look up solutions online, and try out commands that I found. This was perhaps the hardest part of learning how to use a Raspberry Pi, as it was something I had never done before, but it really helped me in later years when I would use Linux more than Windows. For learning programming, I preferred to use books. I had a book for C++ and a book for Python that I would work through. These were game-based books, so many of the fun projects that I did were simple text-based games where you typed in responses to questions.

    A family robotics project

    The first robot Callum made using a Raspberry Pi

    By far the coolest project I did with the Raspberry Pi was to build a small robot (shown above). This was a joint project between myself and my dad. He sorted out the electronics and I programmed the robot. It was a great opportunity to learn about robotics and refine my programming skills. By the end, the robot was capable of moving around by itself, driving into objects, and then reversing and trying a new direction. It was almost like an unintelligent Roomba that couldn’t hoover, but I spent many hours improving small bits and pieces to make it as easy to use as possible. My one wish that I never managed to achieve with my robot was allowing it to map out its surroundings. This was a very ambitious project at the time, since I was still quite inexperienced in programming. The biggest problem with this was calibrating the robot’s turning circle, which was never consistent so it was very hard to have the robot know where in the room it was.

    Sense HAT maze game

    Another fun project that I worked on used the Sense HAT developed for the Astro Pi computers for use on the International Space Station. Using this, I was able to make a memory maze game (shown below), in which a player is shown a maze for several seconds and then has to navigate that maze from memory by shaking the device. This was my first introduction to using more interactive types of input, and this eventually led to my final-year project, which used these interesting interactions to develop another way of teaching.

    Learning programming without formal lessons

    I have now just finished my Master’s degree in computer science at the University of Bristol. Before going to university, I had no experience of being taught programming in a formal environment. It was not a taught subject at my secondary school or sixth form. I wanted to get more people at my school interested in this area of study though, which I did by running a coding club for people. I would help others debug their code and discuss interesting problems with them. The reason that I chose to study computer science is largely because of my experiences with Raspberry Pi and other programming I did in my own time during my teenage years. I likely would have studied history if it weren’t for the programming I had done by myself making robots and other games.

    Raspberry Pi has continued to play a part in my degree and extra-curricular activities; I used them in two large projects during my time at university and used a similar device in my final project. My robot experience also helped me to enter my university’s ‘Robot Wars’ competition which, though we never won, was a lot of fun.

    A tool for learning and a device for industry

    Having a Raspberry Pi is always useful during a hackathon, because it’s such a versatile component. Tech like Raspberry Pi will always be useful for beginners to learn the basics of programming and electronics, but these computers are also becoming more and more useful for people with more experience to make fun and useful projects. I could see tech like Raspberry Pi being used in the future to help quickly prototype many types of electronic devices and, as they become more powerful, even being used as an affordable way of controlling many types of robots, which will become more common in the future.

    Our guest blogger Callum

    Now I am going on to work on programming robot control systems at Ocado Technology. My experiences of robot building during my years before university played a large part in this decision. Already, robots are becoming a huge part of society, and I think they are only going to become more prominent in the future. Automation through robots and artificial intelligence will become one of the most important tools for humanity during the 21st century, and I look forward to being a part of that process. If it weren’t for learning through Raspberry Pi, I certainly wouldn’t be in this position.

    Cheers for your story, Callum! Has tinkering with our tiny computer inspired your educational or professional choices? Let us know in the comments below. 

    Website: LINK

  • Meet TELEBOT, the terrifying telepresence robot

    Meet TELEBOT, the terrifying telepresence robot

    Reading Time: 2 minutes

    Meet TELEBOT, the terrifying telepresence robot

    Arduino TeamJune 1st, 2020

    The Internet has been perhaps more important than ever to keep us connected these days. Available technology, however, apparently wasn’t good enough for brothers Hunter and Josh Irving, who built their own telepresence robot using parts on-hand during their own two-person hackathon.

    The robot they came up with, dubbed TELEBOT, features a partially 3D-printed face along with a set of chattering teeth and eyes recycled from an antique doll. An Arduino Uno is used to take audio signals from remote “guests” via a standard 3.5mm cable, simulating their facial expressions with servos that drive TELEBOT’s mouth and LED-lit eyes. 

    The duo also made TELEBOT’s “body” out of an adjustable lamp for manual movement. And, as an added bonus, the device is capable of glowing in the dark and can be customized with a wizard, cowboy or top hat. 

    While it might not be the most comforting robot you’ve ever seen, it looks like a fun build! 

    [youtube https://www.youtube.com/watch?v=XXLaeMre5Ac?feature=oembed&w=500&h=281]

    Website: LINK

  • GoodBoy is a robot dog that runs on Arduino

    GoodBoy is a robot dog that runs on Arduino

    Reading Time: < 1 minute

    GoodBoy is a robot dog that runs on Arduino

    Arduino TeamMay 27th, 2020

    Daniel Hingston wanted to build a four-legged walking robot for several years, and with current coronavirus restrictions he finally got his chance. His 3D-printed robodog, dubbed “GoodBoy,” is reminiscent of a miniature version of Boston Dynamics’ Spot, which helped inspire the project. 

    It’s extremely clean, with wiring integrated into the legs mid-print. Two micro servos per leg move it in a forward direction, controlled by an Arduino Uno.

    Obstacle avoidance is provided by a pair of ultrasonic sensor “eyes,” allowing it to stop when something is in its path. An LDR sensor is also implemented, which when covered by its human minder commands it to present its paw for shaking.

    Be sure to check out a short demo of GoodBoy below! 

    [youtube https://www.youtube.com/watch?v=uE5hZhkQkwI?feature=oembed&w=500&h=281]

    Website: LINK

  • Meet your new robotic best friend: the MiRo-E dog

    Meet your new robotic best friend: the MiRo-E dog

    Reading Time: 3 minutes

    When you’re learning a new language, it’s easier the younger you are. But how can we show very young students that learning to speak code is fun? Consequential Robotics has an answer…

    The MiRo-E is an ’emotionally engaging’ robot platform that was created on a custom PCB  and has since moved onto Raspberry Pi. The creators made the change because they saw that schools were more familiar with Raspberry Pi and realised the potential in being able to upgrade the robotic learning tools with new Raspberry Pi boards.

    [youtube https://www.youtube.com/watch?v=KNYB5PaR6KE]

    The MiRo-E was born from a collaboration between Sheffield Robotics, London-based SCA design studio, and Bristol Robotics Lab. The cute robo-doggo has been shipping with Raspberry Pi 3B+ (they work well with the Raspberry Pi 4 too) for over a year now.

    While the robot started as a developers’ tool (MiRo-B), the creators completely re-engineered MiRo’s mechatronics and software to turn it into an educational tool purely for the classroom environment.

    Three school children in uniforms stroke the robot dog's chin

    MiRo-E with students at a School in North London, UK

    MiRo-E can see, hear, and interact with its environment, providing endless programming possibilities. It responds to human interaction, making it a fun, engaging way for students to learn coding skills. If you stroke it, it purrs, lights up, move its ears, and wags its tail. Making a sound or clapping makes MiRo move towards you, or away if it is alarmed. And it especially likes movement, following you around like a real, loyal canine friend. These functionalities are just the basic starting point, however: students can make MiRo do much more once they start tinkering with their programmable pet.

    These opportunities are provided on MiRoCode, a user-friendly web-based coding interface, where students can run through lesson plans and experiment with new ideas. They can test code on a virtual MiRo-E to create new skills that can be applied to a real-life MiRo-E.

    What’s inside?

    Here are the full technical specs. But basically, MiRo-E comprises a Raspberry Pi 3B+ as its core, light sensors, cliff sensors, an HD camera, and a variety of connectivity options.

    How does it interact?

    MiRo reacts to sound, touch, and movement in a variety of ways. 28 capacitive touch sensors tell it when it is being petted or stroked. Six independent RGB LEDs allow it to show emotion, along with DOF to move its eyes, tail, and ears. Its ears also house four 16-bit microphones and a loudspeaker. And two differential drive wheels with opto-sensors help MiRo move around.

    What else can it do?

    The ‘E’ bit of MiRo-E means it’s emotionally engaging, and the intelligent pet’s potential in healthcare have already been explored. Interaction with animals has been proved to be positive for patients of all ages, but sometimes it’s not possible for ‘real’ animals to comfort people. MiRo-E can fill the gap for young children who would benefit from animal comfort, but where healthcare or animal welfare risks are barriers.

    The same researchers who created this emotionally engaging robo-dog for young people are also working with project partners in Japan to develop ‘telepresence robots’ for older patients to interact with their families over video calls.

    Website: LINK

  • This robot looks like a ball and transforms itself into a quadruped to move

    This robot looks like a ball and transforms itself into a quadruped to move

    Reading Time: 2 minutes

    This robot looks like a ball and transforms itself into a quadruped to move

    Arduino TeamMay 25th, 2020

    Gregory Leveque has created an adorable 3D-printed robot that not only walks on four legs, but folds up into a ball when not in use. 

    To accomplish this, the round quadruped utilizes one servo to deploy each leg via a parallelogram linkage system and another to move it forwards and backwards. A clever single-servo assembly is also implemented on the bottom to fill gaps left by the legs.

    The device is controlled by an Arduino Nano, along with a 16-channel servo driver board. Obstacle avoidance is handled via an ultrasonic sensor, which sticks out of the top half of the sphere and rotates side to side using yet another servo. 

    It’s an impressive mechanical build, especially considering its diminutive size of 130mm (5.12in) in diameter.

    [youtube https://www.youtube.com/watch?v=zN9ubHAB0vY?feature=oembed&w=500&h=281]

    Website: LINK

  • Make it rain chocolate with a Raspberry Pi-powered dispenser

    Make it rain chocolate with a Raspberry Pi-powered dispenser

    Reading Time: 5 minutes

    This fully automated M&M’s-launching machine delivers chocolate on voice command, wherever you are in the room.

    [youtube https://www.youtube.com/watch?v=hsGhCl0y1FY]

    A quick lesson in physics

    To get our head around Harrison McIntyre‘s project, first we need to understand parabolas. Harrison explains: “If we ignore air resistance, a parabola can be defined as the arc an object describes when launching through space. The shape of a parabolic arc is determined by three variables: the object’s departure angle; initial velocity; and acceleration due to gravity.”

    Harrison uses a basketball shooter to illustrate parabolas

    Lucky for us, gravity is always the same, so you really only have to worry about angle and velocity. You could also get away with only changing one variable and still be able to determine where a launched object will land. But adjusting both the angle and the velocity grants much greater precision, which is why Harrison’s machine controls both exit angle and velocity of the M&M’s.

    Kit list

    The M&M’s launcher comprises:

    • 2 Arduino Nanos
    • 1 Raspberry Pi 3
    • 3 servo motors
    • 2 motor drivers
    • 1 DC motor
    • 1 Hall effect limit switch
    • 2 voltage converters
    • 1 USB camera
    • “Lots” of 3D printed parts
    • 1 Amazon Echo Dot

    A cordless drill battery is the primary power source.

    The project relies on similar principles as a baseball pitching machine. A compliant wheel is attached to a shaft sitting a few millimetres above a feeder chute that can hold up to ten M&M’s. To launch an M&M’s piece, the machine spins up the shaft to around 1500 rpm, pushes an M&M’s piece into the wheel using a servo, and whoosh, your M&M’s piece takes flight.

    Controlling velocity, angle and direction

    To measure the velocity of the fly wheel in the machine, Harrison installed a Hall effect magnetic limit switch, which gets triggered every time it is near a magnet.

    Two magnets were placed on opposite sides of the shaft, and these pass by the switch. By counting the time in between each pulse from the limit switch, the launcher determines how fast the fly wheel is spinning. In response, the microcontroller adjusts the motor output until the encoder reports the desired rpm. This is how the machine controls the speed at which the M&M’s pieces are fired.

    Now, to control the angle at which the M&M’s pieces fly out of the machine, Harrison mounted the fly wheel assembly onto a turret with two degrees of freedom, driven by servos. The turret controls the angle at which the sweets are ‘pitched’, as well as the direction of the ‘pitch’.

    So how does it know where I am?

    With the angle, velocity, and direction at which the M&M’s pieces fly out of the machine taken care of, the last thing to determine is the expectant snack-eater’s location. For this, Harrison harnessed vision processing.


    Harrison used a USB camera and a Python script running on Raspberry Pi 3 to determine when a human face comes into view of the machine, and to calculate how far away it is. The turret then rotates towards the face, the appropriate parabola is calculated, and an M&M’s piece is fired at the right angle and velocity to reach your mouth. Harrison even added facial recognition functionality so the machine only fires M&M’s pieces at his face. No one is stealing this guy’s candy!

    So what’s Alexa for?

    This project is topped off with a voice-activation element, courtesy of an Amazon Echo Dot, and a Python library called Sinric. This allowed Harrison to disguise his Raspberry Pi as a smart TV named ‘Chocolate’ and command Alexa to “increase the volume of ‘Chocolate’ by two” in order to get his machine to fire two M&M’s pieces at him.

           

    Drawbacks

    In his video, Harrison explaining that other snack-launching machines involve a spring-loaded throwing mechanism, which doesn’t let you determine the snack’s exit velocity. That means you have less control over how fast your snack goes and where it lands. The only drawback to Harrison’s model? His machine needs objects that are uniform in shape and size, which means no oddly shaped peanut M&M’s pieces for him.

    He’s created quite the monster here, in that at first, the machine’s maximum firing speed was 40 mph. And no one wants crispy-shelled chocolate firing at their face at that speed. To keep his teeth safe, Harrison switched out the original motor for one with a lower rpm, which reduced the maximum exit velocity to a much more sensible 23 mph… Please make sure you test your own snack-firing machine outdoors before aiming it at someone’s face.

    Go subscribe

    Check out the end of Harrison’s videos for some more testing to see what his machine was capable of: he takes out an entire toy army and a LEGO Star Wars squad by firing M&M’s pieces at them. And remember to subscribe to his channel and like the video if you enjoyed what you saw, because that’s just a nice thing to do.

    Website: LINK

  • mechDOG, a 12-servo robotic pup

    mechDOG, a 12-servo robotic pup

    Reading Time: < 1 minute

    mechDOG, a 12-servo robotic pup

    Arduino TeamMay 18th, 2020

    Mech-Dickel Robotics has designed a beautiful quadruped robot dubbed mechDOG, which utilizes a dozen servos for motion. This gives each leg three degrees of freedom, allowing the cat-sized beast to travel a meter in 8.46 seconds. While it won’t break any speed records, creating a walking motion on this sort of unstable platform is an impressive feat in itself.

    mechDOG is controlled by an Arduino Uno, while a Lynxmotion Smart Servo Adapter Board interfaces with the servos themselves. The device is remote-controlled via an RF unit, though it does have a pair of ultrasonic sensors that presumably could be used for obstacle avoidance. 

    You can check it out in action in the videos below, looking sharp in its yellow-finished aluminum sheet metal chassis.

    [youtube https://www.youtube.com/watch?v=i-wuqwJ5QNY?feature=oembed&w=500&h=281]

    [youtube https://www.youtube.com/watch?v=_6UpOW29lDs?feature=oembed&w=500&h=281]

    Website: LINK

  • This mouth mechanism is controlled by your typing

    This mouth mechanism is controlled by your typing

    Reading Time: < 1 minute

    This mouth mechanism is controlled by your typing

    Arduino TeamApril 21st, 2020

    Will Cogley, known for his awesome animatronics, has created a robotic mouth that’s already a work of art and could form the basis of something even more amazing. 

    The device features an array of servo mechanisms to actuate its jaw, forceps, cheeks, and a tongue. The cheek assemblies are particularly interesting, employing two servos each and a linkage system that allows it to move in a variety of positions.

    For control, the project uses a Python program to break typed sentences up into individual sounds. It then sends these to an Arduino, which poses the mouth in sequence. Cogley has also experimented with microphone input and hopes to explore motion capture with it in the future.

    [youtube https://www.youtube.com/watch?v=mEAz-72ZjKE?feature=oembed&w=500&h=281]

    Website: LINK

  • This garbage-bot trash talked TEDx Copenhagen attendees

    This garbage-bot trash talked TEDx Copenhagen attendees

    Reading Time: 2 minutes

    This garbage-bot trash talked TEDx Copenhagen attendees

    Arduino TeamApril 21st, 2020

    For TEDx Copenhagen 2019, MAKESOME was contacted about building a trash can. Not just any ordinary waste bin, however, but one that would fit in with their theme of “expect the unexpected” by driving around and being rude to participants. From the video, the bot looks like it was a great success, and something that caught attendees off guard with its “in your face” attitude.

    Mechanically, the base of the device is an omniwheel robot, which moves in any direction under the power of four DC gearmotors. An Arduino Uno is the brains of the project, with user interface provided by a PlayStation gamepad over Bluetooth. A Nano takes controls motors, while an MP3 module, amp, and speakers allow it to conversate and joke around while receiving their refuse.

    [youtube https://www.youtube.com/watch?v=DQmHZyi0gRA?feature=oembed&w=500&h=281]

    The robot is built around an Arduino Uno used as the main controller and an Arduino Nano for the motor control. Commucation is via an USB Host Shield and a Bluetooth dongle. 2 MDD10A 2.0 motorcontrollers were used to control the 4 JGB37-550 motors. The mp3 player is a Serial mp3 player v1.0 and the amplifier and speakers come from a set of Z150 Logitech computer speakers. Battery is a Tattu 22000 mAh, 14.8V 4 cells LiPo battery. A DC-DC converter was used to deliver 5V for the controllers.

    Website: LINK

  • Pingo, the motion-detecting ping pong ball launcher

    Pingo, the motion-detecting ping pong ball launcher

    Reading Time: < 1 minute

    Pingo, the motion-detecting ping pong ball launcher

    Arduino TeamMarch 11th, 2020

    If you want to “enhance your athletic training regimen,” or perhaps just have a bit of fun with robotically launched ping pong balls, then be sure to check out the Pingo apparatus shown in the video below. This robot moves back and forth on four DC motor-powered wheels, searching for targets with an ultrasonic rangefinder.

    When something comes into view, Pingo adjusts its ping pong launching tube’s angle to match the target distance, then loads a ball and flings it into the air with a pair of spinning disks. 

    The device is controlled by an Arduino Mega and uses a half-dozen DC motors, a pair of steppers, and even a servo to accomplish its mission.

    Website: LINK

  • The Watchman is a 3D-printed robot head that follows your face with realistic eyeballs

    The Watchman is a 3D-printed robot head that follows your face with realistic eyeballs

    Reading Time: 2 minutes

    The Watchman is a 3D-printed robot head that follows your face with realistic eyeballs

    Arduino TeamMarch 9th, 2020

    When you step out in public, you’ll often be filmed by a number of cameras and perhaps even be analyzed by tracking software of some kind. The Watchman robot head by Graham Jessup, however, makes this incredibly obvious as it detects and recognizes facial movements, then causes a pair of eyeballs to follow you around.

    The 3D-printed system — which is a modified version of Tjhazi’s Doorman — uses a Raspberry Pi Camera to capture a live video feed, along with a Raspberry Pi Zero and a Google AIY HAT for analysis.

    This setup passes info on to an Arduino Uno that actuates the eyeballs via a 16-channel servo shield and a number of servos. The device can follow Jessup up, down, left, and right, making for a very creepy robot indeed!

    [youtube https://www.youtube.com/watch?v=qRGOz6Pa32A?feature=oembed&w=500&h=281]

    [youtube https://www.youtube.com/watch?v=xo8RRXo4SKw?feature=oembed&w=500&h=281]

    Website: LINK

  • Creating an online robot fighting game using Arduino MKR1000 WiFi

    Creating an online robot fighting game using Arduino MKR1000 WiFi

    Reading Time: 7 minutes

    This is a guest post from Surrogate, a team of developers building games that people play in real-life over the internet.

    We introduced this concept last year, and have launched three games so far. Our final game of 2019 was SumoBots Battle Royale — where players from anywhere in the world can fight real robots in a battle royale-style arena. The aim of the project was to have the game run semi-autonomously, meaning that the bots could self-reset in between the games, and the arena could run by itself with no human interaction. This was our most complex project to date, and we wanted to share some parts of the build process in more detail, specifically, how we’ve built these robots and hooked them online for people to control remotely.

    Robot selection

    We’ve started our process by choosing which robots we’d want to use for the game. There were a couple of requirements for the robots when making the evaluation:

    • Are able to withstand 24/7 collision
    • Easily modifiable and fixable
    • Can rotate on the same spot
    • Must have enough space to fit the electronics

    After looking at a lot of different consumer robots, maker projects, and competitive fighting bots, we’ve decided to use the JSUMO BB1 robots for this game. We liked the fact that these bots have a metal casing which makes them very durable, all parts are easily replaceable and can be bought separately, and it has 4 independent motors (motor shields included), one for each wheel, which allows it to rotate on the same spot.

    We were pretty skeptical of being able to fit all the electronics into the original casing, but we decided to go with this robot anyways, as it had the best overall characteristics. As this robot is easily modifiable, we can always 3D print an extra casing to fit all the parts.

    What is the board?

    Now that we’ve decided on the robot, it was the time to define what electronics should we use in this build. As usual, it all starts with the requirements. Here’s what we need for the game to run smoothly:

    • The robot should be able to recover from any position
    • Can stay online while charging
    • Supports WiFi network connection and offers reliable connectivity
    • Easily programmable and supports OTA updates
    • Can control four motors simultaneously

    Based on these requirements we had the following electronics layout in mind:

    We had to find a board that is energy efficient, can send commands to motors, supports parallel charging and has a small footprint on the robot size. With so many requirements, finding the perfect board can be a challenge.

    Arduino to the rescue

    Fortunately, Arduino was there to help us out. They offer a rich selection of boards to fit every possible robotics project out there and have very detailed documentation for each of the boards. 

    More importantly, Arduino is known for its high quality, something that is crucial for semi-autonomous types of applications. Coming from an embedded software background and having to work with all sorts of hardware, we often see that some features or board functionalities are not fully finished which can lead to all sorts of unpleasant situations.

    After looking at the Arduino’s collection of boards we quickly found a perfect candidate for our project, the Arduino MKR1000 WiFi. This board fits all of our main requirements for the motor controls, is easily programmable via Arduino IDE, and due to its low power design is extremely power efficient, allowing us to have a lower capacity battery. Additionally, it has a separate WiFi chip onboard, which solely focuses on providing a reliable WiFi connection, something that is very important in our use case.

    Now that we’ve decided on the “brain” of our robot, it was time to choose the rest of the components.

    Robust hardware means working software

    Something to keep in mind is that when working with hardware, you should always try to avoid any possible risks. This means that you should always over-do your minimal hardware requirements where possible. The reason is — if your hardware doesn’t work as intended, your whole software stack becomes unusable too. Always chose reliable hardware components for mission-critical applications.

    Some of our electric components might look a bit overkill, but due to the nature of our projects, they are a critical requirement.

    Avoiding the battery explosions

    As there is a lot of robot collision involved in the game, we decided to go with a high safety standard battery solution. After evaluating multiple options on the market, we decided to go with the RRC2040 from RRC (Germany). It has a capacity of 2950 mAh that allows us to run the robots for up to five hours on a single charge. It has an internal circuitry for power management, protection features and it supports SMBUS communications (almost like I2C), and is certified for all of the consumer electronics battery standards. For charging, we used RRC’s charging solution designed specifically for this battery and that offers the possibility to feed power to the application while the battery is being charged.

    Note: the Arduino MKR1000 has a pretty neat charging solution on the board itself. You can connect the battery to the board directly as the main power source, and you charge it directly through the MKR1000’s micro USB port. We really wanted to use it to save space and have a more robust design, but due to the large capacity of our battery, we couldn’t use it at full potential. In our future projects with smaller scale robots, we definitely plan to use the board’s internal charging system, as it works perfectly for 700-1800 mAh power packs.

    Bot recovery

    For the bot to be able to recover from falling on its head, we’ve implemented a flipping servo. We didn’t want to have any risk of not enough torque, so we went with DS3218, which is capable of lifting up to 20KG of weight. Here’s how it works:

    Hooking everything together

    Now that we’ve decided on all of the crucial elements of this setup, it was time to connect all the elements together. As the first step, we figured what would be the best step way to locate all the pieces within the bot. We then 3D-printed a casing to protect the electronics. With all of the preliminary steps completed, we’ve wired all of the components together and mounted them inside of the casing. Here’s how it looks:

    It was really convenient for us that all the pins on the board could be connected just by plugging them in, this avoids a lot of time spent on soldering the cables for 12 robots and more importantly, allowed us to cut out the risk of bad soldering that usually can’t be easily identified.

    Arduino = Quick code

    Arduino MKR1000 offered us the connectivity we needed for the project. Each sumo robot hosts their own UDP server using MKR1000 WiFi libraries to receive their control commands for a central control PC and broadcasting their battery charge status. The user commands are translated to three different PWM signals using Arduino Servo library for the flipping, left and right side motor controllers. The board used has support for hardware PWM output which was useful for us.  Overall we managed to keep the whole Arduino code in a few hundred lines of code due to the availability of Servo and Wifi libraries.

    The out of the box ArduinoOTA support for updating the code over the WiFi came in handy during the development phase, but also anytime we update the firmware for multiple robots at the same time. No need to open the covers and attach a USB cable! We created a simple Bash script using the OTA update tool bundled in Arduino IDE to send firmware updates to every robot at the same time.  

    To summarize

    It’s pretty amazing that we live in the age where you can use a mass market, small form factor board like the Arduino MKR1000 and have so much functionality. We’ve had a great experience developing our SumoBots Battle Royale game using the board. It made the whole process very smooth and streamlined, the documentation was right on point, and we never had to hit a bottleneck where the hardware wouldn’t work as expected.

    More importantly, the boards have proven to be very robust throughout the time. These SumoBots have been used for more than 3,000 games already, and we haven’t seen a single failure from the MKR1000. For a game where you literally slam the robots in to each other at a high speed, that’s pretty impressive to say the least.

    We look forward to working with Arduino on our future games, and we can’t wait to see what they will be announcing in 2020!

    Website: LINK

  • OmBURo is an Arduino-controlled unicycle robot with an active omnidirectional wheel

    OmBURo is an Arduino-controlled unicycle robot with an active omnidirectional wheel

    Reading Time: 2 minutes

    OmBURo is an Arduino-controlled unicycle robot with an active omnidirectional wheel

    Arduino TeamFebruary 4th, 2020

    Omni wheels normally contain a number of rollers arranged on their circumference, allowing them to slide left and right and perform various tricks when combined with others. The rollers on UCLA researchers Junjie Shen and Dennis Hong’s OmBURo, however, are quite different in that they are actually powered, enabling a single wheel to accomplish some impressive feats on its own.

    These powered rollers give OmBURo the ability to move in both longitudinal and lateral directions simultaneously, balancing as a dual-axis wheeled inverted pendulum. 

    Control is accomplished via an Arduino Mega along with an IMU and encoders for its two servo motors —one tasked with driving the wheel backwards and forwards, the second for actuating the rollers laterally via helical gears and a flexible shaft. 

    As seen in the video below, the robot can follow different paths via remote control, and even balance on an inclined plane. More informaton on the impressive build is available in the Shen and Hong’s research paper here.

    [youtube https://www.youtube.com/watch?v=avtlx1-X3lo?feature=oembed&w=500&h=281]

    A mobility mechanism for robots to be used in tight spaces shared with people requires it to have a small footprint, to move omnidirectionally, as well as to be highly maneuverable. However, currently there exist few such mobility mechanisms that satisfy all these conditions well. Here we introduce Omnidirectional Balancing Unicycle Robot (OmBURo), a novel unicycle robot with active omnidirectional wheel. The effect is that the unicycle robot can drive in both longitudinal and lateral directions simultaneously. Thus, it can dynamically balance itself based on the principle of dual-axis wheeled inverted pendulum. This letter discloses the early development of this novel unicycle robot involving the overall design, modeling, and control, as well as presents some preliminary results including station keeping and path following. With its very compact structure and agile mobility, it might be the ideal locomotion mechanism for robots to be used in human environments in the future.

    Website: LINK