Schlagwort: LiDAR

  • Teaching an Arduino UNO R4-powered robot to navigate obstacles autonomously

    Teaching an Arduino UNO R4-powered robot to navigate obstacles autonomously

    Reading Time: 2 minutes

    The rapid rise of edge AI capabilities on embedded targets has proven that relatively low-resource microcontrollers are capable of some incredible things. And following the recent release of the Arduino UNO R4 with its Renesas RA4M1 processor, the ceiling has gotten even higher as YouTuber Nikodem Bartnik has demonstrated with his lidar-equipped mobile robot.

    Bartnik’s project started with a simple question of whether it’s possible to teach a basic robot how to make its way around obstacles using only lidar instead of the more resource-intensive computer vision techniques employed by most other platforms. The chassis and hardware, including two DC motors, an UNO R4 Minima, a Bluetooth® module, and SD card, were constructed according to Open Robotic Platform (ORP) rules so that others can easily replicate and extend its functionality. After driving through a series of courses in order to collect a point cloud from the spinning lidar sensor, Bartnik imported the data and performed a few transformations to greatly minify the classification model.

    Once trained, the model was exported with help from the micromlgen Python package and loaded onto the UNO R4. The setup enables the incoming lidar data to be classified as the direction in which the robot should travel, and according to Bartnik’s experiments, this approach worked surprisingly well. Initially, there were a few issues when navigating corners and traveling through a figure eight track, but additional training data solved it and allowed the vehicle to overcome a completely novel course at maximum speed.

    [youtube https://www.youtube.com/watch?v=PdSDhdciSpE?feature=oembed&w=500&h=281]

    The post Teaching an Arduino UNO R4-powered robot to navigate obstacles autonomously appeared first on Arduino Blog.

    Website: LINK

  • See how Nikodem Bartnik integrated LIDAR room mapping into his DIY robotics platform

    See how Nikodem Bartnik integrated LIDAR room mapping into his DIY robotics platform

    Reading Time: 2 minutes

    Arduino TeamJanuary 31st, 2022

    As part of his ongoing autonomous robot project, YouTuber Nikodem Bartnik wanted to add LIDAR mapping/navigation functionality so that his device could see the world in much greater resolution and actively avoid obstacles. In short, LIDAR works by sending out short pulses of invisible light and measuring how much time it takes for the beam to reflect off an object and return to its detector. By combining this distance value with the angle of the sensor at the moment of measurement, a virtual cloud of points can be built and used to represent the entire space around the robot.

    The LIDAR module Bartnik opted to use was fairly simple, as it sent measurements in frames over UART that encoded everything including the sensor’s angle, the distance, and the speed of the device. He then created a simple sketch for the MKR WiFi 1010 that takes advantage of the increased power and connectivity to read values and send them to a host machine for further processing and visualization. 

    The resulting Python script opens a websocket, which receives the aforementioned data, does some basic filtering, and then displays it within a point-cloud. It also determines the direction in which the robot should move and sends that command back to the MKR board so it can tell the attached Arduino Uno how to move the motors. 

    Website: LINK

  • Making a mini 360° LiDAR for $40

    Making a mini 360° LiDAR for $40

    Reading Time: 2 minutes

    Making a mini 360° LiDAR for $40

    Arduino TeamDecember 21st, 2020

    LiDAR (or “light detection and ranging”) sensors are all the rage these days, from their potential uses in autonomous vehicles, to their implementation on the iPhone 12. As cool as they are, these (traditionally) spinning sensors tend to be quite expensive, well out of reach for most amateur experimenters. Daniel Hingston, however, has managed to build his own unit for under $40, using an Arduino Uno and a pair of VL53L0X time-of-flight (ToF) sensors.

    The lighthouse employs a small gearmotor to rotate the two sensors on top of its cylindrical 3D-printed housing, passing signals to the Arduino via a slip ring. Data can then be visualized using a Processing sketch running on a nearby computer.

    As seen at around the 10:00 mark in the video, the setup has been utilized to map out different test enclosures, and could be excellent for use in small robotic applications. More details can be found in Hingston’s tutorial here.

    [youtube https://www.youtube.com/watch?v=uYU534Wn4lA?feature=oembed&w=500&h=281]

    Website: LINK

  • DolphinView headset lets you see the world like Flipper!

    DolphinView headset lets you see the world like Flipper!

    Reading Time: 2 minutes

    DolphinView headset lets you see the world like Flipper!

    Arduino TeamJuly 26th, 2018

    Dolphins are not only amazing swimmers and extremely intelligent, but can also navigate their surroundings using echolocation. While extremely useful in murky water, Andrew Thaler decided to make a device that would enable him to observe his (normally dry) environment with a similar distance-indicating audio setup.

    While he first considered using an ultrasonic sensor, he eventually settled on LiDAR for its increased range, and uses an Arduino to translate distance into a series of audio clicks. Sound is transferred to Thaler through bone conduction speakers, mimicking the way dolphins hear without external ears. 

    He notes that while using the “DolphinView” headset is initially disorienting, he was eventually able correlate his surroundings with the system’s audio feedback. Arduino code and parts list is available on GitHub, and the mechanical frame design can be found on Thingiverse if you’d like to build your own!

    Website: LINK

  • Lunar landing conspiracy put to rest(?) with LIDAR

    Lunar landing conspiracy put to rest(?) with LIDAR

    Reading Time: 2 minutes

    Lunar landing conspiracy put to rest(?) with LIDAR

    Arduino TeamFebruary 20th, 2018

    On July 20th, 1969 man first set foot on the moon with the Apollo 11 mission, or so they say. If it was faked, or so the theory goes, one would think that there were a few details that don’t quite add up. One such theory is that the hatch on the lunar module isn’t actually large enough to allow a fully-suited up astronaut to enter and exit the module.

    Rather than make assumptions, astrophotographer and hacker “AstronomyLive” took matters into his own hands and used a homemade LIDAR unit to measure the hatch of Lunar Module #9 at the Kennedy Space Center, as well as an Apollo spacesuit.

    The Arduino-powered device aims the laser, and transmits this information to a tablet that also provides a convenient user interface. This data was then arranged as a point cloud, proving that… You can take a guess, or watch the video below to see his conclusion!

    I used the Garmin LIDAR-Lite V3 along with a couple of metal geared servo motors to build a simple pan/tilt scanner, which pairs via Bluetooth to an Android app I built using MIT App Inventor 2 to control and receive data from the Arduino. It’s simple but effective. Although every tutorial I read suggested I couldn’t safely pull the voltage off the board for the motors, but I found that the vin pin gave me no problems, as long as I used a 5V 1.5A linear voltage regulator between the pin and the motors. I supplied 9V using AA batteries to the power jack on the Arduino. In the future I may upgrade the scanner by adding a small camera to grab RGB data for each point as it samples, and ideally I would change the whole thing to use a stepper motor for continuous spinning and scanning to generate a denser cloud.



    Website: LINK

  • LiDAR Scanning Technology Helps Archaeologists Uncover Forgotten Cities

    LiDAR Scanning Technology Helps Archaeologists Uncover Forgotten Cities

    Reading Time: 3 minutes

    LiDAR scanning technology offers tremendous potential to help archaeologists discover lost cities across the world.

    Archaeologists are able to speed up their process of discovery and exploration using a high-tech laser technique originally developed to scan the surface of the moon.

    The technology is based on light detection and ranging scanning (LiDAR) and was created in 1970. It uses a combination of aerial light pulses as well as GPS to create a 3D map of what lies beneath.

    The surveying method basically measures the distance to a target by illuminating it. It then measures the reflected pulse via a sensor. A wide variety of areas including geography, geology, seismology and geomatics are already using the LiDAR technology.

    This has led to some major discoveries, including the Mexican city of Angamuco. Although archeologists had been aware of its existence, the city had been covered in layers of lava making it harder to spot. There was no way of knowing how dense the city was.


    LiDAR used to build 3D models. (Image: Wikipedia)

    Discovering the city of Angamuco and others with LiDAR

    According to Chris Fisher, an archeologist at Colorado State University, uncovering Angamuco has been a true achievement. Using LiDAR, the team noticed that the city extended over 26 km2.

    “That is a huge area with a lot of people and a lot of architectural foundations that are represented,” Fisher explained.

    “If you do the maths, all of a sudden you are talking about 40,000 building foundations up there, which is the same number of building foundations that are on the island of Manhattan.”

    He recently presented his findings at the 2018 AAAS Annual Meeting. Here, he discussed how LiDAR was used to uncover the ruins of yet another city in Honduras.

    Using the technology, the team has now verified over 7,000 architecture features spanning an area of 4 km2.

    Although archeologists are still having to do the dirty work, LiDAR offers tremendous future potential.

    “Everywhere you point the LiDAR instrument, you find new stuff, and that is because we know so little about the archaeological universe in the Americas right now. Right now every textbook has to be rewritten, and two years from now going to have to be rewritten again,” Fisher says.

    Source: The Guardian


    lidar

     

    Website: LINK