Schlagwort: Gesture-Controlled Drone

  • Use the Nano 33 BLE Sense’s IMU and gesture sensor to control a DJI Tello drone

    Use the Nano 33 BLE Sense’s IMU and gesture sensor to control a DJI Tello drone

    Reading Time: 2 minutes

    Use the Nano 33 BLE Sense’s IMU and gesture sensor to control a DJI Tello drone

    Arduino TeamSeptember 8th, 2021

    Piloting a drone with something other than a set of virtual joysticks on a phone screen is exciting due to the endless possibilities. DJI’s Tello can do just this, as it has a simple Python API which allows for basic aspects to be controlled such as taking off, landing, and moving within a horizontal plane. Soham Chatterjee built a system that takes advantage of two sensors within the Arduino Nano 33 BLE Sense’s onboard suite, namely the APDS-9960 and LSM9DS1 IMU.

    He started this endeavor by creating two simple programs that ran on the BLE Sense. The first initializes the APDS-9960 to detect gestures, which then sends strings like “detected DOWN gesture” via the USB port to a host machine. The second program checks if the IMU has gone over a certain threshold in a single direction and relays a corresponding string if it has. 

    A Raspberry Pi runs one of two Python scripts that essentially read the incoming data from the Arduino and converts it into movements. For example, a gesture in the ‘DOWN’ direction lands the Tello drone, whereas tilting the board forwards will move the drone forward 50cm. As an added safety feature, the drone automatically lands after 60 seconds, although the Python script can be modified to prevent this behavior.

    To read more about how Chatterjee constructed his drone system, you can view his first APDS-9960-based project here and the second IMU-controlled tutorial here.

    [youtube https://www.youtube.com/watch?v=qaPVjRupbrg?feature=oembed&w=500&h=281]

    Website: LINK

  • Light painting with a gesture-controlled drone

    Light painting with a gesture-controlled drone

    Reading Time: < 1 minute

    Light painting with a gesture-controlled drone

    Arduino TeamOctober 9th, 2020

    Researchers at the Skolkovo Institute of Science and Technology (Skoltech) in Moscow, Russia have come up with a novel way to interface with a drone via hand movements.

    As shown in the video below, the device can be used to create long-exposure light art, though the possibilities for such an intuitive control system could extend to many other applications as well.

    In this setup, a small Crazieflie 2.0 quadcopter is operated by a glove-like wearable featuring an Arduino Uno, along with an IMU and flex sensor for user input, and an XBee module for wireless communication. The controller connects to a base station running a machine learning algorithm that matches the user’s gestures to pre-defined letters or pattern, and directs the drone to light paint them. 

    [youtube https://www.youtube.com/watch?v=SdnIqLjtGeU?feature=oembed&w=500&h=281]

    The team’s full research paper is available here.

    Website: LINK