Schlagwort: Motion Capture

  • DSLR Motion Capture with Raspberry Pi and OpenCV

    DSLR Motion Capture with Raspberry Pi and OpenCV

    Reading Time: 3 minutes

    One of our favourite makers, Pi & Chips (AKA David Pride), wanted to see if they could trigger a DSLR camera to take pictures by using motion detection with OpenCV on Raspberry Pi.

    You could certainly do this with a Raspberry Pi High Quality Camera, but David wanted to try with his swanky new Lumix camera. As well as a Raspberry Pi and whichever camera you’re using, you’ll also need a remote control. David sourced a cheap one from Amazon, since he knew full well he was going to be… breaking it a bit.

    Breaking the remote a bit

    When it came to the “breaking” part, David explains: “I was hoping to be able to just re-solder some connectors to the button but it was a dual function button depending on depth of press. I therefore got a set of probes out and traced which pins on the chip were responsible for the actual shutter release and then *carefully* managed to add two fine wires.”

    Further breaking

    Next, David added Dupont cables to the ends of the wires to allow access to the breadboard, holding the cables in place with a blob of hot glue. Then a very simple circuit using an NPN transistor to switch via GPIO gave remote control of the camera from Python.

    Raspberry Pi on the right, working together with the remote control’s innards on the left

    David then added OpenCV to the mix, using this tutorial on PyImageSearch. He took the basic motion detection script and added a tiny hack to trigger the GPIO when motion was detected.

    He needed to add a delay to the start of the script so he could position stuff, or himself, in front of the camera with time to spare. Got to think of those angles.

    David concludes: “The camera was set to fully manual and to a really nice fast shutter speed. There is almost no delay at all between motion being detected and the Lumix actually taking pictures, I was really surprised how instantaneous it was.”

    The whole setup mounted on a tripod ready to play

    Here are some of the visuals captured by this Raspberry Pi-powered project…

    Take a look at some more of David’s projects over at Pi & Chips.

    Website: LINK

  • Track your cat’s activity with a homemade speedometer

    Track your cat’s activity with a homemade speedometer

    Reading Time: 3 minutes

    Firstly, hamster wheels for cats are (still) a thing. Secondly, Bengal cats run far. And Shawn Nunley on reddit is the latest to hit on this solution for kitty exercise and bonus cat stats.

    Here is the wheel itself. That part was shop-bought. (Apparently it’s a ZiggyDoo Ferris Cat Wheel.)

    Smol kitty in big wheel

    Shawn has created a speedometer that tracks distance and speed. Every time a magnet mounted on the wheel passes a fixed sensor, a Raspberry Pi Zero writes to a log file so he can see how far and fast his felines have travelled. The wheel has six sensors, which each record 2.095 ft of travel. This project revealed the cats do about 4-6 miles per night on their wheel, and they reach speeds of 14 miles an hour.

    Here’s your shopping list:

    • Raspberry Pi
    • Reed switch (Shawn got these)
    • Jumper wires
    • Ferris cat wheel

    The tiny white box sticking out at the base of the wheel is the sensor

    Shawn soldered a 40-pin header to his Raspberry Pi Zero and used jumper wires to connect to the sensor. He mounted the sensor to the cat wheel using hot glue and a pill box cut in half, which provided the perfect offset so it could accurately detect the magnets passing by. The code is written in Python.

    Upcoming improvements include adding RFID so the wheel can distinguish between the cats in this two-kitty household.

    Shawn also plans to calculate how much energy the Bengals are expending, and he’ll soon be connecting the Raspberry Pi to their Google Cloud Platform account so you can all keep up with the cats’ stats.

    The stats are currently available only locally

    And, get this – this was Shawn’s first ever time doing anything with Raspberry Pi or Python. OK, so as an ex-programmer he had a bit of a head start, but he assures us he hasn’t touched the stuff since the 1990s. He explains: “I was totally shocked at how easy it was once I figured out how to get the Raspberry Pi to read a sensor.” Start to finish, the project took him just one week.

    Website: LINK

  • VIVE & Motion Workshop Bring Full-Body Interaction To VR

    VIVE & Motion Workshop Bring Full-Body Interaction To VR

    Reading Time: 2 minutes

    Fast, high-quality mocap animation at 400 fps. Brought to you by VIVE Tracker & Shadow® Motion Capture System.

    Erik Bakke is the co-founder of Motion Workshop, a Seattle-based business that focuses on motion sensor technology and motion capture. We invited him to sit down and discuss his signature Shadow Motion Capture System and what he was able to accomplish with VIVE.

    [youtube https://www.youtube.com/watch?v=cWBX6aE4qeA?feature=oembed&wmode=opaque&w=730&h=411]

    Dancer Catriona Urquhart, equipped with the Shadow Motion Capture System and 3 VIVE Trackers, has her movement livestreamed into characters in Unreal Engine.

    Production of animation content has traditionally involved very expensive motion capture setups using dozens of cameras. This limited high-quality animation production to large studios with big budgets. In the past few years, advances in tracking technology have brought excellent quality animation production to smaller budget projects, indie game and film studios, and independent freelance animators.

    Motion Workshop offers a new hybrid tracking system using their Shadow full-body mocap system and HTC VIVE Trackers. The Shadow mocap system provides excellent, fast full-body animation up to 400fps with built-in VIVE Tracker support. The Shadow/VIVE setup provides drift-free position tracking and smooth, accurate full-body joint animation. Plus, VIVE Trackers can be added for virtual camera, prop, and object tracking.

    With the included plugins you can livestream into Unreal Engine and Unity game engines. Character animation, props, and cameras are all available in one easy-to-use data stream inside the game engine.

    In a recent mocap shoot with dancer Catriona Urquhart from Cornish College of the Arts, we used the Shadow mocap system with 3 VIVE Trackers to livestream the dancer’s motions onto a AAA game character in Unreal Engine. We started with the free Paragon characters from the Unreal Marketplace and retargeted the animation in real time using Autodesk Motion Builder. The results were composited and rendered in near-real time using the Composure tool in Unreal Engine.

    [youtube https://www.youtube.com/watch?v=338xxSMqMoo?feature=oembed&wmode=opaque&w=730&h=411]

    This setup is ideal for virtual production and doesn’t require a large team or a dedicated mocap stage to create great content. For very small teams, these tools are helpful in getting animation content and rendering/compositing arranged in one place for quick and reliable delivery.

    ***

    Shadow Mocap Plugin
    Shadow Mocap plugins are available for free on the Unreal Marketplace and Unity asset store. The Shadow Mocap system is available for purchase at motionshadow.com

    Website: LINK