Schlagwort: conservation

  • Ultrasonically detect bats with Raspberry Pi

    Ultrasonically detect bats with Raspberry Pi

    Reading Time: 3 minutes

    Welcome to October, the month in which spiderwebs become decor and anything vaguely gruesome is considered ‘seasonal’. Such as bats. Bats are in fact cute, furry creatures, but as they are part of the ‘Halloweeny animal’ canon, I have a perfect excuse to sing their praises.

    baby bats in a row wrapped up like human babies
    SEE? Baby bats wrapped up cute like baby humans

    Tegwyn Twmffat was tasked with doing a bat survey on a derelict building, and they took to DesignSpark to share their Raspberry Pi–powered solution.

    UK law protects nesting birds and roosting bats, so before you go knocking buildings down, you need a professional to check that no critters will be harmed in the process.

    The acoustic signature of an echo-locating brown long-eared bat

    The problem with bats, compared to birds, is they are much harder to spot and have a tendency to hang out in tiny wall cavities. Enter this big ultrasonic microphone.

    Raspberry Pi 4 Model B provided the RAM needed for this build

    After the building was declared safely empty of bats, Tegwyn decided to keep hold of the expensive microphone (the metal tube in the image above) and have a crack at developing their own auto-classification system to detect which type of bats are about.

    How does it work?

    The ultrasonic mic picks up the audio data using an STM M0 processor and streams it to Raspberry Pi via USB. Raspberry Pi runs Alsa driver software and uses the bash language to receive the data.

    Tegwyn turned to the open-source GTK software to process the audio data

    It turns out there are no publicly available audio records of bats, so Tegwyn took to their own back garden and found 6 species to record. And with the help of a few other bat enthusiasts, they cobbled together an audio dataset of 9 of the 17 bat species found in the UK!

    Tegwyn’s original post about their project features a 12-step walkthrough, as well as all the code and commands you’ll need to build your own system. And here’s the GitHub repository, where you can check for updates.

    Website: LINK

  • Raspberry Pi listening posts ‘hear’ the Borneo rainforest

    Raspberry Pi listening posts ‘hear’ the Borneo rainforest

    Reading Time: 2 minutes

    These award-winning, solar-powered audio recorders, built on Raspberry Pi, have been installed in the Borneo rainforest so researchers can listen to the local ecosystem 24/7. The health of a forest ecosystem can often be gaged according to how much noise it creates, as this signals how many species are around.

    And you can listen to the rainforest too! The SAFE Acoustics website, funded by the World Wide Fund for Nature (WWF), streams audio from recorders placed around a region of the Bornean rainforest in Southeast Asia. Visitors can listen to live audio or skip back through the day’s recording, for example to listen to the dawn chorus.

    Listen in on the Imperial College podcast

    What’s inside?

    The device records data in the field and uploads it to a central server continuously and robustly over long time-periods. And it was built for around $305.

    Here’s all the code for the platform, on GitHub.

    The 12V-to-5V micro USB converter to the power socket of the Anker USB hub, which is connected to Raspberry Pi.

    The Imperial College London team behind the project has provided really good step-by-step photo instructions for anyone interested in the fine details.

    Here’s the full set up in the field. The Raspberry Pi-powered brains of the kit are safely inside the green box

    The recorders have been installed by Imperial College London researchers as part of the SAFE Project – one of the largest ecological experiments in the world.

    Dr Sarab Sethi designed the audio recorders with Dr Lorenzo Picinali. They wanted to quantify the changes in rainforest soundscape as land use changes, for example when forests are logged. Sarab is currently working on algorithms to analyse the gathered data with Dr Nick Jones from the Department of Mathematics.

    The lovely cross-disciplinary research team based at Imperial College London

    Let the creators of the project tell you more on the Imperial College London website.

    Website: LINK

  • Clean up the planet with awesome robot arms in Trash Rage from Giant Lazer

    Clean up the planet with awesome robot arms in Trash Rage from Giant Lazer

    Reading Time: 8 minutes

    VR has the power to educate as well as entertain, but designing experiences that do both successfully is easier said than done. Luckily, the team over at Giant Lazer were more than up to the task when they created the sci-fi arcade experience Trash Rage. Tasked with cleaning up a planet ravaged with pollution and waste, you’ll use super cool robot arms to frantically sort a bevy of junk on your quest for a cleaner world and a higher score. 

    We sat down with the head of Giant Lazer to hear about what went into this addictive and enlightening experience.

    Interview by Nathan Allen Ortega, Viveport Staff

    [youtube https://www.youtube.com/watch?v=OlWQU5TxdBE?feature=oembed&wmode=opaque&w=730&h=411]

    For those unfamiliar, tell us a bit about yourself and your team – your background in VR and game development and so on.

    Jakub Korczynski: I am the CEO at Giant Lazer. I have a technical, but also musical background. I worked on more than 25 VR/AR projects using various hardware setups and special features ranging from haptic gloves to A.I.

    My team consists of people with a wide range of skills. Together with have built the first educational VR game in Poland and we are the creators of Industry XR – a platform for easily deploying VR and AR in Industry 4.0. We like to get creative with VR that is why we love doing different types of projects: for education, industry or gaming.

    The first title that we decided to produce by ourselves and self-publish is Trash Rage – the environmental education VR game. The core Trash Rage team consists of nine people. Anita the cosplaying concept artist and developer, Ozi the programming wizz, Mahrcheen – 3D graphic artist and animator, Sebastian our sound designer, Matt our UX, Adam and Andrew, the photo/video crew and Ania which worked on organizing Trash Rage Game Days and marketing.

    Trash Rage is a post-apocalypse set arcade experience about the impact humans have on the environment – was making the game educational as well as fun a goal from the outset?

    Yes, it was our goal from day one. We wanted to create something that isn’t a typical educational game. Those tend to be very literal and in result not much fun. We wanted to take people somewhere, to tell a story and to make them sweat a little – the game is fast paced and can get intense.

    What were your biggest sources of inspiration when crafting the world of Trash Rage?

    Our inspirations came from many sources. I would certainly highlight Isle of Dogs – a Wes Anderson film. Others include: Love, Death And Robots: Three Robots, Elysium, Ready Player One, WALL-E, Suisei no Gargantia, Blade Runner 2049, The 100 and Gunnm: Battle Angel Alita. All these were strong visual inspirations.

    Most of these are considered post apocalyptic fiction, where the main culprit is a rogue A.I., genetically modified cats or global (nuclear) war.

    In Trash Rage we did include A.I., but the main theme is environmental destruction and trashing of the planet. The humans get a second chance after the A.I. goes haywire and destroys itself. In this post-post-apocalypse world it’s a fight for resources and a fight for survival.

    I should also mention other inspirations like The Machine Stops by E. M. Forster and Brave New World by Aldous Huxley. We also were motivated by things such as the less and zero waste communities and Extinction Rebellion.

    Were there any particular design challenges that your team encountered that you didn’t expect?

    One such challenge was overcoming the limits of human perception. It quickly occurred to us that the mechanic we had envisioned was generally too hard for people. Most of them, trying VR for the first time, didn’t have the coordination and memory required to play Trash Rage. It was just too overwhelming. We had to tune things down a bit and do a lot of testing before we achieved a good balance of the game. Some people say “it’s very easy to play”. It is! But this required many hours of trail and error to achieve. We’re proud of that characteristic – anybody can learn to play in seconds. Of course the game is hard to master. It requires a lot of concentration to stay focused till the end. We are yet to see people achieve the scores we are able to, knowing the game mechanics inside out.

    What are your thoughts on the role VR and other emerging innovative technology can play in regards to educating people – especially in regards to preserving our world?

    Stanford research has shown that VR is an excellent tool for learning. They also proved that learning about environmental issues is a great VR use case.

    I think that VR certainly has a future in education. When it comes to environmental education Virtual Reality can let people experience the consequences of our actions first-hand. This is in contrast to just reading some news or watching a video. We can learn by experience and really see our impact on the planet.

    What kind of research did you do to prepare to make this ecologically minded arcade experience?

    We first reached out to companies that work in the recycling sector. They helped us clear up common myths about recycling. We learned that unfortunately recycling isn’t the answer to all humanities waste problems. It works great in some cases like aluminium and glass. Other types of trash like plastic can’t be recycled efficiently, so only going to less waste or even zero waste is the real solution to reducing the amount of plastic in our environment.

    We even made a short video about how many single-use plastic bags one might use during one visit to the grocery. It went viral in Poland.

    [youtube https://www.youtube.com/watch?v=WUV9SydDvYo?feature=oembed&wmode=opaque&w=730&h=411]

    We later talked with environmental educators and also managed to get feedback from the Polish Ministry of Environment about the recycling scheme we used in the game. It turned out that the regulations and what is often implemented locally differ. That is why we are planning to upgrade the game with an editor for educators. They will be able to adjust the trash sorting rules to their local regulations. With this upgrade it will be super easy to use the app for educational purposes all around the world.

    Making something that people will want to play time and time again with lots of mechanical depth is no easy task. What was the design process like to craft this addictive and satisfying arcade experience?

    We had some previous experience with a “First Person Catcher FPC” mechanic with our production Pack Rage. It was an educational game about symbols used for logistic packaging of dangerous goods. It was our first commercial VR game for WSB University and the first educational VR game in Poland too. To make the mechanic work for Trash Rage, we had to rebuild the game from scratch, but we used Pack Rage as a prototype for testing new gameplay upgrades.

    We also managed to make some stuff a lot better. For example the Blob. The Blob is a bucket-sized piece of car oil goo that makes it harder for the player to see. In the first version this was something that blocked your sight. After tests with users it turned out that it wasn’t a good VR experience. Something sticking to your face makes you want to take your headset off. Finally it ended up being an LSD-like effect that changes the color of the world around you, so that it’s super hard to keep scoring points.

    Have there been any surprising bits of feedback from players since launching that you didn’t expect?

    A lot of very funny ones for sure. Some of this feedback is related to people trying VR for the first time. Besides that, we got a lot of love, though there were also those who didn’t like it. Because this isn’t your everyday zombie-shooter we expected some negative reactions. Fortunately the positive responses outweigh the negative ones, which keeps us motivated to keep pushing further. The best responses we got were from people that not only loved the game, but also were thrilled by our effort to educate about environmental issues using VR. That was the best feedback!

    What would you like to see from the VR ecosystem going forward in order to empower you to make even more engaging experiences?

    Easy content distribution, especially in the school setting is something that would push things forward and would allow us to reach more people with our message. Of course the development of hardware will further allow us to create better experiences for the end user.

    Trash Rage is an Early Access title – what has that journey been like, and how is player feedback helping shape the direction of the game as you update it?

    Before releasing the game, we gathered feedback on the ground during the many events we organised with Trash Rage. We gathered several hundred questionnaires and spoke with players. After the launch we have much more feedback – now worldwide. Based on this new feedback we are modifying our roadmap to better adjust to what the players are saying. What I can say, is that’s it’s very hard to make educational games. Especially if you really want to appeal to gamers and still have real educational value.

    One example I can give of responding to player feedback is that we changed the whole slow motion experience. At first it was a SUPERHOT style slow-mo effect. But because it was hard for players to get a hang of, we changed it to more traditional slow motion.

    How long have you and your team been working on this project?

    We have been working on Trash Rage for a year now with some breaks for other work.

    What do you ultimately want players to take away from their time with Trash Rage?

    We want them to have fun first of all, but also to take some time to think about our impact on the planet. It would be great if people considered how simple things that we can do everyday can really make a difference. When people open their eyes and see that they have the power to make change in their daily lives that has benefits for them and their loved ones they will also start to demand change from business and government. Trash Rage is just a small drop in the sea of educational needs, but it’s a start.

    Beyond updates to Trash Rage, what’s next for Giant Lazer?

    We are currently working on some educational projects like language learning in VR, Japanese business etiquette in 360, a geometry and geography app for school children and an educational app about forest habitats. We are also planning some new cool stuff for Trash Rage beyond just basic game updates. So stay tuned!

    Sounds exciting! Thanks for sharing this with us. 

    Trash Rage is out now on Viveport and Viveport Infinity. Start your free trial today and start warming up your robo recycling arm!

    Website: LINK

  • View Stonehenge in real time via Raspberry Pi

    View Stonehenge in real time via Raspberry Pi

    Reading Time: 4 minutes

    You can see how the skies above Stonehenge affect the iconic stones via a web browser thanks to a Raspberry Pi computer.

    Stonehenge

    Stonehenge is Britain’s greatest monument and it currently attracts more than 1.5 million visitors each year. It’s possible to walk around the iconic stone circle and visit the Neolithic houses outside the visitor centre. Yet, worries about potential damage have forced preservationists to limit access.

    With that in mind, Eric Winbolt, Interim Head of Digital/Innovation at English Heritage, had a brainwave. “We decided to give people an idea of what it’s like to see the sunrise and sunset within the circle, and allow them to enjoy the skies over Stonehenge in real time without actually stepping inside,” he explains.

    This could have been achieved by permanently positioning a camera within the stone circle, but this was ruled out for fear of being too intrusive. Instead, Eric and developers from The Bespoke Pixel agency snapped a single panoramic shot of the circle’s interior using a large 8K high-res, 360-degree camera when the shadows and light were quite neutral.

    “We then took the sky out of the image with the aim of capturing an approximation of the view without impacting on the actual stones themselves,” Eric says.

    Stone me

    By taking a separate hemispherical snapshot of the sky from a nearby position and merging it with the master photograph of the stones, the team discovered they could create a near real-time effect for online visitors. They used an off-the-shelf, upwards-pointing, 220-degree fish-eye lens camera connected to a Raspberry Pi 3 Model A+ computer, taking images once every four minutes.

    This Raspberry Pi was also fitted with a Pimoroni Enviro pHAT containing atmospheric, air pressure, and light sensors. Captured light values from the sky image were then used to alter the colour values of the master image of the stones so that the light on Stonehenge, as seen via the web, reflected the ambient light of the sky.

    What can you see?

    “What it does is give a view of the stones as it looks right now, or at least within a few minutes,” says Eric. “It also means the effect doesn’t look like two images simply Photoshopped together.”

    Indeed, coder Mark Griffiths says the magic all runs from Node.js. “It uses a Python shell to get the sensor data and integrates with Amazon’s AWS and an IoT messaging service called DweetPro to tie all the events together,” he adds.

    There was also a lot of experimentation. “We used the HAT via the I2C connectors so that we could mount it away from the main board to get better temperature readings,” says Mark, “We also tried a number of experiments with different cameras, lenses, and connections and it became clear that just connecting the camera via USB didnít allow access to the full functionality and resolutions.”

    Mark reverse-engineered the camera’s WiFi connection and binary protocol to work out how to communicate with it via Raspberry Pi so that full-quality images could be taken and downloaded. “We also found the camera’s WiFi connection would time out after several days,” reveals Mark, “so we had to use a relay board connected via the GPIO pins.”
    With such issues resolved, the team then created an easy-to-use online interface that lets users click boxes and see the view over the past 24 hours. They also added a computer model to depict the night sky.

    “Visitors can go to the website day and night and allow the tool to pan around Stonehenge or pause it and pan manually, viewing the stones as they would be at the time of visiting,” Eric says. “It can look especially good on a smart television. It’s very relaxing.”

    View the stones in realtime right now by visiting the English Heritage website.

    Website: LINK

  • Penguin Watch — Pi Zeros and Camera Modules in the Antarctic

    Penguin Watch — Pi Zeros and Camera Modules in the Antarctic

    Reading Time: 2 minutes

    Long-time readers will remember Penguin Lifelines, one of our very favourite projects from back in the mists of time (which is to say 2014 — we have short memories around here).

    Penguins

    Click on penguins for fun and conservation

    Penguin Lifelines was a programme run by the Zoological Society of London, crowdsourcing the tracking of penguin colonies in Antarctica. It’s since evolved into something called Penguin Watch, now working with the World Wildlife Fund (WWF) and British Antarctic Survey (BAS). It’s citizen science on a big scale: thousands of people from all over the world come together on the internet to…click on penguins. By counting the birds in their colonies, users help penguinologists measure changes in the birds’ behaviour and habitat, and in the larger ecosystem, thus assisting in their conservation.

    The penguin people say this about Penguin Watch:

    Some of these colonies are so difficult to get to that they haven’t been visited for 50 years! The images contain unprecedented detail, giving us the opportunity to gather new data on the number of penguins in the region. This information will help us understand how they are being affected by climate change, the potential impact of local fisheries, and how we can help conserve these incredible species.

    Pis in the coldest, wildest place

    And what are those special cameras? The static ones providing time-lapse images are Raspberry Pi Camera Modules, mounted on Raspberry Pi Zeros, and we’re really proud to see just how robust they’ve been in the face of Antarctic winters.

    Alasdair Davies on Twitter

    Success! The @arribada_i timelapse @Raspberry_Pi Zero cameras built for @penguin_watch survived the Antarctic winter! They captured these fantastic photos of a Gentoo penguin rookery for https://t.co/MEzxbqSyc1 #WorldPenguinDay 🐧@helenlynn @philipcolligan https://t.co/M0TK5NLT6G

    These things are incredibly tough. They’re the same cameras that Alasdair and colleagues have been sticking on turtles, at depths of down to 500m; I can’t think of a better set of tests for robustness.

    Want to get involved? Head over to Penguin Watch, and get clicking! We warn you, though — it’s a little addictive.

    Website: LINK

  • Prepare yourself for winter with the help of squirrels

    Prepare yourself for winter with the help of squirrels

    Reading Time: 3 minutes

    This article from The MagPi issue 72 explores Carsten Dannat’s Squirrel Cafe project and his mission to predict winter weather conditions based on the eating habits of local squirrels. Get your copy of The MagPi in stores now, or download it as a free PDF here.

    The Squirrel Cafe on Twitter

    Squirrel chowed down on 5.0 nuts for 3.16 min at 12:53:18 CEST. An #IoT project to predict how cold it’ll be next winter. #ThingSpeak

    Back in 2012, Carsten Dannat was at a science summit in London, during which a lecture inspired him to come up with a way of finding correlations between nature and climate. “Some people say it’s possible to predict changes in weather by looking at the way certain animals behave,” he tells us. “Perhaps you can predict how cold it’ll be next winter by analysing the eating habits of animals? Do animals eat more to get additional fat and excess weight to be prepared for the upcoming winter?” An interesting idea, and one that Germany-based Carsten was determined to investigate further.

    “On returning home, I got the sudden inspiration to measure the nut consumption of squirrels at our squirrel feeder”, he says. Four years later and his first prototype of the The Squirrel Cafe was built, incorporating a first-generation Raspberry Pi.

    A tough nut to crack

    A switch in the feeder’s lid is triggered every time a squirrel opens it. To give visual feedback on how often the lid has been opened, a seven-segment LED display shows the number of openings per meal break. A USB webcam is also used to capture images of the squirrels, which are tweeted automatically, along with stats on the nuts eaten and time taken. Unsurprisingly perhaps, Carsten says that the squirrels are “focussed on nuts and are not showing interest at all in the electronics!”

    The Squirrel Cafe on Twitter

    Squirrel chowed down on 4.5 nuts for 6.60 min at 14:23:55 CEST. An #IoT project to predict how cold it’ll be next winter. #ThingSpeak

    So, how do you know how many nuts have actually been eaten by the squirrels? Carsten explains that “the number of nuts eaten per visit is calculated by counting lid openings. This part of the source code had been reworked a couple of times to get adjusted to the squirrel’s behaviour while grabbing a nut out of the feeder. Not always has a nut been taken out of the feeder, even if the lid has been opened.” Carsten makes an assumption that if the lid hasn’t been opened for at least 90 seconds, the squirrel went away. “I’m planning to improve the current design by implementing a scale to weigh the nuts themselves to get a more accurate measurement of nut consumption,” he says.

    Squirrel Cafe Raspberry Pi The MagPi

    Just nuts about the weather!

    The big question, of course, is what does this all tell us about the weather? Well, this is a complicated area too, as Carsten illustrates: “There are a lot of factors to consider if you want to find a correlation between eating habits and the prediction of the upcoming winter weather. One of them is that I cannot differentiate between individual squirrels currently [in order to calculate overall nut consumption per squirrel].” He suggests that one way around this might be to weigh the individual squirrels in order to know exactly who is visiting the Cafe, with what he intriguingly calls “individual squirrel recognition” — a planned improvement for a future incarnation of The Squirrel Cafe. Fine-tuning of the system aside, Carsten’s forecast for the winter of 2017/18 was spot-on when he predicted, via Twitter, a very cold winter compared to the previous year. He was proven right, as Germany experienced its coldest winter since 2012. Go squirrels!

    Follow The Squirrel Cafe

    Track the eating habits of the squirrels through some utterly adorable photos on The Squirrel Cafe Twitter account, and learn more about the project on The Squirrel Cafe website.

    Website: LINK

  • Protecting coral reefs with Nemo-Pi, the underwater monitor

    Protecting coral reefs with Nemo-Pi, the underwater monitor

    Reading Time: 3 minutes

    The German charity Save Nemo works to protect coral reefs, and they are developing Nemo-Pi, an underwater “weather station” that monitors ocean conditions. Right now, you can vote for Save Nemo in the Google.org Impact Challenge.

    Nemo-Pi — Save Nemo

    Save Nemo

    The organisation says there are two major threats to coral reefs: divers, and climate change. To make diving saver for reefs, Save Nemo installs buoy anchor points where diving tour boats can anchor without damaging corals in the process.

    In addition, they provide dos and don’ts for how to behave on a reef dive.

    The Nemo-Pi

    To monitor the effects of climate change, and to help divers decide whether conditions are right at a reef while they’re still on shore, Save Nemo is also in the process of perfecting Nemo-Pi.

    Nemo-Pi schematic — Nemo-Pi — Save Nemo

    This Raspberry Pi-powered device is made up of a buoy, a solar panel, a GPS device, a Pi, and an array of sensors. Nemo-Pi measures water conditions such as current, visibility, temperature, carbon dioxide and nitrogen oxide concentrations, and pH. It also uploads its readings live to a public webserver.

    The Save Nemo team is currently doing long-term tests of Nemo-Pi off the coast of Thailand and Indonesia. They are also working on improving the device’s power consumption and durability, and testing prototypes with the Raspberry Pi Zero W.

    web dashboard — Nemo-Pi — Save Nemo

    The web dashboard showing live Nemo-Pi data

    Long-term goals

    Save Nemo aims to install a network of Nemo-Pis at shallow reefs (up to 60 metres deep) in South East Asia. Then diving tour companies can check the live data online and decide day-to-day whether tours are feasible. This will lower the impact of humans on reefs and help the local flora and fauna survive.

    Coral reefs with fishes

    A healthy coral reef

    Nemo-Pi data may also be useful for groups lobbying for reef conservation, and for scientists and activists who want to shine a spotlight on the awful effects of climate change on sea life, such as coral bleaching caused by rising water temperatures.

    Bleached coral

    A bleached coral reef

    Vote now for Save Nemo

    If you want to help Save Nemo in their mission today, vote for them to win the Google.org Impact Challenge:

    1. Head to the voting web page
    2. Click “Abstimmen” in the footer of the page to vote
    3. Click “JA” in the footer to confirm

    Voting is open until 6 June. You can also follow Save Nemo on Facebook or Twitter. We think this organisation is doing valuable work, and that their projects could be expanded to reefs across the globe. It’s fantastic to see the Raspberry Pi being used to help protect ocean life.

    Website: LINK

  • Journeying with green sea turtles and the Arribada Initiative

    Journeying with green sea turtles and the Arribada Initiative

    Reading Time: 4 minutes

    Today, a guest post: Alasdair Davies, co-founder of Naturebytes, ZSL London’s Conservation Technology Specialist and Shuttleworth Foundation Fellow, shares the work of the Arribada Initiative. The project uses the Raspberry Pi Zero and camera module to follow the journey of green sea turtles. The footage captured from the backs of these magnificent creatures is just incredible – prepare to be blown away!

    Pit Stop Camera on Green Sea Turtle 01

    Footage from the new Arribada PS-C (pit-stop camera) video tag recently trialled on the island of Principe in unison with the Principe Trust. Engineered by Institute IRNAS (http://irnas.eu/) for the Arribada Initiative (http://blog.arribada.org/).

    Access to affordable, open and customisable conservation technologies in the animal tracking world is often limited. I’ve been a conservation technologist for the past ten years, co-founding Naturebytes and working at ZSL London Zoo, and this was a problem that continued to frustrate me. It was inherently expensive to collect valuable data that was necessary to inform policy, to designate marine protected areas, or to identify threats to species.

    In March this year, I got a supercharged opportunity to break through these barriers by becoming a Shuttleworth Foundation Fellow, meaning I had the time and resources to concentrate on cracking the problem. The Arribada Initiative was founded, and ten months later, the open source Arribada PS-C green sea turtle tag was born. The video above was captured two weeks ago in the waters of Principe Island, West Africa.

    Alasdair Davies on Twitter

    On route to Principe island with 10 second gen green sea #turtle tags for testing. This version has a video & accelerometer payload for behavioural studies, plus a nice wireless charging carry case made by @institute_irnas @ShuttleworthFdn

    The tag comprises a Raspberry Pi Zero W sporting the Raspberry Pi camera module, a PiRA power management board, two lithium-ion cells, and a rather nice enclosure. It was built in unison with Institute IRNAS, and there’s a nice user-friendly wireless charging case to make it easy for the marine guards to replace the tags after their voyages at sea. When a tag is returned to one of the docking stations in the case, we use resin.io to manage it, download videos, and configure the tag remotely.

    The tags can also be configured to take video clips at timed intervals, meaning we can now observe the presence of marine litter, plastic debris, before/after changes to the ocean environment due to nearby construction, pollution, and other threats.

    Discarded fishing nets are lethal to sea turtles, so using this new tag at scale – now finally possible, as the Raspberry Pi Zero helps to drive down costs dramatically whilst retaining excellent video quality – offers real value to scientists in the field. Next year we will be releasing an optimised, affordable GPS version.

    green sea turtle Alasdair Davies Raspberry Pi Arribada Initiative

    To make this all possible we had to devise a quicker method of attaching the tag to the sea turtles too, so we came up with the “pit-stop” technique (which is what the PS in the name “Arribada PS-C” stands for). Just as a Formula 1 car would visit the pits to get its tyres changed, we literally switch out the tags on the beach when nesting females return, replacing them with freshly charged tags by using a quick-release base plate.

    Alasdair Davies on Twitter

    About 6 days left now until the first tagged nesting green sea #turtles return using our latest “pit-stop” removeable / replaceable tag method. Counting down the days @arribada_i @institute_irnas

    To implement the system we first epoxy the base plate to the turtle, which minimises any possible stress to the turtles as the method is quick. Once the epoxy has dried we attach the tag. When the turtle has completed its nesting cycle (they visit the beach to lay eggs three to four times in a single season, every 10–14 days on average), we simply remove the base plate to complete the field work.

    If you’d like to watch more wonderful videos of the green sea turtles’ adventures, there’s an entire YouTube playlist available here. And to keep up to date with the initiative, be sure to follow Arribada and Alasdair on Twitter.

    Website: LINK