Schlagwort: Raspberry Pi Cameras

  • Machine Learning made easy with Raspberry Pi, Adafruit and Microsoft

    Machine Learning made easy with Raspberry Pi, Adafruit and Microsoft

    Reading Time: 3 minutes

    Machine learning can sound daunting even for experienced Raspberry Pi hobbyists, but Microsoft and Adafruit Industries are determined to make it easier for everyone to have a go. Microsoft’s Lobe tool takes the stress out of training machine learning models, and Adafruit have developed an entire kit around their BrainCraft HAT, featuring Raspberry Pi 4 and a Raspberry Pi Camera, to get your own machine learning project off to a flying start.

    adafruit lobe kit
    Adafruit developed this kit especially for the BrainCraft HAT to be used with Microsoft Lobe on Raspberry Pi

    Adafruit’s BrainCraft HAT

    Adafruit’s BrainCraft HAT fits on top of Raspberry Pi 4 and makes it really easy to connect hardware and debug machine learning projects. The 240 x 240 colour display screen also lets you see what the camera sees. Two microphones allow for audio input, and access to the GPIO means you can connect things likes relays and servos, depending on your project.

    Adafruit’s BrainCraft HAT in action detecting a coffee mug

    Microsoft Lobe

    Microsoft Lobe is a free tool for creating and training machine learning models that you can deploy almost anywhere. The hardest part of machine learning is arguably creating and training a new model, so this tool is a great way for newbies to get stuck in, as well as being a fantastic time-saver for people who have more experience.

    Get started with one of three easy, medium, and hard tutorials featured on the lobe-adafruit-kit GitHub.

    This is just a quick snippet of Microsoft’s full Lobe tutorial video.
    Look how quickly the tool takes enough photos to train a machine learning model

    ‘Bakery’ identifies and prices different pastries

    Lady Ada demonstrated Bakery: a machine learning model that uses an Adafruit BrainCraft HAT, a Raspberry Pi camera, and Microsoft Lobe. Watch how easy it is to train a new machine learning model in Microsoft Lobe from this point in the Microsoft Build Keynote video.

    [youtube https://www.youtube.com/watch?v=O6KdTTdioTY?feature=oembed&w=500&h=281]

    A quick look at Bakery from Adafruit’s delightful YouTube channel

    Bakery identifies different baked goods based on images taken by the Raspberry Pi camera, then automatically identifies and prices them, in the absence of barcodes or price tags. You can’t stick a price tag on a croissant. There’d be flakes everywhere.

    Extra functionality

    Running this project on Raspberry Pi means that Lady Ada was able to hook up lots of other useful tools. In addition to the Raspberry Pi camera and the HAT, she is using:

    • Three LEDs that glow green when an object is detected
    • A speaker and some text-to-speech code that announces which object is detected
    • A receipt printer that prints out the product name and the price

    All of this running on Raspberry Pi, and made super easy with Microsoft Lobe and Adafruit’s BrainCraft HAT. Adafruit’s Microsoft Machine Learning Kit for Lobe contains everything you need to get started.

    full adafruit lobe kit
    The full Microsoft Machine Learning Kit for Lobe with Raspberry Pi 4 kit

    Watch the Microsoft Build keynote

    And finally, watch Microsoft CTO Kevin Scott introduce Limor Fried, aka Lady Ada, owner of Adafruit Industries. Lady Ada joins remotely from the Adafruit factory in Manhattan, NY, to show how the BrainCraft HAT and Lobe work to make machine learning accessible.

    [youtube https://www.youtube.com/watch?v=ML0s-T55VPA?feature=oembed&w=500&h=281]

    Website: LINK

  • SleePi sounds alarm when Raspberry Pi detects sleepiness

    SleePi sounds alarm when Raspberry Pi detects sleepiness

    Reading Time: 2 minutes

    SleePi is a real-time sleepiness detection and alert system developed especially for Raspberry Pi and our Raspberry Pi Camera Module 2 NoIR.

    Driver drowsiness detection was the original application for this project, and Raspberry Pi was chosen for it because it’s small enough to not obstruct a driver’s view and can be powered from a vehicle’s 12 V socket or a USB port.

    sleepi setup
    Teeny tiny setup

    Our Raspberry Pi NoIR Camera has no infrared filter and can therefore detect infrared light. It was chosen for this project to help with driver visibility by infrared illumination in low light, because night time is when people are more likely to become drowsy.

    Never drive tired

    Firstly, you should absolutely never drive tired. The UK’s Driver and Vehicle Licensing Agency says that, by law, after every 5 hours 30 minutes of driving you must take a break of at least 30 minutes.

    We’re sharing this project because we like the software behind this sleepiness detector, which can tell when your eyes narrow and alert you before you nod off. A safer application of this invention could be for exam cramming season when you don’t want to fall asleep before reading that final chapter of your revision guide. Or perhaps for the sleepier among us who need extra help staying awake for the New Year’s Eve countdown. We cannot miss another one of those. But we get SO sleepy.

    How does SleePi work?

    Eye Aspect Ratio (EAR)
    How SleepPi uses EAR to detect sleepiness in the eyes

    The camera tracks the position of the eyes and uses something called the Eye Aspect Ratio (EAR) to detect blinks. When squinting or blinking is observed, Raspberry Pi thinks you’re getting sleepy. When sleepiness is detected, a loud alarm sounds via the Raspberry Pi’s AUX port, connected to the car’s speaker system. The alarm carries on sounding until the camera detects that the user’s eyes are completely open again.

    How do I build it?

    Sai Sathvik is a dreamboat of a maker and left detailed instructions to help you build your own SleePi.

    Are you a New Year’s Eve napper? Or a classroom snoozer? What do you need a SleePi for? Comment below telling us why you need this doziness detector.

    Website: LINK

  • Raspberry Pi dog detector (and dopamine booster)

    Raspberry Pi dog detector (and dopamine booster)

    Reading Time: 2 minutes

    You can always rely on Ryder’s YouTube channel to be full of weird and wonderful makes. This latest offering aims to boost dopamine levels with dog spotting. Looking at dogs makes you happier, right? But you can’t spend all day looking out of the window waiting for a dog to pass, right? Well, a Raspberry Pi Camera Module and machine learning can do the dog spotting for you.

    [youtube https://www.youtube.com/watch?v=4UUEO6Xv1OM?feature=oembed&w=500&h=281]

    What’s the setup?

    Ryder’s Raspberry Pi and camera sit on a tripod pointing out of a window looking over a street. Live video of the street is taken by the camera and fed through a machine learning model. Ryder chose the YOLO v3 object detection model, which can already recognise around 80 different things — from dogs to humans, and even umbrellas.

    A hand holding a raspberry pi high quality camera pointing out of a window
    Camera set up ready for dog spotting

    Doggo passing announcements

    But how would Ryder know that his Raspberry Pi had detected a dog? They’re so sneaky — they work in silence. A megaphone and some text-to-speech software make sure that Ryder is alerted in time to run to the window and see the passing dog. The megaphone announces: “Attention! There is a cute dog outside.”

    A machine learning image with a human and a dog circled in different colours
    The machine learning program clearly labels a ‘person’ and a ‘dog’

    “Hey! Cute dog!”

    Ryder wanted to share the love and show his appreciation to the owners of cute dogs, so he added a feature for when he is out of the house. With the megaphone poking out of a window, the Raspberry Pi does its dog-detecting as usual, but instead of alerting Ryder, it announces: “I like your dog” when a canine is walked past.

    Raspberry Pi camera pointing out of a window connected to a megaphone which will announce when a dog passes by
    When has a megaphone ever NOT made a project better?

    Also, we’d like to learn more about this ‘Heather’ who apparently once scaled a six-foot fence to pet a dog and for whom Ryder built this. Ryder, spill the story in the comments!

    Website: LINK

  • Raspberry Pi LEGO sorter

    Raspberry Pi LEGO sorter

    Reading Time: 3 minutes

    Raspberry Pi is at the heart of this AI–powered, automated sorting machine that is capable of recognising and sorting any LEGO brick.

    And its maker Daniel West believes it to be the first of its kind in the world!

    [youtube https://www.youtube.com/watch?v=04JkdHEX3Yk?feature=oembed&w=500&h=281]

    Best ever

    This mega-machine was two years in the making and is a LEGO creation itself, built from over 10,000 LEGO bricks.

    A beast of 10,000 bricks

    It can sort any LEGO brick you place in its input bucket into one of 18 output buckets, at the rate of one brick every two seconds.

    While Daniel was inspired by previous LEGO sorters, his creation is a huge step up from them: it can recognise absolutely every LEGO brick ever created, even bricks it has never seen before. Hence the ‘universal’ in the name ‘universal LEGO sorting machine’.

    Hardware

    There we are, tucked away, just doing our job

    Software

    The artificial intelligence algorithm behind the LEGO sorting is a convolutional neural network, the go-to for image classification.

    What makes Daniel’s project a ‘world first’ is that he trained his classifier using 3D model images of LEGO bricks, which is how the machine can classify absolutely any LEGO brick it’s faced with, even if it has never seen it in real life before.

    [youtube https://www.youtube.com/watch?v=-UGl0ZOCgwQ?feature=oembed&w=500&h=281]

    We LOVE a thorough project video, and we love TWO of them even more

    Daniel has made a whole extra video (above) explaining how the AI in this project works. He shouts out all the open source software he used to run the Raspberry Pi Camera Module and access 3D training images etc. at this point in the video.

    LEGO brick separation

    The vibration plate in action, feeding single parts into the scanner

    Daniel needed the input bucket to carefully pick out a single LEGO brick from the mass he chucks in at once.

    This is achieved with a primary and secondary belt slowly pushing parts onto a vibration plate. The vibration plate uses a super fast LEGO motor to shake the bricks around so they aren’t sitting on top of each other when they reach the scanner.

    Scanning and sorting

    A side view of the LEFO sorting machine showing a large white chute built from LEGO bricks
    The underside of the beast

    A Raspberry Pi Camera Module captures video of each brick, which Raspberry Pi 3 Model B+ then processes and wirelessly sends to a more powerful computer able to run the neural network that classifies the parts.

    The classification decision is then sent back to the sorting machine so it can spit the brick, using a series of servo-controlled gates, into the right output bucket.

    Extra-credit homework

    A front view of the LEGO sorter with the sorting boxes visible underneath
    In all its bricky beauty, with the 18 output buckets visible at the bottom

    Daniel is such a boss maker that he wrote not one, but two further reading articles for those of you who want to deep-dive into this mega LEGO creation:

    Website: LINK

  • These Furby-‘controlled’ Raspberry Pi-powered eyes follow you

    These Furby-‘controlled’ Raspberry Pi-powered eyes follow you

    Reading Time: 3 minutes

    Sam Battle aka LOOK MUM NO COMPUTER couldn’t resist splashing out on a clear Macintosh case for a new project in his ‘Cosmo’ series of builds, which inject new life into retro hardware.

    furby facial recognition robot in a clear case in front of a dark background
    AAGGGGHHHHHHH!

    This time around, a Raspberry Pi, running facial recognition software, and one of our Camera Modules enable Furby-style eyes to track movement, detect faces, and follow you around the room.

    [youtube https://www.youtube.com/watch?v=axVrGmbjkSc?feature=oembed&w=500&h=281]

    Give LOOK MUM NO COMPUTER a follow on YouTube

    He loves a good Furby does Sam. Has a whole YouTube playlist dedicated to projects featuring them. Seriously.

    Raspberry Pi with camera module attached to small screen loading software needed to run face recognition
    Sam got all the Raspberry Pi kit needed from Pimoroni

    Our favourite bit of the video is when Sam meets Raspberry Pi for the first time, boots it up, and says:

    “Wait, I didn’t know it was a computer. It’s an actual computer computer. What?!”

    face recognition software running on small screen with raspberry pi camera behind it, looking at the maker
    Face recognition software up and running on Raspberry Pi

    The eyes are ping pong balls cut in half so you can fit a Raspberry Pi Camera Module inside them. (Don’t forget to make a hole in the ‘pupil’ so the lens can peek through).

    Maker inserting raspberry pi camera module inside a sliced ping pong ball. You can see the ribbons of the camera module sticking out of the ping pong ball half
    Raspberry Pi Camera Module tucked inside ping pong ball as it’s mounted to a 3D-printed part

    The Raspberry Pi and display screen are neatly mounted on the side of the Macintosh so they’re easily accessible should you need to make any changes.

    Raspberry Pi and display screen mounted on the side of a clear macintosh frame
    Easy access

    All the hacked, repurposed junky bits sit inside or are mounted on swish 3D-printed parts.

    Add some joke shop chatterbox teeth, and you’ve got what looks like the innards of a Furby staring at you. See below for a harrowing snapshot of Zach’s ‘Furlexa’ project, featured on our blog last year. We still see it when we sleep.

    It gets worse the more you look around

    It wasn’t enough for Furby-mad Sam to have created a Furby look-a-like face-tracking robot, he needed to go further. Inside the clear Macintosh case, you can see a de-furred Furby skeleton atop a 3D-printed plinth, with redundant ribbon cables flowing from its eyes into the back of the face-tracking robot face, thus making it appear as though the Furby is the brains behind this creepy creation that is following your every move.

    a side view of the entire build with a furby skeleton visible inside
    Hey in there. We see you! You dark lord of robo-controlling

    Eventually, Sam’s Raspberry Pi–powered creation will be on display at the Museum of Everything Else, so you can go visit it and play with all the “obsolete and experimental technology” housed there. The museum is funded by the Look Mum No Computer Patreon page.

    Website: LINK

  • Classify your trash with Raspberry Pi

    Classify your trash with Raspberry Pi

    Reading Time: 3 minutes

    Maker Jen Fox took to hackster.io to share a Raspberry Pi–powered trash classifier that tells you whether the trash in your hand is recyclable, compostable, or just straight-up garbage.

    Jen reckons this project is beginner-friendly, as you don’t need any code to train the machine learning model, just a little to load it on Raspberry Pi. It’s also a pretty affordable build, costing less than $70 including a Raspberry Pi 4.

    “Haz waste”?!

    Hardware:

    • Raspberry Pi 4 Model B
    • Raspberry Pi Camera Module
    • Adafruit push button
    • Adafruit LEDs

    [youtube https://www.youtube.com/watch?v=jyz0ArPEsj4?feature=oembed&w=500&h=281]

    Watch Jen giving a demo of her creation

    Software

    The code-free machine learning model is created using Lobe, a desktop tool that automatically trains a custom image classifier based on what objects you’ve shown it.

    The image classifier correctly guessing it has been shown a bottle cap

    Training the image classifier

    Basically, you upload a tonne of photos and tell Lobe what object each of them shows. Jen told the empty classification model which photos were of compostable waste, which were of recyclable and items, and which were of garbage or bio-hazardous waste. Of course, as Jen says, “the more photos you have, the more accurate your model is.”

    Loading up Raspberry Pi

    Birds eye view of Raspberry Pi 4 with a camera module connected
    The Raspberry Pi Camera Module attached to Raspberry Pi 4

    As promised, you only need a little bit of code to load the image classifier onto your Raspberry Pi. The Raspberry Pi Camera Module acts as the image classifier’s “eyes” so Raspberry Pi can find out what kind of trash you hold up for it.

    The push button and LEDs are wired up to the Raspberry Pi GPIO pins, and they work together with the camera and light up according to what the image classifier “sees”.

    Here’s the fritzing diagram showing how to wire the push button and LEDS to the Raspberry Pi GPIO pins

    You’ll want to create a snazzy case so your trash classifier looks good mounted on the wall. Kate cut holes in a cardboard box to make sure that the camera could “see” out, the user can see the LEDs, and the push button is accessible. Remember to leave room for Raspberry Pi’s power supply to plug in.

    Jen’s hand-painted case mounted to the wall, having a look at a plastic bag

    Jen has tonnes of other projects on her Hackster profile — check out the micro:bit Magic Wand.

    Website: LINK

  • Hire Raspberry Pi as a robot sous-chef in your kitchen

    Hire Raspberry Pi as a robot sous-chef in your kitchen

    Reading Time: 3 minutes

    Design Engineering student Ben Cobley has created a Raspberry Pi–powered sous-chef that automates the easier pan-cooking tasks so the head chef can focus on culinary creativity.

    [youtube https://www.youtube.com/watch?v=W4utRCyo5C4?feature=oembed&w=500&h=281]

    Ben named his invention OnionBot, as the idea came to him when looking for an automated way to perfectly soften onions in a pan while he got on with the rest of his dish. I have yet to manage to retrieve onions from the pan before they blacken so… *need*.

    OnionBot robotic sous-chef set up in a kitchen
    The full setup (you won’t need a laptop while you’re cooking, so you’ll have counter space)

    A Raspberry Pi 4 Model B is the brains of the operation, with a Raspberry Pi Touch Display showing the instructions, and a Raspberry Pi Camera Module keeping an eye on the pan.

    OnionBot robotic sous-chef hardware mounted on a board
    Close up of the board-mounted hardware and wiring

    Ben’s affordable solution is much better suited to home cooking than the big, expensive robotic arms used in industry. Using our tiny computer also allowed Ben to create something that fits on a kitchen counter.

    OnionBot robotic sous-chef hardware list

    What can OnionBot do?

    • Tells you on-screen when it is time to advance to the next stage of a recipe
    • Autonomously controls the pan temperature using PID feedback control
    • Detects when the pan is close to boiling over and automatically turns down the heat
    • Reminds you if you haven’t stirred the pan in a while

    How does it work?

    A thermal sensor array suspended above the stove detects the pan temperature, and the Raspberry Pi Camera Module helps track the cooking progress. A servo motor controls the dial on the induction stove.

    Screenshot of the image classifier of OnionBot robotic sous-chef
    Labelling images to train the image classifier

    No machine learning expertise was required to train an image classifier, running on Raspberry Pi, for Ben’s robotic creation; you’ll see in the video that the classifier is a really simple drag-and-drop affair.

    Ben has only taught his sous-chef one pasta dish so far, and we admire his dedication to carbs.

    Screenshot of the image classifier of OnionBot robotic sous-chef
    Training the image classifier to know when you haven’t stirred the pot in a while

    Ben built a control panel for labelling training images in real time and added labels at key recipe milestones while he cooked under the camera’s eye. This process required 500–1000 images per milestone, so Ben made a LOT of pasta while training his robotic sous-chef’s image classifier.

    Diagram of networked drivers and devices in OnionBot robotic sous-chef

    Ben open-sourced this project so you can collaborate to suggest improvements or teach your own robot sous-chef some more dishes. Here’s OnionBot on GitHub.

    He also rates this Auto ML system used in the project as a “great tool for makers.”

    Website: LINK

  • New book: The Official Raspberry Pi Camera Guide

    New book: The Official Raspberry Pi Camera Guide

    Reading Time: 3 minutes

    To coincide with yesterday’s launch of the Raspberry Pi High Quality Camera, Raspberry Pi Press has created a new Official Camera Guide to help you get started and inspire your future projects.

    The Raspberry Pi High Quality Camera

    Connecting a High Quality Camera turns your Raspberry Pi into a powerful digital camera. This 132-page book tells you everything you need to know to set up the camera, attach a lens, and start capturing high-resolution photos and video footage.

    Make those photos snazzy

    The book tells you everything you need to know in order to use the camera by issuing commands in a terminal window or via SSH. It also demonstrates how to control the camera with Python using the excellent picamera library.

    You’ll discover the many image modes and effects available – our favourite is ‘posterise’.

    Build some amazing camera-based projects

    Once you’ve got the basics down, you can start using your camera for a variety of exciting Raspberry Pi projects showcased across the book’s 17 packed chapters. Want to make a camera trap to monitor the wildlife in your garden? Build a smart door with a video doorbell? Try out high-speed and time-lapse photography? Or even find out which car is parked in your driveway using automatic number-plate recognition? The book has all this covered, and a whole lot more.

    Don’t have a High Quality Camera yet? No problem. All the commands in the book are exactly the same for the standard Raspberry Pi Camera Module, so you can also use this model with the help of our Official Camera Guide.

    Snap it up!

    The Official Raspberry Pi Camera Guide is available now from the Raspberry Pi Press online store for £10. And, as always, we have also released the book as a free PDF. But the physical book feels so good to hold and looks so handsome on your bookshelf, we don’t think you’ll regret getting your hands on the print edition.

    Whichever format you choose, have fun shooting amazing photos and videos with the new High Quality Camera. And do share what you capture with us on social media using #ShotOnRaspberryPi.

    Website: LINK

  • New product: Raspberry Pi High Quality Camera on sale now at $50

    New product: Raspberry Pi High Quality Camera on sale now at $50

    Reading Time: 5 minutes

    We’re pleased to announce a new member of the Raspberry Pi camera family: the 12.3-megapixel High Quality Camera, available today for just $50, alongside a range of interchangeable lenses starting at $25.

    NEW Raspberry Pi High Quality Camera

    Subscribe to our YouTube channel: http://rpf.io/ytsub Help us reach a wider audience by translating our video content: http://rpf.io/yttranslate Buy a Raspbe…

    It’s really rather good, as you can see from this shot of Cambridge’s finest bit of perpendicular architecture.

    At 69 years, King’s College Chapel took only slightly longer to finish than the High Quality Camera.

    And this similarly pleasing bit of chip architecture.

    Ready for your closeup.

    Raspberry Pi and the camera community

    There has always been a big overlap between Raspberry Pi hackers and camera hackers. Even back in 2012, people (okay, substantially Dave Hunt) were finding interesting ways to squeeze more functionality out of DSLR cameras using their Raspberry Pi computers.

    Dave’s water droplet photography. Still, beautiful.

    The OG Raspberry Pi camera module

    In 2013, we launched our first camera board, built around the OmniVision OV5647 5‑megapixel sensor, followed rapidly by the original Pi NoIR board, with infrared sensitivity and a little magic square of blue plastic. Before long, people were attaching them to telescopes and using them to monitor plant health from drones (using the aforementioned little square of plastic).

    TJ EMSLEY Moon Photography

    We like the Moon.

    Sadly, OV5647 went end-of-life in 2015, and the 5-megapixel camera has the distinction of being one of only three products (along with the original Raspberry Pi 1 and the official WiFi dongle) that we’ve ever discontinued. Its replacement, built around the 8-megapixel Sony IMX219 sensor, launched in April 2016; it has found a home in all sorts of cool projects, from line-followers to cucumber sorters, ever since. Going through our sales figures while writing this post, we were amazed to discover we’ve sold over 1.7 million of these to date.

    The limitations of fixed-focus

    Versatile though they are, there are limitations to mobile phone-type fixed-focus modules. The sensors themselves are relatively small, which translates into a lower signal-to-noise ratio and poorer low-light performance; and of course there is no option to replace the lens assembly with a more expensive one, or one with different optical properties. These are the shortcomings that the High Quality Camera is designed to address.

    Raspberry Pi High Quality Camera

    Raspberry Pi High Quality Camera, without a lens attached

    Features include:

    • 12.3 megapixel Sony IMX477 sensor
    • 1.55μm × 1.55μm pixel size – double the pixel area of IMX219
    • Back-illuminated sensor architecture for improved sensitivity
    • Support for off-the-shelf C- and CS-mount lenses
    • Integrated back-focus adjustment ring and tripod mount

    We expect that over time people will use quite a wide variety of lenses, but for starters our Approved Resellers will be offering a couple of options: a 6 mm CS‑mount lens at $25, and a very shiny 16 mm C-mount lens priced at $50.

    Our launch-day lens selection.

    Read all about it

    Also out today is our new Official Raspberry Pi Camera Guide, covering both the familiar Raspberry Pi Camera Module and the new Raspberry Pi High Quality Camera.

    We’ll never not be in love with Jack’s amazing design work.

    Our new guide, published by Raspberry Pi Press, walks you through setting up and using your camera with your Raspberry Pi computer. You’ll also learn how to use filters and effects to enhance your photos and videos, and how to set up creative projects such as stop-motion animation stations, wildlife cameras, smart doorbells, and much more.

    Aardman ain’t got nothing on you.

    You can purchase the book in print today from the Raspberry Pi Press store for £10, or download the PDF for free from The MagPi magazine website.

    Credits

    As with every product we build, the High Quality Camera has taught us interesting new things, in this case about producing precision-machined aluminium components at scale (and to think we thought injection moulding was hard!). Getting this right has been something of a labour of love for me over the past three years, designing the hardware and getting it to production. Naush Patuck tuned the VideoCore IV ISP for this sensor; David Plowman helped with lens evaluation; Phil King produced the book; Austin Su provided manufacturing support.

    We’d like to acknowledge Phil Holden at Sony in San Jose, the manufacturing team at Sony UK Tec in Pencoed for their camera test and assembly expertise, and Shenzhen O-HN Optoelectronic for solving our precision engineering challenges.

    FAQS

    Which Raspberry Pi models support the High Quality Camera?

    The High Quality Camera is compatible with almost all Raspberry Pi models, from the original Raspberry Pi 1 Model B onward. Some very early Raspberry Pi Zero boards from the start of 2016 lack a camera connector, and other Zero users will need the same adapter FPC that is used with Camera Module v2.

    What about Camera Module v2?

    The regular and infrared versions of Camera Module v2 will still be available. The High Quality Camera does not supersede it. Instead, it provides a different tradeoff between price, performance, and size.

    What lenses can I use with the High Quality Camera?

    You can use C- and CS-mount lenses out of the box (C-mount lenses use the included C-CS adapter). Third-party adapters are available from a wide variety of lens standards to CS-mount, so it is possible to connect any lens that meets the back‑focus requirements.

    We’re looking forward to seeing the oldest and/or weirdest lenses anyone can get working, but here’s one for starters, courtesy of Fiacre.

    Do not try this at home. Or do: fine either way.

    Website: LINK