Schlagwort: open source

  • The 2024 Arduino Open Source Report is here!

    The 2024 Arduino Open Source Report is here!

    Reading Time: 2 minutes

    Every year, we take a moment to reflect on the contributions we made to the open source movement, and the many ways our community has made a huge difference. As we publish the latest Open Source Report, we are proud to say 2024 was another year of remarkable progress and achievements.

    A year of growth and collaboration

    At Arduino, we continued pushing the boundaries of open hardware and software

    In 2024, we:

    These updates ensure a more flexible and robust ecosystem for developers, educators, and makers worldwide.

    But what truly makes open source thrive is the community behind it! Over the past year, Arduino users contributed 1,198 new libraries to the Library Manager (+18% YoY growth!), shared hundreds of open-source tutorials, and actively engaged in thousands of discussions and collaborations on GitHub and Project Hub. These collective efforts fuel innovation, making the Arduino ecosystem more dynamic, inclusive, and powerful than ever.

    How can you contribute to open source?

    We believe open-source success is built on collaboration. Every original Arduino purchase, Arduino Cloud subscription, and community contribution helps support and expand this shared ecosystem. Donations of course are also welcome, and play a great part in everything we do! 

    Download the 2024 Open Source Report to explore the milestones we’ve achieved together. Here’s to another year of openness, creativity, and progress!

    (Want to catch up on previous editions? Here are the Open Source Reports for 2023, 2022, and 2021.)

    The post The 2024 Arduino Open Source Report is here! appeared first on Arduino Blog.

    Website: LINK

  • The 2023 Arduino Open Source Report is out

    The 2023 Arduino Open Source Report is out

    Reading Time: 2 minutes

    New year, new Open Source Report! Lots has happened in 2023 in terms of open-source development, and we’re excited to share our yearly recap of the various contributions from the Arduino team and the community. Together, we have released new, important open-source projects – both hardware and software – as well as published and maintained a staggering number of libraries, growing our shared resources at a record-breaking pace. 

    Apparently, we have a history of surpassing our own expectations – and 2023 was no exception. We joined the Zephyr® Project, released five new open-source hardware products, five new versions of the Arduino IDE 2.x, 13 new versions of our command line tools, 12 new official libraries, and 13 versions of the official board packages. We also worked to significantly support MicroPython – releasing a new installer tool and creating a package index.

    The Arduino community has clearly shown its love for open source too. During 2023, 1,068 new libraries were added (+20% in one year!) and 101 new versions of community board packages were released. On the content side, 205 new open-source tutorials were published on our new Project Hub

    The 2023 Report also includes a ranking of the most active library authors and maintainers, who provide an incredible service to the whole community with their hard work in the name of open source.

    Finally, the achievements we are proud to recap in the Arduino Open Source Report would not be possible without you. To all the people who buy original Arduino products, subscribe to Arduino Cloud, or make donations: THANK YOU for supporting us and our efforts in open-source development. 

    Let’s get ready for a very open source 2024!

    (And if you missed the last yearly report about 2022, go check it out now!)

    The post The 2023 Arduino Open Source Report is out appeared first on Arduino Blog.

    Website: LINK

  • Our Code Editor is open source

    Our Code Editor is open source

    Reading Time: 5 minutes

    A couple of months ago we announced that you can test the online text-based Code Editor we’re building to help young people aged 7 and older learn to write code. Now we’ve made the code for the Editor open source so people can repurpose and contribute to it.

    The interface of the beta version of the Raspberry Pi Foundation's Code Editor.

    How can you use the Code Editor?

    You and your learners can try out the Code Editor in the first two projects of our ‘Intro to Python’ path. We’ve included a feedback form for you to let us know what you think about the Editor.

    • The Editor lets you run code straight in the browser, with no setup required.
    • It makes getting started with text-based coding easier thanks to its simple and intuitive interface.
    • If you’re logged into your Raspberry Pi Foundation account, your code in the Editor is automatically saved.
    • If you’re not logged in, your code changes persist for the session, so you can refresh or close the tab without losing your work.
    • You can download your code to your computer too.

    Since the Editor lets learners save their code using their Raspberry Pi Foundation account, it’s easy for them to build on projects they’ve started in the classroom or at home, or bring a project they’ve started at home to their coding club.

    Three learners working at laptops.

    Python is the first programming language our Code Editor supports because it’s popular in schools, CoderDojos, and Code Clubs, as well as in industry. We’ll soon be adding support for web development languages (HTML/CSS).

    A text output in the beta version of the Raspberry Pi Foundation's Code Editor.

    Putting ease of use and accessibility front and centre

    We know that starting out with new programming tools can be tricky and add to the cognitive load of learning new subject matter itself. That’s why our Editor has a simple and accessible user interface and design:

    • You can easily find key functions, such as how to write and run code, how to save or download your code, and how to check your code.
    • You can switch between dark and light mode.
    • You can enlarge or reduce the text size in input and output, which is especially useful for people with visual impairments and for educators and volunteers who want to demonstrate something to a group of learners.

    We’ll expand the Editor’s functionalities as we go. For example, at the moment we’re looking at how to improve the Editor’s user interface (UI) for better mobile support.

    If there’s a feature you think would help the Editor become more accessible and more suitable for young learners, or make it better for your classroom or club, please let us know via the feedback form.

    The open-source code for the Code Editor

    Our vision is that every young person develops the knowledge, skills, and confidence to use digital technologies effectively, and to be able to critically evaluate these technologies and confidently engage with technological change. We’re part of a global community that shares that vision, so we’ve made the Editor available as an open-source project. That means other projects and organisations focussed on helping people learn about coding and digital technologies can benefit from the work.

    How did we build the Editor? An overview

    To support the widest possible range of learners, we’ve designed the Code Editor application to work well on constrained devices and low-bandwidth connections. Safeguarding, accessibility, and data privacy are also key considerations when we build digital products at the Foundation. That’s why we decided to design the front end of the Editor to work in a standalone capacity, with Python executed through Skulpt, an entirely in-browser implementation of Python, and code changes persisted in local storage by default. Learners have the option of using a Raspberry Pi Foundation account to save their work, with changes then persisted via calls to a back end application programming interface (API).

    As safeguarding is always at the core of what we do, we only make features available that comply with our safeguarding policies as well as the ICO’s age-appropriate design code. We considered supporting functionality such as image uploads and code sharing, but at the time of writing have decided to not add these features given that, without proper moderation, they present risks to safeguarding.

    There’s an amazing community developing a wealth of open-source libraries. We chose to build our text-editor interface using CodeMirror, which has out-of-the-box mobile and tablet support and includes various useful features such as syntax highlighting and keyboard shortcuts. This has enabled us to focus on building the best experience for learners, rather than reinventing the wheel.

    Diving a bit more into the technical details:

    • The UI front end is built in React and deployed using Cloudflare Pages
    • The API back end is built in Ruby on Rails
    • The text-editor panel uses CodeMirror, which has best-in-class accessibility through mobile device and screen-reader support, and includes functionality such as syntax highlighting, keyboard shortcuts, and autocompletion
    • Python functionality is built using Skulpt to enable in-browser execution of code, with custom extensions built to support our learning content
    • Project code is persisted through calls to our back end API using a mix of REST and GraphQL endpoints
    • Data is stored in PostgreSQL, which is hosted on Heroku along with our back end API

    Accessing the open-source code

    You can find out more about our Editor’s code for both the UI front end and API back end in our GitHub readme and contributions documentation. These kick-starter docs will help you get up and running faster:

    The Editor’s front end is licensed as permissively as possible under the Apache Licence 2.0, and we’ve chosen to license the back end under the copyleft AGPL V3 licence. Copyleft licences mean derived works must be licensed under the same terms, including making any derived projects also available to the community.

    We’d greatly appreciate your support with developing the Editor further, which you can give by:

    • Providing feedback on our code or raising a bug as a GitHub Issue in the relevant repository.
    • Submitting contributions by raising a pull request against the relevant repository.
      • On the back end repository we’ll ask you to allow the Raspberry Pi Foundation to reserve the right to re-use your contribution.
      • You’ll retain the copyright for any contributions on either repository.
    • Sharing feedback on using the Editor itself through the feedback form.

    Our work to develop and publish the Code Editor as an open-source project has been funded by Endless. We thank them for their generous support.

    If you are interested in partnering with us to fund this key work, or you are part of an organisation that would like to make use of the Code Editor, please reach out to us via email.

    Website: LINK

  • The 2022 Arduino Open Source Report is out

    The 2022 Arduino Open Source Report is out

    Reading Time: 2 minutes
    Arduino Open Source Report 2022

    In our last annual report we described 2021 as one of the busiest and most productive years in Arduino history in terms of open source development (if you missed that report, go read it now as it contains so many nice things). Well, we didn’t rest in 2022 either!

    The Arduino team has been busy releasing new important open source projects, both hardware and software, while the community continues to release and maintain libraries at an incredible pace.

    Just to name one big release, the IDE 2 was launched a few months ago. For Arduino, such an incredibly complex project has been a massive investment in financial terms and we are proud of the very positive reception by the users and the active participation of contributors. There’s a healthy community and this can also be seen from many indicators that are not in this report, including participation in the Arduino Day yearly celebration as well as the forum activity and much more.

    The report highlights the main achievements of our open source community. Among those, in 2022 we had three new open source hardware products, the new Lab for MicroPython, the language discussion space, 1,042 new libraries (+25% in one year), 421 new open source tutorials on Project Hub, 84 new releases of Arduino cores, and the ranking of the most active library maintainers.

    All this is made possible by people who buy original Arduino products, subscribe to the Arduino Cloud, and/or make donations: THANK YOU for supporting us and our efforts in open source development. There’s a lot to do in 2023!

    The post The 2022 Arduino Open Source Report is out appeared first on Arduino Blog.

    Website: LINK

  • The 2021 Arduino Open Source Report is out

    The 2021 Arduino Open Source Report is out

    Reading Time: 2 minutes

    Arduino TeamJanuary 13th, 2022

    Arduino Open Source Report for 2021

    We’re excited to announce the Arduino Open Source Report for 2021 is now available, offering many insights into the development of our open-source ecosystem during the past year.

    In this retrospective report you’ll learn about the activities Arduino carried out in the last twelve months, thanks to the hard work of the employees, contractors and volunteers on our team and to the passion of our vibrant community, fueling our mission every day.

    We’re proud of the many achievements we celebrated in 2021. It was one of the busiest and most productive years in Arduino’s history of commitment to open source.

    We launched a number of new open source hardware products, software tools and libraries. We also upgraded existing assets, heavily refactoring some core pillars of the Arduino framework (IDE, library index and more) ,making them robust enough to support the growing Arduino user base.

    The document also highlights key contributions from the Arduino community – libraries, cores and more – that were made during the year. We’re grateful for all the active maintainers and contributors that put Arduino in a league of its own, and strive to give everyone proper credit.

    We invite all of you to join the community and become active contributors. There’s a lot to do! For each sub-project, the report points out where you can join us and make a difference.

    So, are you ready to dive in? Download the Arduino Open Source Report 2021, and please share your comments and get in touch with us on the Arduino Forum. We want to read your feedback and understand what we can do together in 2022 to ensure Arduino keeps getting better and better.

    Website: LINK

  • Get your SteFly™ – Nice sightseeing tour of Pavullo city

    Get your SteFly™ – Nice sightseeing tour of Pavullo city

    Reading Time: < 1 minute

    Nice sightseeing tour of Pavullo city during the traffic pattern. At the last competition day of the 1st e-glide contest it was very difficult to get back home to the airfield. The finish cylinder was big enough to stay out of the rain shower and still finish the task. With a lot of patience I managed to climb with thermals above the final glide path and also used the FES to gain some more meters. Unfortunately you are not allowed to use the FES in rain and there are no outlanding fields north of Pavullo.

    https://www.stefly.aero/openvario/

    WHAT IS AN OPENVARIO?

    OpenVario is a glide computer for XCSoar. Both are Open Source projects developed by glider enthusiasts and engineers in their spare time.

    SteFly offers ready to use 7″ and 5,7″ devices with perfectly sunlight readable displays. OpenVario can be used with the integrated electronical variometer or with an external variometer.

    MAIN FEATURES

    • Linux operating system
    • XCSoar as navigation software
    • Latest generation pressure sensors
    • Sunlight readable displays
    • User input with rotary module or remote stick
    • File transfer via USB stick

     

    Official Source: https://www.stefly.aero/

    https://www.youtube.com/channel/UC6a5hujlVVZpxC4WfEqVAbg

  • An open source camera stack for Raspberry Pi using libcamera

    An open source camera stack for Raspberry Pi using libcamera

    Reading Time: 6 minutes

    Since we released the first Raspberry Pi camera module back in 2013, users have been clamouring for better access to the internals of the camera system, and even to be able to attach camera sensors of their own to the Raspberry Pi board. Today we’re releasing our first version of a new open source camera stack which makes these wishes a reality.

    (Note: in what follows, you may wish to refer to the glossary at the end of this post.)

    We’ve had the building blocks for connecting other sensors and providing lower-level access to the image processing for a while, but Linux has been missing a convenient way for applications to take advantage of this. In late 2018 a group of Linux developers started a project called libcamera to address that. We’ve been working with them since then, and we’re pleased now to announce a camera stack that operates within this new framework.

    Here’s how our work fits into the libcamera project.

    We’ve supplied a Pipeline Handler that glues together our drivers and control algorithms, and presents them to libcamera with the API it expects.

    Here’s a little more on what this has entailed.

    V4L2 drivers

    V4L2 (Video for Linux 2) is the Linux kernel driver framework for devices that manipulate images and video. It provides a standardised mechanism for passing video buffers to, and/or receiving them from, different hardware devices. Whilst it has proved somewhat awkward as a means of driving entire complex camera systems, it can nonetheless provide the basis of the hardware drivers that libcamera needs to use.

    Consequently, we’ve upgraded both the version 1 (Omnivision OV5647) and version 2 (Sony IMX219) camera drivers so that they feature a variety of modes and resolutions, operating in the standard V4L2 manner. Support for the new Raspberry Pi High Quality Camera (using the Sony IMX477) will be following shortly. The Broadcom Unicam driver – also V4L2‑based – has been enhanced too, signalling the start of each camera frame to the camera stack.

    Finally, dumping raw camera frames (in Bayer format) into memory is of limited value, so the V4L2 Broadcom ISP driver provides all the controls needed to turn raw images into beautiful pictures!

    Configuration and control algorithms

    Of course, being able to configure Broadcom’s ISP doesn’t help you to know what parameters to supply. For this reason, Raspberry Pi has developed from scratch its own suite of ISP control algorithms (sometimes referred to generically as 3A Algorithms), and these are made available to our users as well. Some of the most well known control algorithms include:

    • AEC/AGC (Auto Exposure Control/Auto Gain Control): this monitors image statistics into order to drive the camera exposure to an appropriate level.
    • AWB (Auto White Balance): this corrects for the ambient light that is illuminating a scene, and makes objects that appear grey to our eyes come out actually grey in the final image.

    But there are many others too, such as ALSC (Auto Lens Shading Correction, which corrects vignetting and colour variation across an image), and control for noise, sharpness, contrast, and all other aspects of image processing. Here’s how they work together.

    The control algorithms all receive statistics information from the ISP, and cooperate in filling in metadata for each image passing through the pipeline. At the end, the metadata is used to update control parameters in both the image sensor and the ISP.

    Previously these functions were proprietary and closed source, and ran on the Broadcom GPU. Now, the GPU just shovels pixels through the ISP hardware block and notifies us when it’s done; practically all the configuration is computed and supplied from open source Raspberry Pi code on the ARM processor. A shim layer still exists on the GPU, and turns Raspberry Pi’s own image processing configuration into the proprietary functions of the Broadcom SoC.

    To help you configure Raspberry Pi’s control algorithms correctly for a new camera, we include a Camera Tuning Tool. Or if you’d rather do your own thing, it’s easy to modify the supplied algorithms, or indeed to replace them entirely with your own.

    Why libcamera?

    Whilst ISP vendors are in some cases contributing open source V4L2 drivers, the reality is that all ISPs are very different. Advertising these differences through kernel APIs is fine – but it creates an almighty headache for anyone trying to write a portable camera application. Fortunately, this is exactly the problem that libcamera solves.

    We provide all the pieces for Raspberry Pi-based libcamera systems to work simply “out of the box”. libcamera remains a work in progress, but we look forward to continuing to help this effort, and to contributing an open and accessible development platform that is available to everyone.

    Summing it all up

    So far as we know, there are no similar camera systems where large parts, including at least the control (3A) algorithms and possibly driver code, are not closed and proprietary. Indeed, for anyone wishing to customise a camera system – perhaps with their own choice of sensor – or to develop their own algorithms, there would seem to be very few options – unless perhaps you happen to be an extremely large corporation.

    In this respect, the new Raspberry Pi Open Source Camera System is providing something distinctly novel. For some users and applications, we expect its accessible and non-secretive nature may even prove quite game-changing.

    What about existing camera applications?

    The new open source camera system does not replace any existing camera functionality, and for the foreseeable future the two will continue to co-exist. In due course we expect to provide additional libcamera-based versions of raspistill, raspivid and PiCamera – so stay tuned!

    Where next?

    If you want to learn more about the libcamera project, please visit https://libcamera.org.

    To try libcamera for yourself with a Raspberry Pi, please follow the instructions in our online documentation, where you’ll also find the full Raspberry Pi Camera Algorithm and Tuning Guide.

    If you’d like to know more, and can’t find an answer in our documentation, please go to the Camera Board forum. We’ll be sure to keep our eyes open there to pick up any of your questions.

    Acknowledgements

    Thanks to Naushir Patuck and Dave Stevenson for doing all the really tricky bits (lots of V4L2-wrangling).

    Thanks also to the libcamera team (Laurent Pinchart, Kieran Bingham, Jacopo Mondi and Niklas Söderlund) for all their help in making this project possible.

    Glossary

    3A, 3A Algorithms: refers to AEC/AGC (Auto Exposure Control/Auto Gain Control), AWB (Auto White Balance) and AF (Auto Focus) algorithms, but may implicitly cover other ISP control algorithms. Note that Raspberry Pi does not implement AF (Auto Focus), as none of our supported camera modules requires it
    AEC: Auto Exposure Control
    AF: Auto Focus
    AGC: Auto Gain Control
    ALSC: Auto Lens Shading Correction, which corrects vignetting and colour variations across an image. These are normally caused by the type of lens being used and can vary in different lighting conditions
    AWB: Auto White Balance
    Bayer: an image format where each pixel has only one colour component (one of R, G or B), creating a sort of “colour mosaic”. All the missing colour values must subsequently be interpolated. This is a raw image format meaning that no noise, sharpness, gamma, or any other processing has yet been applied to the image
    CSI-2: Camera Serial Interface (version) 2. This is the interface format between a camera sensor and Raspberry Pi
    GPU: Graphics Processing Unit. But in this case it refers specifically to the multimedia coprocessor on the Broadcom SoC. This multimedia processor is proprietary and closed source, and cannot directly be programmed by Raspberry Pi users
    ISP: Image Signal Processor. A hardware block that turns raw (Bayer) camera images into full colour images (either RGB or YUV)
    Raw: see Bayer
    SoC: System on Chip. The Broadcom processor at the heart of all Raspberry Pis
    Unicam: the CSI-2 receiver on the Broadcom SoC on the Raspberry Pi. Unicam receives pixels being streamed out by the image sensor
    V4L2: Video for Linux 2. The Linux kernel driver framework for devices that process video images. This includes image sensors, CSI-2 receivers, and ISPs

    Website: LINK

  • Hands-on with the Arduino CLI!

    Hands-on with the Arduino CLI!

    Reading Time: < 1 minute

    Hands-on with the Arduino CLI!

    Arduino TeamApril 23rd, 2020

    In our last post, we told you that the Arduino CLI’s primary goal is to provide a flexible yet simple command line tool with all the features and ease of use that made Arduino a successful platform, and enable users to find new ways of improving their workflows. 

    The Arduino CLI is not just a command line tool, but contains all you need to build applications around the Arduino ecosystem.

    For example, you can:

    • Parse the JSON output of the CLI and easily integrate it into your custom application.
    • Run the CLI as an always-on service that accepts commands via a gRPC interface using your language of choice.
    • Use the CLI in your Go application as a library.

    In the video below, we’ll focus on how to start using the Arduino CLI in a terminal session. The tutorial will walk you through setting up all the required tools on your machine to the fastest way to compile and upload a sketch on your target board to allow quick iterations in developing your project with Arduinos.

    [youtube https://www.youtube.com/watch?v=J-qGn1eEidA?feature=oembed&w=500&h=281]

    Website: LINK

  • Growth Monitor pi: an open monitoring system for plant science

    Growth Monitor pi: an open monitoring system for plant science

    Reading Time: 3 minutes

    Plant scientists and agronomists use growth chambers to provide consistent growing conditions for the plants they study. This reduces confounding variables – inconsistent temperature or light levels, for example – that could render the results of their experiments less meaningful. To make sure that conditions really are consistent both within and between growth chambers, which minimises experimental bias and ensures that experiments are reproducible, it’s helpful to monitor and record environmental variables in the chambers.

    A neat grid of small leafy plants on a black plastic tray. Metal housing and tubing is visible to the sides.

    Arabidopsis thaliana in a growth chamber on the International Space Station. Many experimental plants are less well monitored than these ones.
    (“Arabidopsis thaliana plants […]” by Rawpixel Ltd (original by NASA) / CC BY 2.0)

    In a recent paper in Applications in Plant Sciences, Brandin Grindstaff and colleagues at the universities of Missouri and Arizona describe how they developed Growth Monitor pi, or GMpi: an affordable growth chamber monitor that provides wider functionality than other devices. As well as sensing growth conditions, it sends the gathered data to cloud storage, captures images, and generates alerts to inform scientists when conditions drift outside of an acceptable range.

    The authors emphasise – and we heartily agree – that you don’t need expertise with software and computing to build, use, and adapt a system like this. They’ve written a detailed protocol and made available all the necessary software for any researcher to build GMpi, and they note that commercial solutions with similar functionality range in price from $10,000 to $1,000,000 – something of an incentive to give the DIY approach a go.

    GMpi uses a Raspberry Pi Model 3B+, to which are connected temperature-humidity and light sensors from our friends at Adafruit, as well as a Raspberry Pi Camera Module.

    The team used open-source app Rclone to upload sensor data to a cloud service, choosing Google Drive since it’s available for free. To alert users when growing conditions fall outside of a set range, they use the incoming webhooks app to generate notifications in a Slack channel. Sensor operation, data gathering, and remote monitoring are supported by a combination of software that’s available for free from the open-source community and software the authors developed themselves. Their package GMPi_Pack is available on GitHub.

    With a bill of materials amounting to something in the region of $200, GMpi is another excellent example of affordable, accessible, customisable open labware that’s available to researchers and students. If you want to find out how to build GMpi for your lab, or just for your greenhouse, Affordable remote monitoring of plant growth in facilities using Raspberry Pi computers by Brandin et al. is available on PubMed Central, and it includes appendices with clear and detailed set-up instructions for the whole system.

    Website: LINK

  • You can now use Arduino to program Linux IoT devices

    You can now use Arduino to program Linux IoT devices

    Reading Time: 2 minutes

    You can now use Arduino to program Linux IoT devices

    Arduino TeamMarch 13th, 2018

    Today, at Embedded Linux Conference 2018, Arduino announced the expansion of the number of architectures supported by its Arduino Create platform for the development of IoT applications. With this new release, Arduino Create users can manage and program a wide range of popular Linux® single-board computers like the AAEON® UP² board, Raspberry Pi® and BeagleBone® as if they were regular Arduino boards. Multiple Arduino programs can run simultaneously on a Linux-based board and interact and communicate with each other, leveraging the capabilities provided by the new Arduino Connector. Moreover, IoT devices can be managed and updated remotely, independently from where they are located.

    To further simplify the user journey, Arduino has also developed a novel out-of-the-box experience for Raspberry Pi and BeagleBone boards, in addition to Intel®  SBCs, which enables anyone to set up a new device from scratch via the cloud without any previous knowledge by following an intuitive web-based wizard. Arduino plans to continue enriching and expanding the set of features of Arduino Create in the coming months.

    “With this release, Arduino extends its reach into edge computing, enabling anybody with Arduino programming experience to manage and develop complex multi-architecture IoT applications on gateways,” said Massimo Banzi, Arduino CTO. “This is an important step forward in democratizing access to the professional Internet of Things.”

    “At Arduino we want to empower anyone to be an active player in the digital world. Being able to run Arduino code and manage connected Linux devices is an important step in this direction, especially for IoT applications that need more computing power, like AI and computer vision,” added Fabio Violante, Arduino CEO.



    Website: LINK

  • Gigabot X can 3D Print with Recycled Plastic Pellets

    Gigabot X can 3D Print with Recycled Plastic Pellets

    Reading Time: 3 minutes

    Now live on Kickstarter is the Gigabot X, a large-scale, direct pellet extrusion 3D printer for fabricating with recycled plastic.

    Houston, Texas might seem likely an unlikely location for a revolution in 3D printing, but this is where re:3d have announced the Gigabot X, an open source 3D printer that fabricates with pelletized plastic. The unit is specifically designed to accept recycled pellets, a cleaner and greener approach for fused deposition modeling.

    The official launch of the Kickstarter campaign for the Gigabot X took place at the SXSW Festival, with a campaign seeking $50,000 in funding. Pledges of $9,500 or more will secure backers an exclusive Gigabot X Beta 3D printer, plus 5 lbs of pellets to get started.

    The first-generation Gigabot is an affordable large format 3D printer which was also a crowdfunding success story in 2013. But in launching the Gigabot X, the gang at re:3D reckon they’re fast approaching the realization of a goal 5 years in the making; a 3D printer that can print using plastic trash.

    How so? The answer appears to lie in direct pellet extrusion. Melting small chunks of plastic instead of extruded filament for the input material makes 3D printing directly from recyclables an easier process.

    Gigabot X creates a Virtuous Cycle for 3D Printing

    There are other benefits that come from printing with pellets. It eliminates the need for extruded plastic filament, for example, which tends to be about 10x more expensive than pelletized plastic.

    re:3D also say that direct pellet extrusion dramatically cuts back on printing times; in current tests, they’ve increased print times up to 17x than a filament-fed Gigabot.

    There are other pellet printers already on the market, but they’re typically used in larger, more expensive manufacturing systems. According to the Kickstarter page:

    “Our goal, much like with the first-generation Gigabot, is to increase 3D printer accessibility and bridge the gap between cost and scale by creating an affordable, large-scale pellet printer.”

    In addition to raising funds, the campaign has another important objective; to recruit a number of beta testers who will fine-tune the Gigabot X. With their feedback, they’ll be collaborating with re:3D in an ongoing process of iteration and improvement.

    And there will be some work ahead, to be sure. In addition to the direct pellet extruder, a small ecosystem of accessories are required for the Gigabot X. This includes a low-cost dryer, grinder, and feeder system.

    It’s an ambitions plan, but if successful it could blaze the trail for 3D printing directly from ground-up plastic. Interested? Visit the official Gigabot X Kickstarter campaign page to learn more.


    gigabot X


    License: The text of „Gigabot X can 3D Print with Recycled Plastic Pellets“ by All3DP is licensed under a Creative Commons Attribution 4.0 International License.

    Subscribe to updates from All3DP

    You are subscribed to updates from All3DP

    Website: LINK

  • Open Source 3D Printed Clip-On Microscope For Smartphones

    Open Source 3D Printed Clip-On Microscope For Smartphones

    Reading Time: 3 minutes

    This is an open source design for a smartphone camera microscope which can be customized, downloaded and 3D printed.

    A team of researchers at RMIT University in Australia have developed a 3D printable clip-on microscope for smartphones. The design of the microscope has been shared on the Centre for Nanoscale BioPhotonics website.

    Led by Anthony Orth, a research officer at the centre at RMIT University, the device has been developed to allow anyone, from students to medical staff to people at home, to take a closer look at things invisible to the naked eye.

    The solution also means that the microscope can be used in situations where laboratory equipment may not be available. This could be hugely beneficial for application in less developed countries to help detect malaria or other blood borne parasites.

    Orth explains in a post for The Conversation:

    “What we’re hoping is that our design, or something like it, gets used for ultra simple, cheap and robust mobile phone based devices – be it for medical diagnostics in underserved areas such as the remote Australian outback and central Africa, or monitoring microorganism populations in local water sources.”

    The researchers are also hoping that the final design can be optimized further to suit different people’s needs.


    Bright and dark-field images taken with clip-on microscope. (Image: Nature)

    Bright and dark-field microscopy possibilities

    Orth and his team have proven that a smartphone already offers all the necessary parts to make a usable microscope. All that is missing is the magnification, which can be simply stuck on.

    Samples also require illumination, which was achieved by using the smartphone’s internal flash. The challenge was to point the flash in the right direction to be able to shine through a sample and then into a camera.

    Traditionally, this requires the use of prisms or mirror, but Orth and his team were able to diffuse the smartphone’s flash off of regular plastic. The clip-on design includes an assortment of tunnels to confine light and point it to the camera.

    In addition to using the camera’s flash for a bright-field microscopy experience, the researchers also demonstrated that the sunlight could be used for illumination for a process called dark-field microscopy.

    Finally, Orth recommends using Formlabs range of stereographic 3D printers to fabricate the design in black resin.

    Source: Nature


    Zooplankton using bright-field microscopy. (Image: Nature)


    License: The text of „Open Source 3D Printed Clip-On Microscope For Smartphones“ by All3DP is licensed under a Creative Commons Attribution 4.0 International License.

    Subscribe to updates from All3DP

    You are subscribed to updates from All3DP

    Website: LINK

  • Original Prusa i3 MK2 Review: It Doesn’t Get Any Better

    Original Prusa i3 MK2 Review: It Doesn’t Get Any Better

    Reading Time: 13 minutes

    This is where 3D printing is right now, according to Thomas Sanladerer. Read his detailed and enthusiastic Original Prusa i3 MK2 review. 

    Don’t miss: Best Prusa i3 Clone – 24 Prusa i3 Kits vs Prusa i3 MK2

    Editor’s Note: This content originally appeared on Thomas Sanladerer’s YouTube Channel and is licensed as Creative Commons Attribution Share-alike thanks to his supporters on Patreon. If you are looking for the Prusa Mk3 review, please continue here

    So as it turns out, there are a number of issues with reviewing the Original Prusa i3 MK2.

    The first one being, it completely changed my frame of reference for how I’ll expect a printer to perform at a given price-point.

    And secondly… I guarantee you, there will be people calling me a sellout for this, but as always, this review was not influenced by anything other than my own experiences with this machine. I think this is the absolutely best goddamn 3D printer on the market right now.

    But let’s start out with what this i3 is. If you’re at all interested in 3D printing, you will have heard the name “Prusa i3” or just “i3” or even “i4” for various 3D printer kits before, some of which have practically nothing to do with what the i3 actually is.

    The thing is, “Prusa” is actually a person, Josef “Jo” Prusa from Prague, whose first popular design was the first Prusa Mendel, a cheaper and simpler version of the old Sells Mendel back in the day.

    Skip forward to today and you’ll find an almost 60-person strong team under the Prusa Research brand, engineering, and selling, what is now the Original Joseph Prusa i3 “MK2” (or “Mark 2”, I guess). That and only that is what we’re looking at today.

    Original Prusa i3 MK2 Review: Overview

    1. Features & Specifications
    2. Assembly & Performance
    3. Verdict

    If you’ve seen articles like “Best Prusa i3 Clone – 24 Prusa i3 Kits vs Prusa i3 MK2”, some of those machines are based on the open source i3 design, but thinking you’ll get the exact same experience from any of the kits from Far East sellers would be like buying this Goophone i7 and expecting it to rival an actual Apple iPhone 7. You get the idea.

    Original Prusa i3 MK2 Review: Features & Specifications

    Now, of course, the Prusa i3 design is completely open-source, both the hardware and software, and the MK2 comes with a bunch of very clever features for both of them. Let’s have a look at what the Original Prusa i3 MK2 promises specs-wise.

    So it’s still the familiar design of the vertical center plate carrying the Z and X axis and the M12 threaded rod base that carries that vertical plate and the Y-axis. This gives the Original Prusa i3 MK2 a slightly plus-sized build volume that’s 250mm or 10 inches wide, 210mm deep and 200 mm tall.

    It’s printing onto the MK42 heated bed, so the solution to all problems apparently, and that’s a thick, custom PCB — or printed circuit board — heater with no aluminum, glass, or anything else required to give it stiffness, since it’s already made from glass-fiber-reinforced resin, and to get your prints to stick, a thin PEI foil on top. This means it heats up and cools down fairly quickly, actually just as quickly as the hotend if you simply want to print PLA, and also ends up as a very light y-axis setup.

    The MK42 heater PCB also has zones with different heating properties that compensate for the bed cooling down faster at its edges, so you’ll get a very even temperature distribution at any point of the bed, which is important for printing larger prints with high-temp plastics.

    And having a genuine all-metal E3D v6.1 hotend in here means that you can throw any material at the printer. Use PLA, ABS, PET, Nylon or particle-filled filaments like wood-infused materials with the stock setup and brass 0.4mm nozzle; or swap in a hardened or coated nozzle for glass or carbon-fiber-reinforced filaments; or add a Volcano heater and nozzle if you want to go, like, really fast.

    Or if you’d rather end up with even more precise prints instead, grab a finer 0.25mm nozzle. Spoiler alert: it already prints magnificently with the default setup, but of course, the v6.1 does give you a lot of flexibility there.


    original prusa i3 mk2 review

    Next to the hotend, we find something I believe should be mandatory for any 3D printer sold today — a bed probe. And not just any probe, but a smaller-than-usual inductive one. Dubbed the P.I.N.D.A. probe (which apparently has a different meaning in Czech), it’s a custom-made sensor that of course takes up less space and also runs reliably off of 5v directly instead of requiring some sort of voltage level adaption like the larger, standard industrial probes.

    The Original Prusa i3 MK2 uses the probe for several tasks. One, it does auto mesh bed leveling, which allows the printer to correct for a slight bow or warp in the build platform instead of just a planar misalignment. Two, if you built the Original Prusa i3 MK2 from the kit version, it also uses the embedded calibration spots in the MK2 heated bed to square up your X and Y axes, so even if you built it with the lower frame super poorly aligned to the rest of the machine, which can be tricky to get perfectly right, your prints will still come out square after you let the printer calibrate itself.

    Some reviewers actually left that part out completely. It does square itself up, no need to meticulously adjust it while building it. And both the mesh leveling and the auto-squaring were developed by the Prusa Research team and are now becoming part of the main Marlin firmware as well, so that everyone can use them. Open source for the win!

    You’ll mostly be operating the printer through this decidedly unspectacular LCD controller. But I do actually like the way the clickwheel knob looks with this flap, which makes it super easy to use with a quick flick of a finger. I know, it’s the smallest of all details, but those usually do make the biggest differences.


    original prusa i3 mk2 review

    On the LCD, you get all the options for running the calibration routines, and loading and unloading filament. And it’s all not just dumb scripts, these will actually detect if something doesn’t look right, like the heaters not responding properly or the sensor not triggering at the height it’s expecting it to. So in plain words, if you mess up building the printer or something else fails, the Original Prusa i3 MK2 isn’t going to instantly destroy itself.

    So the entire machine is driven by a genuine Ultimachine Mini Rambo, which means reliable components for driving heaters and such as well as having a solid fusing concept that will protect the machine should anything ever short out. On the other side behind the frame we find a generic power supply without a fan, which does get warm to the touch during regular use.

    What’s awesome here is that it has this cover on its connector side, and this, in fact, also comes preinstalled even on the kit. You will not need to wire up mains voltage into your machine, you simply plug in your power cord into the fused IEC connector and the other side into the Mini Rambo mainboard. That’s awesome! And the frame also gets grounded properly by having the power supply attached to it, and even stiffened up by having it brace the vertical frame against the subframe.


    original prusa i3 mk2 review

    One thing about the entire wiring situation that stands out is that the most strained wire bundles, the ones going to the extruder and to the heated bed, actually include a piece of 3 mm Nylon filament to keep them from kinking and wearing out from repeatedly bending in the same spot. And short of using an actual drag chain, that’s what i think is one of the best ways of taking care of such a wire bundle.

    Original Prusa i3 MK2 Review: Assembly & Performance

    So if you’re deciding to build the Original Prusa i3 MK2 yourself, you should plan for a good five hours of assembly fun. And it was actually quite enjoyable. If you want to see my entire assembly process, check out the livestream recording here. It took me quite a bit longer, but then again, I was also trying to entertain about 500 people at the same time.

    The manual takes you through each step of the assembly, and then through the automatic calibration, and shows you how to prepare your own prints. While the pictures in the printed version aren’t particularly great, you can also pull up the additional online guide alongside it and augment the printed one with the images there.

    Now, Jo Prusa actually sent me two machines: one assembled, one as a kit. The assembled one actually came with a bit of shipping damage; it looked like the bed shipping lock came loose, broke its belt mount, and tore the LCD case off the frame. The latter only required a pair of zip ties to fix, and the belt mount, well, I used the part from the kit for that and then used the already assembled Original Prusa i3 MK2 to print a replacement part. But obviously, Prusa Research would just ship you the replacement part no problem.


    original prusa i3 mk2 review

    So with both machines assembled, it turns out they actually perform absolutely identically. If it weren’t for the signed frame, I’d have no way to tell them apart other than the serial number.

    You get a testing protocol with each machine; on the kit, they hook up the components on a dummy setup, for the assembled one, they actually test all components in the printer itself, as to how each part actually performs compared to how it should perform.

    And boy, do these MK2s perform well. Let me just show you the first “real” print I did on the assembled Original Prusa i3 MK2.

    This frog was printed live on stream, using the supplied sample GCode and Fillamentum Rapunzel Silver filament. And it looks absolutely perfect. There is literally nothing about this print that I could criticize, and that’s Ultimaker-level quality straight out of the box!


    original prusa i3 mk2 review

    But what good would a single demo print be if you couldn’t print your own stuff this well? Well, turns out you can do just that. So software-wise, Prusa Research are providing a full installer for Windows and Mac OSX, and instructions on how to set up the tools if you’re using GNU/Linux, and their software package includes everything from drivers, a preconfigured slicer, a printer host, a Netfabb installer, a color print tool as well as a firmware updater.

    Let’s go through those one by one: Drivers! The Original Prusa i3 MK2 still shows up as a serial port when you plug it into any USB port, so you can use it with any printer host, be it on a full computer or a Raspberry Pi with Octoprint or any other cloud printing solution. However, it also identifies straight-up as a 3D printer to Windows 10 and, I believe, also to Windows 8.1, so you can use the integrated 3D Builder app to print things or print directly from professional CAD tools like Solidworks without even needing to ever touch a separate slicer or 3D printer host.

    That is pretty awesome, I think, and other than some 3D Systems and Stratasys machines, I don’t know of any other 3D printer that allows you to work that way yet. Basically, you get the Original Prusa i3 MK2 to show up as a printer device, you get a print queue for it and all applications that support the Windows 3D printer interface will be able to use it directly. Very, very nice.


    original prusa i3 mk2 review

    But of course, you can still use the traditional way of exporting your model as an STL file and taking that through a slicer. You get a pre-configured version of Slic3r, which is actually a newer and improved version compared to what you can officially download.

    This includes a full profile for the printer, for various layer heights and after-market nozzle sizes – here’s where that smaller 0.25mm nozzle comes in – and for a bunch of different materials, covering all the basic ones from PLA, ABS, PET, over Taulman T-Glase or Bridge Nylon. All the ones I tried ended up working absolutely perfectly – unless, of course, I messed up the settings myself. You can still go in and tweak all of them, it’s just usually not necessary.

    If you prefer a different slicer, say Cura or Simplify3D, you can also download ready-to-go profiles for those from Prusa’s site.

    Now, having a ready-to-rock slicer like this is, in my opinion, one of the easiest and most effective ways to add value to any 3D printer. Because I don’t want to mess with tuning in a 3D printer and having my first ten or so print being complete failures, especially after I’ve just spent half day assembling it already. And having sort of this one-click solution to slicing available just completely removes that step from the equation, especially if you get profiles that are as well-tuned-in as the ones the Original Prusa i3 MK2 comes with.

    Pretty much all of my prints with this machine so far were done with the exact stock profiles and I just don’t feel a need to tweak them unless I wanted to add a new material that’s not supported out of the box.

    One more cool feature I’ve been using for years on most of my custom printers is the hotend priming on the bed edge instead of having the slicer draw a skirt around the print for that. Basically, you get a more reliably primed hotend and don’t waste a whole bunch of space on your printbed.


    original prusa i3 mk2 review

    Let’s move on with the software. You also get a firmware update tool for the Original Prusa i3 MK2, as the firmware is continuously being improved, and, point in case, they’ve already had a look at the points where I screwed up in the unboxing and now have the printer tell you not to do those exact things.

    Also, there were some performance improvements already, but be honest, I didn’t have any issues with the firmware running out of processing power anyways. If you’re using the supplied Slic3r install, you’ll even get a notification on the MK2’s LCD before a print if a new firmware is available.

    Then, color print! While the Original Prusa i3 MK2 is a single-color 3D printer, they’ve included some features to allow you to print in multiple colors by swapping filament mid-print. You can either do this through the LCD controller on any print (which you could also use simply to drop in a fresh spool of filament if your old one runs out) or by inserting color change positions to the ready-to-print GCode file before a print, and at those positions the printer will pause and ask you to swap its filament.

    Original Prusa i3 MK2 Review: A Flawless Experience?

    So overall, that’s pretty much a flawless experience with the Original Prusa i3 MK2 so far. Now of course, it’s still a regular FDM-based 3D printer, a very good one, but it still has its limits like any other machine.

    I still had one print fail, this ginormous Squid Attack model, which I even scaled down and therefore made it even harder to print. The overhangs on this one were just a bit too extreme and ended up curling up and getting the printer to skip.


    original prusa i3 mk2 review

    Now there are two different run modes you can select on the Original Prusa i3 MK2, power and silent mode. I had the MK2s on silent mode for most of the time, and that really does make them comfortably quiet with the hotend fan as the loudest part.

    I guess going with power mode could have made the Squid Attack print go through successfully. Of course, the printer does also get significantly louder, so to make use of that mode, you should definitely have the machine in a room separate from your living room.

    So let’s recap. The Original Joseph Prusa i3 MK2 is a €739 or $845.79 kit or an €999 or $1,087.79 assembled machine that punches way, way above its weight class. While it’s not your super-streamlined mainstream design 3D printer, it easily outperforms those with a form-follows-function approach, brings many innovative and actually useful features to the table and print like a champ.

    Again, the Original Prusa i3 MK2 has the best and most consistent print quality – even straight out of the box with zero tuning – of any filament-based 3D printer I’ve ever seen. It’s literally got everything I’m looking for in a 3D printer right now. From now on, it will be my new benchmark which other printers will have to measure against when it comes to ease of use, features, and raw print quality.


    original prusa i3 mk2 review

    Website: LINK

  • Building the Original Prusa i3 MK3: Review the Facts Here!

    Building the Original Prusa i3 MK3: Review the Facts Here!

    Reading Time: 2 minutes

    So what’s the big deal about the Original Prusa i3 MK3? It’s pitched as a refinement of everything Prusa Research have achieved to date; more than a reliability upgrade, but a new and improved desktop 3D printer with some awesome new features.

    Where the previous model offered astonishing print quality for the money, the Original Prusa i3 MK3 seeks to make the discipline of fused deposition modeling (FDM) more intuitive and easier than ever before, with a plethora of sensors to alert users to potential problems and prevent failed prints.

    There’s a lot to cover, but to summarize those new features:

    • Filament sensor
    • Power Panic
    • RPM sensing fans and Noctua
    • Ambient thermistor and P.I.N.D.A 2 with thermistor
    • EINSY RAMBo motherboard
    • Trinamic2130 drivers with layer shift detection, faster and silent printing
    • New Y axis
    • Bondtech extruder
    • Magnetic MK52 Heatbed
    • Powder coated PEI spring steel print sheet
    • Ready for OctoPrint

    The Filament Sensor uses an optical filament encoder to detect the presence and movement of a filament. This provides early warning for when the filament is about to run out, and the machine can pause the print and prompt the user to insert a new spool. It can also detect stuck filament and recommend a “cold pull” to clean the nozzle and continue the print.

    Other upgrades are Power Panic, where the MK3 can recover and resume a print-job after a power loss, and a new EINSY RAMBo motherboard which is pitched as the most advanced 3D printer board currently available. How advanced is it? It can monitor power, for one, which allows for the detection of blown fuses. For another, it features Trinamic drivers which are super fast, quiet, and can detect (and correct) layer shift while printing.

    Elsewhere, the Original Prusa i3 MK3 has a reworked Y axis for improved frame rigidity and an extra 10mm build volume on the Z-height. And there’s an upgraded Bondtech drive gear extruder, which grips filament from both sides to increase the push force of the filament and making it more reliable (especially for flexible filaments).

    But perhaps the biggest highlight of the MK3 is the new MK52 Magnetic HeatBed, which holds detachable alloy spring steel sheets powder coated with PEI. As the sheet cools down, parts can be popped off by simply flexing the sheet.

    So! Without further ado, let’s move onto the unboxing and build, shall we?

    Website: LINK

  • Ultimate 3D printer control with OctoPrint

    Ultimate 3D printer control with OctoPrint

    Reading Time: 3 minutes

    Control and monitor your 3D printer remotely with a Raspberry Pi and OctoPrint.

    Timelapse of OctoPrint Ornament

    Printed on a bq Witbox STL file can be found here: http://www.thingiverse.com/thing:191635 OctoPrint is located here: http://www.octoprint.org

    3D printing

    Whether you have a 3D printer at home or use one at your school or local makerspace, it’s fair to assume you’ve had a failed print or two in your time. Filament knotting or running out, your print peeling away from the print bed — these are common issues for all 3D printing enthusiasts, and they can be costly if they’re discovered too late.

    OctoPrint

    OctoPrint is a free open-source software, created and maintained by Gina Häußge, that performs a multitude of useful 3D printing–related tasks, including remote control of your printer, live video, and data collection.

    The OctoPrint logo

    Control and monitoring

    To control the print process, use OctoPrint on a Raspberry Pi connected to your 3D printer. First, ensure a safe uninterrupted run by using the software to restrict who can access the printer. Then, before starting your print, use the web app to work on your STL file. The app also allows you to reposition the print head at any time, as well as pause or stop printing if needed.

    Live video streaming

    Since OctoPrint can stream video of your print as it happens, you can watch out for any faults that may require you to abort and restart. Proud of your print? Record the entire process from start to finish and upload the time-lapse video to your favourite social media platform.

    OctoPrint software graphic user interface screenshot

    Data capture

    Octoprint records real-time data, such as the temperature, giving you another way to monitor your print to ensure a smooth, uninterrupted process. Moreover, the records will help with troubleshooting if there is a problem.

    OctoPrint software graphic user interface screenshot

    Print the Millenium Falcon

    OK, you can print anything you like. However, this design definitely caught our eye this week.

    3D-Printed Fillenium Malcon (Timelapse)

    This is a Timelapse of my biggest print project so far on my own designed/built printer. It’s 500x170x700(mm) and weights 3 Kilograms of Filament.

    You can support the work of Gina and OctoPrint by visiting her Patreon account and following OctoPrint on Twitter, Facebook, or G+. And if you’ve set up a Raspberry Pi to run OctoPrint, or you’ve created some cool Pi-inspired 3D prints, make sure to share them with us on our own social media channels.

    Website: LINK