Schlagwort: art

  • Technology meets creativity in two interactive art student projects

    Technology meets creativity in two interactive art student projects

    Reading Time: 2 minutes

    Art and engineering are not separate concepts. There is a great deal of overlap between the two and many modern disciplines increasingly blur those lines. Mónica Riki is an “electronic artist and creative coder” who embodies that idea: you might remember her and her incredible Arduino UNO R4-powered installations from our blog post last year. In addition to her artistic practice, her technology-forward approach inspires her work as an educator, as she helps her master’s students develop hybrid concepts that use microcontrollers, sensors, lights and a variety of different technologies to create interactive art pieces. The level of creativity that technology is able to unleash is readily apparent in two of her students’ projects: Flora and Simbioceno.

    [youtube https://www.youtube.com/watch?v=PvYn_yYEQ5w?feature=oembed&w=500&h=375]

    Flora, created by College of Arts & Design of Barcelona students Judit Castells, Paula Jaime, Daniela Guevara, and Mariana Pachón, is a board game in the form of an interactive art installation. It was inspired by nature, with gameplay occurring throughout a simulated ecosystem. An Arduino UNO R4 WiFi board handles the interactive elements, with additional hardware including NFC readers, motors and accompanying drivers, sensors, pumps, LEDs, and more. 

    [youtube https://www.youtube.com/watch?v=rf9-mkIkiGY?feature=oembed&w=500&h=375]

    Simbioceno, by Ander Vallejo Larre, Andrea Galano Toro, Pierantonio Mangia, and Rocío Gomez, also uses an UNO R4 WiFi. It consists of two ecosystems: one aquatic and one aerial-terrestrial. They exist in symbiosis, communicating and sharing resources as necessary. Hardware includes LEDs, pumps, and biofeedback sensors. The students put particular thought into the construction materials, many of which are recycled or biomaterials. 

    Both projects are interactive art and expressions of creativity. While they do integrate technology, that technology isn’t the focal point. Instead, the technology helps to bring the two experiences to life.Feeling inspired by this creative use of the Arduino platform? We hope you’ll develop your own projects and share them with us and the entire community: contact creators@arduino.cc or upload directly to Project Hub! You could be our next Arduino Star.

    The post Technology meets creativity in two interactive art student projects appeared first on Arduino Blog.

    Website: LINK

  • Flux is a kinetic art installation brought to life with Arduino

    Flux is a kinetic art installation brought to life with Arduino

    Reading Time: 2 minutes

    Art may be subjective, but all of our readers can appreciate the technology that goes into kinetic art. That term encompasses any piece of art that incorporates movement, which means it can be as simple as a sculpture that turns in the wind. But by integrating electronics, artists can achieve impressive effects. That was the case for Nicholas Stedman and his Devicist Design Works team, who built the Flux kinetic art installation for Shopify’s Toronto offices.

    Flux is a massive 40-foot-long kinetic art piece that hangs suspended from the ceiling in the Shopify offices. That length is divided into 20 individual planks, each of which contains two reflective prisms. The prisms rotate in different patterns, resulting in mesmerizing visuals as light reflects around the art piece and the surrounding office. It is striking in its industrial minimalism, but subtle enough that it blends into the space instead of overpowering it.

    Stedman’s team used stepper motors to rotate the prisms. 20 Arduino Uno boards control the steppers through silent TMC2160 drivers and receive feedback on position via AS5600 magnetic encoders. A Raspberry Pi single-board computer running a Node.js program coordinates the operation of the Arduino boards through USB. The team also developed 3D simulation software that helps them create animation patterns in a virtual space before deploying them in the real world.

    [youtube https://www.youtube.com/watch?v=Z2o9WQWpmp4?feature=oembed&w=500&h=281]
    Boards:Uno
    Categories:Arduino

    Website: LINK

  • Art class stinks! Learn with smell in art class using this olfactory display

    Art class stinks! Learn with smell in art class using this olfactory display

    Reading Time: 3 minutes

    By Maria Nikoli, Interaction Designer, MSc., Malmö University

    Smelling is crucial to our everyday living. But how well do we really understand the role that smells play in our day-to-day? Ask someone who temporarily lost their sense of smell because of COVID-19. They’ll probably tell you about how incredibly boring eating became all of a sudden, and how their roomies saved them from eating a foul-smelling, spoiled block of cheese that had zero mold on it. 

    The sense of smell is super important, as it makes life pleasurable, and helps us detect danger. It’s also intrinsically connected to memory and emotion. You probably know what it’s like to smell something and get an instant flashback – it almost feels like time travel. 

    Yet, olfaction (a fancy word for the sense of smell) is often overlooked in both HCI and art education. Building on that, “Art Class Stinks!” is an interactive system for learning with smell in art class while helping the students become more aware of their sense of smell.

    The prototype consists of two components. The first component is a mobile app that guides the user through processes of learning and being creative with smell, gives instructions for creative tasks and smell awareness tasks, and archives the users’ art. The second component is an olfactory display (OD). The OD consists of a scent kit and an Arduino-powered interactive board equipped with LED lights and RFID tag readers. Navigating the app, the user gets prompted to do several creative tasks using the scents for inspiration. They also get prompted to do smell identification tasks to raise their own awareness of their sense of smell. The interactive board links each scent note to the software and also indicates the ways in which the user can sniff the scent notes. 

    [youtube https://www.youtube.com/watch?v=-0P8fpRPwz4?feature=oembed&w=500&h=281]

    Find out more about this project on Instagram (@marianikolistudio) and Malmö University’s digital archive.

    Categories:Arduino

    Website: LINK

  • This piece of art knows when it’s being photographed thanks to tinyML

    This piece of art knows when it’s being photographed thanks to tinyML

    Reading Time: 2 minutes

    This piece of art knows when it’s being photographed thanks to tinyML

    Arduino TeamSeptember 9th, 2022

    Nearly all art functions in just a single direction by allowing the viewer to admire its beauty, creativity, and construction. But Estonian artist Tauno Erik has done something a bit different thanks to embedded hardware and the power of tinyML. His work is able to actively respond to a person whenever they bring up a cell phone to take a picture of it.

    At the center are four primary circuits/components, which include a large speaker, an abstract LED sculpture, an old Soviet-style doorbell board, and a PCB housing the control electronics. The circuit contains an Arduino Nano 33 BLE Sense along with an OV7670 camera module that can capture objects directly in front. Tauno then trained a machine learning model with the help of Edge Impulse on almost 700 images that were labeled as human-containing, cell phone, or everything else/indeterminate. 

    With the model trained and deployed to the Nano 33 BLE Sense, a program was written that grabs a frame from the camera, converts its color space to 24-bit RGB, and sends it to the model for inferencing. The resulting label can then be used to activate the connected doorbell and play various animations on the LED sculpture.

    [youtube https://www.youtube.com/watch?v=u6I8PXLvG6I?feature=oembed&w=500&h=281]

    More details about this project can be found here on Tauno’s website.

    Website: LINK

  • Brenda is classic automata nightmare fuel

    Brenda is classic automata nightmare fuel

    Reading Time: 2 minutes

    Arduino TeamJuly 5th, 2022

    Art is a strange thing. Sometimes its purpose is purely aesthetic. Sometimes it makes a statement. And sometimes it exists to disturb. Kinetic art is no different and some robots fall into this category. Graham Asker’s art elicits pondering on the relationship between humans and robots, as well as the relationships between different robots. But as Brenda, a classical-style automaton, demonstrates, Asker’s art can also induce nightmares.

    Brenda and her companion Brian are strange, bodiless robots designed to mimic the aesthetics of automatons from myth and history. Each robot is a construction of beautiful brass, mechanical joints, linkages, and cables. Servos hidden inside the bases of the robots actuate the various joints, giving Brenda and Brian the ability to emote. Most of their “facial” movement is in their eyes. Lifelike eyeballs look around from within heavy eyelids, while pivoting eyebrows help to convey expressions.

    Arduino boards, also hidden within the robots’ bases, control the servos that actuate the joints. Asker programmed the Sketches with a variety of different servo movements that correspond to facial expressions and eye movements. Brenda even received lips, so she can smile – or frown. Both robots’ bases rotate, so the robots can turn to look at their surroundings. Brenda and Brian do not have any communications hardware and so they can’t interact with each other, but Asker can sync their pre-coded movements to create the illusion that they do.

    Asker, who is a retired engineer with a Master’s degree in fine art, displayed Brenda at London’s Espacio Gallery and on the Walthamstow Art Trail.

    Website: LINK

  • Coding for kids: Art, games, and animations with our new beginners’ Python path

    Coding for kids: Art, games, and animations with our new beginners’ Python path

    Reading Time: 7 minutes

    Python is a programming language that’s popular with learners and educators in clubs and schools. It also is widely used by professional programmers, particularly in the data science field. Many educators and young people like how similar the Python syntax is to the English language.

    Two girls code together at a computer.

    That’s why Python is often the first text-based language that young people learn to program in. The familiar syntax can lower the barrier to taking the first steps away from a block-based programming environment, such as Scratch.

    In 2021, Python ranked in first place in an industry-standard popularity index of a major software quality assessment company, confirming its favoured position in software engineering. Python is, for example, championed by Google and used in many of its applications.

    Coding for kids in Python

    Python’s popularity means there are many excellent resources for learning this language. These resources often focus on creating programs that produce text outputs. We wanted to do something different.

    Two young people code at laptops.

    Our new ‘Introduction to Python’ project path focuses on creating digital visuals using the Python p5 library. This library is like a set of tools that allows you to get creative by using Python code to draw shapes, edit images, and create frame-by-frame animations. That makes it the perfect choice for young learners: they can develop their knowledge and skills in Python programming while creating cool visuals that they’ll be proud of. 

    What is in the ‘Introduction to Python’ path?

    The ‘Introduction to Python’ project path is designed according to our Digital Making Framework, encouraging learners to become independent coders and digital makers by gently removing scaffolding as they progress along the projects in a path. Paths begin with three Explore projects, in which learners are guided through tasks that introduce them to new coding skills. Next, learners complete two Design projects. Here, they are encouraged to practise their skills and bring in their own interests to personalise their coding creations. Finally, learners complete one Invent project. This is where they put everything that they have learned together and create something unique that matters to them.

    ""
    Emoji, archery, rockets, art, and movement are all part of this Python path.

    The structure of our Digital Making Framework means that learners experience the structured development process of a coding project and learn how to turn their ideas into reality. The Framework also supports with finding errors in their code (debugging), showing them that errors are a part of computer programming and just temporary setbacks that you can overcome. 

    What coding skills and knowledge will young people learn?

    The Explore projects are where the initial learning takes place. The key programming concepts covered in this path are:

    • Variables
    • Performing calculations with variables
    • Using functions
    • Using selection (if, elif and else)
    • Using repetition (for loops)
    • Using randomisation
    • Importing from libraries

    Learners also explore aspects of digital visual media concepts:

    • Coordinates
    • RGB colours
    • Screen size
    • Layers
    • Frames and animation

    Learners then develop these skills and knowledge by putting them into practice in the Design and Invent projects, where they add in their own ideas and creativity. 

    Explore project 1: Hello world emoji

    In the first Explore project of this path, learners create an interactive program that uses emoji characters as the visual element.

    ""

    This is the first step into Python and gets learners used to the syntax for printing text, using variables, and defining functions.

    Explore project 2: Target practice

    In this Explore project, learners create an archery game. They are introduced to the p5 library, which they use to draw an archery board and create the arrows.

    ""

    The new programming concept covered in this project is selection, where learners use if, elif and else to allocate points for the game.

    Explore project 3: Rocket launch

    The final Explore project gets learners to animate a rocket launching into space. They create an interactive animation where the user is asked to enter an amount of fuel for the rocket launch. The animation then shows if the fuel is enough to get the rocket into orbit.

    ""

    The new programming concept covered here is repetition. Learners use for loops to animate smoke coming from the exhaust of the rocket.

    Design project 1: Make a face

    The first Design project allows learners to unleash their creativity by drawing a face using the Python coding skills that they have built in the Explore projects. They have full control of the design for their face and can explore three examples for inspiration.

    ""

    Learners are also encouraged to share their drawings in the community library, where there are lots of fun projects to discover already. In this project, learners apply all of the coding skills and knowledge covered in the Explore projects, including selection, repetition, and variables.

    Design project 2: Don’t collide!

    In the second Design project, learners code a scrolling game called ‘Don’t collide’, where a character or vehicle moves down the screen while having to avoid obstacles.

    ""

    Learners can choose their own theme for the game, and decide what will move down the screen and what the obstacles will look like. In this project, they also get to practice everything they learned in the Explore projects. 

    Invent project: Powerful patterns

    This project is the ultimate chance for learners to put all of their skills and knowledge into practice and get creative. They design their own unique patterns and create frame-by-frame animations.

    ""

    The Invent project offers ingredients, which are short reminders of all the key skills that learners have gained while completing the previous projects in the path. The ingredients encourage them to be independent whilst also supporting them with code snippets to help them along.

    Key questions answered

    Who is the Introduction to Python path for?

    We have written the projects in the path with young people around the age of 9 to 13 in mind. To code in a text-based language, a young person needs to be familiar with using a keyboard, due to the typing involved. A learner may have completed one of our Scratch paths prior to this one, but this isn’t essential. and we encourage beginner coders to take this path first if that is their choice.

    A young person codes at a Raspberry Pi computer.

    What software do learners need to code these projects?

    A web browser. In every project, starter code is provided in a free web-based development environment called Trinket, where learners add their own code. The starter Trinkets include everything that learners need to use Python and access the p5 library.

    If preferred, the projects also include instructions for using a desktop-based programming environment, such as Thonny.

    How long will the path take to complete?

    We’ve designed the path to be completed in around six one-hour sessions, with one hour per project. However, the project instructions encourage learners to upgrade their projects and go further if they wish. This means that young people might want to spend a little more time getting their projects exactly as they imagine them. 

    What can young people do next after completing this path?

    Taking part in Coolest Projects Global

    At the end of the path, learners are encouraged to register a project they’re making with their new coding skills for Coolest Projects Global, our world-leading online technology showcase for young people.

    Three young tech creators show off their tech project at Coolest Projects.

    Taking part is free, all online, and beginners as well as more experienced young tech creators are welcome and invited. This is their unique opportunity to share their ingenuity in an online gallery for the world and the Coolest Projects community to celebrate.

    Coding more Python projects with us

    Coming very soon is our ‘More Python’ path. In this path, learners will move beyond the basics they learned in Introduction to Python. They will learn how to use lists, dictionaries, and files to create charts, models, and artwork. Keep your eye on our blog and social media for the release of ‘More Python’.

    Website: LINK

  • Raspberry Pi interactive wind chimes

    Raspberry Pi interactive wind chimes

    Reading Time: 2 minutes

    Grab yourself a Raspberry Pi, a Makey Makey, and some copper pipes: it’s interactive wind chime time!

    Perpetual Chimes

    Perpetual Chimes is a set of augmented wind chimes that offer an escapist experience where your collaboration composes the soundscape. Since there is no wind indoors, the chimes require audience interaction to gently tap or waft them and encourage/nurture the hidden sounds within – triggering sounds as the chimes strike one another.

    Normal wind chimes pale in comparison

    I don’t like wind chimes. There, I said it. I also don’t like the ticking of the second hand of analogue clocks, and I think these two dislikes might be related. There’s probably a name for this type of dislike, but I’ll leave the Googling to you.

    Sound designer Frazer Merrick’s interactive wind chimes may actually be the only wind chimes I can stand. And this is due, I believe, to the wonderful sounds they create when they touch, much more wonderful than regular wind chime sounds. And, obviously, because these wind chimes incorporate a Raspberry Pi 3.

    Perpetual Chimes is a set of augmented wind chimes that offer an escapist experience where your collaboration composes the soundscape. Since there is no wind indoors, the chimes require audience interaction to gently tap or waft them and encourage/nurture the hidden sounds within — triggering sounds as the chimes strike one another. Since the chimes make little acoustic noise, essentially they’re broken until you collaborate with them.

    Follow the Instructables tutorial to create your own!

    Website: LINK

  • These interactive drawing machines are inspired by Japanese zen gardens

    These interactive drawing machines are inspired by Japanese zen gardens

    Reading Time: 2 minutes

    These interactive drawing machines are inspired by Japanese zen gardens

    Arduino TeamAugust 27th, 2019

    Artist Jo Fairfax has created automated drawing machines inspired by carefully manicured Japanese rock gardens, AKA zen gardens. The mesmerizing artwork uses magnets and motors that move underneath a bed of iron filings, generating soothing shapes as viewers come near via motion sensor.  

    An Arduino Uno is utilized for the device, or rather devices, and you can see a square “magnet garden” in the first video below, automatically producing a circular pattern. A (non-square) rectangular garden sketches a sort of snake/wave pattern in the second clip. 

    The build is reminiscent of sand drawing machines that rotate a metal marble through magnetic force, but does away with a visible source of movement as the filings react directly to the magnetic field as it’s applied.

    An Arduino Uno is programmed to set off a mechanism with integrated magnets below the platform of iron filings. each time a viewer approaches the machine, it starts to ‘draw’ and agitate the black particles, moving them around the platforms. Slowly the drawings become three dimensional and the sense of the magnets’ tracing becomes visible. 
     
    The charged iron filings create varying geometric clusters that shape the zen gardens. The drawing machines reveal the forces acting on them, imitating grass and sand that react to the natural force of the wind. the gesture of the viewer’s movement that activates the machine coupled with the magnetic power makes the artwork become a dialogue of forces… elegant and subtle, just like a zen garden.

    Website: LINK

  • Get obsessed with art in Lucid Realities Studio’s Viveport Developer Story

    Get obsessed with art in Lucid Realities Studio’s Viveport Developer Story

    Reading Time: 2 minutes

    Website: LINK

  • Bringing a book to life with Raspberry Pi | Hello World #9

    Bringing a book to life with Raspberry Pi | Hello World #9

    Reading Time: 4 minutes

    Sian Wheatcroft created an interactive story display to enable children to explore her picture book This Bear, That Bear. She explains the project, and her current work in teaching, in the newest issue of Hello World magazine, available now.

    The task of promoting my first children’s picture book, This Bear, That Bear, was a daunting one. At the time, I wasn’t a teacher and the thought of standing in front of assembly halls and classrooms sounded terrifying. As well as reading the book to the children, I wanted to make my events interactive using physical computing, showing a creative side to coding and enabling a story to come to life in a different way than what the children would typically see, i.e. animated retellings.

    The plan

    Coming from a tech-loving family, I naturally gravitated towards the Raspberry Pi, and found out about Bare Conductive and their PiCap. I first envisaged using their conductive paint on the canvas, enabling users to touch the paint to interact with the piece. It would be some sort of scene from the book, bringing some of the characters to life. I soon scrapped that idea, as I discovered that simply using copper tape on the back of the canvas was conductive enough, which also allowed me to add colour to the piece.

    I enlisted the help of my two sons (two and five at the time) — they gladly supplied their voices to some of the bears and, my personal favourite on the canvas, the ghost. The final design features characters from the book — when children touch certain areas of the canvas, they hear the voices of the characters.

    The back of the canvas, covered in copper tape

    Getting the project up and running went pretty smoothly. I do regret making the piece so large, though, as it proved difficult to transport across the country, especially on the busy London Underground!

    Interactivity and props

    The project added a whole other layer to the events I was taking part in. In schools, I would read the book and have props for the children to wear, allowing them to act out the book as I read aloud. The canvas then added further interaction, and it surprised me how excited the children were about it. They were also really curious and wanted to know how it worked. I enjoyed showing them the back of the canvas with all its copper tape and crocodile clips. They were amazed by the fact it was all run on the Raspberry Pi — such a tiny computer!

    The front of the interactive canvas

    Fast-forward a few years, and I now find myself in the classroom full-time as a newly qualified teacher. The canvas has recently moved out of the classroom cupboard into my newly developed makerspace, in the hope of a future project being born.

    I teach in Year 3, so coding in Python or using the command line on Raspbian may be a little beyond my students. However, I have a keen interest in project-based learning and am hoping to incorporate a host of cross-curricular activities with my students involving the canvas.

    I hope to instil a love for digital making in my students and, in turn, show senior leaders what can be done with such equipment and projects.

    A literacy project

    This work really lends itself to a literacy project that other educators could try. Perhaps you’re reading a picture book or a more text-based piece: why not get the students to design the canvas using characters from the story? The project would also work equally well with foundation subjects like History or Science. Children could gather information onto the canvas, explaining how something works or how something happened. The age of the children would influence the level of involvement they had in the rest of the project’s creation. The back end could be pre-made — older children could help with the copper tape and wiring, while younger children could stop at the design process.

    Part of the project is getting the children to create sounds to go with their design, enabling deeper thinking about a story or topic.

    It’s about a collaborative process with the teacher and students, followed by the sharing of their creation with the broader school community.

    Get Hello World magazine issue 9 for free

    The brand-new issue of Hello World is available right now as a free PDF download from the Hello World website.

    UK-based educators can also subscribe to receive Hello World as printed magazine FOR FREE, direct to their door. And those outside the UK, educator or not, can subscribe to receive free digital issues of Hello World in their inbox on the day of their release.

    Head to helloworld.raspberrypi.org to sign up today!

    Website: LINK

  • This machine creates images using Skittles as pixels

    This machine creates images using Skittles as pixels

    Reading Time: 2 minutes

    This machine creates images using Skittles as pixels

    Arduino TeamJuly 9th, 2019

    Skittles candies come in various vibrant colors. While they may be a tasty treat, JohnO3 had another idea: to create an amazing automated display for the little circles. 

    His device, dubbed the “Skittle Pixel8r,” uses an Arduino Mega to pull a dispensing funnel between one of 46 channels, covered on one side with a piece of glass.

    On top of the shuttle mechanism, eight boxes release the correct flavor/color into an intermediate tube via individual metal gear servos. The Arduino then commands the linear axis to move the funnel to the appropriate bin. This process is repeated 2,760 times until an image, measuring up to 785 x 610mm (31 x 24 inches), is completed. 

    The Skittle Pixel8r an incredible build, and perhaps we could see it expanded even further to not just dispense, but also sort Skittles as an all-in-one auto art installation! Code and files for the project can be found here.

    Website: LINK

  • Quick Fix — a vending machine for likes and followers

    Quick Fix — a vending machine for likes and followers

    Reading Time: 2 minutes

    Sometimes we come across a project that just scores a perfect 10 on all fronts. This is one of them: an art installation using Raspberry Pi that has something interesting to say, does it elegantly, and is implemented beautifully (nothing presses our buttons like a make that’s got a professionally glossy finish like this).

    Quick Fix is a vending machine (and art installation) that sells social media likes and followers. Drop in a coin, enter your social media account name, and an army of fake accounts will like or follow you. I’ll leave the social commentary to you. Here’s a video from the maker, Dries Depoorter:

    Quick Fix – the vending machine selling likes and followers

    Quick Fix in an interactive installation by Dries Depoorter. The artwork makes it possible to buy followers or likes in just a few seconds. For a few euros you already have 200 of likes on Instagram. “Quick Fix “is easy to use. Choose your product, pay and fill in your social media username.

    There’s a Raspberry Pi 3B+ in there, along with an Arduino, powering a coin acceptor and some I2C LCD screens. Then there’s a stainless steel heavy-duty keyboard, which we’re lusting after (a spot of Googling unearthed this, which appears to be the same thing, if you’re in the market for a panel-mounted beast of a keyboard).

    This piece was commissioned by Pixelache, a cultural association from Helsinki, whose work looks absolutely fascinating if you’ve got a few minutes to browse. Thanks to them and to Dries Depoorter — I have a feeling this won’t be the last of his projects we’re going to feature here.

    Website: LINK

  • Make art with LEDs | HackSpace magazine #16

    Make art with LEDs | HackSpace magazine #16

    Reading Time: 3 minutes

    Create something beautiful with silicon, electricity, your endless imagination, and HackSpace magazine issue 16 — out today!

    HackSpace magazine 16

    LEDs are awesome

    Basically, LEDs are components that convert electrical power into light. Connect them to a power source (with some form of current limiter) in the right orientation, and they’ll glow.

    Each LED has a single colour. Fortunately, manufacturers can pack three LEDs (red, green, and blue) into a single component, and varying the power to each LED-within-an-LED produces a wide range of hues. However, by itself, this type of colourful LED is a little tricky to control: each requires three inputs, so a simple 10×10 matrix would require 300 inputs. But there’s a particular trick electronics manufacturers have that make RGB LEDs easy to use: making the LEDs addressable!

    An RGB LED

    Look: you can clearly see the red, green, and blue elements of this RGB LED

    Addressable LEDs

    Addressable LEDs have microcontrollers built into them. These aren’t powerful, programmable microcontrollers, they’re just able to handle a simple communications protocol. There are quite a few different types of addressable LEDs, but two are most popular with makers: WS2812 (often called NeoPixels) and APA102 (often called DotStars). Both are widely available from maker stores and direct-from-China websites. NeoPixels use a single data line, while DotStars use a signal and a clock line. Both, however, are chainable. This means that you connect one (for NeoPixels) or two (for DotStars) pins of your microcontroller to the Data In connectors on the first LED, then the output of this LED to the input of the next, and so on.

    Exactly how many LEDs you can chain together depends on a few different things, including the power of the microcontroller and the intended refresh rate. Often, though, the limiting factor for most hobbyists is the amount of electricity you need.

    Which type to use

    The big difference between NeoPixels and DotStars comes down to the speed of them. LEDs are made dimmer by turning them off and on very quickly. The proportion of the time they’re off, the dimmer they are. This is known as pulse-width modulation (PWM). The speed at which this blinking on and off can have implications for some makes, such as when the LEDs are moving quickly.

    NeoPixels

    • Cheap
    • Slowish refresh rate
    • Slowish PWM rate

    DotStars

    • More expensive
    • Faster refresh rate
    • Fast PWM rate
    NeoPixels moving in the dark

    As a NeoPixel is moved through a long-exposure photograph, you can see it blink on and off. DotStars – which have a faster PWM rate – avoid this.

    Safety first!

    HackSpace magazine’s LED feature is just a whistle-stop guide to the basics of powering LEDs — it’s not a comprehensive guide to all things power-related. Once you go above a few amperes, you need to think about what you’re doing with power. Once you start to approach double figures, you need to make sure you know what you’re doing and, if you find yourself shopping for an industrial power supply, then you really need to make sure you know how to use it safely.

    Read more

    Read the rest of the exclusive 14-page LED special in HackSpace magazine issue 16, out today. Buy your copy now from the Raspberry Pi Press store, major newsagents in the UK, or Barnes & Noble, Fry’s, or Micro Center in the US. Or, download your free PDF copy from the HackSpace magazine website.

    HackSpace magazine 16 Front Cover

    We’re also shipping to stores in Australia, Hong Kong, Canada, Singapore, Belgium, and Brazil, so be sure to ask your local newsagent whether they’ll be getting HackSpace magazine.

    Subscribe now

    Subscribe to HackSpace on a monthly, quarterly, or twelve-month basis to save money against newsstand prices.

    Twelve-month print subscribers get a free Adafruit Circuit Playground Express, loaded with inputs and sensors and ready for your next project. Tempted?

    Website: LINK

  • Play multiple sounds simultaneously with a Raspberry Pi

    Play multiple sounds simultaneously with a Raspberry Pi

    Reading Time: 2 minutes

    Playing sound through a Raspberry Pi is a simple enough process. But what if you want to play multiple sounds through multiple speakers at the same time? Lucky for us, Devon Bray figured out how to do it.

    Play multiple audio files simultaneously with Raspberry Pi

    Artist’s Website: http://www.saradittrich.com/ Blog Post: http://www.esologic.com/multi-audio/ Ever wanted to have multiple different sound files playing on different output devices attached to a host computer? Say you’re writing a DJing application where you want one mix for headphones and one for the speakers.

    Multiple audio files through multiple speakers

    While working with artist Sara Dittrich on her These Blobs installation for Provincetown Art Association and Museum, Devon was faced with the challenge of playing “8 different mono sound files on 8 different loudspeakers”. Not an easy task, and one that most online tutorials simply do not cover.

    These Blobs - Sarah Dittrich

    These Blobs by Sara Dittrich

    Turning to the sounddevice Python library for help, Devon got to work designing the hardware and code for the project.

    The job was to create some kind of box that could play eight different audio files at the same time on eight different unpowered speakers. New audio files had to be able to be loaded via a USB thumb drive, enabling the user to easily switch files without having to use any sort of UI. Everything also had to be under five inches tall and super easy to power on and off.

    Devon’s build uses a 12v 10 amp power supply controlled via a DC/DC converter. This supply powers the Raspberry Pi 3B+ and four $15 audio amplifiers, which in turn control simple non-powered speakers designed for use in laptops. As the sound is only required in mono, the four amplifiers can provide two audio tracks each, each track using a channel usually reserved for left or right audio output.

    A full breakdown of the project can be seen in the video above, with more information available on Devon’s website, including the link to the GitHub repo.

    And you can see the final project in action too! Watch a video of Sara Dittrich’s installation below, and find more of her work on her website.

    These Blobs

    Poem written and recorded by Daniel Sofaer, speakers, conduit, clay, spray paint, electrical components; 4′ x 4′ x 5′ ft.

    Website: LINK

  • Adding the Pi to Picasso with wireless digital graffiti

    Adding the Pi to Picasso with wireless digital graffiti

    Reading Time: 3 minutes

    It looks like the Nintendo Wii Remote (Wiimote) has become a staple in many maker toolkits! Case in point: with the help of a Raspberry Pi and the cwiid Python library, David Pride turned the popular piece of tech into a giant digital graffiti spraycan.

    Raspberry Pi-powered Nintento Wiimote digital art

    Using the Wiimote with a Raspberry Pi

    While it’s no longer being updated and supported, the cwiid library is still a handy resource for creators who want to integrate the Wiimote with their Raspberry Pi.

    Raspberry Pi-powered Nintento Wiimote digital art

    Over the years, makers have used the Wiimote to control robots, musical instruments, and skateboards; the accessibility of the library plus the low cost and availability of the remote make using this tool a piece of cake…or pie, in this instance.

    Digital graffiti

    Using aWiimote, a Wii Sensor Bar, and a large display, David Pride hacked his way to digital artistry wonderment and enabled attendees of the Open University Knowledge Makers event to try their hand at wireless drawing. It’s kinda awesome.

    OK, it’s all kinds of awesome. We really like it.

    Digital graffiti ingredients

    To construct David’s digital graffiti setup, you’ll need:

    • A Raspberry Pi
    • A Nintendo Wii Remote and a Wii Sensor Bar
    • A power supply and DC/DC power converter
    • A large display, e.g. a TV or projector screen
    • A 30mm × 30mm mirror and this 3D-printed holder

    Putting it all together

    David provides the step-by-step instructions for setting up the Wiimote and Raspberry Pi on his website, including a link to the GitHub repository with the complete project code. The gist of the build process is as follows:

    Raspberry Pi-powered Nintento Wiimote digital art

    After installing the cwiid library on the Raspberry Pi, David connected the Pi to the Wiimote via Bluetooth. And after some digging into the onboard libraries of the remote itself, he was able to access the infrared technology that lets the remote talk to the Sensor Bar.

    Raspberry Pi-powered Nintento Wiimote digital art

    The 3D-printed holder with which David augmented the Wiimote lets the user hold the remote upright like a spray can, while the integrated mirror reflects the IR rays so the Sensor Bar can detect them.

    Raspberry Pi-powered Nintento Wiimote digital art

    The Sensor Bar perceives the movement of the Wiimote, and this data is used to turn the user’s physical actions into works of art on screen. Neat!

    If you’ve used the Nintendo Wiimote for your Raspberry Pi projects, let us know. And, speaking of the Wii, has anyone hacked their Balance Board with a Pi?

    On a completely unrelated note…

    How cool is this?!

    Website: LINK

  • Creating virtual art with MasterpieceVR

    Creating virtual art with MasterpieceVR

    Reading Time: 8 minutes

    VR is revolutionizing the 3D production process, and MasterpieceVR is on the forefront. We talked to the company’s Marketing Officer, Brendan de Montigny, about their most recent updates and how MasterpieceVR can help both traditional and 3D artists to thrive.

    MasterpieceVR is available now on Viveport.

    [youtube https://www.youtube.com/watch?v=ZKieI3TnVHY]

    Hello! Tell us a bit about MasterpieceVR (the company) and MasterpieceVR (the app).

    Brendan de Montigny: MasterpieceVR is a technology company that is developing the most intuitive and powerful social 3D content creation platform using virtual reality. MasterpieceVR is also a cross-platform and comprehensive 3D sculpting and painting tool that is an extension of traditional workflows and opens up new ways for rapid ideation, creation and collaboration in virtual space. Creative professionals and teams can quickly learn to create high-quality 3D content and collaborate with others from all over the world.

    Who do you feel MasterpieceVR is aimed at? Amateurs, professionals, both?

    Both! One of the most interesting aspects of MasterpieceVR is how intuitive it is to use. It is excellent for creating collaborative workflow solutions that help professionals. It’s also a perfect opportunity for amateurs to experience VR artwork creation.

    For professional level users – where do you feel MasterpieceVR fits in the usual workflow, and what (if anything) does it replace or augment?

    MasterpieceVR is designed for 2D and 3D concept artists and animators primarily but is so intuitive that it’s useful for all sorts of applications. You can create fully finalized characters, objects, and asset visuals in very short order. It cuts out those early hours of project ideation. No longer do you need to use a 2D screen in a 3D program. In addition, you can simply import any MasterpieceVR files into existing programs like Marmoset and Unity allowing professionals a workflow that is flexible and enhances their existing toolkits.

    MasterpieceVR

    What features do you feel are unique to MasterpieceVR?

    We are fully collaborative. You can work with a team on one piece in real time. We just launched MIXER and REMIX Updates that push the boundaries of VR Creation. They include:

    Rasterization: You can convert any imported 3D models into an editable format. With the ability to access libraries of 3D models on Sketchfab and Google Poly.

    Clay Oven: You can now select and convert any part of your MasterpieceVR creation into mesh objects that can be layered and built into complex characters and scenes in seconds. Quicker, scalable, flexible.

    View Mode: Make your model polished with shadows, ambient occlusion, bloom, and 3-point lighting.

    File Browser and Project Saving Functionality: The sleek new file browser makes MasterpieceVR file management super easy.

    New Controller Models: We have created a new model design for in-game controllers, with icons and responsive animations.

    [youtube https://www.youtube.com/watch?v=IefJOQxigGg]

    What have you seen MasterpieceVR used for since launch? Has anything surprised you?

    One of the most interesting ways MasterpieceVR has been embraced is by educators and 2D animators. The fact it is collaborative really helps instructors quickly show how to create in 3D. MasterpieceVR lends itself to 2D animators who want to create engaging elements quickly, and VR creation is proving to be exactly that – the fastest way to create results. The most surprising example of this adoption is by Denmark based 2D Illustrator Martin Nebelong. His work in MasterpieceVR is outstanding (see it here on ArtStation or watch the video embedded below)!

    [youtube https://www.youtube.com/watch?v=sq3K4zLU428]

    One theme that seems to be present in MasterpieceVR is extending and honoring traditional artistic workflows.

    No matter what artistic industry you are part of, there is always an aspect where artists can feel intimidated. As artists I think it is important to think in terms of investment of time and energy. Technology in 3D content creation has been moving fast, and the hours to learn a new program is about an investment of time. By honoring traditional artistic workflows we understand you’ve spent a long time building a creative practice and toolkit. What we do is look at these tools and improve them: make them faster or more adaptable for a unique and wide variety of projects.

    MasterpieceVR

    You’ve said previously “MasterpieceVR values the interplay between traditional and new ways of creating.” Where do you think this is obvious in MasterpieceVR?

    Kitbashing perfectly explains this idea of using traditional approaches with new methods. Kitbashing  in MasterpieceVR allows you to take preexisting objects as stamps that have imbued meanings in traditional art making and rework  them to fit your creative perspective. We are about to launch a contest that will really answer this question… so, stay tuned!

    [youtube https://www.youtube.com/watch?v=FUZqZZL2cgI]

    Explain ‘kitbashing’ for those who don’t know the term.

    The history of modelling has roots in a pastime that is at the heart 3D creation and Virtual Reality. Popularized in the 1960s, artists used model airplanes, trains, automobiles, and other vehicles to create VFX for films like Stanley Kubrick’s 2001: A Space Odyssey, George Lucas’ Star Wars, and Ridley Scott’s Blade Runner. This was the an early form of kitbashing and we have taken this into VR. Artists can now ‘kitbash’ using the accessible archive from Google Poly and SketchFab.

    What are ‘Stamps’? How are they used in MasterpieceVR?

    Stamps are files of pre-made 3D objects that you can bring into MasterpieceVR and manipulate to create new art from. Using standardized 3D files types you can build complex finished pieces in  little time, and seamlessly integrate your creation within traditional 3D programs.

    With our latest update REMIX, creators can now revisit their older work, other artists work, and combine them into new pieces quickly using the new rasterization Google Poly + SketchFab integration and the clay oven.

    MasterpieceVR

    In commercial artforms there have been points in time where technology has changed workflows, often quite radically – for example the transition between 2D, hand-drawn animation and 3D animation. Do you think VR is another one of these ‘tipping points’?

    I absolutely think that VR is a tipping point. Personally, I say often that I wish VR was around when I started in art school. I studied traditional painting, drawing, and print instead of animation and gaming because I felt there was something lacking in sitting at a 2D screen. There was a haptic lack in pushing around a mouse all day. With VR your hands are in the action. Your eyes are moving in a genuine way. The fact that there is now an opportunity to blend industries and pedagogical approaches this way is ground breaking.

    MasterpieceVR is one of the few VR creation apps to allow collaboration in a virtual space. Was that an objective from the start of development?

    Yes. Our world is increasingly networked. Artists who work as freelancers, or with small to large companies are constantly needing to share ideas quickly . One antiquated way of thinking about an artist is that they are alone in a studio. Here they are contemplative. They create, draw, paint, sculpt. With MasterpieceVR’s collaborative approach artists can now instantly share ideas and receive feedback in an organic way through a VR Studio.

    How do you think MasterpieceVR‘s collaboration features change traditional workflow? What can you accomplish there that you might not be able to normally?

    The collaboration features allow creative teams to quickly review and iterate on concepts of a large project and to see the concepts in relation to each other. Some upcoming features will allow for even easier sharing of files, annotations and viewing from multiple devices.

    [youtube https://www.youtube.com/watch?v=Zja5yDgJ5kQ]

    With the new Rasterization feature you can convert imported 3D models into an editable format – and you previously added the ability to access libraries of 3D models. How does this work?

    The process of rasterization turns a 3D model into an MasterpieceVR-specific editable format. Once the model is edited it can be converted back into a standard 3D model format. The artist can also use our powerful selection tools to subdivide the model into multiple mesh parts, which they can use to make stamps for future use, or to help structure the model into sections to easily iterate on.

    Can you tell us how the ‘Clay Oven’ feature works? Why would someone need to convert part of a MasterpieceVR model into a mesh object – performance, workflow or something else?

    Sculpting in MasterpieceVR is extremely intuitive and fast, and one drawback to creating quickly in this manner is that the models exported to standard 3D formats are very complex meshes that are not optimized for games and animation. The Clay Oven helps increase performance without losing quality.

    Your new View Mode allows users to ‘polish up’ a model with shadows, ambient occlusion, bloom and 3-point lighting. Is this a replacement for the usual process of exporting models, or is there a unique advantage to have this available in VR?

    View Mode allows artists to add realistic lighting to their model before sharing, and by playing around with moveable lights, the artist can get an idea of where to add details that will give the model a little bit extra.

    With your ‘3G’ update you added in a number of tools to allow people to be more precise in their work. Can you explain a bit more about them, and what they add to MasterpieceVR? Does that change your positioning and who could benefit from using MasterpieceVR?

    We have had a lot of interest from architecture schools and industrial designers who are experimenting with these new tools and are excited by the ways these precision tools speed up the concept stage of architectural design. These tools have been really useful for relative sizing, spacing, and for creating unique hard surface shapes that are the foundations of buildings and vehicles, something which had not been easy or possible in VR before.

    The snapping grid in conjunction with stamps has been really powerful for VR artists, allowing them to quickly design complex hard surface models. You can see this in this video.

    [youtube https://www.youtube.com/watch?v=ZKieI3TnVHY]

    Finally, with the recent updates it seems like MasterpieceVR has a pretty wide-ranging set of features – but is there anything else you want to add? Anything you haven’t accomplished yet?

    We have some features coming in the near future that will expand the natural workflow of 3D artists so stay tuned!

    There are lots of areas to improve on and we are listening to our community very carefully to ensure we give them the right set of features that will remove creative boundaries and let them take their art to a new level.

    For someone who might be new to MasterpieceVR and/or 3D work, where would you suggest they start in trying to learn?

    We suggest they dive right in and start creating immediately. The program is so intuitive that they will be making art in no time, and they will find that they improve every day as they discover how powerful the tools are and how fun it is to create in VR.

    We will be releasing a set of basic and advanced tutorials soon that will help artists discover some new and advanced workflows with our set of features.

    Where can people see some great examples of MasterpieceVR in action, and what it’s capable of?

    A great place to see what is being made in MasterpieceVR is our VR Creatives community on Facebook – and of course we have some great videos on our YouTube channel.

    Thank you for talking with us, Brendan!


    MasterpieceVR is available now on Viveport.

    Website: LINK

  • Salvaged Arduino powers animated House Party

    Salvaged Arduino powers animated House Party

    Reading Time: 2 minutes

    Salvaged Arduino powers animated House Party

    Arduino TeamOctober 25th, 2018

    What can you do with items that are destined for the dump? As seen here, if you’re Neil Mendoza, you transform old furniture, TVs, computers, art, and even an Arduino Zero that somehow ended up in the trash into a musical installation.

    His resulting “House Party” features decorations and control components that according to the project’s write-up are entirely salvaged. A MIDI interface, software written in openFrameworks, and a JSON file are used to coordinate sound and movements, which include spinning picture frames and flowers, tapping shoes, and a television that loops through a rather dreary weather report snippet. 

    House Party is a musical installation that explores prized possessions in their native habitat. All the materials used to create this artwork, from the furniture to the computers, were scavenged from the discarded trash. The music is a mix of mechanical and synthesized sounds. The piece was created while an artist in residence at Recology SF.

    The actuators in the installation are controlled by an Arduino Zero (also found in the trash) and each screen is connected to a computer running custom software written in openFrameworks (OF). Composition was done in Logic where a MIDI environment was set up to send MIDI data to the Arduino and an OF control program. The control program then sent the data to the other computers over ethernet as OSC. For the final installation, the control program read the data from a JSON file, triggered the screens and Arduino and played the synthesized parts of the music.

    Be sure to see all the zany action in the video below!

    [embedded content]

    Website: LINK

  • Pay for art with your mugshot

    Pay for art with your mugshot

    Reading Time: 2 minutes

    Pay for art with your mugshot

    Arduino TeamSeptember 17th, 2018

    As reported here, digital artist Matthias Dörfelt has created an art vending machine in an attempt to increase awareness around blockchain possibilities, as well as how we handle our personal information.

    Face Trade, now on display at Art Center Nabi in Seoul, takes the form of a large vaguely face shaped box. When it detects a human in front of it, the installation invites the participant to swap his or her face for art, confirmed using a large yellow button that connects to the system’s computer via an Arduino.

    Once confirmed, Face Trade snaps the person’s picture and uploads it to a blockchain in exchange for a computer generated facial image. The resulting art’s conflicted expression is meant to signify the good and bad possibilities that can come out of using this technology. For their trouble, participants also get a receipt showing their captured headshot that now appears along with each transaction on itradedmyface.com.

    Face Trade consists of a camera flash, webcam, receipt printer, inkjet printer, computer, speakers, LCD screen, button and an Arduino (to control the button, LCD screen and camera flash).

    The main application that ties everything together is written in Python. It uses OpenCV to do basic face tracking and take the images. All the Ethereum related things were done using web3.py which is the official python version of web3 to interact with the Ethereum blockchain. The receipt printer, inkjet and Arduino are controlled via Python, too. The process is comprised of taking a picture, uploading it to the blockchain, passing the resulting transaction hash to the face drawing generator that uses it to seed the random numbers (so that each face drawing is uniquely tied to the transaction that it belongs to), printing the resulting drawing and finally printing the receipt.

    [embedded content]

    Website: LINK

  • Transform A Wooden Pallet Into 5 Stenciled Signs Perfect For Fall

    Transform A Wooden Pallet Into 5 Stenciled Signs Perfect For Fall

    Reading Time: < 1 minute

    [unable to retrieve full-text content]

    Website: LINK

  • 9 Simple And Delicious Potato Recipes That Your Friends Will Love [video]

    9 Simple And Delicious Potato Recipes That Your Friends Will Love [video]

    Reading Time: < 1 minute

    [unable to retrieve full-text content]

    Website: LINK

  • I Drew 30 Different Dogs In A 30-Day Challenge

    I Drew 30 Different Dogs In A 30-Day Challenge

    Reading Time: 3 minutes

    Hello, I’m a Finnish artist Sofia Härö. I’ve always had a love for dogs as well as art. When I decided to combine these two, the result was #30canines art challenge.

    Drawing 30 dogs in 30 days was a joy and a challenge. All of the drawings are done by hand, in ink and markers.

    American Staffordshire Terrier Mix

    The Great Dane

    Black Lab pup

    Shepherd

    Lapponian Herder

    American Staffordshire Terrier

    Kleinspitz

    Whippet

    Swedish Vallhund

    Malinois pupper

    Norwegian Elk Hound

    Dachshund

    Kleinspitz

    French Bulldog

    Malinois

    The Boxer

    The old Golden Retriever

    Dachshund

    Pit bull

    The tiniest Dachshund.

    the American Staffordshire Terrier

    Website: LINK

  • Take Your Ikea Coffee Table From Bland To Grand With An Inlay Stencil Kit

    Take Your Ikea Coffee Table From Bland To Grand With An Inlay Stencil Kit

    Reading Time: < 1 minute

    [unable to retrieve full-text content]

    Website: LINK