Schlagwort: augmented reality

  • How makers can use AR and VR

    How makers can use AR and VR

    Reading Time: 4 minutes

    Augmented reality (AR) and virtual reality (VR) are both currently experiencing a meteoric rise in popularity, with the combined market expected to reach $77 billion by 2025, from just $15.3 billion in 2020.

    For makers, AR and VR represent exciting opportunities to build new types of projects, tapping into entirely new possibilities and learning skills that will only become more valuable as time goes on. 

    We’ll explore the significance of AR and VR for makers and look at some of the ways in which makers can integrate these technologies into their projects, rounding off with some real-world examples. 

    AR and VR — what’s the difference?

    AR and VR are similar technologies, but they’re crucially different. Let’s take a quick look at what sets them apart.

    • Augmented reality involves overlaying digital elements onto the physical world, allowing us to observe and even interact with these virtual objects in the context of our actual environments.
    • Virtual veality is much more immersive — typically you will put on a headset and enter a completely virtual world, totally different from your actual physical environment.

    How can makers use AR and VR in their projects?

    Let’s take a look at some of the specific ways makers can leverage AR and VR to improve their projects, along with some examples from Arduino users.

    Gaming and fun

    AR and VR are both making a massive impact in the world of gaming, allowing for far more immersive, novel, and fun experiences. This represents a great opportunity for makers to play around with an entirely new trend, playing a small role in shaping this next chapter of video gaming.

    Probably the best example of this is Pokémon GO — where players track down Pokémon in real-world locations. But this is just the beginning. Ryan Chan decided to design a way for Minecraft — the best-selling video game of all time — to start using AR.

    Thanks to Chan’s work, Minecraft players can now control their in-game movements via their real-life actions. For example, taking physical steps forward will translate into in-game movement. Ryan’s project uses an Arduino MKR Zero board, a MPU-6050 IMU (inertial measurement unit), and two force-sensitive resistors.

    It’s an awesome approach to bringing a fresh set of features to an already established and popular game, and could mark a new generation of smart individual gamers making adjustments to their favorite games.

    Training safety, and education

    Developing new skills is essential if you want to keep making progress as a maker, but it can be tricky. After all, making is a highly technical and complex activity with no real rules.

    The good news is that AR and VR can be massively helpful here. AR can help make learning more interactive, intuitive, and visual by overlaying instructions and visual augmentations onto real-world objects. VR, meanwhile, can help by constructing immersive virtual environments where makers can practice technical tasks in a risk-free setting.

    Let’s check out an example. Kids typically don’t take fire drills too seriously, which means they miss out on important information. This is where AR can come in. This project from a team of engineers at Sejong University created an augmented reality fire drill system based on video games to make fire safety training more realistic and effective.

    By combining virtual reality, AR, and the real world, you can conduct fire drills that simulate smoke-filled rooms and other realistic elements, mimicking the actual experience of a fire much more than standard drills.

    On top of that, the team also made a fire extinguisher that works with the VR system but also looks and feels like the real thing. It connects to an Arduino UNO WiFi Rev2 and can give users the realistic sensation of operating a real extinguisher to put out flames.

    Data visualization and analytics

    It’s important for makers to be able to gain and analyze data related to their projects. This might be a central part of the project’s function — like with a wearable health monitor or a thermostat — or it may just be a way to learn more about your creation to make improvements.

    AR and VR can massively improve your ability to interact with and understand data. By representing data in an entirely new, much more immersive, and more visual way, these technologies can allow you to spot new insights, make connections, and learn more about your projects.

    Mars Kapadia chose to build his own set of smart glasses for a school science fair, using a transparent OLED display paired with Retro Watch software running on an Android phone and powered by an Arduino Nano Every and an HC-05 Bluetooth® module.

    Mars’ glasses also come with darkened lenses to keep the glare of the sun at bay when outdoors, which can also be lifted up when in darker environments.

    Get started today

    With Arduino, you can start bringing AR and VR into your own projects, expanding your horizons and opening up fascinating new possibilities to use this tech as it continues to grow.

    In our Project Hub, you can browse other people’s projects according to category, including AR and VR, and share your own work, too. 

    The post How makers can use AR and VR appeared first on Arduino Blog.

    Website: LINK

  • Augmented reality fire drills make training more effective

    Augmented reality fire drills make training more effective

    Reading Time: 2 minutes

    While we adults don’t experience them often, school kids practice fire drills on a regular basis. Those drills are important for safety, but kids don’t take them seriously. At most, they see the drills as a way to get a break from their lessons for a short time. But what if they could actually see the flames? Developed by a team of Sejong University engineers, this augmented reality fire drill system takes cues from video games to provide more effective training.

    This mixed reality system, which combines virtual reality and augmented reality elements, makes fire drill training more interactive. Instead of just evacuating a building by following a predefined route, participants perform basic firefighting tasks and experience smoke-filled rooms. Using a familiar video game-esque medium, it gives kids a more realistic and believable idea of what an emergency might look like. It is equally useful for adults, because it challenges them to take action.

    That action comes primarily in the form of virtual fires, which participants much douse using fire extinguishers. The mixed reality visuals are straightforward, as the technology is now mainstream. The VIVE VR system can, for example, recognize objects like tables and overlay flame effects. But the fire extinguisher stands out. Instead of a standard VR controller, this system uses a custom interface that looks and feels like a real fire extinguisher.

    That extinguisher has a VIVE PRO tracker, which lets the system monitor its position. The nozzle has an MPU-9265 gyroscope and the handle has a momentary switch. Both of those connect to an Arduino Uno WiFI Rev2 board, which feeds the sensor data to the augmented reality system. With this hardware, participants can manipulate the virtual fire extinguisher just like a real one. The system knows when users activate the fire extinguisher and the direction in which they’re pointing the nozzle, so it can determine if they’re dousing the virtual fires.

    More details on the project can be found in the team’s paper here.

    Image credit: Kang et al.

    The post Augmented reality fire drills make training more effective appeared first on Arduino Blog.

    Website: LINK

  • This project facilitates augmented reality Minecraft gaming

    This project facilitates augmented reality Minecraft gaming

    Reading Time: 2 minutes

    Augmented reality (AR) is distinct from virtual reality (VR) in that it brings the real world into virtual gameplay. The most famous example of AR is Pokémon Go, which lets players find the pocket monsters throughout their own physical region. Minecraft is the best-selling video game of all time, but lacks any official AR gameplay. So Ryan Chan tackled the problem himself and built a system that translates real world movement into control of a player’s Minecraft avatar.

    We’ll just assume that you know how Minecraft works, because you have probably played it yourself. Chan’s project works with the standard game and doesn’t require any special mods — Chan could even use this to play on others’ Minecraft servers if he chose. The system counts footsteps and converts them into forward movement in-game. It also detects real life rotational movement and replicates that movement in the game. But other actions, like attacking or swapping items, require conventional button presses.

    The key components of this project are an Arduino MKR Zero board, a MPU-6050 IMU (inertial measurement unit), and two force sensitive resistors. The IMU detects rotational movement, while the force sensitive resistors detect footsteps when worn on the player’s shoes. Four mechanical key switches trigger the other actions. Chan configured the Arduino to appear as a standard USB HID keyboard and mouse when plugged into a computer, so Minecraft accepts the control commands without issue. To tidy everything up, Chan designed a custom PCB that hosts the aforementioned components.

    Using this system for Minecraft gaming is tricky, as it requires plenty of real world open space to navigate the virtual world. But with access to a large park, it lets the player enjoy an AR Minecraft experience.

    The post This project facilitates augmented reality Minecraft gaming appeared first on Arduino Blog.

    Website: LINK

  • Computer vision and project mapping enable AR PCB debugging bliss

    Computer vision and project mapping enable AR PCB debugging bliss

    Reading Time: 2 minutes

    Imagine if you could identify a component and its schematic label by simply touching that component on your PCB. Imagine if you selected a pin in KiCAD and it started glowing on your real, physical PCB so you can find it easily. Imagine if you could see through your PCB’s solder mask to view the traces underneath. All of those things—and much more — are possible with this Augmented Reality Debugging Workbench (ARDW) system.

    ARDW pairs tracking camera computer vision with projection mapping for fantastic augmented reality examination of PCBs. Touch a component with the special probes and ARDW will project the component’s name and label onto the table next to your board. Select a component or a component’s pin in KicAD and ARDW will project a highlighted overlay on the physical board showing you where it is. ARDW can even guide you through automated debugging by highlighting probe points and checking your measurements as you take them.

    The team that developed ARDW demonstrated the system using Arduino Uno and Arduino Due boards, which were ideal choices because they’re open source and schematics are readily available. But ARDW can work with any PCB for which the user possesses design files.

    It works with a plugin for KiCAD, which is open source PCB design software popular in the maker community and industry. Through KiCAD, ARDW gains access to the PCB layout and the schematics. It matches those up with the physical board sitting on the workbench and then projects graphics according to the selection and the board’s location. ARDW is extremely useful for all kinds of development, debugging, and quality control tasks.

    [youtube https://www.youtube.com/watch?v=RbENbf5WIfc?feature=oembed&w=500&h=281]

    The post Computer vision and project mapping enable AR PCB debugging bliss appeared first on Arduino Blog.

    Website: LINK

  • Embodied Axes is an Arduino-powered controller for 3D imagery and data visualizations in AR

    Embodied Axes is an Arduino-powered controller for 3D imagery and data visualizations in AR

    Reading Time: 2 minutes

    Embodied Axes is an Arduino-powered controller for 3D imagery and data visualizations in AR

    Arduino TeamMay 25th, 2020

    Researchers across several universities have developed a controller that provides tangible interaction for 3D augmented reality data spaces.

    The device is comprised of three orthogonal arms, embodying X, Y, and Z axes which extend from a central point. These form an interactive space for 3D objects, with linear potentiometers and a rotary button on each axis as a user interface.

    At the heart of it all is an Arduino Mega, which takes in data from the sliders to section a model. This enables users to peer inside of a representation with an AR headset, “slicing off” anything that gets in the way by defining a maximum and minimum view plane. The sliders are each motorized to allow them to move together and to provide force feedback.

    Possible applications include medical imaging and CAD modeling, among many others. More details on the Embodied Axes project can be found in the researchers’ paper here.

    [youtube https://www.youtube.com/watch?v=p19ub_pGN5U?feature=oembed&w=500&h=281]

    Website: LINK

  • Improve human-robot collaboration with GhostAR

    Improve human-robot collaboration with GhostAR

    Reading Time: 2 minutes

    Improve human-robot collaboration with GhostAR

    Arduino TeamNovember 26th, 2019

    As robotics advance, the future could certainly involve humans and automated elements working together as a team. The question then becomes, how do you design such an interaction? A team of researchers from Purdue University attempt to provide a solution with their GhostAR system.

    The setup records human movements for playback later in augmented reality, while a robotic partner is programmed to work around a “ghost” avatar. This enables a user to plan out how to collaborate with the robot and work out kinks before actually performing a task.

    GhostAR’s hardware includes an Oculus Rift headset and IR LED tracking, along with actual robots used in development. Simulation hardware consists of a six-axis Tinkerkit Braccio robot, as well as an Arduino-controlled omni-wheel base that can mount either a robot an arm or a camera as needed.

    More information on the project can be found in the team’s research paper.

    [youtube https://www.youtube.com/watch?v=YQMQQe4y7qE?feature=oembed&w=500&h=281]

    With GhostX, whatever plan a user makes with the ghost form of the robot while wearing an augmented reality head mount is communicated to the real robot through a cloud connection – allowing both the user and robot to know what the other is doing as they perform a task.
    The system also allows the user plan a task directly in time and space and without any programming knowledge.

    First, the user acts out the human part of the task to be completed with a robot. The system then captures the human’s behavior and displays it to the user as an avatar ghost, representing the user’s presence in time and space.

    Using the human ghost as a time-space reference, the user programs the robot via its own ghost to match up with the human’s role. The user and robot then perform the task as their ghosts did.

    Website: LINK

  • Ghost hunting in schools with Raspberry Pi | Hello World #9

    Ghost hunting in schools with Raspberry Pi | Hello World #9

    Reading Time: 5 minutes

    In Hello World issue 9, out today, Elliott Hall and Tom Bowtell discuss The Digital Ghost Hunt: an immersive theatre and augmented reality experience that takes a narrative-driven approach in order to make digital education accessible.The Digital Ghost Hunt - Raspberry Pi Hello World

    The Digital Ghost Hunt combines coding education, augmented reality, and live performance to create an immersive storytelling experience. It begins when a normal school assembly is disrupted by the unscheduled arrival of Deputy Undersecretary Quill of the Ministry of Real Paranormal Hygiene, there to recruit students into the Department’s Ghost Removal Section. She explains that the Ministry needs the students’ help because children have the unique ability to see and interact with ghostly spirits.

    The Digital Ghost Hunt - Raspberry Pi Hello World

    Under the tutelage of Deputy Undersecretary Quill and Professor Bray (the Ministry’s chief scientist), the young ghost-hunters learn how to program and use their own paranormal detectors. These allow students to discover ghostly traces, translate Morse code using flickering lights, and find messages left in ultraviolet ectoplasm. Meanwhile, the ghost communicates through a mixture of traditional theatrical effects and the poltergeist potential of smart home technology. Together, students uncover the ghost’s identity, discover her reason for haunting the building, unmask a dastardly villain, find a stolen necklace, clear the ghost’s name, right an old wrong, and finally set the ghost free.

    The Digital Ghost Hunt - Raspberry Pi Hello World

    The project conducted two successful test performances at the Battersea Arts Centre in South London in November 2018, funded by a grant from AHRC’s New Immersive Experiences Programme, led by Mary Krell of Sussex University. Its next outing will be at York Theatre Royal in August.

    Adventures in learning

    The Digital Ghost Hunt arose out of a shared interest in putting experimentation and play at the centre for learners. We felt that the creative, tinkering spirit of earlier computing — learning how to program BASIC on an Atari 800XL to create a game, for example — was being supplanted by a didactic and prescriptive approach to digital learning. KIT Theatre’s practice — creating classroom adventures that cast pupils as heroes in missions — is also driven by a less trammelled, more experiment-led approach to learning.

    We believe that the current Computer Science curriculum isn’t engaging enough for students. We wanted to shift the context of how computer science is perceived, from ‘something techy and boyish’ back to the tool of the imagination that it should be. We did this by de-emphasising the technology itself and, instead, placing it in the larger context of a ghost story. The technology becomes a tool to navigate the narrative world — a means to an end rather than an end in itself. This helps create a more welcoming space for students who are bored or intimidated by the computer lab: a space of performance, experiment, and play.

    Ghosts and machines

    The device we built for the students was the SEEK Ghost Detector, made from a Raspberry Pi and a micro:bit, which Elliot stapled together. The micro:bit was the device’s interface, which students programmed using the block-based language MakeCode. The Raspberry Pi handled the heavier technical requirements of the show, and communicated them to the micro:bit in a form students could use. The detector had no screen, only the micro:bit’s LEDs. This meant that students’ attention was focused on the environment and what the detector could tell them about it, rather than having their attention pulled to a screen to the exclusion of the ‘real’ world around them.

    In addition to the detector, we used a Raspberry Pi to make ordinary smart home technology into our poltergeist. It communicated with the students using effects such as smart bulbs that flashed in Morse code, which the students could then decode on their devices.

    To program their detectors, students took part in a series of four lessons at school, focused on thinking like a programmer and the logic of computing. Two of the lessons featured significant time spent programming the micro:bit. The first focused on reading code on paper, and students were asked to look out for any bugs. The second had students thinking about what the detector will do, and acting out the steps together, effectively ‘performing’ the algorithm.

    We based the process on KIT Theatre’s Adventures in Learning model, and its Theory of Change:

    • Disruption: an unexpected event grabs attention, creating a new learning space
    • Mission: a character directly asks pupils for their help in completing a mission
    • Achievement: pupils receive training and are given agency to successfully complete the mission

    The Ghost Hunt

    During these lessons, Deputy Undersecretary Quill kept in touch with the students via email, and the chief scientist sent them instructional videos. Their work culminated in their first official assignment: a ghost haunting the Battersea Arts Centre — a 120-year-old former town hall. After arriving, students were split into four teams, working together. Two teams analysed evidence at headquarters, while the others went out into places in the building where we’d hidden ghostly traces that their detectors would discover. The students pooled their findings to learn the ghost’s story, and then the teams swapped roles. The detectors were therefore only one method of exploring the narrative world. But the fact that they’d learned some of the code gave students a confidence in using the detectors — a sense of ownership. During one performance, one of the students pointed to a detector and said: “I made that.”

    Future of the project

    The project is now adapting the experience into a family show, in partnership with Pilot Theatre, premiering in York in summer 2019. We aim for it to become the core of an ecosystem of lessons, ideas, and activities — to engage audiences in the imaginative possibilities of digital technology.

    You can find out more about the Digital Ghost Hunt on their website, which also includes rather lovely videos that Vimeo won’t let me embed here.

    Hello World issue 9

    The brand-new issue of Hello World is out today, and available right now as a free PDF download from the Hello World website.

    Hello World issu 9

    UK-based educators can also sign up to receive Hello World as printed magazine FOR FREE, direct to their door, by signing up here. And those outside the UK, educator or not, can subscribe to receive new issues of Hello World in their inbox on the day of release.

    Website: LINK

  • Ghost hunting in schools with Raspberry Pi | Hello World #9

    Ghost hunting in schools with Raspberry Pi | Hello World #9

    Reading Time: 5 minutes

    In Hello World issue 9, out today, Elliott Hall and Tom Bowtell discuss The Digital Ghost Hunt: an immersive theatre and augmented reality experience that takes a narrative-driven approach in order to make digital education accessible.The Digital Ghost Hunt - Raspberry Pi Hello World

    The Digital Ghost Hunt combines coding education, augmented reality, and live performance to create an immersive storytelling experience. It begins when a normal school assembly is disrupted by the unscheduled arrival of Deputy Undersecretary Quill of the Ministry of Real Paranormal Hygiene, there to recruit students into the Department’s Ghost Removal Section. She explains that the Ministry needs the students’ help because children have the unique ability to see and interact with ghostly spirits.

    The Digital Ghost Hunt - Raspberry Pi Hello World

    Under the tutelage of Deputy Undersecretary Quill and Professor Bray (the Ministry’s chief scientist), the young ghost-hunters learn how to program and use their own paranormal detectors. These allow students to discover ghostly traces, translate Morse code using flickering lights, and find messages left in ultraviolet ectoplasm. Meanwhile, the ghost communicates through a mixture of traditional theatrical effects and the poltergeist potential of smart home technology. Together, students uncover the ghost’s identity, discover her reason for haunting the building, unmask a dastardly villain, find a stolen necklace, clear the ghost’s name, right an old wrong, and finally set the ghost free.

    The Digital Ghost Hunt - Raspberry Pi Hello World

    The project conducted two successful test performances at the Battersea Arts Centre in South London in November 2018, funded by a grant from AHRC’s New Immersive Experiences Programme, led by Mary Krell of Sussex University. Its next outing will be at York Theatre Royal in August.

    Adventures in learning

    The Digital Ghost Hunt arose out of a shared interest in putting experimentation and play at the centre for learners. We felt that the creative, tinkering spirit of earlier computing — learning how to program BASIC on an Atari 800XL to create a game, for example — was being supplanted by a didactic and prescriptive approach to digital learning. KIT Theatre’s practice — creating classroom adventures that cast pupils as heroes in missions — is also driven by a less trammelled, more experiment-led approach to learning.

    We believe that the current Computer Science curriculum isn’t engaging enough for students. We wanted to shift the context of how computer science is perceived, from ‘something techy and boyish’ back to the tool of the imagination that it should be. We did this by de-emphasising the technology itself and, instead, placing it in the larger context of a ghost story. The technology becomes a tool to navigate the narrative world — a means to an end rather than an end in itself. This helps create a more welcoming space for students who are bored or intimidated by the computer lab: a space of performance, experiment, and play.

    Ghosts and machines

    The device we built for the students was the SEEK Ghost Detector, made from a Raspberry Pi and a micro:bit, which Elliot stapled together. The micro:bit was the device’s interface, which students programmed using the block-based language MakeCode. The Raspberry Pi handled the heavier technical requirements of the show, and communicated them to the micro:bit in a form students could use. The detector had no screen, only the micro:bit’s LEDs. This meant that students’ attention was focused on the environment and what the detector could tell them about it, rather than having their attention pulled to a screen to the exclusion of the ‘real’ world around them.

    In addition to the detector, we used a Raspberry Pi to make ordinary smart home technology into our poltergeist. It communicated with the students using effects such as smart bulbs that flashed in Morse code, which the students could then decode on their devices.

    To program their detectors, students took part in a series of four lessons at school, focused on thinking like a programmer and the logic of computing. Two of the lessons featured significant time spent programming the micro:bit. The first focused on reading code on paper, and students were asked to look out for any bugs. The second had students thinking about what the detector will do, and acting out the steps together, effectively ‘performing’ the algorithm.

    We based the process on KIT Theatre’s Adventures in Learning model, and its Theory of Change:

    • Disruption: an unexpected event grabs attention, creating a new learning space
    • Mission: a character directly asks pupils for their help in completing a mission
    • Achievement: pupils receive training and are given agency to successfully complete the mission

    The Ghost Hunt

    During these lessons, Deputy Undersecretary Quill kept in touch with the students via email, and the chief scientist sent them instructional videos. Their work culminated in their first official assignment: a ghost haunting the Battersea Arts Centre — a 120-year-old former town hall. After arriving, students were split into four teams, working together. Two teams analysed evidence at headquarters, while the others went out into places in the building where we’d hidden ghostly traces that their detectors would discover. The students pooled their findings to learn the ghost’s story, and then the teams swapped roles. The detectors were therefore only one method of exploring the narrative world. But the fact that they’d learned some of the code gave students a confidence in using the detectors — a sense of ownership. During one performance, one of the students pointed to a detector and said: “I made that.”

    Future of the project

    The project is now adapting the experience into a family show, in partnership with Pilot Theatre, premiering in York in summer 2019. We aim for it to become the core of an ecosystem of lessons, ideas, and activities — to engage audiences in the imaginative possibilities of digital technology.

    You can find out more about the Digital Ghost Hunt on their website, which also includes rather lovely videos that Vimeo won’t let me embed here.

    Hello World issue 9

    The brand-new issue of Hello World is out today, and available right now as a free PDF download from the Hello World website.

    Hello World issu 9

    UK-based educators can also sign up to receive Hello World as printed magazine FOR FREE, direct to their door, by signing up here. And those outside the UK, educator or not, can subscribe to receive new issues of Hello World in their inbox on the day of release.

    Website: LINK

  • Play musical chairs with Marvel’s Avengers

    Play musical chairs with Marvel’s Avengers

    Reading Time: 2 minutes

    You read that title correctly.

    I played musical chairs against the Avengers in AR

    Planning on teaching a 12 week class on mixed reality development starting in June. Apply if interested – http://bit.ly/3016EdH

    Playing with the Avengers

    Abhishek Singh recently shared his latest Unity creation on Reddit. And when Simon, Righteous Keeper of the Swag at Pi Towers, shared it with us on Slack because it uses a Raspberry Pi, we all went a little doolally.

    As Abhishek explains in the video, the game uses a Raspberry Pi to control sensors and lights, bridging the gap between augmented reality and the physical world.

    “The physical world communicates with the virtual world through these buttons. So, when I sit down on a physical chair, and press down on it, the virtual characters know that this chair is occupied,” he explains, highlighting that the chairs’ sensors are attached to a Raspberry Pi. To save the physical-world player from accidentally sitting on Thanos’s lap, LEDs, also attached to the Pi, turn on when a chair is occupied in the virtual world.

    Turning the losing Avenger to dust? Priceless 👌

    Why do you recognise Abhishek Singh?

    You might be thinking, “Where do I recognise Abhishek Singh from?” I was asking myself this for a solid hour — until I remembered Peeqo, his robot that only communicates through GIF reactions. And Instagif NextStep, his instant camera that prints GIFs!

    First GIFs, and now musical chairs with the Avengers? Abhishek, it’s as if you’ve understood the very soul of the folks who work at Pi Towers, and for that, well…

    Website: LINK

  • Dragon Ball Z head-mounted Scouter computer replica

    Dragon Ball Z head-mounted Scouter computer replica

    Reading Time: 2 minutes

    Dragon Ball Z head-mounted Scouter computer replica

    Arduino TeamOctober 26th, 2018

    Those familiar with the Dragon Ball Z franchise will recognize the head-mounted Scouter computer often seen adorning character faces. As part of his Goku costume, Marcin Poblocki made an impressive replica, featuring a see-through lens that shows the “strength” of the person he’s looking at, based on a distance measurement taken using a VL53L0X sensor. 

    An Arduino Nano provides processing power for the headset, and light from a small OLED display is reflected on the lens for AR-style viewing.

    It’s not exactly perfect copy but it’s actually working device. Inspired by Google virtual glasses I made virtual distance sensor.

    I used Arduino Nano, OLED screen and laser distance sensor. Laser sensor takes readings (not calibrated yet) and displays number on OLED screen. Perspex mirror reflects the image (45 degrees) to the the lens (used from cheap Google Cardboard virtual glasses) and then it’s projected on clear Perspex screen.

    So you will still see everything but in the clear Perspex you will also see distance to the object you looking at. On OLED screen I typed ‘Power’ instead distance because that’s what this device suppose to measure in DBZ. 😀

    Print files as well as code and the circuit diagram needed to hook this head-mounted device up are available on Thingiverse. For those that don’t have a DBZ costume in their immediate future, the concept could be expanded to a wide variety of other sci-fi and real world applications.

    Website: LINK

  • World of Tanks AR Spectate: AR-Tabletop-Technologie auf der Gamescom 2018 vorgestellt

    World of Tanks AR Spectate: AR-Tabletop-Technologie auf der Gamescom 2018 vorgestellt

    Reading Time: 2 minutes

    Auf der Gamescom 2018 präsentiert Entwicklerstudio Wargaming die AR-Erfahrung World of Tanks AR Spectate, welche die Panzergefechte von World of Tanks in die reale Welt transportiert. Dafür setzten die Entwickler/innen auf eine neue, experimentelle AR-Technologie. Diese erlaubt es, die Inhalte der PC-Version von World of Tanks realistisch in Echtzeit darzustellen.

    World of Tanks AR Spectate – AR-Tabletop mit realistischen Objekten

    Das MMO World of Tanks weist weltweit mehr als 120 Millionen Spieler/innen auf und erwirtschaftet jährlich mehrere Millionen Euro Gewinn für das Unternehmen. Mit World of Tanks AR Spectate führen die Spieleentwickler Wargaming auf der Gamescom 2018 nun ein neues spannendes AR-Tabletop-Erlebnis vor.

    Der AR-Titel ist allerdings noch nicht ausgereift, viel mehr handelt es sich derzeit um ein Proof of Concept. Die Entwickler/innen arbeiten seit einigen Monaten an dem Projekt. Erfahrung sammelten sie bereits in einem vorherigen MR-Projekt, welches ein 3D-Modell eines Panzers aus dem Spiel im Panzermuseum in Bovington vorführte.

    Die AR-Erfahrung basiert auf ARCore und soll bereits jetzt stabile 60 FPS gewährleisten. Dabei wird ein iPad mit ARKit mit einem PC verbunden, um die entsprechenden Informationen zu versenden. Während der Computer die nötigen Renderarbeiten erledigt, schickt das Tablet dauerhaft seine aktuelle Position an den PC. Eine Ingame-Kamera erfasst zeitgleich die WoT-Runde. Nach Abschluss des Renderings werden die jeweiligen Daten in einem Videoformat zurück auf das iPad transferiert, um diese darzustellen.

    World-of-Tanks-AR-Spectate-iOS

    Aufgrund der großen Datenmenge hatten die Devs mit einigen technischen Schwierigkeiten zu kämpfen. So musste man zunächst den Delay in den Griff bekommen, um eine realistische Echtzeitsimulation zu ermöglichen. Nach zahlreichen Optimierungen im Netzwerk, bei der Paketvermittlung und an der Kamera des iPads funktioniert die AR-Simulation nun auf beeindruckende Weise. Derzeit ist die AR-Technologie allerdings nur bei Replays möglich. Die Nutzung für Live-Content ist bei einer Fortentwicklung jedoch durchaus möglich.

    World-of-Tanks-AR-Spectate-iOS

    Eine Anwendung in verschiedenen Einsatzfeldern ist ebenso denkbar. So wäre ein neuer realistischer Zuschauermodus in Spielen erwägenswert, um beispielsweise eSports-Verantstaltungen noch näher an die Fans zu bringen. Doch auch in den Bereichen Film oder sozialer Interaktion würde die AR-Methode neue Möglichkeiten offenbaren. Bis es so weit ist, muss laut Entwickler/innen jedoch zunächst die 5G-Infrastruktur ausgebaut werden, um die derzeitigen Hardware-Limitierungen aufzuheben.

    Mit World of Tanks 1.0 AR stellten die Verantwortlichen bereits Anfang des Jahres ihre ersten Schritte im AR-Markt vor.

    (Quellen: Upload VR | Video: World of Tanks DE YouTube)

    Website: LINK

  • Google Maps to Improve Walking Navigation with AR Fox and Giant Arrows

    Google Maps to Improve Walking Navigation with AR Fox and Giant Arrows

    Reading Time: 3 minutes

    At Google’s I/O Developers conference this week, the company’s Vice President, Aparna Chennapragada, demonstrated how AR could be used to improve navigation when using Google Maps. The result is a cute AR fox and huge arrows to point you in the right direction. 

    How often do you find yourself relying on Google Maps yet still managing to get lost? Maybe the blue dot arrow was pointing in the wrong direction or maybe you need a virtual character to help show you the way.

    This is Google’s latest idea for Maps. They’re working on improving the walking navigation section by adding augmented reality (AR). As usual, it works by using the camera on your smartphone to show you which direction to go. To do this, arrows pop up to direct you.

    “You instantly know where you are… No fussing with the phone. The street names, the directions, right there in front of you,” explained Google Vice President Aparna Chennapragada during Google’s I/O developers conference this week.

    Better yet is the idea of a guide – in Google’s demo, they showed a bouncy fox. However, it’s unclear whether this character will make it to the final update as it is still a work in progress. But, the audience was certainly in awe and Chennapragada’s demo received cheers and claps.

    It’s clear that having a visual smartphone overlay would make navigating a new city a lot easier and perhaps more enjoyable as the character would show you nearby bars and restaurants to visit.

    Chennapragada displays the AR functionality in the video below – the speech starts from 01:25:00.

    Moving from GPS to VPS

    Chennapragada explains that currently, GPS isn’t good enough for the arrows and AR fox to work accurately. So, Google has been working on VPS – or “visual positioning system”. This can estimate precisely your position and orientation.

    On screen, as well as showing the AR fox, the guiding arrows, and the camera display, there will also be a small semi-circle showing just a section of the map, ensuring you have a vague idea of what street you’re on.

    Of course, this technology wouldn’t work so well for driving. But, if you’re someone who regularly uses Google Maps while walking around and don’t mind looking like a tourist taking hundreds of photos, it could work well.

    Unfortunately for those with a poor sense of direction, Google has given no estimate for when this technology will be available. This is likely due to the need to seriously fine-tune VPS so it doesn’t go wrong as often as the GPS blue dot.

    For now, it’s an interesting, real-world use case for AR which doesn’t appear to be just a gimmick.

    Source: Business Insider

    Google Maps
    Google Maps

    License: The text of „Google Maps to Improve Walking Navigation with AR Fox and Giant Arrows“ by All3DP is licensed under a Creative Commons Attribution 4.0 International License.

    Subscribe to updates from All3DP

    You are subscribed to updates from All3DP

    Website: LINK

  • Augmented-reality projection lamp with Raspberry Pi and Android Things

    Augmented-reality projection lamp with Raspberry Pi and Android Things

    Reading Time: 3 minutes

    If your day has been a little fraught so far, watch this video. It opens with a tableau of methodically laid-out components and then shows them soldered, screwed, and slotted neatly into place. Everything fits perfectly; nothing needs percussive adjustment. Then it shows us glimpses of an AR future just like the one promised in the less dystopian comics and TV programmes of my 1980s childhood. It is all very soothing, and exactly what I needed.

    Android Things – Lantern

    Transform any surface into mixed-reality using Raspberry Pi, a laser projector, and Android Things. Android Experiments – http://experiments.withgoogle.com/android/lantern Lantern project site – http://nordprojects.co/lantern check below to make your own ↓↓↓ Get the code – https://github.com/nordprojects/lantern Build the lamp – https://www.hackster.io/nord-projects/lantern-9f0c28

    Creating augmented reality with projection

    We’ve seen plenty of Raspberry Pi IoT builds that are smart devices for the home; they add computing power to things like lights, door locks, or toasters to make these objects interact with humans and with their environment in new ways. Nord ProjectsLantern takes a different approach. In their words, it:

    imagines a future where projections are used to present ambient information, and relevant UI within everyday objects. Point it at a clock to show your appointments, or point to speaker to display the currently playing song. Unlike a screen, when Lantern’s projections are no longer needed, they simply fade away.

    Lantern is set up so that you can connect your wireless device to it using Google Nearby. This means there’s no need to create an account before you can dive into augmented reality.

    Lantern Raspberry Pi powered projector lamp

    Your own open-source AR lamp

    Nord Projects collaborated on Lantern with Google’s Android Things team. They’ve made it fully open-source, so you can find the code on GitHub and also download their parts list, which includes a Pi, an IKEA lamp, an accelerometer, and a laser projector. Build instructions are at hackster.io and on GitHub.

    This is a particularly clear tutorial, very well illustrated with photos and GIFs, and once you’ve sourced and 3D-printed all of the components, you shouldn’t need a whole lot of experience to put everything together successfully. Since everything is open-source, though, if you want to adapt it — for example, if you’d like to source a less costly projector than the snazzy one used here — you can do that too.

    components of Lantern Raspberry Pi powered augmented reality projector lamp

    The instructions walk you through the mechanical build and the wiring, as well as installing Android Things and Nord Projects’ custom software on the Raspberry Pi. Once you’ve set everything up, an accelerometer connected to the Pi’s GPIO pins lets the lamp know which surface it is pointing at. A companion app on your mobile device lets you choose from the mini apps that work on that surface to select the projection you want.

    The designers are making several mini apps available for Lantern, including the charmingly named Space Porthole: this uses Processing and your local longitude and latitude to project onto your ceiling the stars you’d see if you punched a hole through to the sky, if it were night time, and clear weather. Wouldn’t you rather look at that than deal with the ant problem in your kitchen or tackle your GitHub notifications?

    What would you like to project onto your living environment? Let us know in the comments!

    Website: LINK

  • Snapchat Introduces Interactive AR Games Called Snappables

    Snapchat Introduces Interactive AR Games Called Snappables

    Reading Time: 2 minutes

    As well as being able to use Snapchat for turning yourself into a dog and stalking your crush, you can now play their new AR experience, Snappables, with your friends. 

    If you’re sat across from someone making weirder than normal faces at their phone, you can guarantee they’re trying out Snapchat’s latest interactive AR lenses.

    The AR games and experiences are played in the Snapchat camera. Everything from facial expressions to touch and motion will control your character and help you beat your friends.

    Snappables are the company’s first shared AR experience meaning you can interact with a friend on their phone. Although it’s possible to enjoy some of the games solo, Snappables have been designed to play and send to people to challenge them to beat your score or to go head-to-head with multiplayers.

    On Wednesday April 25th, Snapchat launched the lens-based games. But, they also released this video helping you to realize just how ridiculous you’ll look to anyone not in the know:

    Reshuffling Snapchat to Make Space for AR

    Don’t worry, everything you love about Snapchat will remain. In fact, the company add in a blog post: “Snappables live right where Lenses do.” Regular lenses will be to the right of the camera button while Snappables are to the left.

    “Fight aliens, start a rock band, play basketball, and more — together with your friends, no matter where you are,” Snapchat further adds in the post.

    Other games include, fighting aliens and blowing kisses. But, every week new games will be released to keep you hooked. To start playing with friends, get the app update for both iOS and Android.

    Although the games certainly look fun, could they also be a distraction from the fact that Snapchat could be trialling six second ads which you can’t skip (only in publisher made shows) next month?

    Source: Digital Trends


    License: The text of „Snapchat Introduces Interactive AR Games Called Snappables“ by All3DP is licensed under a Creative Commons Attribution 4.0 International License.

    Subscribe to updates from All3DP

    You are subscribed to updates from All3DP

    Website: LINK

  • Watch the HTC VIVE and VIVEPORT Keynotes from Mobile World Congress 2018

    Watch the HTC VIVE and VIVEPORT Keynotes from Mobile World Congress 2018

    Reading Time: < 1 minute

    Last week at Mobile World Congress 2018, HTC Chairwoman Cher Wang gave a keynote sharing her vision for HTC Vive and the VR industry. Watch the speech below to hear her thoughts on the on the convergence of major technologies like VR, AR, 5G and AI.

    Also at Mobile World Congress, Rikard Steiber, President of Viveport, gave a keynote on “The Dawn of VIVE REALITY” which can find below.

    Stay tuned to our blog and social channels, as we bring y


    Website: LINK

  • RoMA: Robotic Modeling Assistant Could be a Better Prototyping Machine

    RoMA: Robotic Modeling Assistant Could be a Better Prototyping Machine

    Reading Time: 3 minutes

    Cornell and MIT are working on a joint project called the Robotic Modeling Assistant (RoMA) which will bring together multiple technologies to create an ultimate prototyping machine. 

    Although 3D printing is certainly improving and streamlining prototyping, researchers from MIT and Cornell want to bring more emerging technologies together to improve such machines.

    The joint project is called the Robotic Modeling Assistant (RoMA). It blends technologies such as augmented reality, 3D printing and robotics. It looks like a robotic arm with a 3D printing pen attached to the end of the arm.

    Team leader Huaishu Peng explains on his website that the machine is an interactive fabrication system. It offers a fast, hands-on, precise modeling experience.

    Essentially, users can create a 3D model in-situ and get hands on with their 3D print using the open, robotic arm. Peng adds:

    “With RoMA, users can integrate real-world constraints into a design rapidly, allowing them to create well-proportioned tangible artifacts. Users can even directly design on and around an existing object, and extending the artifact by in-situ fabrication.”

    Although this process might appear clunky and awkward, it’s an interesting mixture of the emerging technologies. Check out the way it works in the video below:

    Positives and Negatives of the Robotic Modeling Assistant

    Using the augmented reality headset, it’s possible to create the perfect design. While the designer creates a model with the AR CAD editor, the robotic arm will fabricate the object simultaneously.

    The small, basic plastic model created with the 3D printing pen attached to the end can be used as a tangible reference for the maker.

    With this robotic arm, it’s also possible to print on top of other objects as it doesn’t work with a printing bed. Currently, the machine is faster than most FDM 3D printing methods and designers can move the arm more easily.

    “At any time, the designer can touch the handle of the platform and rotate it to bring part of the model forward,” Peng continues.

    “The robotic arm will park away from the user automatically. If the designer steps away from the printing platform, the robotic fabricator can take the full control of the platform and finish the printing job.”

    However, it is more advanced than a 3D printing pen and offers more control. Peng explains that he hopes to see people designing their own everyday objects to suit their needs in the future. Want to find out more? Visit Peng’s website.

    Source: Tech Crunch


    Roma


    License: The text of „RoMA: Robotic Modeling Assistant Could be a Better Prototyping Machine“ by All3DP is licensed under a Creative Commons Attribution 4.0 International License.

    Subscribe to updates from All3DP

    You are subscribed to updates from All3DP

    Website: LINK

  • Job-Posting: Magic Leap will in den stationären Handel

    Job-Posting: Magic Leap will in den stationären Handel

    Reading Time: 2 minutes

    Der Handel im Internet ist ja gut und schön, aber gerade Wearables wie die noch jungen VR Headsets und kommenden AR-Brillen benötigen eine konkrete Erfahrung, um sie einschätzen und verstehen zu können. In einer neuen Stellenanzeige sucht Magic Leap nun einen „leidenschaftlichen und furchtlosen“ Mitarbeiter, der Store-Konzepte designen und bei der Umsetzung helfen soll. Dieser nimmt laut Start-up eine Schlüsselposition für die Kunden-Erfahrung im Verkauf ein.

    Magic Leap sucht Designer für Ladengeschäfte

    Dieses Jahr soll die Magic Leap One auf den Markt kommen und den Weg zu Entwicklern und Kunden finden. Bisher ist zwar kein genauer Preis bekannt, doch soll die AR-Brille zusammen mit einem Hosentaschen-PC lediglich so viel kosten wie ein Highend-Smartphone. Falls die Magic Leap One die Erwartungen erfüllt, könnte sie also durchaus eine größere Zielgruppe außerhalb des Business-Bereichs ansprechen. Die bisherigen Äußerungen des Unternehmens lassen vermuten, dass Magic Leap genau das im Sinn hat.

    Für diese Strategie spricht auch das jetzt veröffentlichtes Job-Posting. Darin sucht das Augmented-Reality-Start-up einen Designer für Ladengeschäfte. Zu den Anforderungen an den neuen Mitarbeiter zählt beispielsweise auch die Gestaltung von Merchandising-Produkten. Eine enge Zusammenarbeit mit dem Verkaufs- und Marketing-Team gehört ebenfalls zu den Bedingungen. Aus der Stellenbeschreibung geht hervor, dass Magic Leap Shop-in Shop-Konzepte plant. Außerdem setzt das Unternehmen Reisefreudigkeit voraus und geht davon aus, dass Reisen jährlich bis zu 75 Prozent der Zeit verbrauchen werden.

    Wer Lust hat, andere Job-Angebote zu inspizieren: Die beeindruckend lange Liste findet man auf der Seite des AR-Start-ups.

    (Via Road to VR)

    Website: LINK

  • VR Weekly: Tower Tag in Tokio und Magic Leap One

    VR Weekly: Tower Tag in Tokio und Magic Leap One

    Reading Time: 2 minutes

    In dieser Woche stehen in unserem VR Weekly hauptsächlich zwei Themen auf unserer Liste: Sega holte unser VR-Spiel Tower Tag in die riesige Arcade Joypolis in Tokio. Außerdem gibt es einige Neuigkeiten zur AR-Brille Magic Leap One, die Chris und Patrick diskutieren.

    VR Weekly Plus: Tower Tag und Magic Leap im Fokus

    In dieser Woche gab es gleich einige News zur AR-Brille Magic Leap One, die dieses Jahr als Creator Edition auf den Markt kommen soll. Dabei will der Hersteller nicht wie andere erst ein Entwickler-Kit verkaufen, sondern gleich Endkunden ansprechen. Magic Leap sieht AR nicht nur als Consumer-Gerät, sondern als eine völlig neue PC-Plattform. Daher leitet sich der Name Creator Edition ab, denn das Unternehmen wünscht sich, dass Käufer der AR-Brille aktiv werden und selbst Inhalte erstellen.

    Allerdings kommt auch der Entertainment-Bereich nicht zu kurz, denn Magic Leap kooperiert mit der amerikanischen Basketball-Profiliga NBA, um sportliche Events mit zusätzlichen Informationen anzureichern oder gar einen Spieler ins eigene Wohnzimmer zu bringen. Außerdem investiert Axel Springer in das junge Unternehmen, womit der Medien-Konzern weiter den digitalen Ausbau vorantreibt. Fast noch wichtiger ist eine Nebenbemerkung zum Preis der Magic Leap One. Das autarke System soll nicht teurer werden als ein High-End-Smartphone. Bei einem Preis zwischen 800 und 1000 US-Dollar wäre die AR-Brille ein echtes Schnäppchen – Microsofts HoloLens kostet – allerdings in der Entwickler-Version – derzeit mehr als drei Mal so viel.

    Zumindest die für uns VR Nerds wichtigste News der Woche war aber die Eröffnung von Tower Tag in Japan. Kein geringerer als Sega holte die im norddeutsch-kühlen Hamburg entwickelte Arcade-Erfahrung in die japanische High-Tech-Hochburg Tokio. In der gigantischen Arcade-Vergnügungsstätte Joypolis darf man unseren Arcade-VR-Titel seit dieser Woche spielen und sich hinter echten Acrylglas-Obelisken vor feindlichem Beschuss verstecken. Ein Besuch des Joypolis lohnt sich daher auf jeden Fall. Neben Tower Tag warten beispielsweise noch eine echte Achterbahn und eine Halfpipe auf Besucher. Ausschnitte aus unserem Spiel und Impressionen aus Japan zeigen wir euch diese Woche im VR Weekly.

    Website: LINK

  • Magic Leap Finally Reveal their First Augmented Reality Glasses

    Magic Leap Finally Reveal their First Augmented Reality Glasses

    Reading Time: 3 minutes

    Magic Leap’s “One Creator Edition” is finally here. “Magic Leap One”, the first product of the company’s Augmented Reality technology, will be made public to developers in 2018. Still, there is no price available yet. 

    After raising more than $1.8 billion across four investments rounds and making several bold announcements about changing the Augmented Reality game, Magic Leap finally revealed their first product.

    Magic Leap’s first AR headset is called “Magic Leap One”. It consists of a surprisingly small AR headset named “Lightwear”, which looks significantly smaller than the competitors’ models. Also, you get a remote-like controller trackpad that offers six degrees of freedom (6DOF) tracking. Last but not least, everything is tethered to a small wearable computing device named “Lightpack”. It offers a computing performance the company compares to a regular laptop PC.

    The Lightwear AR glasses are fitted with sensors; eight can be seen on the front. These sensors gather 3D readings of real-world surroundings. With the acquired information, the device uses projections to produce virtual objects. It also measures and emulates lighting conditions, which makes the AR experience much more lifelike. As the device will store the data of the room, the AR can even “interact” with objects in the real world, if programmed to do so. The device also replicates real-world sounds using 3D “soundfield” audio features.

    Inputs include eye-tracking, voice recognition, head position, gesture recognition and movement in general.


    Magic Leap

    Magic Leap One: Available to Developers in 2018, No Date Set for Consumers

    The company state on their website: “We’re adding another dimension to computing. Where digital respects the physical. And they work together to make life better. Magic Leap One is built for creators who want to change how we experience the world.”

    Magic Leap will ship “Magic Leap One” to developers in 2018. They are also launching a “Creator Portal” at the same time. It will include learning resources, tools, documentation, and support. Find out more on their website.

    The company also adds: “The result is a system that sees what you see, allowing light-field objects to not only exist in the physical world but actually interact with it… Whether it’s virtual displays sitting alongside the computer monitor on your desk or a virtual panda that climbs across your living-room couch, visual perception with machine learning unlocks the power of spatial computing.”

    Source: Upload VR


    Magic Leap

    Website: LINK

  • Es ist soweit – INTERACTION ist Live!

    Es ist soweit – INTERACTION ist Live!

    Reading Time: < 1 minute

    Damit wir unser Ziel erreichen und INTERACTION Wirklichkeit werden kann, brauchen wir jetzt Deine Hilfe. Wir würden uns natürlich riesig freuen, wenn Du das Projekt finanziell unterstützt zum Beispiel mit dem „Early Bird“, da bekommst Du das komplette Spiel um nur 29 EUR!


    Hier geht’s zum Early Bird: https://bit.ly/LiveAtKS

    INTERACTION | How to play

    Official Source: Rudy Games Press Release

     

  • Video Shows Incredible VIPE Holodeck That Takes VR to the Next Level

    Video Shows Incredible VIPE Holodeck That Takes VR to the Next Level

    Reading Time: < 1 minute

    We have seen the future of gaming, and it may include holodecks. Northrop Grumman’s VIPE (Virtual Immersive Portable Environment) is essentially a portable room in which soldiers can use to run any number of simulated missions – diplomatic, combat, and even medical training environments, etc. That’s not all, uniforms and compatible weapons need are brought in from the outside to make the experience even more realistic.

    VIPE2

    (mehr …)

  • Video Shows Futuristic Halo-Inspired US Army Helmet, Complete With Heads Up Display

    Video Shows Futuristic Halo-Inspired US Army Helmet, Complete With Heads Up Display

    Reading Time: < 1 minute

    More than just a simple helmet, the Halow-inspired „Helmet Electronics and Display System-Upgradeable Protection“ is a futuristic modular variation of the current head gear that comes complete with face-protective 9mm plating and Heads Up Display powered by an Android phone

     

    Created after a four year research program at the Natick Soldier Research, Development and Engineering Center, it will provide with much better protection and useful features than the current models.

    And it looks so cool too – although not as awesome and scary as the demon helmet.

    Official Source:  http://www.military.com/video/forces/army/ausa-helmet-of-the-future/2761973934001/