Schlagwort: Google AR and VR

  • “The Mandalorian” in AR? This is the way.“The Mandalorian” in AR? This is the way.Head of Creative

    “The Mandalorian” in AR? This is the way.“The Mandalorian” in AR? This is the way.Head of Creative

    Reading Time: 2 minutes

    In a galaxy far, far away, the Mandalorian and the Child continue their journey, facing enemies and rallying allies in the tumultuous era after the collapse of the Galactic Empire. But you don’t need a tracking fob to explore the world of the hit STAR WARS streaming series. Google and Lucasfilm have teamed up to bring iconic moments from the first season of “The Mandalorian” to life with “The Mandalorian” AR Experience (available on the Play Store for 5G Google Pixels and other select 5G Android phones) as fans follow the show’s second season. (Check your phone to see if it meets app requirements.)

    Animated GIF showing a person's hand holding a Pixel phone while using the Mandalorian AR app.

    From dinosaurs to astronauts, Google has been bringing objects and creatures to life with augmented reality. Now, people using compatible Android 5G devices can interact with heroes from the Mandalorian in their own space.

    “The Mandalorian” AR Experience puts you in the shoes of a bounty hunter following the trail of Mando himself, Din Djarin and the Child. Explore the world of “The Mandalorian,” interact with characters in augmented reality and capture your very own scenes to share with friends.

    To create this original experience, Google, Disney and Lucasfilm worked together to imagine a next-generation augmented reality app optimized for 5G devices. Our teams collaborated to build hyper-detailed models and life-like animations—all while packing scenes with fun surprises.

    UsingARCore,Google’s developer platform for building augmented reality experiences, we created scenes that interact with your environment and respond to your surroundings. You can discover and unlock even more effects based on your actions. And thanks to the new ARCore Depth API, we also enabled occlusion, allowing 3D scenes to blend more naturally with our world.

    Animated GIF showing the character the Mandalorian in AR standing in someone's kitchen on the screen of a Pixel phone.

    New content will keep rolling out in the app each week onMando Mondays, so stay tuned—and Pixel owners should keep an eye out for additional exclusive content outside of the app as well.

    Lucasfilm, the Lucasfilm logo, STAR WARS and related properties are trademarks and/or copyrights, in the United States and other countries, of Lucasfilm Ltd. and/or its affiliates. © & ™ 2020 Lucasfilm Ltd. All rights reserved.

    Bring iconic moments from the first season of “The Mandalorian” to life with “The Mandalorian” AR Experience.

    Website: LINK

  • Bring abstract concepts to life with AR expeditionsBring abstract concepts to life with AR expeditions

    Bring abstract concepts to life with AR expeditionsBring abstract concepts to life with AR expeditions

    Reading Time: 2 minutes

    Over the last three years, Google Expeditions has helped students go on virtual field trips to far-off places like Machu Picchu, the International Space Station and the Galapagos Islands. The more you look around those places in virtual reality (VR), the more you notice all the amazing things that are there. And while we’ve seen first hand how powerful a tool VR is for going places, we think augmented reality (AR) is the best way to learn more about the things you find there. Imagine walking around a life-sized African elephant in your classroom or putting a museum’s worth of ancient Greek statues on your table.

    Last year at Google I/O we announced the Google Expeditions AR Pioneer Program, and over the last school year, one million students have used AR in their classrooms. With AR expeditions, teachers can bring digital 3D objects into their classrooms to help their students learn about everything from biology to Impressionist art.

    Starting today, Expeditions AR tours are available to anyone via the Google Expeditions app on both Android and iOS. We’ve also updated the Expeditions app to help you discover new tours, find your saved tours, and more easily start a solo adventure. It’s never been easier to start a tour on your own, at home with your family or in the classroom.

    Website: LINK

  • Now students can create their own VR toursNow students can create their own VR toursDaydream Software Engineer

    Now students can create their own VR toursNow students can create their own VR toursDaydream Software Engineer

    Reading Time: 2 minutes

    Editor’s note: For Teacher Appreciation Week, we’re highlighting a few ways Google is supporting teachers—including Tour Creator, which we launched today to help schools create their own VR tours. Follow along on Twitter throughout the week to see more on how we’re celebrating Teacher Appreciation Week.

    Since 2015, Google Expeditions has brought more than 3 million students to places like the Burj Khalifa, Antarctica, and Machu Picchu with virtual reality (VR) and augmented reality (AR). Both teachers and students have told us that they’d love to have a way to also share their own experiences in VR. As Jen Zurawski, an educator with Wisconsin’s West De Pere School District, put it: “With Expeditions, our students had access to a wide range of tours outside our geographical area, but we wanted to create tours here in our own community.“  

    That’s why we’re introducing Tour Creator, which enables students, teachers, and anyone with a story to tell, to make a VR tour using imagery from Google Street View or their own 360 photos. The tool is designed to let you produce professional-level VR content without a steep learning curve. “The technology gets out of the way and enables students to focus on crafting fantastic visual stories,” explains Charlie Reisinger, a school Technology Director in Pennsylvania.

    Once you’ve created your tour, you can publish it to Poly, Google’s library of 3D content. From Poly, it’s  easy to view. All you need to do is open the link in your browser or view in Google Cardboard. You can also embed it on your school’s website for more people to enjoy. Plus, later this year, we’ll add the ability to import these tours into the Expeditions application.

    Website: LINK

  • Experience augmented reality together with new updates to ARCoreExperience augmented reality together with new updates to ARCoreDirector of Engineering, AR

    Experience augmented reality together with new updates to ARCoreExperience augmented reality together with new updates to ARCoreDirector of Engineering, AR

    Reading Time: < 1 minute

    Three months ago, we launched ARCore, Google’s platform for building augmented reality (AR) experiences. There are already hundreds of apps on the Google Play Store that are built on ARCore and help you see the world in a whole new way. For example, with Human Anatomy you can visualize and learn about the intricacies of the nervous system in 3D. Magic Plan lets you create a floor plan for your next remodel just by walking around the house. And Jenga AR lets you stack blocks on your dining room table with no cleanup needed after your tower collapses.

    Website: LINK

  • Google Lens: real-time answers to questions about the world around youGoogle Lens: real-time answers to questions about the world around youDirector, Google Lens

    Google Lens: real-time answers to questions about the world around youGoogle Lens: real-time answers to questions about the world around youDirector, Google Lens

    Reading Time: 2 minutes

    There’s so much information available online, but many of the questions we have are about the world right in front of us. That’s why we started working on Google Lens, to put the answers right where the questions are, and let you do more with what you see.

    Last year, we introduced Lens in Google Photos and the Assistant. People are already using it to answer all kinds of questions—especially when they’re difficult to describe in a search box, like “what type of dog is that?” or “what’s that building called?”

    Today at Google I/O, we announced that Lens will now be available directly in the camera app on supported devices from LGE, Motorola, Xiaomi, Sony Mobile, HMD/Nokia, Transsion, TCL, OnePlus, BQ, Asus, and of course the Google Pixel. We also announced three updates that enable Lens to answer more questions, about more things, more quickly:

    First, smart text selection connects the words you see with the answers and actions you need. You can copy and paste text from the real world—like recipes, gift card codes, or Wi-Fi passwords—to your phone. Lens helps you make sense of a page of words by showing you relevant information and photos. Say you’re at a restaurant and see the name of a dish you don’t recognize—Lens will show you a picture to give you a better idea.  This requires not just recognizing shapes of letters, but also the meaning and context behind the words. This is where all our years of language understanding in Search help.

    Website: LINK

  • Introducing the first Daydream standalone VR headset and new ways to capture memoriesIntroducing the first Daydream standalone VR headset and new ways to capture memoriesVP

    Introducing the first Daydream standalone VR headset and new ways to capture memoriesIntroducing the first Daydream standalone VR headset and new ways to capture memoriesVP

    Reading Time: < 1 minute

    Back in January, we announced the Lenovo Mirage Solo, the first standalone virtual reality headset that runs Daydream. Alongside it, we unveiled the Lenovo Mirage Camera, the first camera built for VR180. Designed with VR capture and playback in mind, these devices work great separately and together. And both are available for purchase today.

    More immersive

    The Mirage Solo puts everything you need for mobile VR in a single device. You don’t need a smartphone, PC, or any external sensors—just pick it up, put it on, and you’re in VR in seconds.

    The headset was designed with comfort in mind, and it has a wide field of view and an advanced display that’s optimized for VR. It also features WorldSense, a powerful new technology that enables PC-quality positional tracking on a mobile device, without the need for any additional sensors. With it, you can duck, dodge and lean, step backward, forward or side-to-side. All of this makes for a more natural and immersive experience, so you really feel like you’re there.

    Website: LINK

  • Premiering now: The first-ever VR Google Doodle starring illusionist & film director Georges MélièsPremiering now: The first-ever VR Google Doodle starring illusionist & film director Georges Méliès

    Premiering now: The first-ever VR Google Doodle starring illusionist & film director Georges MélièsPremiering now: The first-ever VR Google Doodle starring illusionist & film director Georges Méliès

    Reading Time: < 1 minute

    An illusionist before he was a filmmaker, Méliès discovered and exploited basic camera techniques to transport viewers into magical worlds and zany stories. He saw film and cameras as more than just tools to capture images, he saw them as vehicles to transport and truly immerse people into a story. He played around with stop motion, slow motion, dissolves, fade-outs, superimpositions, and double exposures.

    “Méliès was fascinated by new technologies and was constantly on the lookout for new inventions. I imagine he would have been delighted to live in our era, which is so rich with immersive cinema, digital effects, and spectacular images on screen,” says Laurent Manonni, Director of Heritage at The Cinémathèque Française.  “I have no doubt he would have been flattered to find himself in the limelight via today’s very first virtual reality / 360° video Google Doodle, propelled around the world thanks to a new medium with boundless magical powers.”

    Website: LINK

  • Behind the scenes: Coachella in VR180Behind the scenes: Coachella in VR180Director, VR Video

    Behind the scenes: Coachella in VR180Behind the scenes: Coachella in VR180Director, VR Video

    Reading Time: < 1 minute

    Last weekend, fans from all around the world made the trek to Southern California to see some of music’s biggest names perform at Coachella. To make those not at the festival feel like they were there, we headed to the desert with VR180 cameras to capture all the action.

    Throughout the first weekend of Coachella, we embarked on one of the largest VR live streams to date, streaming more than 25 performances (with as many cameras to boot) across 20 hours and capturing behind-the-scenes footage of fans and the bands they love. If you missed it live, you can enjoy some of the best experiences—posted here.

    VR180 can take you places you never thought possible—the front row at a concert, a faraway travel destination, the finals of your favorite sporting event, or a memorable location. This year at Coachella, we pushed the format even further by adding augmented reality, AR, overlays on top of the performances—like digital confetti that falls when the beat drops, or virtual objects that extend the into the crowd.

    Website: LINK

  • How to publish VR180How to publish VR180Software Engineer

    How to publish VR180How to publish VR180Software Engineer

    Reading Time: 2 minutes

    Last year we introduced VR180, a new video format that makes it possible to capture or create engaging immersive videos for your audience. Most VR180 cameras work just like point-and-shoot models. However, what you capture in VR180 is far more immersive. You’re able to create VR photos and videos in stunning 4K resolution with just the click of a button.

    Today, we’re publishing the remaining details about creating VR180 videos on github and photos on the Google Developer website, so any developer or manufacturer can start engaging with VR180.

    For VR180 video, we simply extended the Spherical Video Metadata V2 standard. Spherical V2 supports the mesh-based projection needed to allow consumer cameras to output raw fisheye footage. We then created the Camera Motion Metadata Track so that you’re able to stabilize the video according to the camera motion after video capture. This results in a more comfortable VR experience for viewers. The photos that are generated by the cameras are written in the existing VR Photo Format pioneered by Cardboard Camera.

    When you use a Cardboard or Daydream View to look back on photos and videos captured using VR180, you’ll feel like you’re stepping back into your memory. And you can share the footage with others using Google Photos or YouTube, on your phone or the web. We hope that this makes it simple for anyone to shoot VR content, and watch it too.

    In the coming months, we will be publishing tools that help with writing appropriately formatted VR180 photos and videos and playing it back, so stay tuned!

    Website: LINK

  • Announcing high-quality stitching for JumpAnnouncing high-quality stitching for JumpSoftware Engineer

    Announcing high-quality stitching for JumpAnnouncing high-quality stitching for JumpSoftware Engineer

    Reading Time: 2 minutes

    We announced Jump in 2015 to simplify VR video production from capture to playback. High-quality VR cameras make capture easier, and Jump Assembler makes automated stitching quicker, more accessible and affordable for VR creators. Using sophisticated computer vision algorithms and the computing power of Google’s data centers, Jump Assembler creates clean, realistic image stitching resulting in immersive 3D 360 video.

    Stitching, then and now

    Today, we’re introducing an option in Jump Assembler to use a new, high-quality stitching algorithm based on multi-view stereo. This algorithm produces the same seamless 3D panoramas as our standard algorithm (which will continue to be available), but it leaves fewer artifacts in scenes with complex layers and repeated patterns. It also produces depth maps with much cleaner object boundaries which is useful for VFX.

    Let’s first take a look at how our standard algorithm works. It’s based on the concept of optical flow, which matches pixels in one image to those in another. When matched, you can tell how pixels “moved” or “flowed” from one image to the next. And once every pixel is matched, you can interpolate the in-between views by shifting the pixels part of the way. This means that you can “fill in the gaps” between the cameras on the rig, so that, when stitched together, the result is a seamless, coherent 360° panorama.

    Website: LINK

  • Chromebook tablets for versatile learningChromebook tablets for versatile learningGroup Product Manager

    Chromebook tablets for versatile learningChromebook tablets for versatile learningGroup Product Manager

    Reading Time: < 1 minute

    This past January, students in Kristine Kuwano’s third grade classroom were buzzing with excitement at De Vargas Elementary School in Cupertino, California. Tasked with writing out math equations to upload to Google Classroom, the students grabbed their new tablets from the cart, pulled out the stylus, and logged into Chrome. “They love technology and they have grown up working with touch devices, so tablets are intuitive for them,” said Kuwano.

    Since their debut, schools have chosen Chromebooks because they are fast, easy-to-use and manage, shareable, secure and affordable. We’ve listened carefully to feedback from educators around the world, and one common theme is that they want all the benefits of Chromebooks in a tablet form.

    Starting today, with the new Acer Chromebook Tab 10, we’re doing just that. It’s the first education tablet made for Chrome OS, and gives schools the easy management and shareability of Chromebook laptops. With touch and stylus functionality, this lightweight device is perfect for students creating multimedia projects—and also comes with a world of immersive experiences with Google Expeditions AR.

    Website: LINK

  • Early explorations with ARCore 1.0Early explorations with ARCore 1.0Product Manager

    Early explorations with ARCore 1.0Early explorations with ARCore 1.0Product Manager

    Reading Time: < 1 minute

    eBay
    eBay is using AR to solve a specific challenge facing their community of sellers: what size shipping container is needed to send that product? With the “Which Box” feature in eBay’s app, sellers can visualize shipping boxes to determine which container size they need to send any product.

    Curate by Sotheby’s International Realty, Streem
    If you’re shopping for a new home or need help maintaining yours, AR can also come in handy. With ARCore, Sotheby’s International Realty is changing the way people stage furniture in the real estate world, and the Streem app connects customers with professionals to solve household maintenance requests.

    Creativity

    Over the last few months, we’ve been tinkering with experiments that show how AR can be used as a new creative medium for self-expression. We’ve worked with creators across different disciplines to explore what happens when AR is used by illustrators, choreographers, animators and more.

    Now, we’re inviting more people to experiment with this technology through an app that lets you make simple drawings in AR, and then share your creation with a short video. The caveat: it’s “Just a Line.”

    Website: LINK

  • Open sourcing Resonance AudioOpen sourcing Resonance AudioProduct Manager

    Open sourcing Resonance AudioOpen sourcing Resonance AudioProduct Manager

    Reading Time: 3 minutes

    Spatial audio adds to your sense of presence when you’re in VR or AR, making it feel and sound, like you’re surrounded by a virtual or augmented world. And regardless of the display hardware you’re using, spatial audio makes it possible to hear sounds coming from all around you.

    Resonance Audio, our spatial audio SDK launched last year, enables developers to create more realistic VR and AR experiences on mobile and desktop. We’ve seen a number of exciting experiences emerge across a variety of platforms using our SDK. Recent examples include apps like Pixar’s Coco VR for Gear VR, Disney’s Star WarsTM: Jedi Challenges AR app for Android and iOS, and Runaway’s Flutter VR for Daydream, which all used Resonance Audio technology.

    To accelerate adoption of immersive audio technology and strengthen the developer community around it, we’re opening Resonance Audio to a community-driven development model. By creating an open source spatial audio project optimized for mobile and desktop computing, any platform or software development tool provider can easily integrate with Resonance Audio. More cross-platform and tooling support means more distribution opportunities for content creators, without the worry of investing in costly porting projects.

    What’s included in the open source project

    As part of our open source project, we’re providing a reference implementation of YouTube’s Ambisonic-based spatial audio decoder, compatible with the same Ambisonics format (Ambix ACN/SN3D) used by others in the industry. Using our reference implementation, developers can easily render Ambisonic content in their VR media and other applications, while benefiting from Ambisonics open source, royalty-free model. The project also includes encoding, sound field manipulation and decoding techniques, as well as head related transfer functions (HRTFs) that we’ve used to achieve rich spatial audio that scales across a wide spectrum of device types and platforms. Lastly, we’re making our entire library of highly optimized DSP classes and functions, open to all. This includes resamplers, convolvers, filters, delay lines and other DSP capabilities. Additionally, developers can now use Resonance Audio’s brand new Spectral Reverb, an efficient, high quality, constant complexity reverb effect, in their own projects.

    We’ve open sourced Resonance Audio as a standalone library and associated engine plugins, VST plugin, tutorials, and examples with the Apache 2.0 license. This means Resonance Audio is yours, so you’re free to use Resonance Audio in your projects, no matter where you work. And if you see something you’d like to improve, submit a GitHub pull request to be reviewed by the Resonance Audio project committers. While the engine plugins for Unity, Unreal, FMOD, and Wwise will remain open source, going forward they will be maintained by project committers from our partners, Unity, Epic, Firelight Technologies, and Audiokinetic, respectively.

    If you’re interested in learning more about Resonance Audio, check out the documentation on our developer site. If you want to get more involved, visit our GitHub to access the source code, build the project, download the latest release, or even start contributing. We’re looking forward to building the future of immersive audio with all of you.

    Website: LINK

  • Experimenting with Light FieldsExperimenting with Light FieldsSenior Researcher

    Experimenting with Light FieldsExperimenting with Light FieldsSenior Researcher

    Reading Time: 2 minutes

    We’ve always believed in the power of virtual reality to take you places. That’s why we created Expeditions, to transport people around the world to hundreds of amazing, hard-to-reach or impossible-to-visit places. It’s why we launched Jump, which lets professional creators film beautiful scenes in stereoscopic 360 VR video, and it’s why we’re introducing VR180, a new format for anyone—even those unfamiliar with VR technology—to capture life’s special moments.

    But to create the most realistic sense of presence, what we show in VR needs to be as close as possible to what you’d see if you were really there. When you’re actually in a place, the world reacts to you as you move your head around: light bounces off surfaces in different ways and you see things from different perspectives. To help create this more realistic sense of presence in VR, we’ve been experimenting with Light fields.

    Light fields are a set of advanced capture, stitching, and rendering algorithms. Much more work needs to be done, but they create still captures that give you an extremely high-quality sense of presence by producing motion parallax and extremely realistic textures and lighting. To demonstrate the potential of this technology, we’re releasing “Welcome to Light Fields,” a free app available on Steam VR for HTC Vive, Oculus Rift, and Windows Mixed Reality headsets. Let’s take a look at how it works.

    Capturing and processing a light field

    With light fields, nearby objects seem near to you—as you move your head, they appear to shift a lot. Far-away objects shift less and light reflects off objects differently, so you get a strong cue that you’re in a 3D space. And when viewed through a VR headset that supports positional tracking, light fields can enable some truly amazing VR experiences based on footage captured in the real world.

    This is possible because a light field records all the different rays of light coming into a volume of space. To record them, we modified a GoPro Odyssey Jump camera, bending it into a vertical arc of 16 cameras mounted on a rotating platform.

    Website: LINK

  • Watch live performances at The FADER FORT from SXSW in VR180Watch live performances at The FADER FORT from SXSW in VR180Director, VR Video

    Watch live performances at The FADER FORT from SXSW in VR180Watch live performances at The FADER FORT from SXSW in VR180Director, VR Video

    Reading Time: < 1 minute

    To bring The FADER FORT experience to more fans, we partnered with The FADER to livestream performances by Saweetie, Bloc Boy, Valee, Speedy Ortiz, YBN Nahmir and other special guests in VR180 on YouTube. No matter where you are, you can watch live on YouTube via your desktop or mobile device, or using Cardboard, Daydream View or PlayStation VR.

    With VR180, those not in attendance at The FADER FORT in Austin will be able to experience three dimensional, 4K video of the show, providing a more immersive experience than a traditional video and making you feel like you are there.

    From March 14-16th, we’ll livestream the the best acts of the day in VR180 and a cutdown of each set—that can be viewed at any time.

    Check out the calendar below, grab your headset and get ready to see some of the best new artists on the scene without ever setting foot in Austin. Visit Fader for the full lineup. See you at the Fort!

    Website: LINK

  • Making a video game in two days with Tilt Brush and UnityMaking a video game in two days with Tilt Brush and UnityEMEA Program Manager

    Making a video game in two days with Tilt Brush and UnityMaking a video game in two days with Tilt Brush and UnityEMEA Program Manager

    Reading Time: < 1 minute

    Imagine you’re playing a video game, and you’re being attacked by a gang of angry space aliens. Wouldn’t it be great if you could just paint an object in 3D space and use it to defend yourself? A talented team of artists and game fanatics explored this very premise at Global Game Jam 2018, a game development hackathon. Seeing Tilt Brush as a fast, powerful and fun 3D asset creation tool, the team at Another Circus used the Tilt Brush Toolkit to create a virtual reality game in less than 48 hours.

    “Pac Tac Atac” casts you as a space adventurer who has landed on an alien planet and needs to beam a rescue message into intergalactic space. But watch out, the locals are angry and in the mood to smash your transmitter. It’s up to you to keep them away!

    What the aliens don’t know is that you’re armed with two cans of spray paint, that let you  magically draw any object in your imagination to defend yourself.

    Website: LINK

  • Announcing ARCore 1.0 and new updates to Google LensAnnouncing ARCore 1.0 and new updates to Google LensDirector of Engineering, AR

    Announcing ARCore 1.0 and new updates to Google LensAnnouncing ARCore 1.0 and new updates to Google LensDirector of Engineering, AR

    Reading Time: < 1 minute

    ARCore, Google’s augmented reality SDK for Android, is out of preview and launching as version 1.0. Developers can now publish AR apps to the Play Store, and it’s a great time to start building. ARCore works on 100 million Android smartphones, and advanced AR capabilities are available on all of these devices. It works on 13 different models right now (Google’s Pixel, Pixel XL, Pixel 2 and Pixel 2 XL; Samsung’s Galaxy S8, S8+, Note8, S7 and S7 edge; LGE’s V30 and V30+ (Android O only); ASUS’s Zenfone AR; and OnePlus’s OnePlus 5). And beyond those available today, we’re partnering with many manufacturers to enable their upcoming devices this year, including Samsung, Huawei, LGE, Motorola, ASUS, Xiaomi, HMD/Nokia, ZTE, Sony Mobile, and Vivo.

    Making ARCore work on more devices is only part of the equation. We’re also bringing developers additional improvements and support to make their AR development process faster and easier. ARCore 1.0 features improved environmental understanding that enables users to place virtual assets on textured surfaces like posters, furniture, toy boxes, books, cans and more. Android Studio Beta now supports ARCore in the Emulator, so you can quickly test your app in a virtual environment right from your desktop.

    Website: LINK

  • Go behind the scenes of “Isle of Dogs” with PixelGo behind the scenes of “Isle of Dogs” with PixelExecutive Producer

    Go behind the scenes of “Isle of Dogs” with PixelGo behind the scenes of “Isle of Dogs” with PixelExecutive Producer

    Reading Time: 2 minutes

    „Isle of Dogs“ tells the story of Atari Kobayashi, 12-year-old ward to corrupt Mayor Kobayashi. When, by Executive Decree, all the canine pets of Megasaki City are exiled to a vast garbage-dump, Atari sets off alone in a miniature Junior-Turbo Prop and flies to Trash Island in search of his bodyguard-dog, Spots. There, with the assistance of a pack of newly-found mongrel friends, he begins an epic journey that will decide the fate and future of the entire Prefecture.

    The film isn’t out until March 23—but Pixel owners will get an exclusive sneak peek this week.

    In “Isle of Dogs Behind the Scenes (in Virtual Reality),” the audience is taken behind-the-scenes in a 360-degree VR experience featuring on-set interviews of the film’s cast (voiced by Bryan Cranston, Bill Murray, Edward Norton, Liev Schreiber, Jeff Goldblum, Scarlett Johansson, Tilda Swinton, F. Murray Abraham and Bob Balaban). Get nose-to-nose with Chief, Boss, Rex and the rest of the cast while the crew works around you, for an inside look at the unique craft of stop-motion animation.

    Pixel’s powerful front-firing stereo speakers and brilliant display make it perfect for watching immersive VR content like this. Presented in 4K video with interactive spatial audio that responds to where you’re looking, “Isle of Dogs Behind the Scenes (in Virtual Reality)” is a collaboration between FoxNext VR Studio, Fox Searchlight Pictures, Felix & Paul Studios, the Isle of Dogs production team, and Google Spotlight Stories.

    Website: LINK

  • Six ways Google can keep you up to speed in PyeongChangSix ways Google can keep you up to speed in PyeongChangProduct Manager

    Six ways Google can keep you up to speed in PyeongChangSix ways Google can keep you up to speed in PyeongChangProduct Manager

    Reading Time: < 1 minute

    5. Get your head in the game with the Assistant

    Your Google Assistant can help you stay up to date throughout the games. Curious about winners? Just say “Hey Google, who won women’s 1000 meter speed skating in the Olympics?” Rooting for a specific country? “Hey Google, how many medals does Iceland have in the Olympics?” You can even say “Hey Google, tell me a fun fact about the games in PyeongChang.” No matter how you’re asking—on your phone, speaker, TV or other enabled device—the Google Assistant can keep up with all the important Olympic details.

    Plus, in the U.S., NBC is bringing an exclusive game to the Google Assistant across devices. It’s already live, so test your winter sports knowledge with dozens of trivia questions. Just say “Hey Google, play NBC Sports Trivia” to start your quest for Olympics’ trivia gold.

    6. VR gets you closer to the action

    Stream more than 50 hours of NBCUniversal’s live coverage—from the Opening Ceremony to alpine skiing, ice hockey, figure skating, snowboarding, curling and more—in virtual reality by using your YouTube TV credentials to log in to the NBC Sports VR app, powered by Intel True VR. In Europe, multi-camera live VR coverage is available via the Eurosport VR app.

    Let the games begin.

    Website: LINK

  • Travel through time with Pepsi and WebVRTravel through time with Pepsi and WebVRGroup Creative Director

    Travel through time with Pepsi and WebVRTravel through time with Pepsi and WebVRGroup Creative Director

    Reading Time: < 1 minute

    Ah, the Super Bowl—come for the action, stay for the commercials. This year, as part of its “Pepsi Generations” global campaign, Pepsi will extend its TV commercial into virtual reality.

    Pepsi’s new commercial, „This is the Pepsi,“  takes viewers on a journey through some of the brand’s most iconic moments. In VR, Pepsi fans can remember those moments, and  feel what it was like to be there.

    That’s why we collaborated to create “Pepsi Go Back,“ a WebVR experience where fans travel through time and step into Pepsi commercials that became some of their biggest pop culture milestones.

    Website: LINK

  • Take your Blocks models to the next levelTake your Blocks models to the next levelProduct Manager

    Take your Blocks models to the next levelTake your Blocks models to the next levelProduct Manager

    Reading Time: < 1 minute

    Since Blocks launched six months ago, it’s been amazing to see all the incredible creations built by novices and professional modelers alike. We’ve witnessed everything from a retro roller skate, to an old timey photograph, to our very own JUMP camera. We’ve also gotten tons of feedback about ways we could improve the experience. The latest release, available today on Steam and the Oculus Store, has lots of new features that make Blocks more powerful and even easier to use. Let’s take a look.

    Environment Options

    Modeling in the desert got you seeing 3D mirages? Don’t fret, you’ll now have the option to pick from four modeling environments. We’ve added a night version of the current environment for those who found the desert a bit too bright after long creation sessions. You’ll also find plain white and black options. Make sure to look up while creating in the black environment for a night sky surprise. Plus, we’ll remember which environment you used in your last session and automatically default to that selection your next time around.

    Website: LINK

  • Pioneer new lessons in your classroom with Google ExpeditionsPioneer new lessons in your classroom with Google ExpeditionsProduct Manager for Google Expeditions

    Pioneer new lessons in your classroom with Google ExpeditionsPioneer new lessons in your classroom with Google ExpeditionsProduct Manager for Google Expeditions

    Reading Time: < 1 minute

    Through our travels with the Google Expeditions Pioneer Program, we’ve worked alongside teachers and students to improve the overall Expeditions experience. One of the top requests we’ve heard from teachers and students is the ability to create their own Expeditions. Today, we are excited to announce a beta program that allows schools and educators to do just that. Classrooms will be able to create immersive tours of the world around them — their classrooms, their schools, their communities. We’ll provide participating schools with all the tools and hardware required to capture 360 images and curate unique Expeditions. For more information about the program, sign up here.

    Website: LINK