On Monday, we welcomed TheWaveVR to Oculus Rift. Since then, we’ve seen rave reviews (pun intended), with members of the Rift community calling it “my favorite VR experience” and “some next level sh1t.” And now, this immersive music platform takes another innovative step forward with the release of See Without Eyes—a brand-new album and live show from cutting-edge electronic artists The Glitch Mob you can experience on Rift for free, starting today at 7:00 pm PT!
“We are excited to present the See Without Eyes VR Experience in partnership with TheWaveVR and Strangeloop Studios,” The Glitch Mob says. “This is a completely new way to experience music. We cannot wait for you all to take this journey with us.”
During a 20-minute custom mix of the band’s third studio album, you’ll explore a number of shifting environments while interacting with other people on both Rift and HTC Vive. Those without VR headsets can also join in the fun with Facebook Live. Once the live show is complete, fans will be able to revisit a looped version of the experience in TheWaveVR at any time.
“Imagine being able to fly through fantastical dreamscapes made from The Glitch Mob’s music alongside your friends- or meet people from around the world,” explains TheWaveVR CEO and Co-Founder Adam Arrigo. “In The Glitch Mob VR experience, players literally embody the cosmos while they create parts of the light show as the audience. Part concert, part film, part rave, part art installation, the only way to understand it is to put on the headset and take the trip.”
Clear your living room and get the dance party started when The Glitch Mob drops the bass in TheWaveVR, tonight at 7:00 pm PT on Rift!
Endlich gibt es einen revolutionären Mini-PC für CAD-Anwendungen mit fantastischer Leistung und Vielseitigkeit in einem kompakten und eleganten Design. Die Z2 Mini – von Designprofis für Designprofis – liefert Performance und Zuverlässigkeit für die beste CAD-Erfahrung.
Hier die Leistung des kleinen Wunderkastens im Detail:
Betriebssystem
Windows 10 Pro 64
Prozessorproduktfamilie
Intel® Xeon® E3 Prozessor
Prozessor
Intel® Xeon® E3-1225 v5 mit Intel HD-Grafikkarte P530 (3,3 GHz, bis zu 3,7 GHz mit Intel Turbo Boost-Technologie, 8 MB Cache, 4 Kerne)
Unser Test hat sich mit dem schneiden und rendern von 360 Grad / VR Videos beschäftigt.
Das Fazit war sehr herausragend, wir hatten schon verschiedene Workstations im Test. Von Größe bis Lautstärke gibt es da so verschieden Ausführungen das man manchmal nicht weiß ob das , die richtige Workstation für einen ist.
Ich muss sagen mir waren 3 Punkte wichtig:
Geschwindigkeit, das ding muss Power haben!
Lautstärke, wenn der Lüfter lauter ist als die Klima im Büro sollte man sich überlegen eine andere Workstation zu holen ^^
Platz, wie groß ist das Teil. Brauch ich einen Renderraum ala Disney? Und komme dann doch nicht zum Ziel.
Mit dieser Workstation waren alle Punkte abgedeckt, und ich war Sehr positiv überrascht was das kleine Ding an Leistung von sich geben kann.
Als vergleich, ein Durchschnittsrechner mit Intel i5 16GB ram, SSD, und einer Nvida Gaming Karte braucht für 30min. schon knapp mehrere std. für einen 360 Grad VR Clip zum rechnen. Und da ist das Spatial Audio noch nicht mal dabei!
Bei der Workstation von HP, haben wir einen 25min clip in knapp 2STD rausgerechnet, 4k 360 Grad!!!! Die HP Z2 Mini Workstation ist wirklich zu empfehlen auch wenn man am Anfang glaubt das Sie wegen der Größe vielleicht doch nicht mithalten kann.
Also hat HP diesmal fast alles richtig gemacht, das einzige kleine Minus ist das Display Port Kabel wie immer muss man sich selber dazu kaufen 🙂 🙂 Aber kennt man irgendwie von allen Herstellern, bei Sony PlayStation war es damals das Ladekabel für den Controller, bei Nintendo ist es das Stromkabel für die neue kleine RetroKonsole. Also das kann man schon verzeihen sage ich mal 🙂
Over the last three years, Google Expeditions has helped students go on virtual field trips to far-off places like Machu Picchu, the International Space Station and the Galapagos Islands. The more you look around those places in virtual reality (VR), the more you notice all the amazing things that are there. And while we’ve seen first hand how powerful a tool VR is for going places, we think augmented reality (AR) is the best way to learn more about the things you find there. Imagine walking around a life-sized African elephant in your classroom or putting a museum’s worth of ancient Greek statues on your table.
Last year at Google I/O we announced the Google Expeditions AR Pioneer Program, and over the last school year, one million students have used AR in their classrooms. With AR expeditions, teachers can bring digital 3D objects into their classrooms to help their students learn about everything from biology to Impressionist art.
Starting today, Expeditions AR tours are available to anyone via the Google Expeditions app on both Android and iOS. We’ve also updated the Expeditions app to help you discover new tours, find your saved tours, and more easily start a solo adventure. It’s never been easier to start a tour on your own, at home with your family or in the classroom.
Bitte bestätigen Sie das Abo in der Bestätigungsmail. Falls Sie keine Bestätigungsmail erhalten habe, schauen Sie im SPAM-Ordner Ihres E-Mail-Postfachs. Danke!
Bitte bestätigen Sie das Abo in der Bestätigungsmail. Falls Sie keine Bestätigungsmail erhalten habe, schauen Sie im SPAM-Ordner Ihres E-Mail-Postfachs. Danke!
Due to the increasing number of terrorist-related attacks, researchers at Sheffield Hallam University have developed a new training method using virtual and augmented reality to better prepare police, first responders, and air workers called AUGGMED.
Historically, training for counter-terrorism assignments has been neither standardized nor readily available. Instead, training includes real-world scenarios and classroom exercises.
However, researchers from Sheffield Hallam University, UK have developed a virtual reality-based training method which they hope can prepare police and aid workers for stressful situations.
Their project is called AUGGMED (Automated Serious Game Scenario Generator for Mixed Reality Training). It’s an online multi-user training platform.
The platform makes use of both virtual reality and augmented reality. This means that police, first responders and aid workers can undergo training within virtual reconstructions of the real world.
Augmented reality is also used and allows trainees to see and interact with virtual civilians and terrorists within the real world. The idea is that both technologies will help improve decision making as well as give trainees experience of staying focused during such intense situations.
AUGGMED Training for Police, First Responders, Paramedics
To develop the platform, the researchers looked into the use of “serious games“. They worked with law enforcement agencies and United Nations organizations to do this.
From this research, they could successfully apply these simulations to training. Their work has also received funding from the European Union’s Horizon 2020 research and innovation programme.
AUGGMED is already in use, for example, by British police officers for critical incident response training and security officers with the Piraeus Port Authority in Greece for potential terrorist-related threats.
Interestingly, the platform also enables training with multiple agencies at the same time. This means, collaborative training between the police force, security personnel and paramedics is possible.
Finally, it may be the case that VR training methods become available to police forces worldwide due to the fact that they’re a cost-effective and a rapid training solution.
Jonathan Saunders, Research Fellow (Lead Games Developer) at Sheffield Hallam University certainly thinks so. He explains:
“In the future, the use of modern technologies to improve and augment existing practices will become commonplace… Serious games and virtual reality will one day be ubiquitous within training packages. But before then, the benefits of these technologies need to be explored and discussed further, because they hold remarkable potential.”
Virtual Reality is making it to cinemas in South Korea with film tech labs and visual effects houses rapidly creating popular content. Cinema-goers are fully immersed with both VR and 4DX which brings feel, touch and smell to the experience.
From the beginning of the 1900s until now, people have been enjoying cinema in a fairly similar way. Of course, image quality, sound systems and the number of films have improved, but it’s the story which really pulls people into the world on screen.
However, this is set to change with the development of virtual reality (VR). So far, the technology is a popular medium for creating games and even enhancing theme park rides. Meanwhile, its prevalence in cinema is only just beginning.
Now, film festivals are launching competition sections purely for VR films. Interestingly, one country taking to this trend is South Korea. In fact, cinemas in the country are bringing in VR headsets so you are completely surrounded by the movie.
As the world accepts VR as a viable film technology, film tech labs and visual effects houses are rapidly producing content and the budgets for such movies will only get bigger.
One such VR film is Stay With Me which was directed by Bryan Ku. It focuses on a relationship between a girl who dreams of being an actress and a boy who wants to be a musician but is too afraid to go on stage. Ku said at a press event for the film:
“When you think about VR, most of the time it would be either adventure, action or horror films… I believe the greatest quality of VR lies in its capacity to let the audiences relate to the film emotionally, and romance drama is the genre that corresponds the most to this quality.”
4DX Cinema before VR
Get Completely Lost in a VR Story
Of course, with the rapid development of content comes the need to find ways to screening the pictures. By adding VR headsets to cinemas, South Korea is able to show many of the notable VR film projects is developed in 2017.
Stay With Me also opened in “4DX” format at cinema chain, CJ-CGV. 4DX is a technology which adds elements of feel, smell, and touch. However, Yoo Young-gun of CGV adds:
“4DX effects for VR should be different from those for other movies… Visual elements are not enough to accomplish what VR is up to, which is to expand to a form of storytelling with its immersive characteristics maximized. With 4DX technology, the audiences can touch, smell and feel the films, meaning that virtual reality in its literal sense can be achieved.”
Stay With Me claims to be the world’s first film production which was shot in 360-degree VR and screened in 4DX. To do this, CGV’s 4DX effect team had to join the project at the development stage.
However, everything must have to plan as the cinema chain is now aiming to globally introduce 4DX VR. It intends on bringing VR tech to its 500 4DX theatres worldwide.
“We are planning a VR add-on package, which allows exhibitors to show VR films, and are offering it to the 500 4DX theaters across the globe,” says Yoo.
Support funds for such films have so far come from The Ministry of Culture, Sport and Tourism, Korea Creative Content Agency and National IT Industry Promotion Industry. Only time will tell whether this is money well spent and whether VR films are just a fad or if they will really take off.
Editor’s note: For Teacher Appreciation Week, we’re highlighting a few ways Google is supporting teachers—including Tour Creator, which we launched today to help schools create their own VR tours. Follow along on Twitter throughout the week to see more on how we’re celebrating Teacher Appreciation Week.
Since 2015, Google Expeditions has brought more than 3 million students to places like the Burj Khalifa, Antarctica, and Machu Picchu with virtual reality (VR) and augmented reality (AR). Both teachers and students have told us that they’d love to have a way to also share their own experiences in VR. As Jen Zurawski, an educator with Wisconsin’s West De Pere School District, put it: “With Expeditions, our students had access to a wide range of tours outside our geographical area, but we wanted to create tours here in our own community.“
That’s why we’re introducing Tour Creator, which enables students, teachers, and anyone with a story to tell, to make a VR tour using imagery from Google Street View or their own 360 photos. The tool is designed to let you produce professional-level VR content without a steep learning curve. “The technology gets out of the way and enables students to focus on crafting fantastic visual stories,” explains Charlie Reisinger, a school Technology Director in Pennsylvania.
Once you’ve created your tour, you can publish it to Poly, Google’s library of 3D content. From Poly, it’s easy to view. All you need to do is open the link in your browser or view in Google Cardboard. You can also embed it on your school’s website for more people to enjoy. Plus, later this year, we’ll add the ability to import these tours into the Expeditions application.
Three months ago, we launched ARCore, Google’s platform for building augmented reality (AR) experiences. There are already hundreds of apps on the Google Play Store that are built on ARCore and help you see the world in a whole new way. For example, with Human Anatomy you can visualize and learn about the intricacies of the nervous system in 3D. Magic Plan lets you create a floor plan for your next remodel just by walking around the house. And Jenga AR lets you stack blocks on your dining room table with no cleanup needed after your tower collapses.
There’s so much information available online, but many of the questions we have are about the world right in front of us. That’s why we started working on Google Lens, to put the answers right where the questions are, and let you do more with what you see.
Last year, we introduced Lens in Google Photos and the Assistant. People are already using it to answer all kinds of questions—especially when they’re difficult to describe in a search box, like “what type of dog is that?” or “what’s that building called?”
Today at Google I/O, we announced that Lens will now be available directly in the camera app on supported devices from LGE, Motorola, Xiaomi, Sony Mobile, HMD/Nokia, Transsion, TCL, OnePlus, BQ, Asus, and of course the Google Pixel. We also announced three updates that enable Lens to answer more questions, about more things, more quickly:
First, smart text selection connects the words you see with the answers and actions you need. You can copy and paste text from the real world—like recipes, gift card codes, or Wi-Fi passwords—to your phone. Lens helps you make sense of a page of words by showing you relevant information and photos. Say you’re at a restaurant and see the name of a dish you don’t recognize—Lens will show you a picture to give you a better idea. This requires not just recognizing shapes of letters, but also the meaning and context behind the words. This is where all our years of language understanding in Search help.
Back in January, we announced the Lenovo Mirage Solo, the first standalone virtual reality headset that runs Daydream. Alongside it, we unveiled the Lenovo Mirage Camera, the first camera built for VR180. Designed with VR capture and playback in mind, these devices work great separately and together. And both are available for purchase today.
More immersive
The Mirage Solo puts everything you need for mobile VR in a single device. You don’t need a smartphone, PC, or any external sensors—just pick it up, put it on, and you’re in VR in seconds.
The headset was designed with comfort in mind, and it has a wide field of view and an advanced display that’s optimized for VR. It also features WorldSense, a powerful new technology that enables PC-quality positional tracking on a mobile device, without the need for any additional sensors. With it, you can duck, dodge and lean, step backward, forward or side-to-side. All of this makes for a more natural and immersive experience, so you really feel like you’re there.
An illusionist before he was a filmmaker, Méliès discovered and exploited basic camera techniques to transport viewers into magical worlds and zany stories. He saw film and cameras as more than just tools to capture images, he saw them as vehicles to transport and truly immerse people into a story. He played around with stop motion, slow motion, dissolves, fade-outs, superimpositions, and double exposures.
“Méliès was fascinated by new technologies and was constantly on the lookout for new inventions. I imagine he would have been delighted to live in our era, which is so rich with immersive cinema, digital effects, and spectacular images on screen,” says Laurent Manonni, Director of Heritage at The Cinémathèque Française. “I have no doubt he would have been flattered to find himself in the limelight via today’s very first virtual reality / 360° video Google Doodle, propelled around the world thanks to a new medium with boundless magical powers.”
Lobby 2.0 features an overall larger space, a dedicated quarter-court Echo Arena practice area, new lobby music (because we all need practice jams), and single-player private matches open to everyone.
As you use this space and practice your skills, you can also activate personal training discs in the lobby’s practice area and before games start in private matches. Use the button on your Arm Computer to recall it to your hand for another throw. Other players can’t see or interact with your personal disc, so you can practice to your heart’s content.
And don’t forget your hardhats—one of the most exciting parts of the update is still under construction …
Stay tuned for more on Echo Combat very soon. And if you haven’t already entered the Echo Games universe, dive into Echo Arena and Lone Echo on Rift today.
Check out Ready At Dawn’s Medium post for greater detail on Lobby 2.0. Have fun checking out the new space—we’ll see you in zero-g!
Last weekend, fans from all around the world made the trek to Southern California to see some of music’s biggest names perform at Coachella. To make those not at the festival feel like they were there, we headed to the desert with VR180 cameras to capture all the action.
Throughout the first weekend of Coachella, we embarked on one of the largest VR live streams to date, streaming more than 25 performances (with as many cameras to boot) across 20 hours and capturing behind-the-scenes footage of fans and the bands they love. If you missed it live, you can enjoy some of the best experiences—posted here.
VR180 can take you places you never thought possible—the front row at a concert, a faraway travel destination, the finals of your favorite sporting event, or a memorable location. This year at Coachella, we pushed the format even further by adding augmented reality, AR, overlays on top of the performances—like digital confetti that falls when the beat drops, or virtual objects that extend the into the crowd.
360° | ISLE OF DOGS | Behind The Scenes (in Virtual Reality) | FoxNext VR Studio
Experience the world of Wes Anderson’s upcoming stop-motion animated film, face to face with the cast of dogs as they are interviewed on set, while the crew works around you to create the animation.
Produced in collaboration with the production team of Isle of Dogs, and in partnership with Felix & Paul Studios and Google Spotlight Stories.
Last year we introduced VR180, a new video format that makes it possible to capture or create engaging immersive videos for your audience. Most VR180 cameras work just like point-and-shoot models. However, what you capture in VR180 is far more immersive. You’re able to create VR photos and videos in stunning 4K resolution with just the click of a button.
Today, we’re publishing the remaining details about creating VR180 videos on github and photos on the Google Developer website, so any developer or manufacturer can start engaging with VR180.
For VR180 video, we simply extended the Spherical Video Metadata V2 standard. Spherical V2 supports the mesh-based projection needed to allow consumer cameras to output raw fisheye footage. We then created the Camera Motion Metadata Track so that you’re able to stabilize the video according to the camera motion after video capture. This results in a more comfortable VR experience for viewers. The photos that are generated by the cameras are written in the existing VR Photo Format pioneered by Cardboard Camera.
When you use a Cardboard or Daydream View to look back on photos and videos captured using VR180, you’ll feel like you’re stepping back into your memory. And you can share the footage with others using Google Photos or YouTube, on your phone or the web. We hope that this makes it simple for anyone to shoot VR content, and watch it too.
In the coming months, we will be publishing tools that help with writing appropriately formatted VR180 photos and videos and playing it back, so stay tuned!
We announced Jump in 2015 to simplify VR video production from capture to playback. High-quality VR cameras make capture easier, and Jump Assembler makes automated stitching quicker, more accessible and affordable for VR creators. Using sophisticated computer vision algorithms and the computing power of Google’s data centers, Jump Assembler creates clean, realistic image stitching resulting in immersive 3D 360 video.
Stitching, then and now
Today, we’re introducing an option in Jump Assembler to use a new, high-quality stitching algorithm based on multi-view stereo. This algorithm produces the same seamless 3D panoramas as our standard algorithm (which will continue to be available), but it leaves fewer artifacts in scenes with complex layers and repeated patterns. It also produces depth maps with much cleaner object boundaries which is useful for VFX.
Let’s first take a look at how our standard algorithm works. It’s based on the concept of optical flow, which matches pixels in one image to those in another. When matched, you can tell how pixels “moved” or “flowed” from one image to the next. And once every pixel is matched, you can interpolate the in-between views by shifting the pixels part of the way. This means that you can “fill in the gaps” between the cameras on the rig, so that, when stitched together, the result is a seamless, coherent 360° panorama.
Firefox Reality is a new web browser in development from Mozilla for virtual and augmented reality headsets. However, release dates are yet to be disclosed.
There is much discourse about whether virtual reality can and will live up to its hype. However, Mozilla, the creator of many open-source tools, clearly believes the technology is worth developing for.
Soon it will launch a new web browser called Firefox Reality. What’s new is that it’s “designed from the ground up for stand-alone virtual and augmented reality headsets,” the company explains in an announcement post.
Eventually, the plan is to engineer and develop Firefox Reality for the next generation of standalone VR and AR headsets. However, for now, source code can be run in Developer Mode on Daydream and Gear VR devices.
Currently, there is no official release date. Instead, you’ll have to get an idea of how it will work from the video below. This offers an early insight of the web engine and test user interface:
New Information to Come in the Following Weeks
For now, all we know is that the team took their existing Firefox web technology and enhanced it with Servo, their experimental web engine.
They explain that Firefox offers decades of web compatibility. Meanwhile, the Servo team offers the “ability to experiment with entirely new designs and technologies for interacting with the immersive web.”
However, the company explains that this is simply the first step in a long-term plan. Over time, the idea is to deliver a new experience on an “exciting” platform. So watch this space.
Over the next few weeks, Mozilla promises to release regular updates on how work is going. But, you’ll also learn more details of design and see paper sketches of a headset prototype. Furthermore, the team promise sneak peaks of new capabilities for artists, designers, and developers of immersive experiences.
Visit the blog or the company’s Twitter account to find out more and stay up to date with the latest releases. If you’re a developer with insights to offer, Mozilla encourages you to reach out.
This past January, students in Kristine Kuwano’s third grade classroom were buzzing with excitement at De Vargas Elementary School in Cupertino, California. Tasked with writing out math equations to upload to Google Classroom, the students grabbed their new tablets from the cart, pulled out the stylus, and logged into Chrome. “They love technology and they have grown up working with touch devices, so tablets are intuitive for them,” said Kuwano.
Since their debut, schools have chosen Chromebooks because they are fast, easy-to-use and manage, shareable, secure and affordable. We’ve listened carefully to feedback from educators around the world, and one common theme is that they want all the benefits of Chromebooks in a tablet form.
Starting today, with the new Acer Chromebook Tab 10, we’re doing just that. It’s the first education tablet made for Chrome OS, and gives schools the easy management and shareability of Chromebook laptops. With touch and stylus functionality, this lightweight device is perfect for students creating multimedia projects—and also comes with a world of immersive experiences with Google Expeditions AR.
Um dir ein optimales Erlebnis zu bieten, verwenden wir Technologien wie Cookies, um Geräteinformationen zu speichern und/oder darauf zuzugreifen. Wenn du diesen Technologien zustimmst, können wir Daten wie das Surfverhalten oder eindeutige IDs auf dieser Website verarbeiten. Wenn du deine Einwillligung nicht erteilst oder zurückziehst, können bestimmte Merkmale und Funktionen beeinträchtigt werden.
Funktional
Immer aktiv
Die technische Speicherung oder der Zugang ist unbedingt erforderlich für den rechtmäßigen Zweck, die Nutzung eines bestimmten Dienstes zu ermöglichen, der vom Teilnehmer oder Nutzer ausdrücklich gewünscht wird, oder für den alleinigen Zweck, die Übertragung einer Nachricht über ein elektronisches Kommunikationsnetz durchzuführen.
Vorlieben
Die technische Speicherung oder der Zugriff ist für den rechtmäßigen Zweck der Speicherung von Präferenzen erforderlich, die nicht vom Abonnenten oder Benutzer angefordert wurden.
Statistiken
Die technische Speicherung oder der Zugriff, der ausschließlich zu statistischen Zwecken erfolgt.Die technische Speicherung oder der Zugriff, der ausschließlich zu anonymen statistischen Zwecken verwendet wird. Ohne eine Vorladung, die freiwillige Zustimmung deines Internetdienstanbieters oder zusätzliche Aufzeichnungen von Dritten können die zu diesem Zweck gespeicherten oder abgerufenen Informationen allein in der Regel nicht dazu verwendet werden, dich zu identifizieren.
Marketing
Die technische Speicherung oder der Zugriff ist erforderlich, um Nutzerprofile zu erstellen, um Werbung zu versenden oder um den Nutzer auf einer Website oder über mehrere Websites hinweg zu ähnlichen Marketingzwecken zu verfolgen.