In 1992, Neal Stephenson coined the term "metaverse" in his science fiction novel Snow Crash. In his world, avatars of humans interacted in a three-dimensional virtual space.
Three decades later, Mark Zuckerberg just showed off his version of that vision.
What do you think? Zoom killer?
Like the 3D v 2D feel? Don't like avatars? Don't like the bulky headgear?
A medical examiner puts on a pair of augmented-reality goggles and brings up a computer simulated image of a body that appears to hover steps away. Nearby on a metal autopsy table lies the body of a person brought into the lab after a fatal shooting. Instead of cutting into the victim, the examiner slices through the 3-D image, mapping the bullet’s trajectory and determining the cause of death without making a single incision.
This is one vision of the virtual future of autopsies, based on interviews with forensic and digital health-care experts: Using digital reconstructions and machine-learning algorithms to diagnose the cause of death, identify a victim, and even triage battlefield or motor-vehicle injuries in live patients by analyzing images of victims who died in similar incidents. It would mark a step change for the field of forensic science, where the standard methods of autopsy have remained nearly unchanged for a century.
In 2017, Dr. A.J. Peper started Command Sight, a small business based in Seattle, to bridge human and animal communication. Through conversations with current and former military operators he identified a need to increase the efficacy of communication between canine and handler. As a result, Command Sight built the first prototype of augmented reality glasses for military working dogs.
The augmented reality goggles are specially designed to fit each dog with a visual indictor that allows the dog to be directed to a specific spot and react to the visual cue in the goggles. The handler can see everything the dog sees to provide it commands through the glasses.
The reason for the false starts and the delay is that it is very hard for computers to map and represent the physical world, so the experiences which today’s augmented reality applications can provide are limited. They can overlay digital images onto the physical world, but these images incorporate very little information about the physical world. Emil thinks his new company has cracked the problem. Using highly accurate 6-DOF (degrees of freedom) tracking to capture every motion in six directions (up, down, forwards, backwards, left and right), he and his team have developed technology that creates a near-perfect simulation of any physical space in three dimensions. A digital twin, imbued with intelligence from the physical world.
Facebook's future plans for AR smartglasses are still in the works, but one part of the equation that could soon see the light of day is smart audio. Spatial audio and augmented audio reality are two arenas in which Facebook is actively interested, and the company's newest Facebook Reality Labs research update details how the company plans to make devices that can tune out the real world and make virtual objects sound like they're right next to you, even if they're not really there.
Typically, swim monitors reside on your wrist. An Apple Watch will track your performance, as will other wearables, like a Garmin watch. But a new gadget from a Vancouver, Canada company called Form does it differently: with a head-up display right in your goggles.
Slip them on, and simple yellow, pixelated text appears in front of one of your eyes, superimposed over the scene before you. Whether you’re underwater or turning your head for a breath of air, the information floats there. Two buttons on the side let you take actions like selecting the pool size, or actually starting the swim. While you’re swimming, the display allows you to see information like how far how you’ve been going and how many lengths you’ve completed. A companion app provides a place to make more granular changes and see the full results of your swim afterwards.
Immertec’s technology involves placing special cameras in an operating room. During a surgical procedure, doctors in other locations can use a Medoptic-enabled headset to watch the surgery in 180-degree 3-D VR. Users also can talk to one another remotely, ask questions and zoom in on the patient’s surgical site. Dean, now Immertec’s acting chief medical officer, regularly performs endoscopies that are livestreamed via the Medoptic platform.
Immertec’s competitive advantage is its speed: The network lag — the length of time it takes for data to travel between the sender and receiver — is less than 200 milliseconds, Maltais says.
“Our IP (intellectual property) is around compression and distribution. It’s exactly real time,” says Maltais. “You can ask questions, and there’s no lag time.” Immertec’s competitors use simulated training, he says — “you’d put on a headset and you’d practice surgery in a video game-like experience, or you’d watch a video recording.”
The point of the new feature is to help you orient yourself when you're following a walking map. It also helps to solve a common annoyance in many big cities: You get out of a subway and you have no idea which way you're facing, so you wait for the little blue dot on the Maps app to point you in the right direction.
I found myself at Gettysburg National Military Park this past weekend. Among other things I went to the Cyclorama which displays French Artist Paul Philippoteaux's painting "Battle of Gettysburg". It is a mind-boggling canvas that measures 377 feet in circumference and is 42 feet tall and brings to life the fury of the third day (July 3, 1863) of the Battle of Gettysburg. Philippoteaux was commissioned to paint it in 1879.
I could have spent hours with the ranger who explained the logistics of shipping the canvas from France to Chicago where it was first exhibited, then displayed around the country, the assembly in the circular wall at its permanent home in Gettysburg and subsequent maintenance and periodic restoration.
"Cycloramas were a very popular form of entertainment in the late 1800's, both in America and Europe. These massive, oil-on-canvas paintings were displayed in special auditoriums and enhanced with landscaped foregrounds sometimes featuring trees, grasses, fences and even life-sized figures. The result was a three-dimensional effect that surrounded viewers who stood on a central platform, literally placing them in the center of the great historic scene."
The painting has been digitized in the 360 degree video below by the American Battlefield Trust - it's worth a watch, but I was thinking with today's AR/VR we will likely see a next generation depiction.
Of course, nothing beats a visit to the Park, and a walk around many of the sites of the battle. It is so solemn.
And to me even more gratifying was seeing many artifacts of President Abraham Lincoln's famous "Four Score and Seven Years ago" address at the location a few months after the battle.
Mixed reality allows aerospace trainees to learn in an immersive virtual environment without the need for an actual physical aircraft or parts. This 3D environment can offer features that real-life training cannot, such as the ability to view elements in three dimensions from any angle.
HoloLens helps Airbus designers virtually test their designs to see if they are ready for manufacture. Mixed reality speeds up the process substantially, decreasing the time spent by 80 percent.
Mixed reality technology can also help workers on the production line access crucial information while keeping their hands free. Digital information, such as instructions or diagrams, can be overlaid on a real piece of machinery to aid in complex or hard-to-reach tasks. These kinds of mixed-reality solutions have allowed Airbus to cut manufacturing time by a third while improving quality.
Recent Comments