When Cossman went to Ambrym, he and his team used a drone to map out every nook and cranny of the volcano, covering both craters. That map was in turn made into a full three-dimensional rendering. Using the Unreal game engine inside an AvayaLive Engage virtual environment, it's possible for anyone with a login to explore Ambrym from the comfort of their laptop screen. Which is what I'm doing. With a jetpack.
The jetpack is a video game conceit. The crater is full of steep edges, and jetpacks are a simple solution to avoid getting stuck. Even without the difficulty of climbing up, the crater itself is huge. The avatars stand about six feet tall in the virtual environment, and while moving in the game is less physically taxing (not to mention less life-threatening) than crawling around an actual volcano, it’s not much faster if you don't use the jetpack.
A new SUV after 7 years, a new laptop after 3, and a new smartphone after 2, and especially when you see things from the eye of a technophobe wife, you realize how everything, even basic, humble things are rapidly evolving
The car clock has evolved to synch with the GPS and auto-adjust to daylight savings and time zones
The earbuds can screenprint to your photos
The speakerbox can speed-dial for you
The mouse has lost its tail and can walk even on rough surfaces
Occipital has developed apps that allow people to scan objects in 3-D by walking around them, and to scan entire rooms. One shows how the sensor can enable augmented reality, where virtual imagery is overlaid onto the real world when seen through a viewfinder. In that app, a person plays fetch with a virtual cat by throwing a virtual ball that bounces realistically off real-world objects
Suppose you want to fillet a fish. Lay it down on a chopping board and the cameras will detect its outline and orientation so the projectors can overlay a virtual knife on the fish with a line indicating where to cut. Speech bubbles even appear to sprout from the fish's mouth, guiding you through each step.
If that is not enough, the kitchen also comes equipped with a small robot assistant named Phyno that sits on the countertop. When its cameras detect the chef has stopped touching the ingredients, Phyno asks whether that particular step in the recipe is complete. Users can answer "yes" to move on to the next step or "no" to have the robot repeat the instructions.
Sign of the times. Time has devoted its entire issue to how mobile technology is changing everything from elections to teaching to detective work.
Three nice bonus features in the issue:
a) The cover is made up of 288 of the 31,429 mobile phone photos it got from readers in over 120 countries.
b) The Time mobile app (for iOS and Android) has an Augmented Reality Scanner and you can scan 5 of the pages in the issue for more content. So if you scan the map on the editorial page you can watch the Managing Editor, Richard Stengel talk about the issue.
c) The 10 questions page is an interview with Austin Wierschke, 17, the fastest texter in the US. Seriously. He won a $ 50,000 award for typing a 149 character message in 39 seconds.
“Google’s Project Glass has just jumped out of a blimp above Google IO here in San Francisco and are live streaming the entire thing via a Google+ Hangout session.
These augmented reality glasses skydived onto the top of the Moscone West Center here in San Fran, then they took a few BMX bikes down a building or two, and finally made it on stage for a demo. Google’s running over a few of the specs although aren’t saying anything specific. Project Glass has 3G/4G data (multiple radios) microphones, speakers, a camera (with video obviously,) sensors, compass, gyro, and more. It knows where you are, connects you to your friends, lets you take plenty of pictures, and keeps you social with Google+.
What makes Google’s Project Glass even better is the comfort. If it’s heavy and uncomfortable who will wear them? Google has just stated thy are lighter than many sunglasses, yet are sturdy enough to do daily activities like mountain biking, playing tennis and more.”
"This project investigates techniques to track the 6DOF position of handheld depth sensing cameras, such as Kinect, as they move through space and perform high quality 3D surface reconstructions for interaction."