The Light L16 aggregates images from multiple phone-camera-size sensors to create pictures with a resolution as high as 52 megapixels, a little higher than the best digital single-lens reflex (DSLR) camera on the market.
The Navdy is the first portable head-up display (HUD). It sits atop the dash and plugs into the OBD-II port of any car made after 1996. It projects info such as speed, engine rpm, and compass direction on a transparent screen in front you, and uses built-in GPS and Google Maps to show the surrounding area, display speed limits and street names, and route you to your destination. It also connects to your Android or iOS smartphone via Bluetooth to display data including calls, texts, music, and all manner of social media and alerts. Access to this info is largely controlled using a thumbwheel that attaches to your steering wheel and is supplemented by gesture control that's activated by waving your hand in front of the device.
In writing my recent book, Silicon Collar I saw several mismatches in the labor market. There have been nearly 5 million unfilled jobs for 4+ years. Yet, people have racked up over a trillion in student debt for education many cannot parlay into jobs. Higher education still thinks in terms of 4-6-8 years of formal school when the average job is lasting 5 years or less. We need to revisit our learning methods and fast.
So, I have been watching with interest as Nick Hortovanyi started describing on social media his experience in a new area using Udacity.
Nick had graced this blog a couple of years ago as he described how wearables and data were reshaping his passion for cycling
I asked him how and why he decided on Udacity
“I'm not an academic, have never been to University and the thought maybe of going to University for 3+ years was offputting. I have had a life long learning experience with technology via technology itself. The Udacity Silicon Valley approach seems to fit how I learn using the internet itself."
However, he is not using Udacity for learn a well-trodden subject. And he is doing it across the Pacific from his home on the Gold Coast of Australia.
"I was having trouble finding a large enough market for a startup vision, I had improving performance of cyclists from the data they collected. Thus as part of my what’s next thinking I applied for the new Udacity Self Driving Car Engineer Nano Degree. I thought if I got in, that'd be great, I could get recognition for some of my more recent data science learnings as well as learn more about AI (Deep Learning & Machine Learning), Computer Vision and Robotics.”
It is cutting edge stuff like the Advanced Lane Detection project where he applied “computer vision techniques to augment video output with a detected road lane, road radius curvature and road centre offset.”
The biggest upgrade to the FOX broadcast feed this year comes courtesy of Intel. The Silicon-Valley tech company aims to provide as many as two dozen player’s eye view clips from the game, a feature called “Be the Player.” The feature, based on its 360 Replay technology, models the real world so that virtual views from any location can be generated.
Intel has installed 38 5K cameras high above the field at NRG Stadium in Houston, bolted onto the building’s metal structure. Pointed downwards, these cameras operate more like sensors, feeding visual data back to a rack of servers elsewhere inside the stadium. Working together, those servers can digitally reconstruct the 3D world of the game, representing real objects using 3D pixels, known as voxels.
When FOX sends a request—perhaps for a quarterback’s view from the pocket during a crucial play, or a linebacker’s view from the other side of the ball—two Intel staffers will take over. The system’s pilot will operate a virtual camera, choosing where and when to position the viewpoint in the 3D reconstruction. The navigator will package the visual feed from the pilot into a clip that can be relayed back to FOX. The whole process will take a couple of minutes, so don’t expect instant replays yet.
The universal set of emoji are “regulated,”for lack of a better term, by the very unsexy- and unfunny- sounding Unicode Consortium, which gives each approved emoji its own universally recognized, unique code. The nonprofit consortium is an alliance of big tech companies, including Apple, Google, IBM, Microsoft, Facebook and others, that pay an annual fee of s18,000 to vote on characters and other text decisions. Having one central coding depot ensures that devices created by competing companies recognize each other’s text and symbols. Emoji make up just a small percentage of the many text codes issued by the organization. The process of creating new emoji takes about 18 months from start to finish, and anyone can submit applications to the consortium for a new icon, along with the reasoning behind it.
In 2014, the population of Singapore was estimated to be 5.47 million, inhabiting a land area of 718 square kilometres. As one of the most densely populated cities in the world, Singapore faces complex urban challenges, and careful urban planning is crucial to maintain efficiency and sustainability. For the coming decades, Virtual Singapore will provide a collaborative platform and rich data environment to help make long-term decisions on areas such as infrastructure and resource management, environmental and disaster management, public services, urban planning, community services and homeland security.
“This is Magi, a system that captures images in 3-D and “4K” ultrahigh resolution and displays the resulting frames at five times the usual rate. Trumbull developed the technology as a way to create movie experiences more immersive than regular 3-D or giant-screen IMAX—and restore the joy of going out to the movies.
Trumbull inside a green-screen studio he is building on his Berkshires property.
Trumbull, 74, has spent his entire life thinking about how people experience the illusions of cinema. He grew up in Los Angeles fascinated by the Cinerama widescreen movie format; got his first Hollywood job, doing visual effects for 2001: A Space Odyssey, in his 20s; and went on to direct two cult-classic films (Brainstorm and Silent Running) and design visual effects for Blade Runner, Close Encounters of the Third Kind, and Star Trek: The Motion Picture. Now, in an age when the movie theater is losing its allure, he’s hoping to wow people yet again—this time using Magi’s “hyper-reality,” which enables audiences to connect intensely with stories and vividly experience a character’s perspective.”
Jamboard works like a digital whiteboard, letting users sketch out ideas, attach digital sticky notes, plus bring in content from the web into a single, constantly updating workspace. People can use Jamboard to collaborate both on the 55-inch mega-display of the same name, or using accompanying tablet and smartphone apps for iOS and Android.
If you use Dark Sky as your weather app of choice—or just happen to enjoy gloriously rendered maps of weather movement—you should be both gladdened and a little surprised to learn that it’s now available in a new incarnation. Meet Dark Sky, the web site. It should look pretty familiar.
To find the ideal vacation-photography arsenal, I toured New York City for a week with three devices: the LG 360 CAM, a 360-degree camera; the Narrative Clip 2, a wearable cam that automatically snaps photos every 30 seconds; and the Moment smartphone lensesand case, which equip your phone to shoot like a full-fledged camera. All three items combined were lighter, smaller and less expensive than the kind of DSLR “serious” photographers lug around. Plus, I didn’t have to wear a fanny pack.