Despite Apple's claims to the contrary, not everyone believes the iPad Pro can replace a computer. When paired with an Apple Pencil, however, the tablet lets people do things they might never accomplish on a traditional PC or Mac. For example, the iPad Pro lets you create handwritten thank you cards that can then be sent through the mail, write notes on PDFs and sign on the dotted lines, and drain all the color out of photos, then add it back to particular objects.
The following 12 iOS apps look great on iPad Pro, and they all put your Apple Pencil to work.
The autopilot tracks the position of the deck, adjusting the throttle, flaps, ailerons, and stabilizers to keep the flight path and angle of attack on point. Instead of maintaining continuous pressure on the stick and making myriad inputs before landing, the pilot can relax. Any adjustments he does make are incorporated into the autopilot settings.
During a week of trials last month, test pilots flying F/A-18 Super Hornets conducted nearly 600 touch-and-go landings and many tailhook-arrested landings on the Nimitz-class USS George Washington. They made both highly accurate approaches and deliberately inaccurate approaches, with varying wind speeds and directions. According to engineers with the Navy and Boeing, the system increased the accuracy and consistency of landings under all conditions. Those landings were less stressful, too: Pilots typically perform 300 corrections to their flight path in the final 18 seconds of an approach. Magic Carpet drops that between 10 and 20.
(intelligent Voice)’s CEO Nigel Cannings says the breakthrough came when he decided to see what would happen if he pointed a machine-learning system at the waveform of the voice data – its pattern of spikes and troughs – rather than the audio recording directly. It worked brilliantly.
Training his system on this visual representation let him harness powerful existing techniques designed for image classification. “I built this dialect classification system based on pictures of the human voice,” he says.
3D Touch - By weaving 3D Touch more deeply into the fabric of iOS 10—and specifically, making it an essential part of its lock screen—Apple finally arms potentially revolutionary feature with true purpose. You may not use 3D Touch much today, but after iOS 10 arrives this fall, you may wonder how you lived without it.
Siri - Not only that, you’ll be able to use Siri on your PC, to make a lot of simple actions easier: adding things to your calendar, doing quick research and calculations, setting reminders, playing music, even searching your computer. Siri can search Finder, finding you files from last week about the offsite and then showing you the ones you tagged as draft. Click on a button and it pins into your notification center, for easy finding later. The voice assistant can do more on the Apple TV as well: Siri has improved topical searches for movies and TV shows (“Horror movies from the ’80s”) and you can now run voice searches for YouTube videos.
Dell's new monitor will pique your interest, with a 43-inch 4K display and the option to run as four separate 1080p screens, without bezel breaks.The Dell P4317Q monitor can show content from four separate inputs simultaneously in full HD (four USB 3.0, two HDMI, one DisplayPort, one Mini DisplayPort, and one VGA port are available), and you can zoom in to any single display to take advantage of that 4K display at will. If you're considering throwing your multi-monitor setup out the window and going all-in with Dell — which the company says will save you 30 percent in energy consumption —prepare to spend some serious cash. This monitor will cost you $1,349 and is expected begin shipping on May 23rd.
Google Home project lead Mario Querioz held the device in his palm, revealing a design that was shorter and wider than Amazon's cylindrical Echo, which is powered by Amazon's virtual assistant Alexa. Microsoft also has its own personal assistant, Cortana, but as yet no at-home device.
Google Home will use its new Google assistant, which leverages Google's search and the contextual queries it's been developing with a decade of research into artificial intelligence. It will be able to play music, complete a range of tasks and answer questions that one would ask of Google search.
The SignAloud glove captures ASL gestures with sensors that measure everything from XYZ coordinates to the way individual fingers flex or bend. That sensor data is sent via Bluetooth to a nearby computer and fed into coding algorithms that categorize the gestures, which are translated into English and then audibly spoken via speaker.
“Until now, that's the direction technology has taken us -- living and gorging on screens with our heads down in our phones. But 2016 will be the year this changes. Why? Because we know something is wrong. The loss of humanness is very real. We also know that technology has the profound potential to enhance our experience of the world around us, rather than distract us from it.
I call this the Invisible Interface -- a movement wherein technology still provides us with information and gives us command of our surroundings, but through discreet signals rather than screens. It is not that different from the way we orient ourselves in nature: we look at the Sun to understand how much daylight is left in the day; we feel a breeze and turn towards it to scan the horizon for the sign of a storm.
This new approach to the transmission of information is much harder to build than pixels on a screen. And yet it is so much more rewarding for the designer, because the resulting user experience is natural, fluid and non-interruptive. Information and action is then woven into our lives so discreetly that, if it weren't for the magical experiences it creates, we would forget it is there.”
“Working with Corning, Apple created pliable iPhone cover glass. Swipe it, and the phone works the way it always has. But press it, and 96 sensors embedded in the backlight of the retina display measure microscopic changes in the distance between themselves and the glass. Those measurements then get combined with signals from the touch sensor to make the motion of your finger sync with the image on screen.
Some of this technology was first revealed in the Apple Watch, which has a feature called Force Touch. But 3D Touch is to Force Touch as ocean swimming is to a foot bath. Screen size makes a difference, but the software on the iPhone 6S has a liquid ease. Apply a tiny bit of pressure anywhere you want to explore something—a restaurant link inside a text, an 11 a.m. meeting invite buried in an e-mail—and a peek at the restaurant’s Web page or a window into your calendar hovers expectantly in the middle of the screen while everything else blurs into temporary opacity. Press a little harder, and what you’ve been peeking at pops fully into frame. Release your finger, and you’re right back where you started. Presto chango, no home button required.”
He starts simply, asking for the time in Berlin and the population of Japan. Basic search-result stuff—followed by a twist: “What is the distance between them?” The app understands the context and fires back, “About 5,536 miles.”
Then Mohajer gets rolling, smiling as he rattles off a barrage of questions that keep escalating in complexity. He asks Hound to calculate the monthly mortgage payments on a million-dollar home, and the app immediately asks him for the interest rate and the term of the loan before dishing out its answer: $4,270.84.
“What is the population of the capital of the country in which the Space Needle is located?” he asks. Hound figures out that Mohajer is fishing for the population of Washington, DC, faster than I do and spits out the correct answer in its rapid-fire robotic voice.