Suddenly, everybody’s talking about ambient computing. I blame Intel, and I’ll tell you why in a minute.
Ambient computing is real. It’s the next megatrend in computing.
Ambient means “in the air” or “present on all sides” or “all around us.” To interact in an “ambient computing” context means to not care and not even necessarily know where exactly the devices are that you’re interacting with.
When IoT devices and sensors are all around us, and artificial intelligence can understand human contexts for what’s happening and act accordingly and in our interests, then ambient computing will have arrived.
In the past, computing existed inside a computer, which you saddled up to and consciously used. In the future, connected computing devices will be all around us, and we’ll always be interacting with them, even when we don’t know or think about it.
As I told you last year, ambient computing isn’t a specific technology, but a general way to interact with digital devices and the internet.
As with many technology revolutions, including augmented reality and AI, the buzzword ambient will precede the actual technology by many years. In fact, the marketing buzzword is suddenly here in full force. The actual technologies? Not so much.
Instead, we’re on the brink of a revolution in what you might call “semi-ambient computing.”
Intel inside; screen and sensors outside
Intel dragged ambient computing into the spotlight a couple of weeks ago by unveiling a prototype laptop it calls an Ambient PC at its Technology Open House at Computex in Taiwan.
The Ambient PC gets its label in part from a touchscreen edge that functions while the laptop lid is closed. The edge-screen shows icons, calendar information (it's running a calendar app in the image up above), buttons for controlling audio on the device and the laptop (while still closed), and microphones make the laptop work like an Amazon Echo appliance, offering Alexa just a wake-word away. Most interestingly, the laptop has a 360-degree camera that can log you in as you approach it using Windows Hello.
It’s a stretch to call Intel’s Ambient PC idea an example of ambient computing. It basically does things while closed that regular laptops can do only when open. Containing these functions with the closed lid doesn’t make it ambient, just usable in a new configuration. I still want one.
Alexa, are you ambient?
Amazon’s Alexa virtual assistant shows up in many physical appliances these days, including Amazon’s own Echo and related lines of smart speakers and smart displays. Alexa, Siri, Google Assistant and Cortana enable hands-free interaction with information, “skills,” the internet and communication and are, as such, semi-ambient computing products.
They’re not fully ambient because they can’t yet use AI to synthesize several sensors to understand context. For now, they mostly operate through voice. When I’m having a conversation and mention Alexa, the Echo wakes up and says it doesn’t understand. And that’s right. Alexa has no ability to understand the context of my use of the A-word.
The new Google ‘wave’
Rumors are circulating that Google’s next smartphones, the Pixel 4 line, may come with Soli built in. I told you in January about Google’s Project Soli, which may be called the “Aware” sensor or feature in the Pixel 4 — again, according to unconfirmed rumors.
Soli or Aware capability means the Pixel 4 may accept in-the-air hand gestures, such as “skip” and “silence” during music playback. The new Google “wave” is a hand gesture.
The ability to wave away music with a hand gesture brings the smartphone into the semi-ambient computing era. It basically adds natural hand gestures to natural-language processing. There’s no reason to believe that the Pixel 4 will synthesize these actions into context.
The truth is that ambient computing for consumers is years away, and will likely first emerge in cars.
As consumer electronics appliances, cars have an advantage for the introduction of sensors. The reason is that drivers and passengers are a captive audience — it’s clear where to point the sensors and even possible to build them into seats and seatbelts. The context of human activity or intent is also easier to guess in a car.
Over the next five years, an increasing number of cars will gain the ability to identify the drivers and passengers (and adjust settings and preferences accordingly); monitor the driver for sleepiness, drunkenness and distraction; and safely wrench control from the human and pull over if necessary.
Grocery stores will also become consumer-facing ambient computing locations. Amazon’s Go stores are already semi-ambient computing systems, where customers essentially shoplift with Amazon’s permission, and then pay for the nabbed items automatically.
Smart glasses will one day function as consumer ambient computing devices. With advanced machine learning, glasses will monitor gaze and drip information about what we see into our ears (or though bone conduction), and provide context and extremely effective virtual-assistant functions in a seamless way.
But the first place ambient computing will appear for real is in businesses, enterprises and healthcare facilities.
Clippy gets a job
Microsoft is going all-in on ambient computing, according to reports. Under the Surface brand, Microsoft is expected to release a range of ambient computing devices, features and services that probably involve its Cortana virtual assistant.
Looking behind the scenes, it appears that Microsoft sees ambient computing as a concept for businesses and enterprises. In posted job listings, Microsoft says that its “Ambient Computing & Robotics team is creating applications for the era where computer vision, AI-based cognition, and autonomous electro-mechanicals pervade the workplace. We are using this convergence to transform physical work in construction sites, logistics yards, baggage handling areas, hospital corridors, factories, restaurants, farms and more.”
Microsoft was mocked for its “Clippy” assistant, which the company released in 1996 as a way to provide friendly help for people using Microsoft Office. In the future, Microsoft may release what will essentially be a Clippy that works, because it will understand human context through AI.
We’ll also see ambient computing showing up in medicine.
Nuance, the venerable speech-recognition company, is working on ambient computing for healthcare called Ambient Clinical Intelligence, or ACI. It works through a smart speaker mounted on the wall of a doctor’s examination room, which also has a camera. By applying deep learning to speech and visuals, ACI is able to document doctor visits.
The idea is to enable the doctor to pay full attention to the patient, without worrying about writing everything down.
ACI should start showing up in medical centers next year.
Google also briefly talked last year about a healthcare assistant called Dr. Liz., which was described by former Google CEO Eric Schmidt as an ambient computing virtual assistant for doctors. We’ll see if Google ever ships a Dr. Liz product.
Yes, ambient computing is real, and the Next Big Thing, showing up first in business, enterprises and healthcare.
But for now, the term ambient computing will be misapplied. It’s a buzzword that will be stapled to every semi-ambient computing product and service that comes out over the next few years.
Join the CIO New Zealand group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.