Google's Big Bet on the Future of UI: Gadgets That Predict Your Needs

As Hayes Raffle, Google's head of interaction research described it: "Computing should be ephemeral and should fade into the background. The world is the experience.

At I/O last week, Google didn’t make much hay about its boldest release all year: Android Wear, its OS for smartwatches and Google Glass, won't be some simplified version of a smartphone UI. Rather, it will be a new kind of beast, whose core experience is lifted heavily from Google Now. The Android Wear UX won't be a bunch of app icons arrayed on a screen; rather, it'll be a series of cards, each of which is triggered by whatever you might be doing in the moment.

Some of the interactions that Google touted were: When you're preparing to leave for the airport, your watch would stack a series of cards about the traffic you could expect, your flight information, and your hotel location; or an app powered by Trulia, which would alert you when you're driving around the city to homes nearby that are for sale; or the ability to see if something you've added to your Pinterest board is available in a store nearby.

Note how the series of cards is geared to a trip you're about to embark on. On the right, the list being activated shows the various things you can ask for using voice alone.

Google

Crucially, none of the actions would be ones you explicitly bring up, rather they would simply appear on your watch. And, if you need something in particular, you'd just speak your command out loud, whether that's to set up a calendar appointment or send a friend a text message or figure out the name of the song you're hearing.

All of this was in service to an ambitious goal that interaction designers have had since the dawn of interaction design, in the 1970s: Contextually aware computing. Or, as Hayes Raffle, Google's head of interaction research described it: "Computing should be ephemeral and should fade into the background. The world is the experience. The experience of a product shouldn't compete with the world. At the best it can provide timely information to be connected to the moment you're in. Our goal is to create experiences as short as possible, as fast as possible."

Shortcomings For Now; a New Platform on the Way

The only problem now is that the experience seems to be awful: Android Wear still defaults to mapping the functions of a cellphone too directly. The smartwatches that depend upon them simply because proxies for your smartphone that are more insistent. Early users around the WIRED office report that the watches with Android Wear are buzzing constantly with new messages. If you want to mute the various apps, such as email, you end up muting them completely—instead of making them buzz only when the most important messages arise.

In short, there is a long way to go before our devices are able to figure out, all by themselves, exactly what we really want in the moment. Once again, Google's instinct to launch before a user experience is fully baked is, for now, obscuring the quality of the ideas behind it.

For example, to address these problems, you wonder why the Android Wear UI isn't adaptive from the outset: Starting with few notifications, then asking users to opt into more as they become accustomed to it's possibilities. This kind of adaptive UI would be ideal for instances where we're trying to teach a user about the capabilities of something new.

Moreover, the devices seem not to be taking enough advantage of their setting: Your watch should definitely not ping you about emails if you're at your computer or near your phone. That's what those devices are for! Instead, you'd want some subtler type of logic.

What notifications on your watch would make using your phone or computer easier? What notifications would be enough to scratch the itch of what you need to know? In those cases, a notification that simply says, "You haven't gotten any VIP emails in the last two hours," would buy you time away from your phone, which, after all, is the real goal with Android Wear.

But Android Wear does offer at least a credible indication about what's next for UI design. For one, Google is using its natural language algorithms to build a library of verbal "intents," such as anything from wanting directions to the grocery store to wanting to take a note. They're asking developers to tell them about any other intents they'd like to map. And, perhaps more interesting, Google is also creating a way for app developers to use event based triggers in Android Wear. Thus, an app could trigger a notification based on any combination of the following: Location, time, events on your calendar, motion (walking, cycling, driving), and nearby devices.

Google is the only company well-positioned to put all of these pieces together. As good as Apple has been at smooth, often delightful interactions, they haven't spent years building a compelling suite of products driven by user data. As I've argued before, user data is the magic ingredient to a new era of user experience.

A Brand New Field: Emergent Design

Google's VP of design, Matias Duarte, sat down to talk with WIRED about this new era of interaction design. When asked about this newfound push towards contextual smarts, he couldn't quite pin down the origins of such thinking at Google. Rather, he says, it was a natural progression that came about when trying to stitch together all of Google's various products, while making them useful in new, wearable form factors. He was most interested in a new kind of design such thinking represents.

Matias Duarte.

Ariel Zambelich/WIRED

"This is a massive new tool that people aren't thinking about," Duarte told WIRED. "And that's that we're now designing systems that can do things that you can't predict." Obviously, some of this comes from the machine-learning algorithms that our wearables, for example, might use to make sense of the patterns in our movements, desires, and day-planning.

But Duarte points out that this requires designers to think more loosely: No longer will they simply be tasked with prescribing the exact conditions under which some new product will be used. "Context as a discipline is going to make us learn to design for coincidental things. It's like learning to paint with oils then learning to paint with watercolors. The paints are going to flow and bleed. You have to paint a different way." Much like videogames and films, where AI often determines behavior rather than a programmer, designers then become orchestrators more than anything else.

Of course, in the near term, with Android Wear, there are some major challenges to be solved. For example, can you really cobble together seamless, contextually aware interactions from stuff as plain as a precise set of sensor readouts? Or do people want context in different ways, at different times—so much so that it's silly to map our needs to simple chains of if/then logic? We still have a long way to go before what these devices can do is more than simply logical. To be useful, these things will also have to be surprising. That's a very high bar, but it's one that we'll expect before wearables can ever steal us away from our phones.