Thursday, March 14, 2013

How Search Is Evolving -- Finally! -- Beyond Caveman Queries

How Search Is Evolving -- Finally! -- Beyond Caveman Queries

Longtime Google search executive Amit Singhal has a favorite example of what he and others call “conversational search.” He pulls out his phone and says, “How old is Justin Bieber?” Then he asks a follow-up question: “How tall is he?”

JustinBieberisshort

Facebook/Justin BieberJustin Bieber is 5’7?.

Singhal explained in a recent interview that Google has learned a basic understanding of what is known as pronoun and anaphora resolution. So the robot Android woman replies to him, understanding that the second question refers back to the first proper noun.

“Search has been, in the past, a one-shot deal. But for the first time, ‘he’ meant ‘Justin Bieber!’ No one else does that,” said Singhal.

(The Biebs measures 5’7?, same as Tom Cruise and Al Pacino, which took me a third Google search.)

“Today I showed you a two-sentence conversation,” Singhal said. “I wouldn’t be surprised if, in a year or two, we’ll see a much broader conversation happening within search” â€" where users can talk to a search engine as if they’re talking to a person.

More Than an Interface Problem

For years, online search has trained us to speak its odd and stilted language. Type a demand for information. Isolate the keywords. Start from scratch with every query. Use quotation marks to specify a phrase. It’s enough of a foreign language that some people call it “Searchese.”

One thing binding together much of the work Google and other companies are doing around search these days is that they’re making it more natural and conversational.

Conversational search is search that tries to understand context, that makes educated guesses, that takes voice input, that parses homonyms and adapts to mobile environments, and that understands the same user across multiple devices.

These ideas have been around since at least the 1990s, part of research projects like AT&T Labs’s Watson and MIT’s Jupiter, a phone service that could understand a wide range of queries about the weather.

While on the surface, making search conversational sounds like an interface problem â€" just figure out an easier way for people to access the same underlying information â€" in reality, it cuts deep into artificial intelligence and the world outside of computers.

Plus, it’s a descriptor that’s much more accessible than other recent search coinages, like Google’s “Knowledge Graph.”

Getting back to Singhal’s Bieber example, Bing search director Stefan Weitz said it’s true that Microsoft doesn’t yet handle many multipart queries, with the exception of structured search for content titles on Xbox.

But Bing is also working on many fronts to make search more natural and conversational â€" for instance, in its work to disambiguate queries, its semantic search effort Satori, and personalized apps like Local Scout.

“When you talk about conversational search, you’re really talking about machines being able to understand the last thing you said or the path you’re heading down,” Weitz said. “The real challenge is deconstructing the digital world using the Web as a very high-definition physical proxy.”

So, for instance, Bing can now answer queries like “the movie with Tom Cruise and a unicorn” or “the tallest mountain in the world,” even though they take a good bit of extrapolation, Weitz said. (The answers are “Legend” and Mount Everest.)

Unlearning Awkwardness

At Google, the first conversational search launch was probably Google Suggest, a “20 percent” project from way back in 2004 that auto-completes likely searches while a user is typing. It was the predecessor to Google Instant, which launched in 2010 and displays likely search results for those queries.

John Boyd, a research manager at Google, runs a team that brings in users to observe their experiences with new products, and designs studies to evaluate what they might want in the future.

John Boyd at work in the Google user experience research lab

John Boyd at work in the Google user experience research lab

Speaking in one of those one-way-mirror rooms containing a computer fitted with an eye tracker, Boyd said that one of the stranger parts of his job is that volunteers who are brought in to campus for studies and interviews often fail to notice whatever it is that Google is testing. When his team ran user tests on Google Instant, some lab rats guessed that what was new was an older navigation bar on the left side â€" not the results appearing smack-dab in front of their faces.

Without a before-and-after comparison, it’s hard to pick up on what’s new, even on a website you use every day.

But that’s not necessarily bad. One of the things about Google search that Boyd most wants to change is to subtly guide people away from learned behaviors and back to natural conversation.

“Google is magic,” said Boyd, who directed research at Yahoo for five years before joining Google five years ago. ”But because we stay out of the way, it allows people to get into bad habits.”

What does he mean by that? Well, some people think search works better when you type in queries in all caps. ACTUALLY, it doesn’t.

This is an example of B.F. Skinner’s “superstitious learning,” Boyd said. For instance, some searchers tend to overuse double quotes to try to tell Google that we really want to find a set of words in the order we entered them. What’s the harm in that? Quotes are not always necessary, they’re extra work, and in some cases they exclude worthy results, Boyd said. You can tell that little bit of inefficiency actually gets under Boyd’s skin.

On a broader level, Google should have a better idea of what we want to know, Boyd said, so we don’t have to ask a question from scratch every time.

But What Do You Really Want to Know?

To that end, last year Google conducted a study of 150 participants by pinging them through a custom mobile app at multiple random points throughout the day to ask them what they wanted to know.

Boyd showed me the file of one female participant, whose information needs included “How do I get $ 200 in eight days?” “How long does state have to indict somebody before they get charged?” “What is string theory?” “What does a cuttlefish look like?” “How do I make my daughter leave me alone?” “How to make my tooth stop hurting?” “How do I find a bail bondsman?”

Meanwhile, another male participant said he had car problems, a dog that may have had fleas, a need for cash, a smoke detector that needed to be reset, and interest in buying a car for his granddaughter.

This is what a cuttlefish looks like.

Monterey Bay Aquarium This is what a cuttlefish looks like.

Besides being fascinating slices of life, the list of questions large and small is helping Google think about how to better understand how it might be helpful to people.

For instance, the woman asked a whole bunch of questions around the theme of jail; Google could do a smoother job of helping her with that topic. And the man’s list of to-dos is pretty hectic; perhaps Google could help organize and execute them.

Two of Google’s main thrusts to help people in the moment are voice search â€" especially useful when you’re driving, or your hands are otherwise busy â€" and the Google Now smart personal assistant for Android.

Voice is a natural format for conversation, Boyd noted. “The conversational component opens up a dimension of ‘did you mean?’ or ‘have you thought about it this way?’” That tends to be a lot more awkward and slow in text, he said.

Meanwhile, the Google Now Android app rides along with users and takes note of their habits, showing things like sports scores and weather and traffic, based on their personal search history. Recent additions to the app trigger previously purchased movie tickets and boarding passes to pop up when a user enters the theater or the airport at the designated time.

“There are so many different situations when our users need help,” Google Now product manager Baris Gultekin recently told me. “My goal is to anticipate all your needs, and anticipate the right thing when you need it. It’s a huge undertaking. We are basically trying to focus on trying to get you information you need, when you need it, before you ask.”

For those who are okay with the privacy implications, these are significant and exciting leaps forward. But we are a long way from engaging in any old casual conversation with our computers. Google Now only understands a very few things in the world, and each new one is being added manually through partnerships with Fandango and United Airlines and the like.

And while voice recognition is much better than it used to be, I often find myself falling into a new sort of voice “Searchese,” where I carefully enunciate words in a monotone and speak out punctuation. It’s far from the most natural thing in the world.

No comments:

Post a Comment

//PART 2