THANKS TO APPLE'S voice assistant, Siri, natural language understanding has become the buzzword du jour not only in the enterprise, but in the consumer market as well.
Interest in natural language understanding (NLU) exploded even before Siri arrived, when IBM's Watson supercomputer appeared on Jeopardy! last year, competing and ultimately holding its own against human contestants until almost the very end. According to IBM, Watson's data included the application of advanced natural language processing, information retrieval, knowledge representation and reasoning, and machine learning technologies to the field of open-domain question answering.
"The trends seem to be quite clear," says Ilya Bukshteyn, senior director of marketing, sales and solutions, at Microsoft Tellme. "When you look at the kinds of technologies that consumers are snapping up and buying in record numbers, whether it's Kinect or Apple products, it's very clear that natural interaction (language) done right is very, very compelling. You don't want to be the last company not offering a natural experience in your category."
Dan Miller, senior analyst and founder, Opus Research, says that NLU should allow people to speak naturally and have a reasonable expectation that a machine on the other end is going to understand their intent.
"Accurate recognition is key to cloud-based resources that understand intent," he says. "What's happened in the last year, as [demonstrated by] Watson, [is that] computer systems that can claim to understand what people are saying and accurately render words have become more reliable. It's getting better and better. It's never going to be perfect, but now in enough cases, such as with Siri, there's a feeling that you can put language on a front end of highly popular devices, and that's never happened before."
While the gee whiz factor is hard to overlook in the consumer market, the view of how well NLU works in the business market is divided. Many experts say that the technology is too expensive and has a long way to go, while others point to a spate of products mature enough to operate as money makers. Even with the deployment of commercial offerings, NLU continues to evolve.
IBM partnered with Nuance Communications to combine IBM's Deep Question Answering, Natural Language Processing, and Machine Learning capabilities with Nuance's speech recognition and Clinical Language Understanding solutions. Under the agreement, the companies are jointly investing in a five-year research project designed to advance next-generation natural speech technologies, which will be commercialized by Nuance. (Nuance is the same company that has been the rumored partner of Apple in developing Siri, although neither company will confirm this.)
WHAT IS NLU?
NLU is the ability of users to interact with any system or device in a conversational manner without being constrained by responses.
"What NLU does is understand a string of words or utterances," explains Daniel Hong, lead analyst at Ovum. "NLU takes into consideration statistical language and semantic language and combines the two. The engine that powers NLU has to be able to understand a sequence of words and process it to determine what the intent is behind the caller."
What NLU is not is speech recognition.
"Sometimes people don't realize that there's a big difference between speech recognition and NLU and they confuse them," says Roberto Pieraccini, former CTO of SpeechCycle and now the director of The International Computer Science Institute.
As an example, he explains the difference between Siri, which uses natural language, and Google Voice Search, which uses speech recognition.
"In the case of words that are spoken into a text box (like Google Voice Search), it does not mean that the machine understood what you said, but can …