Close Menu
    Facebook X (Twitter) Instagram
    WhatsApp Instagram Facebook X (Twitter)
    DGPR
    Join Whatsapp Channel
    • Home
    • Privacy Policy
    • About us
    • Contact us
    • Terms and Conditions
    DGPR
    AI News

    Natural language understanding tough for neural networks

    March 21, 2025Updated:July 31, 2025No Comments9 Mins Read

    Natural language understanding tough for neural networks

    how does natural language understanding (nlu) work?

    The best modality depends on your callers, processes, and balance between caller satisfaction and cost. SpeakFreely is used in the Nuance Call Steering Portal (NCSP), a Web-based portal used to create, deploy, and optimize NLU call steering solutions. The company describes NCSP as enabling someone without a Ph.D. in speech science to bring NLU to the masses faster without breaking the bank.

    • Some industry experts also believe that companies don’t need a full-blown NLU engine, but can incorporate some of the technology into solutions they already have in place.
    • “We wanted to reverse that and let the technology understand what the customer is saying,” he says.
    • “What NLU does is understand a string of words or utterances,” explains Daniel Hong, lead analyst at Ovum.
    • Naturally, the larger the dataset and more diverse the examples, the better those numerical parameters will be able to capture the variety of ways words can appear next to each other.
    • In comments to TechTalks, McShane, a cognitive scientist and computational linguist, said that machine learning must overcome several barriers, first among them being the absence of meaning.

    “In the past, when speech was first applied, people envisioned the end of touch tone, but today touch tone is alive and well, and speech is simply one of several potential solutions,” Middleton says. “Natural language versus directed dialogue is no different. Natural language is simply another potential solution. Silver bullets are few and far between.” “They don’t know what instructions are. We teach them that. When they do something right, we pat them on the back; when they do something wrong, we correct them. It’s the same process here. We take data and train these systems and monitor them and correct them. We call it reinforcement learning.” “Accurate recognition is key to cloud-based resources that understand intent,” he says. “The trends seem to be quite clear,” says Ilya Bukshteyn, senior director of marketing, sales and solutions, at Microsoft Tellme. “When you look at the kinds of technologies that consumers are snapping up and buying in record numbers, whether it’s Kinect or Apple products, it’s very clear that natural interaction (language) done right is very, very compelling. You don’t want to be the last company not offering a natural experience in your category.”

    “We are poised to undertake a large-scale program of work in general and application-oriented acquisition that would make a variety of applications involving language communication much more human-like,” she said. “Of course, people can build systems that look like they are behaving intelligently when they really have no idea what’s going on (e.g., GPT-3),” McShane said. DestinationCRM.com is dedicated to providing Customer Relationship Management product and service information in a timely manner to connect decision makers and CRM industry providers now and into the future.

    “What NLU does is understand a string of words or utterances,” explains Daniel Hong, lead analyst at Ovum. “NLU takes into consideration statistical language and semantic language and combines the two. The engine that powers NLU has to be able to understand a sequence of words and process it to determine what the intent is behind the caller.” It allows machines to understand our intent, emotions, and meaning beyond words, leading to more natural, efficient, and engaging interactions. Natural language understanding (NLU) is a branch of AI that focusses on enabling computers to understand human language in the same way humans do. It’s a complex research area involving a combination of techniques from various fields, including computer science, linguistics, and psychology. In the real world, humans tap into their rich sensory experience to fill the gaps in language utterances (for example, when someone tells you, “Look over there?” they assume that you can see where their finger is pointing).

    how does natural language understanding (nlu) work?

    Listed Tech Companies

    LEIAs process natural language through six stages, going from determining the role of words in sentences to semantic analysis and finally situational reasoning. These stages make it possible for the LEIA to resolve conflicts between different meanings of words and phrases and to integrate the sentence into the broader context of the environment the agent is working in. For the most part, machine learning systems sidestep the problem of dealing with the meaning of words by narrowing down the task or enlarging the training dataset. But even if a large neural network manages to maintain coherence in a fairly long stretch of text, under the hood, it still doesn’t understand the meaning of the words it produces.

    Getting closer to meaning

    “Routing the request to a specialized agent is an important action of the system, as it helps that agent address that issue, and not go to just any agent.” NLU is the ability of users to interact with any system or device in a conversational manner without being constrained by responses. According to IDC, bots like Microsoft Cortana and Apple Siri will create revenue of around $1.4 billion in 2016 alone. The developers are using an RRG (Role and Reference Grammar) Model that parses everyday language to discover its true meaning. For example, you could tell Siri, “Call Bob, no, I mean Tom,” and Siri would understand that you have changed your mind.

    Does natural language understanding need a human brain replica?

    Natural language understanding is an AI branch for computers to grasp human language and blend CS, linguistics, and psychology. In Linguistics for the Age of AI, McShane and Nirenburg argue that replicating the brain would not serve the explainability goal of AI. “Agents operating in human-agent teams need to understand inputs to the degree required to determine which goals, plans, and actions they should pursue as a result of NLU,” they write. In comments to TechTalks, McShane, a cognitive scientist and computational linguist, said that machine learning must overcome several barriers, first among them being the absence of meaning. Some industry experts also believe that companies don’t need a full-blown NLU engine, but can incorporate some of the technology into solutions they already have in place.

    After implementing AT&T’s NLU solutions, by 2010, Panasonic was able to resolve a million more customer problems a year, with 1.6 million fewer calls than in 2005. The core technology for understanding natural responses to open questions (such as “How may I help you today?”) is called SpeakFreely. Its technology involves taking a collection of responses to the open question, analyzing each to attribute a meaning, and then defining an appropriate application response. An IVR can respond to unique requests that have not previously been encountered by using SpeakFreely for NLU. Once the intent and information is extracted, based on AT&T’s dialogue technology and how the system is designed, a company can send callers to a specialized agent or complete the automation.

    Mobile and cloud are expected to continue to drive interest and lower cost, hopefully allowing more companies to board the NLU train. “The more categories you have, the more different kind of users you have, the harder it is to categorize what they’re saying,” says Deborah Dahl, principal of Conversational Technologies, and chair of the World Wide Web Consortium Multimodal Interaction Working Group. “If you have something like an airline, where most of the callers are used to the system and have a clear idea of what they want, the system’s going to work better because what the caller will say is more precise.” “Some systems may not fully automate because it could be part of the design, or it could be that the complexity of the request is hard and it’s best to send them to a specialized agent,” Gilbert says.

    how does natural language understanding (nlu) work?

    Marjorie McShane and Sergei Nirenburg, the authors of Linguistics for the Age of AI, argue that AI systems must go beyond manipulating words. In their book, they make the case for NLU systems can understand the world, explain their knowledge to humans, and learn as they explore the world. “As opposed to following numerous menus in the IVR and getting lost, natural language–enabled customers can articulate their problem and allow the system to understand it much more quickly,” he says. The company is automating the machine learning that goes into text and voice chats with bots. Pat uses techniques like Word Sense Disambiguation, Context Tracking, Machine Translation, and Word Boundary Identification to accomplish this.

    Breaking Boundaries: How AI is Powering Seamless Customer Service Workflows Across the Enterprise

    how does natural language understanding (nlu) work?

    Tech available today does not parse these exchanges correctly and would get confused. An autoregressive model predicts future sequence values based on its past values using statistical… Reproduction of news articles, photos, videos or any other content in whole or in part in any form or medium without express written permission of moneycontrol.com is prohibited. Microsoft and other larger entities like Nuance and AT&T are working to level the playing field, and to offer NLU at a lower cost and broader scale. Another factor preventing widespread adoption is that NLU tends to work better in verticals where open-asked questions have constraints, such as utilities and travel.

    There is a significant barrier to widespread adoption of NLU, and that barrier is cost. Purchasing licensing technology, implementing it, and maintaining it is prohibitively expensive for many companies. Natural language understanding is meant to attack a basic problem of call centers—extended call times—by automatically handling calls, and reap significant savings too. Dena Skrbina, senior director of solutions marketing of the Enterprise division at Nuance, says that NLU is more than just collecting information; it’s about determining intent. As an example, he explains the difference between Siri, which uses natural language, and Google Voice Search, which uses speech recognition. While the gee whiz factor is hard to overlook in the consumer market, the view of how well NLU works in the business market is divided.

    Related Posts

    Hotel Chatbots: Everything You Need to Know

    May 14, 2025

    Natural language processing chatbots bring conversation to AI

    February 27, 2025

    Everything you need to know about an NLP AI Chatbot

    October 17, 2024
    Leave A Reply Cancel Reply

    WhatsApp Instagram Facebook X (Twitter)
    • Terms and Conditions
    • Privacy Policy
    • About us
    • Contact us
    © 2025 DGPR Punjab All Rights Reserved

    Type above and press Enter to search. Press Esc to cancel.