Modernizing Natural Language Processing with Deep Neural Networks

Jelani Harper

February 22, 2019

5 Min Read

by Jelani Harper

Natural language processing is moving fast. Cutting edge developments in deep learning, neural networks, and aspects of classic machine learning are drastically improving the efficacy of this vital component of artificial intelligence, expanding its overall utility to the enterprise.

Nevertheless, the horizontal applicability of NLP remains the same. Organizations rely on this technology for aspects of speech recognition, internal and external interfaces for intelligent systems, and text analytics.

What’s evolving, however, are the use cases for NLP within these applications. When paired with deep neural networks, NLP use cases become much more specific and tailored for question answering, customer support, enterprise search, document classification, regulatory compliance, and more.

By augmenting NLP with deep neural networks, the former becomes considerably more practical for workplace automation. This is what Razorthink Senior Product Manager of Enterprise AI Solutions Goutham Krishnamurthy characterizes as a “new era” for the technology—and possibly for the enterprise use of neural networks.

Dynamic co-attention networks and question answering

The ability to parse through documents and quickly find answers is a pivotal subset of NLP’s text analytics capabilities. Although other types of neural networks are used with varying degrees of success to this end, dynamic co-attention networks were created with this particular use case in mind. Architecturally, these networks contain an encoder and a decoder. According to Razorthink AI Engineer Rudresh V., the encoder “takes in the question and the document and analyzes the question. For each word [in the question], it will try to get the information from the document.”

The decoder uses a matrix to find the answers to the questions, and deploys an iterative process in which the confidence of varying answers is compared to determine the best one. For customer service use cases in which representatives need to quickly analyze multiple documents to retrieve information, variants of dynamic co-attention networks are extremely effective and can 'go through 10 documents to answer one question,' according to V. Moreover, this approach is responsive to the scope and scale of widespread customer service deployments with low latency. It’s also useful for enterprise search because of how swiftly these neural networks use attention mechanisms to answer questions.

Related: AI Causes Job Disruption - So Is Workplace Automation The Future?

Long Short Term Memory, optical character recognition, and speech

Speech recognition systems and text analytics powered by NLP are considerably enhanced by Long Short Term Memory (LSTM). LSTM is a type of deep neural network designed for time-series analysis. When applied to NLP, it helps remember the various words—and their significance—for different parts of longer sentences or paragraphs.

The memory underpinned by this time-series analysis is responsible for the contextualization of various words in relation to speech recognition and text analytics. LSTM is then crucial for implementing episodic memory in conversational interfaces for employees or customers, enabling NLP to identify antecedents and referents to common words such as ‘that’.

LSTM applies these same boons to text analytics and improves Optical Character Recognition (OCR) by providing context to scanned-in characters “to capture words more accurately,” mentioned Razorthink Deep Learning Engineer Shreesha N. LSTM is particularly useful with NLP for translating languages, although it’s gaining momentum as a means of “building a speech recognition model for the user to interact with the database,” according to N. In this case, the speech of employees (or customers, conceivably) is first converted into natural language before input as code to query the database for celeritous responses.

Related: 2019 Trends In Natural Language Processing

Memory augmented networks and text analytics

Memory augmented networks are widely perceived as an improvement to LSTM and enrich the comprehension of NLP for text analytics, speech recognition, and other means. These neural networks are as renowned for their memory capabilities as much as for their learning capacity.

As opposed to the linear memory of LSTM, the memory of memory augmented networks is based on a permanent memory bank that can be read, written to, and overwritten. Data are written into these networks’ memory while the neural networks are training on the same data. Furthermore, these deep neural networks simultaneously capture context-based relationships, so they effectively “store the context, then learn how different context is related, and all the stuff you do for reading comprehension,” commented Razorthink Deep Learning Engineer Sagnik Bhattacharya.

Memory augmented networks’ capacity to look up different facets of data and language stored in their memory enables them to better understand the text or speech they’re analyzing—which is atypical of machine learning in general. Thus, these neural networks considerably improve the overall comprehension of NLP for strikingly accurate text analytics and speech recognition systems. Financial analysts can leverage NLP with memory augmented networks to understand both long and short term trends affecting the stock market, for example. “In stock market trading, a lot of it depends on the old history from 10 years ago, and a lot of it depends on the current context of this minute,” Bhattacharya explained. Memory augmented networks have the capacity to remember both historic and contemporary developments to identify trends useful for trades and analysis.

Intelligent document classification

Several aspects of the memory and attention nuances facilitated by aiding NLP with the aforementioned neural networks (and others) also deliver tangible business value for document classification. Accurately classifying documents is useful for different aspects of regulatory compliance, as is the ability to rapidly parse through them to identify various regulatory concerns such as personally identifiable information. By equipping NLP solutions with some of these more progressive deep neural network techniques, organizations are able to fundamentally extend the overall utility of NLP as a whole.

Jelani Harper is an editorial consultant servicing the information technology market, specializing in data-driven applications focused on semantic technologies, data governance and analytics.

Keep up with the ever-evolving AI landscape
Unlock exclusive AI content by subscribing to our newsletter!!

You May Also Like