Experts in AI: Teaching reading comprehension to machines

Paul Barba from Lexalytics talks about his career in text analytics, trends in NLP, and times when everything is on fire

Today, most mainstream applications of AI are dealing with language. Chatbots, virtual digital assistants and automated transcription services are all powered by a collection of Natural Language Processing (NLP) technologies which enable users to mine text for meaning.

NLP includes sub-fields like Natural Language Understanding (NLU), which enables functions like translation and sentiment analysis, essentially turning text into structured data, and Natural Language Generation (NLG) – used to turn data back into human-readable text.

To find out more about working with NLP, AI Business sat down with Paul Barba, chief scientist at Lexalytics, a company that specializes in text analysis.

Barba is a machine learning expert who spent 12 years at the company, progressing from his position as a development intern to overseeing its research and development efforts. He was shortlisted for the AI Innovator of the Year award at the AI Summit New York in December.

 “The AI algorithms are really great at making use of anything that’s wrong with your data,” Barba told AI Business. “Any little pattern that you didn’t intend, [data] you sampled incorrectly at some point, and you’ll get really promising results that will then burst into flames in production.”

“With technology, we always traditionally kind of put it in a box and interacted with it on its own terms. With AI, you’re pulling in data sources, maybe form a CRM system that a customer has, and then some other part of the enterprise decides to change CRM vendors, and suddenly data flowing into your algorithms is gone, and everything is on fire.

“There’s a lot of little things that can go wrong, and it’s hard to predict what they will be.”