We recently caught up with SAS’ Chief Data Scientist, Wayne Thompson, who gave us an insight into SAS view of AI’s impact, where its challenges lies, and where he sees the future of this technology going.
SAS is the leader in analytics. Through innovative analytics, business intelligence and data management software and services, SAS helps customers at more than 83,000 sites make better decisions faster. Since 1976, SAS has been giving customers around the world the power to know.
Wayne Thompson is the Chief Data Scientist at SAS. He is one of the early pioneers of business predictive analytics, and he is a globally recognised presenter, teacher, practitioner, and innovator in the field of predictive analytics technology. He has worked alongside the world’s biggest and most challenging companies to help them harness analytics to build high-performing organisations.
Over the course of his 20-year career at SAS, he has been credited with bringing to market landmark SAS analytic technologies (SAS® Text Miner, SAS® Credit Scoring for SAS® Enterprise Miner™, SAS® Model Manager, SAS® Rapid Predictive Modeler, SAS® Scoring Accelerator for Teradata, SAS® Analytics Accelerator for Teradata, and SAS® Visual Statistics). His current focus initiatives include easy-to-use, self-service data mining tools for business analysts, deep learning and cognitive computing. Wayne received his PhD and MS degrees from the University of Tennessee. During his PhD program, he was also a visiting scientist at the Institut Superieur d’Agriculture de Lille, Lille, France.
Starting off the interview we were eager to learn more about how SAS defines artificial intelligence, and their current involvement in the AI space.
“AI is a general term that implies the use of non-linear algorithms to model and/or replicate intelligent behavior”, Thompson begins. “SAS has a long history of providing algorithms, such as, neural networks and support vector machines that learn and/or perform intelligent behavior. Natural language processing is also a big component of AI that enables the machine to understand human input and also glean insights out of textual data”, Thompson explains.
Thompson explains how SAS’ text analytics tools for sentiment analyses, documentation categorisation, document summarisation, and natural language interaction for voice commands really helps the company round out their AI delivery. Their current focus is on deep learning algorithms for applications, such as speech recognition and image detection.
In an interview with Huffington Post last year, Oliver Schabenberger, SAS Executive Vice President and Chief Technology Officer mentioned that data without analytics is value, not yet realised – can you elaborate on what this statement means?
“I believe Aristotle was the first data scientist because he practiced empiricism, meaning you learn through observing”, Thompson says. “Most data collected today is not designed for experimental analyses but is rather highly observational data. Learning by observation is exactly what we try to do with machine learning”.
Thompson explains how SAS collects (observes) representative data from some phenomena, such as why customers are churning, whereby the machine learns iteratively in successive passes through the data to extract knowledge and insights.
“The knowledge learned through this historical data can then be applied to new data to make decisions – this is the process of model scoring or deployment”.
“Data is the fuel that feeds the analytical fire. Without analytics you never really get to light the fire; the data just sits around decaying and being of little use”
How did the expansion for SAS with respect to deep learning and cognitive analytics come about?
“AI and cognitive computing are back in vogue through the need to derive insights automatically with continuous learning” Thompson says. He explains that it is important to note that AI and CC are not just for exotic science projects, “These fields are truly part of the analytics continuum as a logical next extension”.
“As a leader in predictive and prescriptive analytics these fields are just that – logical new extensions to our portfolio which help us orchestrate what will happen and make automatic adjustments along the journey”, Thompson says.
SAS has recently released their newest platform Viya, so we asked if Thompson could tell us more about how this platform works.
“SAS Viya is an open, cloud-ready, in-memory platform that delivers everything needed for fast, accurate analytical results all of the time”, Thompson says. “With its fluid, scalable and fault-tolerant processing environment, Viya addresses complex analytical challenges with the ability to effortlessly scale into the future”.
“SAS Viya provides: a modern, cloud-ready analytics platform; a single, open, governed analytics environment with a standardized code base that can incorporate both SAS and other programming languages; and a uniquely comprehensive and scalable platform for both public and private cloud implementations”.
Does this platform (Viya) differ from your other in-memory, scalable, high-performance architecture? If so, in what ways?
“SAS Viya is an extension of the SAS in-memory computing platform. It provides a cloud ready environment through support of multitenancy, elastic computing with fault tolerance”, Thompson said.
“It is also a very open system for application development which is really required for AI. You want to build customised applications like conversational assistants and question-answer systems”.
Looking further ahead, how do you see the rate of AI adoption evolving in 2017?
“IDC indicates that by 2018, 75 percent of enterprise and ISV development will include cognitive/AI or machine learning functionality in at least one application, including all business analytics tools”, Thompson says. “That sounds conservative – I would say 90 percent with truly approachable analytics being one of the big advantages for the user community”.
What do you feel are the biggest challenges faced by enterprises looking to adopt AI?
Too many organisations are not drowning in data as some suggest but are actually data weak. They must have representative data with repeatable data acquisition and preparation practices to fuel the continuously learning AI machine.
“AI also requires gobs of data to extract deep insights out of it. Otherwise, they tend to overfit with methods like deep learning so your solution does not generalize well on the target population”, Thompson says. He emphasises that the data also needs to have reasonable overall quality if you are going to obtain acceptable levels of accuracy when making inferences.
“We also simply do not have the talent pool to ideate, experiment, and push the envelope of what we can do with AI”, he says. “Many AI systems are also black boxes which are difficult to explain. This diminishes acceptance in highly regulated industries and where transparency of decisions is key”.
Where do you see the biggest opportunity for the enterprise over the next 5 years?
“I would like to be bullish and say that we will see a bunch of AI systems with intellectual capabilities nearly equivalent to a human’s developed in the next five years”.
These systems are what are called strong AI systems, Thompson explains. However, he does mention that from a more pragmatic standpoint he does expect to see more weak AI targeted systems being rolled out.
“For example, question and answer bots that are tailored for applications like healthcare and medical assistants. One of the top requests I am getting from SAS customers is to automate conversational assistants to help with customer online routing and also cross product promotions”.
One client wants to use SAS Viya with Amazon Echo as a personal concierge in each and every hotel room. Automated data scientist bots that develop predictive and descriptive models in a continuous learning fashion are also in big demand”.