By Jelani Harper
According to Jack Norris, Senior Vice President of Data and Applications at MapR, the entire history of analytics has been plagued by a single, chronic problem which has heretofore proven unsolvable despite advancements in technologies, tools, and methods.
“The problem with analytics is that it was limited to primarily either historical analytics, describing what happened, or taking past events and projecting forward with predictive analytics: based on the past here’s what we think could happen,” Norris explained. “But you had this gap in the middle of how do you respond to what’s happening in the moment.”
Artificial intelligence, coupled with event streaming data on real-time operations analyzed both in the moment and over the course of time, can effectively resolve this dilemma to identify action organizations can take to improve business outcomes. Best of all, they’re able to achieve these benefits in decreasing time frames, what Norris termed “day zero”, to immediately benefit both business and IT users.
“I can tell you what happened in the past, and after a day I can respond, but during that day zero I’m completely blind because it takes me that long to get the data staged before I can analyze it,” Norris remarked about conventional analytics processes. “I think the interesting applications of AI are in this day zero timeframe…to add intelligence to impact the business as it’s happening.”
To effect AI’s analytic capabilities on time-sensitive events, solutions require a flexible architecture that minimizes costs associated with managing the massive data quantities needed to suitably train machine learning models. The drivers for such a solution are binary. “The data side is like, we’ve got all this data, how do we harness it in a way that keeps costs down,” Norris mentioned. “For the business, it’s how do we get more intelligent about customer engagement; how do I drive better revenue?” The answer to both of these concerns is to utilize an all-encompassing platform for analytics and operations predicated on a single dataset reusable for each application.
This approach reduces the copies and data redundancies which, according to MapR Senior Vice President of Product Management and Marketing Anoop Dawar, “create unnecessary data movement, lineage issues, data duplication, and fragmentation” for each application of AI or specific business use case. By simply reusing the same data for each purpose on an inclusive data fabric, the copies and individual clusters for specific use cases (or data science tools) are pared down to essentially one dataset accessible through multiple APIs. The same fabric can endow different clusters in multiple locations such as on-premises, at the cloud’s edge, or in conventional, private, and hybrid clouds. Dawar observed that the greater value is a common understanding of data so that organizations don’t “come up with different, often conflicting insights”—which frequently occurs with too many replications of data for silo use cases.
When such a monolithic architecture is deployed in tandem with streaming event data—which can pertain to virtually any business domain, for an array of business problems—the low latency capabilities of AI truly emerge. Perhaps the best example of this fact is the implementation of streaming data based on security log data, including “the data access, the operation commands, the authentication requests, everything,” Norris said. This data supports both granular and aggregate analytics, which yield a wealth of insights about both individual employee and collective business unit behavior. This perceptivity includes knowledge about “what data’s accessed the most frequently, what data’s accessed infrequently, [and] what are the events that precipitated the access of this data,” Norris said. “It’s this type of information that can drive the intelligent machine learning driven placement or decision of what data should be tied to which tiered architectures.” The magic, however, occurs when such analytics are applied to specific business goals. According to Norris, “Our ability to use machine learning to understand how the cluster is operating and being used can serve as the template for an organization to say, ‘I’m going to take the same process and change the perspective to better understand my business operations’.”
Increasing revenue with analytics
Moreover, organizations can monetize this newfound intelligence in ways they previously couldn’t—simply by applying knowledge from this fundamental understanding of business operations. “You can look and see which end users, by which department, were accessing what data, and what type of events are used to understand what drives higher revenue versus lower revenue,” Norris commented. For example, marketing or sales departments can identify which literature contributes to more sales or conversions, and strategically position it in front of prospects and customers.
Additionally, in e-commerce and certain retail applications, organizations can utilize AI and business operational data to tailor the advertising or home page images and products for specific customers. They can also identify opportunities for cross selling and up-selling based on real-time customer activity to maximize interactions while encouraging future ones. “It’s not how do I get better data to understand what I should have offered a customer last week; it’s how do I adjust on-the-fly to drive the shopping cart content and make it increase,” Norris remarked.
The same process can also improve facets of risk management and fraud detection. In financial services, when customers deposit checks with mobile apps by essentially transmitting images of the checks, those images are transported to object stores “to be processed and analyzed to make sure it’s not a fake check, make sure the signature is proper, the account number is correct, the numbers are legit and all of that stuff,” Dawar said.
Sophisticated image recognition systems are used to provide this level of protection for individual checks as they are deposited. Simultaneously, however, there are systems to monitor the check deposit apps in general, involving “a streaming dashboard that’s running to show how many checks get deposited every day of the week, every hour, every minute, every month, every year, and you’re tracking for anomalies,” Dawar revealed. “You can track fraud anomalies and system anomalies.”
In the moment
The fraud detection use case is compelling because it illustrates how AI can provide both conventional and aggregate analytics—on a single check or on all of them—to analyze events as they occur and over time, for both historic and day zero information on real-time operational systems. The sales, marketing, e-commerce and retail use cases exemplify how AI can transform operational data pertaining to customers to competitive advantage driving revenues.
In all of these use cases AI is offering in the moment analysis of how to improve business objectives, which should almost always be the end of advanced analytics. According to Norris, those analytics are underpinned by “a comprehensive data platform to support what’s happening, and needs to happen, with AI.”
Jelani Harper is an editorial consultant servicing the information technology market, specializing in data-driven applications focused on semantic technologies, data governance and analytics.