Shaping AI and analytics servicesShaping AI and analytics services
AI and analytics teams must market themselves – and they have to have a clear service strategy
August 6, 2020
Contain your expectations
Put a team of AI experts together, let them get their software configured – and then AI will rock the company. I know businesses that follow this approach.
Once complex technologies such as Hadoop are involved, the engineers have the time of their lives. They focus on understanding and getting the new technology to work. Exciting!
Spring turns to summer, summer turns to autumn, autumn turns to winter – and there is still so much more to be prepared before you can offer a real service for the internal customers. Any IT service or IT consulting company would be bankrupt before they approach their first customer if they follow this model. Thus, it comes to no surprise that many internal AI initiatives fail.
When setting up an internal AI and analytics service, these are the key points:
Provide a service that meets the expectations of the internal customers. This requires:
You have customers
You understand what the customers need
You can deliver
Ensure the funding for the provided service. You need a sponsor that understands the benefit of the service you provide for himself or herself.
AI teams in large corporations might have the chance to focus exclusively on their technology stack, but many AI teams have to take a wider, more customer-focused approach. They have to propose answers to the challenges of their potential customers. If they do not meet the need of an internal customer (e.g., unsuitable technology stack, the AI experts refuse a task as not “challenging” enough), the potential internal customer might simply not start the project. Even worse, they might approach another team or an external service provider. Thus, AI and analytics teams have to market themselves – and they have to have a clear service strategy.
Analytics and AI service requests: Strategic projects VS operations support
AI and analytics teams can help their internal customers in two ways. First, and more obvious, are strategic advisory such as analyzing client groups and impact factors for client buying decisions. Second, an AI and analytics team can support operational processes by providing ongoing operational insights. Whom should they contact to try to sell which product? This could be a list generated weekly.
These service types have different characteristics (Figure 1). Strategic advisory services are executed project-style: analyze the data, build a model using statistics and machine learning, and prepare slides with results, present them to the management. This is repeated a few times, then the project ends after a few days or several months. The business uses these insights for better decisions, e.g., to push sales and increase revenues. These benefits are realized usually without (much) further involvement of the analytics and AI team.
This is good for seasoned data scientists. They move to the next topic and build new models and, potentially, even use new technologies. However, this also means that the AI and analytics service management has to acquire new projects frequently to prove its relevance to the organization.
This is different for ongoing operational insights. The data scientists build a model. The model becomes part of an overall software solution. It triggers and influences the behavior of the software solution. This can be, for example, a product recommendation model and component that determines the ads specific customers are targeted with. This means that AI and analytics are an integral part of the business processes. Daily operations become difficult or impossible without AI and analytics.
This means that there is continuous work to be done for the AI and analytics team – or the business will not get the expected extra value from AI and analytics. Obviously, this implies that the continuous funding of the AI and analytics service team gets easier.
Figure 1: Comparing strategic (left) and operational analytics services (right) regarding the staffing needs for the analytics and AI service (red) and the benefit for the business (green).
Designing an AI and analytics service
In order to understand how the same service can be packaged differently, we can look at banking. All banks have highly standardized services: payments, credits, asset management, credit and debit cards, buying and selling financial instruments. However, different customer segments get these basic services “packaged” differently.
In retail banking, there is an online banking platform and a call center. From time to time, if there is a clear need, such as a mortgage; a bank customer meets a bank clerk in person in a branch office. Online banks have no physical branches or offices. The interaction and banking services are managed either via online portals or call centers. There is no chance to meet anyone in person. Finally, there is the world of private banking. The client advisor has always time for each and every of his clients. Thay are happy to go for lunch or dinner with you – and send you flowers and wine for your birthday. Online banking, retail banking, private banking – the services are similar, but the customer experience is completely different.
Philip and Hazlett’s PCP attribute model [SDV05] explains this and is easily applicable to AI and analytics services. A service has pivotal service elements. This is what the internal customers and users get, e.g., a churn model for banks. The deliverables are closely related to the knowhow and competences of the team: statistics and machine learning know-how, experience with tools, and access to all relevant data for building good models.
People, processes, and organization are core service attributes for service experience. They ensure reliability and friendliness. How well designed are the customer touchpoints? Is there phone support and can the hotline be reached as announced? Are there ticketing tools that are convenient to use? Is our project organization able to deliver on time? Core attributes are important for a long-term working relationship.
Finally, there are peripheral service attributes. They are for the 'wow' effect. They should excite internal customers and users. Do team members present at conferences and, thus, are obviously in high demand and interesting to work with? Are reports and lists designed well? Are there community events to foster relationships between the AI service team and the customers – and between customers?
The important point is to clearly understand which service elements are pivotal, core, or peripheral – and to set the priorities right.
Figure 2: PCP attribute model, Philip and Hazlett, 1997 [SDV05]
The customer satisfaction for AI and analytics services depends on the perceived service quality regarding what the service delivers (pivotal elements) and how it is delivered (core and periphery elements). Certainly, the expectations depend on objective needs and requirements, e.g., to boost sales for a specific mobile subscription. The expectations are influenced by the image and reputation of the AI and analytics team, and how previous interactions were handled. Based on these influencing factors, customers and users form an expectation about the service – and compare it with the service they perceive as being delivered (Figure 3). Thus, the perceived service quality has a strong subjective element.
On the service delivery side, the management and the customer-facing team members form their impression of what customers need and want. They design a service to meet these requirements and deliver this service. Again, misunderstandings can happen in each step. As a result, the delivered service differs from the envisioned one, and customers are unhappy.
This model explains why the quality of an AI and analytics service is more than some statistical metrics. Math does usually not solve service issues; it is communication and commitment. Based on my service design experience, writing down the key points of the service for potential customers helps. What do you do and what are the limitations? This does not make additional communications obsolete. It fosters discussion needs and expectations – and prevents wrong assumptions on the customer-side.
Figure 3: Service Quality Model based on Grönsroos (1984), Parasuranam (1985), and Philip and Hazlett (1997) [SDV05]
Obviously, the distinction between pivotal, core, and periphery service elements and the various aspects of service quality impact service costs and how an AI and analytics team is set up. More about that in my post next week.
Klaus Haller is a Senior IT Project Manager with in-depth business analysis, solution architecture, and consulting know-how. His experience covers Data Management, Analytics & AI, Information Security and Compliance, and Test Management. He enjoys applying his analytical skills and technical creativity to deliver solutions for complex projects with high levels of uncertainty. Typically, he manages projects consisting of 5-10 engineers.
Since 2005, Klaus works in IT consulting and for IT service providers, often (but not exclusively) in the financial industries in Switzerland.
About the Author(s)
You May Also Like