COVID-19 turned our professional and personal lives upside down.
It also forced governments, scientists, and health-care providers around the world to accelerate the development of artificial intelligence (AI).
The pandemic, while devastating in many ways, has helped accelerate the development of and reliance on AI, having completely changed how we think about analytics.
It has in transformed what we believe to be the best ways to use AI to better predict and navigate a range of scenarios and adapt to those that come true.
More specifically, in our ongoing work to leverage AI for navigating not just the pandemic but any complex business challenge, recent months have taught us these five valuable lessons about using data and AI models.
Even inexact data can be useful, as long as it’s consistently inexact
If, for example, you are looking at a country or a community you suspect is underreporting its COVID infections, but their reported numbers are rising over time, you can use that data to deduce the underlying trends. This can also be used in other scenarios – for example, when assessing quality trends from a plant manager who consistently under reports poor production yields.
By using data from multiple sources, it can actually help to accurately sense the underlying trends beneath the noise of poor-quality data.
Running your models with more varied data makes them more robust
A model might be trained to specialise for a particular geography or time period, however this might limit it to repeating specific past experience. By exposing the model to more varied situations, for example by using data from other geographies and periods, the model will develop a better understanding of the underlying causes and effects. This allows them to perform more robustly in the situation of interest.
Data needs to have the right granularity
Determining the granularity of analysis is as much an art as it is a science, and it is all about finding the right balance. Make the data too fine grained, for example by including low-level details of a geography or market, and the system cannot apply lessons learned from one geography to another because they all look different. Similarly, make it too coarse grained and you risk overlooking specific features that matter. For example, this could apply when predicting disease spread in countries with different levels of compliance, or when analysing the needs of customers in different geographies or cultures.
Frequency of data and model updates need to match your needs
COVID-19 has shown us how essential daily updates are in order to keep track of new data as it arrives, and monitor for new changes in conditions, such as when a new outbreak occurs.
However, these updates might not be necessary on a daily basis in other sectors or industries. When looking at tracking customer preferences in clothing, for example, the updates might only be needed when the seasons and trends change. In consumer electronics, they may only be required when a new product is being developed or introduced.
Data and model updates are decided on a case-by-case basis. But in every case, it will be crucial to ensure the quality of the data and, in some cases, transform it from legacy data formats into a form that is manageable for both modern databases and AI algorithms.
Budgeting and commitment are an ongoing process and not a one-off activity
The lack of ongoing budget and commitment to data and model updates is perhaps the biggest hurdle to being able to use artificial intelligence effectively. Model updates and analysing data are rarely one-off activities and should be seen as an ongoing process. AI is now an essential tool in guiding decisions on everything from bank loans to which products should be shelved closer to checkouts in convenience store to maximise sales, to when the best time is for agricultural harvesting. However, predictive accuracy works best when continuously updated as to create a better indication of specific trends. Success in AI requires formalising the processes, staff, skills and budget to make updates a routine.
Giving data and models the care they need
For the last century, analytics activities have relied on static, mathematical formulas sifting through static data. Today, the foundation of analytics is data, analysed by ever-improving and evolving models. In this new and exciting world, organisations can make the most complex and important decisions of our day, such as how to balance reopening economies while protecting human health, in the most informed way possible. Agile analytics enabled by artificial intelligence can tell us how to reach those goals – but only if we give the underlying data and models the care they need.
Risto Miikkulainen is associate VP of Evolutionary AI at Cognizant