Why Enterprises Struggle to Operationalize AI, AI Summit NY 2023

Panelists discuss obstacles to deploying generative AI and how to overcome them

Deborah Yao, Editor

December 6, 2023

3 Min Read
AI Summit New York panel. Panelists discussed obstacles to deploying generative AI and how to overcome them
From left Deborah Yao, Gaurav Dhama from Mastercard, Lucinda Linde from Ironside, Sesh Iyer from BCG and Vik Scoggins from CoinbaseAI Business

Companies deploying generative AI models often find it difficult to go beyond proofs of concept to operationalize their learnings at scale. But identifying these roadblocks is the first step to overcoming them, according to panelists at the AI Summit New York 2023.

One big hurdle is getting the right data, curating it, getting the metadata and also ensuring the data pipeline remains sustainable, said Sesh Iyer, managing director and senior partner at BCG who also is the North America co-chair of BCG X, the firm’s tech build and design unit.

AI learns from troves of data so companies need to organize their knowledge bases to feed into large language models, added Gaurav Dhama, director of product development − AI at Mastercard.

Another hurdle is setting up the right governance structure to manage generative AI’s risks, Iyer said. 

“There is a confidence problem” among top leaders in the use of large language models with its attendant risks around security, copyright issues, hallucinations and others, added Lucinda Linde, senior data scientist at Ironside, a technology consultancy.

Other obstacles include a shortage of people who can do the work, enterprises still struggling to focus on their business value, or return on investment (ROI), and the cost of generative AI being in flux – bringing another factor of uncertainty to businesses, according to Iyer.

To be sure, this is due to the relatively new nature of generative AI. “It is not a paved road yet,” said Vik Scoggins, who leads AI/ML product strategy and development at Coinbase.

Because of generative AI’s risks, Dhama sees it staying in the copilot phase – assisting humans rather than autonomous − “for a long time,” especially for companies in heavily regulated industries such as financial services. That means a human being will still be in the loop for a while.

Especially with code, generative AI can introduce a security vulnerability. “We use it carefully and the skill of the programmers using it should be higher,” Dhama added.

Linde recommended using generative AI internally first, to raise employee productivity and efficiency, because it is the safer route than using it in customer-facing applications. Practice deploying it in the back office until confidence builds within the organization.

However, despite the pains in learning a new technology, generative AI is worth the trouble to learn to deploy, with productivity gains ranging from as low as 10% to 15% all the way up to 80% to 90%, according to Iyer.

OpenAI’s drama: Another risk

Linde said a best practice is to use multiple models, even though 95% of generative AI code being written right now uses OpenAI's tech.

The recent firing and rehiring of OpenAI CEO Sam Altman shows the risk of relying on just one AI company. “It’s a material event that just happened,” Linde added.

Another reason to use multiple models: Some models do some things better than others. So it is good to try things out to see which would work better. For instance, “Mistral came out of nowhere and it is a really good model,” Linde said.

Dhama added that a company needs a cascade of systems as a rule.

When designing a generative AI stack, the factors to consider are accuracy, latency and cost, said Iyer.

Asked how companies can differentiate themselves when everyone is using the same foundation models available, the panelists said the competitive edge is in the data used to power generative AI. “It is in the data, not the model,” Dhama said. 

Make sure to intersect business insights with operations to derive full value from generative AI. But it starts with the right data.

“If you have the data, you win,” Iyer said.

Read more about:

ChatGPT / Generative AI

About the Author(s)

Deborah Yao

Editor

Deborah Yao runs the day-to-day operations of AI Business. She is a Stanford grad who has worked at Amazon, Wharton School and Associated Press.

Keep up with the ever-evolving AI landscape
Unlock exclusive AI content by subscribing to our newsletter!!

You May Also Like