This Week's Most Read: Asking the Right Questions in AI Deployments

An expert column from IBM’s former AI chief, Nvidia’s new AI chips and OpenAI backs an AV startup

Ben Wodecki, Jr. Editor

November 16, 2023

3 Min Read

Here are this week's most popular stories on AI Business:

1 AI IQ: Asking the Right Questions in AI Deployments

Introducing AI IQ, a new column from Seth Dobrin, founder and CEO of Qantm AI and former chief AI officer of IBM.

The most common mistake organizations make when evaluating emerging technologies like generative AI is to focus on the capabilities of the technology rather than on how it can support overall business goals and strategies. It is easy to get caught up in the hype and potential of any new shiny object, such as generative AI, without considering what specific use cases are relevant given the company's current challenges and opportunities.

The allure of generative AI is understandable − the promise of being able to interact in natural language with an AI system and having it do high-value tasks such as building creative content like blogs and images; turning your disaster of a shared drive into institutional knowledge; creating software programs using natural language; providing better human interactions with customers, employees, and partners and so much more.

Read the full column

2 Nvidia Upgrades its Flagship AI Chip as Rivals Circle

Nvidia unveiled an upgrade to its flagship AI chip – the H100 − to fend off budding rivals to its dominance in the GPU market.

The world’s most valuable chipmaker announced the H200, which will use high bandwidth memory (HBM3e) to handle massive datasets for generative AI and other intense AI computing workloads.

Nvidia said the H200 is the first GPU to offer HBM3e. It has 141GB of memory delivered at 4.8 terabytes per second, or almost double the capacity and 2.4 times more bandwidth than its predecessor, the A100. The H100 supports 120GB of memory.

The chipmaker said the H200 would nearly double the inference speed on the 70 billion-parameter model of Meta’s Llama 2 open source large language model, compared to the H100. Nvidia said the H200 will see further improvements with software updates.

Read more

3 AI Startup Roundup: AV Startup Snags Coveted OpenAI Backing

San Francisco-based Ghost Autonomy is developing software to potentially solve the thorniest problem in self-driving vehicles: navigating complex situations and unusual incidents. Its approach is to use multimodal large language models (LLMs) to achieve this ‘Holy Grail’ of self-driving cars.

Read more

4 New Chip Designs to Boost AI Workload Processing

New chip designs could soon revolutionize AI by using innovative ways to handle generative workloads more efficiently.

“When it comes to machine learning, keeping up with the requirements of AI/ML workloads, both in terms of hardware and software, is paramount,” Siddharth Kotwal, Quantiphi's global head of Nvidia practice, said in an interview. “The potential hardware opportunities revolve around developing workload-specific AI accelerators/GPUs to cater to the specialized needs of enterprises.”

General-purpose microprocessors like those from Intel and AMD offer high performance for a broad spectrum of applications, Ben Lee, professor at the University of Pennsylvania’s Penn Engineering, noted in an interview. However, he said that chips customized for specific application domains, such as AI, can offer much greater performance and energy efficiency.

Read more

5 Exclusive: Schneider Electric Chief AI Officer on Using Custom ChatGPT

Schneider Electric is leveraging Microsoft’s Azure OpenAI platform to develop chatbot solutions to improve worker productivity and enhance interactions with customers.

Among the AI tools being used is Resource Advisor Client, a copilot for data analysis and decision support, Jo-Chat GPT, an internal conversational assistant, and Knowledge Bot, a chatbot assisting customer care representatives.

In an exclusive interview, Philippe Rambach, chief artificial intelligence officer at Schneider Electric, explained how the French energy solutions giant built its new AI tools, and offered advice on how to approach generative AI projects at scale.

Read more

About the Author(s)

Ben Wodecki

Jr. Editor

Ben Wodecki is the Jr. Editor of AI Business, covering a wide range of AI content. Ben joined the team in March 2021 as assistant editor and was promoted to Jr. Editor. He has written for The New Statesman, Intellectual Property Magazine, and The Telegraph India, among others. He holds an MSc in Digital Journalism from Middlesex University.

Keep up with the ever-evolving AI landscape
Unlock exclusive AI content by subscribing to our newsletter!!

You May Also Like