Amazon Creates Its ‘Most Ambitious’ AI Group

Amazon created a "central team" to work on its most ambitious large language models.

Deborah Yao, Editor

August 1, 2023

1 Min Read
Amazon logo on orange wall
Getty Images

At a Glance

  • Amazon created a group to work on its “most ambitious” large language models, according to an AI Business source.
  • Alexa's head scientist was promoted to lead the group and will report directly to CEO Andy Jassy.
  • Amazon now joins the cadre of tech giants charging ahead on generative AI: Google, Microsoft and Meta.

Amazon has created a "central team" to work on its “most ambitious” AI project.

According to an AI Business source who has knowledge of the development, the new group will work on large language models (LLMs). It will be led by Rohit Prasad, head scientist for Amazon’s digital assistant Alexa. He will report directly to CEO Andy Jassy. The news was first reported by Insider.

Amazon now joins the cadre of other U.S. tech giants charging forward on generative AI research, applications and platforms: Google, Microsoft and Meta, which went full-bore in June.

In an email, Jassy said Prasad would lead its “most expansive” large language model efforts and his group will act as the “central team.”

"While we've built several LLMs around the company, and have several others in flight, we are going to pool some resources centrally to build our most ambitious LLMs,” Jassy wrote.

In April, AWS unveiled Bedrock, which makes generative AI foundational models available through an API. It includes models from Stability AI, AI121 Labs and Anthropic. Bedrock also will offer Amazon’s own Titan models. A month later, Jassy disclosed that Amazon is building a “much larger and much more generalized and capable” language model for Alexa in a bid to enhance customer experience across all its businesses.

Related:Amazon Enters the Generative AI Race with Bedrock

In 2022, Amazon unveiled a 20 billion-parameter language model called Alexa Teacher Model 20B that supports several languages. The company’s researchers had said it outperforms OpenAI’s GPT-3 in linguistics, which has 175 billion parameters. (OpenAI’s latest model is GPT-4, which is rumored to have 1.7 trillion parameters.)

Read more about:

ChatGPT / Generative AI

About the Author(s)

Deborah Yao


Deborah Yao runs the day-to-day operations of AI Business. She is a Stanford grad who has worked at Amazon, Wharton School and Associated Press.

Keep up with the ever-evolving AI landscape
Unlock exclusive AI content by subscribing to our newsletter!!

You May Also Like