Google Cloud Next ’23: New AI Enterprise Tools Take Center Stage

Major updates from Google Cloud Next 2023 include new TPU chips, an expansion of Duet AI and model upgrades for the Vertex platform

Ben Wodecki, Jr. Editor

August 30, 2023

5 Min Read
AI 'will be the most profound shift we’ll see in our lifetime' said Google CEO Sundar Pichai Google Cloud

At a Glance

  • Google unveiled major AI updates including new enterprise tools and improvements to its AI models.

Google Cloud used its annual Next event to continue showcasing new AI innovations, including new enterprise tools and an AI-powered collaborator.

Google CEO Sundar Pichai used his keynote to emphasize the company’s shift in focus towards AI.

“We believe that making AI helpful for everyone is the most important way we'll deliver on our mission in the next decade. That’s why we’ve invested in the very best tooling, foundation models and infrastructure, across both TPUs and GPUs.

“These underlying technologies are helping us transform our products and businesses —and they’ll help you transform yours,” Pichai said.

Duet AI expansion

Among Google Cloud’s announcements was an expansion of Duet AI – an AI-powered collaboration tool that can be embedded across Google Cloud interfaces.

Duet AI was first revealed at Google I/O and is designed to help users with contextual code completion, similar to Copilot from rival Microsoft.

Google Cloud announced at Next ’23 that Duet AI will be made generally available later this year with  Google Workspace users able to access a free trial now.

Duet AI is also gaining additional features, including tools to help users modernize applications by assisting with code refactoring. Duet AI can now refactor code and migrate to Google Cloud faster, a feature that the company hopes will save users time and costs.

Related:Google DeepMind Launches Invisible Watermarks for AI Images

The code development tool can also be customized as Google Cloud gave a glimpse at context-aware code generation. Some “select enterprises” were given the ability to customize Duet AI with organization-specific knowledge from their libraries and code base to generate context-aware code suggestions.

Another expansion will allow users to integrate and build APIs with Duet AI, and the code generation tool is being added to Looker, Google’s enterprise platform for business intelligence.

Vertex AI improvements

Google Cloud also announced a series of updates to its application development environment, Vertex AI.

The company claimed that demand for Vertex has increased, with customer accounts growing more than 15 times in the last quarter.

Among the additions to Vertex include new models in Model Garden, which provides users access to various AI models. New additions include Meta’s Llama 2 and Code Llama, as well as the Technology Innovation Institute's Falcon LLM. Also getting support is Claude 2– the powerful new model from Anthropic, the OpenAI rival that Google has backed.

Google Cloud also announced updates to its own models available via Model Garden – including upgrades to its flagship PaLM 2 language model to improve its ability to analyze larger documents. Users can now also ground PaLM in capabilities for enterprise data. And Codey, Google’s code generation model, now offers improved performance.

Related:Highlights from Google's I/O Developer Conference

Also unveiled were Vertex AI Extensions, new tools to help enterprises get more value out of the company's models. AI models via Vertex can now retrieve real-time data. Users can also use pre-built extensions to popular enterprise APIs or build their own extensions.

Vertex AI will offer pre-built extensions for services like BigQuery and AlloyDB, as well as database partners like DataStax, MongoDB and Redis. Developers can also integrate via LangChain.

Vertex AI also now supports training with various frameworks and libraries using Cloud TPU virtual machines. This gives users access to built-in support for AI frameworks like JAX, PyTorch and TensorFlow on the cloud iteration of Google’s new TPU v5e chips (see below).

New AI chips

Google Cloud announced the latest version of its custom TPU chips: The TPU v5e.

Google’s new TPU performs better for less – with the TPU v5e performing AI training twice as fast per dollar compared to the previous version.

Developers only got their hands on the TPU v4 last year, and the company used the chips to train around 90% of its AI models.

Related:Google Offers Devs Browser-based AI Workspace in Project IDX

The chips also support eight different virtual machine configurations and can be used up to 250 chips within a single slice, allowing users to choose the right configurations to support AI model sizes.

Users can interconnect up to 256 TPU v5es to gain an aggregate bandwidth of more than 400 Tb/s and 100 petaOps of INT8 performance.

“Our speed benchmarks are demonstrating a 5X increase in the speed of AI models when training and running on Google Cloud TPU v5e,” said Wonkyum Lee, head of machine learning at Gridspace.

Among the early users of the chips was Claude 2 developer Anthropic. Co-founder Tom Brown said the hardware provides Anthropic with “price-performance benefits for our workloads as we continue to build the next wave of AI.”

Partnerships

A host of collaborations were announced at Google Cloud Next, including partnerships with Confluent, Elemental Cognition (founded by the development lead on IBM Watson) and HumanFirst.

But it was Google Cloud’s collab with Nvidia that was the most significant partnership. The pair announced the general availability of its A3 virtual machines.

Powered by eight Nvidia H100 chips, the A3 is designed to power developers building AI applications.

Nvidia founder and founder and CEO Jensen Huang joined Google Cloud CEO Thomas Kurian during the Next ’23 keynote to discuss the collaboration.

The pair announced a new large language model framework – PaxML.

PaxML is a Jax-based machine learning framework purpose-built to train large-scale models. Google has used the framework to build internal models on research projects over at Google DeepMind.

PaxML is available via the Nvidia NGC container registry.

New Colab services & image detection tools

Google Cloud also announced that Colaboratory (Colab), its browser-based developer platform, is getting an enterprise version.

Colab Enterprise will enable users to securely execute Python code and access AI models via the Vertex Model Garden.

Google Cloud also unveiled an enterprise version of its Google Kubernetes Engine container platform, GKE Enterprise edition.

GKE Enterprise is designed to make it safer for distributed enterprise teams to run workloads at scale.

Also unveiled at Google Cloud Next ’23 was SynthID, a new tool to tag and detect AI-generated images based on invisible watermarks.

Read more about:

ChatGPT / Generative AI

About the Author(s)

Ben Wodecki

Jr. Editor

Ben Wodecki is the Jr. Editor of AI Business, covering a wide range of AI content. Ben joined the team in March 2021 as assistant editor and was promoted to Jr. Editor. He has written for The New Statesman, Intellectual Property Magazine, and The Telegraph India, among others. He holds an MSc in Digital Journalism from Middlesex University.

Keep up with the ever-evolving AI landscape
Unlock exclusive AI content by subscribing to our newsletter!!

You May Also Like