AI News Roundup: Anthropic Supercharges Claude ChatbotAI News Roundup: Anthropic Supercharges Claude Chatbot
Also - Google joins forces with SAP for a new enterprise solution
May 12, 2023
AI Business brings you the latest in deals and products from across the AI world.
This week's roundup covers Anthropic's significant upgrade to Claude, its ChatGPT rival chatbot, among other developments.
To keep up to date with coverage of all things AI, subscribe to the AI Business newsletter to get content straight to your inbox and follow the AI Business Podcast on Apple and Spotify.
Anthropic's Claude gets an upgrade
Anthropic, the Google-backed AI startup trying to rival OpenAI, has expanded the context window of its chatbot to let it parse longer documents.
Claude’s context window (a range of tokens that the AI considers before generating an output) now spans 100,000 tokens of text, corresponding to around 75,000 words.
Anthropic said Claude can now retrieve information from lengthy business documents. Users can drop multiple documents or even a book into the prompt and ask Claude questions about them.
The suped-up Claude and its API are now now available to business partners. Claude is also available via Amazon Bedrock, an AWS service offering multiple foundation models.
Claude is a generative AI model akin to OpenAI’s ChatGPT. It differs, however, in that it has been built using constitutional AI, a novel concept where the underlying language model was trained to answer adversarial questions using a set of principles as a guide - so outputs will be less harmful.
Notably, Anthropic was invited to the recent White House meeting with VP Kamala Harris to discuss safeguards around AI models. Meta was not present.
Google joins forces with SAP
Fresh from I/O 2023, Google has announced a widening of its partnership with SAP.
The pair will offer a solution that uses Google Cloud’s data analytics, AI/ML, and data cloud technologies to SAP customers. It is designed to let users build an end-to-end data cloud to analyze data from SAP and external sources. The new solution can also be integrated with BigQuery, Google Cloud’s serverless warehouse platform.
Users can get real-time access to business-critical data and use replication tech to train models using that data. The companies also plan to partner on joint go-to-market initiatives for large data projects for enterprises.
Google opens up generative AI music model
Sticking with Google, the company has opened up access to MusicLM, its text-to-music AI model.
Unveiled back in January, users can generate music from natural language prompts at a consistent 24 kHz over several minutes.
Google had published the dataset used to train the model and users could only really interact with it via a demo.
But now Google has made MusicLM available via its new AI Test Kitchen, an app allowing users to try out the company’s various AI research projects in exchange for feedback.
Nvidia wants to make more AI chips
Nvidia has reportedly ordered 10,000 more AI chips that use Taiwanese foundry TSMC's chips-on-a-substrate (CoWos) packaging technology, amid a wave of interest in AI and training models.
Digitimes Asia reports that the additional wafer orders are for this year. TSMC currently has a monthly capacity of around 8,000 to 9,000 wafers and Nvidia's demand means CoWos supply will be tight. CoWos is a multi-chip packaging tech designed by TSMC for high-performance applications.
Nvidia has started stepping up shipments of its new H100 chips, meant to replace the A100s at the start of May. But a wave of interest in AI has sparked interest in companies using its hardware.
Earlier this week, Google announced at its I/O event that its new A3 supercomputers would be powered by Nvidia H100 GPUs.
Adding to the demand for H100s are requests from Oracle, Microsoft and potentially Elon Musk’s secretive AI project at Twitter.
About the Author(s)
You May Also Like