Stability AI Aims to Democratize Coding with Release of StableCode AI Model

After popularizing AI art creations with Stable Diffusion, Stability AI turns its attention to code generation

Ben Wodecki, Jr. Editor

August 16, 2023

3 Min Read
An example of code for using StableCode Instruct to generate a response to a given instruction.Stability AI

At a Glance

  • Stability AI has unveiled its new AI code generation model StableCode which can generate Python and Java code.
  • The team behind Stable Diffusion also showed off a new Japanese text generation model, Japanese StableLM Alpha.

After helping popularize AI image generation with Stable Diffusion, the team behind it has turned its attention to code generation.

Stability AI has unveiled StableCode, a small model just three billion parameters in size. It’s designed to be used as a tool to help developers code.

StableCode understands various programming languages, including Python, Go, Java, Javascript, C, markdown and C++.

There are three versions of StableCode: a standard version (StableCode-Completeion-Alpha-3b), a version that boasts a long context window of up to 4k tokens (StableCode-Completion-Alpha-3b-4k) and a version with an instruction model added (StableCode-Instruct-Alpha-3b). All three are available via Stability’s Hugging Face page.

Stability said it wants to make StableCode accessible, to help programmers with their daily work.

“StableCode will help the next billion software developers learn to code while providing fairer access to technology all over the world,” a Stability announcement reads.

Technical talk: How was StableCode made?

The base model of StableCode was trained on programming languages from the BigCode dataset. Stability then took popular programming languages like Python, Go and Javascript and further fine-tuned the model to improve its coding capability. In total, StableCode was trained on 560B tokens of code.

Related:AI Code Generation Models: The Big List

An instruction model was then added to StableCode, tuned for specific use cases to help solve complex programming tasks. It took around 120,000 code instruction/response pairs in Alpaca format to help with the instruction portion of StableCode.

How does StableCode compare?

There’s already a glut of AI code generation models. While models like ChatGPT and Bard can generate snippets of code in response to user queries, specially designed models like StarCoder from Hugging Face or GitHub’s Copilot X are trained largely on code to aid human developers.

Stability tested StableCode against other code generation models of a similar parameter size using the pass@1 metrics on the popular HumanEval benchmark.

The results saw StableCode achieve slightly better accuracy results than Replit Coder and the StarCoder base model.

Stability launches Japanese language model

Sticking with Stability, the AI company also unveiled its first Japanese language model - Japanese StableLM Alpha.

Designed for Japanese speakers, the seven billion parameters general-purpose language model can generate text.

There are two versions – a base version which is available for commercial use, and a research-focused version with an instruction model added, Japanese StableLM Instruct Alpha 7B.

Related:Stability AI’s Policy Chief: AI Will Not Destroy Creativity

The base version was trained on Japanese and English text, as well as source code.

Also used to train the model were specially crafted datasets created by Stability’s Japanese team in cooperation with the Japanese team of the EleutherAI Polyglot project.

“We are proud of our first big step towards contributing to the Japanese generative AI ecosystem,” said Meng Lee, project lead of Japanese StableLM. ”We look forward to continuing to create models across several modalities, built specifically to reflect Japanese culture, language and aesthetics”.

Read more about:

ChatGPT / Generative AI

About the Author(s)

Ben Wodecki

Jr. Editor

Ben Wodecki is the Jr. Editor of AI Business, covering a wide range of AI content. Ben joined the team in March 2021 as assistant editor and was promoted to Jr. Editor. He has written for The New Statesman, Intellectual Property Magazine, and The Telegraph India, among others. He holds an MSc in Digital Journalism from Middlesex University.

Keep up with the ever-evolving AI landscape
Unlock exclusive AI content by subscribing to our newsletter!!

You May Also Like