unveils zero-emissions AI cloud in partnership with atNorth

Launch comes as carbon emissions from data centers are estimated to rival those of pre-COVID air travel

Ben Wodecki, Jr. Editor

October 28, 2021

2 Min Read

As carbon emissions from data centers are estimated to rival those of pre-COVID air travel

MLOps firm has launched a zero-emissions AI cloud.

The platform can be integrated with a complete stack of MLOps tooling for the entire machine learning model lifecycle.

It uses 100 percent renewable geothermal and hydro energy and free-air cooling due to its data center’s near-Arctic location in Iceland.

“The AI Cloud was created for enterprise AI teams who are dedicated to both high-impact AI solutions and the responsibility we all share for ethical and sustainable AI,” said Constantine Goltsev, CEO of

“Our green cloud is a zero-emission path forward for increasingly large models and datasets.”

Clean cloud is based in San Francisco and was founded in 2018 by Goltsev, formerly of SolidOpinion, and Maxim Prasolov, who also works at deep learning R&D firm Neuromation.

The startup has raised around $2m in its seed round, according to figures from Crunchbase.

The company’s decision to launch a zero-emissions AI cloud comes at a time where tech’s carbon footprint is growing at an unprecedented pace.

Carbon emissions from tech infrastructure and servers that enable cloud computing now exceed those of pre-Covid air travel, according to French think tank The Shift Project.

The group's findings suggest that data centers make up 15 percent of the IT sector’s digital footprint, and big players like Microsoft, Google, and Amazon emitted 16m, 1.5m, and 44m tons of greenhouse gases respectively last year, according to Greenpeace.

The AI Cloud was built in partnership with atNorth, a Nordic data center services company.

The firm is “thrilled” to be working with the startup, said Sebastian Holtslag, VP from atNorth.

"Our high-density data centers across the Nordics are built with sustainability, scalability, and security at their forefront, and we look forward to supporting's future growth.”

The new cloud service makes use of Nvidia GPU architectures, including the A100-powered DGX and HGX systems, and can provide unified resource management capabilities for deep learning and inference workloads.

Users also gain access to’s complete MLOps interoperability platform, which includes tools for data management, training, and monitoring.

The platform can be integrated into several open source and proprietary AI developer tools, including Pachyderm, DVC, Seldon, and MLflow, among others.

About the Author(s)

Ben Wodecki

Jr. Editor

Ben Wodecki is the Jr. Editor of AI Business, covering a wide range of AI content. Ben joined the team in March 2021 as assistant editor and was promoted to Jr. Editor. He has written for The New Statesman, Intellectual Property Magazine, and The Telegraph India, among others. He holds an MSc in Digital Journalism from Middlesex University.

Keep up with the ever-evolving AI landscape
Unlock exclusive AI content by subscribing to our newsletter!!

You May Also Like