AI Business is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 3099067.

AI and the quest for a commercial fusion reactor

by Len Strugatsky
Article Image

An alternative approach to plasma containment could make compact nuclear fusion reactors a reality – and it would rely on artificial intelligence

Government-backed projects are rumbling towards fusion energy from giant reactors, but smaller systems could make them look like dinosaurs.

And in order to meet the challenge, they will have to rely on artificial intelligence.

Since the 1950s, fusion research has focused on tokamaks, the torus shapes which have been seen as the best hope for replicating the conditions of the sun, where hydrogen fuses into helium and releases energy.

But in 2015, a startup emerged from stealth with a different fusion topology, aiming for a different reaction path. “You could not do what we are doing now without machine learning,” Michl Binderbauer, CEO of TAE, told AI Business.

You’re gonna need a bigger boat

Fusion reactions take place in super-hot, super-dense plasmas, where electromagnetism and fluid dynamics create massively complex conditions. Researchers picked the tokamak because other topologies were too complex to control – and even the tokamak has turned out to require neural networks to tame it.

But times have changed, Binderbauer told us in an interview. AI is enabling radical designs in other fields – for instance, in fighter aircraft which now have dynamically unstable airframes, making them faster and more maneuverable, but which cannot be built or flown without sophisticated automation. So why not go back to the drawing board in fusion, and see if another topology might become possible with modern, intelligent systems to manage the complexity?

TAE (originally Tri Alpha Energy) was founded in 1998, but kept itself to itself for 17 years, while it raised some $150 million to refine a design developed by its founder, the Canadian physicist Norman Rostoker. It saw public launch in 2015, and in 2017 it achieved plasma. In 2018, TAE announced a deal with Google Applied Science to push its plasma towards fusion and break-even.

“Our next desire is net energy [when the plasma generates more energy than it consumes],” Binderbauer said. “We’ll have that capability, in the next three or four years.”

That’s a lot faster and cheaper than the big projects like ITER, the $22 billion tokamak being built in the south of France, which isn’t due to open until 2025. What’s TAE doing differently?

For one thing, it’s starting with a different topology: “It’s accomplished with ingenious insights from Norman Rostoker – applying the principles of accelerator physics,” Binderbauer explained. TAE uses colliding beam fusion – where two beams of particles are accelerated and collided, much like accelerators such as CERN.

Where those two beams collide, a torus is created, and TAE uses the field-reversed configuration (FRC). This is potentially simpler than a tokamak, with less leakage and taking less energy to contain. In an FRC the torus sustains itself: the eddy currents created in the plasma generate the containment field.

TAE Technologies' Norman © TAE

It’s a lot like the strange stability of a smoke ring – and the difference is in the size of the particles’ orbits. There are two other commercial firms working with FRC – Helion Energy, and General Fusion, which has a computing partnership with Microsoft.

“The tokamak is done with brute force and control by humans,” Binderbauer said. “In a tokamak, the particles have orbits with a millimeter or centimeter scale. It’s a tiny orbit in a big machine. In our machine, the particle orbit is on the order of 1 meter.”

Having bigger orbits is a bit like riding out a storm in a bigger boat, he said: “Compared to a small rowboat, a cruise liner will get just a few bumps. It’s called phase averaging.”

But it wasn’t that simple. TAE’s C-2U machine had more than 1,000 knobs and switches, and plasmas obey the physical laws of magnetohydrodymamics, which are too complex to predict. TAE was running a “shot” every eight minutes, trying to keep the plasma contained for more than a few milliseconds. What settings would be the most stable?

Trying different configurations was complex: “You are traversing a multidimensional cube – and innovation is a constant tinkering process,” Binderbauer said. “You try something, you rebuild. The faster you can cycle, the quicker you can innovate.”

Cycling hardware takes a while: “You could spend two months twiddling the knobs to get to optimal conditions. You might be close and not be able to see it.

Better fuels

Up till 2014, TAE built its own machine learning algorithms, and reduced the optimization time from two months down to two weeks. “That made a huge difference. We’d been sitting in front of this big asset, not making the best use of it. It was now much more in tune, than if we had to wait months.”

At that point, the company decided to collaborate with bigger AI experts: “We realized we should talk to people who are much better than us at AI. We talked to Google, and struck a chord.”

Google and TAE came up with a system that worked in partnership with the fusion scientists – exploring their preferences alongside the settings of the reactor.

First the controls were boiled down to 30 main parameters, and then a Markov Chain Monte Carlo (MCMC) algorithm came up with multiple suggested settings, and let the operator choose one. It was dubbed the Optometrist Algorithm because it resembled the process of choosing options in getting an eyeglass prescription.

“The key improvement we provided was a technique to search the high-dimensional space of machine parameters efficiently,” Ted Baltz, senior staff software engineer at Google, wrote in a 2017 blog.

Google and TAE scored a paper in Nature with the algorithm. More importantly, the human-AI team learnt the knack of stabilizing an FRC torus – so TAE dismantled C-2U and built the next one: C-2W, which was later renamed Norman, as Professor Rostoker had sadly died.

With the ability to stabilize FRC systems, TAE plans to move to higher temperatures – and that’s a big deal. So far, everyone’s been working on the deuterium-tritium reaction, which is easiest because it happens at lower temperatures. Higher temperatures mean TAE can look at better fuels.

“There is only 50kg of tritium around on the planet, it’s a classified substance, and it decays in 12 years,” Binderbauer said. “We can look beyond the lowest energy fuel cycle. Hydrogen and boron is our goalset, because there’s about 100,000 years of terrestrial fuel supply.” It also has the advantage of not creating extra neutrons, limiting radiation.

TAE is aiming for temperatures around a billion Celsius, which is way hotter than the center of the Sun – but there’s a plus: plasmas conduct better at higher temperatures, so containment could get easier.

AI has continued to help, and TAE has enjoyed access to the US Department of Energy’s fastest supercomputers. “The DoE is very supportive of the private fusion sector – and that’s priceless. You can’t even pay to get onto these machines. You have to earn the right.”

The Optometrist algorithm cut the time it took to optimize Norman – and more importantly, made the process predictable: “We knew how long it would take to build, and can estimate a schedule. This is very different! You’re up against Mother Nature, and can actually estimate, with good predictability, how long it would take you. With Norman and C-2U, we got the data in a year.”

Norman uses machine learning techniques, some of which were developed out of Optometrist, and have been reincarnated as a control system.

“There are about 80,000 parts on the machine that are electrified, and there are different hierarchies to the operating system,” Binderbauer noted. The flow of coolant water won’t need to vary much, but other parameters need to change on millisecond timescales.

A few years from now, production systems will have to be controlled by machine learning: “In a reactor scenario, you don’t want 100 PhDs to operate it. It’s got to be like a gas fired or nuclear plant today.”

That’s very different from research reactors, where all the information is desirable. “We have to figure out how to get rid of 99 percent of the diagnostics. We’ll have to turn off information flows and learn how little we can get away with.”

“If we could get away with having just one sensor we would,” Binderbauer said. “We have one of the world’s best diagnosed machines, with 5,500 diagnostics – and we want to get rid of most of this. The art is figuring out what are the ones you can get rid of.”

There may be no PhDs on-site at commercial fusion reactors, but TAE wants to help remotely, turning into a software and optimization company, while the hardware is churned out by an industrial giant such as Siemens or Toshiba: “We will want access to all the operating data of those machines to train those brains. We want to be in the space where the science mixes with control.”

Practitioner Portal - for AI practitioners


UK's ICO publishes guidance on AI and data protection


The document aims to help organizations mitigate the risks of using personal data in AI applications


Perfect AI model, broken app: Integration patterns and testing needs


It is a long way from having a working machine learning model on a local laptop to having a full-fledged fashion store with a mobile app incorporating this model

Practitioner Portal


More EBooks

Upcoming Webinars

More Webinars

Experts in AI

Partner Perspectives

content from our sponsors

Research Reports

More Research Reports


AI tops the list of most impactful emerging technologies

Infographics archive

Newsletter Sign Up

Sign Up