Generative AI Explosion Requires New Compute Approaches

For generative AI to scale sustainably, developers and industry stakeholders must rethink how they power their applications

Wes Levitt, head of strategy at Theta Network

August 12, 2024

3 Min Read
a lit up city at night from the sky
Getty Images

Artificial intelligence has done what many other emerging technologies could not. OpenAI's ChatGPT and Google's Bard are household names, while quantum computing still sounds more like a sci-fi subplot. Over one-third of the population, 18 million people, in the UK have used generative AI, with 7 million using it for work compared to 4 million just a year ago. Usage isn’t the only generative AI statistic that has skyrocketed. ChatGPT’s coding fixes and Stable Diffusion’s interior design mockups have significantly escalated the demand for computational power. 

Unlike traditional search engines and non-AI tools, these applications require immense energy to function effectively. Plugging something into ChatGPT reportedly requires ten times as much power as a traditional Google search. The energy infrastructure powering these is already struggling, and alternative energy sources are still limited. The climate can’t afford an AI-driven emissions spike, and even the industry’s AI giants know it. Even Altman has called for clean energy to power generative AI rather than reverting to fossil fuels. 

Future of AI is not Coal Powered

Sustainable AI progress will require alternative compute solutions, and decentralization is a strong contender. Decentralized networks can harness the collective power of edge nodes operated from individuals’ home computers, using GPU already on PCs to run compute tasks in the background without interrupting everyday routines. 

Related:Leveraging Existing Information Security Investments to Secure AI Data

Decentralized edge computing also reduces processing bottlenecks. Edge applications that analyze data closer to their origins reduce latency, improve efficiency, and ensure ChatGPT can help you write your research brief without delay. 

Optical neural networks offer another answer to the compute sourcing question. These networks use light instead of electricity to process information, cutting out coal dependence almost entirely. For example, a University College London spinout, Oriole Networks, promises to use light beams and optical fibers to reduce network consumption to just 3% of traditional models. More funding and attention should go to these promising avenues. 

Benefits of Alternative Solutions

Overall, implementing these solutions is a win-win situation for users, applications, and providers. Reducing reliance on fossil fuels like coal will cut generative AI’s carbon footprint and reduce the industry’s environmental impact. Moving away from data centers also strengthens the industry and minimizing reliance on data center providers keeps AI free from too much centralized authority. 

Related:Female Leadership in the Age of AI

Individual and environmental benefits alone might not be enough for companies to pursue alternative approaches, especially with an “if it’s not broken, don’t fix it” mindset. Luckily, decentralization is also attractive for generative AI apps, as these networks enable cost-efficient, flexible compute usage without long-term contracts. They can provide compute when and where it is needed rather than pushing a one-size-fits-all contract that simply does not fit the ever-changing generative AI world.  

Better Future for generative AI

For generative AI to scale sustainably, developers and industry stakeholders must rethink how they power their life-changing applications. Their responsibility is to embrace decentralized approaches, explore innovative technologies like optical neural networks, and diversify power sources. By reconsidering power infrastructure, the generative AI explosion can be managed to improve our lives while protecting the planet.

About the Author

Wes Levitt

head of strategy at Theta Network, Theta Network

Wes Levitt is the Amsterdam-based head of strategy for Theta Network, a decentralized blockchain cloud for AI, media and entertainment. Before joining Theta, Wes worked in investment roles in real estate equity and securitized debt. He holds a BS in economics from the University of Oregon and an MBA from UC Berkeley’s Haas School of Business.

Keep up with the ever-evolving AI landscape
Unlock exclusive AI content by subscribing to our newsletter!!

You May Also Like