Sponsored by Google Cloud
Choosing Your First Generative AI Use Cases
To get started with generative AI, first focus on areas that can improve human experiences with information.
The new guidelines aim to enhance national security, economic strength
The Biden-Harris Administration has released new guidelines to regulate chip licensing and AI systems while strengthening AI security standards.
The Interim Final Rule on Artificial Intelligence Diffusion released Monday builds on previous chip controls by “thwarting smuggling, closing other loopholes and raising AI security standards,” according to a White House statement.
“To enhance U.S. national security and economic strength, it is essential that we do not offshore this critical technology and that the world’s AI runs on American rails. It is important to work with AI companies and foreign governments to put in place critical security and trust standards as they build out their AI ecosystems.”
The rule builds on previous regulations aimed at protecting national security including an August 2023 executive order limiting U.S. investments in Chinese companies involved in “national security sensitive technologies,” referring to certain AI systems, semiconductors and microelectronics and quantum information technologies. It also follows 2022 restrictions imposed on exports of AI-focused semiconductors and related equipment to China.
The rule outlines six key actions to ensure U.S. technology is shared responsibly:
Restrictions on chip sales to 18 key allies and partners, allowing for seamless large-scale purchases
No license is required for chip orders with collective computation power up to roughly 1,700 advanced GPUs. They also don’t count against national chip caps.
Entities that meet high security and trust standards and are headquartered in close allies and partners can obtain highly trusted “Universal Verified End User” (UVEU) status.
Entities that meet the same security requirements and are headquartered anywhere that is not a country of concern can apply for “National Verified End User” status, allowing them to purchase computational power equivalent to up to 320,000 advanced GPUs over the next two years.
Non-VEU entities located outside of close allies can still purchase large amounts of computational power, up to the equivalent of 50,000 advanced GPUs per country a cap the White House said ensures that U.S. technology is available to service foreign governments, health care providers and other local businesses.
Governments can sign government-to-government arrangements to work together to cultivate an international ecosystem of shared values regarding the development, deployment and use of AI.
The rule also takes significant steps to prevent “countries of concern” from accessing advanced AI systems and the computer power needed to train them. They include:
Ensuring that advanced semiconductors sold abroad are not used by countries of concern to train advanced AI systems, while still allowing access for general-purpose applications.
Restricting the transfer of model weights for advanced, closed-weight models to untrusted parties.
Setting security standards to protect the weights of advanced closed-weight AI models so they can be stored and used securely worldwide.
You May Also Like