GitHub, Hugging Face: EU’s AI Act Will ‘Threaten’ Open Source

The open source community says the EU AI Act's rules are more fitting for closed, private models, like ChatGPT

2 Min Read
A digitized rendering of the EU flag
mirsad sarajlic/Getty Images

At a Glance

  • GitHub, Hugging Face and other open source advocates lobby the EU to revise its rules on open source AI models.
  • In a position paper, they said open source foundation models fall under the same rules as closed, private models.
  • They said such uniform treatment "threaten" to create "impractical barriers" for open source contributors.

The EU AI Act, the most sweeping AI legislation in the world set to become law, will “threaten” the open source AI ecosystem, according to a position paper signed by Hugging Face, GitHub, Creative Commons and other advocates.

They said the Act’s current proposals are meant for “large-scale, commercial and closed models” familiar to most people such as OpenAI’s ChatGPT and GPT-4, instead of open source foundation models.

As such, the Act would “threaten to create impractical barriers to and disadvantages for contributors to this open ecosystem,” according to the signatories, which also include LAION, Open Future and EleutherAI.

Hugging Face is the AI company behind StarCoder and Hugging Chat; Microsoft owns the repository platform GitHub; EleutherAI is the non-profit research lab behind GPT-J; German nonprofit LAION built Stable Diffusion’s underlying dataset; and Open Future is a think tank.

The paper, seen by AI Business, acknowledges certain carve-outs made for open source AI models under the Act but also pointed out other rules that they deem problematic. These include the Act being applied to open source foundation models as well as AI systems tested in real world conditions.

The rules present “a number of challenges for a foundation model developer without significant financial resources and institutional backing.”

Related:EU AI Act Nears Completion as Lawmakers Reach Historic Moment

Stay updated. Subscribe to the AI Business newsletter.

Another requirement the group decries is to create a quality management system for open source foundation models. They argue that this assumes there will be a final product and staff who know the law and relevant technological practices. However, these are financial and human resources that a “typical volunteer project” will not have.

The group also said the requirement of hiring independent auditors would be “costly” and actually “not necessary” to mitigate risks in many foundation models. They argue that open source models already can be vetted by the public, who would do a better job than auditors.

Another point of contention is the Act’s requirement that documentation be kept for 10 years. The group said with open source projects, it is “unclear” who would have this responsibility in light of “decentralized development that lacks institutional backing.”

“Open source collaborations are often ad hoc without a formal organization,” the group said. For example, they point out that EleutherAI consisted of a group of volunteers for three years before it became a legal entity.

The group recommends the following actions:

Related:Top European Companies: ‘Serious Concerns’ About EU AI Act

1. Define AI components to make it clear which fall under the Act.

2. Clarify that developers collaborating on development of open source AI components and making them available in public repositories will not be subject to the Act.

3. Support the EU AI Office’s coordination and inclusive governance with the open source ecosystem. The AI Office monitors rules implementation.

4. Allow limited testing in real-world conditions by open source models.

5. Create rules for open source foundation models that are appropriate and different from those for private, closed models.

Read more about:

ChatGPT / Generative AI

About the Author(s)

Ben Wodecki

Jr. Editor

Ben Wodecki is the Jr. Editor of AI Business, covering a wide range of AI content. Ben joined the team in March 2021 as assistant editor and was promoted to Jr. Editor. He has written for The New Statesman, Intellectual Property Magazine, and The Telegraph India, among others. He holds an MSc in Digital Journalism from Middlesex University.

Deborah Yao

Editor

Deborah Yao runs the day-to-day operations of AI Business. She is a Stanford grad who has worked at Amazon, Wharton School and Associated Press.

Keep up with the ever-evolving AI landscape
Unlock exclusive AI content by subscribing to our newsletter!!

You May Also Like