GitHub, Hugging Face: EU’s AI Act Will ‘Threaten’ Open Source
The open source community says the EU AI Act's rules are more fitting for closed, private models, like ChatGPT
At a Glance
- GitHub, Hugging Face and other open source advocates lobby the EU to revise its rules on open source AI models.
- In a position paper, they said open source foundation models fall under the same rules as closed, private models.
- They said such uniform treatment "threaten" to create "impractical barriers" for open source contributors.
The EU AI Act, the most sweeping AI legislation in the world set to become law, will “threaten” the open source AI ecosystem, according to a position paper signed by Hugging Face, GitHub, Creative Commons and other advocates.
They said the Act’s current proposals are meant for “large-scale, commercial and closed models” familiar to most people such as OpenAI’s ChatGPT and GPT-4, instead of open source foundation models.
As such, the Act would “threaten to create impractical barriers to and disadvantages for contributors to this open ecosystem,” according to the signatories, which also include LAION, Open Future and EleutherAI.
Hugging Face is the AI company behind StarCoder and Hugging Chat; Microsoft owns the repository platform GitHub; EleutherAI is the non-profit research lab behind GPT-J; German nonprofit LAION built Stable Diffusion’s underlying dataset; and Open Future is a think tank.
The paper, seen by AI Business, acknowledges certain carve-outs made for open source AI models under the Act but also pointed out other rules that they deem problematic. These include the Act being applied to open source foundation models as well as AI systems tested in real world conditions.
The rules present “a number of challenges for a foundation model developer without significant financial resources and institutional backing.”
Stay updated. Subscribe to the AI Business newsletter.
Another requirement the group decries is to create a quality management system for open source foundation models. They argue that this assumes there will be a final product and staff who know the law and relevant technological practices. However, these are financial and human resources that a “typical volunteer project” will not have.
The group also said the requirement of hiring independent auditors would be “costly” and actually “not necessary” to mitigate risks in many foundation models. They argue that open source models already can be vetted by the public, who would do a better job than auditors.
Another point of contention is the Act’s requirement that documentation be kept for 10 years. The group said with open source projects, it is “unclear” who would have this responsibility in light of “decentralized development that lacks institutional backing.”
“Open source collaborations are often ad hoc without a formal organization,” the group said. For example, they point out that EleutherAI consisted of a group of volunteers for three years before it became a legal entity.
The group recommends the following actions:
1. Define AI components to make it clear which fall under the Act.
2. Clarify that developers collaborating on development of open source AI components and making them available in public repositories will not be subject to the Act.
3. Support the EU AI Office’s coordination and inclusive governance with the open source ecosystem. The AI Office monitors rules implementation.
4. Allow limited testing in real-world conditions by open source models.
5. Create rules for open source foundation models that are appropriate and different from those for private, closed models.
Read more about:
ChatGPT / Generative AIAbout the Authors
You May Also Like