One Line
Foundation model providers are not fully compliant with the requirements of the draft EU AI Act, particularly in areas such as copyright, energy, risk, and evaluation, highlighting the need for more transparency and accountability in the ecosystem.
Slides
Slide Presentation (12 slides)
Key Points
- Compliance with the draft EU AI Act is lacking among foundation model providers, particularly in areas such as copyright, energy, risk, and evaluation.
- Transparency is currently lacking in the foundation model ecosystem and industry standards should be set to improve it.
- Policymakers should prioritize transparency and ensure it underlies the technology.
- Technical expertise on foundation models is necessary for effective enforcement of the AI Act.
- Many organizations receive poor scores in areas such as copyrighted data, compute/energy, risk mitigation, and evaluation/testing.
- The EU AI Act is an important regulatory initiative globally and will set a precedent for AI regulation worldwide.
Summary
379 word summary
Foundation model providers are not fully compliant with the requirements of the draft EU AI Act, particularly in areas such as copyright, energy, risk, and evaluation. Compliance with the Act will bring positive change to the foundation model ecosystem. Transparency is currently lacking, and providers should work together to set industry standards that improve transparency. Policymakers should also ensure transparency underlies this technology. The implementation of the AI Act should consider additional factors such as disclosure of usage patterns and AI risk management. Technical expertise on foundation models is necessary for effective enforcement. Recommendations are directed towards EU policymakers, global policymakers, and foundation model providers to improve transparency and accountability. Compliance with the Act is feasible with sufficient incentives, and even minor improvements can be made by providers. Open releases show better compliance with resource disclosure requirements, while restricted/closed releases perform better in deployment-related requirements. There is a clear divide in compliance between open and restricted/closed models. Evaluation standards for foundation models are lacking, and risk mitigation disclosure is inadequate. Reporting of energy usage and copyright status of training data is inconsistent. Many organizations receive poor scores in four areas: copyrighted data, compute/energy, risk mitigation, and evaluation/testing. Compliance with the AI Act varies among model providers, with some scoring less than 25% and only one provider scoring at least 75%. The final scores are presented in Figure 1. The assessment of compliance with the Act's requirements was done for 10 major foundation model providers and their flagship models. The Act's high-level obligations for foundation model providers lack precise interpretation or enforcement guidelines, so rubrics were used to assess compliance. The rubrics focus on transparency and include requirements related to data resources, compute resources, the model itself, and deployment practices. The research highlights the need for more transparency and accountability in the ecosystem. Foundation model providers generally do not comply with requirements related to disclosing copyrighted training data, hardware usage and emissions, and evaluation/testing of models. The researchers recommend prioritizing transparency based on the AI Act's requirements. Compliance with the Act is feasible for foundation model providers, and increased disclosure would improve transparency in the ecosystem. The EU AI Act is the most important regulatory initiative on AI globally and will set a precedent for AI regulation around the world.