Share This Article
The European Commission’s GPAI guidelines under the AI Act are here—and they’re about to change how general-purpose AI models are developed, distributed, and regulated in the EU. If you work with large language models, generative AI systems, or provide AI tools to customers in the European Union, these guidelines define the rules you’ll need to follow from 2 August 2025.
This article walks you through the key definitions, obligations, exemptions, and enforcement timelines—so you’re not caught off guard.
📌 Why the GPAI Guidelines Under the AI Act Matter
The GPAI guidelines AI Act package is a milestone: for the first time, the EU has clarified how it will interpret and enforce the obligations for general-purpose AI (GPAI) providers. This guidance applies to:
- Foundation model developers
- API providers
- Open-source AI distributors
- Enterprises integrating GPAI into downstream tools
- And any company modifying or fine-tuning base models
Some questions to be addressed to your business:
- Does your current AI strategy assume that the rules only apply to high-risk systems, not general-purpose models?
- Are you sure you’re not a GPAI provider under the new definitions?
🧠 What Counts as a General-Purpose AI Model?
The AI Act defines a GPAI model as one that shows “significant generality” and can “competently perform a wide range of distinct tasks.” That sounds vague—but the GPAI guidelines provide a concrete threshold.
A model is presumed to be a GPAI if:
- It was trained using more than 10²³ FLOPs
- It can generate language (text/audio) or text-to-image/video
This is a significant increase from the originally proposed 10²² FLOPs. According to the EU Commission, 10²³ reflects the typical compute required to train a model with at least 1 billion parameters. However, models trained for narrow purposes—even with high compute—are not considered GPAI. For example, a speech-to-text model trained with 10²⁴ FLOPs is out of scope if it only performs that task.
Some questions that are not addressed by the AI Act GPAI Guidelines:
- Is using FLOPs as a proxy for generality too simplistic?
- Should the EU consider alternative benchmarks, like real-world performance across domains?
👤 Who Is a GPAI Provider?
The GPAI guidelines clarify that you’re a provider if you either:
- Develop a GPAI model yourself; or
- have it developed and place it on the EU market under your name, whether for free or for payment
It doesn’t matter whether the model is distributed via:
- APIs
- Software libraries
- Public repositories
- Cloud services
- Mobile or web apps
Even if your company is outside the EU, these obligations apply once your model enters the EU market. You’ll need to appoint an EU representative if you’re not based in Europe.
Some questions that are not addressed by the AI Act GPAI Guidelines:
- If a U.S.-based model is used inside an EU-deployed product, who’s the liable provider?
- How will enforcement work for global models accessed by EU users?
📋 Key Obligations for GPAI Providers
Starting 2 August 2025, GPAI providers must:
✅ Maintain up-to-date technical documentation
This must cover the model’s architecture, training, testing, and evaluation procedures (Article 53(1)(a)).
✅ Provide information to downstream users
Especially those integrating the model into their AI systems (Article 53(1)(b)).
✅ Implement a copyright policy
This is to ensure compliance with EU copyright law, including opt-outs under Article 4(3) of Directive 2019/790 (Article 53(1)(c)).
✅ Publish a training data summary
This public summary must outline the content used to train the model (Article 53(1)(d)).
Some questions whose answer still needs to be addressed:
- How specific must the data summary be to meet compliance?
- Will copyright compliance force developers to remove training data retroactively?
🚨 Additional Rules for GPAI Models with Systemic Risk
Some GPAI models will be subject to stricter obligations due to their potential impact on public safety, rights, or the internal market.
Your model is presumed to have systemic risk if either:
- It was trained with more than 10²⁵ FLOPs; or
- It matches the capabilities of the most advanced models on the market
If so, you must:
- Conduct adversarial testing and model evaluations
- Track and report serious incidents
- Implement robust cybersecurity protections
- Notify the AI Office before and during training
Some questions that are not addressed by the AI Act GPAI Guidelines:
- Should models with systemic risk be subject to third-party audits?
- How will the Commission keep the FLOPs threshold aligned with rapidly evolving model architectures?
🔧 Fine-Tuning or Modifying Models? You May Be the New Provider
The GPAI guidelines also address downstream modifiers—companies or individuals who adapt a base model (e.g., via fine-tuning, quantization, or distillation).
If your modification uses more than one-third of the compute that trained the original model, you become a provider of a new GPAI model.
This means:
- You are fully subject to the AI Act
- You must comply immediately—no two-year grandfathering
Some questions to be addressed to your business:
- How can downstream actors, including your company, estimate original compute use if it’s not disclosed?
- Will this discourage valuable innovation and experimentation?
🧑💻 The Open-Source Model Exception: Not as Open as You Think
Not all open-source models are exempt.
To qualify, your model must:
- Be released under a free and open-source license allowing access, use, modification, and redistribution
- Be non-monetized
- Include public access to model weights, architecture, and usage information
What disqualifies you:
- Restricting usage to research or non-commercial purposes
- Paywalls, ads, or usage fees
- Requiring commercial licenses for scale
Even exempt models must still:
- Comply with copyright rules
- Publish a training data summary
Some questions that are not addressed by the AI Act GPAI Guidelines:
- Will companies be forced to relicense or withdraw open-source models?
- Can the open-source ecosystem survive without sustainable monetization options?
⏳ Grandfathering Clause: Transition Time for Existing Models
If you’ve already placed a GPAI model on the EU market before 2 August 2025, you have until 2 August 2027 to comply.
No need to retrain or “unlearn” the model—provided you either:
- Can’t retrieve the training data; or
- Retraining would impose disproportionate burden
You must disclose and justify this in your documentation.
Some questions to be addressed to your business:
- What changes to the GPAI model make it a “new” that can no longer rely on the grandfathering clause?
- Will this clause open the door to selective transparency?
- Could competitors or regulators challenge the scope of these justifications?
📜 GPAI Code of Practice: A Voluntary Path to Compliance
The GPAI Code of Practice, released on 10 July 2025, provides a voluntary route to demonstrate compliance with Articles 53 and 55.
Signing the code offers benefits:
- Regulatory trust and reduced scrutiny
- Potentially lower fines
- Public perception of responsible development
However:
- You must implement the measures—not just sign
- Non-compliance with the code could hurt credibility
Some questions to be addressed to your business:
- Will signing the code become de facto mandatory?
- Should codes of practice eventually be replaced with formal standards?
📆 Key Enforcement Deadlines
- 2 August 2025: GPAI provider obligations apply
- 2 August 2026: Fines and enforcement powers become active
- 2 August 2027: Deadline for legacy models to comply
Fines can reach €15 million or 3% of global turnover—whichever is higher.
Some questions that are not addressed by the AI Act GPAI Guidelines:
- Will the AI Office prioritize enforcement by sector, scale, or risk?
- How will the EU coordinate with non-EU regulators on cross-border compliance?
🧭 Final Thoughts: GPAI Compliance as a Strategic Advantage
The GPAI guidelines under the AI Act are not just about regulation—they reflect a broader shift toward responsible AI development. Compliance is no longer a back-office legal issue. It’s a strategic lever for trust, differentiation, and investment.
Organizations that prepare now—by mapping their models, documenting training processes, and aligning with the Code of Practice—will not only avoid regulatory pitfalls but position themselves as leaders in the new AI economy.
🗣 Let’s Discuss
- Do you know the compute profile of your AI models?
- Have you mapped which ones may qualify as GPAI?
- Will your current licensing or monetization approach be compliant?
- Are you ready to sign the Code of Practice?
On the same topic, you can read the following article “GPAI Code Approved: What It Really Means for AI Compliance in the EU” and the latest issues of our AI law journal Diritto Intelligente.