Share This Article
The new EDPB Guidelines 3/2025 on the interplay between the Digital Services Act (DSA) and the GDPR provide a comprehensive roadmap for aligning platform accountability with personal data protection, clarifying how Regulation (EU) 2022/2065 (DSA) and the General Data Protection Regulation (GDPR) must work together to ensure consistent enforcement, lawful data processing, and a unified model of digital governance across the EU.
Regulation (EU) 2022/2065 on digital services (hereinafter, DSA) and the General Data Protection Regulation (EU) 2016/679 (hereinafter, GDPR) are designed to redefine the responsibility of online intermediaries and the protection of personal data in the digital economy.
With the adoption of Guidelines 3/2025 on the interplay between the Digital Services Act and the GDPR (hereinafter, the Guidelines), the European Data Protection Board (hereinafter, EDPB) has embarked on a path of systematic interpretation aimed at clarifying how the two regulations should interact in a consistent and mutually integrated manner.
The stated objective is twofold:
- to ensure that the provisions of the DSA involving the processing of personal data are applied in accordance with the GDPR and,
- at the same time, to ensure that the implementation of the GDPR takes into account the new liability dynamics introduced by the DSA.
The Guidelines also emphasize the need for structured cooperation between Digital Services Coordinators, Data Protection Authorities, and the European Commission, which is essential to prevent overlapping competences, interpretative misalignments, and risks of bis in idem in the application of the two regulatory regimes.
In this perspective, the EDPB does not limit itself to providing technical clarifications, but outlines a model of integrated digital governance, in which the protection of personal data, the transparency of decision-making processes, and the responsibility of platforms converge towards a single architecture of trust and accountability, destined to become the hallmark of European regulation of online services.
DSA and GDPR as two sides of the same coin
The DSA and the GDPR are two European digital law pillars: the former aims to build a safe and transparent online space, while the latter ensures that all the processing of personal data activities comply with the principles of lawfulness, proportionality, and data minimization. In force since February 17, 2024, the DSA applies to all online intermediary service providers, imposing enhanced obligations on Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs).
The Guidelines clarify that the DSA is not lex specialis with respect to the GDPR: both, of equal rank, must be applied consistently. Article 2(4)(g) and recital 10 DSA confirm that the protection of personal data remains governed by the GDPR and the ePrivacy Directive. Any DSA obligation involving data processing must therefore comply with the GDPR; emblematic in this regard are Article 26(3) DSA on the prohibition of advertising based on special categories of data and Article 28(2) DSA on the reference to profiling in Article 4(4) GDPR.
Supervision is entrusted to the Digital Services Coordinators (DSC) in the Member States and to the European Commission for large operators, assisted by the European Board for Digital Services (EBDS). The Guidelines promote structured cooperation between the EBDS and the EDPB, based on the principle of loyal cooperation, to avoid duplication and conflicts of interpretation. In this balance, the DSA structures the responsibility and transparency of platforms, while the GDPR continues to protect fundamental rights: two complementary instruments of the same architecture of trust in the European digital ecosystem.
Voluntary detection of illegal content: balancing platform responsibility and GDPR principles
Article 7 of the DSA allows intermediary service providers – including hosting services, online platforms, and VLOPs – to voluntarily identify and remove illegal content without losing the exemptions from liability referred to in Articles 4–6, provided that they fully comply with the GDPR and its principles of lawfulness, fairness, transparency, and proportionality. Detection techniques based on machine learning or automatic recognition involve potentially invasive processing: the EDPB warns that they may amount to systematic monitoring, with risks of errors, bias, and undue restrictions on freedom of expression, and requires effective human oversight to ensure the proportionality and legitimacy of the intervention.
The Guidelines, all in all, distinguish between two scenarios:
- if the provider acts on its own initiative, the legal basis is Article 6(1)(f) GDPR and so the controller legitimate interest, applicable only if the processing is necessary, proportionate, and the rights of the data subjects do not prevail, subject to documentation of the balancing test and clear information pursuant to Articles 13 and 14;
- if the processing arises from a legal obligation, the basis is Article 6(1)(c) GDPR, subject to clear and proportionate rules and verification of compliance with Articles 9 and 10 DSA. Recital 56 confirms that Article 7 does not constitute an autonomous basis for profiling or detecting crimes.
Automated decisions, for their part, require real human control, as mere formal supervision is not sufficient to overcome the prohibition in Article 22(1) GDPR. Articles 14 and 17 DSA reinforce transparency by requiring a statement of reasons for any limitation or removal of content, specifying whether the intervention is automated or voluntary. Since these activities involve high-risk criteria (profiling, systematic monitoring, automated decisions), providers must conduct a Data Protection Impact Assessment (hereinafter, DPIA) pursuant to Article 35 GDPR and, if necessary, consult the authority. Article 7 DSA thus becomes a test bed for digital due diligence: it promotes security and trust only if platforms fully comply with the guarantees of the GDPR.
Reports, complaints, and abuse: data processing in “notice and action” mechanisms
Article 16 DSA requires hosting providers to set up accessible and transparent electronic mechanisms for reporting illegal content (notice and action), including through trusted flaggers. These processes involve the processing of the personal data of the notifier, the recipient, and any third parties, in relation to whom the provider acts as data controller in full compliance with the GDPR.
- The data collected – ideally only name and email address – must be limited to what is necessary, in line with Article 5(1)(c) of the GDPR, ensuring the possibility of anonymous reports unless identity is essential, as in the case of copyright infringements.
- The identity of the notifier may only be disclosed if strictly necessary and after providing information in accordance with Article 13 of the GDPR.
- Article 16(6) DSA allows the use of automated systems for managing reports, but decisions with legal effects must include effective human intervention and comply with the safeguards in Article 22 GDPR.
- Article 17 DSA reinforces transparency by requiring a statement of reasons for any removal or restriction of content, specifying whether the intervention was automated.
- Articles 20 and 23 of the DSA complete the framework: the former recognizes the right of complainants and recipients to lodge a complaint, which cannot be decided solely by algorithm; the latter allows for temporary suspension for abuse, provided that it is based on accurate data, proportionate, and complies with the principles of minimization and fairness.
Dark patterns and the right to digital self-determination
With Article 25, the DSA introduces a key principle: digital interfaces must be designed in such a way that they do not compromise users’ ability to make autonomous and informed decisions. Ethical design thus becomes a legal parameter, aimed at preventing behavioral manipulation that affects freedom of choice and the protection of personal data.
The Guidelines define deceptive design patterns (also known as dark patterns) as interface patterns that induce unwanted choices, often to facilitate data collection or prolong online interaction. When such practices reduce user awareness or alter consent, they violate the principles of lawfulness, fairness, and transparency of the GDPR referred to in Article 5(1)(a), making the processing unlawful ab origine, regardless of formal consent.
The prohibition in Article 25(2) DSA is coordinated with the GDPR and the Unfair Commercial Practices Directive, extending protection even to cases where manipulation does not involve the processing of personal data. When, on the other hand, design influences the collection or use of data, jurisdiction lies with the privacy authorities, which must assess its compliance with the principles of lawfulness, minimization, and privacy by design. Particular attention is paid to addictive design (infinite scrolling, autoplay, gamification), which exploits psychological levers to prolong the use of platforms and, based on the analysis of behavioral data, falls fully within the scope of the GDPR.
Advertising, profiling, and recommendation algorithms
The DSA makes transparency a structural principle of the new digital economy: Articles 26, 27, and 38 outline a system aimed at making the algorithmic logic behind advertising and content recommendations understandable, controllable, and compliant with the GDPR.
- Article 26 DSA requires platform providers to indicate, for each advertisement, the identity of the advertiser, the targeting parameters, and how to modify them, introducing operational transparency that complements Articles 13 and 14 of the GDPR. The requirements of lawfulness, consent, and the right to object under the GDPR and the ePrivacy Directive remain unchanged. The DSA also provides for an absolute ban on advertising based on special categories of data under Article 26(3) DSA, even with explicit consent, removing sensitive data from market logic and protecting digital dignity. Security and confidentiality measures under Article 26(4) DSA must ensure the minimization of information flows and the effective application of the principle of privacy by design.
- Article 27 DSA extends these obligations to recommendation systems, requiring the criteria determining the priority of content to be specified in the terms of service and allowing users to modify them. When such systems are based on personal data, they constitute a form of profiling within the meaning of Article 4(4) GDPR, subject to the safeguards of Articles 5, 12–14, and 22, which ensure transparency on the logic and effects of the process.
- Finally, Article 38 DSA requires VLOPs and VLOSEs to offer at least one option that is not based on profiling, presented without nudging and accompanied by a prohibition on collecting data for predictive purposes, in implementation of Article 25 GDPR. Taken together, these articles outline a model of informational accountability based on algorithmic transparency as a tool for digital sovereignty, balancing freedom of choice, data protection, and trust in the online environment.
Protecting minors in the digital ecosystem
The protection of minors is one of the cornerstones of the DSA, which requires providers of platforms accessible to minors to ensure high standards of privacy, security, and protection through technical and organizational measures proportionate to the risks. Article 28 introduces a principle of proactive responsibility: platforms must prevent and mitigate the risks arising from their services – such as exposure to harmful content, undue data collection, or addictive design practices – through tools such as technical standards, codes of conduct, or parental controls, in accordance with the principles of necessity, and minimization. Article 28(3) excludes the obligation to process additional data to verify age, avoiding systematic identification practices. Paragraph 2 prohibits advertising based on profiling when the platform is “reasonably certain” that the user is a minor, in line with Articles 9(1) and 26(3) of the DSA, to prevent the exploitation of cognitive vulnerability and protect digital dignity without imposing new data collection.
When protective measures involve data processing, the legal basis may derive from Article 6(1)(c) of the GDPR, provided that the processing is strictly necessary and proportionate; the use of biometric data or special categories of data is excluded, except for the exceptions in Article 9(2). The EDPB recommends privacy-preserving age assurance systems – such as self-certification, parental confirmation, or zero-knowledge proofs – that allow for non-invasive verification without data retention. Compliance with Article 28 is thus achieved through data protection by design and by default, preventing the protection of minors from translating into new profiling risks. For VLOPs, these obligations are integrated with those of systemic risk assessment and mitigation provided for in Articles 34 and 35 of the DSA.
Governing risk and building trust
Articles 34 and 35 of the DSA introduce a key principle of the new digital governance: VLOPs and VLOSEs must identify, assess, and mitigate the systemic risks generated by their services. This is a paradigm shift: from reactive to proactive responsibility, based on the prevention of the collective impacts that algorithmic architectures can have on society, democracy, and fundamental rights. The DSA identifies various categories of risk – dissemination of illegal content, negative effects on fundamental rights, threats to health and safety, gender-based violence, damage to mental and physical well-being – making “systemic risk” a structural responsibility.
Since such risks often arise from the processing of personal data, management must comply with the principles of lawfulness, transparency, and proportionality of the GDPR. The link between the two regulations is clear: the DPIA in Article 35 of the GDPR complements the assessment of systemic risks in Article 34 of the DSA, becoming indispensable when the social impact arises from data processing. Article 35 DSA requires reasonable, proportionate, and effective measures consistent with the principles of privacy by design in line with Article 25 GDPR and security of processing under Article 32, including periodic testing, algorithm review, and protection of minors under Article 35(1)(j) through age assurance and parental control. Articles 45–47 complete the picture: codes of conduct, in parallel with Article 40 GDPR, promote cooperation between authorities, businesses, and civil society, translating legal obligations into verifiable standards based on periodic reporting; Article 46 strengthens transparency in the advertising sector, while Article 47 introduces a principle of universal and inclusive accessibility of digital interfaces.
Looking ahead, the interplay between the DSA and the GDPR is set to become the cornerstone of a new era of European digital governance, one in which transparency, accountability, and the protection of fundamental rights are no longer parallel objectives but mutually reinforcing pillars. As enforcement actions begin and cooperation between Digital Services Coordinators, Data Protection Authorities, and the European Commission evolves, companies operating online will face increasing pressure to embed privacy, safety, and algorithmic transparency into the very architecture of their platforms.
The EDPB’s Guidelines 3/2025 are not just a technical document—they signal the transition towards a harmonized digital ecosystem where compliance, innovation, and trust must coexist. Those who adapt early will not only reduce regulatory risk but also strengthen their position in a market that rewards ethical design and responsible data governance.
On a similar topic, you can read the article “Why Meta Can’t Give Up Fact Checking in Europe with the DSA“.

