Digital Omnibus: pseudonymisation, artificial intelligence, data breach and ePrivacy under review by the EDPB and the EDPS

Contents

On 10 February 2026, the European Data Protection Board (“EDPB”) and the European Data Protection Supervisor (“EDPS”) published Joint Opinion 2/2026 on the proposed regulation known as the “Digital Omnibus”, through which the European Commission—on the basis of proposal COM(2025) 836 final of 19 November 2025—intends to streamline the Union’s digital regulatory framework by amending a broad and heterogeneous corpus of key instruments: Regulation (EU) 2016/679 (“GDPR”), Regulation (EU) 2018/1725 (“EUDPR”), the Data Act, the Data Governance Act, the ePrivacy Directive, the Cybersecurity Directive, the NIS 2 Directive, the Single Digital Gateway Regulation, and other instruments within the European digital ecosystem. The stated objective is to reduce administrative burdens for businesses and public administrations, improve consistency across digital regulations, and foster the development of the data economy, in line with the Commission’s most recent priorities on European competitiveness.

In their opinion, the EDPB and the EDPS welcome certain simplification and harmonisation measures, recognising the positive value of initiatives aimed at facilitating organisations’ compliance with the GDPR, strengthening legal certainty, and promoting responsible innovation. The joint opinion follows the path already set by the Helsinki Statement on Enhanced Clarity, Support and Engagement adopted by the EDPB on 2 July 2025, in which the Board committed to developing practical tools—including ready-to-use templates for businesses, common templates for data breach notifications, and accessible guidance—in order to reduce the burden of compliance, particularly for micro, small and medium-sized enterprises.

At the same time, however, the two authorities express significant concerns about certain proposed amendments—in particular with regard to the definition of personal data and the processing of data in the context of artificial intelligence—which are considered likely to have a material impact on the current level of protection guaranteed by the GDPR. Four areas of particular interest emerge from Joint Opinion 2/2026 and will be analysed in this piece: the rules on pseudonymisation, the processing of data within artificial intelligence systems, data breach notifications, and amendments to the ePrivacy framework.

Pseudonymisation and definition of personal data

One of the most contentious aspects of the Digital Omnibus proposal concerns the amendment to the definition of personal data contained in Article 4(1) GDPR. The Commission proposes introducing a new paragraph stating that information does not necessarily constitute personal data for every party that processes it, but only for those entities that have means reasonably likely to be used to identify the data subject. In the Commission’s view, the amendment aims to “codify” the case law of the Court of Justice of the European Union (“CJEU”) developed on pseudonymisation, with particular reference to the judgment of 4 September 2025 in EDPS v SRB (C‑413/23 P, ECLI:EU:C:2025:645), as well as earlier rulings—including the judgment of 9 November 2023 in Gesamtverband Autoteile – Handel eV v Scania CV AB (C‑319/22)—which had already outlined a relative and contextual approach to the notion of personal data.

The crux of the proposed amendment lies in the last sentence of the new text, which specifies that information does not become personal for a given entity merely because a subsequent recipient may have the means to identify the data subject. According to the EDPB and the EDPS, this negatively framed wording does not faithfully reflect the CJEU’s case law, which has consistently reaffirmed that data that would otherwise be non-personal can “become” personal data for the entity that makes them available to a recipient equipped with means reasonably likely to be used for identification. Indeed, in EDPS v SRB, the CJEU confirmed that, in such circumstances, the data are personal both for the recipient and, indirectly, for the entity that transmitted them. The Commission’s proposal therefore overlooks a fundamental element of the European jurisprudential framework.

The operational risk for organisations

The definition of personal data is the linchpin of the entire European data protection system, being directly referenced by Article 8 of the Charter of Fundamental Rights of the European Union and Article 16 TFEU. Any narrowing of the GDPR’s scope of application—whether through a legislative amendment or, even more so, through implementing acts of the Commission—could prompt controllers to artificially structure their data governance architectures in order to exclude certain activities from the framework’s reach: a concrete risk strongly emphasised in Joint Opinion 2/2026, paragraphs 16 and 17.

In practical terms, many artificial intelligence, advanced data analytics and data-sharing architectures rely on pseudonymisation techniques. Legal uncertainty regarding the classification of pseudonymised data could translate into concrete difficulties for compliance officers, DPOs and legal advisers tasked with assessing whether and to what extent the GDPR applies to specific processing operations. The EDPB is, moreover, working on new guidelines on pseudonymisation (Guidelines 01/2025, under public consultation) and on anonymisation, which will also take into account the evolution of the case law.

A further critical aspect concerns the proposal to introduce a new Article 41a GDPR, which would grant the Commission the power to adopt implementing acts to specify the means and criteria by which to determine whether data resulting from pseudonymisation no longer constitute personal data for certain entities. According to the joint opinion, this choice is open to a threefold criticism: first, it could—in practice—affect the material scope of application of the GDPR by means of an instrument hierarchically inferior to legislation, thereby bypassing the ordinary legislative procedure; second, entrusting the Commission itself with defining the criteria for the application of the GDPR appears to conflict with the independence of supervisory authorities guaranteed by Article 8(3) of the Charter; third, the provision could generate new interpretative uncertainty, rather than reduce it, since the practical consequences of the implementing acts—whether they establish a rebuttable presumption of non-identifiability or serve merely as one factor among others—remain entirely indeterminate.

For all these reasons, the EDPB and the EDPS strongly urge the co-legislators not to adopt the proposed amendments to the definition of personal data and to remove the proposed Article 41a GDPR from the Proposal, considering that the interpretative issues raised by the evolution of the case law can be better addressed through further EDPB guidance that takes into account the CJEU’s entire body of case law, rather than through a partial and potentially misleading legislative amendment.

Artificial intelligence and legal bases for processing

The Joint Opinion 2/2026 addresses the delicate relationship between the GDPR and the development of artificial intelligence systems, an issue that has taken on growing urgency on the European regulatory agenda following the entry into force of the AI Act (Regulation (EU) 2024/1689). The Digital Omnibus proposal introduces an explicit provision—the proposed Article 88c GDPR—under which the processing of personal data in the context of the development and operation of artificial intelligence systems or models may be based on the controller’s legitimate interests as a legal basis under Article 6(1)(f) GDPR.

The EDPB and the EDPS note, first, that reliance on legitimate interests in this context is already possible under the existing legal framework, as the EDPB has explicitly confirmed in its Opinion 28/2024 on certain data protection aspects related to the processing of personal data in the context of AI models, adopted on 17 December 2024. The introduction of a specific provision in the text of the GDPR therefore does not appear to provide meaningful legal clarification; on the contrary, it risks creating the impression that legitimate interests constitute a “privileged” legal basis for AI-related processing, thereby weakening individual safeguards.

On the substantive conditions, the opinion recalls the need for the controller in all cases to conduct a rigorous three-step test, as codified in the EDPB Guidelines 1/2024 on processing of personal data based on Article 6(1)(f) GDPR: (i) the existence of a legitimate interest on the part of the controller or a third party; (ii) the necessity of the processing for the pursuit of that interest; (iii) whether the data subject’s fundamental rights and freedoms prevail, as the outcome of a case-by-case balancing exercise. This assessment cannot be compressed or generalised merely because the processing falls within the domain of artificial intelligence.

The right to object in the AI context

The proposed Article 88c GDPR introduces, in its second paragraph, an “unconditional right to object” as a risk-mitigation measure for data subjects. The EDPB and the EDPS welcome this measure but recommend inserting this right into Article 21 GDPR—rather than as a standalone provision—and clarifying that it must be brought to data subjects’ attention in advance, before processing begins, taking into account the technical difficulties of removing data from AI systems that have already been trained. The effectiveness of the right to object ultimately depends on the practical ability to cease or limit the processing and to delete data that may already be embedded in the model’s parameters.

A second issue of great practical significance concerns the proposal to introduce a derogation—through the new Article 9(2)(k) GDPR—for the incidental and residual processing of special categories of personal data during the development and training of artificial intelligence systems or models. The need this derogation seeks to address is real: when collecting data to train large-scale models—such as general-purpose AI models—it is not always technically possible to prevent the dataset from containing information relating to data subjects’ health, sexual orientation, or political or religious beliefs, which would qualify as special categories under Article 9 GDPR.

The authorities therefore acknowledge the legitimacy of the objective pursued by the Proposal; however, they put forward a series of improvements aimed at strictly circumscribing the derogation. First, the reference to “incidental and residual” processing should be expressly set out in the operative text of Article 9(2)(k) GDPR, and not only in the recital, to avoid expansive interpretations that would end up covering deliberate processing of sensitive data. Second, the proposed Article 9(5) GDPR should explicitly provide, as a precondition for the applicability of the derogation, that erasure of the data in question is impossible or would involve disproportionate efforts, and that such assessment is properly documented. Third, the protective safeguards should extend to the entire lifecycle of the AI system, including the deployment phase, in order to prevent not only unintentional collection but also the reuse of the data for different purposes.

Finally, the opinion highlights the need to coordinate the derogation proposed in Article 9(2)(k) GDPR with the new Article 4a of the AI Act—introduced by the related AI Omnibus Proposal—which provides a distinct regime for the processing of special categories of data that are deliberately collected solely for the purpose of detecting and correcting bias. The two regimes are not interchangeable and require a clear normative delineation to avoid uncertainty in application, especially for AI system providers that straddle both frameworks.

Data breach: towards greater harmonisation of notifications

Among the measures most positively received in Joint Opinion 2/2026 are the proposals to harmonise and simplify the procedures for notifying personal data breaches, as governed by Articles 33 and 34 GDPR. The EDPB and EDPS structure their analysis along three distinct lines: the notification threshold, the standardisation of operational tools, and the introduction of a Single Entry Point (“SEP”) at the European level.

With regard to the threshold for notification to supervisory authorities, the Commission proposes raising the trigger for the obligation, limiting it to breaches that present a high risk to the rights and freedoms of data subjects, in line with the threshold already provided for notification to data subjects under the current Article 34 GDPR. The EDPB and the EDPS welcome this change: on the one hand, it is not expected to significantly undermine the protection of data subjects, given the continuing obligation to internally document any breach under Article 33(5) GDPR; on the other hand, it would allow supervisory authorities to focus their resources—which are already under increasing pressure, as evidenced by data from the Danish authority, which received 9,302 notifications in 2025, and the Irish authority, which handled 7,781 in 2024—on breaches that are genuinely high-risk.

As to notification deadlines, the Proposal extends from 72 to 96 hours the period within which controllers must notify the competent authority of a breach. The EDPB and the EDPS support this extension, considering that the current 72-hour deadline is often difficult to reconcile with weekends and public holidays, particularly for smaller organisations. The longer deadline is expected to improve the quality of notifications, enabling controllers to gather more complete information and to implement remedial measures even before submission. The opinion nevertheless notes, with concern, the need for closer alignment with the shorter deadlines provided by other European legislative instruments: NIS 2 requires notification within 24 or 72 hours, DORA within 24 or 72 hours, eIDAS within 24 hours, and the CER Directive within 24 hours. This fragmentation risks complicating incident management for organisations subject to multiple regulatory regimes.

The Single Entry Point and the EDPB’s role in standardisation

The Proposal provides for the establishment of a SEP for the notification of personal data breaches, in application of Article 23a of the NIS 2 Directive (2022/2555). The SEP at European level is strongly welcomed by the EDPB and the EDPS, who view it as a tool capable of reducing the administrative burden on organisations operating in multiple Member States without diminishing the level of protection for data subjects. The authorities nevertheless emphasise the need to ensure the security of communications transmitted via the SEP, given the sensitivity of the data contained in breach notifications.

In parallel, the Proposal assigns to the EDPB the preparation of a common template for data breach notifications and a list of circumstances in which a breach is likely to result in a high risk, with the possibility for the Commission to amend these instruments through implementing acts. The EDPB and the EDPS consider this attribution of powers to the Commission inappropriate: the template and the list of circumstances should be prepared and approved exclusively by the EDPB, similarly to the approval power over certification criteria already recognised by Article 42(5) GDPR, in order to ensure independence and consistency in the interpretation of the rules.

In the same spirit of harmonisation, the Proposal introduces the creation of EEA-wide common lists of processing activities for which a data protection impact assessment (“DPIA”) is mandatory or not mandatory, as well as a common template and methodology for conducting DPIAs. The EDPB and the EDPS welcome these initiatives which—consistent with the Helsinki Statement—will help reduce fragmentation among national authorities and ease the compliance burden on businesses. They nevertheless recommend that the notion of “methodology” for the DPIA be understood broadly and practically—as a structured process and set of application principles, rather than a mere checklist—in order to preserve the necessary flexibility given the variety of processing contexts.

E-Privacy and cookies: towards overcoming consent fatigue?

The Joint Opinion 2/2026 finally addresses the relevant amendments to the ePrivacy Directive (2002/58/EC), with particular reference to the protection of information stored on users’ terminals—so‑called cookies and similar technologies. This is an area in which the need for reform is widely acknowledged: the proliferation of cookie banners and consent requests has not, in practice, strengthened user awareness but has instead fuelled the phenomenon of “consent fatigue,” with paradoxically regressive effects on the quality of the consent obtained.

The Digital Omnibus proposal tackles this terrain with a twofold approach: on the one hand, it introduces new provisions—the proposed Article 88a GDPR—that establish exceptions to the consent requirement for accessing and storing information on users’ terminals; on the other, it provides for automated, machine‑readable mechanisms for expressing users’ preferences—the proposed Article 88b GDPR. The EDPB and the EDPS strongly support the objective underlying both provisions: simplifying consent management, reducing the burden on both users and controllers, and promoting more genuine and informed choices.

The exceptions to consent set out in the proposed Article 88a(3) GDPR concern, in particular: (i) the provision of a service explicitly requested by the user; (ii) the measurement of an online service’s audience by its operator, for exclusively internal use; (iii) the security of the service or the terminal. Compared to the current Article 5(3) of the ePrivacy Directive, the Proposal introduces a broader exception for the provision of the requested service—not limited to information society services—and adds new exceptions for audience measurement and security. The EDPB and the EDPS recommend strictly limiting each exception to what is strictly necessary: in particular, audience measurement should be limited to aggregated anonymous information, not usable for further purposes and not combined with data from other services or third parties.

A further significant aspect concerns the proposal to introduce a specific exception for contextual advertising, namely advertising based on the user’s current browsing context, without cross‑site tracking or long‑term storage. The EDPB and the EDPS consider that this form of advertising, which is less intrusive than behavioural advertising, may be included in the list of consent exceptions, provided that the exception is clearly delimited and does not lend itself to abusive uses that would in practice expand its scope. If well circumscribed, the proposal could provide an economic incentive for the spread of privacy‑respecting advertising models, reducing pressure toward intrusive tracking.

On the subject of consent renewal, the proposed Article 88a(4) GDPR introduces specific safeguards: in particular, the controller may not submit a new consent request for six months in the event of a user’s refusal. The EDPB and the EDPS welcome these precautions, while at the same time recommending the definition of a maximum validity period for the consent originally given, in order to ensure that users are periodically put in a position to review their choices. Finally, it is suggested that an explicit exception to consent be provided for the recording of the refusal itself, which is technically necessary to enforce the ban on repeated requests, provided that this exception results in the use of a generic, anonymous identifier, common to all users who have refused consent.

The joint opinion finally highlights a cross‑cutting issue of primary importance: the new ePrivacy provisions cannot be effectively implemented and enforced without a suitable strengthening of data protection authorities’ corrective powers. The proposed Articles 88a and 88b GDPR must be accompanied by explicit references to the sanctioning provisions of Articles 83(5) GDPR and 66(3) EUDPR, in order to ensure effective deterrence against non‑compliant parties.

A difficult balance between simplification and protection

The Joint Opinion 2/2026 of the EDPB and the EDPS on the Digital Omnibus clearly reflects the complexity and sensitivity of the phase that European digital law is currently undergoing. The Commission’s proposal moves on two tracks: on the one hand, genuine and desirable simplification measures—such as the harmonisation of data breach notifications, the standardisation of DPIAs, and efforts to counter consent fatigue—which the EDPB and the EDPS themselves welcome; on the other, structural changes to the definition of personal data and to the regime of legal bases for AI that the authorities consider inappropriate, premature, or even counterproductive in light of the stated objectives.

On pseudonymisation, the decision to legislate on a definition that the CJEU has progressively developed through nuanced and contextual case law raises the most acute concerns. The possible adoption of the proposed Article 4(1) GDPR as presented by the Commission—and, even more so, of the proposed Article 41a GDPR—could usher in a phase of systematic evasion of the GDPR, through processing architectures deliberately designed to remove certain data flows from the scope of the legislation. The remedy indicated in the opinion—further EDPB guidance rather than legislative intervention—appears better suited to ensuring an orderly evolution of the legal framework, in respect of the institutional competences of the supervisory authorities.

On artificial intelligence, the regulatory challenge is of a different nature: it is not a matter of correcting a potentially harmful intervention, but of ensuring that the provisions introduced—legitimate interests for AI and a derogation for the incidental processing of special categories—are accompanied by sufficiently robust safeguards and clear application criteria. Coordination with the AI Act remains an ongoing project that will require careful regulatory alignment in the coming months, also in light of the phased application of the various obligations laid down by the Regulation.

Ultimately, the legislative process that will unfold in the coming months—with the involvement of the European Parliament and the Council—will be decisive in determining whether the revision of the European digital rulebook can strike the right balance between regulatory simplification, the promotion of innovation, and the effective protection of individuals’ fundamental rights. In this context, the joint opinion of the two main European data protection authorities represents a contribution of extraordinary value: not only for the depth of its legal and technical analysis, but also for the clarity with which it draws the lines beyond which simplification risks turning into regression.

Download Area
Download the PDF
Download
Date
Speak to our experts