Data & Technology Innovation | May 2026 Insight

Contents

Tracking pixels in emails: The Privacy authority sets new rules

Data protection

With Provision No. 284 of April 17, 2026 (the “Provision“), the Italian Data Protection Authority (Garante per la protezione dei dati personali) adopted new Guidelines regarding the use of tracking pixels in email communications (the “Guidelines“). The aim of the Guidelines is to specify the correct procedures for providing information and obtaining consent from individuals.

The Provision arises from the need to address the issue of “hidden” tracking technologies used in email communications, particularly within newsletters, direct email marketing (DEM), transactional emails, service communications, or institutional messages.

The Provision highlights that tracking pixels are particularly invasive because (i) they are transparent images of minimal size, often just a single pixel, making them imperceptible to the recipient; (ii) they are embedded in emails but hosted on remote servers; and (iii) they do not contain any content. When the recipient opens the message, the email client automatically downloads the image, enabling the sender to detect that the email has been opened and, in some cases, obtain additional information from the recipient’s IP address, such as the type of device used, the duration of the message’s viewing, and the number of subsequent openings of the same email.

The Authority first noted the applicability of Article 122 of Legislative Decree No. 196 of June 30, 2003 (the “Privacy Code“), which governs access to and storage of information on the user’s device, as a lex specialis in relation to the European privacy regulations. This provision stipulates, as a general rule, that processing is allowed only with consent or, in certain cases, when specific conditions are met, such as when the operation is strictly necessary for technical purposes or the provision of a service explicitly requested by the user.

The Guidelines therefore make a distinction between uses that may benefit from an exemption and uses that require prior user consent. According to the Authority, consent will not be required, for example, when tracking pixels are used for aggregated and anonymized statistical purposes, to improve deliverability, combat spam, ensure security needs, or verify the opening of relevant institutional or service communications. In contrast, consent will be necessary when tracking is used to measure individual user interaction with promotional messages, optimize commercial campaigns, adjust the frequency of messages, or create commercial profiles that can be used for other initiatives.

To simplify, the Authority allows that consent for tracking may be included in the more general consent for receiving promotional communications, provided the user is clearly informed, and the request is made without coercion. However, it remains essential to ensure easy and granular revocation: the recipient must be able to choose whether to stop receiving emails entirely or continue receiving them without tracking pixels.

Finally, the Guidelines clarify that the information can be provided in a simplified form, for example, with a brief notice in the email address collection form, accompanied by a link to a more detailed privacy notice. For ongoing processing, the data controller may supplement the information in the first useful message or at the first point of discontinuity in the relationship with the individual.

The Data & Technology Innovation team at LEXIA offers assistance in the operational phases of compliance with the Provision, which must be completed within six months from the publication of the Guidelines.

Digital governance, AI, and cybersecurity: Legislative Decree No. 47/2026 brings technological risks into corporate structures

Corporate Governance & Technology Regulation

With Legislative Decree No. 47 of March 27, 2026, the legislator intervenes on the Consolidated Finance Act and the regulations governing joint-stock companies to strengthen the competitiveness of the capital markets and simplify the regulatory framework. However, beyond the stated objectives, the decree introduces a change that is destined to have a much broader impact: cybersecurity, artificial intelligence, and personal data protection are now firmly embedded within corporate governance.

The decree does not directly amend the GDPR or the NIS2 Directive, nor does it introduce new substantive obligations in terms of cybersecurity or data protection. The novelty is more profound and concerns how technological risks must be managed within a company.

A particularly significant change is the modification of Article 123-bis of the Consolidated Finance Act (TUF), which requires listed companies to describe, in their corporate governance report (if adopted), the policies related to the use and monitoring of new technologies, with specific reference to artificial intelligence systems and cybersecurity risks. Governance of AI and cybersecurity thus moves beyond the purely technical domain and becomes a subject of disclosure to the market.

In addition, the new Article 149-ter TUF calls for the adoption of continuous monitoring systems and automatic and predictive control tools that are “adequate and proportionate” to the company’s risks. This reference echoes principles that are now central in both the GDPR and European cyber regulations: proportionality, accountability, a risk-based approach, and effective control over automated processes.

The central point of the reform is likely this: cybersecurity and data protection are no longer treated as separate specialist functions but as components of the company’s organizational structure. As a result, the board of directors can no longer simply “delegate” technological issues to IT or compliance functions but must be able to understand, supervise, and integrate them into decision-making processes and internal control systems.

From this perspective, the meaning of compliance also changes. Compliance is no longer limited to the preparation of policies or formal documentation but requires organizational structures capable of demonstrating how the company manages the risks related to AI, cybersecurity, and the processing of personal data.

Therefore, Legislative Decree No. 47/2026 marks another step towards a model of integrated digital governance, in which corporate law, technology, and risk management progressively converge into a single system of responsibility and control.

NIS 2: ACN Guidelines on the categorization model for activities and services

Cybersecurity

In April 2026, the National Cybersecurity Agency (“ACN“) published the Guidelines on the categorization model for activities and services (the “Guidelines“), aimed at providing support to essential and important entities in complying with Article 30 of Legislative Decree No. 138/2024 (“NIS Decree“). This article requires the annual submission to ACN of a list of the entity’s services and activities, categorized according to their relevance. In particular, the Guidelines offer practical guidance on how to apply the model that entities must follow to categorize their services and activities, as adopted with ACN Determination No. 155238/2026 (the “Categorization Model“). The Categorization Model must be completed within the mandatory window between May 1 and June 30, 2026, through the “NIS/Categorization Service” on the ACN portal, regulated by ACN Determination No. 127437/2026.

The Guidelines define the structure of the Categorization Model, which is divided into ten macro-areas, from “Monitoring and Control” to “Other Services and Activities,” each identified by a name, description, and a pre-assigned relevance category. The relevance categories are four, ranked in increasing order of impact: minimal, low, medium, and high. The Guidelines clarify that the purpose of the process is not merely classification: the relevance categories assigned to each activity or service will form the basis for ACN to define future long-term security measures.

The categorization process described in the Guidelines consists of three phases:

  • In the first phase, the entity identifies all activities and services supported by information and network systems; no specific methodology is required, but the necessary level of detail must distinguish activities with homogeneous relevance categories.
  • In the second phase, each activity is assigned to only one macro-area, using the residual macro-area “Other Services and Activities” only if none of the others are applicable.
  • In the third phase, the relevance category is assigned: the pre-assigned category applies by default, but the entity may deviate from it based on a simplified Business Impact Analysis—conducted according to the principles of confidentiality, integrity, and availability—while retaining the relevant documentation, which may be requested by ACN during sample compliance checks.

ACN Determination No. 155238/2026 provides for two separate versions of the Categorization Model: Annex 1, applicable to entities operating in specific sectors (energy, transportation, healthcare, digital infrastructures), and Annex 2, applicable to all other entities.

After the June 30, 2026 deadline, the categorized list is considered definitively acquired and no longer modifiable, except in cases of documented technical-operational issues, not attributable to the entity.

Processing of personal data for scientific research purposes: the EDPB’s new Guidelines

Data protection

On April 15, 2026, the European Data Protection Board (EDPB) adopted Guidelines 1/2026, providing essential clarifications on the interpretation of the GDPR when personal data is processed for scientific research purposes. In a context marked by rapid technological advancements, such as artificial intelligence, the document aims to balance scientific freedom with the protection of individuals’ fundamental rights.

Definition and indicative Factors of scientific research

Although the GDPR suggests a broad interpretation of the concept, the Guidelines clarify that research must be genuinely scientific and conducted in accordance with industry-specific methodological and ethical standards. The EDPB introduces six key factors to presume the existence of such a purpose:

  • Methodical and systematic approach.
  • Adherence to ethical standards.
  • Verifiability and transparency of results.
  • Autonomy and independence of researchers.
  • Aim to contribute to general knowledge and societal well-being.
  • Potential scientific contribution or innovative application of existing knowledge.

Legal bases and simplified regimes

The document delves into the main applicable legal bases (consent, public interest, legitimate interest), introducing important clarifications:

  • presumption of compatibility: further processing for scientific purposes is presumed compatible with the original purposes of collection, exempting the data controller from the compatibility test, though a legal basis for the new processing must still be ensured.
  • broad” and “dynamic” consent: broad consent is allowed for general research areas when the objectives are not fully known at the outset, provided additional safeguards are in place. Dynamic consent, on the other hand, allows for ongoing involvement of the data subject in individual projects.
  • retention limitation: data may be retained for periods longer than necessary for the original purposes, as long as it is limited to what is required for the research.

Rights of data subjects and safeguards

The GDPR provides specific exceptions to the right to erasure (deletion) and limitations to the right to object if exercising these rights would make it impossible or seriously hinder the achievement of research objectives. However, the obligation to implement appropriate technical and organizational measures, such as pseudonymization or anonymization, in accordance with the data minimization principle, remains in force.

AGCOM adopts new guidelines for transparency and comparability in digital audience measurement

Media & Communications

With Decision No. 87/26/CONS (the “Decision“), AGCOM concludes the investigation launched in 2025 regarding audience measurement in the digital ecosystem and establishes a framework of guidelines aimed at strengthening transparency and comparability of data. This measure is part of a profound transformation of the media market, characterized by the growing centrality of online platforms and digital advertising.

The Authority highlights how the fragmentation of measurement methodologies, often based on proprietary systems of platforms and not subject to independent verification, creates informational asymmetries and risks of competitive distortion. Audience data, in fact, represents the “currency” of the advertising market and directly affects the economic value of editorial content and advertising investments, with impacts on information pluralism as well.

AGCOM emphasizes the need to avoid the proliferation of non-comparable metrics and to foster the construction of a coherent cross-media measurement system, capable of fully representing content consumption across all platforms.

The Decision establishes the central role of shared measurement systems, particularly the Joint Industry Committee (the “JIC“), which the Authority considers to be the most suitable governance model to ensure transparency, independence, representativeness, and verifiability of audience data. The Decision therefore stipulates that the audience measurement of content distributed by digital platforms should be carried out by the JIC, specifically Audicom, according to homogeneous, declared, and verifiable methodological rules, defined and governed by the JIC itself.

On the technological front, the Decision does not impose a single solution, clarifying that data homogeneity does not depend on the tool used, but on adherence to shared and verifiable rules. The Decision acknowledges that the use of the single SDK standard is the most advanced method in terms of consistency, comparability, and independence of measurement. On the other hand, it recognizes the proliferation of alternative models, such as server-to-server, and permits their coexistence with the single SDK standard, provided these solutions guarantee an equivalent level of transparency and third-party oversight.

Finally, the Decision requires Audicom to adopt the minimum requirements that a server-to-server-based measurement system must meet to produce the official currency of digital audiovisual audience measurement, such as homogeneous tracking, access and control over the data, independent auditing, ping/audit systems, and security.

Overall, the Decision marks a significant step in the regulation of the digital ecosystem, laying the groundwork for a more transparent and reliable measurement system to prevent competitive distortions in the advertising market

Date
Speak to our experts