Mandatory account creation for e-commerce: the new recommendations from the EDPB clarify the limits for data controllers
Data protection – eCommerce
On December 3, 2025, the European Data Protection Board (EDPB) published Recommendations 2/2025 regarding the obligation of creating a user account on e-commerce websites. Currently open for public consultation, the document aims to clarify a now-common practice: forcing users to create an account in order to complete an online purchase.
According to the EDPB, while this approach may be justified by commercial needs, it can pose significant risks to privacy, especially when associated with massive data collection, unnecessary profiling, or deceptive design mechanisms that unintentionally push users to provide more information than necessary. The guiding principle is simple: the requirement to create an account should be the exception, not the rule, and the option of “guest checkout” — allowing users to complete the purchase without registering — is the method that best aligns with the principles of necessity and data minimization, as established by the GDPR (Articles 5 and 25).
In the document, the EDPB analyzes the main legal bases invoked by controllers (contract performance, legal obligation, legitimate interest), ruling out that these can justify the forced creation of an account in most cases. If a user intends to purchase a single product, the transaction can easily be completed without any registration, and even legitimate goals like fraud prevention or customer loyalty can be achieved through less intrusive methods.
There are few specific cases where an account requirement can be justified: subscription services (requiring continuous, authenticated interactions), or access to genuine “closed communities” with selective admission criteria. A generic loyalty program, on the other hand, is not enough to make account creation lawful.
For controllers, the message is clear: offering users a genuine choice is now an essential part of the privacy by design principle. This means not only making it clear that users can purchase as guests but also explaining the purpose of the account and ensuring each processing activity is based on a distinct and valid legal basis (such as explicit consent, if required).
In conclusion, Recommendations 2/2025 do not introduce new obligations but set a clear operational standard for all e-commerce operators: rethink the user experience to ensure transparency, control, and respect for individual freedom in purchasing decisions.
AIMAG sanctioned by the Data Protection Authority: insufficient security measures and invalid consents
Data protection
By decision of November 27, 2025 (No. 10202135), the Italian Data Protection Authority (Garante) imposed a fine of €300,000 on AIMAG S.p.A., a multi-utility company operating in the energy, water, and environmental sectors, for serious violations of the GDPR regarding data security and processing for promotional purposes.
The investigation began following a report from a user who complained about vulnerabilities in the registration process for the restricted area of the company’s website. The Authority found that it was sufficient to enter the tax code of the account holder and any email address to create an account and access personal data, including residence address, consumption history, contact details, and billing information.
Such a vulnerability led the Authority to charge AIMAG with violating Article 32 of the GDPR for failing to adopt appropriate security measures. Specifically, the authentication system did not ensure user identity verification, breaching the principle of integrity and confidentiality (Article 5, paragraph 1, letter f) GDPR).
Further irregularities emerged during the investigation:
- AIMAG processed user data for marketing and customer satisfaction purposes without valid consent;
- the checkboxes for the privacy notice and the use of data for commercial purposes were pre-selected, violating the principle of positive and unequivocal action required by the GDPR;
- data processed for promotional purposes were stored without documented retention limits, violating the data retention limitation principle (Article 5, paragraph 1, letter e).
The AIMAG case confirms the Authority’s approach aimed at strengthening the accountability of data controllers and ensuring the effective implementation of appropriate security measures. In particular, this case serves as a reminder to all digital operators of the need to focus on three key areas: user authentication, consent management, and retention period definition.
AI Act: first draft of the “Code of Practice on Transparency of AI-Generated Content” published
Artificial Intelligence
On December 17, 2025, the first draft of the Code of Practice on Transparency of AI-Generated Content (hereinafter the “Code“) was published. This is a voluntary adherence tool (for more details on the nature of such codes, we invite you to consult our article by clicking here) that provides operational guidelines for fulfilling the transparency obligations on AI-generated content set out in Article 50 of EU Regulation 2024/1689 (“AI Act“).
After an introduction, the Code is divided into two sections with specific rules: (i) Section 1, aimed at providers of generative AI systems; (ii) Section 2, aimed at users (deployers) of AI systems. Below are the key elements to consider for each category of subject.
Section 1 – Providers
- Multi-layered approach: The Code requires the combination of the following content traceability technologies: (i) metadata (embedding): digital signatures and provenance information; (ii) watermarking: imperceptible watermarks; (iii) fingerprinting/logging (where necessary): logs or fingerprints to verify the origin of the output.
- Responsibility along the supply chain: upstream providers must facilitate the compliance of those integrating their models; for “open-weight” models (publicly available parameters), structural marking should be incorporated into the model’s “weights” during training.
- Verification tools: Providers are required to make interfaces (APIs) or public tools available free of charge to enable third parties to detect synthetic content.
Section 2 – Deployers
- Transparency taxonomy: Content must be clearly classified into: (i) fully generated by AI, with no significant human contribution; (ii) AI-assisted, where AI has altered factual, emotional, or stylistic elements.
- Deepfake: The Code mandates the adoption of a common icon, visible or appropriate to the context (e.g., a visible disclaimer for videos and a vocal disclaimer for audio content). Pending definition, the Code suggests the provisional use of acronyms such as “AI” or “IA”.
- Exception: The labeling requirement for public interest texts does not apply if the content has undergone substantial human revision (ensuring editorial responsibility through internal control procedures).
This first draft of the Code forms a foundation on which further refinements will be developed and is therefore subject to changes. Ahead of the second draft, the Commission invites interested parties to submit feedback by January 23, 2026.
EU-UK: the European Commission confirms adequacy decisions until 2031
Data protection
On December 19, 2025, the European Commission renewed the two adequacy decisions for the free flow of personal data between the European Union and the United Kingdom: adequacy decision 1772/2021 under the GDPR and adequacy decision 1773/2021 under Directive (EU) 2016/680 (which concerns the transfer of personal data in the context of law enforcement activities), confirming and amending the original assessments from 2021.
Based on these new decisions, personal data flows between the United Kingdom and the European Economic Area (EEA) can continue without obstacles, under the guarantee of a “substantially equivalent” level of protection as provided by European legislation.
The renewal decisions follow a six-month technical extension granted on June 24, 2025, necessary to assess, in light of the adequacy criteria under Article 45(2) of the GDPR, the impact of the new Data (Use and Access) Act on UK data protection legislation, primarily consisting of the United Kingdom General Data Protection Regulation (UK GDPR) and the Data Protection Act of 2018.
Despite recent legislative reforms in the United Kingdom, the Commission concluded that the UK legal framework continues to provide “substantially equivalent” guarantees to those under the GDPR and Directive (EU) 2016/680, also known as the LED Directive (which pertains to data processed by law enforcement and criminal justice authorities).
In its assessment, the Commission found that the control mechanisms and avenues for redress provided by UK legislation continue to allow for the identification and appropriate punishment of violations of the rights of data subjects, as well as offering remedies for individuals to access their personal data and, if necessary, to have such data rectified or deleted. In this context, it was emphasized that the future Information Commission, which will replace the current Information Commissioner’s Office (ICO), will continue to independently carry out its tasks as the supervisory authority.
Another relevant factor influencing the Commission’s assessment was the United Kingdom’s commitment to the European Convention on Human Rights (ECHR) and its submission to the jurisdiction of the European Court of Human Rights (ECHR), which are international guarantees of fundamental importance for assessing the level of protection.
The new adequacy decisions will be valid for six years, expiring on December 27, 2031. The Commission, in collaboration with the European Data Protection Board (EDPB), will conduct a periodic review at least every four years to ensure that the equivalent level of protection is maintained over time.
Revision of the eIDAS Regulation: Pan-European digital identity and European wallets
Digital Identity & Trust Services
With Regulation (EU) 2024/1183, known as eIDAS 2.0, the European Union is reshaping the digital identity framework, focusing on an interoperable and secure system at the community level. Entering into force on May 20, 2024, the new regulation updates the 2014 regulation to overcome fragmentation between national systems and establish a single model for digital identification and authentication.
A key element is the European Digital Identity Wallet (EUDI Wallet), which each Member State must offer free of charge to citizens, residents, and businesses by the end of 2026. This tool will allow users to prove their identity online and offline, store and share verifiable digital documents (e.g., certificates, licenses, professional qualifications), and digitally sign with full legal value.
The wallet will ensure easier and more secure access to public and private services, reducing the costs and risks of traditional authentication methods, while giving users full control over which data to share and with whom. Privacy and security are central to the system, which adopts advanced technical standards and the principle of data minimization.
In addition to the wallet, eIDAS 2.0 sets common rules for electronic identities, verifiable attributes, and qualified trust services, strengthening cooperation between Member States and the private sector. It also introduces requirements for interoperability and security, to be implemented through specific executive acts that are still in the adoption phase.
One of the most innovative aspects of the new regulation is the focus on verifiable digital attributes: certified information—such as date of birth, professional qualifications, or educational degrees—that can be integrated into the wallet and shared selectively. This approach opens the door to new digital services but also imposes strict governance based on privacy by design and compliance with GDPR rules.
The roadmap is already outlined: by 2026, Member States must make at least one national wallet available, while by 2027, certain regulated services (e.g., banks, public authorities, telecoms) must accept it as a tool for secure authentication.
In summary, eIDAS 2.0 and the European wallet system represent a strategic turning point for the European digital market, laying the foundation for a secure, transparent, and universally recognized digital identity across the EU.