
The EU’s DMA Fine Against Meta: GDPR in Disguise?
The European Commission has issued its first major fines under the Digital Markets Act (DMA), charging Apple and Meta for alleged violations of the new digital antitrust regulation. While framed as landmark DMA enforcement, the Meta case reveals something deeper: a shift toward using competition law to advance misguided data protection objectives and condemn behavior that has not been found to violate Europe’s data protection law, the General Data Protection Regulation (GDPR). This is more than regulatory expansionism; it signals a broader institutional strategy that could reshape how digital rules are applied across Europe’s economy, with serious implications for legal certainty, transatlantic alignment, and European consumers and innovation.
At the heart of the Commission’s case is Meta’s business model, which offers users a choice between a free, ad-supported service—based on user consent to data processing and a combination of data across services—or a paid, ad-free version. For users choosing the free option, consent covers the use of their personal data across:
- Meta’s core platform services (CPSs), which include Meta Ads, Facebook, Instagram, Messenger, and WhatsApp
- Other distinct services, such as Dating and Gaming Play
- Third-party data used for personalized advertising
The Commission found this model, which critics derisively refer to as “Consent or Pay,” violates Article 5(2) of the DMA, which prohibits gatekeepers from processing or combining personal data across services or with third-party data unless users are offered a clear, specific, and freely given choice in line with GDPR standards. Specifically, the Commission objects to two aspects of Meta’s approach. First, it is concerned that the company presents users with only a binary choice: pay a subscription for an ad-free experience or use the platform for free with personalized advertising. Second, it objects to the company obtaining user consent through a single “consent flow” at the Accounts Center level—typically when users first access one of Meta’s platforms, usually Facebook or Instagram—without allowing further control over how data is used across other services such as Marketplace or Dating.
Although the DMA is a competition instrument, the Commission’s case against Meta relies heavily on a misguided view of data protection principles. Fundamentally, the Commission’s concern is about the scale and scope of personal data collection and use across Meta’s ecosystem—not about preventing exclusionary abuses of market dominance. To be sure, the Commission appears to believe that Meta’s extensive data practices give it a significant ability to profile users and offer high-performance advertising, which in turn might result in an unfair competitive advantage over rivals. But at its core, a company improving its ability to advertise is fundamentally a form of competition on the merits not a type of anticompetitive abuse; the Commission’s apparent view that more personalized advertising is harmful, was debunked in recent scientific research analyzing a long-term field experiment on Facebook which found that user valuation of the platform does not significantly change whether they see ads or not.
The Commission’s decision to use the DMA as a data protection instrument risks creating a dual-track enforcement regime alongside the GDPR, one that will result in overlapping or conflicting obligations, legal uncertainty, and a lack of regulatory coherence across the EU. Notably, the Commission itself acknowledges this overlap by referencing its coordination with the Irish Data Protection Commission—the national authority responsible for enforcing the GDPR with respect to Meta. But the very data protection practices now condemned under the DMA have not been found to violate the GDPR. Indeed, the Court of Justice of the European Union (CJEU) has confirmed that a “Consent or Pay” model, in principle, satisfies GDPR standards if the consent is genuinely free and informed, and not undermined by the platform’s dominance. Users must also be offered an equivalent alternative, possibly for an appropriate fee, that avoids the same data processing. As such, the Commission’s DMA decision not only attempts to overwrite the GDPR, but substitutes its judgment for that of the CJEU on the lawfulness of Meta’s practices—despite the Commission having no power to interpret European law.
The Commission’s restrictions on Meta’s advertising model also reflect harmful regulatory overreach. In particular, limiting Meta’s ability to monetize its platform through consent-based, personalized advertising effectively constrains how it can charge for its services. Rather than using regulation to prevent anticompetitive behavior—as the DMA supposedly intends to do—this sort of direct intervention attempts to determine market outcomes. That approach poses serious threats to innovation: A platform that cannot sufficiently monetize its services due to regulatory micromanagement will have a reduced incentive and ability to invest in new technologies and products, ultimately stifling innovation across the platform. Research by IAB Europe shows that 80 percent of EU consumers find online advertising at least occasionally useful, and a majority prefer fewer, more relevant ads over a flood of generic ones.
Moreover, the Commission’s decision harms both business and non-business users by forcing Meta to implement a “Less Personalized Ads” (LPA) model, which, as Meta explains, relies on nearly 90 percent less data for personalized advertising. The Commission’s new mandate against Meta deprives businesses, especially SMEs, of effective advertising tools; Meta estimates that its ad service generates €3.98in revenue for every €1 spent by firms, and also predicts that LPA will lead to drastically reduced advertising, much higher ad dismissals, and fewer on-site and off-site conversions. For users, the quality of digital services will also diminish, as LPA draws from a substantially reduced amount of user data—meaning ads will be less relevant and less helpful in product discovery.
Perhaps most troubling, by restricting Meta’s ability to combine personal data across services for advertising, the Commission is tilting the playing field in favor of Chinese ad competitors. For example, although Meta’s ad services are designated as CPS, the Commission declined to designate TikTok Ads as a CPS—even though ByteDance met all the relevant quantitative thresholds—because it accepted ByteDance’s argument that TikTok Ads is not yet an “important gateway” under Article 3(1)(b) of the DMA. This creates a glaring asymmetry: Meta must degrade its ability to engage in effective advertising, while TikTok—a Chinese-owned platform with persistent data governance concerns—faces no such restrictions. As a result, advertisers facing weaker performance on Meta’s constrained LPA model may shift their budgets to platforms facing fewer regulatory barriers that can offer superior and more targeted services, such as TikTok. Paradoxically, in the name of improving data protection, the Commission may be pushing users away from secure American platforms toward a Chinese platform that European data protection regulators found puts user data at risk of exposure to Chinese authorities.
The Commission’s decision finding that Meta’s “Consent or Pay” model violates Article 5(2) of the DMA is troubling. It blurs the line between data protection and competition, penalizes GDPR-compliant business models that were not found to be unlawful by the CJEU, injects legal uncertainty into Europe’s digital economy, and sends a chilling message to innovators. What’s more, rather than empowering European players, this type of DMA enforcement is more likely to advantage under-regulated Chinese rivals, raising serious strategic and security concerns in the process.
Editors’ Recommendations
Related
November 14, 2024
EU Commission’s Meta Fine Continues Attacks on US Tech, Says ITIF
April 23, 2025