2 Foreword
There is a paradox at the heart of the social media age. Platforms that billions of people rely upon to share information, make markets, organise civic life, and participate in democratic debate are themselves almost entirely shielded from independent public scrutiny. They know a great deal about us. We know almost nothing about them. This report, and the Social Media Data Transparency Index it presents, is a rigorous attempt to map that asymmetry—and to make the case that closing it is a matter of urgency.
Transparency is the precondition for accountability. Without the ability to observe how content flows across platforms, how advertisers target audiences, and how information operations exploit the interplay between organic and paid content, we cannot hold enormously powerful social media companies to account. Regulators cannot enforce the law. Researchers cannot produce the evidence base that democratic policymaking requires. Journalists cannot investigate. And citizens cannot make informed choices about the digital environments that shape so much of their lives.
The team at the Minderoo Centre for Technology and Democracy (MCTD) was founded on precisely this conviction: that rethinking the power relationships between digital technologies, society, and the planet requires evidence. In the context of social media, evidence means access to data that companies increasingly hide inside online “walled gardens”. That is why MCTD has consistently placed the access to social media data in the public interest at the centre of its research agenda—not as an end in itself, but as the essential infrastructure for democratic oversight and public accountability.
This report is the result of a collaboration that I am genuinely proud of. The research team at NetLab UFRJ, led by Professor R. Marie Santini, has spent years building one of the most rigorous and consequential programmes of empirical research on platform transparency, disinformation, and digital governance anywhere in the world. Their particular expertise—in the architecture of data access, the mechanics of advertising opacity, and the real-world consequences of platform non-compliance—is unmatched. The partnership between NetLab and MCTD combines complementary strengths: NetLab’s deep technical and empirical grounding in the Brazilian and Latin American context, and MCTD’s public-facing translational research on the governance challenges that confront democratic societies across Europe and beyond.
What makes this report particularly important is its ambition to paint an international comparative picture. The platforms under scrutiny here are not local actors operating within tidy jurisdictional boundaries, but global infrastructure. When Meta, TikTok, or X make data access policies, their decisions ripple across every country that relies on their services. A transparency failure in Brazil—where this report finds consistently the lowest levels of data access—is not just a Brazilian problem. The asymmetry this report documents, in which the same companies offer meaningfully greater access in markets where regulation compels them to do so, is a choice that social media platforms have made and our governments have acquiesced to. This report names it as such.
Researchers working in this field know the contours of the landscape we report here, but not the specific topography. The data transparency crisis that is the subject of this report has been building for over a decade. In the early years of social media research, platforms were relatively open: APIs were accessible, data was available, and a generation of computational social scientists built an evidence base on phenomena from disinformation to radicalisation to public health communication that simply could not have existed otherwise. That openness, always imperfect, has been systematically eroded. The Cambridge Analytica scandal accelerated a closing of the gates, for it was not-for-profit public interest research that paid the price, not the commercial actors whose misuse had prompted the crackdown. The erosion has continued: CrowdTangle discontinued, APIs paywalled, access requests denied or unanswered, transparency tools deployed that return no results. Our own researchers’ experience in the course of producing this report—including having an access request to X’s API under the Digital Services Act definitively rejected after months of follow-up—is itself testimony to the scale of the problem.
Data transparency is the means by which societies can hold powerful institutions to account. The online advertising market, now estimated at USD 650 billion globally, runs largely on unauditable, self-reported metrics. Electoral campaigns are conducted on platforms where neither regulators nor researchers can independently verify the reach, content, or targeting of political advertising. Disinformation networks exploit the interplay between organic and paid content in ways that are structurally invisible to outside observers. Scam advertisers operate without scrutiny. These risks are the documented reality of the information environment.
The twelve recommendations set out in this report are addressed to social media companies, international governance bodies, regional policymakers, and the research community itself. They are grounded in evidence and calibrated to what is technically and institutionally feasible. They make practical and operationalisationable demands for what the United Nations itself has called for: meaningful transparency, enforced by binding standards, that enables independent research and democratic oversight. This is what societies need to guarantee information integrity.
I am grateful to every member of the research team who contributed to this work, both at NetLab UFRJ and at the Minderoo Centre for Technology and Democracy. The effort involved in rigorously assessing 15 platforms across three jurisdictions, negotiating access requests, testing interfaces, and validating findings over months of collaborative work is immense, and the quality of the scholarship here reflects it. I am particularly grateful to Professor Santini and Dr Hugo Leal for their intellectual leadership for this collaboration from its outset.
The online world does not have to live in a walled garden. But making a different future will require the sustained pressure of rigorous research, coordinated advocacy, and, ultimately, the political will to insist that accountability is not optional. That work is what this report—and the partnership that produced it—is for.
Professor Gina Neff
Executive Director, Minderoo Centre for Technology and Democracy
University of Cambridge
Professor of Responsible AI, Queen Mary University of London
Cambridge, April 2026