(C) Common Dreams
This story was originally published by Common Dreams and is unaltered.
. . . . . . . . . .
The EU Digital Services Act: A Win for Transparency [1]
[]
Date: 2024-12
As of February 2024, the European Union’s (EU) Digital Services Act (DSA) is fully implemented across the bloc. The DSA is a landmark law for platform responsibility, and could transform how we understand and address the harms that online platforms exacerbate, including disinformation and harassment.
Provisions within the DSA promise to aid civil society during a crucial period, as a record number of countries hold elections, generative artificial intelligence (AI) threatens to further distort the information landscape, and tech companies downsize their content moderation, trust and safety, and human rights teams. The act’s potential lies in its transparency measures, which require more detailed reporting from tech companies and allow external researchers to access online platforms’ data.
How does the DSA work?
The DSA applies to all “intermediaries”—a term which includes social media platforms, search engines, online marketplaces, and internet service providers—whose services are used by people in the EU. The law is especially concerned with regulating online platforms that are ingrained in everyday life and places additional burdens on platforms and search engines with over 45 million monthly users in the EU, which it deems Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs). In May 2023, the European Commission initially designated 17 platforms, including Facebook, Instagram, TikTok, X, and YouTube, as VLOPs.
The DSA establishes a notice-and-action regime, a legal framework that requires intermediaries to restrict content that violates their own terms of service or the laws of an EU member state. In turn, people have the right to appeal decisions to remove or alter their content. Under the law, regulators from each EU member state will help to implement the law and appoint “trusted flaggers,” to point out content that is illegal or violates intermediaries’ terms of services.
The act also requires that intermediaries identify risks that are inherent to their platform’s design, known as systemic risks, including features that negatively impact civic discourse, electoral processes, and fundamental rights. It empowers independent auditors to assess how well intermediaries are mitigating these risks, which is crucial to understanding how platforms behave ahead of high-stakes events like elections.
If VLOPs and VLOSEs fail to comply with DSA requirements to moderate content or address systemic risks, they can be fined up to 6 percent of their annual global revenue. The European Commission has yet to issue any fines, but it has opened formal proceedings against a host of platforms, including TikTok and X.
Promising Transparency Measures
The DSA includes some particularly encouraging measures that require online platforms to be more transparent about their content moderation, advertising, and use of automated tools. Greater transparency can highlight critical issues, like the ways disinformation or harassment spreads across platforms, and create opportunities for more informed policymaking, legislative oversight, and civil society advocacy.
The act requires all intermediaries to publish annual transparency reports on content restrictions, government requests for user data, and the use of automated moderation tools. While many social media platforms already publish their own transparency reports, the DSA requires VLOPs to produce additional information, including on their algorithms and which languages their content moderators speak.
The European Commission has launched the DSA Transparency Database, which tracks when online platforms remove content, and also requires VLOPs to maintain their own databases with detailed information on all their online advertisements. These repositories can be a tremendous asset for researchers, and can also help regulators identify harms and propose creative strategies to combat them.
Finally, the DSA requires that VLOPs provide data to independent researchers. This creates an opportunity for experts to independently study how information spreads across platforms, the impact of platforms’ algorithms, and much more. The timing of this mechanism is particularly important: though some companies have shared data with researchers, many have pared back their data-sharing programs in recent years, even as civil society and academia have advocated for increased access. Standardizing this process and enshrining it in law will increase the likelihood that companies prioritize these programs and allow researchers to gain new insights.
Ensuring Transparency Shines Through
The DSA is imperfect. The law could lead to the excessive removal of people’s content as companies try to avoid fines, and governments within the EU could leverage the act to remove content protected by international human rights standards. Civil society and academic experts have also warned that emergency powers could be abused to block platforms. Additionally, the regulatory burden could make it difficult for small businesses with fewer financial and personnel resources to comply.
However, despite these risks, the DSA presents a welcome model for internet regulation. As the European Commission implements the act, platforms should ensure they are adopting best practices globally, not just in the EU. Similarly, other democratic governments should seek to replicate some of the act’s most promising transparency measures. The European Commission should also continue to solicit feedback from civil society, which can take advantage of the transparency measures to advocate for policy changes that better protect fundamental rights on these platforms. Because of the outsized impact that EU regulation has globally, the act’s transparency measures can help civil society, policymakers, and tech companies across the world chart a path toward a more rights-centered and democratic online experience.
[END]
---
[1] Url:
https://freedomhouse.org/article/eu-digital-services-act-win-transparency
Published and (C) by Common Dreams
Content appears here under this condition or license: Creative Commons CC BY-NC-ND 3.0..
via Magical.Fish Gopher News Feeds:
gopher://magical.fish/1/feeds/news/commondreams/