The European Union reached a deal on Saturday on landmark legislation that would force Facebook, YouTube and other internet services to combat misinformation, disclose how their services amplify divisive content and stop targeting online ads based on a person’s ethnicity, religion or sexual orientation.
The law, called the Digital Services Act, is intended to address social media’s societal harms by requiring companies to more aggressively police their platforms for illicit content or risk billions of dollars in fines. Tech companies would be compelled to set up new policies and procedures to remove flagged hate speech, terrorist propaganda and other material defined as illegal by countries within the European Union.
The law aims to end an era of self-regulation in which tech companies set their own policies about what content could stay up or be taken down. It stands out from other regulatory attempts by addressing online speech, an area that is largely off limits in the United States because of First Amendment protections. Google, which owns YouTube, and Meta, the owner of Facebook and Instagram, would face yearly audits for “systemic risks” linked to their businesses, while Amazon would confront new rules to stop the sale of illegal products.
The Digital Services Act is part of a one-two punch by the European Union to address the societal and economic effects of the tech giants. Last month, the 27-nation bloc agreed to a different sweeping law, the Digital Markets Act, to counter what regulators see as anticompetitive behavior by the biggest tech firms, including their grip over app stores, online advertising and internet shopping.
Together, the new laws underscore how Europe is setting the standard for tech regulation globally. Frustrated by anticompetitive behavior, social media’s effect on elections and privacy-invading business models, officials spent more than a year negotiating policies that give them broad new powers to crack down on tech giants that are worth trillions of dollars and that are used by billions of people for communication, entertainment, payments and news.
“This will be a model,” Alexandra Geese, a Green party member of the European Parliament from Germany, said of the new law. Ms. Geese, who helped draft the Digital Services Act, said she had already spoken with legislators in Japan, India and other countries about the legislation.
A deal was reached by European policymakers in Brussels early Saturday after 16 hours of negotiations.
“Platforms should be transparent about their content moderation decisions, prevent dangerous disinformation from going viral and avoid unsafe products being offered on marketplaces,” said Margrethe Vestager, who has spearheaded much of the bloc’s work to regulate the tech industry as the executive vice president of the European Commission, the executive arm of the European Union.
The moves contrast with the lack of action in the United States. While U.S. regulators have filed antitrust cases against Google and Meta, no comprehensive federal laws tackling the power of the tech companies have been passed.
Yet even as the European authorities gain newfound legal powers to rein in the tech behemoths, critics wondered how effective they will be. Writing laws can be easier than enforcing them, and while the European Union has a reputation as the world’s toughest regulator of the tech industry, its actions have sometimes appeared tougher on paper than in practice.
An estimated 230 new workers will be hired to enforce the new laws, a figure that critics said was insufficient when compared with the resources available to Meta, Google and others.
The staffing figures “are totally inadequate to face gigantic firms and new gigantic tasks,” said Tommaso Valletti, a former top economist for the European Commission, who worked on antitrust cases against Google and other tech platforms.
Without robust enforcement, he said, the new laws will amount to an unfulfilled promise. Mr. Valletti said that even as Europe had levied multibillion-dollar antitrust rulings against Google in recent years, those actions had done little to restore competition because regulators did not force the company to make major structural changes.
Lack of enforcement of the European Union’s data privacy law, the General Data Protection Regulation, or G.D.P.R., has also cast a shadow over the new laws.
Like the Digital Services Act and Digital Markets Act, G.D.P.R. was hailed as landmark legislation. But since it took effect in 2018, there has been little action against Facebook, Google and others over their data-collection practices. Many have sidestepped the rules by bombarding users with consent windows on their websites.
“They haven’t shown themselves capable of using powerful tools that already exist to rein in Big Tech,” said Johnny Ryan, a privacy-rights campaigner and senior fellow at the Irish Council for Civil Liberties, who has pushed for tougher enforcement. “I don’t anticipate them showing themselves suddenly to be any different with a new set of tools.”
Tech companies and industry trade groups have warned that the laws could have unintended consequences, like harming smaller businesses and undercutting Europe’s digital economy.
Google said in a statement that it supported the goals of the Digital Services Act but that “details will matter” and that it planned to work with policymakers to “get the remaining technical details right.” Amazon and Twitter declined to comment. Meta and TikTok did not respond to requests for comment.
Backers of the new laws said they had learned from past mistakes. While enforcement of G.D.P.R. was left to regulators in individual countries — which many felt were overmatched by multinational corporations with seemingly bottomless legal budgets — the new laws will largely be enforced out of Brussels by the European Commission, a major shift in approach.
“Introducing new obligations on platforms and rights for users would be pointless if they are not properly enforced,” said Thierry Breton of the European Commission, a former French business executive who helped draft the law.
The final text of the Digital Services Act is not expected to be available for several weeks, and final votes must still be taken, a process that is not expected to result in any major changes to the agreement. But policymakers in the European Commission and European Parliament involved in the negotiations described details of what would be one of the world’s most far-reaching pieces of digital policy.
The law, which would take effect next year, does not order internet platforms to remove specific forms of speech, leaving that to individual countries to define. (Certain forms of hate speech and references to Nazism are illegal in Germany but not in other European countries.) The law forces companies to add ways for users to flag illicit content.
Inspired by the war in Ukraine and the pandemic, policymakers gave regulators additional power to force internet companies to respond quickly during a national security or health crisis. This could include stopping the spread of certain state propaganda on social media during a war or the online sale of bogus medical supplies and drugs during a pandemic.
Many provisions related to social media track closely with recommendations made by Frances Haugen, the former Facebook employee who became a whistle-blower. The law requires companies to offer a way for users to turn off recommendation algorithms that use their personal data to tailor content.
Meta, TikTok and others would also have to share more data about how their platforms work, with outside researchers at universities and civil society groups. The companies would have to conduct an annual risk-assessment report, reviewed by an outside auditor, with a summary of the findings made public.
Policymakers said the prospect of reputational damage could be more powerful than fines. But if the European Commission determined that Meta or another company was not doing enough to address problems identified by auditors, the company could face financial penalties of up to 6 percent of global revenue and be ordered to change business practices.
New restrictions on targeted advertising could have major effects on internet-based businesses. The rules would limit the use of data based on race, religion, political views or labor union membership. The companies would also not be able to target children with ads.
Online retailers like Amazon would face new requirements to stop the sale of illicit products by resellers on their platforms, leaving the companies open to consumer lawsuits.
Europe’s position as a regulatory leader will depend on enforcement of the new laws, which are likely to face legal challenges from the biggest companies, said Agustín Reyna, director of legal and economic affairs at the European Consumer Organization, a consumer watchdog group.
“Effective enforcement is absolutely key to the success of these new rules,” he said. “Great power comes with greater responsibility to ensure the biggest companies in the world are not able to bypass their obligations.”