The real revelation from the ‘Twitter Files’: Content moderation is messy | CNN Business




New York
CNN
 — 

Before then-President Donald Trump was banned from Twitter after the Capitol riot last January, there was a debate among some employees about what to do with the company’s most prominent and controversial user.

Some employees questioned whether Trump’s final tweets on the platform actually violated the company’s policies, according to internal documents. Others asked if the tweets could be considered veiled (or “coded”) efforts to dodge Twitter’s rules and requested research to better understand how users might interpret them.

The high-stakes debate among several employees, including several top execs, was revealed earlier this week in the latest edition of the “Twitter Files,” a tranche of internal company documents provided to and tweeted out by several journalists unaffiliated with major news organizations. The releases so far have focused on some of the social media company’s most high-profile, and controversial, content moderation decisions.

The Twitter Files reports appear aimed at calling into question the integrity of Twitter’s former leadership and riling up the right-leaning user base that new owner Elon Musk has increasingly courted. The latest release, for example, appeared to imply that Twitter executives had sidestepped the platform’s rules when deciding to ban Trump and instead sought a justification to support a partisan decision they’d already made. That interpretation, while not fully supported by the documents, was echoed by Musk, who has cheered and seemingly sanctioned the release of the documents. But outside of Musk’s core base, reaction to the Twitter Files, which provide little new insight into the company’s policy and decision-making, has been largely muted.

Strip away the spectacle and partisan discord and what the Twitter Files show is something that is arguably both far less explosive but nonetheless should give all users pause, regardless of where they sit on the political spectrum. In the absence of meaningful coordination or government oversight, a select few powerful tech platforms are left to make incredibly impactful and difficult decisions around content moderation — and, even when well intentioned, the people at these companies often struggle with how messy that process can be.

In moments of crisis, platforms are generally on their own to determine how to weigh sometimes competing priorities — protecting speech versus protecting users — and often under immense public scrutiny and with pressure to act quickly. These companies have created extensive platform guidelines, set up content moderation councils, partnered with fact-checkers and invested heavily in artificial intelligence, but at the end of the day, it can still just be a group of employees trying to sort through unprecedented decisions such as whether or not to ban a sitting US president.

“There’s no decision that’s cost free,” said Matt Perault, tech policy consultant and professor at University of North Carolina’s School of Information and Library Science. “The challenge is that any decision [social media companies] make, including the decision not to act, will have consequences and they need to figure out which consequences they’re comfortable with … I do think it is much harder than most people seem to think it would be.”

The process doesn’t necessarily always yield the right result. Former Twitter head of trust and safety Yoel Roth has acknowledged the company may not have made the right call in how to handle the 2020 New York Post story about Hunter Biden’s laptop. And Twitter founder and former CEO Jack Dorsey reiterated in an online post Tuesday that he believes the company acted wrongly in removing Trump’s account.

“We did the right thing for the public company business at the time, but the wrong thing for the internet and society,” Dorsey wrote, although he added, “I continue to believe there was no ill intent or hidden agendas, and everyone acted according to the best information we had at the time. Of course mistakes were made.”

Monday’s Twitter Files released from journalist Bari Weiss appeared to present screenshots showing Twitter employees debating how to handle Trump’s tweets in the wake of the January 6, 2021, Capitol attack as proof that the company’s leadership wanted to sidestep its rules to ban Trump. But the screenshots could also be interpreted as showing a group of employees challenging each other to find the best possible way to apply the company’s rules during a critical moment that no one could have perfectly prepared for.

The process of involving multiple staffers and teams and relying on research for high-profile decisions does not appear out of line with how Twitter and other social platforms make content moderation decisions, especially in crisis situations.

“This is how the whole process went … this is not really out of the ordinary,” one former Twitter executive told CNN, noting that the various teams involved in content decisions would push each other to consider context and information they might not have thought of as they worked through how to handle difficult issues. “I think these conversations look like people were trying to be really thoughtful and careful,” the former executive said.

It’s not just Twitter that wrestles with tough decisions, including around Trump. Meta also had a monthslong back-and-forth with its internal team and its external oversight board about its own decision to suspend Trump on Facebook and Instagram.

The Files also point to several instances in which Twitter leaders changed, or considered changing, the company’s policies as evidence that they had ulterior motives. For example, there was a screenshot of a Slack message from an unnamed employee the day after Trump’s ban discussing a desire to address medical misinformation and “getting to a place of improved maturity in how our policies are actualized.” But examining emergent concerns and considering whether they might require new or updated policies seems to be precisely the job of social media trust and safety teams.

The “Twitter Files” threads appear to have been written “with a very clear agenda,” the former executive said. “What they seem to have missed … is just how much power and influence was sitting on the shoulders of a very small number of people.”

Even Dorsey in his Tuesday night post called for a radical overhaul of how social media works that would involve taking away the power of big social media platforms, including the one he co-founded. “I generally think companies have become far too powerful,” Dorsey said. He added that he is pushing for the growth of decentralized social media that is not controlled by any one corporation or individual, and where users can choose their own forms of content moderation.

Still, the Twitter Files reports show just how many of the company’s employees and teams were involved in the deliberations over difficult content decisions. According to the former Twitter executive, that was by design. “Twitter’s process was designed to make sure that the decision doesn’t come down to just one person,” they said. “The alternative is that you wait until Jack Dorsey decides he doesn’t like somebody and you take it down.”

And despite the often-charged rhetoric about the people making content decisions at social media companies, “the people who do this work are thoughtful, are skilled,” Perault said. “They’re deeply connected to the technology, to the products, to the social implications of their products.”

The process under Musk now appears to be much different — the new Twitter owner has fired many of the employees that had been responsible for safety on the platform, he’s used easily-manipulated Twitter polls to justify major content rulings, he’s done away with Twitter’s council of outside trust and safety experts and he’s based at least one decision on who to allow on the platform on his personal feelings.

It’s hard to argue that process isn’t messy, too.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *