Meanwhile, the Russian government said on Friday it was blocking popular social media app Instagram, taking further action against Meta – the parent company of Facebook, Instagram and WhatsApp – over reports the day before that Facebook had temporarily suspended its hate speech rules to allow the posts. who called for the death of Russian leader Vladimir Putin. The country previously blocked Facebook, which has a much smaller audience in Russia than WhatsApp or Instagram.
The gradual escalations over the past two weeks between Russia and the tech giants have forced companies to rethink how they police online speech, rewriting their rules as they go in response to the rapidly evolving dispute. Social platforms are essential tools for the public to communicate and share information during wartime, but Russian propaganda outlets have also used them to spread disinformation about the war. And the companies are weighing pressure from world leaders to increase Russia’s isolation against possible retaliation from the country itself.
Following Russian interference in the 2016 election and a global pandemic, companies such as Facebook, Twitter and YouTube have moved away from a historically hands-off approach to controlling content on their platforms, creating new rules to try to stop the spread of misinformation that they believe could cause harm in the real world.
But the Ukraine conflict has sparked a wave of new policy decisions and rule changes as companies have banned state media and allowed some speech previously seen as hateful.
“This is clearly a crisis information environment and technology companies are making a lot of decisions on the fly,” said Graham Brookie, senior director of the Atlantic Council’s Digital Forensic Research Lab.
Companies say there is precedent for last-minute decisions and the need to stay nimble during fast-paced global events.
“I want to be perfectly clear: our policies are focused on protecting people’s right to free speech as an expression of self-defense in response to a military invasion of their country,” said Nick Clegg, president of the Society for public affairs. “The point is that if we applied our standard content policies without any adjustments, we would not remove content from ordinary Ukrainians expressing their resistance and fury against invading military forces, which would rightly be considered unacceptable. “
Instagram head Adam Mosseri tweeted on Friday that Instagram has 80 million users in Russia who will be cut off “from each other and the rest of the world” because 80% of Russians follow an Instagram account outside of Russia. their country. It’s wrong.”
Facebook also temporarily suspended its hate speech policies last year, allowing Iranians to call the country’s leader Ali Khameini to death for two weeks during a period of government crackdown.
YouTube adopted a policy against “denying, minimizing, or trivializing well-documented violent events” in 2019, and cited the Holocaust and the Sandy Hook school shooting as examples. On Friday, the company said it would shut down Russian state media worldwide “in accordance with this policy.”
The companies have long resisted a crackdown on Russian state-backed channels – despite their known propensity to broadcast propaganda – because they feared being left out of the country. The companies also feared being perceived as inconsistent, as news outlets like PBS and the BBC in the US and Europe are also state-backed. Instead, the companies opted in 2018 to label media outlets that receive the bulk of their funding from governments.
As a result, public broadcasters have capitalized heavily on social media. With over 16 million subscribers for its English, Spanish and Arabic channels combined, outlet RT has claimed to be the most-watched news network on YouTube and has garnered over 10 billion views over time. RT’s English YouTube channel gained an additional 130,000 subscribers in the weeks leading up to the war in Ukraine.
The companies say that in places where the government controls the news media, Western social media services are often one of the few places where people can organize and express their opinions more freely. Services like Instagram and YouTube are hugely popular with Russian audiences and have been a place where some critics of the invasion have been able to find a platform, despite stiff penalties for dissent inside Russia.
But the calculus of social media companies is rapidly changing amid international condemnation of the Russian invasion and Russian retaliation against Silicon Valley services, in what some are calling a new digital iron curtain. Increasingly, companies are willing to choose a side.
State media has inaccurately portrayed Russia as liberating the Ukrainian people who support Russia and protecting them from Ukrainian Nazis. The Russian government does not call the war a “war”, but rather a “special operation” to “demilitarize and denazify” Ukraine.
(The Ukrainian government and its Jewish president were democratically elected, and a recent poll by a Kyiv-based agency showed that almost 80% of Ukrainians oppose making concessions to Russia and 67% said they were prepared to put up armed resistance against Russia.)
By closing outlets like RT and Sputnik around the world, YouTube is knowingly risking a retaliatory shutdown in Russia. Facebook was also prepared to risk a total shutdown of Instagram and WhatsApp when it hastily implemented the temporary policy allowing people to call for death to Putin and the Russian invaders earlier this week.
Former security chief Alex Stamos said that despite tech companies’ reluctance to do so, in geopolitical events where thousands of people die, “it’s okay to choose a side. In fact, it’s the only reasonable thing you can do. Because if you don’t choose a side, you actually choose the side of the powerful over their victims,” he said.
Tech companies have long championed the protection of free speech and have been reluctant to remove political content until it’s overtly violent. This hands-off stance has changed in recent years, but researchers have consistently shown that tech companies have struggled with inconsistent content regulation and non-compliance with their rules. Some say the companies have been too permissive with state media.
The coronavirus pandemic is a prime example, said Brookie of the Atlantic Council. For years, tech companies said they wouldn’t block misinformation on their platforms because they didn’t want to be arbiters of the truth, but then they started removing content about the coronavirus that people say experts, went against public health guidelines.
This extended to RT, whose initials once stood for Russia Today, which media observers have underline was pushing different narratives about the coronavirus pandemic depending on its audience. Nationally, he supported mask wearing and vaccines, while on his English, French, Spanish and German channels he aired stories about how mask mandates were an attack on peoples’ freedoms. (Last year, YouTube cracked down on two German RT channels over the spread of false covid news, prompting the Russian government to pledge “retaliatory action.”)
Facebook has been criticized by its own independent oversight board for having inconsistent rules. The company created an exception to its hate speech rules for world leaders, but was never clear which leaders got the exception or why. After suspending President Donald Trump’s account following the Jan. 6 violence a year ago, the Oversight Council said the decision was correct but that Facebook did so without a clear rationale or plan.
The tit-for-tat between businesses in Russia began in the early days of the invasion, when Facebook began checking Russian outlets for misleading items. Facebook also quickly changed its hate speech policy to allow praise for a neo-Nazi group previously banned in Ukraine that fought against Russia.
The Russian government asked Facebook to remove these fact checks, but Facebook refused, Clegg said in tweets last week. The company argued that its services are essential for activists and ordinary Russians to be able to communicate with their families. Russia’s internet regulator then said it would start limiting access to Facebook services.
Facebook, YouTube and others reacted by blocking the ability of Russian public media services to buy advertising. Then the European Commission announced it was banning state media services RT and Sputnik, and it asked tech companies to comply with the regional ban.
The tech companies have complied and also said they will further limit the reach of Russian government-backed outlets around the world.
A week ago on Friday, Russia announced that it was completely blocking Facebook. But the Russian internet regulator has not extended the blocking to the popular Instagram or WhatsApp.
On Thursday, a Facebook content moderator leaked new guidelines that showed Facebook decided to break its own rules to allow some calls for violence against the Russian invaders. The company confirmed the leak.
Russia’s attorney general said on Friday the government was opening a criminal investigation against Meta and seeking the company’s classification as an “extremist organization and a ban on its activities” on Russian territory, alleging the platform had been used to incite “mass riots accompanied by violence”.
The invasion and flurry of demands to remove certain content has triggered an emergency for tech companies, so relatively quick policy changes make sense, said Daphne Keller, who was associate general counsel at Google until recently. in 2015 and now directs the program on Platform Regulation at Stanford University’s Cyber Policy Center.
“When it comes to a crisis situation, doing something exceptional that isn’t already covered by law or isn’t already covered by the platform’s discretionary policies makes sense,” she said.
But the resulting Russian retaliation has negative downstream effects, Brookie said.
“Creating a digital iron curtain that cuts the Russian people off from a global information environment will make it harder for the Russian government to be held accountable for its actions,” he said.
Craig Timberg, Ellen Frances, Taylor Telford and Mary Ilyushina contributed reporting.