In May 2019, Facebook asked the organizing bodies of English soccer to its London offices off Regent’s Park. On the agenda: what to do about the growing racist abuse on the social network against Black soccer players.
At the meeting, Facebook gave representatives from four of England’s main soccer organizations — the Football Association, the Premier League, the English Football League and the Professional Footballers’ Association — what they felt was a brushoff, two people with knowledge of the conversation said. Company executives told the group that they had many issues to deal with, including content about terrorism and child sex abuse.
A few months later, Facebook provided soccer representatives with an athlete safety guide, including directions on how players could shield themselves from bigotry using its tools. The message was clear: It was up to the players and the clubs to protect themselves online.
The interactions were the start of what became a more than two-year campaign by English soccer to pressure Facebook and other social media companies to rein in online hate speech against their players. Soccer officials have since met numerous times with the platforms, sent an open letter calling for change and organized social media boycotts. Facebook’s employees have joined in, demanding that it to do more to stop the harassment.
The pressure intensified after the European Championship last month, when three of England’s Black players were subjected to torrents of racial epithets on social media for missing penalty kicks in the final game’s decisive shootout. Prince William condemned the hate, and the British prime minister, Boris Johnson, threatened regulation and fines for companies that continued to permit racist abuse. Inside Facebook, the incident was escalated to a “Site Event 1,” the equivalent of a companywide five-alarm fire.
Yet as the Premier League, England’s top division, opens its season on Friday, soccer officials said that the social media companies — especially Facebook, the largest — hadn’t taken the issue seriously enough and that players were again steeling themselves for online hate.
“Football is a growing global market that includes clubs, brands, sponsors and fans who are all tired of the obvious lack of desire from the tech giants to develop in-platform solutions for the issues we are dealing with daily,” said Simone Pound, head of equality, diversity and inclusion for the Professional Footballers’ Association, the players’ union.
The impasse with English soccer is another instance of Facebook’s failing to solve speech problems on its platform, even after it was made aware of the level of abuse. While Facebook has introduced some measures to mitigate the harassment, soccer officials said they were insufficient.
Social media companies aren’t doing enough “because the pain hasn’t become enough for them,” said Sanjay Bhandari, the chair of Kick It Out, an organization that supports equality in soccer.
This season, Facebook is trying again. Its Instagram photo-sharing app is expected to roll out new features on Wednesday to make racist material harder to view, according to an internal document obtained by The New York Times. Among them, one will let users hide potentially harassing comments and messages from accounts that either don’t follow or recently followed them.
“The unfortunate reality is that tackling racism on social media, much like tackling racism in society, is complex,” Karina Newton, Instagram’s global head of public policy, said in a statement. “We’ve made important strides, many of which have been driven by our discussions with groups being targeted with abuse, like the U.K. football community.”
But Facebook executives also privately acknowledge that racist speech against English soccer players is likely to continue. “No one thing will fix this challenge overnight,” Steve Hatch, Facebook’s director for Britain and Ireland, wrote last month in an internal note that The Times reviewed.
Some players appear resigned to the abuse. Four days after the European Championship final, Bukayo Saka, 19, one of the Black players who missed penalty kicks for England, posted on Twitter and Instagram that the “powerful platforms are not doing enough to stop these messages” and called it a “sad reality.”
Around the same time, Facebook employees continued to report hateful comments to their employer on Mr. Saka’s posts in an effort to get them taken down. One that was reported — an Instagram comment that read, “Bro stay in Africa” — apparently did not violate the platform’s rules, according to the automated moderation system. It stayed up.
Much of the racist abuse in English soccer has been directed at Black superstars in the Premier League, such as Raheem Sterling and Marcus Rashford. About 30 percent of players in the Premier League are Black, Mr. Bhandari said.
Over time, these players have been harassed at soccer stadiums and on Facebook, where users are asked to provide their real names, and on Instagram and Twitter, which allows users to be anonymous. In April 2019, fed up with the behavior, some players and two former captains of the national team, David Beckham and Wayne Rooney, took part in a 24-hour social media boycott, posting red badges on Instagram, Twitter and Facebook with the hashtag #Enough.
A month later, English soccer officials held their first meeting with Facebook — and came away disappointed. Facebook said that “feedback from the meeting was taken on board and influenced further policy, product and enforcement efforts.”
Tensions ratcheted up last year after the police killing of George Floyd in Minneapolis. When the Premier League restarted in June 2020 after a 100-day coronavirus hiatus, athletes from all 20 clubs began each match by taking a knee. Players continued the symbolic act last season and said they would also kneel this season.
That has stoked more online abuse. In January, Mr. Rashford used Twitter to call out “humanity and social media at its worst” for the bigoted messages he had received. Two of his Manchester United teammates, who are also Black, were targeted on Instagram with monkey emojis — which are meant to dehumanize — after a loss.
Inside Facebook, employees took note of the surge in racist speech. In one internal forum meant for flagging negative press to the communications department, one employee started cataloging articles about English soccer players who had been abused on Facebook’s platforms. By February, the list had grown to about 20 different news clips in a single month, according to a company document seen by The Times.
English soccer organizations continued meeting with Facebook. This year, organizers also brought Twitter into the conversations, forming what became known as the Online Hate Working Group.
But soccer officials grew frustrated at the lack of progress, they said. There was no indication that Facebook’s and Twitter’s top leaders were aware of the abuse, said Edleen John, who heads international relations and corporate affairs for the Football Association, England’s governing body for the sport. She and others began discussing writing an open letter to Mark Zuckerberg and Jack Dorsey, the chief executives of Facebook and Twitter.
“Why don’t we try to communicate and get meetings with individuals right at the top of the organization and see if that will make change?” Ms. John said in an interview, explaining the thinking.
In February, the chief executives of the Premier League, the Football Association and other groups published a 580-word letter to Mr. Zuckerberg and Mr. Dorsey accusing them of “inaction” against racial abuse. They demanded that the companies block racist and discriminatory content before it was sent or posted. They also pushed for user identity verification so offenders could be rooted out.
But, Ms. John said, “we didn’t get a response” from Mr. Zuckerberg or Mr. Dorsey. In April, English soccer organizations, players and brands held a four-day boycott of social media.
Twitter, which declined to comment, said in a blog post about racism on Tuesday that it had been “appalled by those who targeted players from the England football team with racist abuse following the Euro 2020 Final.”
At Facebook, members of the policy team, which sets the rules around what content stays up or comes down, pushed back against the demands from soccer officials, three people with knowledge of the conversations said.
They argued that terms or symbols used for racist abuse — such as a monkey emoji — could have different meanings depending on the context and should not be banned completely. Identity verification could also undermine anonymity on Instagram and create new problems for users, they argued.
In April, Facebook announced a privacy setting called Hidden Words to automatically filter out messages and comments containing offensive words, phrases and emojis. Those comments cannot then be easily seen by the account user and will be hidden from those who follow the account. A month later, Instagram also began a test that allowed a slice of its users in the United States, South Africa, Brazil, Australia and Britain to flag “racist language or activity,” according to documents reviewed by The Times.
The test generated hundreds of reports. One internal spreadsheet outlining the results included a tab titled “Dehumanization_Monkey/Primate.” It had more than 30 examples of comments using bigoted terms and emojis of monkeys, gorillas and bananas in connection with Black people.
‘The Onus Is on Them’
In the hours after England lost the European Championship final to Italy on July 11, racist comments against the players who missed penalty kicks — Mr. Saka, Mr. Rashford and Jadon Sancho — escalated. That set off a “site event” at Facebook, eventually triggering the kind of emergency associated with a major system outage of the site.
Facebook employees rushed to internal forums to say they had reported monkey emojis or other degrading stereotypes. Some workers asked if they could volunteer to help sort through content or moderate comments for high-profile accounts.
“We get this stream of utter bile every match, and it’s even worse when someone black misses,” one employee wrote on an internal forum.
But the employees’ reports of racist speech were often met with automated messages saying the posts did not violate the company’s guidelines. Executives also provided talking points to employees that said Facebook had worked “swiftly to remove comments and accounts directing abuse at England’s footballers.”
In one internal comment, Jerry Newman, Facebook’s director sports partnerships for Europe, the Middle East and Africa, reminded workers that the company had introduced the Hidden Words feature so users could filter out offensive words or symbols. It was the players’ responsibility to use the feature, he wrote.
“Ultimately the onus is on them to go into Instagram and input which emojis/words they don’t want to feature,” Mr. Newman said.
Other Facebook executives said monkey emojis were not typically used negatively. If the company filtered certain terms out for everyone, they added, people might miss important messages.
Adam Mosseri, Instagram’s chief executive, later said the platform could have done better, tweeting in response to a BBC reporter that the app “mistakenly” marked some of the racist comments as “benign.”
But Facebook also defended itself in a blog post. The company said it had removed 25 million pieces of hate content in the first three months of the year, while Instagram took down 6.3 million pieces, or 93 percent before a user reported it.
Kelly Hogarth, who helps manage Mr. Rashford’s off-field activities, said he had no plans to leave social media, which serves as an important channel to fans. Still, she questioned how much of the burden should be on athletes to monitor abuse.
“At what point does responsibility come off the player?” she wondered. She added, “I wouldn’t be under any illusions we will be in exactly the same place, having exactly the same conversation next season.”