Of those, 241,729 were anti-black, 44,368 white supremacist, more than 28,000 neo-Nazi, 8,021 anti-Semitic, 5,607 homophobic, and 168 anti-Muslim. She said: “In the case of Facebook, it may be that when users set up profiles with names that clearly mock and flout community standards - from ‘Jewkilla’ to ‘Nate Higgers’ - they are telling Facebook what kind of user they will be, what kind of ideas they bring to the platform, and the reality is that is far from community-orientated.”Īn analysis of the digital gaming service Steam revealed more than 300,000 offensive profile names. Others included the name Adolf Hitler and other high profile Nazis, as well as the names of mass killers such as the Christchurch mosque attacker in New Zealand.īy changing the spelling or inserting spaces and special characters, profiles appeared to fool moderation systems, she suggested.ĭr Johnson said the findings highlighted “significant room for improvement”. They found around 300 users or profile names on Twitter derived from a racist phrase, including the N-word, dating as far back as 2009.Įither way, it would appear that there is no automatic moderation being performed by Twitter in terms of analysing existing accounts for offensive usernames containing (the N-word), and no moderation when it comes to initially setting your username or handle to contain the same term.ĭr Bethan Johnson from CARR identified dozens of offensive Facebook profiles, including 83 variants of “hate (N-word)” and 91 on the Holocaust. Six months on, CARR researchers looked for profiles using simple words and phrases as indicators of “systemic failure” over two days in January. Twitter removed more than 1,900 tweets but acknowledged it needed to do better. Last July, England footballers Marcus Rashford, Jadon Sancho and Bukayo Saka were targeted after missing penalties in the Euro 2020 final at Wembley. “Otherwise, these platforms will stay reactive – badly – rather than proactive in taking down hateful extremism,” Prof Feldman said. He said platforms had a “duty of care” to users but only Government regulation and the threat of tens of millions in fines would bring change. “It doesn’t matter if you have billions of users if the most vulnerable are subjected to this kind of abuse repeatedly, and seemingly without either protection or action.” “Is Facebook really unable to moderate celebrations of the Holocaust because of the odd apostrophe?
Some usernames – or ‘handles’ – made a flagrant and even proud mockery of Twitter’s terms of service, Prof Feldman added, saying: “What’s the point of claiming to provide moderation when this stuff is only a click away?
This material is disgusting and makes it seem that platforms just don’t care enough to address this running sore.