When John Travolta turns around looking from left to right, that means something like: Something is missing here on the Internet. “Confused Travolta”, a short excerpt from the film Pulp Fiction, is a well-known meme. With his shrugging shoulders and outstretched arms, the actor clearly shows that he is looking for something that is not there. In social networks, people are constantly putting this scene into new contexts.
A TikTok user wrote about the video: “ Me in heaven looking for the 6 million”, in German: “How I look for the 6 million in heaven.” The six million are the Jews murdered during the Shoah. Since Travolta is looking for something that is not there, the TikTok user assumes through its context that there were not six million murdered Jews. At first glance, the video looks like millions of other memes on TikTok, in the background Men at Work is happily rumbling Down Under — but it is a clear denial of the Holocaust. Almost 25,000 people saw it.
The example shows how right-wing extremists operate on TikTok. A study published on Tuesday by the London think tank Institute for Strategic Dialogue (ISD) that there are such videos on the platform and the means they use.
The study is not representative; it does not allow any statement to be made about the proportion of extremist content in all of the videos published on TikTok. The more than 1,000 videos examined were specifically selected based on the fact that they could contain hate messages and then examined to determine how they convey hate, discrimination and extremism.
“TikTok does not consistently implement its own rules”
Actually, there shouldn’t be such videos on TikTok. The network has joined an EU-wide code against hate speech and has imposed strict rules on what kind of content is allowed. Anything that contradicts these rules should be deleted.
According to its own information, the network removed almost 90 million videos from the platform in the second half of 2020 because they violated these community guidelines — in addition to extremism, hatred and hate speech, this also includes those who, for example, self-harm or too much show bare skin. The ISD study shows that videos with hate messages make it through the controls. After all, 80 percent of the videos he examined were still online when he finished, says Ciaran O’Connor, the study’s author.
“The rules that TikTok gives itself are not bad,” says O’Connor. “But TikTok is not implementing them consistently enough.” Hate messages are removed, but too inconsistent and imprecise. The study says, for example, that TikTok needs to define more clearly what it means by extremism and develop a better understanding of how individual video elements are used in a targeted manner to spread hatred.
This is also important because young people in particular use TikTok. The fact that political content does play a role is nothing new. It has been reported time and again that activists of the Black Lives Matter movement made intensive use of the platform. In addition to liberal and progressive views, conservative content would also find its audience there, said communication scientist Ioana Literat from Columbia University in an interview with the New York Times.
Read more: A Hacker Steals 600 Million From A Company
It is therefore not particularly surprising that extremists and right-wing extremists are also trying to use the platform for themselves. This is also dangerous because TikTok is known for the fact that content can spread very quickly and widely. Because with TikTok, users not only see the content of the accounts they are following. Videos selected by an algorithm are displayed and played on the for you page, i.e. the start page that can be seen when you open the app. This is content that fits what you have seen and liked so far, but also videos on other topics that are currently receiving a lot of attention.
The criteria according to which selection is made is a trade secret. Once a video is found by the algorithm to be worth seeing for many people, it can spread faster than in other networks, where accounts first have to laboriously build a following.
Some of the videos examined in the ISD study also appear to have spread this way. There is hardly any other explanation for the number of hits. The video most viewed in the study with two million hits is a clip that deals with anti-Asian racism in connection with the coronavirus. After all, videos denying the Holocaust were viewed a good 650,000 and 230,000 times.
The range of hatred that the study author found on TikTok is wide. 30 percent of the videos examined are assigned to the “ White Supremacy “ category, i.e. the idea that people with white skin are superior to others. Other motives that were frequently found were anti-Semitism, hatred against blacks, LGBTQ people, women, Muslims or the glorification of an extremist group or people.
In particular, the research focuses on the ways in which people try to spread hatred and extremism on TikTok. For example, they are building replicas of the Auschwitz extermination camp in the game Minecraft, making photo slideshows of Nazi marches, showing recordings of the Christchurch assassin, and excerpts from speeches by Adolf Hitler and Joseph Goebbels.
The Creator, so the authors of the extremist content, use the possibilities of the platform sent as filters and effects that old recordings give a modern look or embarrassed videos via green screen technology on a different background. Extreme hateful content is packaged in the style of light entertainment. A popular stylistic device at TikTok is to react to videos of other users, the app has special functions for this. These are also used by extremists to ridicule people.
Some of the videos O’Connor found aren’t difficult to read. Others are more subtle. A video shows the black George Floyd killed by a police officer in a colorful look. There is also music with the text “I can’t breathe”, in German: I can’t breathe. The account that posted the video has an “88” in its name, a reference from the neo-Nazi scene. “It’s only in combination with the profile name and the reference to Floyd’s last words that it becomes clear that the video is supposed to mock him,” says O’Connor.
Songs and sound effects play a special role in TikTok. For example, MGMT’s song Little Dark Age is the most widely used among the videos examined. That doesn’t mean that the electro-pop song has a racist meaning. Rather, it shows that racists are also participating in big trends: Little Dark Age was a very popular song on TikTok in 2020. There you have the option of displaying all videos that use a certain sound in the background. Using a popular song can increase the visibility of your own clip and attract new audiences. “Sounds work at TikTok in a similar way to hashtags for grouping and assigning content,” says researcher O’Connor.
To avoid deletion, many accounts use more or less hidden signals, for example by deliberately spelling wrong hashtags. And if they are deleted, many accounts reappear immediately with slightly different names.
In order to be able to examine all of this even better in the future, more transparency would be necessary, according to O’Connor. For his investigation, he had to manually and individually search for the videos in the app, a tedious process. He demands that TikTok give scientists and journalists better access to the data, ideally via an interface, a so-called API. “That would make it easier to investigate these problems even more,” he says. Scientists in the USA and Germany are also making demands of this kind on other large platforms, such as Facebook and Instagram, for better access to data.
The fact that he had to search for the videos by hand has another side effect: O’Connor was able to observe how the algorithm adjusted more and more to his interest in extremist content — even if it was only for research purposes. “When I open the TikTok account that I used to search, I see content similar to the one examined.”
Do You Know What We Have Posted on
Originally published at https://www.tehnologijaviews.xyz.