Regardless of what they might be related to, videos with “inappropriate” content were removed by powerful artificial intelligence combined with human moderators removed them at lightning speed, 94.2% before anyone saw them. Systems that searched for “infringing content” removed 17 million of the automatically removed videos.
And other social media companies share a similar story – thousands of hours of content taken down every day. Now, some are asking if Big Tech, by removing so much content, is also removing footage of war crimes in Ukraine.
The graphic content and the war
TikTok was already hugely popular around the world before Russian President Vladimir Putin’s decision to invade Ukraine – but the war was a maturing moment for the platform. Videos using various Ukrainian hashtags have had billions of views.
But Ukrainians uploading videos from the ground could be generating more than “likes”. They may well be carrying a piece in a puzzle of evidence that will one day be used to prosecute war crimes.
But they could also be breaking TikTok and other social media companies’ strict rules about graphic content. “TikTok is a platform that celebrates creativity, but not shock or violence.”, say the TikTok rules.
“We do not allow content that is gratuitously shocking, graphic, sadistic or macabre“. And some, but not all, content depicting possible atrocities may fall into this category.
‘big question’
Researchers aren’t sure how much user-generated content Ukrainian Tiktok and other social media companies like Meta, Twitter and YouTube are taking down. “TikTok is not as transparent as some of the other companies – and none of them are very transparent” says Witness program director Sam Gregory.
“You don’t know what was not visible and withdrawn because it was graphic, but potentially evidence. There is a big problem here.”
This is not the first time that major social media companies have had to deal with evidence of potential war crimes. The conflict in Syria has brought similar problems. Human Rights Watch has been calling for a centralized system of conflict zone uploads for years, to no avail.
“At the moment this does not exist“, says senior conflict researcher Belkis Wille. She describes the random and complicated process prosecutors have to follow to get evidence removed from social media. In Will’s words:
“Authorities can write to social media companies, or ask for a subpoena or court order…but the way the process works right now, no one has a very good picture of where all that content is.”
And that’s a real problem for investigators. Even before Ukraine, those trying to document atrocities highlighted how increased restraint was having a detrimental effect on evidence gathering.
“This detection rate means that human rights actors are increasingly losing the race to identify and preserve information“said a report on digital evidence of atrocities by the Center for Human Rights, Berkeley School of Law.
The report called for “digital lockers” – places where content can be stored and reviewed not just by social media companies, but by non-governmental organizations (NGOs) and legal experts.
But many social media companies don’t want to invite outsiders into their moderation processes, leaving researchers with a Rumsfeldian conundrum — they often don’t know what’s being pulled, so how can they know what to request or subpoena?
These are unknown unknowns. But not all social media platforms have the same policies when it comes to graphic content. Telegram has been extremely important in sharing videos from the ground in Ukraine. It also happens to have an extremely light policy on moderation. Videos that would be removed from Twitter or Facebook remain on Telegram.
And that’s not the only reason the platform is helping investigators. “I would say that some of the most valuable photo and video content that we as an organization receive is from Telegram“, that is Will.
And there’s another key benefit: Social media companies like Facebook and Twitter automatically strip content from a photo or video’s metadata — a sort of digital ID, revealing where and when the content was captured and crucial to investigators. “One benefit we’ve found is that metadata is not removed in Telegram“, that is Will.
The protection of metadata, from the time a lawsuit is captured to the time it is filed in court, is sometimes referred to as “chain of custody”.
Wendy Betts, director of Eye Witness, an International Bar Association project focused on collecting verifiable human rights atrocities, encourages people to film potential war crimes on her app, Eye Witness to Atrocities , to preserve the information in the best possible way. possible way for use in court. In her words:
“As the footage moves from the photographer to the investigator to the attorney… if any link in that chain is missing, that footage will be viewed as more suspect because changes may have been made during that gap.”
But all these solutions seem fragmented and unsatisfactory. Without a digital locker that all social media companies use, without a place where all this is being stored, crucial evidence could be falling through the cracks.
Different answers from companies
In some cases, it’s not clear that social media companies are storing or documenting these videos. BBC News asked TikTok, Google, Meta and Twitter about their policies in this area.
TikTok forwarded its policies to protect its users during the war in Ukraine, but did not respond to any of the questions asked by the international site. “We don’t have more to share other than this information right now.“, said a representative. Twitter and Google also did not respond. Only Meta gave a personalized answer.
In the words of a representative regarding the removal of videos that could be related to the war:
“We will only remove this type of content when it glorifies violence or celebrates the suffering of others, or when the content is extremely graphic or violent – for example, dismemberment video footage. In relation to the war in Ukraine specifically, we are exploring ways to preserve this and other types of content when we remove it.”
The very different responses from these big four tech companies tell their own story. There is no system, there is no politics, everyone shares. And until there is, crucial evidence may be lost and forgotten.
This article is a translation of the writing by James Clayton into the BBC News.
Want to know more? Don’t forget to follow the DNEWS on social media and stay on top of everything!
The post Are Tech Companies Erasing War Crimes? appeared first on DNAMES.