ArrowArtboardCreated with Sketch.Title ChevronTitle ChevronEye IconIcon FacebookIcon LinkedinIcon Mail ContactPath LayerIcon MailPositive ArrowIcon PrintSite TitleTitle ChevronIcon Twitter
Opinion

Tech companies continue evading accountability for violent content

Governments must step in to stop platforms allowing pro-terrorist material

| Thailand
The symbol of the Islamic State terrorist group on a smartphone: when it comes to extremist content online, time is of the essence.   © NurPhoto/Getty Images

David Ibsen is Executive Director of the Counter Extremism Project, a not-for-profit, nonpartisan, international policy organization formed to combat the growing threat from extremist ideologies.

On March 15, 2019, a white supremacist went on a shooting rampage at two mosques in Christchurch, New Zealand, and killed 51 people. He live streamed 17 minutes of terror on Facebook and 4,000 people watched.

Following that attack, governments around the world gathered to discuss how to prevent such viral displays of violence from happening again. Tech companies sat at the table and promised to do more.

The 12 months following the Christchurch shooting saw two terror attacks in Asia, yet little has changed. In early February, another shooting in Thailand claimed 29 lives. The attacker posted on Facebook throughout the attack and even uploaded a video complaining he felt tired from all the shooting. It is unclear how long it took Facebook to shut down the attacker's account and whether they did so only after a request from the Thai government.

The problem is not limited to livestreaming: the internet has become a repository for violent content of all kinds, which remains accessible despite tech platforms' claims that they are removing nearly all of it. Now governments need to act.

Following the stabbing of one Australian and two Chinese tourists in the Maldives on February 4, researchers at the Counter Extremism Project, or CEP, where I am executive director, were able to retrieve Maldivian content promoting the Islamic State terrorist group on Vimeo, SoundCloud and WordPress that had been online for years.

CEP constantly finds evidence of the inadequate response of tech platforms. In 2018, we scanned YouTube for three months searching for IS-related content. We found over 1,300 videos which had earned 163,000 views. Ninety-one percent of the videos were uploaded more than once and 60% of the accounts responsible remained online even after the videos were removed.

The tech companies' abdication of responsibility for what happens on their platforms has concrete consequences. After the routing of the IS in the Middle East, Southeast Asia has the potential to be another terrorism hot spot. While it is uncertain whether the IS would move its base to the region, the threat of radicalization is here to stay.

People embrace at the memorial site for the victims outside the mosque in Christchurch in March 2019: The tech companies' abdication of responsibility for what happens on their platforms has concrete consequences.   © Reuters

Even in a country as vigilant as Singapore, a man was charged in January with sending money to the IS and a 17-year old was detained for supporting the terrorist group. Both were radicalized online.

Technology companies continue to make promises to be more effective in removing extremist content when they are threatened with fines and regulation. However, we have yet to see this translate into measurable, systematic and transparent actions which will lead to this content being permanently removed.

The solution to this problem has to be both technical and political. Advanced hashing technology which assigns a unique, recognizable digital signature to photos and videos gives us the ability to remove harmful content.

CEP senior adviser Hany Farid, a computer science professor and digital forensics expert, developed eGlyph, a tool capable of detecting and blocking known extremist images, videos and audio files from upload.

If a nonprofit organization such as CEP is capable of developing efficient tools to tackle extremist content online, big tech companies should be able to do the same.

At the political level, governments should stop relying on self-regulation and force the tech industry to face their responsibilities. New legislation approved in Australia last year is a welcome step forward: the law holds social media companies, websites and internet service providers liable for fines of up to 10% of their annual global turnover if they fail to promptly remove offending material.

When it comes to extremist content online, time is of the essence. Therefore, in addition to holding tech platforms liable for the content they host, we need to introduce binding rules for proactive measures.

These would require online platforms and companies to proactively deploy automated filters and algorithmic tools to identify and prevent the upload of content designated as terrorist.

We cannot afford to wait on removal orders for every piece of content, particularly when the same material appears time after time. Companies need to be required, under legislation, to do more.

Sponsored Content

About Sponsored Content This content was commissioned by Nikkei's Global Business Bureau.

You have {{numberArticlesLeft}} free article{{numberArticlesLeft-plural}} left this monthThis is your last free article this month

Stay ahead with our exclusives on Asia;
the most dynamic market in the world.

Stay ahead with our exclusives on Asia

Get trusted insights from experts within Asia itself.

Get trusted insights from experts
within Asia itself.

Try 1 month for $0.99

You have {{numberArticlesLeft}} free article{{numberArticlesLeft-plural}} left this month

This is your last free article this month

Stay ahead with our exclusives on Asia; the most
dynamic market in the world
.

Get trusted insights from experts
within Asia itself.

Try 3 months for $9

Offer ends July 31st

Your trial period has expired

You need a subscription to...

  • Read all stories with unlimited access
  • Use our mobile and tablet apps
See all offers and subscribe

Your full access to the Nikkei Asian Review has expired

You need a subscription to:

  • Read all stories with unlimited access
  • Use our mobile and tablet apps
See all offers
NAR on print phone, device, and tablet media