Tracking how Russia fabricated
its pretext for invading Ukraine

Loading

Analysis of video and social media posts
reveal extent of disinformation campaign

Read in Japanese

Disinformation about Russia's invasion of Ukraine is spreading. What methods were used to create false information, and where did it originate? Nikkei sought out the main threads and analyzed how they spread on Telegram, a messaging app widely used in the former Soviet bloc.

Topic1

How has Russian disinformation spread on social media?

Nikkei searched for suspiciously fake news related to Russia's invasion and found 15 posts from Russian government-affiliated accounts containing false information. The posts included images of fake corpses and documents. Accounts that spread such disinformation are indicated by circles, and those that quote or otherwise forward the disinformation are shown as lines connecting the circles. The circles and lines make up a diagram of how disinformation is spread. The size of the circle indicates the number of views of a post and thus shows what kind of accounts contributed to the spread of the fake.

How to read the diagram
  • Account type
  • Government-affiliated
  • Experts close to the government
  • Accounts thought to have been created to disinform
  • Accounts distancing themselves from the government
  • Others
  • Account type
  • Accounts close to the government
  • Accounts distancing themselves from the government
  • Others
  • Account type
  • Russia
  • China
  • Armenia
  • Others

Total views

Loading...

Government-affiliated accounts disseminate pretexts as invasion approaches

On Feb. 18, seemingly fake posts began to appear on Telegram, having been spread by Russian government-affiliates ().

Scroll

Experts close to the government spread pretext

Experts close to the Russian government () post disinformation, alleging Ukrainian militants tried to bomb a chlorine tank and that Ukraine's army was intensifying its attacks on the Russian-occupied territories. Views of these posts totaled 610,000 by Feb. 22, two days before Russia invaded.

Scroll

Accounts alleged to have been created to disinform accelerate the spread before and immediately after invasion

By Feb. 26, total views hit 2.7 million. Daily views, meanwhile, peaked at 1.5 million – about three times that of before the invasion. The main spreaders were accounts thought to have been used to disinform ().

Scroll

Two weeks after Russia invaded Ukraine, a second and more intense wave of disinformation picked up

Daily views

These posts began to spread around March 8, with Russia's military bogged down. On March 10, daily views hit 2.7 million, topping the immediate post-invasion peak of disinformation by 80%. The spread slowed somewhat in the ensuing days.

We can see two disinformation waves: one before and after Russia's invasion; the other two weeks later, when the war became a stalemate. During the second wave, posts claiming that Ukraine's government sent out false information stood out.

Scroll

Quantity over quality: Total views eclipse 10 million

By March 14, total views of disinformation surpassed 10 million. One of the most viewed videos posted by the Russian government's disinformation campaign for the 2016 U.S. presidential election had 9 million views. The efficacy of disinformation related to Ukraine is low, but the messages may have had a certain impact by being collectively spread.

Scroll

Disinformation overwhelms posts critical of Russian government by volume

Though some posts by accounts critical of Russian government () are spreading, the volume of pro-government () posts is overwhelming.

Scroll

Spreading inward, with most accounts from Russia

Fake posts spread inward, with 97% of accounts from Russia (). By spreading false information domestically, Russia is dividing its citizens from the rest of the world.

Outside Russia, the spread to pro-Russian countries such as China () and Armenia () can also be confirmed.

Scroll

Posts spread to pro-Russian countries including China and Greece

Percentage of confirmed disinformation in local news site spread from Russia, by region

Searches were performed on major platforms (Google, Microsoft and Yandex) for images in fake Russian news stories, from which images were extracted from top-ranked stories. These images were reprinted without verification on regional language news sites (excluding English, Russian and Ukrainian).

Disinformation originating from the Russian government is quickly denied in Western media and has limited influence.

But in countries politically close to Russia – such as China, Greece and Armenia – local news sites often reprint false information. China, in particular, was the largest destination of dissemination outside Russia, accounting for about 30% of confirmed disinformation.

Topic2

Fake videos posted on social media

Russia released a video showing a Ukrainian military drone shelling a TV film crew, but it is presumed to be fake news. Fake videos, such as those showing Ukrainian President Zelenskyy calling for surrender, were also posted. Meanwhile, false information is also coming from the Ukrainian side.


Russian Defense Ministry-affiliated media forge drone strikes

Russian Defense Ministry-affiliated media Zvezda on Feb. 23 broadcast a video claiming that a Ukrainian military drone shelled a film crew in the pro-Russian Donetsk region in eastern Ukraine. It was presumed to be fake news based on analysis of the speed of the bomb and the audio.

Bomb cannot be seen flying

The video does not show the bomb, which would be visible at normal speeds. The speed of the bomb, if invisible, would have been 2,700 kph (the speed of sound is about 1,225 kph). Heigo Sato, professor at Takushoku University in Japan said, “If the speed of the object is twice the speed of sound, it should leave an exhaust trail. The explosion is also weak.” It is possible that the explosion was not caused by drone fire, but by buried explosives.

Sound of bomb flying cannot be not heard

Pro-Russian militia media on Feb. 23 showed a video taken at the same location and at the same time as Zvezda's. The sound of something flying, which was not heard in the Zvezda video, was heard just before the explosion.

Added sound was found by frequency analysis
The flying sound just before the explosion has different frequency characteristics. This suggests the possibility that sound was artificially added. (Photo courtesy of Japan Acoustic Laboratory)

Japan Acoustic Laboratory analyzed the flying sound and claimed it may have been edited. The sound in this video was evident mainly in the frequency shown in the center of the image. However, the sound was recorded in the bandwidth shown at the bottom, and its characteristics differ from other sounds. It is presumed that the sound was added to conceal the fact that the bombing was fake.


Video purportedly shows President Zelenskyy calling for surrender, but the length of his neck and overall body structure reveals the person is not him

In mid-March, a fake video of Ukrainian President Zelenskyy calling for surrender was circulated. It is apparent that it was a deepfake – a manipulated video produced by sophisticated machine-learning that yield seemingly realistic images and sounds. Zelenskyy's face could have been morphed onto another face, as the length of neck and skeletal frame do not match Zelenskky's.

Kotaro Nakayama, who has a doctorate in Information Science and Technology and is representative director of AI-focused NABLAS in Tokyo, noted, “If we use correction technology to increase resolution, it would be difficult for humans to [discern fakes].”


“Ukrainian soldier entered Russian territory,” reports Russia's state-owned news agency, but it is not a Ukrainian military vehicle

Russia's Tass news agency on Feb. 21 reported that Russian troops destroyed a Ukrainian military vehicle that had entered Russia, killing five Ukrainian soldiers. Subsequent video footage allegedly from cameras attached to Ukrainian soldiers spread on social media.

The footage shows the soldiers and the vehicle moving. The military website Oryx believes the vehicle in the video is a BTR70M armored personnel carrier. The Ukrainian military does not use the BTR70M, and Oryx noted the possibility of a false flag operation.


“Ukrainian saboteurs attempt to blow up chlorine tanks,” pro-Russian militants say, but video and audio are old

Pro-Russian militants on Feb. 18 released a video of what they claimed was a shootout with Ukrainian saboteurs attempting to blow up chlorine tanks in pro-Russia-controlled areas. The militants claimed it was evidence of a crime committed by the Ukrainian side. However, Nikkei analyzed the video's metadata and found that it was created 10 days earlier, on Feb.8. The British investigative news organization Bellingcat stated that the video “may be a fake” because it may have used the “explosion sound” from a video posted on YouTube in 2010.


Photo captioned as “Russian military jet captured” was seen on Twitter, but it was a fake photo posted by the Ukrainian side.

This photo spread on social media as “Ukrainian farmer captured a Russian military jet.”

There have also been fake posts by the Ukrainian side. A Twitter account from Ukrainian media posted a photo on March 11 captioned, “Ukrainian farmer captured a Russian military jet.”

When Nikkei checked the source of the photo, it turned out it was actually one related to a military ceremony posted by the Croatian Defense Ministry on its website on May 26, 2011. Fighter jets have a mark to identify nationality, but the one on the tail indicates Croatia's military, not Russia's. Furthermore, the domain of the URL written on the sign of the building in the background is “.hr,” indicating Croatia.

Topic3

Disinformation is Russia's forte

"The Nazi regime in Ukraine is planning to commit genocide against the Russian population." This is disinformation that Russia is spreading to justify its invasion. Disinformation has been Russia's forte in its history of territorial expansion since the days of the former Soviet Union. In World War II, Russia invaded Poland on the pretext of protecting Ukrainian and Belarusian civilians. In 2014, it intervened in Crimea on the pretext that the pro-Ukrainian government was oppressing the region.

Russian disinformation dates back to the Soviet era.

World War II

  • Establishes department specializing in disinformation (dezinformatsiya, in Russian) at KGB predecessor

    (This is the origin of the English word “disinformation,” which came into use in the 1950s)

  • Russian Red Army employs much deception in combat, including disinformation and false flag operations

Cold War

  • Spreads disinformation stating that HIV is a biological weapon developed by the U.S. in the 1980s

2014

Annexation of Crimea

  • Spreads disinformation that Russian citizens in Crimea are being oppressed to justify annexation

  • Sends troops secretly to annex Crimea after initially denying action

2016

U.S. presidential election

  • Internet Research Agency, a Russian government-affiliated organization, uses many accounts to post disinformation in attempt to heighten divisions between racial and other social groups

  • Spreads disinformation to attack Donald Trump opponent Hillary Clinton

2020

U.S. presidential election

  • Spreads disinformation to attack President Donald Trump opponent Joe Biden

2022

Ukraine invasion

  • Suddenly claims genocide is occurring in eastern Ukraine

  • Makes unfounded claims that Ukraine is secretly developing nuclear and biological weapons

  • Spreads false information via Putin officials that Ukraine President Zelenskyy has fled the country

Disinformation has often been used to divide adversaries. During the Cold War, information was disseminated stating that HIV was a biological weapon developed by the United States. During the U.S. presidential election, the Internet Research Agency, a disinformation organization, spread posts that heightened divisions between racial and other social groups.

Topic4

Expert view and analysis

Lecturer in professional communication at RMIT University, Melbourne, Australia

Jay Daniel Thompson

Jay Daniel Thompson researches ways to cultivate ethical online communication in an era of digital disinformation. He co-authored a recent article on Russia's disinformation strategy with Timothy Graham, a senior lecturer at Queensland University of Technology.

A massive challenge to democracy

When Timothy Graham and I analyzed the disinformation spread on Twitter on the Ukraine issue, we were surprised by the sheer volume of dissemination, and the speed with which this was spread. In particular, some official Russian government accounts seem to be the main source of pro-Russia disinformation. This disinformation seeks to instill doubt about what is true and what is false regarding Russia's invasion of Ukraine, thereby debilitating democracy.

Pro-Russia disinformation can take the form of using old images and videos taken out of their original context, and "astroturfing," a fabricated organized activity to influence public opinion by posing as a grassroots effort, where many social media accounts are used to share and reshare false information about Russia and Ukraine. Automated online "bots" are also used to spread disinformation. However, there are people who oppose the government in Russia, so it's difficult to say if Russia's disinformation strategy has been successful so far.

For users of social media, digital literacy is more important than ever. People always need to confirm certain facts: Who is posting this information? Why might they be posting it? Can the information be factually verified? Has it already been debunked? It is appropriate to react calmly to suspicious, factually dubious information, by, for example, reporting it to the company operating the service rather than sharing it impulsively.

Russian disinformation operations are a massive challenge to democracy, and to public health and safety. Everyone has a role to play in stemming the spread of this disinformation.

Interviewed by Toru Tsunashima