On October 24, The Economist published an article profiling Lithuania’s use of software known as Demaskuok (“debunk” in Lithuanian) to combat disinformation emanating from Russian disinformation factories. As The Economist noted, Russian-sponsored disinformation “is a bane everywhere, but it is particularly rife in Estonia, Latvia and Lithuania—the three countries that, in 1990, were the first to declare independence from the Soviet Union” and later “join[ed] NATO and the European Union.” Those offenses, in the eyes of certain Russians still nostalgic for the halcyon days of Soviet rule, warrants making the Baltic states “particular targets for falsehoods intended to confuse and destabilise.”
Demaskuok has become the tip of the spear for Lithuanians to combat these relentless Russian disinformation campaigns. According to The Economist, it is software that searches for the true points of origin of particular disinformation. Developed by Lithuanian news portal Delfi in conjunction with Google, it
works by sifting through reams of online verbiage in Lithuanian, Russian and English, scoring items for the likelihood that they are disinformation. Then, by tracking back through the online history of reports that look suspicious, it attempts to pin down a disinformation campaign’s point of origin—its patient zero.
Demaskuok searches for a variety of clues characteristics of disinformation, such as:
- “[W]ording redolent of themes propagandists commonly exploit,” including “poverty, rape, environmental degradation, military shortcomings, war games, societal rifts, viruses and other health scares, political blunders, poor governance, and, ironically, the uncovering of deceit”;
- “[A] text’s ability to stir the emotions,” including topics like immigrants, sex, ethnicities, injustice, gossip, and scandal, because effective disinformation has that effect;
- “Virality,” “the number of times readers share or write about an item,” because “disinformation is crafted to be shared”;
- The reputations “of websites that host an item or provide a link to it”;
- “[T]he timing of a story’s appearance”; and
- The names of people quoted in disinformation, “as they sometimes crop up again, and images, which may be posted in other locations.
The software, however, does not do the job all on its own; human scrutiny “is an important part of the process.” Demaskuok users, who include Delfi journalists, the Lithuanian Foreign Ministry, “and a score of news outlets, think-tanks, universities and other organisations,” review items that Demaskuok flags and provide feedback on the accuracy of those flags to improve the software’s performance. In addition, more than 4,000 volunteers known as “elves” — about 50 at one time –
scroll through Demaskuok’s feed of suspected disinformation, selecting items to be verified. These are sent to the other elves for fact checking. Reports on the findings are then written up by the software’s users and emailed to newsrooms and other organisations, including Lithuania’s defence ministry, that produce written or video “debunks” for the public.
N.B.: Although disinformation and “deepfake” technology have garnered the most publicity for their geopolitical ramifications, companies must also be attentive to what one expert commentator termed “the threat deep fakes and disinformation more generally pos[e] to corporations, brands and markets.” Companies with international operations and visibility should therefore look more closely at the successes and techniques of Demaskuok – including its marriage of technology and human judgment – in evaluating their reputation risks and their capacity for timely prevention or response to disinformation campaigns directed at them.