Introduction
Disinformation is a concept that has existed in many forms over history. It has become a hot topic in recent decades due to the rise of online disinformation campaigns, especially those perpetrated by Russia. Disinformation is not new to Russia: Lenin regarded disinformation as a key component to establishing the Soviet Union. Agitation and propaganda, or agitprop, is a combination of spreading propaganda and mobilizing the masses, as a form of political warfare without the need to resort to military force (Leighton, 12). Agitation and propaganda combined with forgery, terrorism, deception, sabotage, and espionage are referred to as “active measures,” a strategy that has been developed since the inception of the Soviet Union (Abrams, 8). Disinformation is a subset of this, which is information based on a kernel of truth that is then embellished or manipulated to suit a specific purpose (Leighton, 4). This can be a concoction of facts, arguments, rumours, half-truths, or falsehoods, which are then either disseminated to a small group through human mechanical or electronic channels, or to a wider and more undifferentiated audience via mass communication (Leighton, 4).
Disinformation campaigns are low-cost, low-risk campaigns that act preemptively, forestalling opposition and attempting to change perspectives early. It is easily deployable, much more accessible than military power and significantly more subtle. They are used as quiet conditioning devices, intending to disorient frames of reference and disarm the masses. The strategy is to slowly indoctrinate an audience through repetitive messages, heightening a subject’s predisposition to receive and absorb subsequent messages (Leighton, 15). Currently, these campaigns are deployed primarily on social media in attempts to reach as many people as possible. Websites and applications such as Facebook, Twitter, Instagram, Whatsapp, Telegram, TikTok, YouTube, and so on are all used in the process. Several methods can be used – false videos shared across platforms, user profiles disseminating messages through articles or posts, ads purchased for target audiences, and so on. Each person that encounters disinformation becomes a potential agent, reframing and redistributing messages for the next group of interpreters (Hoch, 5).
Social media disinformation campaigns by some Russian forces have been extensive and are widely used for various purposes. Election interference campaigns have been run in the United States, the Netherlands, the United Kingdom, France, Germany, and Sweden (Brattberg and Maurer, 28). They have also been used to interfere with European Parliament elections, before referendums such as Brexit, during protests such as in Catalonia, and about the coronavirus pandemic (Legucka). These tactics are used especially heavily in former Soviet states. Disinformation campaigns have also been intended to stoke infighting in groups as they target races, classes, political interest groups, and activists indiscriminately.
One area of interest is the Karabakh region of Azerbaijan. Karabakh is a region in the South Caucasus that is internationally recognized as part of Azerbaijan but was under the de-facto control of separatist Armenian groups since the first Karabakh war of the early 1990s until the Second Karabakh War (2020). The conflict in the region was fueled by Armenia’s irredentist claims to the internationally-recognized territories of Azerbaijan. Most recently, the conflict in 2020 had a strong disinformation aspect in Russia and Armenia. Re-edited videos and old footage incorrectly labeled as current events were spread on social media (Giles and Bhat, 2020). Additionally, accounts claiming to be from numerous countries with ties to Armenia, Russia, and Azerbaijan were created to game the algorithm and reach more international audiences that have large platforms (Thomas and Zhang, 1). Finally, Russia has aired anti-Azerbaijani media domestically in order to manipulate the narrative and come out on top of both Armenia and Azerbaijan (Muradov, 2022).
This report seeks to investigate the use of disinformation surrounding the Armenia-Azerbaijan conflict with special attention focused on Russia and pro-Armenia narratives concerning the Armenia-Azerbaijan conflict over the Karabakh region. This report does not seek to make claims about the area but rather investigate the use of disinformation as a tactic in the regional conflict. It will look at various examples of disinformation perpetrated around this conflict and examine the potential roles of numerous countries in disinformation campaigns.
Russia’s Interest
Current scholarship indicates that there are several aspects to Russia’s interest in the region. First, Russia has a long-standing history with both Armenia and Azerbaijan, as both are former Soviet states and thus are considered within Russia’s intended sphere of influence. Second, Russia has fears of these states acting as puppet client systems by the West (Kuzio, 2021). There is much disinformation about the Second Karabakh War – Russia claims Azerbaijan received instructions from the United States to continue the conflict as an attempt to expel Russia from the South Caucasus (EU vs Disinfo, issue 221); Russia also claims that British intelligence services are trying to push Turkiye’s involvement (EU vs Disinfo, issue 216). Third, Russia is concerned with losing its influence and power over the Eastern Partnership states which will weaken Russia (EU vs Disinfo, issue 229). The Eastern Partnership shifted Russia’s focus on NATO and the EU, resulting in Russia’s view of both as geopolitical rivals. Conversely, Russia still maintains ties with Armenia. Russia is considered Armenia’s main regional partner and a source of critical security support (Barseghyan et al, 21). Despite this, security guarantees were made for Armenians but curiously this did not extend to Armenians in Karabakh (Broers, 2020).
A common theme that occurred in many campaigns is the narrative that Azerbaijan had enlisted foreign mercenaries including Chechens, ISIS, Ukrainians, and Turks who contributed significantly to Azerbaijan’s military victory. On September 30th, a few days after the start of the war, a Russian BBC story claimed that several hundred mercenaries were allegedly transported to Karabakh before the outbreak of the war from the Syrian territory controlled by Turkey (BBCRussia, 2020). The publication claimed that they received this information via a message from one of the militants, but immediately noted that they could not confirm the validity of the claims. Kyiv allegedly trained “militants and nationalists” for the Second Karabakh War and supplied Ukrainian arms to the Azerbaijani. Some go as far to suggest that both the United States and Ukraine supplied chemical weapons to Azerbaijan which President Zelenskyy has dismissed as “fake news” (Kuzio, 2021).
Russian weapons also became the subject of disinformation. Armenia has Russian-produced Iskander mobile short-range ballistic missile systems in their arsenal and is considered one of their most powerful weapons. The public questioned why the Iskander was not used, and in response, several narratives began to circulate. Some suggested that the lack of Iskander use was evidence of a deliberate defeat; others claimed that Armenian forces had used the missiles but intentionally compromised missions in Shusha and Hadrut. The debate was further sparked by Prime Minister Pashinyan claiming that the Iskander missiles had been used but were faulty, which Russian officials refuted. The Prime Minister’s office then claimed they had received false information from the General Staff, but the public had begun to question the Armenian government’s agency during the war (Barseghyan, 2021)
Russia’s anti-Azerbaijani campaign occurred during and after the war. Russian media networks have aired anti-Azerbaijani stories (Mirovalev, 2020), primarily connected to Russian paramilitary structures such as the Wagner Group which projected their influence through various media outlets and related Telegram channels, drawing upon existing critical views that Russia has had with Baku since the 1990s. The war was covered by military correspondents who were known for strong views. While in Karabakh, Russian reports emerged describing “unseen brutalities” conducted by Azerbaijani forces. Additionally, they claimed that thousands of foreign fighters, including Syrian and Afghan jihadists, had supposedly perpetrated mass beheadings and desecrated Christian monuments. The narrative depicted Azerbaijan as nothing more than a Turkish proxy focused on Islamic expansion in Russia’s sphere of influence and destroying relations with Russia’s ally Armenia (Muradov, 2022). Another narrative that was disseminated was about Azerbaijani-Jewish oligarchs in Russia. An informal meeting occurred between Moscow intelligence chief Sergey Naryshkin and prominent oligarch God Nisanov which triggered sections of Russian media to claim that an “omnipresent Azerbaijani mafia in Russia” was manipulating Moscow’s support for Azerbaijani interests. Russian political talk shows even covered the war with subtle anti-Azerbaijani statements, staying away from anything too extreme but still perpetuating a noticeable anti-Azerbaijani stance (Muradov, 2022).
Russian news media has also influenced international disinformation about the conflict. A video filmed in Azerbaijan by Russian journalists during the 2016 conflict escalations claims that Terter residents were so afraid of shooting that they lived in cars outside the city. Meanwhile, the background of the video shows standard parking practices, indicating this was a false story. An individual interviewed at the scene said that they were returning from a funeral and had a voice-over translation that has nothing to do with the original statements. International media, especially Armenian media sources, circulated and replicated the video. Life News, the channel responsible for the broadcast, was expelled from Azerbaijan in disgrace but the damage had already been done (Rzayev et al, 2021)
Russia’s involvement in the conflict has developed a new facet after Russia’s invasion of Ukraine. Azerbaijan adopted Turkish tactics and military equipment which aided in Azerbaijan’s military victory – Russian campaigns about Ukraine have focused on Turkiye’s involvement as Turkiye is a vital corridor for key resources like energy and food supplies Turkiye also is a vital member of the NATO security bloc but tends to operate independently (and sometimes counter-to) of the positions that other NATO members take (Chausovsky, 2022). Turkiye is still attempting to balance ties between Russia and Ukraine, but the supplying of Turkish weapons to both Ukraine and Azerbaijan can be seen as a slight. Russia sees Ukraine, Armenia, and Azerbaijan all as artificial states and any assistance to assert their independence is a threat on the Russian sphere of influence. Russian disinformation campaigns surrounding this relationship are warning that Kyiv could be “preparing a Karabakh scenario” for the Donbas (Kuzio, 2021).
Disinformation about the relation between Armenia, Azerbaijan, and Ukraine have also become much more common. The official account of Ukraine’s parliament tweeted that the Azerbaijani armed forces “went on the offensive in Karabakh, taking advantage of the redeployment of Russian soldiers [from Karabakh] to Ukraine” (Civilnet, 2022). The source of the since-deleted tweet was from a Ukrainian military Telegram channel which has also been deleted. Insiders claimed that in order to replenish Russia’s losses, they were redeploying part of their peacekeeping mission in Karabakh to Ukraine, using an Azerbaijani video doctored to supposedly show Russian troops and equipment leaving Karabakh through the Lachin corridor. Karabakh’s Security Council denied Russian peacekeepers were leaving as this would escalate tensions and leave populations defenseless. Russia’s ambassador to Azerbaijan also refuted this claim (Civilnet, 2022).
Social Media Campaigns
Disinformation campaigns run online, especially through social media, are a new facet of active measure warfare that has emerged with the rapid global connection of the internet. Most of these campaigns have similar tactics that have been developed by the Internet Research Agency (IRA) based in Saint Petersburg. The IRA is a non-state cyber proxy content and troll farm based in Saint Petersburg that has been in operation since mid-2013 (Sherman, 2021). They are most well-known for the social media campaigns run during the American Presidential election of 2016 that were extensive, diverse, and pervaded all major social media platforms (Dawson and Innes, 245). The IRA comprises different departments that focus on different regions and demographics as well as social media sites such as Twitter, Facebook, Instagram, TikTok, YouTube, and so on. Individuals are responsible for running multiple fake accounts with different requirements and quotas that must be met. These quotas can include making a certain number of original posts, posting a certain number of articles to fake pages, making incendiary comments on articles, promoting cross-platform disinformation, and so on (Linvill and Warren, 9). Few operators are ideologically committed as most are students and young people looking for jobs and job experience. As a result, there is a high turnover rate which encourages the creation of easing behaviours that help operators meet demanding performance requirements (Dawson and Innes, 246). These behaviours can include content copied verbatim between accounts, sources used repeatedly and consistently between groups of people or between sites, and generic accounts that have only been created or started posting recently.
Russian social media distributed narratives about Azerbaijan seemingly having an “ethnic mafia” that allegedly controlled Saint Petersburg and had ties to Britain and Türkiye to further their leftist agendas (Muradov, 2022). Evidence of this group’s existence was largely restricted to archived social media chats and the controversial VKontakte groups, but these records could have been forged. There was no larger evidence that indicated this was an organized network and claims ceased after a short online campaign. These campaigns were aimed to take advantage of nationalism mixed with resentment towards the peoples of the Caucasus and anti-western “besieged fortress” thinking (Muradov, 2022).
Pro-Armenian Campaigns
During the first couple days of the war, an influx of Twitter accounts that discussed the conflict were created. Between September 27th and September 29th, a study collected 206116 tweets containing hashtags associated with the conflict. During this time frame 70350 unique accounts were identified in the data set with 7764 of those accounts created within those two days. These recently created accounts accounted for 14.5% of all tweets (Thomas and Zhang, 3). Most of these accounts amplified posts by Armenian public figures and the Armenian government, as well as expressing vocal support for Armenia in the war. Armenian government accounts including @armgov, @ArmenianUnified, @MFAofArmenia were mentioned a combined total of 3131 times. Tweeting activity was most active during the evening in UTC+4 time zones and the least active around 4 am (UTC+4). While this could be automated, it is more likely that this pattern indicates that these are not automated accounts since they follow a typical user schedule. Automated accounts tend to post at all hours as automation does not require sophisticated methods (Linvill and Warren, 11).
Much of the pro-Armenian content was a persistent and seemingly coordinated effort to translate identical news snippets or Armenian government press statements into multiple languages including English, Spanish, Ukrainian, and German. It is possible that this was only partially a coordinated effort; however, a significant number of accounts that exhibited this behaviour appeared to be suspicious based on a number of factors including account age and post activity. A significant number of accounts were recently created or if they were older accounts, they would have little to no prior activity. For example, some accounts had join dates listed as 2015, but only started posting content no earlier than late September 2020. Older twitter accounts such as these are widely available for purchase and sometimes can be used in information wars to make accounts appear more credible (Thomas and Zhang, 5). Other types of accounts that can be suspicious are generic accounts that use stock photos of women to appear sympathetic and lend more credibility to an account (Hoch, 6).
Many of these accounts were considered “booster accounts” (Thomas and Zhang, 7). These accounts primarily focused on boosting authentic pro-Armenian content and ‘boosting’ the message up in Twitter’s algorithm. These tweets were typically from high-profile public figures in support of Armenia or contained pro-Armenian hashtags. The question of authenticity is raised from the use of hashtags that were consistently misspelled, likely a result of either automation or human copy-pasting content, both common tactics in social media campaigns. The intent is to both share pro-Armenian content more widely, as well as to game Twitter’s algorithm by falsely creating engagement and appearing on the trending page. Thus, more people are exposed to the Armenian narratives (Thomas and Zhang, 7). This increased exposure can enable the perpetuation of the narrative to cross international borders and, indeed, escape the context it originated in.
Disinformation in an International Context
Many disinformation campaigns about the Armenia-Azerbaijan conflict that spread on social media ended up reported in international media and, in some cases, started from international media. Actors well outside the geographical scope of the conflict started to get involved on Twitter, including diaspora groups as well as people allegedly from Turkiye, Pakistan, India, America, and others. The support of Turkiye and Pakistan to Azerbaijan on the ground is echoed in English-language skirmishes online, indicating the intent to target Western and other international audiences. Indian accounts also got involved because of Pakistan’s support, pushing back with hashtags such as #IndiaStandsWithArmenia. Other activity includes engaging in Twitter accounts of celebrities or media outlets in an attempt to enlist support for Armenia. Some portion of this “shadow battle” (Thomas and Zhang, 1) is undoubtedly authentic as people naturally have strong opinions on this conflict, especially those from the Armenian and Azerbaijani diaspora. Despite this, many of the accounts had multiple clear signs that they were inauthentic, similar to earlier examples.
Accounts disproportionately engaged with high-profile American figures on Twitter in an attempt to gain support to those sympathetic to Armenia. For example, Kim Kardashian who has family ties to Armenia became the target of multiple tweets per minute after the start of the war. She did later comment publicly in support of Armenia (Kardashian, 2020) but it is unknown whether or not the bombardment of potential inauthentic support along with authentic comments played into her decision to voice her support. Another example was Lady Gaga whose ‘911’ music video included Armenian cultural references – she was also subject to a deluge of tweets regarding the conflict, although she never explicitly addressed the conflict (EU vs Disinfo, issue 215).
American politicians were also targeted. During the first presidential US debate on September 29th US time, multiple accounts tweeted identical text claiming that “Armenian American taxpayers” wanted to hear about the US’ plans to “stop Azerbaijan’s aggression” alongside the #debate2020 hashtag. There was also an influx of activity aimed at influencing the US’ position on the conflict, in the form of widespread sharing of a White House petition calling on the US government to “condemn the aggression of Turkey and Azerbaijan against Artsakh, Armenia” (Thomas and Zhang, 10). This petition was created on September 27th and received more than 150000 signatures as of September 30th. The website defendarmenia.com changed from a redirect to a separate White House petition calling for an end to USAID and military support for Azerbaijan to a redirect to this new petition. This domain was shared widely on Twitter but Twitter’s content moderators have since deleted almost all the tweets sharing the domain as far back as September 6th, suggesting that Twitter’s moderators may suspect that the behaviour was either inauthentic or originating outside the US and thus a violation of foreign interference policies.
English-speaking media accounts also experienced a deluge of engagement. Pro-Armenian accounts created during the war replied en masse to English media articles about the conflict. The BBCWorld Twitter account received heightened levels of engagement on tweets regarding the conflict, especially from potentially inauthentic accounts. Al-Jazeera English, CNN, BBC News UK, and Time magazine were also all targeted. The behaviour of these accounts appears more like operators tasked to amplify pro- Armenian views rather than automated bots (Thomas and Zhang, 11).
Other European countries reported disinformation as well. In France, a media company France24’s “The Observers” program published several user-generated videos allegedly showing Syrian militants preparing to leave for Azerbaijan with the evidence being only that the soldiers spoke Arabic and discussed the cities of Aleppo and Idlib (Hamad, 2020). In the same report, a Syrian sheikh allegedly made an appeal to start a “war against the infidels” and mentioned Azerbaijan. The video was then actively disseminated by Armenian Telegram channels as “evidence of the recruitment of Syrians in Afrin for a trip to Azerbaijan”; however, the Rybar Telegram channel, a Russian analytical group that specializes in the Middle East and Africa, pointed out that it is not clear who the sheikh was, if he was a Sharia judge, a town crier, or one of the local elders for example, and whether it was really in Afrin. He also does not mention the war in Azerbaijan directly, and instead says “this battle is also ours, like the one in Syria” – this could reference any number of things, and taken out of context, it is unclear why this was said and what it was in relation to (EU Reporter, 2020).
Disinformation also came from Western sources. In another report, a video shows a call for mercenaries to fight for Azerbaijan; however, not a single Azerbaijani soldier is in the video, nor is any Azerbaijani military equipment visible in the footage, despite being allegedly filmed on the ground in Karabakh. Incidentally, two American experts on Syria were the first to post these videos online. One of the expats, Lindsay Snell, had tweets indicating her geo-location was in Armenia which raises questions over her objectivity and impartiality (EU Reporter, 2020).
The conflict has seen a large number of allegedly Indian accounts vocally supporting Armenia. This alliance ties back to earlier clashes on Twitter wars, specifically in 2019 when Armenian Twitter users sided with India against Pakistan on the issue of Kashmir. India has since signed a significant defense contract with Armenia which could be seen as a move against Turkey who supported Azerbaijan in the conflict. Additionally, since Pakistan has also supported Azerbaijan, this could be an attempt to oppose Pakistan (Thomas and Zhang, 13). These pro-Armenian narratives feed into broader narratives opposing both Pakistan and Turkey, as well as bringing in Islamophobic elements.
WhatsApp and Telegram have also become significant battle grounds in the social media landscape, as there is less content moderation and fewer corrections issued for confirmed disinformation campaigns. Encrypted channels emerged as a way to protect privacy but have become a way to rapidly spread disinformation amongst communities. WhatsApp’s “forwarding” functionality coupled with large groups enabled disinformation to spread quickly without being associated with an original source (Kraus, 2022). WhatsApp in fact had to limit the function after it had real world consequences of violence in India and Brazil (Byager, 2019).
Conclusion
The conflict between Armenia and Azerbaijan incorporated a lot of intersections with strong convictions. Russia is keen on maintaining ties with both Armenia and Azerbaijan but the potential for information wars to affect perspectives on this conflict are strong. Not only have disinformation campaigns been run, but they have also been targeted towards international audiences, a deliberate effort to bring other nations into the conflict as well as other global citizens that may have good intentions but bad information. If powerful and wealthy nations become involved due to the lobbying of their citizens, the conflict could escalate into something much larger.
Disinformation is increasingly becoming a part of the arsenal of warfare, especially in politically charged situations. While it is clear that many countries have potentially had a role in the spread of disinformation, it is difficult to truly trace the origins of online campaigns; however, despite these unknown origins, masquerading information as real or from people affected by conflicts can have an effect on what people see in the media internationally. These effects, while still not fully known, still have the potential to do more as technology becomes more accessible. It is important to learn from campaigns both past and present and learn how to recognize the signs in order to not fall prey to these narratives. Distinguishing between real people and ‘bot’ accounts can be difficult, especially during emerging crises where authentic social media users may act in unusual ways in response. Despite these points, there is still a level of inauthentic activity that should be studied for more comprehensive research to understand the effect these campaigns may have on communities globally and directly involved in the conflict.
Dr. Vasif Huseynov is Head of Department at the Center of Analysis of International Relations (AIR Center)
Nafeesa Dewji is student at the University of Toronto and was an intern at the AIR Center from June 1, 2022 to September 1, 2022.
References
Abrams, Steve. “Beyond Propaganda: Soviet Active Measures in Putin’s Russia.” Partnership for Peace Consortium of Defense Academies and Security Studies (2016): 5-31.
Barseghyan, Arshaluys, et al. “Disinformation and Misinformation in Armenia: Confronting the Power of False Narratives.” 2021. <https://freedomhouse.org/sites/default/files/2021-06/Disinformation-in- Armenia_En-v3.pdf>.
BBC Russia. “Я не знал, что придется воевать”. Би-би-си нашла наемника из Сирии в Карабахе.” BBC 30 September 2020. <https://www.bbc.com/russian/features-54348623>.
Brattberg, Erik and Tim Maurer. “Five European Experiences With Russian Election Interference.” Russian Election Interference: Europeʹs Counter to Fake News and Cyber Attacks. 2018. <https://www.jstor.org/stable/resrep21009.6#metadata_info_tab_contents>.
Broers, Laurence. “Did Russia win the Karabakh war?” Eurasianet 17 November 2020. <https://eurasianet.org/perspectives-did-russia-win-the-karabakh-war>.
Byager, Laura. “WhatsApp imposes new limits on forwarding to fight fake news.” Mashable (2019). <https://mashable.com/article/whatsapp-sharing-restrictions-users>.
Chausovsky, Eugene. “Turkey Is the Biggest Swing Player in the Russia-Ukraine War.” Foreign Policy 11 August 2022. <https://foreignpolicy.com/2022/08/11/turkey-russia-ukraine-war-swing-player.
Civilnet. “Fake News Spreads on Alleged Armenian Involvement in the Ukraine War.” Civilnet 31 March 2022. <https://www.civilnet.am/en/news/656115/fake-news-spreads-on-alleged-armenian-involvement- in-the-ukraine-war/>.
Dawson, Andrew and Martin Innes. “How Russia’s Internet Research Agency Built its Disinformation Campaign.” The Political Quarterly (2019): Volume 90, Issue 2.
EU Reporter. “War in Karabakh: How fake news appears on Western media.” EU Reporter (2020). <https://www.eureporter.co/general/2020/11/10/war-in-karabakh-how-fake-news-appears-on-western- media>.
EU vs Disinfo. “Disinfo: Nine Days Before The War, Lady Gaga Warned About It.” 7 October 2020. <https://euvsdisinfo.eu/report/nine-days-before-the-war-lady-gaga-warned-about-it/>.
—. “Disinfo: The Eastern Partnership Project Was Initially Designed As A Geopolitical Trap.” 21 01 2021. <https://euvsdisinfo.eu/report/the-eastern-partnership-project-was-originally-designed-as-a- geopolitical-trap>.
—. “Disinfo: The Escalation In Nagorno Karabakh Is Led By British Intelligence Services.” 01 10 2020. <https://euvsdisinfo.eu/report/escalation-nagorno-karabakh-british-intelligence-services>.
—. “Disinfo: The Us And Its Allies Are Trying To Reignite The Clashes Between Armenia And Azerbaijan.” 18 11 2020. <https://euvsdisinfo.eu/report/us-allies-trying-reignite-clashes-armenia- azerbaijan>.
Facebook. October 2020 Coordinated Inauthentic Behavior Report. 5 November 2020. <https://about.fb.com/news/2020/11/october-2020-cib-report>.
Geukjian, Ohannes. Ethnicity, nationalism and conflict in the South Caucasus Nagorno-Karabakh and the legacy of Soviet nationalities policy. Burlington: Ashgate, 2012.
Giles, Christopher and Upasana Bhat. “Nagorno-Karabakh: The Armenian-Azeri ‘information wars’.” BBC 26 October 2020. <https://www.bbc.com/news/world-europe-54614392>.
Hamad, Fatma Ben. “Videos shared on social media show Syrians sent to fight in Nagorno-Karabakh.” The Observers 26 October 2020. <https://observers.france24.com/en/20201026-videos-shared-social- media-show-syrians-sent-fight-nagorno-karabakh>.
Hoch, Indira Neill. “Russian Internet Research Agency Disinformation Activities on Tumblr: Identity, Privacy, and Ambivalence.” Social Media + Society 6.4 (2020). <https://journals.sagepub.com/doi/10.1177/2056305120961783>.
Kardashian, Kim [KimKardashian]. “Tweet Message.” Twitter, 27 September 2020. <https://twitter.com/KimKardashian/status/1310278057018753026>.
Kraus, Rachel. “In the Russia-Ukraine information war, encrypted messaging apps provide opportunity and risk.” Mashable (2022). <https://mashable.com/article/whatsapp-telegram-russia-ukraine- disinformation>.
Kuzio, Taras. “How Russia Spreads Disinformation About the Second Karabakh War.” The National Interest 13 April 2021. <https://nationalinterest.org/blog/buzz/how-russia-spreads-disinformation-about- second-karabakh-war-182635>.
Legucka, Agnieszka. “Russia’s Long-Term Campaign of Disinformation in Europe.” Carnegie Europe 19 March 2020. <https://carnegieeurope.eu/strategiceurope/81322>.
Leighton, Marian. Soviet Propaganda as a Foreign Policy Tool. New York: Freedom House, 1991.
Linvill, Darren L and Patrick L Warren. “Troll Factories: Manufacturing Specialized Disinformation on Twitter.” Political Communication 37.4 (2020): 1-21.
Martirosyan, Samvel. “Misinformation in the Karabakh conflict: analysis of Azerbaijani media from Armenia.” 2021. <https://jam-news.net/misinformation-in-the-karabakh-conflict-analysis-of-azerbaijani- media-from-armenia/>.
Mirovalev, Mansur. “Armenia, Azerbaijan battle an online war over Nagorno-Karabakh.” 15 October 2020. <https://www.aljazeera.com/features/2020/10/15/karabakh-info-war>.
Muradov, Murad. “The anti-Azerbaijani campaign in Russian media.” New Eastern Europe 13 April 2022. <https://neweasterneurope.eu/2022/04/13/the-anti-azerbaijani-campaign-in-russian-media/>.
Rzayev, Shahin, Alia Hagverdy and Emil Abbasov. “Misinformation in the Karabakh conflict: analysis of Armenian media from Azerbaijan.” 2021. <https://jam-news.net/misinformation-in-the-karabakh- conflict-analysis-of-armenian-media-from-azerbaijan/>.
Sherman, Justin. “Reassessing RuNet: Russian internet isolation and implications for Russian cyber behavior.” Atlantic Council 12 July 2021. <https://www.atlanticcouncil.org/in-depth-research- reports/issue-brief/reassessing-runet-russian-internet-isolation-and-implications-for-russian-cyber- behavior>.
Thomas, Elise and Albert Zhang. “Snapshot of a shadow war: a preliminary analysis of Twitter activity linked to the Azerbaijan–Armenia conflict.” 2020. <https://ad-aspi.s3.ap-southeast- 2.amazonaws.com/2020-10/Snapshot%20of%20a%20shadow%20war.pdf>.
Wong, Julia Carrie and Luke Harding. “‘Facebook isn’t interested in countries like ours’: Azerbaijan troll network returns months after ban.” The Guardian 13 April 2021. <https://www.theguardian.com/technology/2021/apr/13/facebook-azerbaijan-ilham-aliyev>.