“Useful Idiots” of Disinformation Campaigns. Mechanisms for the Formation of Virtual Communities Spreading Falsehood and Manipulation Online

Jacek Czerwiński

Abstract


This article explores the phenomenon of virtual communities disseminating false and manipulated content. The author points out the relationship between contemporary changes in the media environment, influencing the formation of social identity, worldview and the feeling of emotions, and the intentional creation of virtual communities around disinformation content. The impact of disinformation campaigns, initiated by organised state and non-state actors, on the creation of virtual communities complicit in the creation, reproduction and propagation of false narratives was also considered. The objectives of such campaigns, which focus on creating alternative information environments and disinformation communities, are identified and described. Contemporary social transformations were then characterised in relation to the role played by virtual communities in the adaptation of individuals to these changes. Particular attention was paid to increasing individualisation, the formation of social identification in digital media and the growth of a sense of anxiety and insecurity in society. As a result, mechanisms that are well-established in contemporary social transformations have been identified that foster the formation of and participation in disinformation communities, viz: (1) helping to understand complex and unclear socially important topics; (2) pointing to the “right” norms and values; (3) helping to form a worldview; (4) supporting coping with difficult-to-control emotions; (5) creating a space for the expression of rebellion and countercultural attitudes; (6) creating a space of “escape” for excluded or marginalised individuals; (7) helping to shape social identification. The effects of disinformation campaigns based on the participation of virtual communities were also identified, viz: (1) the creation of “unconscious disinformation agents”, (2) the formation of oppositional identities, (3) the polarisation of communities around emotions, and (4) the normalisation of extreme values and views.


Keywords


disinformation; fake news; virtual communities; digital media; social identity

Full Text:

PDF (Język Polski)

References


Aral, S. (2020). The Hype Machine: How Social Media Disrupts Our Elections, Our Economy, and Our Health and How We Must Adapt (First edition). Currency.

Bantimaroudis, P. (2016). “Chemtrails” in the Sky: Toward a Group-mediated Delusion Theory. Studies in Media and Communication, 4(2), 23–31. https://doi.org/10.11114/smc.v4i2.1719

Bradshaw, S., Howard, P.N. (2018). Challenging Truth and Trust: A Global Inventory of Organized Social Media Manipulation. Working Paper 2018.1. Oxford, UK: Project on Computational Propaganda

Brady, W.J., Wills, J.A., Jost, J.T., Tucker, J.A., Van Bavel, J.J. (2017). Emotion Shapes the Diffusion of Moralized Content in Social Networks. Proceedings of the National Academy of Sciences, 114(28), 7313–7318. https://doi.org/10.1073/pnas.1618923114

Christensen, K., Levinson, D. (Eds.). (2003). Encyclopedia of Community: From the Village to the Virtual World. Sage Publications.

Darius, P., Urquhart, M. (2021). Disinformed Social Movements: A Large-Scale Mapping of Conspiracy Narratives as Online Harms during the COVID-19 Pandemic. Online Social Networks and Media, 26, 100174. https://doi.org/10.1016/j.osnem.2021.100174

Demczuk, A. (2018). Fenomen ruchu antyszczepionkowego w cyberprzestrzeni, czyli fake news i postprawda na usługach hipotezy Andrew Wakefielda. Annales Universitatis Paedagogicae Cracoviensis. Studia de Cultura, 4(10), 92–113. https://doi.org/10.24917/20837275.10.4.8

Douglas, K.M., Uscinski, J.E., Sutton, R.M., Cichocka, A., Nefes, T., Ang, C.S., Deravi, F. (2019). Understanding Conspiracy Theories. Political Psychology, 40(S1), 3–35. https://doi.org/10.1111/pops.12568

Elliott, A. (2015). Identity Troubles (1 wyd.). Routledge. https://doi.org/10.4324/9780203402221

Freelon, D., Lokot, T. (2020). Russian Twitter Disinformation Campaigns Reach across the American Political Spectrum. Harvard Kennedy School Misinformation Review. https://doi.org/10.37016/mr-2020-003

Hardy, C., Lawrence, T.B., Grant, D. (2005). Discourse and Collaboration: The Role of Conversations and Collective Identity. Academy of Management Review, 30(1), 58–77. https://doi.org/10.5465/amr.2005.15281426

Hasell, A., Weeks, B.E. (2016). Partisan Provocation: The Role of Partisan News Use and Emotional Responses in Political Information Sharing in Social Media: Partisan News, Emotions, and Information Sharing. Human Communication Research, 42(4), 641–661. https://doi.org/10.1111/hcre.12092

Huneman, P., Vorms, M. (2018). Is a Unified Account of Conspiracy Theories Possible? Argumenta Oeconomica Cracoviensia, 3, 49–72 https://doi.org/10.23811/54.ARG2017.HUN.VOR

Innes, M., Dobreva, D., Innes, H. (2021). Disinformation and Digital Influencing after Terrorism: Spoofing, Truthing and Social Proofing. Contemporary Social Science, 16(2), 241–255. https://doi.org/10.1080/21582041.2019.1569714

Innes, M., Innes, H., Roberts, C., Harmston, D., Grinnell, D. (2021). The Normalisation and Domestication of Digital Disinformation: On the Alignment and Consequences of Far-Right and Russian State (dis)information Operations and Campaigns in Europe. Journal of Cyber Policy, 6(1), 31–49. https://doi.org/10.1080/23738871.2021.1937252

Jungherr, A., Schroeder, R. (2021). Disinformation and the Structural Transformations of the Public Arena: Addressing the Actual Challenges to Democracy. Social Media + Society, 7(1), 2056305121988928. https://doi.org/10.1177/2056305121988928

Kahan, D.M. (2017). Misconceptions, Misinformation, and the Logic of Identity-Protective Cognition (SSRN Scholarly Paper Nr 2973067). https://doi.org/10.2139/ssrn.2973067

Kata, A. (2010). A Postmodern Pandora’s Box: Anti-Vaccination Misinformation on the Internet. Vaccine, 28(7), 1709–1716. https://doi.org/10.1016/j.vaccine.2009.12.022

Keller, F.B., Schoch, D., Stier, S., Yang, J. (2020). Political Astroturfing on Twitter: How to Coordinate a Disinformation Campaign. Political Communication, 37(2), 256–280. https://doi.org/10.1080/10584609.2019.1661888

Kosloff, S., Greenberg, J., Schmader, T., Dechesne, M., Weise, D. (2010). Smearing the Opposition: Implicit and Explicit Stigmatization of the 2008 U.S. Presidential Candidates and the Current U.S. President. Journal of Experimental Psychology: General, 139(3), 383–398. https://doi.org/10.1037/a0018809

Kramer, A.D.I., Guillory, J.E., Hancock, J.T. (2014). Experimental Evidence of Massive Scale Emotional Contagion Through Social Networks. Proceedings of the National Academy of Sciences, 111(24), 8788–8790. https://doi.org/10.1073/pnas.1320040111

Lewandowsky, S., Ecker, U.K.H., Cook, J. (2017). Beyond Misinformation: Understanding and Coping with the “Post-Truth” Era. Journal of Applied Research in Memory and Cognition, 6(4), 353–369. https://doi.org/10.1016/j.jarmac.2017.07.008

Linvill, D.L., Warren, P.L. (2020). Engaging with Others: How the IRA Coordinated Information Operation Made Friends. Harvard Kennedy School Misinformation Review, 1(2). https://doi.org/10.37016/mr-2020-011

Manivannan, V. (2013). FCJ-158 Tits or GTFO: The logics of misogyny on 4chan’s Random—/B/, The Fibreculture Journal, 22, 109–132.

Marshall, J.P. (2017). Disinformation Society, Communication and Cosmopolitan Democracy. Cosmopolitan Civil Societies: An Interdisciplinary Journal, 9(2), Article 2. https://doi.org/10.5130/ccs.v9i2.5477

Marwick, A.E. (2018). Why Do People Share Fake News? A Sociotechnical Model of Media Effects. Georgetown Law Technology Review. https://georgetownlawtechreview.org/why-do-people-share-fake-news-a-sociotechnical-model-of-media-effects/GLTR-07-2018/

Marwick, A., Lewis, R. (2017). Media Manipulation and Disinformation Online. Data and Society Research Institut, 1–104.

Matei, S.A. (2005). From Counterculture to Cyberculture: Virtual Community Discourse and the Dilemma of Modernity. Journal of Computer-Mediated Communication, 10(3), JCMC1031. https://doi.org/10.1111/j.1083-6101.2005.tb00262.x

Meek, J. (2020). Red Pill, Blue Pill. London Review of Books, 42(20). https://www.lrb.co.uk/the-paper/v42/n20/james-meek/red-pill-blue-pill

Nadler, A., Crain, M., Donovan, J. (2018). The Political Perils of Online Ad Tech.

P, Deepak, Tanmoy Chakraborty, Cheng Long, Santhosh Kumar G. (2021). Data Science for Fake News: Surveys and Perspectives (T. 42). Springer International Publishing. https://doi.org/10.1007/978-3-030-62696-9

Pacheco, D., Flammini, A., Menczer, F. (2020). Unveiling Coordinated Groups Behind White Helmets Disinformation. Companion Proceedings of the Web Conference 2020, 611–616. https://doi.org/10.1145/3366424.3385775

Partin, W.C., Marwick, A.E. (2020). The Construction of Alternative Facts: Dark Participation and Knowledge Production in the Qanon Conspiracy. AoIR Selected Papers of Internet Research 2020. https://doi.org/10.5210/spir.v2020i0.11302

Pasquetto, I.V., Olivieri, A.F., Tacchetti, L., Riotta, G., Spada, A. (2022). Disinformation as Infrastructure: Making and Maintaining the QAnon Conspiracy on Italian Digital Media. Proceedings of the ACM on Human-Computer Interaction, 6(CSCW1), 1–31. https://doi.org/10.1145/3512931

Paul, C., Matthews, M. (2016). The Russian „Firehose of Falsehood” Propaganda Model: Why It Might Work and Options to Counter It. RAND Corporation. https://doi.org/10.7249/PE198

Pennycook, G., Cannon, T.D., Rand, D.G. (2017). Prior Exposure Increases Perceived Accuracy of Fake News. Journal of experimental psychology. General, 147(12), 1865–1880. https://doi.org/10.1037/xge0000465

Pierre, J.M. (2020). Mistrust and Misinformation: A Two-Component, Socio-Epistemic Model of Belief in Conspiracy Theories. Journal of Social and Political Psychology, 8(2), 617–641. https://doi.org/10.5964/jspp.v8i2.1362

Śledź, P. (2022). Dezinformacja w stosunkach międzynarodowych w warunkach czwartej rewolucji przemysłowej. Studia Socjologiczno-Polityczne. Seria Nowa, 1. https://doi.org/doi 10.26343/0585556X11605

Starbird, K., Arif, A., Wilson, T. (2019). Disinformation as Collaborative Work: Surfacing the Participatory Nature of Strategic Information Operations. Proceedings of the ACM on Human-Computer Interaction, 3(CSCW), 1–26. https://doi.org/10.1145/3359229

Stephan, W.S., Stephan, C.W. (2000). An Integrated Threat Theory of Prejudice. In S. Oskamp (Ed.), Reducing prejudice and discrimination (pp. 23–45). Lawrence Erlbaum Associates Publishers.

Sternisko, A., Cichocka, A., Van Bavel, J.J. (2020). The Dark Side of Social Movements: Social Identity, Non-Conformity, and the Lure of Conspiracy Theories. Current Opinion in Psychology, 35, 1–6. https://doi.org/10.1016/j.copsyc.2020.02.007

Stowarzyszenie Demagog (2022). Kremlowska propaganda zablokowana. Sankcjami objęto rosyjskie media. https://demagog.org.pl/analizy_i_raporty/kremlowska-propaganda-zabloko wana-sankcjami-objeto-rosyjskie-media/

Strandberg, K., Himmelroos, S., Grönlund, K. (2019). Do Discussions in Like-Minded Groups Necessarily Lead to More Extreme Opinions? Deliberative Democracy and Group Polarization. International Political Science Review, 40(1), 41–57. https://doi.org/10.1177/0192512117692136

Szpunar, M. (2004). Społeczności wirtualne jako nowy typ społeczności – Eksplikacja socjologiczna. Studia Socjologiczne, 2(173), 95–135.

Szpunar, M. (2018). Kultura lęku (nie tylko) technologicznego. https://doi.org/10.26112/KW.2018.101.10

Toma, G.-A. (2021). Fake News as a Social Phenomenon in the Digital Age: A Sociological Research Agenda. Sociologie Românească, 19(1), Article 1. https://doi.org/10.33788/sr.19.1.7

Tucker, J., Guess, A., Barbera, P., Vaccari, C., Siegel, A., Sanovich, S., Stukal, D., Nyhan, B. (2018). Social Media, Political Polarization, and Political Disinformation: A Review of the Scientific Literature. SSRN Electronic Journal. https://doi.org/10.2139/ssrn.3144139

Uscinski, J.E., Parent, J.M. (2014). American Conspiracy Theories. Oxford University Press.

Vosoughi, S., Roy, D., Aral, S. (2018). The Spread of True and False News Online. Science, 359(6380), 1146–1151. https://doi.org/10.1126/science.aap9559

Wang, V., Tucker, J. V., Haines, K. (2014). Viewing Cybercommunities through the Lens of Modernity: The Case of Second Life. IGI Global. International Journal of Virtual Communities and Social Networking, 5(1), 75–90. https://doi.org/10.4018/jvcsn.2013010105

Wardle, C., Derakhshan, H. (2017). Information Disorder: Toward an Interdisciplinary Framework for Research and Policy Making. Council of Europe, 27, 1–107.

Wasiuta, O., Wasiuta, S. (2020). Kremlowska dezinformacja w internecie i reakcja społeczeństw zachodnich. Przegląd Geopolityczny, 34, 136–147.

Węglińska, A. (2018). Astroturfing internetowy a zagrożenie bezpieczeństwa – protesty w obronie sądów w Polsce, boty i dezinformacja. Rocznik Bezpieczeństwa Międzynarodowego, 12(1), 68–81. https://doi.org/10.34862/rbm.2018.1.6

Włodkowska-Bagan, A. (2018). Rosyjska ofensywa propagandowa. Casus Ukrainy. Studia politologiczne, 49, 109–124.

Zannettou, S., Sirivianos, M., Blackburn, J., Kourtellis, N. (2019). The Web of False Information: Rumors, Fake News, Hoaxes, Clickbait, and Various Other Shenanigans. Journal of Data and Information Quality, 11(3), 1–37. https://doi.org/10.1145/3309699

Zduniak, A. (2010). Event jako ponowoczesna forma uczestnictwa w życiu społecznym. Roczniki Nauk Społecznych, 2(38), 28.




DOI: http://dx.doi.org/10.17951/ks.2022.10.2.5-30
Date of publication: 2024-03-04 01:18:55
Date of submission: 2024-03-03 21:32:20


Statistics


Total abstract view - 545
Downloads (from 2020-06-17) - PDF (Język Polski) - 349

Indicators



Refbacks

  • There are currently no refbacks.


Copyright (c) 2024 Jacek Czerwiński

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.