What makes audiences resilient to disinformation? Integrating micro, meso, and macro factors based on a systematic literature review

: Despite increased attention since 2015, there is little consensus on why audiences believe or share disinformation. In our study, we propose a shift in analytical perspective by applying the concept of resilience. Through a systematic literature review (n = 95), we identify factors that have been linked to individuals’ resilience and vulnerability to disinformation thus far. Our analysis reveals twelve factors: thinking styles, political ideology, worldview and beliefs, pathologies, knowledge, emotions, (social) media use, demographics, perceived control, trust, culture


Introduction
If disinformation had PR agents, they would be delighted.After all, the topic has been continuously sparking attention, controversy, and action in the past decade.The discourse often uses strong rhetoric-blaming social media, and referring to democracies at stake, free speech endangered, an unfolding infodemic, or information wars (Alvarez-Galvez et al., 2021;Bojic et al., 2023;Miller and Vaccari, 2020).Disinformation is considered harmful, demanding prevention and mitigation, although the degree of harmfulness is the subject of an ongoing debate (Bennett and Livingston, 2018;Meese et al., 2020).Fueled by the perceived threat and its pivotal role during Brexit, two U.S. elections and the storming of the U.S. Capitol, the Covid-19 pandemic, and, most recently, the war in Ukraine, the disinformation debate remains continuously high on the agenda of the press, politics, and public (Corbu et al., 2023;Marwick and Lewis, 2017).
The scientific community is no exception, as continuously rising numbers of publications on the topic indicate (Humprecht et al., 2020).Popular themes are political and health-related disinformation, fact-checking, disinformation correction or interventions, and introspective works focused on terminology (Abu Arqoub et al., 2022;Janmohamed et al., 2021;Kapantai et al., 2021).
Hovering above it all is the question of why people believe or share disinformation.Focusing on the ever-changing political, psychological, technological, sociological, and even neurological drivers of disinformation, previous studies have found a multitude of chains of causes and effects.They point to a decline of trust in media and politics, low levels of media literacy, different cognitive styles, motivated reasoning, and the deceiving characteristics of disinformation (Bryanov and Vziatysheva, 2021;Klebba and Winter, 2023;Sindermann et al., 2020).In most cases, the focus lies on singular factors.However, the complexity of the issue at hand demands more encompassing approaches.
We argue that applying the concept of resilience to disinformation offers a pathway to a more comprehensive understanding.The concept is applied in many disciplines, yet in the context of disinformation, research on resilience is practically still in its infancy.Thus, we ask: RQ: Which factors are connected to resilience and vulnerability to disinformation in previous research?
We conducted a systematic literature review to analyze the increasingly convoluted landscape of disinformation research.Our standardized, protocol-driven methodology and rigorous search for all relevant literature allow for a reliable overview of research results on the topic.We first identify factors that have been linked to individuals' resilience and vulnerability to disinformation.Unlike other (systematic) reviews in the field, we do not focus on one type or content of disinformation and draw our insights from a multidisciplinary pool of sources, including but not limited to psychology, journalism, political, and business sciences.Our analysis goes beyond mapping the field by applying the results to the socio-ecological model (SEM).The SEM is a holistic framework that is used to explain behavior by considering influences from different levels, such as the interpersonal and institutional level (Ma et al., 2017).Our conceptual framework contributes to an under-theorized field of research and allows for an integrated rather than microscopic perspective, uncovering potential interdependencies of influences as well as identifying gaps in research.

Theoretical framework
The concept of resilience is often associated with psychology.However, given its broad meaning of empowering people against risks, it has been applied in many fields, such as economics, psychology, sports, and political sciences, to name only a few (Den Hartigh et al., 2022;Masten, 2011).As Bracke (2016, p. 57) puts it, "in precarious times, resilience is the new security."As a result, definitions and ascribed meanings differ, depending on the context and scope of use.After all, resilience may relate to anything spanning from cells to persons, organizations, nations, and nature (Southwick et al., 2014).
Despite this heterogeneity, the fundamental idea of resilience remains the same across disciplines.For one, it is always linked to a dawning challenge or threat.Without this component, there is simply no need for resilience (Bracke, 2016).Once confronted with adversity, resilience refers to the ability to withstand, adapt and recover, mitigating potential negative effects (Masten et al., 1990;Sapienza and Masten, 2011).The term is also used to describe the process through which resources are harnessed and obstacles overcome, as well as the outcomes of coping (Liu et al., 2020).As these dynamic understandings demonstrate, resilience as a concept defies binary approaches.The fact that the state of resilience within a system or person is prone to change over contexts and time further exemplifies this (Panter-Brick, 2014;Southwick et al., 2014).All these aspects must be considered when conceptualizing resilience in a new context.
Similarly, disinformation comes in many forms, potentially manifesting as ad-revenue-driven clickbait, memes, manipulated visuals, decontextualized information, "alternative" facts, conspiracy theories, deepfakes, and professionally executed disinformation campaigns, including bots (Kapantai et al., 2021;Marwick, 2018).Previous research defined the term using criteria such as degree of verity of content, content format, and distributor's intent (Bennett and Livingston, 2018;Wardle, 2018).The most used definition characterizes disinformation by an intention to mislead, to create (ad) revenue, division, or pushing a (political) agenda (Nielsen and Graves 2017;Wardle and Derakhshan, 2017).However, intentionality of the distributor, apart from proving difficult to determine in retrospect, is of lesser relevance for this research as the focus lies on the receiver and factors influencing the process of discarding, internalizing, or sharing.Thus, all the above-named forms of disinformation are of relevance and included in our operationalization.

Resilience and vulnerability to disinformation
Only a few authors have applied the concept of resilience to disinformation (Roozenbeek et al., 2022).Hansen (2017), for example, differentiates between cognitive and physical resilience when describing means to counter information warfare.Cognitive resilience can be understood as the capability to process disinformation in a manner that prevents internalization, comparable to a cognitive "firewall."Physical resilience aims at obstructing the distribution of disinformation so that it does not reach the user in the first place.Shadow banning and removing content from social media platforms belongs to this category.These two types of resilience go hand in hand.The less physical resilience someone has, the more cognitive resilience is needed, and vice versa (Bjola and Papadakis, 2020).Humprecht and colleagues (2020, p. 497) view resilience in the context of disinformation from a more structural perspective, defining it as "a collective characteristic that transcends the individual level."This is based on Hall and Lamont's (2013) understanding of resilience as the capability of groups to overcome adversity.Here, the focus lies on social, political, and informational structures within a country, such as its media system or level of polarization, as crucial influences on levels of resilience (Humprecht, 2018).
Models in resilience research generally differentiate between (1) risks, such as challenges and adversity; (2) competence criteria; and (3) variables of influence, such as protective or inhibiting factors (Masten, 2011).These three components build the foundation for our definition of resilience to disinformation, which focuses on the individual.The adversity component, in this case, is easily identified: disinformation.The competence criteria refer to markers for positive adaptation, in our case, indicators of resilience to disinformation.The variables of influence will be identified through our systematic review of the literature.
We define resilience to disinformation as a capability that manifests in the process of encountering disinformation and results in either questioning or rec-ognizing disinformation and consequently dismissing it.This process is influenced by internal and external factors as well as available resources to the individual.Dismissal can take many forms, including visible signs of objection, both on-and offline, or tacitly, when disinformation is recognized, and, as a consequence, not believed or shared but neither actively rejected.As falsifying or determining the context of (dis-)information is not always possible, we view questioning (dis-)information, which results in non-internalizing and non-sharing (unless for verification purposes), as an indicator of resilience as well.Simply put, resilient persons are not deceived, and do not internalize nor distribute encountered disinformation.Consequently, resilience protects the individual and their environment from the potentially harmful effects of disinformation.
Attempts to understand resilience will always lead to questions about vulnerability.After all, what sets the process of developing resilience in motion are different forms of threats, which lead to vulnerability (Bracke, 2016).Thus, if resilience exists on a continuum, vulnerability is at the opposite end.In disinformation research, the term vulnerability, just as resilience, is used frequently yet loosely, mostly without terminological or conceptual discussion.There seems to be an unspoken understanding of what vulnerability to disinformation entails, as most scholars refer to it in an almost identical manner: not being able to discern, accepting and believing dis-or misinformation, fake news, rumors, or conspiracy theories.Another popular term is susceptibility, which is often used interchangeably with vulnerability (Nisbet and Kamenchuk, 2021;Pennycook and Rand, 2019;Traberg and van der Linden, 2022).
In our understanding, vulnerability to disinformation leads to an implicit or explicit acceptance of disinformation.Just as dismissal, acceptance can take many forms, such as sharing false information due to lack of deliberation, being deceived by, or subconscious internalization.Thus, vulnerable, or susceptible individuals are exposed to potentially harmful consequences of disinformation.

The socio-ecological model
Resilience does not develop in isolation.It emerges through and is shaped by a combination of external and internal factors, events, and circumstances (Garcia-Dia et al., 2013).Therefore, to truly grasp resilience to disinformation, it is necessary to consider the influence of technological, political, social, cultural, economic, legislative, or educational environmental factors (Liu et al., 2017;Masten, 2011).We therefore introduce a framework that allows for delineating the different layers and connected factors of influence.Originating from health sciences, the socio-ecological model (SEM, see Figure 1) is used to explain individuals' behavior by examining a variety of influences, ranging from micro-to macro-level factors (Bronfenbrenner, 1979;Ma et al., 2017;McLeroy et al., 1988).
Most models applied in disinformation research focus on the individual and on cognition, neglecting the environments that shape both.The SEM highlights the importance of contextual understanding to explain behavior and acknowledges that effective interventions must consider and target individuals as well as their environment (Robinson, 2008).It emphasizes the interplay rather than the influence of single factors when studying a subject (Upreti et al., 2021).This viewpoint could explain why, for instance, disinformation campaigns focused on transferring knowledge have limited effects.Previous research on health campaigns, for example, has shown that education needs a supportive environment to bear fruits (Sallis et al., 2015).The SEM's ability to depict such complexities suits our multidisciplinary and comprehensive approach.

Method
To ensure transparent and replicable research, we conducted our literature review according to the PRISMA guidelines (PRISMA, 2021), the most adopted guideline for systematic reviews (Batten and Brackett, 2022).We started with a detailed review protocol and pre-test phase, which included an evaluation of the selected search strategy, terms, and databases by three researchers and two librarians.The data was then collected from three databases: Web of Science, due to its large coverage of interdisciplinary journals, Communication and Mass Media Complete, and APA PsycInfo as the leading databases for the two most important fields of disinformation research.Google Scholar was excluded due to its non-transparent algorithm, which impedes replication studies.The search strategy was tailored to each database regarding aspects such as truncations, wildcards, and Boolean operators.We searched for the following terms: (disinformation OR misinformation OR "fake news" OR "conspiracy theor*") AND (vulnerab* OR resilien* OR susceptib* OR belie* OR *trust* OR shar* OR decepti* OR deceiv* OR endors*).The search terms are based on our definition of disinformation and resilience, as outlined previously, and include synonyms of key terms.We queried articles published from 2011 onwards, as the topic mainly gained traction within the past decade (Kapantai et al., 2021).Figure 2 shows the increase in publications mentioning disinformation during this period, using Web of Science as an example.
To be included, articles needed to be written in English, published in peer-reviewed journals from 2011 to 2021, and present qualitative or quantitative empirical studies.Articles dedicated to research instrument development were left out due to a lack of explanatory value for the research question.Research related to just one phenomenon, such as misinformation in general, or related topics, such as misinformation correction, was excluded.To ground our framework in empirical studies, we excluded non-empirical work.Studies relying on student or convenience samples were not included due to lack of representativeness, as well as works not disclosing their sampling method.The data collection was conducted in mid-December 2021 and yielded a total of n = 1586 results.After five deduplication rounds, a title and abstract screening (n = 1451 articles) was performed independently by two researchers for enhanced reliability using the software Rayyan.After the screening, full texts for n = 208 articles were retrieved and assessed for eligibility, out of which n = 94 met the criteria and were included.An updated search in September 2022 included one additional article, leading to a total sample size of n = 95.Out of the excluded works, most did either not meet the sample criteria or did not address the topic.Throughout the whole selection process, each exclusion decision was documented, and discrepancies were jointly resolved.Data were extracted using a standardized coding protocol, which was pilottested and peer-reviewed by three researchers.The data extraction sheet (see supplementary material) comprised 13 categories encompassing bibliometric information and empirical data.It was developed by synthesizing standard coding categories for systematic reviews, such as author or study outcomes, with tailored categories for more fine-grained analysis, such as research instruments and key measures.To draw insights from the rich dataset, we conducted a qualitative thematic analysis and relied on data visualization tools, such as word clouds, to map regions or keywords.After grouping the data from within our categories, we used an inductive approach, relying on first open and then thematic coding to identify themes and systematically document research results.We decided against a traditional quantitative meta-analysis of results for two main reasons.First, the 95 analyzed studies exhibited large differences in their operationalization of concepts, sample origins, sizes, and levels of representativeness.Second, many studies relied on country-specific and thematically diverse disinformation stimuli, ranging from political false news to AIDS conspiracy theories, further impeding comparison.

Results
To fully comprehend and connect study results, the following section gives an overview of the sample by examining the research fields and regions that investigate the topic, popular themes, and applied theories and methods.
Our geospatial analysis confirms the findings of previous studies, identifying the United States as the main source of research, followed by Europe and Australia (Abu Arqoub et al., 2022).Contributing to the U.S.-centeredness of the debate are a considerable number of studies located outside of North America that nevertheless rely on American samples.We found publications from most EU member countries and increased interest from Eastern European countries.Clearly underrepresented regions, on the other hand, are South America, Africa, and large parts of Asia, with Singapore and Malaysia as exceptions.
The topic is often approached from a psychological perspective, with scholars and journals from the field accounting for more than half of all publications.The other half consists of communication, political science, business, and journalism studies.Three main areas of interest emerge from the reviewed literature.A total of 51 % of the articles in our sample investigate belief in conspiracy theories, around 27 % focus on political disinformation, and 19 % revolve around Covid-19 disinformation, including related conspiracy theories.Across all disciplines, the topic is commonly approached by investigating predictors of disinformation belief and sharing.
Only one third of all studies within our sample use established pre-existing theoretical constructs as the basis for their empirical research.Within this group, theories from cognitive psychology prevail.These mainly rely upon the dual process theory, fluency, motivated reasoning, and cognitive dissonance and share one basic assumption: that disinformation is believed, shared, or rejected as a result of cognitive processes.We did not categorize the definition of predictors for disinformation belief and literature-based arguments regarding their influence as theory-based.
A closer look at study designs reveals a homogenous picture, as all except for one study in our sample are of quantitative nature, one third of them using experimental designs.In 75 % of cases, the samples are recruited from crowdsourcing platforms such as Amazon Mechanical Turk.Sample sizes differ, ranging from 100 to 18,000 participants.
Having gained an understanding of the research field, we move on to our main research question, investigating factors that are connected to individuals' vulnerability or resilience to disinformation.As outlined in the methodology section, direct comparisons of study results are impeded by large differences within our sample.Instead, this section lists and critically examines the factors that have been researched in connection to resilience and vulnerability to disinformation.A detailed account of corresponding studies per factor of influence and study designs can be found in the supplementary material.
Deliberation and cognitive styles.Research on deliberation and cognitive styles traces disinformation beliefs back to thinking processes, for example, by applying the dual process theory.The assumption here is that individuals with a more intuitive thinking style are more prone to believe and share disinformation compared to more analytical or reflective thinkers.Indeed, almost all results confirm this, consistently linking lower scores on the Cognitive Reflection Test (CRT), which are interpreted as reliance on intuition instead of deliberation, to lower resilience to disinformation (e. g., Marques et al., 2022;Nurse et al., 2022).However, regarding disinformation sharing, several studies report no correlation with CRT scores, limiting the potential of cognitive styles (e. g., Buchanan and Kempley, 2021;Nurse et al., 2022).But the field of cognition has more to offer to understand resilience to disinformation, such as cognitive biases.One example is the tendency to accept seemingly meaningful claims uncritically, also referred to as "bullshit receptivity," which has been repeatedly linked to disinformation beliefs (e. g., Hart and Graether, 2018;Pennycook and Rand, 2020).Similarly, illusory pattern perception, thus seeing patterns in random formations, has been linked to conspiracy theory belief (e. g., van Prooijen et al., 2018).
Pathology.In conspiracy theory studies, pathological traits, such as schizophrenia, are commonly hypothesized as predictors for conspiracy beliefs.Fifty-seven percent of research results from the 12 studies investigating pathologies within our sample confirm these assumptions, linking schizotypy and its lower order facets, such as odd beliefs and magical thinking, delusion proneness, and paranoia, to conspiracy theory belief (e. g., Barron et al., 2018;Georgiou et al., 2019).Mediating factors such as different information processing or levels of self-certainty hint towards possible explanations for these effects.Just as to be expected in the general population, individuals with these psychopathological traits account for small minorities within the samples, significantly limiting the explanatory value of the results for most parts of society.
Political ideology.The fact that political ideology is examined in most studies, independent of the overall topic, points to the shared assumption of scholars that it plays an important role for disinformation beliefs.Studies control for alignment of disinformation beliefs along ideological party lines and differentiate between liberal and conservative participants.Most of them indeed find a correlation between participants' political ideology and politically congruent disinformation, indicating motivated reasoning (e. g., Anthony and Moulding, 2019;Lawson and Kakkar, 2021).There is, however, another possible explanation for these findings since several studies point to source credibility as a decisive factor in truth assessments.More specifically, information from politically congruent sources, such as, for example, the New York Times for liberals, was rated as more accurate regardless of its actual veracity (Traberg and van der Linden, 2022).This points to partisan-motivated processing of information in general, including disinformation, which in turn is related to the individual's degree of emotional investment and identification with the respective party.Most simply put, political ideology matters if it causes motivated reasoning or influences deliberation.
Worldview, beliefs, and personality.The included studies survey various factors relating to attitudes, worldviews, and pre-existing beliefs, ranging from traditionalism, anti-intellectualism, and general suspicions to epistemic beliefs and religiosity (e. g., Garrett and Weeks, 2017).Comparisons and generalizations in this domain are futile, as the influence of individual factors is highly contextand content-dependent.For example, attitudes towards vaccines might prove to be a reliable predictor of Covid-19 disinformation beliefs in some cases but completely unrelated to the endorsement of conspiracy theories about 9/11.The only factor that has been consistently linked to conspiracy beliefs is general conspiracist worldviews, an insight that bears little surprise (e. g., Šrol et al., 2021).Religious beliefs, on the other hand, are generally found to have no correlation with disinformation endorsement, as studies find no significant difference between believers and non-believers (e. g., Jasinskaja-Lahti and Jetten, 2019).Personality traits such as the desire to cause chaos, overconfidence, and avoidance coping are found to be linked to higher disinformation belief and sharing, whereas high conscientiousness and discussion heterogeneity preference are linked to lower vulnerability (e. g., Marchlewska et al., 2019;Su, 2021).However, there are no replication or comparable studies to corroborate these findings.It is safe to assume that there is not one personality profile of a person at risk, but rather individual personality traits that have the potential to reinforce tendencies in interaction with external factors, such as the socio-political environment (as seen during Covid-19).
Knowledge.Fourteen studies within our sample examine the connection between pre-existing knowledge and disinformation belief.Of the many forms of knowledge, their focus lies on political, scientific, and health or Covid-19 knowledge, as well as digital and media literacy (e. g., Vegetti and Mancosu, 2020;Zimmermann and Kohring, 2020).Results on the latter differ greatly, in line with previous research (Jones-Jang, 2021;Marwick, 2018).Regarding political disinformation, individuals with higher political knowledge performed better at discerning between real and false news (Rossini et al., 2021).Within the remaining studies, knowledge is assessed with few variables only, partly relying on self-assessment and thus limiting our capacity to draw valid conclusions.
Emotions.Of the studies in our sample, 15 % investigate the influence of emotions on disinformation beliefs or sharing.Most scholars connect conventionally labeled "negative" emotions to vulnerability, investigating anger, anxiety, stress, or feelings of exclusion, with varying results.The findings around anxiety are a good example for showcasing these discrepancies.In the context of political disinformation, anxiety has been shown to decrease partisan processing of (dis-)information (Weeks, 2015).At the same time, health anxiety has been found to increase message importance and thus sharing intentions of health information in general, including disinformation (Oh and Lee, 2019).Thus, context matters, as well as inter-acting factors, such as for example emotional coping mechanisms, which in turn are shaped by previous personal experiences.The only exception to this ambiguity are studies investigating the effects of (societal) threat and exclusion on conspiracy theory belief.Here, all results link higher perceived threat and feelings of exclusion to higher conspiracy theory belief (e. g., Jolley et al., 2018).Only a few studies investigate the influence of positive emotions, such as entertainment seeking, on susceptibility to disinformation (e. g., van Prooijen et al., 2022).
Media use and exposure to disinformation.Social media use is quickly blamed for its presumed negative influence on disinformation belief and sharing, even though research results on this are highly mixed.As disinformation is popularly distributed via social media, presence on these platforms naturally increases the risk of encountering it.Within the same logic, highly active social media users who frequently share information on their channels have a higher risk of (accidentally) posting false information (Buchanan and Kempley, 2021).Adding to the potential danger are algorithms that present the user with similar information, activating cognitive processes of fluency, where the repetition of information, factual or false, increases believability.Out of the 12 studies that investigate social media use, 50 % find a positive correlation between social media use and disinformation belief or sharing (e. g., Bae, 2020).However, the effects are often explained through mediating factors such as worry or conspiratorial thinking, pointing to the interplay with emotions and beliefs (e. g., Su, 2021).We interpret these findings as evidence of social media's potential to weaken resilience to disinformation due to its amplifying nature.The use of so-called "alternative media" on the other hand has been found to decrease resilience to disinformation significantly (Humprecht et al., 2021).
Demographics.When identifying vulnerable audiences, relying on factors such as education, income, or age is tempting and probably pointless.According to our analysis, demographic variables have little to no explanatory value for understanding resilience to disinformation.Only three studies within our sample report correlations with age or level of education.However, in all cases, these connections are explained by more influential mediating factors such as knowledge or epistemic beliefs (e. g., Douglas et al., 2016).
Perceived control.A common attempt within media and public discourse to make sense of conspiracy theory beliefs is by referring to the concept of personal control.In this context, conspiracy theories are viewed as a tool to regain a sense of control in an increasingly uncertain and uncontrollable world.Indeed, all five studies investigating the role of control find a correlation between perceived lack of control, feelings of powerlessness, and conspiracy theory belief (e. g., Hart and Graether, 2018).We did not find any results on the role of control for other forms of disinformation.

Meso level
Trust and social environment.The lack of trust in Western societies makes for a popular argument to explain the increased proliferation of disinformation.To avoid inaccurate generalizations, it is essential to differentiate between different forms of trust.The studies within our sample investigate institutional trust, political trust, trust in news sources, interpersonal trust, trust in mainstream media, trust in science, and trust in food safety, with most attention being paid to the former two.Most of the evidence confirms initial assumptions that decreased trust is linked to disinformation belief or sharing, although effect sizes differ (e. g., Hollander, 2018).The mechanisms behind interpersonal and (news) source trust prove more intricate and produce less homogeneous results.High interpersonal trust, for example, such as trust towards a specific person, can increase susceptibility toward a specific conspiracy theory (Green and Douglas, 2018).At the opposite end of the scale, low social trust toward strangers has also been found to increase susceptibility (Hopp et al., 2020).The influence of the social environment on resilience to disinformation receives little attention but bears useful insights, as social media experiments found a significant influence of user comments on readers' evaluation of (dis-)information (Anspach and Carlson, 2020;Colliander, 2019).The findings illustrate how meso-level influences interact with or can overrule intrapersonal factors such as partisan beliefs.

Macro level
Culture and collective narcissism.Only five studies explore the potential influence of culture on disinformation belief, and even fewer produce valid results.The exception is collective narcissism, which is presumed to increase vulnerability to outgroup conspiracy theories.As a form of ingroup identity, it is characterized by a perceived greatness of the own group and increased hostility towards others.The results regarding collective narcissism align, linking collective narcissism to ideologically aligning conspiracy theories (e. g., Cichocka et al., 2016;Marchlewska et al., 2019).
Socio-political and informational environment.In general, macro-level factors receive much less scholarly attention within our sample.Two cross-national studies focus on the influence of structural factors and find high levels of polarization and populism to be connected to higher vulnerability to disinformation.Trust in and use of mainstream and public service media, on the other hand, were not connected to levels of resilience (Humprecht et al., 2020;Humprecht et al., 2021).

Conclusion
Our study presents a conceptual framework on resilience to disinformation, provides an overview of the research field, and identifies key factors associated with resilience and vulnerability to disinformation through a systematic review.Our findings enable us to identify gaps and provide direction for future research.
The studies within our sample mainly originate from North America or Western Europe, thereby inherently reflecting the political context, priorities, and implicit assumptions prevailing within these geographical regions.This could be an outcome of our methodological choices, as, for example, the journals included in our databases of choice might be less accessible or common in other regions.Future research explicitly focused on South American, African, Asian, and Middle Eastern perspectives is needed to complement this.
Another indicator of homogeneity within the research field is the choice of methodology and sampling methods.The overwhelming majority adopt a quantitative approach and mainly sample from crowdsourcing platforms, which raises questions regarding representativeness.It also points to a lack of insight into the underlying motivations and circumstances that lead to disinformation endorsement.Mixed methods or qualitative approaches could provide meaning to thus far discovered correlates and uncover more latent factors and processes.Qualitative interviews, for instance, would allow to explore processes of resilience inductively and provide more in-depth insight into how individuals interpret and navigate (dis) information.
In our framework and analysis, we differentiate between micro-, meso-, and macro-level factors of influence on resilience to disinformation.At the micro level, political ideology, cognitive processes, and pathologies are considered the prime drivers of vulnerability to disinformation.As a substantial part of disinformation is of political nature, it is logical to find connected correlates embedded in empirical research.The prevalence of cognitive and pathological measures can partly be attributed to the availability of existing research instruments.However, if cognitive abilities and mental illnesses are first to be explored in relation to vulnerability to disinformation, this also points to implicit biases.We are not the first to notice this, and our evidence emphasizes the importance of being conscious of implicit value judgments and the importance of understanding instead of labeling subjects (Harambam, 2020).
Findings that reflective thinking has a positive influence on resilience to disinformation are promising but come with limitations.We find inconsistencies in the understanding of what the Cognitive Reflection Test (CRT), the most used instrument for measuring cognitive styles, intends to measure, ranging from "analytical thinking" to "cognitive reflection" or "cognitive ability" (Buchanan and Kempley, 2021;Marques et al., 2022;Tandoc et al., 2021).Some authors acknowledge its constraints, pointing out that it is unclear whether the CRT assesses analytic thinking or simply reflects numeracy or general cognitive ability (Nurse et al., 2022).These inconsistencies limit the overall validity and generalizability of results.In addition, we should not equate improved truth discernment, resulting from deliberation, with resilience to disinformation.After all, research results show that higher CRT results do not necessarily lead to better discernment when sharing disinformation (Nurse et al., 2022).For example, individual abilities to discern disinformation could be overruled when content is shared solely for entertainment purposes, connecting with peers, or expressing belonging to a certain group.Based on our framework, we conclude that meso-level factors, for example, relating to the social environment, might help explain the differences between abilities and behavior and should be explored in future research.
Overall, the studies within our sample pay little attention to meso-and macro-level influences on resilience to disinformation, leaving questions regarding the role of social, educational, or political environments unanswered.This points to a significant gap, as experiences and environments undeniably shape the development of resilience and thus need to be considered.Our results further emphasize the importance of context.Believing Covid-19 disinformation during the pandemic might be motivated by different factors than believing 9/11 conspiracy theories.Moreover, while different forms of disinformation can be believed for the same reasons, our evidence shows that the type of content, in addition to the context, matters.Lastly, our research shows that concepts related to disinformation are undertheorized, which is exemplified by the fact that only one third of studies use established theories and concepts.Our framework aims to contribute to the conceptual groundwork in the field.
Based on our sample and analysis, we cannot quantitatively compare study results or effect sizes within our sample.This limits our ability to make statements on the exact influence of different factors on resilience to disinformation.Additionally, since research based on convenience and student samples was not included in the systematic review, our list of factors connected to resilience to disinformation might have potential for extension.
As a multi-layered issue, disinformation research benefits from approaches that allow for complexity.By integrating different levels, our conceptual framework of resilience to disinformation provides a meta-level perspective, which can aid researchers in identifying mediating factors, explaining unexpected effects, and contextualizing their results.It also provides a possible roadmap for more extensive, multi-factor research, which would greatly contribute to a research field that currently mainly focuses on the influence of single, micro-level factors.Research results show that influencing factors act additively, further emphasizing the impor-tance of more comprehensive approaches.Some scholars already acknowledge this and attempt to provide a more holistic view of disinformation (Manuvie and van Dorssen, 2021;Chadwick and Stanyer, 2022).In the end, grasping resilience to disinformation is like crafting a complex mosaic where both the individual parts and the bigger picture need to be monitored simultaneously.

Figure 4 :
Figure 4: Factors connected to resilience and vulnerability to disinformation.