Why do individuals accept mis- or disinformation? Specifically, why do individuals accept mis- and disinformation particularly related to anthropogenic climate change? Welcome to “Project Mizaru,” where we seek to answer these questions in the most rigorously informed way possible.

Project Mizaru was born out of the realization that among the many angles of climate change research, there needs to be more research on the actual acceptance of disinformation on the climate crisis. Named after the “See No Evil” member of The Three Wise Monkeys, Mizaru examines multiple dimensions to the acceptance of false information – who receives it, how the message is conveyed, who the messenger is, and where the message is encountered (the platform). Mizaru is a “meta-study” that aims to create the definitive compendium of scholarly explanations of why large numbers of people accept climate disinformation despite overwhelming scientific evidence to the contrary. The team is examining scholarship from a diverse range of disciplines, including psychology, where a confirmation bias approach might explain disinformation acceptance; sociology, where scholars examine social pressures or group identity dynamics as driving reasons for disinformation acceptance; and political science, where questions related to partisanship and ideological framing are examined, These explanations, and the meta-level summary of those explanations, will then be applied to climate change mis- and disinformation specifically.

Fall 2023 Progress

Project Mizaru started in the summer of 2023 with the recruitment of our task team lead (TTL), Jacquie Moss, a PhD student at the LBJ School of Public Affairs. After student recruitment at the first all-hands introduction to GDIL, Mizaru was fully formed and ready to work. The process flowchart for the fall semester consisted of the following steps: create keyword search parameters, write structured notes on a subset of the library, develop a coding methodology, “triage” the full library, and begin to code the articles that pass the triage stage. 

Keyword search. To find articles, the primary database has been Google Scholar, with the search phrase  “misinformation acceptance” conjoined with “economics,” “healthcare,” “sociology,” and other fields. Other researchers used “Why do people accept disinformation?” and similar phrases. Students were later asked to find additional studies from the references in the articles they reviewed. Our search is limited to articles published in English. There were no date limitations. We have one empirical article from 1956 and another from 1995; otherwise, the empirical studies span mid-2010 to October 2023.

Structured notes. To acquaint undergraduate students with efficiently reviewing academic papers, we trained them using a note-taking template. Students were asked to find component parts, such as the research questions(s), hypotheses, dependent and independent variables, and main findings. We used Zotero to manage all materials, collaborate on highlighting articles, and share notes. 

Develop a coding framework. Using notes for 40 articles, including empirical, theoretical, and literature review articles, Jacquie inductively developed an initial coding framework that evolved, with iterative recoding, into the current set of concepts (Table 1). 

Triage. We determined that coding would be most efficient if we did a preliminary assessment of each article for its fit, relevance, and overall quality. To guide students through this process, we developed a Qualtrics survey. One of the triage steps is to confirm that each article meets the study criteria. To be included, an article must meet each of the following criteria: 1) peer-reviewed; 2) original empirical research (e.g., not a literature review or theoretical framework without an empirical component); and 3) explores the study questions – why people accept falsehoods, discern truth, share fake info, or revise beliefs (or not). Currently, 87% of the articles in the Zotero library have passed (indicating that much of the screening happened during the article selection process). 

Also during the triage process, student researchers follow guidelines for highlighting each article. There are additional questions related to the article’s academic discipline(s), the primary conception of falsehoods (i.e., fake news, conspiracy theories, etc.), and the geographic context of the study’s population sample. As of the writing of this blog post, students have completed the “triage” step for 91 of the 100 articles in the “empirical” folder of our shared Zotero library (Table 1).

Table 1. During the triage process, student researchers identify the discipline of the journal in which it was published and that of the article’s authors, select the primary term or conception for false information, and discern the location of the study’s sampling frame.

Summary Statistics for the Mizaru Empirical Library (as of Jan. 31, 2024)




Triage process complete


Met screening criteria 



Discipline of Journal & Authors (multi-select)




Communications, Media Studies



Political Science, Government, Politics



Sociology, Social Psychology



Cognition, Neuroscience, Medicine, other hard science






Business Management, Economics



Computer Science, Technology



Conception of Falsehoods (single selection)

Fake news






Conspiracy theories or beliefs






Biased information






Location of Sample Population or Cases (multi-select)




European country/ies



General “online”



Other (Saudi Arabia, Pakistan, Russia, Mexico, etc)












To categorize each article’s discipline, we looked at both the journal’s field of study and that of the article’s authors. As such, an article might be assigned to several fields or to only one. For instance, “One day of eating: Tracing misinformation in ‘What I Eat In A Day’ videos” is coded as Sociology, Education, and Political Science/Law. It is published in the Journal of Sociology with authors representing Education (Topham) and Law (Smith). As shown in Table 1, psychology is represented more than any other discipline. Studies in this field consider the role of personality traits, such as narcissistic tendencies or self-esteem, one’s attitudes and beliefs, such as conspiracy thinking or belief in the supernatural, and cognitive abilities, such as deliberative or critical thinking and repeated exposure to a message (i.e., “illusory truth effect”). The articles drawn from the sociological literature are less numerous but just as interesting, which consider social costs, group norms, and the desire to discredit or berate “the other side.” A rich vein of studies comes from communications and media studies. There, investigators focus on platform features (ease of sharing) and media literacy. A theme that cuts across disciplines is interventions to correct falsehoods, such as comparing the effectiveness of fact checks before, during, or after exposure to falsehood and warning respondents to double-check accuracy by searching up the information on their own.

Code each article. We use the triage step to prioritize the order in which articles are coded based on how students rated relevance to our study questions. Using Google Sheets, we follow a structured coding process. We capture each article’s unique findings as an individual observation (i.e., a row in our spreadsheet). We copy verbatim from the article into our database to preserve the original meaning. For each finding, we select the following from predefined choices:

  • Angle. Is the finding related to falsehoods (i.e., why people believe or share false information) or accuracy (i.e., why people discern falsehoods or respond to corrections)? This is coded as “falsehoods,” “accuracy,” or “other” (with text clarification).
  • Significance. Did the finding pass the test of significance? Statistically significant findings are coded as “yes.” If not, it is coded as “no.” There is also an option for “mixed or nuanced” to capture findings requiring more careful interpretation.
  • Relationship. For statistically significant findings, we note the direction of the association or causal effect as “positive,” “negative,” or “mixed/nuanced.”

Explanation. Lastly, we classify the explanation using the framework that evolved over the past several months. Table 2 provides a snapshot. (Stay tuned for a more complete list of explanations.)

Table 2. A partial list of explanations for why people believe or share false information, discern falsehoods, or respond to corrections. This snapshot is a partial list of all the concepts we have discerned from the academic literature.

Explanation category

Explanation type

Terms encountered

Receiver’s abilities, traits

Critical thinking

Classical reasoning, analytical reasoning, “engage in effortful cognitive activity”

Perceived self-efficacy

Perceived ability, Dunning-Kruger effect, overconfidence

Prior knowledge

Scientific knowledge, knowledge about [topic]

Repeated exposure

Familiarity, “illusory truth effect,” “availability heuristic”

Attitudes, interest

Perceived usefulness of the information, opinions, preexisting beliefs


Political ideology (conservative identity vs. liberal identity), berate or discredit “the other [political] side”

Personality traits

Narcissism, self esteem


College-educated, high school only, academic achievement

Wealth or class

Income, profession

Message attributes (WHAT was said, WHO said it, WHERE was it said)

Media type

Text, imagery, video, etc; manipulated media


Negative charge, neutral tone

Elite messenger

Media personalities, celebrities

Influencers or super-spreaders

Active on social media, number of followers

Perceived credibility

Scientific experts, reputation of news source, official sources

Platform, features

Website(s), social media, TV, print


Correct information provided AFTER exposure



Correction information provided during exposure


Correction or warning provided before exposure


Reminder to evaluate truthfulness, accuracy reminder

Group-level factors

Social pressure

Avoid social costs, earn social credibility, FOMO

Group or cultural norms

Traditions, shared beliefs

Next steps

With these initial findings in place, Project Mizaru researchers have laid the groundwork for our next phase: applying our methodology to climate-change specific mis- and disinformation campaigns. In Spring 2024, we will analyze news and other media stories for evidence of the factors outlined above.


Thank you to the Fall 2023 Mizaru team – Ty Gribble, Moo Sun “Sunny” Kim, Isabella Sherwood, Matthew Scheberle, Elliot Mast, David Tran, and Jonathan Bardin.


1. Asch, S. E. (1956). Studies of independence and conformity: I. A minority of one against a unanimous majority. Psychological Monographs: General and Applied, 70(9), 1–70

2. Hernan, P. (1995). Disinformation and Misinformation through the Internet. Government Information Quarterly, 12(2), 133–139.

3. Topham, J., & Smith, N. (2023). One day of eating: Tracing misinformation in ‘What I Eat In A Day’ videos. Journal of Sociology, 59(3), 682–698. https://doi.org/10.1177/14407833231161369

4. Brashier, N. M., Pennycook, G., Berinsky, A. J., & Rand, D. G. (2021). Timing matters when correcting fake news. Proceedings of the National Academy of Sciences, 118(5), e2020043118. https://doi.org/10.1073/pnas.2020043118

5. Donovan, A. M., & Rapp, D. N. (2020). Look it up: Online search reduces the problematic effects of exposures to inaccuracies. Memory & Cognition, 48(7), 1128–1145. https://doi.org/10.3758/s13421-020-01047-z