Misinformation

Download as pdf or txt
Download as pdf or txt
You are on page 1of 23

Misinformation

Misinformation is incorrect or misleading information.[1] It is differentiated from disinformation, which is


deliberately deceptive.[2][3][4] Rumors are information not attributed to any particular source,[5] and so are
unreliable and often unverified, but can turn out to be either true or false. Even if later retracted,
misinformation can continue to influence actions and memory.[6] People may be more prone to believe
misinformation because they are emotionally connected to what they are hearing or reading. The role of
social media has made information readily available to us at anytime, and it connects vast groups of people
along with their information at one time.[7] Advances in technology has impacted the way we communicate
information and the way misinformation is spread.[8] Misinformation has impacts on our societies' ability to
receive information which then influences our communities, politics, and medical field.[7]

Contents
History
Identification and correction
Cognitive factors
Countering misinformation
Causes
Online misinformation
Countermeasures
The role of social media
Lack of peer review
Censorship accusations
Mass media, trust, and transparency
Competition in news and media
Inaccurate information from media sources
Distrust
Impact
See also
References
Further reading
External links

History
Early examples include the insults and smears spread among political rivals in Imperial and Renaissance
Italy in the form of "pasquinades."[9] These are anonymous and witty verses named for the Pasquino piazza
and "talking statue" in Rome. In pre-revolutionary France, "canards", or printed broadsides, sometimes
included an engraving to convince readers to take them seriously.
According to writers Renée DiResta and Tobias Rose-Stockwell, in 1588, false news of the victory of the
Spanish Armada over the English (which had been expected) spread throughout Europe, and news of the
actual English victory came many days later.[10]

The first recorded large-scale disinformation campaign was the


"Great Moon Hoax," published in 1835 in the New York Sun, in
which a series of articles claimed to describe life on the Moon,
"complete with illustrations of humanoid bat-creatures and bearded
blue unicorns".[11] The challenges of mass-producing news on a
short deadline can lead to factual errors and mistakes. An example
of such is the Chicago Tribune's infamous 1948 headline "Dewey
Defeats Truman".

A lithograph from the first large scale


Identification and correction spread of disinformation in America,
"The Great Moon Hoax"
According to Anne Mintz, editor of Web of Deception:
Misinformation on the Internet, one of the best ways to determine
whether the information is factual is to use common sense.[12] Mintz advises that the reader check whether
the information makes sense, and to check whether the founders or reporters who are spreading the
information are biased or have an agenda. Journalists and researchers look at other sites (particularly
verified sources like news channels)[13] for information, as the information is more likely to be reviewed by
multiple people or have been heavily researched, providing more reliable details.

Martin Libicki, author of Conquest In Cyberspace: National Security and Information Warfare,[14] noted
that readers must balance what is correct or incorrect. Readers cannot be gullible, but also should not be
paranoid that all information is incorrect. There is always the chance that even readers who strike this
balance will believe an error to be true, or a truth to be an error.

A person's formal education level and media literacy correlates with their ability to recognize
misinformation.[15][16] This means if a person is more familiar with the content and process of how the
information is researched and presented or is better at critically evaluating information of any source, they
are more likely to correctly identify misinformation. Increasing literacy may not lead to improved ability to
detect misinformation, as a certain level of literacy could be used to "justify belief in misinformation."[17]
Further research reveals that content descriptors can have varying effects on people's ability to detect
misinformation.[18]

Based on the work by Scheufele and Krause, misinformation has different social layers that occur at the
individual, group and sociostructural levels. At the Individual Root level of misinformation, efforts have
sought to focus on the citizen's individual ability to recognize disinformation or misinformation and thus
correct their views based on what they received. Hence, the proposed solutions for these cases utilize side
of news which range from altering algorithms that find the root of fake news or fact check these different
sites. The concern is that having the "inability to recognize misinformation" leads to assumption that all
citizens are misinformed and thus unable to discern and logically evaluate information that emerge from
social media. What poses the largest threat is "evaluation skill" that is lacking amongst individuals to
understand and identify the sources with biased, dated or exploitative sources. Interestingly enough, Pew
Research reports shared that approximately one in four American adults admitted to sharing misinformation
on their social media platforms. The quality of media literacy is also part of the problem contributing to the
individual root level of misinformation. Hence, the call for improving media literacy is a necessity to
educate individual citizens on fake news. Other factors that influence misinformation at the individual level
is motivations and emotion that influence motivated reasoning processes.[19]
The second root is at the group level. People's social networks have truly changed as the social media
environment has evolved. Thus, allowing a different web of social networks to persist allowing individuals
to ""selectively disclose"" information which unfortunately is in a biased format. As we all have seen the
effects of playing the Telephone Game with a large group of people, the same concept with the beliefs that
are most widespread become the most repeated. The problem with debunking misinformation is that this
can backfire due to people relying only on the familiar information they had just been exposed to. The
problem with the homogenous social groups is that it nurtures a misinformation mindset allowing for
falsehood to be accepted since it appears as perhaps a social "norm" due to the decrease in contradictory
information. Due to these social networks, it creates "clustering" effect which can end up being "specific
rumor variations". These rumor variations lead to beliefs being perceived as more popular than they
actually are causing a rumor cascade on these social networks.[19]

The third level of misinformation is the Societal level which is influenced by both the individual and group
levels. The common figures associated with misinformation include Politicians as well as other political
actors who attempt to shape the public opinion in their favor. The role of the mass media is to be a
corrective agent to prevent misinformation to American citizens. Objectivity has been a common thread that
American media has lacked being a contributor to the plague of misinformation. As print media evolved
into radio, television and now the internet which go hand in hand with paid commercial actors to generate
tailored content to attract viewers. The intent is to reach target audiences which has dramatically shifted
with examples such as Facebook utilize their sources to have data collection as well as ""profiling"" tools
that track each users' preferences for products and allow for ads that are hypertargeted for that viewer. Not
only are these hypertargeted ads but they also compete for younger audiences attention on social media
which limit the amount of news sources viewed on a daily basis. The condition of our society at this point
is quoted best by the Axios cofounder Jim VandeHei who stated that ""Survival...depends on giving
readers what they really want, how they want it, when they want it, and on not spending too much money
producing what they don't want."" Unfortunately, this is the climate of our culture when it comes to news
quality. The change of these news realities are attributed to ""social mega trends"" which have been a huge
contributor to the misinformation problem of the United States. In addition, the decline in social capital,
political polarization, gap in economic inequalities, decline in trust in science, and how the parties are
susceptible also to misinformation.[19]

Cognitive factors

Prior research suggests it can be difficult to undo the effects of misinformation once individuals believe it to
be true, and that fact-checking can backfire.[20] Individuals may desire to reach a certain conclusion,
causing them to accept information that supports that conclusion. Individuals are more likely to hang onto
information and share information if it emotionally resonates with them.[21]

Individuals create mental models and schemas to understand their physical and social environments.[22]
Misinformation that becomes incorporated into a mental model, especially for long periods of time, will be
more difficult to address as individuals prefer to have a complete mental model.[23] In this instance, it is
necessary to correct the misinformation by both refuting it and providing accurate information that can
function in the mental model.[20] When attempting to correct misinformation, it is important to consider
previous research which has identified effective and ineffective strategies. Simply providing the corrected
information is insufficient to correct the effects of misinformation, and it may even have a negative effect.
Due to the familiarity heuristic, information that is familiar is more likely to be believed to be true—
corrective messages which contain a repetition of the original misinformation may result in an increase in
familiarity and cause a backfire effect.[24]
Factors that contribute to the effectiveness of a corrective message include an individual's mental model or
worldview, repeated exposure to the misinformation, time between misinformation and correction,
credibility of the sources, and relative coherency of the misinformation and corrective message. Corrective
messages will be more effective when they are coherent and/or consistent with the audience's worldview.
They will be less effective when misinformation is believed to come from a credible source, is repeated
prior to correction (even if the repetition occurs in the process of debunking), and/or when there is a time
lag between the misinformation exposure and corrective message. Additionally, corrective messages
delivered by the original source of the misinformation tend to be more effective.[25]

Countering misinformation

One suggested solution for prevention of misinformation is a distributed consensus mechanism to validate
the accuracy of claims, with appropriate flagging or removal of content that is determined to be false or
misleading.[23] Another approach is to "inoculate" against it by delivering weakened misinformation that
warns of the dangers of the misinformation. This includes counterarguments and showing the techniques
used to mislead. One way to apply this is to use parallel argumentation, in which the flawed logic is
transferred to a parallel situation (E.g. shared extremity or absurdity). This approach exposes bad logic
without the need for complicated explanations.[26]

Flagging or eliminating false statements in media using algorithmic fact checkers is becoming an
increasingly common tactic to fight misinformation. Computer programs that automatically detect
misinformation are just emerging, but similar algorithms are already in place on Facebook and Google.
Google provides supplemental information pointing to fact-checking websites in response to its users
searching controversial search terms. Likewise, algorithms detect and alert Facebook users that what they
are about to share is likely false.[27]

A common related issue brought up is the over censorship of platforms like Facebook and Twitter.[28]
Many free speech activists argue that their voices are not being heard and their rights being taken away.[29]
To combat the spread of misinformation, social media platforms are often tasked with finding common
ground between allowing free speech, while also not allowing misinformation to be spread throughout their
respective platforms.[28]

Websites have been created to help people to discern fact from fiction. For example, the site FactCheck.org
aims to fact check the media, especially viral political stories. The site also includes a forum where people
can openly ask questions about the information.[30] Similar sites allow individuals to copy and paste
misinformation into a search engine and the site will investigate it.[31] Facebook and Google added
automatic fact-checking programs to their sites, and created the option for users to flag information that they
think is false.[31] A way that fact-checking programs find misinformation involves analyzing the language
and syntax of news stories. Another way is fact-checkers can search for existing information on the subject
and compare it to the news broadcasts being put online.[32] Other sites such as Wikipedia and Snopes are
also widely used resources for verifying information.

Causes
Historically, people have relied on journalists and other information professionals to relay facts and truths
about certain topics..[33] Many different things cause miscommunication but the underlying factor is
information literacy. Because information is distributed by various means, it is often hard for users to ask
questions of credibility. Many online sources of misinformation use techniques to fool users into thinking
their sites are legitimate and the information they generate is factual. Often misinformation can be politically
motivated.[34] For example, websites such as USConservativeToday.com have posted false information for
political and monetary gain.[35] Another role misinformation serves is to distract the public eye from
negative information about a given person and/or issues of policy.[27] Aside from political and financial
gain, misinformation can also be spread unintentionally.

Misinformation cited with hyperlinks has been found to increase readers' trust. Trust is shown to be even
higher when these hyperlinks are to scientific journals, and higher still when readers do not click on the
sources to investigate for themselves.[36] Trusting a source could lead to spreading misinformation
unintentionally.

Misinformation is sometimes an unintended side effect of bias. Misguided opinions can lead to the
unintentional spread of misinformation, where individuals do not intend on spreading false propaganda, yet
the false information they share is not checked and referenced.[37] While that may be the case, there are
plenty of instances where information is intentionally skewed, or leaves out major defining details and facts.
Misinformation could be misleading rather than outright false.

Research documents "the role political elites play in shaping both news coverage and public opinion
around science issues".[38]

Another reason for the recent spread of misinformation may be the lack of consequences. With little to no
repercussions, there is nothing to stop people from posting misleading information. The gain they get from
the power of influencing other peoples' minds is greater than the impact of a removed post or temporary
ban on Twitter. This forces individual companies to be the ones to mandate rules and policies regarding
when people's "free speech" impedes other users' quality of life.[39]

Online misinformation
Digital and social media can contribute to the spread of
misinformation – for instance, when users share information
without first checking the legitimacy of the information they have
found. People are more likely to encounter online information
based on personalized algorithms. Google, Facebook and Yahoo
News all generate newsfeeds based on the information they know
about our devices, our location, and our online interests. Although
The differences between
two people can search for the same thing at the same time, they are
disinformation, misinformation, and
very likely to get different results based on what that platform
malinformation.
deems relevant to their interests, fact or false.[40]

An emerging trend in the online information environment is "a shift


away from public discourse to private, more ephemeral, messaging", which is a challenge to counter
misinformation.[41]

Countermeasures

A report by the Royal Society lists potential or proposed countermeasures:[41]

Automated detection systems (e.g. to flag or add context and resources to content)
Emerging anti-misinformation sector (e.g. organizations combating scientific misinformation)
Provenance enhancing technology (i.e. better enabling people to determine the veracity of a
claim, image, or video)
APIs for research (i.e. for usage to detect, understand, and counter misinformation)
Active bystanders
Community moderation (usually of unpaid and untrained, often independent, volunteers)
Anti-virals (e.g. limiting the number of times a message can be forwarded in privacy-
respecting encrypted chats)
Collective intelligence (examples being Wikipedia where multiple editors refine
encyclopedic articles, and question-and-answer sites where outputs are also evaluated by
others similar to peer-review)
Trustworthy institutions and data
Media literacy (increasing citizens' ability to use ICTs to find, evaluate, create, and
communicate information, an essential skill for citizens of all ages)
Media literacy is taught in Estonian public schools – from kindergarten through to high
school – since 2010 and "accepted 'as important as maths or writing or reading'"[42]

Broadly described, the report recommends building resilience to scientific misinformation and a healthy
online information environment and not having offending content removed. It cautions that censorship
could e.g. drive misinformation and associated communities "to harder-to-address corners of the
internet".[43]

The role of social media

In the Information Age, social networking sites have become a notable agent for the spread of
misinformation, fake news, and propaganda.[44][16][45][46][47] Misinformation on social media spreads
quickly in comparison to traditional media because of the lack of regulation and examination required
before posting.[48][49] These sites provide users with the capability to spread information quickly to other
users without requiring the permission of a gatekeeper such as an editor who might otherwise require
confirmation of the truth before allowing publication. Journalists today are criticized for helping to spread
false information on these social platforms, but research shows they also play a role in curbing it through
debunking and denying false rumors.[50][51]

Social media platforms allow for easy spread of misinformation.[52] The specific reasons why
misinformation spreads through social media so easily remain unknown.[48] A 2018 study of Twitter
determined that, compared to accurate information, false information spread significantly faster, further,
deeper, and more broadly.[53] Similarly, a research study of Facebook found that misinformation was more
likely to be clicked on than factual information.[54] Combating its spread is difficult for two reasons: the
profusion of information sources, and the generation of "echo chambers". The profusion of information
sources makes the reader's task of weighing the reliability of information more challenging, heightened by
the untrustworthy social signals that go with such information.[55] Echo chambers and filter bubbles come
from the inclination of people to follow or support like-minded individuals. With no differing information to
counter the untruths or the general agreement within isolated social clusters, some argue the outcome is an
absence of a collective reality.[56] Although social media sites have changed their algorithms to prevent the
spread of fake news, the problem still exists.[57] Furthermore, research has shown that while people may
know what the scientific community has proved as a fact, they may still refuse to accept it as such.[58]

Social media's influence can be supported by scholars such as Ghosh and Scott, who indicated that
misinformation is "becoming unstoppable."[59] It has also been observed that misinformation and
disinformation come back, multiple times on social media sites. A research study watched the process of
thirteen rumors appearing on Twitter and noticed that eleven of those same stories resurfaced multiple
times, after much time had passed.[60]
A social media app called Parler has caused much chaos as well. Right winged Twitter users who were
banned on the app moved to Parler after the Capitol Hill riots and the app was being used to plan and
facilitate more illegal and dangerous activities. Google and Apple later pulled the app off of the App Store.
This app has been able to cause a lot of misinformation and bias in the media allowing for more political
mishaps.[61]

Another reason that misinformation spreads on social media is from the users themselves. In a study, it was
shown that the most common reasons that Facebook users were sharing misinformation for socially-
motivated reasons, rather than taking the information seriously.[62] Although users may not be spreading
false information for malicious reasons, the misinformation is still being spread. A research study shows
that misinformation introduced through a social format influences individuals drastically more than
misinformation delivered non-socially.[63] Facebook's coverage of misinformation has become a hot topic
with the spread of COVID-19, as some reports indicated Facebook recommended pages containing health
misinformation.[28] For example, this can be seen when a user likes an anti-vax Facebook page.
Automatically, more and more anti-vax pages are recommended to the user.[28] Additionally, some
reference Facebook's inconsistent censorship of misinformation leading to deaths from COVID-19.[28]
Larry Cook, the creator of the "Stop Mandatory Vaccination" organization made money posting anti-vax
false news on social media. He posted more than 150 posts aimed towards woman had over 1.6 million
views and earned money on every click and share.[64]

Twitter is one of the most concentrated platforms for engagement with political fake news. 80% of fake
news sources are shared by 0.1% of users, who are "super-sharers". Older, more conservative social users
are also more likely to interact with fake news.[62] On Facebook, adults older than 65 were seven times
more likely to share fake news than adults ages 18–29.[53] Another source of misinformation on Twitter are
bot accounts, especially surrounding climate change.[65] Misinformation spread by bots has been difficult
for social media platforms to address.[66] Facebook estimated the existence of up to 60 million troll bots
actively spreading misinformation on their platform,[67] and has taken measures to stop the spread of
misinformation, resulting in a decrease, though misinformation continues to exist on the platform.[68]

Spontaneous spread of misinformation on social media usually occurs from users sharing posts from friends
or mutually-followed pages. These posts are often shared from someone the sharer believes they can trust.
Other misinformation is created and spread with malicious intent. Sometimes to cause anxiety, other times
to deceive audiences.[69] There are times when rumors are created with malicious intent, but shared by
unknowing users.

With the large audiences that can be reached and the experts on various subjects on social media, some
believe social media could also be the key to correcting misinformation.[70]

Agent-based models and other computational models have been used by researchers to explain how false
beliefs spread through networks. Epistemic network analysis is one example of computational method for
evaluating connections in data shared in a social media network or similar network.[71] In The
Misinformation Age: How False Beliefs Spread, a trade book by philosopher Cailin O'Connor and
physicist James Owen Weatherall, the authors used a combination of case studies and agent-based models
to show how false beliefs spread on social media and scientific networks.[72][73] This book analyses the
social nature of scientific research; the nature of information flow between scientists, propagandists, and
politicians; and the spread of false beliefs among the general population.[72]

Lack of peer review


Due to the decentralized nature and structure of the Internet,
content creators can easily publish content without being required
to undergo peer review, prove their qualifications, or provide
backup documentation. While library books have generally been
reviewed and edited by an editor, publishing company, etc., Internet
sources cannot be assumed to be vetted by anyone other than their
authors. Misinformation may be produced, reproduced, and posted
immediately on most online platforms.[74]

Censorship accusations

Social media sites such as Facebook and Twitter have found Promoting more Peer Review to
themselves defending accusations of censorship for removing posts benefit the accuracy in information.
they have deemed to be misinformation. Social media censorship
policies relying on government agency-issued guidance to
determine information validity have garnered criticism that such policies have the unintended effect of
stifling dissent and criticism of government positions and policies.[75] Most recently, social media
companies have faced criticism over allegedly prematurely censoring the discussion of the SARS-CoV 2
Lab Leak Hypothesis.[75][76]

Other accusations of censorship appear to stem from attempts to prevent social media consumers from self-
harm through the use of unproven COVID-19 treatments. For example, in July 2020, a video went viral
showing Dr. Stella Immanuel claiming hydroxychloroquine was an effective cure for COVID-19. In the
video, Immanuel suggested that there was no need for masks, school closures, or any kind of economic
shut down; attesting that her alleged cure was highly effective in treating those infected with the virus. The
video was shared 600,000 times and received nearly 20 million views on Facebook before it was taken
down for violating community guidelines on spreading misinformation.[77] The video was also taken down
on Twitter overnight, but not before former president Donald Trump shared it to his page, which was
followed by over 85 million Twitter users. NIAID director Dr. Anthony Fauci and members of the World
Health Organization (WHO) quickly discredited the video, citing larger-scale studies of
hydroxychloroquine showing it is not an effective treatment of COVID-19, and the FDA cautioned against
using it to treat COVID-19 patients following evidence of serious heart problems arising in patients who
have taken the drug.[78]

Another prominent example of misinformation removal criticized by some as an example of censorship was
the New York Post's report on the Hunter Biden laptops, which was used to promote the Biden–Ukraine
conspiracy theory. Social media companies quickly removed this report, and the Post's Twitter account was
temporarily suspended. Over 50 intelligence officials found the disclosure of emails allegedly belonging to
Joe Biden’s son had all the "classic earmarks of a Russian information operation".[79] Later evidence
emerged that at least some of the laptop's contents were authentic.[80]

Mass media, trust, and transparency

Competition in news and media

Because news organizations and websites compete for viewers, there is a need for efficiency in releasing
stories to the public. The news media landscape in the 1970s offered American consumers access to a
limited, but often consistent selection of news offerings, whereas today consumers are confronted with an
abundance of voices online. This growth of consumer choice when it comes to news media allows the
consumer to choose a news source that may align with their biases, which consequently increases the
likelihood that they are misinformed.[27] 47% of Americans reported social media as their main news
source in 2017 as opposed to traditional news sources.[81] News media companies often broadcast stories
24 hours a day, and break the latest news in hopes of taking audience share from their competitors. News
can also be produced at a pace that does not always allow for fact-checking, or for all of the facts to be
collected or released to the media at one time, letting readers or viewers insert their own opinions, and
possibly leading to the spread of misinformation.[82]

Inaccurate information from media sources

A Gallup poll made public in 2016 found that only 32% of Americans trust the mass media "to report the
news fully, accurately and fairly", the lowest number in the history of that poll.[83] An example of bad
information from media sources that led to the spread of misinformation occurred in November 2005, when
Chris Hansen on Dateline NBC claimed that law enforcement officials estimate 50,000 predators are online
at any moment. Afterward, the U.S. attorney general at the time, Alberto Gonzales, repeated the claim.
However, the number that Hansen used in his reporting had no backing. Hansen said he received the
information from Dateline expert Ken Lanning, but Lanning admitted that he made up the number 50,000
because there was no solid data on the number. According to Lanning, he used 50,000 because it sounds
like a real number, not too big and not too small, and referred to it as a "Goldilocks number". Reporter Carl
Bialik says that the number 50,000 is used often in the media to estimate numbers when reporters are
unsure of the exact data.[84]

The Novelty Hypothesis, which was created by Soroush Vosoughi, Deb Roy and Sinan Aral when they
wanted to learn more about what attracts people to false news. What they discovered was that people are
connected through emotion. In their study, they compared false tweets on Twitter that were shared by the
total content tweeted, they specifically looked at the users and both the false and true information they
shared. They learned that people are connected through their emotions, false rumors suggested more
surprise and disgust which got people hooked and that the true rumors attracted more sadness, joy and trust.
This study showed which emotions are more likely to cause the spread of false news.[64]

Distrust

Misinformation has often been associated with the concept of fake news, which some scholars define as
"fabricated information that mimics news media content in form but not in organizational process or
intent."[16] Intentional misinformation, called disinformation, has become normalized in politics and topics
of great importance to the public, such as climate change and the COVID-19 pandemic. Intentional
misinformation has caused irreversible damage to public understanding and trust.[85] Egelhofer et al. argued
that the media's wide adoption of the term “fake news” has served to normalize this concept and help to
stabilize the use of this buzzword in our everyday language (2021).[86] Goldstein (2021) discussed the need
for government agencies and organizations to increase transparency of their practices or services by using
social media. Companies can then utilize the platforms offered by social media and bring forth full
transparency to the public. If used in strategic ways, social media can offer an agency or agenda (ex:
political campaigns or vaccines) a way to connect with the public and offer a place for people to track news
and developments.

Despite many popular examples being from the US, misinformation is prevalent worldwide. In the United
Kingdom, many people followed and believed a conspiracy theory that Coronavirus was linked to the 5G
network,[87] a popular idea that arose from a series of hashtags on Twitter.
Misinformation can also be used to deflect accountability. For example, Syria's repeated use of chemical
weapons was the subject of a disinformation campaign intended to prevent accountability [cite Steward, M.
(2021).[87] In his paper Defending Weapons Inspections from the Effects of Disinformation, Stewart shows
how disinformation was used to conceal and purposely misinform the public about Syria's violations of
international law. The intention was to create plausible deniability of the violations, making discussion of
possible violations to be regarded as untruthful rumors. Because the disinformation campaigns have been so
effective and normalized, the opposing side has also started relying on disinformation to prevent
repercussions for unfavorable behavior from those pushing a counter narrative.

According to Melanie Freeze (Freeze et al., 2020), in most cases the damage of misinformation can be
irreparable.[87] Freeze explored whether people can recollect an event accurately when presented with
misinformation after the event occurred. Findings showed that an individual's recollection of political events
could be altered when presented with misinformation about the event. This study also found that if one is
able to identify warning signs of misinformation, they still have a hard time retaining the pieces of
information which are accurate vs inaccurate. Furthermore, their results showed that people can completely
discard accurate information if they incorrectly deem a news source as “fake news” or untrustworthy and
potentially disregard completely credible information. Alyt Damstra (Damstra et al., 2021) states that
misinformation has been around since the establishment of press, thus leaving little room to wonder how it
has been normalized today.[88]

Alexander Lanoszka (2019)[89] argued that fake news does not have to be looked at as an unwinnable war.
Misinformation can create a sense of societal chaos and anarchy. With deep mistrust, no single idea can
successfully move forward. With the existence of malicious efforts to misinform, desired progress may rely
on trust in people and their processes.

Misinformation was a major talking point during the 2016 American Presidential Election with claims of
social media sites allowing "fake news" to be spread.[29] It has been found that exposure to misinformation
is associated with an overall rise in political trust by those siding with the government in power or those
who self-define as politically moderate.[90] Social media became polarized and political during the 2020
United States Presidential Election, with some arguing that misinformation about COVID-19 had been
circulating, creating skepticism of topics such as vaccines and figures such as Dr. Fauci. Others argued that
platforms such as Facebook had been unconstitutionally censoring conservative voices, spreading
misinformation to persuade voters.[29]

Polarization on social media platforms has caused people to question the source of their information.
Skepticism in news platforms created a large distrust of the news media. Often, misinformation is blended
to seem true.[39] Misinformation does not simply imply false information. Social media platforms can be an
easy place to skew and manipulate facts to show a different view on a topic, often trying to paint a bad
picture of events.[91][37]

Impact
Misinformation can affect all aspects of life. Allcott, Gentzkow, and Yu concur that the diffusion of
misinformation through social media is a potential threat to democracy and broader society. The effects of
misinformation can lead to decline of accuracy of information as well as event details.[92] When
eavesdropping on conversations, one can gather facts that may not always be true, or the receiver may hear
the message incorrectly and spread the information to others. On the Internet, one can read content that is
stated to be factual but that may not have been checked or may be erroneous. In the news, companies may
emphasize the speed at which they receive and send information but may not always be correct in the facts.
These developments contribute to the way misinformation may continue to complicate the public's
understanding of issues and to serve as a source for belief and attitude formation.[93]
In regards to politics, some view being a misinformed citizen as worse than being an uninformed citizen.
Misinformed citizens can state their beliefs and opinions with confidence and thus affect elections and
policies. This type of misinformation occurs when a speaker appears "authoritative and legitimate", while
also spreading misinformation.[44] When information is presented as vague, ambiguous, sarcastic, or partial,
receivers are forced to piece the information together and make assumptions about what is correct.[94]
Misinformation has the power to sway public elections and referendums if it gains enough momentum.
Leading up to the 2016 United Kingdom European Union membership referendum, for example, a figure
widely circulated by the Vote Leave campaign claimed the UK would save £350 million a week by leaving
the EU, and that the money would be redistributed to the British National Health Service. This was later
deemed a "clear misuse of official statistics" by the UK statistics authority. The advert infamously shown
on the side of London's double-decker busses did not take into account the UK's budget rebate, and the
idea that 100% of the money saved would go to the NHS was unrealistic. A poll published in 2016 by
Ipsos MORI found that nearly half of the British public believed this misinformation to be true.[95] Even
when information is proven to be misinformation, it may continue to shape attitudes towards a given
topic,[83] meaning it has the power to swing political decisions if it gains enough traction. A study
conducted by Soroush Vosoughi, Deb Roy and Sinan Aral looked at Twitter data including 126,000 posts
spread by 3 million people over 4.5 million times. They found that political news traveled faster than any
other type of information. They found that false news about politics reached more than 20,000 people three
times faster than all other types of false news.[64]

Aside from political propaganda, misinformation can also be employed in industrial propaganda. Using
tools such as advertising, a company can undermine reliable evidence or influence belief through a
concerted misinformation campaign. For instance, tobacco companies employed misinformation in the
second half of the twentieth century to diminish the reliability of studies that demonstrated the link between
smoking and lung cancer.[96]

In the medical field, misinformation can immediately lead to life endangerment as seen in the case of the
public's negative perception towards vaccines or the use of herbs instead of medicines to treat
diseases.[44][97] In regards to the COVID-19 pandemic, the spread of misinformation has proven to cause
confusion as well as negative emotions such as anxiety and fear.[98][99] Misinformation regarding proper
safety measures for the prevention of the virus that go against information from legitimate institutions like
the World Health Organization can also lead to inadequate protection and possibly place individuals at risk
for exposure.[98][100]

Some scholars and activists are heading movements to eliminate the mis/disinformation and information
pollution in the digital world. One theory, "information environmentalism," has become a curriculum in
some universities and colleges.[101][102]

See also
List of common misconceptions
List of fact-checking websites
List of fake news websites
List of satirical news websites
Alarmism
Big lie
Character assassination
Defamation (also known as "slander")
Counter Misinformation Team
Euromyth
Factoid
Fallacy
List of fallacies
Flat earth
Gossip
Junk science
Persuasion
Pseudoscience
Quotation
Rumor
Sensationalism
Social engineering (in political science and cybercrime)
Truth sandwich

References
1. Merriam-Webster Dictionary (19 August 2020). "Misinformation" (https://www.merriam-webst
er.com/dictionary/misinformation). Retrieved 19 August 2020.
2. Merriam-Webster Dictionary (19 August 2020). "Disinformation" (https://www.merriam-webst
er.com/dictionary/disinformation). Merriam-Webster. Retrieved 19 August 2020.
3. Woolley, Samuel C.; Howard, Philip N. (2016). "Political Communication, Computational
Propaganda, and Autonomous Agents" (https://ijoc.org/index.php/ijoc/article/view/6298).
International Journal of Communication. 10: 4882–4890. Archived (https://web.archive.org/w
eb/20191022194727/https://ijoc.org/index.php/ijoc/article/view/6298) from the original on
2019-10-22. Retrieved 2019-10-22.
4. Caramancion, Kevin Matthe (March 2020). "An Exploration of Disinformation as a
Cybersecurity Threat" (https://dx.doi.org/10.1109/icict50521.2020.00076). 2020 3rd
International Conference on Information and Computer Technologies (ICICT). IEEE: 440–
444. doi:10.1109/icict50521.2020.00076
(https://doi.org/10.1109%2Ficict50521.2020.00076). ISBN 978-1-7281-7283-5.
S2CID 218651389 (https://api.semanticscholar.org/CorpusID:218651389).
5. Merriam-Webster Dictionary - rumor (https://www.merriam-webster.com/dictionary/rumor)
6. Ecker, Ullrich K.H.; Lewandowsky, Stephan; Cheung, Candy S.C.; Maybery, Murray T.
(November 2015). "He did it! She did it! No, she did not! Multiple causal explanations and
the continued influence of misinformation" (https://api.research-repository.uwa.edu.au/files/1
1787265/Ecker_et_al_He_did_it_She_did_it_2015_.pdf) (PDF). Journal of Memory and
Language. 85: 101–115. doi:10.1016/j.jml.2015.09.002 (https://doi.org/10.1016%2Fj.jml.201
5.09.002).
7. Aral, Sinan (2020). The hype machine : how social media disrupts our elections, our
economy, and our health--and how we must adapt (https://www.worldcat.org/oclc/115548605
6) (First ed.). New York. ISBN 978-0-525-57451-4. OCLC 1155486056 (https://www.worldca
t.org/oclc/1155486056).
8. Lewandowsky, Stephan; Ecker, Ullrich K. H.; Seifert, Colleen M.; Schwarz, Norbert; Cook,
John (2012). "Misinformation and Its Correction: Continued Influence and Successful
Debiasing" (https://www.jstor.org/stable/23484653). Psychological Science in the Public
Interest. 13 (3): 106–131. doi:10.1177/1529100612451018 (https://doi.org/10.1177%2F1529
100612451018). ISSN 1529-1006 (https://www.worldcat.org/issn/1529-1006).
JSTOR 23484653 (https://www.jstor.org/stable/23484653). PMID 26173286 (https://pubmed.
ncbi.nlm.nih.gov/26173286). S2CID 42633
(https://api.semanticscholar.org/CorpusID:42633).
9. "The True History of Fake News" (https://www.nybooks.com/daily/2017/02/13/the-true-histor
y-of-fake-news/). The New York Review of Books. 2017-02-13. Archived (https://web.archiv
e.org/web/20190205222751/https://www.nybooks.com/daily/2017/02/13/the-true-history-of-fa
ke-news/) from the original on 2019-02-05. Retrieved 2019-02-24.
10. Renee DiResta; Tobias Rose-Stockwell. "How to Stop Misinformation Before It Gets
Shared" (https://www.wired.com/preview/story/6058ae3f217bbd8932a2227b). Wired.
11. "A short guide to the history of 'fake news' and disinformation" (https://www.icfj.org/news/sho
rt-guide-history-fake-news-and-disinformation-new-icfj-learning-module). International
Center for Journalists. Archived (https://web.archive.org/web/20190225103100/https://www.i
cfj.org/news/short-guide-history-fake-news-and-disinformation-new-icfj-learning-module)
from the original on 2019-02-25. Retrieved 2019-02-24.
12. Mintz, Anne. "The Misinformation Superhighway?" (https://www.pbs.org/now/shows/401/inte
rnet-politics.html). PBS. Archived (https://web.archive.org/web/20130402040621/http://www.
pbs.org/now/shows/401/internet-politics.html) from the original on 2 April 2013. Retrieved
26 February 2013.
13. Jain, Suchita; Sharma, Vanya; Kaushal, Rishabh (September 2016). "Towards automated
real-time detection of misinformation on Twitter". 2016 International Conference on
Advances in Computing, Communications and Informatics (ICACCI). IEEE Conference
Publication. pp. 2015–2020. doi:10.1109/ICACCI.2016.7732347 (https://doi.org/10.1109%2
FICACCI.2016.7732347). ISBN 978-1-5090-2029-4. S2CID 17767475 (https://api.semantics
cholar.org/CorpusID:17767475).
14. Libicki, Martin (2007). Conquest in Cyberspace: National Security and Information Warfare
(https://archive.org/details/conquestcyberspa00libi_962). New York: Cambridge University
Press. pp. 51 (https://archive.org/details/conquestcyberspa00libi_962/page/n63)–55.
ISBN 978-0521871600.
15. Khan, M. Laeeq; Idris, Ika Karlina (2019-02-11). "Recognise misinformation and verify before
sharing: a reasoned action and information literacy perspective". Behaviour & Information
Technology. 38 (12): 1194–1212. doi:10.1080/0144929x.2019.1578828 (https://doi.org/10.1
080%2F0144929x.2019.1578828). ISSN 0144-929X (https://www.worldcat.org/issn/0144-92
9X). S2CID 86681742 (https://api.semanticscholar.org/CorpusID:86681742).
16. Lazer, David M. J.; Baum, Matthew A.; Benkler, Yochai; Berinsky, Adam J.; Greenhill, Kelly
M.; Menczer, Filippo; Metzger, Miriam J.; Nyhan, Brendan; Pennycook, Gordon; Rothschild,
David; Schudson, Michael; Sloman, Steven A.; Sunstein, Cass R.; Thorson, Emily A.; Watts,
Duncan J.; Zittrain, Jonathan L. (2018). "The science of fake news". Science. 359 (6380):
1094–1096. Bibcode:2018Sci...359.1094L (https://ui.adsabs.harvard.edu/abs/2018Sci...359.
1094L). doi:10.1126/science.aao2998 (https://doi.org/10.1126%2Fscience.aao2998).
PMID 29590025 (https://pubmed.ncbi.nlm.nih.gov/29590025). S2CID 4410672 (https://api.se
manticscholar.org/CorpusID:4410672).
17. Vraga, Emily K.; Bode, Leticia (December 2017). "Leveraging Institutions, Educators, and
Networks to Correct Misinformation: A Commentary on Lewandosky, Ecker, and Cook" (http
s://dx.doi.org/10.1016/j.jarmac.2017.09.008). Journal of Applied Research in Memory and
Cognition. 6 (4): 382–388. doi:10.1016/j.jarmac.2017.09.008 (https://doi.org/10.1016%2Fj.jar
mac.2017.09.008). ISSN 2211-3681 (https://www.worldcat.org/issn/2211-3681).
18. Caramancion, Kevin Matthe (September 2020). "Understanding the Impact of Contextual
Clues in Misinformation Detection" (https://ieeexplore.ieee.org/document/9216394). 2020
IEEE International IOT, Electronics and Mechatronics Conference (IEMTRONICS): 1–6.
doi:10.1109/IEMTRONICS51293.2020.9216394 (https://doi.org/10.1109%2FIEMTRONICS5
1293.2020.9216394). ISBN 978-1-7281-9615-2. S2CID 222297695 (https://api.semanticsch
olar.org/CorpusID:222297695).
19. Scheufele, Dietram; Krause, Nicole (April 16, 2019). "Science audiences, misinformation,
and fake news" (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6475373). Proceedings of
the National Academy of Sciences. 116 (16): 7662–7669. doi:10.1073/pnas.1805871115 (htt
ps://doi.org/10.1073%2Fpnas.1805871115). PMC 6475373 (https://www.ncbi.nlm.nih.gov/p
mc/articles/PMC6475373). PMID 30642953 (https://pubmed.ncbi.nlm.nih.gov/30642953).
20. Ecker, Ullrich K. H.; Lewandowsky, Stephan; Chadwick, Matthew (2020-04-22). "Can
Corrections Spread Misinformation to New Audiences? Testing for the Elusive Familiarity
Backfire Effect" (http://psyarxiv.com/qrm69/). Cognitive Research: Principles and
Implications. 5 (1): 41. doi:10.31219/osf.io/et4p3 (https://doi.org/10.31219%2Fosf.io%2Fet4p
3). hdl:1983/0d5feec2-5878-4af6-b5c7-fbbd398dd4c4 (https://hdl.handle.net/1983%2F0d5fe
ec2-5878-4af6-b5c7-fbbd398dd4c4). PMC 7447737 (https://www.ncbi.nlm.nih.gov/pmc/articl
es/PMC7447737). PMID 32844338 (https://pubmed.ncbi.nlm.nih.gov/32844338).
21. Lewandowsky, Stephan; Ecker, Ullrich K. H.; Seifert, Colleen M.; Schwarz, Norbert; Cook,
John (2012). "Misinformation and Its Correction: Continued Influence and Successful
Debiasing". Psychological Science in the Public Interest. 13 (3): 106–131.
doi:10.1177/1529100612451018 (https://doi.org/10.1177%2F1529100612451018).
JSTOR 23484653 (https://www.jstor.org/stable/23484653). PMID 26173286 (https://pubmed.
ncbi.nlm.nih.gov/26173286). S2CID 42633
(https://api.semanticscholar.org/CorpusID:42633).
22. Busselle, Rick (2017), "Schema Theory and Mental Models" (https://onlinelibrary.wiley.com/
doi/abs/10.1002/9781118783764.wbieme0079), The International Encyclopedia of Media
Effects, American Cancer Society, pp. 1–8, doi:10.1002/9781118783764.wbieme0079 (http
s://doi.org/10.1002%2F9781118783764.wbieme0079), ISBN 978-1-118-78376-4, retrieved
2021-03-28
23. Plaza, Mateusz; Paladino, Lorenzo (2019). "The use of distributed consensus algorithms to
curtail the spread of medical misinformation". International Journal of Academic Medicine. 5
(2): 93–96. doi:10.4103/IJAM.IJAM_47_19 (https://doi.org/10.4103%2FIJAM.IJAM_47_19).
S2CID 201803407 (https://api.semanticscholar.org/CorpusID:201803407).
24. "Supplemental Material for The Role of Familiarity in Correcting Inaccurate Information" (http
s://dx.doi.org/10.1037/xlm0000422.supp). Journal of Experimental Psychology: Learning,
Memory, and Cognition. 2017. doi:10.1037/xlm0000422.supp (https://doi.org/10.1037%2Fxl
m0000422.supp). ISSN 0278-7393 (https://www.worldcat.org/issn/0278-7393).
25. Walter, Nathan; Tukachinsky, Riva (2019-06-22). "A Meta-Analytic Examination of the
Continued Influence of Misinformation in the Face of Correction: How Powerful Is It, Why
Does It Happen, and How to Stop It?" (https://dx.doi.org/10.1177/0093650219854600).
Communication Research. 47 (2): 155–177. doi:10.1177/0093650219854600 (https://doi.or
g/10.1177%2F0093650219854600). ISSN 0093-6502 (https://www.worldcat.org/issn/0093-6
502). S2CID 197731687 (https://api.semanticscholar.org/CorpusID:197731687).
26. Cook, John (May–June 2020). "Using Humor And Games To Counter Science
Misinformation" (https://web.archive.org/web/20201231192717/https://skepticalinquirer.org/2
020/05/using-humor-and-games-to-counter-science-misinformation/). Skeptical Inquirer.
Vol. 44, no. 3. Amherst, New York: Center for Inquiry. pp. 38–41. Archived from the original
(https://skepticalinquirer.org/2020/05/using-humor-and-games-to-counter-science-misinform
ation/) on 31 December 2020. Retrieved 31 December 2020.
27. Lewandowsky, Stephan; Ecker, Ullrich K.H.; Cook, John (December 2017). "Beyond
Misinformation: Understanding and Coping with the "Post-Truth" Era" (https://dx.doi.org/10.1
016/j.jarmac.2017.07.008). Journal of Applied Research in Memory and Cognition. 6 (4):
353–369. doi:10.1016/j.jarmac.2017.07.008 (https://doi.org/10.1016%2Fj.jarmac.2017.07.00
8). hdl:1983/1b4da4f3-009d-4287-8e45-a0a1d7b688f7 (https://hdl.handle.net/1983%2F1b4d
a4f3-009d-4287-8e45-a0a1d7b688f7). ISSN 2211-3681 (https://www.worldcat.org/issn/2211
-3681). S2CID 149003083 (https://api.semanticscholar.org/CorpusID:149003083).
28. "Facebook exposed over its handling of COVID - ProQuest" (https://www.proquest.com/docv
iew/2553642687). www.proquest.com. ProQuest 2553642687 (https://search.proquest.com/
docview/2553642687). Retrieved 2021-10-07.
29. "When Misinformation is Misinformation - ProQuest" (https://www.proquest.com/docview/247
7885938). www.proquest.com. ProQuest 2477885938 (https://search.proquest.com/docview/
2477885938). Retrieved 2021-10-10.
30. "Ask FactCheck" (http://www.factcheck.org/askfactcheck/). www.factcheck.org. Archived (htt
ps://web.archive.org/web/20160331063044/http://www.factcheck.org/askfactcheck) from the
original on 2016-03-31. Retrieved 2016-03-31.
31. Fernandez, Miriam; Alani, Harith (2018). "Online Misinformation" (http://oro.open.ac.uk/5373
4/1/sample-sigconf.pdf) (PDF). Companion of the Web Conference 2018 on the Web
Conference 2018 – WWW '18. New York: ACM Press: 595–602.
doi:10.1145/3184558.3188730 (https://doi.org/10.1145%2F3184558.3188730). ISBN 978-1-
4503-5640-4. S2CID 13799324 (https://api.semanticscholar.org/CorpusID:13799324).
Archived (https://web.archive.org/web/20190411195506/http://oro.open.ac.uk/53734/1/sampl
e-sigconf.pdf) (PDF) from the original on 2019-04-11. Retrieved 2020-02-13.
32. Zhang, Chaowei; Gupta, Ashish; Kauten, Christian; Deokar, Amit V.; Qin, Xiao (December
2019). "Detecting fake news for reducing misinformation risks using analytics approaches".
European Journal of Operational Research. 279 (3): 1036–1052.
doi:10.1016/j.ejor.2019.06.022 (https://doi.org/10.1016%2Fj.ejor.2019.06.022). ISSN 0377-
2217 (https://www.worldcat.org/issn/0377-2217). S2CID 197492100 (https://api.semanticsch
olar.org/CorpusID:197492100).
33. Calvert, Philip (December 2002). "Web of Deception: Misinformation on the Internet". The
Electronic Library. 20 (6): 521. doi:10.1108/el.2002.20.6.521.7 (https://doi.org/10.1108%2Fe
l.2002.20.6.521.7). ISSN 0264-0473 (https://www.worldcat.org/issn/0264-0473).
34. Conspiracy theories have long lurked in the background of American history, said Dustin
Carnahan, a Michigan State University professor who studies political misinformation:
Conspiracy theories paint fraudulent reality of Jan. 6 riot, By DAVID KLEPPER, AP news, 1°
Jan. 2022 (https://apnews.com/article/television-donald-trump-washington-conspiracy-theori
es-congress-0ddc173391135ac2cdaa335e3c9b4881).
35. Marwick, Alice E. (2013-01-31), "Online Identity", in John Hartley; Jean Burgess; Axel Bruns
(eds.), A Companion to New Media Dynamics, Wiley-Blackwell, pp. 355–364,
doi:10.1002/9781118321607.ch23 (https://doi.org/10.1002%2F9781118321607.ch23),
ISBN 978-1-118-32160-7
36. Verma, Nitin; Fleischmann, Kenneth R.; Koltai, Kolina S. (2017). "Human values and trust in
scientific journals, the mainstream media and fake news" (https://onlinelibrary.wiley.com/doi/
abs/10.1002/pra2.2017.14505401046). Proceedings of the Association for Information
Science and Technology. 54 (1): 426–435. doi:10.1002/pra2.2017.14505401046 (https://doi.
org/10.1002%2Fpra2.2017.14505401046). ISSN 2373-9231 (https://www.worldcat.org/issn/
2373-9231). S2CID 51958978 (https://api.semanticscholar.org/CorpusID:51958978).
37. Chen, Xinran; Sin, Sei-Ching Joanna (2013). " 'Misinformation? What of it?' Motivations and
individual differences in misinformation sharing on social media" (https://onlinelibrary.wiley.c
om/doi/abs/10.1002/meet.14505001102). Proceedings of the American Society for
Information Science and Technology. 50 (1): 1–4. doi:10.1002/meet.14505001102 (https://do
i.org/10.1002%2Fmeet.14505001102). ISSN 1550-8390 (https://www.worldcat.org/issn/1550
-8390).
38. "Literature Review: Echo chambers, filter bubbles and polarization" (https://royalsociety.org/
-/media/policy/projects/online-information-environment/oie-echo-chambers.pdf) (PDF).
Retrieved 21 February 2022.
39. "Preview unavailable - ProQuest" (https://www.proquest.com/docview/1355300828).
www.proquest.com. ProQuest 1355300828 (https://search.proquest.com/docview/13553008
28). Retrieved 2021-10-07.
40. Beware online "filter bubbles" | Eli Pariser (https://www.youtube.com/watch?v=B8ofWFx525
s), retrieved 2022-02-09
41. "The online information environment" (https://royalsociety.org/-/media/policy/projects/online-i
nformation-environment/the-online-information-environment.pdf) (PDF). Retrieved
21 February 2022.
42. Yee, Amy. "The country inoculating against disinformation" (https://www.bbc.com/future/articl
e/20220128-the-country-inoculating-against-disinformation). BBC. Retrieved 21 February
2022.
43. "Royal Society cautions against censorship of scientific misinformation online" (https://royals
ociety.org/news/2022/01/scientific-misinformation-report/). The Royal Society. Retrieved
12 February 2022.
44. Stawicki, Stanislaw; Firstenberg, Michael; Papadimos, Thomas. "The Growing Role of
Social Media in International Health Security: The Good, the Bad, and the Ugly". Global
Health Security. 1 (1): 341–357.
45. Vosoughi, Soroush; Roy, Deb; Aral, Sinan (2018-03-09). "The spread of true and false news
online" (https://web.archive.org/web/20190429073158/http://vermontcomplexsystems.org/sh
are/papershredder/vosoughi2018a.pdf) (PDF). Science. 359 (6380): 1146–1151.
Bibcode:2018Sci...359.1146V (https://ui.adsabs.harvard.edu/abs/2018Sci...359.1146V).
doi:10.1126/science.aap9559 (https://doi.org/10.1126%2Fscience.aap9559).
PMID 29590045 (https://pubmed.ncbi.nlm.nih.gov/29590045). S2CID 4549072 (https://api.se
manticscholar.org/CorpusID:4549072). Archived from the original (http://vermontcomplexsyst
ems.org/share/papershredder/vosoughi2018a.pdf) (PDF) on 2019-04-29. Retrieved
2019-08-21.
46. Tucker, Joshua A.; Guess, Andrew; Barbera, Pablo; Vaccari, Cristian; Siegel, Alexandra;
Sanovich, Sergey; Stukal, Denis; Nyhan, Brendan. "Social Media, Political Polarization, and
Political Disinformation: A Review of the Scientific Literature" (https://hewlett.org/library/soci
al-media-political-polarization-political-disinformation-review-scientific-literature/). Hewlett
Foundation White Paper. Archived (https://web.archive.org/web/20190306111605/https://he
wlett.org/library/social-media-political-polarization-political-disinformation-review-scientific-li
terature/) from the original on 2019-03-06. Retrieved 2019-03-05.
47. Machado, Caio; Kira, Beatriz; Narayanan, Vidya; Kollanyi, Bence; Howard, Philip (2019). "A
Study of Misinformation in WhatsApp groups with a focus on the Brazilian Presidential
Elections". Companion Proceedings of the 2019 World Wide Web Conference on – WWW
'19. New York: ACM Press: 1013–1019. doi:10.1145/3308560.3316738 (https://doi.org/10.11
45%2F3308560.3316738). ISBN 978-1450366755. S2CID 153314118 (https://api.semantic
scholar.org/CorpusID:153314118).
48. Chen, Xinran; Sin, Sei-Ching Joanna; Theng, Yin-Leng; Lee, Chei Sian (September 2015).
"Why Students Share Misinformation on Social Media: Motivation, Gender, and Study-level
Differences". The Journal of Academic Librarianship. 41 (5): 583–592.
doi:10.1016/j.acalib.2015.07.003 (https://doi.org/10.1016%2Fj.acalib.2015.07.003).
49. Caramancion, Kevin Matthe (2021), "The Role of Information Organization and Knowledge
Structuring in Combatting Misinformation: A Literary Analysis" (https://dx.doi.org/10.1007/97
8-3-030-91434-9_28), Computational Data and Social Networks, Lecture Notes in Computer
Science, Cham: Springer International Publishing, vol. 13116, pp. 319–329,
doi:10.1007/978-3-030-91434-9_28 (https://doi.org/10.1007%2F978-3-030-91434-9_28),
ISBN 978-3-030-91433-2, S2CID 244890285 (https://api.semanticscholar.org/CorpusID:244
890285), retrieved 2021-12-19
50. Starbird, Kate; Dailey, Dharma; Mohamed, Owla; Lee, Gina; Spiro, Emma (2018). "Engage
Early, Correct More: How Journalists Participate in False Rumors Online during Crisis
Events" (https://www.researchgate.net/publication/322665656). Proceedings of the 2018
CHI Conference on Human Factors in Computing Systems (CHI '18).
doi:10.1145/3173574.3173679 (https://doi.org/10.1145%2F3173574.3173679).
S2CID 5046314 (https://api.semanticscholar.org/CorpusID:5046314). Retrieved 2019-02-24.
51. Arif, Ahmer; Robinson, John; Stanck, Stephanie; Fichet, Elodie; Townsend, Paul; Worku,
Zena; Starbird, Kate (2017). "A Closer Look at the Self-Correcting Crowd: Examining
Corrections in Online Rumors" (http://ahmerarif.com/papers/CorrectiveBehavior.pdf) (PDF).
Proceedings of the 2017 ACM Conference on Computer Supported Cooperative Work and
Social Computing (CSCW '17): 155–169. doi:10.1145/2998181.2998294 (https://doi.org/10.
1145%2F2998181.2998294). ISBN 978-1450343350. S2CID 15167363 (https://api.semanti
cscholar.org/CorpusID:15167363). Archived (https://web.archive.org/web/20190226172742/
http://ahmerarif.com/papers/CorrectiveBehavior.pdf) (PDF) from the original on 26 February
2019. Retrieved 25 February 2019.
52. Allcott, Hunt; Gentzkow, Matthew; Yu, Chuan (April 2019). "Trends in the diffusion of
misinformation on social media" (https://journals.sagepub.com/doi/full/10.1177/2053168019
848554). Research & Politics. 6 (2): 205316801984855. doi:10.1177/2053168019848554 (h
ttps://doi.org/10.1177%2F2053168019848554). ISSN 2053-1680 (https://www.worldcat.org/i
ssn/2053-1680). S2CID 52291737 (https://api.semanticscholar.org/CorpusID:52291737).
53. Swire-Thompson, Briony; Lazer, David (2020). "Public Health and Online Misinformation:
Challenges and Recommendations" (https://doi.org/10.1146%2Fannurev-publhealth-04011
9-094127). Annual Review of Public Health. 41: 433–451. doi:10.1146/annurev-publhealth-
040119-094127 (https://doi.org/10.1146%2Fannurev-publhealth-040119-094127).
PMID 31874069 (https://pubmed.ncbi.nlm.nih.gov/31874069).
54. Dwoskin, Elizabeth. "Misinformation on Facebook got six times more clicks than factual
news during the 2020 election, study says" (https://www.washingtonpost.com/technology/20
21/09/03/facebook-misinformation-nyu-study/). The Washington Post.
55. Messerole, Chris (2018-05-09). "How misinformation spreads on social media – And what to
do about it" (https://www.brookings.edu/blog/order-from-chaos/2018/05/09/how-misinformati
on-spreads-on-social-media-and-what-to-do-about-it/). Brookings Institution. Archived (http
s://web.archive.org/web/20190225044815/https://www.brookings.edu/blog/order-from-chaos/
2018/05/09/how-misinformation-spreads-on-social-media-and-what-to-do-about-it/) from the
original on 25 February 2019. Retrieved 24 February 2019.
56. Benkler, Y. (2017). "Study: Breitbart-led rightwing media ecosystem altered broader media
agenda" (https://www.cjr.org/analysis/breitbart-media-trump-harvard-study.php). Archived (htt
ps://web.archive.org/web/20180604140114/https://www.cjr.org/analysis/breitbart-media-trum
p-harvard-study.php) from the original on 4 June 2018. Retrieved 8 June 2018.
57. Allcott, Hunt (October 2018). "Trends in the Diffusion of Misinformation on Social Media" (htt
ps://web.stanford.edu/~gentzkow/research/fake-news-trends.pdf) (PDF). Stanford Education.
arXiv:1809.05901 (https://arxiv.org/abs/1809.05901). Bibcode:2018arXiv180905901A (http
s://ui.adsabs.harvard.edu/abs/2018arXiv180905901A). Archived (https://web.archive.org/we
b/20190728160530/https://web.stanford.edu/~gentzkow/research/fake-news-trends.pdf)
(PDF) from the original on 2019-07-28. Retrieved 2019-05-10.
58. Krause, Nicole M.; Scheufele, Dietram A. (2019-04-16). "Science audiences, misinformation,
and fake news" (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6475373). Proceedings of
the National Academy of Sciences. 116 (16): 7662–7669. doi:10.1073/pnas.1805871115 (htt
ps://doi.org/10.1073%2Fpnas.1805871115). ISSN 0027-8424 (https://www.worldcat.org/iss
n/0027-8424). PMC 6475373 (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6475373).
PMID 30642953 (https://pubmed.ncbi.nlm.nih.gov/30642953).
59. Allcott, Hunt; Gentzkow, Matthew; Yu, Chuan (2019-04-01). "Trends in the diffusion of
misinformation on social media" (https://doi.org/10.1177%2F2053168019848554). Research
& Politics. 6 (2): 2053168019848554. doi:10.1177/2053168019848554 (https://doi.org/10.11
77%2F2053168019848554). ISSN 2053-1680 (https://www.worldcat.org/issn/2053-1680).
S2CID 52291737 (https://api.semanticscholar.org/CorpusID:52291737).
60. Shin, Jieun; Jian, Lian; Driscoll, Kevin; Bar, François (June 2018). "The diffusion of
misinformation on social media: Temporal pattern, message, and source". Computers in
Human Behavior. 83: 278–287. doi:10.1016/j.chb.2018.02.008 (https://doi.org/10.1016%2Fj.
chb.2018.02.008). ISSN 0747-5632 (https://www.worldcat.org/issn/0747-5632).
S2CID 41956979 (https://api.semanticscholar.org/CorpusID:41956979).
61. "Amazon to suspend Parler after deadly Capitol Hill riot" (https://www.aljazeera.com/news/2
021/1/10/amazon-to-suspend-parler-after-deadly-capitol-hill-riot). www.aljazeera.com.
Retrieved 2022-03-07.
62. Chen, Xinran; Sin, Sei-Ching Joanna; Theng, Yin-Leng; Lee, Chei Sian (2015). "Why Do
Social Media Users Share Misinformation?". Proceedings of the 15th ACM/IEEE-CE on
Joint Conference on Digital Libraries – JCDL '15. New York: ACM Press: 111–114.
doi:10.1145/2756406.2756941 (https://doi.org/10.1145%2F2756406.2756941). ISBN 978-1-
4503-3594-2. S2CID 15983217 (https://api.semanticscholar.org/CorpusID:15983217).
63. Gabbert, Fiona; Memon, Amina; Allan, Kevin; Wright, Daniel B. (September 2004). "Say it to
my face: Examining the effects of socially encountered misinformation" (https://rke.abertay.a
c.uk/ws/files/8509545/Gabbertetal2004Legal.pdf) (PDF). Legal and Criminological
Psychology. 9 (2): 215–227. doi:10.1348/1355325041719428 (https://doi.org/10.1348%2F13
55325041719428). ISSN 1355-3259 (https://www.worldcat.org/issn/1355-3259).
S2CID 144823646 (https://api.semanticscholar.org/CorpusID:144823646).
64. Aral, Sinan (2020). The hype machine : how social media disrupts our elections, our
economy, and our health--and how we must adapt (https://www.worldcat.org/oclc/115548605
6) (First ed.). New York. ISBN 978-0-525-57451-4. OCLC 1155486056 (https://www.worldca
t.org/oclc/1155486056).
65. "Revealed: a quarter of all tweets about climate crisis produced by bots" (https://www.thegua
rdian.com/technology/2020/feb/21/climate-tweets-twitter-bots-analysis). The Guardian.
2020-02-21. Retrieved 2021-04-20.
66. Milman, Oliver (2020-02-21). "Revealed: quarter of all tweets about climate crisis produced
by bots" (https://www.theguardian.com/technology/2020/feb/21/climate-tweets-twitter-bots-an
alysis). The Guardian. ISSN 0261-3077 (https://www.worldcat.org/issn/0261-3077). Archived
(https://web.archive.org/web/20200222193221/https://www.theguardian.com/technology/202
0/feb/21/climate-tweets-twitter-bots-analysis) from the original on 2020-02-22. Retrieved
2020-02-23.
67. Massey, Douglas S.; Iyengar, Shanto (2019-04-16). "Scientific communication in a post-truth
society" (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6475392). Proceedings of the
National Academy of Sciences. 116 (16): 7656–7661. doi:10.1073/pnas.1805868115 (http
s://doi.org/10.1073%2Fpnas.1805868115). ISSN 0027-8424 (https://www.worldcat.org/issn/
0027-8424). PMC 6475392 (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6475392).
PMID 30478050 (https://pubmed.ncbi.nlm.nih.gov/30478050).
68. Allcott, Hunt; Gentzkow, Matthew; Yu, Chuan (2019-04-01). "Trends in the diffusion of
misinformation on social media" (https://doi.org/10.1177%2F2053168019848554). Research
& Politics. 6 (2): 2053168019848554. doi:10.1177/2053168019848554 (https://doi.org/10.11
77%2F2053168019848554). ISSN 2053-1680 (https://www.worldcat.org/issn/2053-1680).
69. Thai, My T.; Wu, Weili; Xiong, Hui (2016-12-01). Big Data in Complex and Social Networks
(https://books.google.com/books?id=CA4NDgAAQBAJ&q=causes+of+misinformation&pg=P
A125). CRC Press. ISBN 978-1-315-39669-9.
70. Bode, Leticia; Vraga, Emily K. (2018-09-02). "See Something, Say Something: Correction of
Global Health Misinformation on Social Media" (https://doi.org/10.1080/10410236.2017.133
1312). Health Communication. 33 (9): 1131–1140. doi:10.1080/10410236.2017.1331312 (htt
ps://doi.org/10.1080%2F10410236.2017.1331312). ISSN 1041-0236 (https://www.worldcat.o
rg/issn/1041-0236). PMID 28622038 (https://pubmed.ncbi.nlm.nih.gov/28622038).
S2CID 205698884 (https://api.semanticscholar.org/CorpusID:205698884).
71. Shaffer, David Williamson; Collier, Wesley; Ruis, A. R. (2016). "A tutorial on epistemic
network analysis: Analysing the structural connections in cognitive, social and interaction
data" (https://files.eric.ed.gov/fulltext/EJ1126800.pdf) (PDF). Journal of Learning Analytics. 3
(3): 9–45. doi:10.18608/jla.2016.33.3 (https://doi.org/10.18608%2Fjla.2016.33.3). Retrieved
31 January 2022.
72. Cailin, O'Connor; Weatherall, James Owen (2019). The Misinformation Age: How False
Beliefs Spread (https://yalebooks.yale.edu/book/9780300234015/misinformation-age#:~:text
=The%20Misinformation%20Age%2C%20written%20for,depends%20on%20who%20you%
20know.). New Haven, CT, USA: Yale University Press. ISBN 9780300234015. Retrieved
31 January 2022.
73. Valković, Martina (November 2020). "Review of "The Misinformation Age: How False Beliefs
Spread." " (https://philpapers.org/rec/VALCOA). Philosophy in Review. 40 (4).
doi:10.7202/1074030ar (https://doi.org/10.7202%2F1074030ar). S2CID 229478320 (https://a
pi.semanticscholar.org/CorpusID:229478320). Retrieved 31 January 2022.
74. Stapleton, Paul (2003). "Assessing the quality and bias of web-based sources: implications
for academic writing". Journal of English for Academic Purposes. 2 (3): 229–245.
doi:10.1016/S1475-1585(03)00026-2 (https://doi.org/10.1016%2FS1475-1585%2803%2900
026-2).
75. "Facebook's Lab-Leak About-Face" (https://www.wsj.com/articles/facebooks-lab-leak-about-
face-11622154198). WSJ.
76. "Covid origin: Why the Wuhan lab-leak theory is being taken seriously" (https://www.bbc.co
m/news/world-asia-china-57268111). BBC News. 27 May 2021.
77. "Hydroxychloroquine: Why a video promoted by Trump was pulled on social media" (https://
www.bbc.com/news/53559938). BBC News. 2020-07-28. Retrieved 2021-11-24.
78. "Stella Immanuel - the doctor behind unproven coronavirus cure claim" (https://www.bbc.co
m/news/world-africa-53579773). BBC News. 2020-07-29. Retrieved 2020-11-23.
79. Bertrand, Natasha (October 19, 2020). "Hunter Biden story is Russian disinfo, dozens of
former intel officials say" (https://www.politico.com/news/2020/10/19/hunter-biden-story-russi
an-disinfo-430276). Politico. Archived (https://web.archive.org/web/20201020034222/https://
www.politico.com/news/2020/10/19/hunter-biden-story-russian-disinfo-430276) from the
original on October 20, 2020. Retrieved October 20, 2020.
80. Lizza, Ryan (September 21, 2021). "POLITICO Playbook: Double Trouble for Biden" (https://
www.politico.com/newsletters/playbook/2021/09/21/double-trouble-for-biden-494411).
Politico.
81. Shearer, Elisa; Gottfried, Jeffrey (2017-09-07). "News Use Across Social Media Platforms
2017" (https://www.journalism.org/2017/09/07/news-use-across-social-media-platforms-201
7/). Pew Research Center's Journalism Project. Retrieved 2021-03-28.
82. Croteau, David; Hoynes, William; Milan, Stefania. "Media Technology" (http://www.sagepub.
com/upm-data/40857_9.pdf) (PDF). Media Society: Industries, Images, and Audiences.
pp. 285–321. Archived (https://web.archive.org/web/20130102172415/http://www.sagepub.c
om/upm-data/40857_9.pdf) (PDF) from the original on January 2, 2013. Retrieved March 21,
2013.
83. Marwick, Alice; Lewis, Rebecca (2017). Media Manipulation and Disinformation Online.
New York: Data & Society Research Institute. pp. 40–45.
84. Gladstone, Brooke (2012). The Influencing Machine. New York: W. W. Norton & Company.
pp. 49–51. ISBN 978-0393342468.
85. "Misinformation - ProQuest" (https://www.proquest.com/docview/2486203133).
www.proquest.com. ProQuest 2486203133 (https://search.proquest.com/docview/24862031
33). Retrieved 2021-12-16.
86. Egelhofer, Jana Laura; Aaldering, Loes; Eberl, Jakob-Moritz; Galyga, Sebastian; Lecheler,
Sophie (2020-03-30). "From Novelty to Normalization? How Journalists Use the Term "Fake
News" in their Reporting" (https://dx.doi.org/10.1080/1461670x.2020.1745667). Journalism
Studies. 21 (10): 1323–1343. doi:10.1080/1461670x.2020.1745667 (https://doi.org/10.108
0%2F1461670x.2020.1745667). ISSN 1461-670X (https://www.worldcat.org/issn/1461-670
X). S2CID 216189313 (https://api.semanticscholar.org/CorpusID:216189313).
87. Stewart, Mallory (2021). "Defending Weapons Inspections from the Effects of
Disinformation" (https://dx.doi.org/10.1017/aju.2021.4). AJIL Unbound. 115: 106–110.
doi:10.1017/aju.2021.4 (https://doi.org/10.1017%2Faju.2021.4). ISSN 2398-7723 (https://ww
w.worldcat.org/issn/2398-7723). S2CID 232070073 (https://api.semanticscholar.org/CorpusI
D:232070073).
88. Damstra, Alyt; Boomgaarden, Hajo G.; Broda, Elena; Lindgren, Elina; Strömbäck, Jesper;
Tsfati, Yariv; Vliegenthart, Rens (2021-09-29). "What Does Fake Look Like? A Review of the
Literature on Intentional Deception in the News and on Social Media" (https://dx.doi.org/10.1
080/1461670x.2021.1979423). Journalism Studies. 22 (14): 1947–1963.
doi:10.1080/1461670x.2021.1979423
(https://doi.org/10.1080%2F1461670x.2021.1979423). ISSN 1461-670X (https://www.worldc
at.org/issn/1461-670X). S2CID 244253422 (https://api.semanticscholar.org/CorpusID:24425
3422).
89. Lanoszka, Alexander (June 2019). "Disinformation in international politics" (https://www.cam
bridge.org/core/product/identifier/S2057563719000063/type/journal_article). European
Journal of International Security. 4 (2): 227–248. doi:10.1017/eis.2019.6 (https://doi.org/10.1
017%2Feis.2019.6). ISSN 2057-5637 (https://www.worldcat.org/issn/2057-5637).
S2CID 211312944 (https://api.semanticscholar.org/CorpusID:211312944).
90. Ognyanova, Katherine; Lazer, David; Robertson, Ronald E.; Wilson, Christo (2020-06-02).
"Misinformation in action: Fake news exposure is linked to lower trust in media, higher trust
in government when your side is in power" (https://misinforeview.hks.harvard.edu/article/misi
nformation-in-action-fake-news-exposure-is-linked-to-lower-trust-in-media-higher-trust-in-go
vernment-when-your-side-is-in-power/). Harvard Kennedy School Misinformation Review.
doi:10.37016/mr-2020-024 (https://doi.org/10.37016%2Fmr-2020-024). S2CID 219904597 (h
ttps://api.semanticscholar.org/CorpusID:219904597).
91. "Clarifying misinformation Clarifying - ProQuest" (https://www.proquest.com/docview/177169
5334). www.proquest.com. ProQuest 1771695334 (https://search.proquest.com/docview/177
1695334). Retrieved 2021-10-10.
92. Bodner, Glen E.; Musch, Elisabeth; Azad, Tanjeem (2009). "Reevaluating the potency of the
memory conformity effect" (https://doi.org/10.3758%2Fmc.37.8.1069). Memory & Cognition.
37 (8): 1069–1076. doi:10.3758/mc.37.8.1069 (https://doi.org/10.3758%2Fmc.37.8.1069).
ISSN 0090-502X (https://www.worldcat.org/issn/0090-502X). PMID 19933452 (https://pubme
d.ncbi.nlm.nih.gov/19933452).
93. Southwell, Brian G.; Thorson, Emily A.; Sheble, Laura (2018). Misinformation and Mass
Audiences (https://books.google.com/books?id=1Jo7DwAAQBAJ&q=misinformation+cause
s&pg=PT246). University of Texas Press. ISBN 978-1477314586.
94. Barker, David (2002). Rushed to Judgement: Talk Radio, Persuasion, and American
Political Behavior. New York: Columbia University Press. pp. 106–109.
95. "The misinformation that was told about Brexit during and after the referendum" (https://www.
independent.co.uk/news/uk/politics/final-say-brexit-referendum-lies-boris-johnson-leave-ca
mpaign-remain-a8466751.html). The Independent. 2018-07-27. Archived (https://ghostarchiv
e.org/archive/20220515/https://www.independent.co.uk/news/uk/politics/final-say-brexit-refer
endum-lies-boris-johnson-leave-campaign-remain-a8466751.html) from the original on
2022-05-15. Retrieved 2020-11-23.
96. O'Connor, Cailin; Weatherall, James Owen (2019). The Misinformation Age: How False
Beliefs Spread (https://archive.org/details/misinformationag0000ocon). New Haven: Yale
University Press. pp. 10 (https://archive.org/details/misinformationag0000ocon/page/10).
ISBN 978-0300234015.
97. Sinha, P.; Shaikh, S.; Sidharth, A. (2019). India Misinformed: The True Story (https://books.g
oogle.com/books?id=9ZuMDwAAQBAJ&q=misinformation+causes&pg=PT22). Harper
Collins. ISBN 978-9353028381.
98. Bratu, Sofia (May 24, 2020). "The Fake News Sociology of COVID-19 Pandemic Fear:
Dangerously Inaccurate Beliefs, Emotional Contagion, and Conspiracy Ideation" (https://doi.
org/10.22381%2FLPI19202010). Linguistic and Philosophical Investigations. 19: 128–134.
doi:10.22381/LPI19202010 (https://doi.org/10.22381%2FLPI19202010).
99. Gayathri Vaidyanathan (22 July 2020). "News Feature: Finding a vaccine for misinformation"
(https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7431032). Proceedings of the National
Academy of Sciences of the United States of America. 117 (32): 18902–18905.
Bibcode:2020PNAS..11718902V (https://ui.adsabs.harvard.edu/abs/2020PNAS..11718902
V). doi:10.1073/PNAS.2013249117 (https://doi.org/10.1073%2FPNAS.2013249117).
ISSN 0027-8424 (https://www.worldcat.org/issn/0027-8424). PMC 7431032 (https://www.ncb
i.nlm.nih.gov/pmc/articles/PMC7431032). PMID 32699146 (https://pubmed.ncbi.nlm.nih.gov/
32699146). Wikidata Q97652640.
100. "Misinformation on coronavirus is proving highly contagious" (https://apnews.com/article/ap-t
op-news-understanding-the-outbreak-health-media-social-media-86f61f3ffb6173c29bc7db2
01c10f141). AP NEWS. 2020-07-29. Retrieved 2020-11-23.
101. "Info-Environmentalism: An Introduction" (https://er.educause.edu/articles/2017/10/info-envir
onmentalism-an-introduction). Archived (https://web.archive.org/web/20180703130624/http
s://er.educause.edu/articles/2017/10/info-environmentalism-an-introduction) from the original
on 2018-07-03. Retrieved 2018-09-28.
102. "Information Environmentalism" (https://dlinq.middcreate.net/informationenvironmentalism/).
Digital Learning and Inquiry (DLINQ). 2017-12-21. Archived (https://web.archive.org/web/20
180928044347/https://dlinq.middcreate.net/informationenvironmentalism/) from the original
on 2018-09-28. Retrieved 2018-09-28.
Further reading
Machado, Caio; Kira, Beatriz; Narayanan, Vidya; Kollanyi, Bence; Howard, Philip (2019). "A
Study of Misinformation in WhatsApp groups with a focus on the Brazilian Presidential
Elections". Companion Proceedings of the 2019 World Wide Web Conference on – WWW
'19. New York: ACM Press: 1013–1019. doi:10.1145/3308560.3316738 (https://doi.org/10.11
45%2F3308560.3316738). ISBN 978-1450366755. S2CID 153314118 (https://api.semantic
scholar.org/CorpusID:153314118).
Allcott, H.; Gentzkow, M. (2017). "Social Media and Fake News in the 2016 Election" (https://
doi.org/10.1257%2Fjep.31.2.211). Journal of Economic Perspectives. 31 (2): 211–236.
doi:10.1257/jep.31.2.211 (https://doi.org/10.1257%2Fjep.31.2.211). S2CID 32730475 (http
s://api.semanticscholar.org/CorpusID:32730475).
Baillargeon, Normand (4 January 2008). A short course in intellectual self-defense. Seven
Stories Press. ISBN 978-1-58322-765-7. Retrieved 22 June 2011.
Bakir, V.; McStay, A. (2017). "Fake News and The Economy of Emotions: Problems, causes,
solutions" (https://research.bangor.ac.uk/portal/en/researchoutputs/fake-news-and-the-econo
my-of-emotions(6f96b5ed-884a-43c1-921f-74ed6f1384f8).html). Digital Journalism. 6: 154–
175. doi:10.1080/21670811.2017.1345645 (https://doi.org/10.1080%2F21670811.2017.134
5645). S2CID 157153522 (https://api.semanticscholar.org/CorpusID:157153522).
Christopher Cerf, and Victor Navasky, The Experts Speak: The Definitive Compendium of
Authoritative Misinformation, Pantheon Books, 1984.
Cook, John; Stephan Lewandowsky; Ullrich K. H. Ecker (2017-05-05). "Neutralizing
misinformation through inoculation: Exposing misleading argumentation techniques reduces
their influence" (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5419564). PLOS One. 12 (5):
e0175799. Bibcode:2017PLoSO..1275799C (https://ui.adsabs.harvard.edu/abs/2017PLoS
O..1275799C). doi:10.1371/journal.pone.0175799 (https://doi.org/10.1371%2Fjournal.pone.
0175799). PMC 5419564 (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5419564).
PMID 28475576 (https://pubmed.ncbi.nlm.nih.gov/28475576).
Helfand, David J., A Survival Guide to the Misinformation Age: Scientific Habits of Mind.
Columbia University Press, 2016. ISBN 978-0231541022
Christopher Murphy (2005). Competitive Intelligence: Gathering, Analysing And Putting It to
Work. Gower Publishing, Ltd.. pp. 186–189. ISBN 0-566-08537-2. A case study of
misinformation arising from simple error
O'Connor, Cailin, and James Owen Weatherall, "Why We Trust Lies: The most effective
misinformation starts with seeds of truth", Scientific American, vol. 321, no. 3 (September
2019), pp. 54–61.
O'Connor, Cailin, and James Owen Weatherall, The Misinformation Age; How False Beliefs
Spread. Yale University Press, 2019. ISBN 978-0300241006
Offit, Paul (2019). Bad Advice: Or Why Celebrities, Politicians, and Activists Aren't Your Best
Source of Health Information. Columbia University Press. ISBN 978-0231186995.
Persily, Nathaniel, and Joshua A. Tucker, eds. Social Media and Democracy: The State of
the Field and Prospects for Reform. Cambridge University Press, 2020. ISBN 978-
1108858779
Jürg Strässler (1982). Idioms in English: A Pragmatic Analysis. Gunter Narr Verlag. pp. 43–
44. ISBN 3-87808-971-6.

External links
Comic: Fake News Can Be Deadly. Here's How To Spot It (https://www.npr.org/2020/04/17/8
37202898/comic-fake-news-can-be-deadly-heres-how-to-spot-it) (audio tutorial, graphic
tutorial)
Management and Strategy Institute, Free Misinformation and Disinformation Training online
(https://www.msicertified.com/free-training/misinformation-and-disinformation-training/)

Retrieved from "https://en.wikipedia.org/w/index.php?title=Misinformation&oldid=1106447423"

This page was last edited on 24 August 2022, at 17:08 (UTC).

Text is available under the Creative Commons Attribution-ShareAlike License 3.0;


additional terms may apply. By
using this site, you agree to the Terms of Use and Privacy Policy. Wikipedia® is a registered trademark of the
Wikimedia Foundation, Inc., a non-profit organization.

You might also like