Methods of mind manipulation, often based on logical fallacies From Wikipedia, the free encyclopedia
Propaganda techniques are methods used in propaganda to convince an audience to believe what the propagandist wants them to believe. Many propaganda techniques are based on socio-psychological research. Many of these same techniques can be classified as logical fallacies or abusive power and control tactics.
Definition
In their book Propaganda and Persuasion, authors Garth S. Jowett and Victoria O'Donnell define propaganda as the "deliberate, systematic attempt to shape perceptions, manipulate cognitions, and direct behavior to achieve a response that furthers the desired intent of the propagandist".[1]Harold D. Laswell's definition targets even more precisely the technical aspect:
"Propaganda in the broadest sense is the technique of influencing human action by the manipulation of representations. These representations may take spoken, written, pictorial or musical form."[2]
Manipulation can be organized or unorganized, conscious or unconscious, politically or socially motivated. The concept reaches from systematic state propaganda to manipulate public opinion (Edward Bernays) to "sociological propaganda" (propaganda of integration),[3] where the unconscious desire to be manipulated and self manipulation leads the individual to adapt to the socially expected thoughts and behaviours (Jacques Ellul).[4]
The transition from non-propaganda to propaganda is fluid. Effective manipulation presupposes non-manipulative embedding in order to unfold its effect, which is why the reference to these contexts is not yet a refutation of the manipulative character of an act of communication.[3]
Classification
Propaganda is understood as a form of manipulation of public opinion. The semiotic manipulation of signs is the essential characteristic ("Propaganda is a major form of manipulation by symbols" ).[5]
Thus, propaganda is a special form of communication, which is studied in communication research, and especially in media impact research, focusing on media manipulation.[6] Propaganda is a particular type of communication characterized by distorting the representation of reality and manipulation.[4]
Manipulation and media
Common media for transmitting propaganda messages include news reports, government reports, historical revision, junk science, books, leaflets, movies, social media, radio, television, and posters. Less common nowadays are the cow post envelopes, examples of which have survived from the time of the American Civil War. (Connecticut Historical Society; Civil War Collections; Covers.) In the case of radio and television, propaganda can exist on news, current-affairs or talk-show segments, as advertising or public-service announcement "spots" or as long-running advertorials. Propaganda campaigns often follow a strategic transmission pattern to indoctrinate the target group. This may begin with a simple transmission such as a leaflet dropped from a plane or an advertisement. Generally these messages will contain directions on how to obtain more information, via a web site, hot line, radio program, etc. The strategy intends to initiate the individual from information recipient to information seeker through reinforcement, and then from information seeker to opinion leader through indoctrination.[7]
Information dissemination strategies only become propaganda strategies when coupled with propagandistic messages.[citation needed] Identifying these messages is a necessary prerequisite to study the methods by which those messages are spread.
Some techniques are classified as logical fallacies, because propaganda uses arguments which may have psychological effects but which are logically invalid.[10][11][12][13][14]
This uses tireless repetition of an idea. An idea, especially a simpleslogan, that is repeated enough times, may begin to be taken as the truth. This approach is more effective alongside the propagandist limiting or controlling the media.
Agenda setting means the "ability [of the news media] to influence the importance placed on the topics of the public agenda".[16] If a news item is covered frequently and prominently, the audience will regard the issue as more important.
Appeals to fear seek to build support by instilling anxieties and panic in the general population, for example, Joseph Goebbels exploited Theodore Kaufman's Germany Must Perish! to claim that the Allies sought the extermination of the German people.
Bandwagon and "inevitable-victory" appeals attempt to persuade the target audience to join in and take the course of action that "everyone else is taking."
Inevitable victory: invites those not already on the bandwagon to join those already on the road to certain victory. Those already or at least partially on the bandwagon are reassured that staying aboard is their best course of action. (e.g., "The debate is over. Nearly everyone who matters agrees with me.")
Join the crowd: This technique reinforces people's natural desire to be on the winning side. This technique is used to convince the audience that a program is an expression of an irresistible mass movement and that it is in their best interest to join.
The type of propaganda that deals with famous people or depicts attractive, happy people. This suggests if people buy a product or follow a certain ideology, they too will be happy or successful. (This is used more in advertising for products, instead of political reasons.) Usually for advertising rather than political purposes, sexual arousal may also be used. For example, a message promoting a brand of motorcycles to a male target audience may also include sexually attractive bikini-clad women within the advertisement, to make the product more appealing to the audience by targeting sexual desires. However, some evidence suggests that using sexual appeal to sell a product may not succeed, as the target audience may focus too much on the sexually appealing people in the advertisement rather than the product itself.[17]
The repeated articulation of a complex of events that justify subsequent action. The descriptions of these events have elements of truth, and the "big lie" generalizations merge and eventually supplant the public's accurate perception of the underlying events. After World War I the German stab in the back explanation of the cause of their defeat became a justification for Nazi re-militarization and revanchism.
Richard Crossman, the British Deputy Director of Psychological Warfare Division (PWD) for the Supreme Headquarters Allied Expeditionary Force (SHAEF) during the Second World War said "In propaganda truth pays... It is a complete delusion to think of the brilliant propagandist as being a professional liar. The brilliant propagandist is the man who tells the truth, or that selection of the truth which is requisite for his purpose, and tells it in such a way that the recipient does not think he is receiving any propaganda... [...] The art of propaganda is not telling lies, but rather selecting the truth you require and giving it mixed up with some truths the audience wants to hear."[18]
All vertebrates, including humans, respond to classical conditioning. That is, if A is always present when B is present and B causes a physical reaction (e.g. disgust, pleasure), then when presented with object A in the absence of B, that same reaction will be experienced.
People desire to be consistent. Suppose a pollster finds that a certain group of people hates his candidate for senator but loves actor A. They use actor A's endorsement of their candidate to change people's minds because people cannot tolerate inconsistency. They are forced to either dislike the actor or like the candidate.
The "plain folks" or "common man" approach attempts to convince the audience that the propagandist's positions reflect the common sense of the people. It is designed to win the confidence of the audience by communicating in the common manner and style of the target audience. Propagandists use ordinary language and mannerisms (and clothe their message in face-to-face and audiovisual communications) in attempting to identify their point of view with that of the average person. A common example of this type of propaganda is a political figure, usually running for a placement, in a backyard or shop doing daily routine things. This image appeals to the common person. With the plain folks device, the propagandist can win the confidence of persons who resent or distrust foreign sounding, intellectual speech, words, or mannerisms."[19] For example, a politician speaking to a Southern United States crowd might incorporate words such as "Y'all" and other colloquialisms to create a perception of belonging.
A cult of personality arises when an individual uses mass media to create an idealized and heroic public image, often through unquestioning flattery and praise. The hero personality then advocates the positions that the propagandist desires to promote. For example, modern propagandists hire popular personalities to promote their ideas and/or products.
Making individuals from the opposing nation, from a different ethnic group, or those who support the opposing viewpoint appear to be subhuman (e.g., the Vietnam War-era term "gooks" for National Front for the Liberation of South Vietnam aka Vietcong, or "VC", soldiers), worthless, or immoral, through suggestion or false accusations. Dehumanizing is also a term used synonymously with demonizing, the latter usually serves as an aspect of the former.
This technique hopes to simplify the decision making process by using images and words including interjection words to tell the audience exactly what actions to take, eliminating any other possible choices. Authority figures can be used to give the order, overlapping it with the appeal to authority technique, but not necessarily. The Uncle Sam "I want you" image is an example of this technique.
The creation or deletion of information from public records, in the purpose of making a false record of an event or the actions of a person or organization, including outright forgery of photographs, motion pictures, broadcasts, and sound recordings as well as printed documents.
Divide and rule in politics and sociology is gaining and maintaining power by breaking up larger concentrations of power into pieces that individually have less power than the one implementing the strategy.
Is used to increase a person's latitude of acceptance. For example, if a salesperson wants to sell an item for $100 but the public is only willing to pay $50, the salesperson first offers the item at a higher price (e.g., $200) and subsequently reduces the price to $100 to make it seem like a good deal.
The use of an event that generates euphoria or happiness, or using an appealing event to boost morale. Euphoria can be created by declaring a holiday, making luxury items available, or mounting a military parade with marching bands and patriotic messages.
An exaggeration (or hyperbole) occurs when the most fundamental aspects of a statement are true, but only to a certain degree. It is also seen as "stretching the truth" or making something appear more powerful, meaningful, or real than it actually is. Saying that a person ate 20 spring rolls at a party when they actually ate 7 or 8 would be considered an exaggeration.
A false accusation is a claim or allegation of wrongdoing that is untrue and/or otherwise unsupported by facts.[20] They can be used in any of the following contexts: informally in everyday life, quasi-judicially, or judicially.
Sometimes abbreviated as FUD, an attempt to influence public perception by disseminating negative and dubious/false information designed to undermine the credibility of their beliefs.
A propaganda technique in which a large number of messages are broadcast rapidly, repetitively, and continuously over multiple channels (such as news and social media) without regard for truth or consistency.
An attempt to justify an action on the grounds that doing so will make one more patriotic, or in some way benefit a group, country, or idea. The feeling of patriotism this technique attempts to inspire may not necessarily diminish or entirely omit one's capability for rational examination of the matter in question.
Often used by recruiters and salesmen. For example, the perpetrator walks up to the victim and pins a flower or gives a small gift to the victim. The victim says thanks and now they have incurred a psychological debt to the perpetrator. The person eventually asks for a larger favor (e.g., a donation or to buy something far more expensive). The unwritten social contract between the victim and perpetrator causes the victim to feel obligated to reciprocate by agreeing to do the larger favor or buy the more expensive gift.
Framing is the social construction of a social phenomenon often by mass media sources, political or social movements, political leaders, or other actors and organizations. It is an inevitable process of selective influence over the individual's perception of the meanings attributed to words or phrases.
Using persistent denial, misdirection, contradiction, and lying to sow seeds of doubt in a target individual or group, hoping to make them question their own memory, perception, sanity, and norms.
Bombarding a political opponent with obnoxiously complex questions in rapid fire during a debate to make the opponent appear to not know what they are talking about.
Glittering generalities are emotionally appealing words that are applied to a product or idea, but present no concrete argument or analysis. This technique has also been referred to as the PT Barnum effect. (e.g., the advertising campaign slogan "Ford has a better idea!")
This technique is used to persuade a target audience to disapprove of an action or idea by suggesting that the idea is popular with groups hated, feared, or held in contempt by the target audience. Thus if a group that supports a certain policy is led to believe that undesirable, subversive, or contemptible people support the same policy, then the members of the group may decide to change their original position. This is a form of bad logic, where A is said to include X, and B is said to include X, therefore, A = B.
A half-truth is a deceptive statement that includes some element of truth. It comes in several forms: the statement might be partly true, the statement may be totally true but only part of the whole truth, or it may utilize some deceptive element, such as improper punctuation, or double meaning, especially if the intent is to deceive, evade, blame, or misrepresent the truth.
"Information overload can have the same effect as secrecy and certainly in the short term and for democracies today it might be considered more effective."[21] "When information overload occurs, it is likely that a reduction in decision quality will occur."[22] "The glut of information generated by modern technology [...] threatens to make its receivers passive. Overload prompts disengagement."[23]
Generalities are deliberately vague so that the audience may supply its own interpretations. The intention is to move the audience by use of undefined phrases, without analyzing their validity or attempting to determine their reasonableness or application. The intent is to cause people to draw their own interpretations rather than simply being presented with an explicit idea. In trying to "figure out" the propaganda, the audience forgoes judgment of the ideas presented. Their validity, reasonableness and application may still be considered.
A euphemism is used when the propagandist attempts to increase the perceived quality, credibility, or credence of a particular ideal. A dysphemism is used when the intent of the propagandist is to discredit, diminish the perceived quality, or hurt the perceived righteousness of the individual. By creating a "label", "category", or "faction" of a population, it is much easier to make an example of these larger bodies, because they can uplift or defame the individual without actually incurring legal-defamation. Labeling can be thought of as a sub-set of guilt by association, another logical fallacy.[24][unreliable source?]
If a person's message is outside the bounds of acceptance for an individual and group, most techniques will engender psychological reactance (simply hearing the argument will make the message even less acceptable). There are two techniques for increasing the bounds of acceptance. First, one can take an even more extreme position that will make more moderate positions seem more acceptable. This is similar to the door-in-the-face technique. Alternatively, one can moderate one's own position to the edge of the latitude of acceptance and then over time slowly move to the position that was previously held.[25]
A technique used by clandestine professionals: When their veil of secrecy is shredded and they can no longer rely on a phony cover story to misinform the public, they resort to admitting—sometimes even volunteering—some of the truth while still managing to withhold the key and damaging facts in the case.
Specific words and phrases with strong emotional implications are used to influence the audience, for example, using the word reforms rather than a more neutral word like changes.
Used to recruit members to a cult or ideology by having a group of individuals cut off a person from their existing social support and replace it entirely with members of the group who deliberately bombard the person with affection in an attempt to isolate the person from their prior beliefs and value system.
Lying and deception can be the basis of many propaganda techniques including Ad Hominem arguments, Big-Lie, Defamation, Door-in-the-Face, Half-truth, Name-calling or any other technique that is based on dishonesty or deception. For example, many politicians have been found to frequently stretch or break the truth.
According to Adolf Hitler, "The most brilliant propagandist technique will yield no success unless one fundamental principle is borne in mind constantly – it must confine itself to a few points and repeat them over and over."[26][27] This idea is consistent with the principle of classical conditioning as well as the idea of "Staying on Message."
Minimisation is the opposite of exaggeration. It is a type of deception[28] involving denial coupled with rationalization in situations where complete denial is implausible.
Propagandists use the name-calling technique to incite fears and arouse prejudices in their hearers in the intent that the bad names will cause hearers to construct a negative opinion about a group or set of beliefs or ideas that the propagandist wants hearers to denounce. The method is intended to provoke conclusions about a matter apart from impartial examinations of facts. Name-calling is thus a substitute for rational, fact-based arguments against an idea or belief on its own merits.[29]
A type of logical fallacy, in which a conclusion is made out of an argument that does not justify it. All invalid arguments can be considered as special cases of non sequitur.
Generalities are deliberately vague so that the audience may supply its own interpretations. The intention is to move the audience by use of undefined phrases, without analyzing their validity or attempting to determine their reasonableness or application. The intent is to cause people to draw their own interpretations rather than simply being presented with an explicit idea. In trying to "figure out" the propaganda, the audience forgoes judgment of the ideas presented. Their validity, reasonableness and application may still be considered.
Operant conditioning involves learning through imitation. For example, watching an appealing person buy products or endorse positions teaches a person to buy the product or endorse the position. Operant conditioning is the underlying principle behind the ad nauseam, slogan and other repetition public relations campaigns.
Selective editing of quotes that can change meanings. Political documentaries designed to discredit an opponent or an opposing political viewpoint often use this technique.
Individuals or groups may use favorable generalities to rationalize questionable acts or beliefs. Vague and pleasant phrases are often used to justify such actions or beliefs.
Presenting data or issues that, while compelling, are irrelevant to the argument at hand, and then claiming that it validates the argument.[citation needed]
This is the repeating of a certain symbol or slogan so that the audience remembers it. This could be in the form of a jingle or an image placed on nearly everything in the picture/scene. This also includes using subliminal phrases, images or other content in a piece of propaganda.[24]
Assigning blame to an individual or group, thus alleviating feelings of guilt from responsible parties and/or distracting attention from the need to fix the problem for which blame is being assigned.
This technique can be used with the aim of lessening the impact of a damaging headline or sound byte. For example, if an upcoming story about taking a 'bribe' will be damaging, by repeatedly using the word for 'bribe' for trivial accusations, the word itself may become more normalized and readily dismissed when encountered.
A slogan is a brief, striking phrase that may include labeling and stereotyping. Although slogans may be enlisted to support reasoned ideas, in practice they tend to act only as emotional appeals. Opponents of the US's invasion and occupation of Iraq use the slogan "blood for oil" to suggest that the invasion and its human losses was done to access Iraq's oil riches. On the other hand, supporters who argue that the US should continue to fight in Iraq use the slogan "cut and run" to suggest withdrawal is cowardly or weak. Similarly, the names of the military campaigns, such as "enduringfreedom" or "justcause" can also be considered slogans, devised to influence people.
A smear is an effort to damage or call into question someone's reputation, by propounding negative propaganda. It can be applied to individuals or groups.
This technique attempts to arouse prejudices in an audience by labeling the object of the propaganda campaign as something the target audience fears, hates, loathes, or finds undesirable. For instance, reporting on a foreign country or social group may focus on the stereotypical traits that the reader expects, even though they are far from being representative of the whole country or group; such reporting often focuses on the anecdotal. In graphic propaganda, including war posters, this might include portraying enemies with stereotyped racial features.
A straw man argument is an informal fallacy based on misrepresentation of an opponent's position. To "attack a straw man" is to create the illusion of having refuted a proposition by substituting a superficially similar proposition (the "straw man"), and refuting it, without ever having actually refuted the original position.
Testimonials are quotations, in or out of context, especially cited to support or reject a given policy, action, program, or personality. The reputation or the role (expert, respected public figure, etc.) of the individual giving the statement is exploited. The testimonial places the official sanction of a respected person or authority on a propaganda message. This is done in an effort to cause the target audience to identify itself with the authority or to accept the authority's opinions and beliefs as its own.
Works on the principle that people are more willing to accept an argument from a seemingly independent source of information than from someone with a stake in the outcome. It is a marketing strategy commonly employed by Public Relations (PR) firms, that involves placing a premeditated message in the "mouth of the media." The third party technique can take many forms, ranging from the hiring of journalists to report the organization in a favorable light, to using scientists within the organization to present their perhaps prejudicial findings to the public. Frequently, astroturf groups or front groups are used to deliver the message.
Also known as association, this is a technique of projecting positive or negative qualities (praise or blame) of a person, entity, object, or value onto another to make the second more acceptable or to discredit it. It evokes an emotional response, which stimulates the target to identify with recognized authorities. Often highly visual, this technique often utilizes symbols (for example, the swastikas used in Nazi Germany, originally a symbol for health and prosperity) superimposed over other visual images.
This technique is used when the propaganda concept would seem less credible if explicitly stated. The concept is instead repeatedly assumed or implied.
These are words in the value system of the target audience that produce a positive image when attached to a person or issue. Peace, hope, happiness, security, wise leadership, freedom, "The Truth", etc. are virtue words. Many see religiosity as a virtue, making associations to this quality effectively beneficial.
Whataboutism is a variant of the tu quoque logical fallacy that attempts to discredit an opponent's position by charging them with hypocrisy without directly refuting or disproving their argument, which is particularly associated with Soviet and Russian propaganda. When criticisms were leveled at the Soviet Union, the Soviet response would be "What about..." followed by an event in the Western world.
Garth Jowett, Victoria O'Donnell (2006), Propaganda and Persuasion (in German), SAGE, ISBN978-1-4129-0898-6, retrieved 2019-06-30, Propaganda is the deliberate, systematic attempt to shape perceptions, manipulate cognitions, and direct behavior to achieve a response that furthers the desired intent of the propagandist.
Cole, Robert, ed. (1998). "<--! left blank: several entries-->". Encyclopedia of Propaganda. Armonk, NY: Sharpe Reference. ISBN9780765680099. OCLC37238860.