What Propaganda Actually Is and What It Is Not
Propaganda has been defined so variously (and so tendentiously) that the word has become almost purely rhetorical: a label applied to the other side's persuasion, never to one's own. This is itself a propaganda effect. Understanding propaganda requires a definition precise enough to identify the mechanism, broad enough to capture its full range of application, and neutral enough to apply to all actors including those we agree with.
The most analytically useful definition comes from Jacques Ellul, whose 1962 treatise Propagandes remains the most rigorous theoretical account of the phenomenon. For Ellul, propaganda is not simply the communication of false information, nor is it limited to the activities of states or political movements. Propaganda is a sociological phenomenon: a systematic technique of psychological manipulation that uses mass communication to produce in large populations the reactions, beliefs, and behaviours that serve the interests of the propagandist, whether or not those interests are disclosed, and whether or not the content is true or false.
1. Agitation vs. Integration. Agitation propaganda incites immediate action, it mobilises, inflames, and drives toward a specific short-term behaviour. Integration propaganda is slower and more pervasive: it works to produce conformity with an established social and political order, making existing arrangements feel natural, necessary, and morally sanctioned. Most contemporary Western propaganda is integration propaganda: it does not call for action so much as it calls for acquiescence.
2. White, Grey, and Black. White propaganda acknowledges its source. Grey propaganda conceals it. Black propaganda falsely attributes itself to an enemy source. The most effective propaganda is often white (acknowledged, official, transparent about its origin) because the concealment of the source is unnecessary when the audience has already been conditioned to accept the source's authority. State media that openly declares its affiliation can be more effective than covert operations precisely because the audience has learned to trust the state.
3. Vertical vs. Horizontal. Vertical propaganda flows from an authority downward to a mass audience. Horizontal propaganda spreads laterally through peer networks, person to person, community to community. Horizontal propaganda is substantially more difficult to resist because it exploits the trust mechanisms of personal relationship rather than the authority mechanisms of institutional communication. The friend who shares an ideologically loaded video is a more effective propagandist than the broadcaster who produces it.
4. Political vs. Sociological. Political propaganda targets specific political beliefs, attitudes, and behaviours. Sociological propaganda (Ellul's most important and most neglected distinction) works at the level of values, worldview, and the categories through which reality is perceived. It is not produced by a propaganda ministry; it is produced by the totality of cultural output: advertising, entertainment, journalism, education, and social media, all of which collectively socialise individuals into a set of assumptions about what is normal, what is possible, and what is desirable. Sociological propaganda is the water in which political propaganda swims.
This framework immediately expands the domain of propaganda beyond the Nazi poster and the Soviet newsreel, which is where popular understanding tends to locate it. The tobacco industry's decades-long campaign to manufacture scientific doubt about the relationship between smoking and cancer is propaganda. The advertising industry's systematic exploitation of insecurity, desire, and social comparison to generate consumption is propaganda. The news media's structuring of attention through the selection, framing, and repetition of particular events (while others go unreported) is propaganda. None of these producers of ideologically loaded mass communication would accept the label. This is consistent with how propaganda works: it is most effective when its subjects do not recognise it as such.
The morally important question is not "is this propaganda?" but "who benefits from this belief being widely held, and does the mechanism by which it is being induced reflect the informed consent of those being induced?" Propaganda, on this analysis, is persuasion that exploits psychological mechanisms rather than engaging rational evaluation, persuasion that would not work if its target fully understood how it was operating.
The Engineering of Consent How Modern Propaganda Was Born
Edward Bernays was the nephew of Sigmund Freud, and he used his uncle's insights about the unconscious mechanisms of human motivation to build the modern public relations industry. His 1928 book Propaganda (which opens with the sentence "The conscious and intelligent manipulation of the organised habits and opinions of the masses is an important element in democratic society") is among the most honest accounts of its own project in the history of persuasion literature. Bernays did not regard mass psychological manipulation as a regrettable necessity. He regarded it as the proper function of a professional class of "invisible governors" whose social role was to manage the irrational masses on behalf of the interests of those who hired them.
Bernays' most significant innovation was to shift the target of persuasion from the rational faculty to the emotional and associative faculty. Earlier political communication had addressed explicit arguments to rational agents who were expected to evaluate them. Bernays, drawing on Freud and on Gustave Le Bon's crowd psychology, recognised that mass behaviour was driven primarily by emotion, identification, and social conformity rather than by deliberate individual evaluation. The practical conclusion was that persuasion should address emotional needs and social identities rather than rational arguments.
His technique was what he called "crystallising public opinion" through the deliberate creation of pseudo-events, manufactured occasions, endorsements, and spectacles designed to generate news coverage that would appear to reflect spontaneous public sentiment rather than planned communication. His most famous campaign involved persuading American women to smoke in public in 1929: he hired women to march in the New York Easter Parade smoking cigarettes, which he had labelled "Torches of Freedom" in a pitch that framed cigarettes as symbols of women's liberation. The coverage was enormous; smoking among women expanded dramatically; and no one who saw the coverage or read about it recognised it as a planned corporate communication operation. It was designed to look like a social movement. It was an advertisement.
Bernays was explicit about the intellectual framework: "If we understand the mechanism and motives of the group mind, it is now possible to control and regiment the masses according to our will without their knowing it." The phrase "without their knowing it" is the key. The engineering of consent does not depend on the consent of those whose consent is being engineered. It depends on their not recognising that engineering is occurring.
Walter Lippmann, Bernays' contemporary and the most important American journalist of the early 20th century, developed a parallel analysis from a different direction. In Public Opinion (1922) and The Phantom Public (1925), Lippmann argued that ordinary citizens were structurally incapable of forming accurate opinions about the complex modern world (the world was too vast, too complicated, and too inaccessible to direct experience for individual citizens to develop reliable political judgment. The "pictures in our heads" that Lippmann called "pseudo-environments") our mental representations of the world beyond our direct experience, were necessarily simplified, stereotype-laden, and media-mediated. Democratic theory's assumption that citizens had opinions worth consulting was, on this analysis, naive.
Lippmann and Bernays drew opposite conclusions from the same premise. Lippmann, ambivalently, called for a class of expert analysts who would mediate between the complex world and the simplified public mind. Bernays, unambivalently, built an industry for managing the simplified public mind on behalf of corporate and political clients. Together, their work established the intellectual foundation for the 20th century's industrial propaganda apparatus: the recognition that mass opinion was a manufactured product, that the manufacturing could be systematised, and that those who controlled the manufacturing controlled the political landscape.
The conscious and intelligent manipulation of the organised habits and opinions of the masses is an important element in democratic society. Those who manipulate this unseen mechanism of society constitute an invisible government which is the true ruling power of our country.
Edward Bernays, Propaganda (1928)Propaganda as Total Environment Beyond the Poster
Jacques Ellul was a French philosopher, sociologist, and theologian whose 1962 analysis of propaganda remains the most important single work on the subject, and the least read by those who discuss it. Ellul's contribution is not a description of propaganda techniques but a structural analysis of what propaganda is and what conditions make it possible. His conclusions are more disturbing than any account of specific techniques, because they locate the problem not in the malice of propagandists but in the structure of modern technological society itself.
Ellul's central argument is that propaganda is not primarily a product of totalitarian states. It is a product of mass society, a necessary feature of any society characterised by mass communication technology, urbanisation, and the disruption of traditional community structures. Modern individuals, stripped of the traditional networks of face-to-face community that once provided their understanding of the world, are structurally dependent on mass media for their picture of reality beyond their immediate experience. This structural dependency is what makes propaganda possible and, in Ellul's analysis, inevitable.
Ellul's Most Uncomfortable Finding
Propaganda is most effective on the best-educated, most-informed segments of the population, not on the ignorant. The educated person consumes more media, has more structured opinions on more topics, and is therefore more thoroughly embedded in the mediated pseudo-environment that propaganda constructs. Their confident, information-rich worldview is a product of information consumption, and that information was selected, framed, and presented by media organisations that are themselves embedded in ideological systems. The confident, well-informed citizen is often more thoroughly propagandised than the ill-informed one, because they have more material in their heads that was put there by someone else.
The Pre-Propaganda Problem
Ellul argues that effective propaganda requires pre-propaganda: the gradual construction of the cognitive and emotional framework into which specific propaganda messages will later be inserted. Before you can tell someone who the enemy is, you must have established that there is an enemy. Before you can mobilise someone in defence of a sacred value, you must have established that the value is sacred. The pre-propaganda work of building these frameworks is done, in modern societies, not by propaganda ministries but by entertainment, advertising, journalism, and education. Which collectively socialise individuals into a set of emotional and cognitive defaults that specific propaganda can then activate. By the time the specific message arrives, the ground has been entirely prepared.
Individual psychological isolation: The individual must be simultaneously part of a mass and cut off from organic community. Traditional communities provided alternative sources of meaning, information, and social identity that could buffer propaganda's effects. The atomised modern individual has no such buffer. They receive mass communications alone, and their response is unmediated by the conversations and relationships that traditional communities would have provided.
Technical means of mass communication: Propaganda requires the technological infrastructure to reach large populations simultaneously with the same message. Radio, cinema, television, and now social media each created new propaganda environments by providing new mechanisms for simultaneous mass reach. Each new technology initially appeared to democratise communication; each was rapidly colonised by those with the resources and interests to use it most effectively for mass persuasion.
Monopoly or near-monopoly on information: Effective propaganda does not require that all alternative information be suppressed, though this helps. It requires that the propagandistic framing of events be ubiquitous enough that alternative frameworks are marginalised and implausible. In a media environment where all major outlets frame events through similar assumptions and priorities, the propaganda effect is achieved without censorship.
Duration and total sociological implication: Effective propaganda is not a campaign; it is an environment. It must be sustained long enough to alter the individual's basic cognitive and emotional framework, not merely their position on a specific issue. This is why Ellul emphasises sociological propaganda (the continuous background construction of the ideological air that political propaganda breathes) as more important than any specific communication operation.
Psychological efficiency: Effective propaganda must be calibrated to the actual psychological needs and preoccupations of its target population. It cannot create emotions or needs from nothing; it can only amplify, redirect, and exploit needs that already exist. The propagandist must know the audience (their fears, their aspirations, their resentments, their social identities) with precision, and must design messages that fit into these existing structures rather than challenging them.
Ellul's most challenging conclusion is about the complicity of the audience. Modern individuals, he argues, are not passive victims of propaganda imposed on them from outside. They are active participants who seek out propaganda because it performs psychological functions they need: it reduces the intolerable complexity of the modern world to manageable narratives, it provides the emotional satisfaction of moral certainty, and it integrates the individual into a community of shared belief. People do not merely tolerate propaganda; they demand it. Any political communication that offered genuine complexity, genuine uncertainty, and genuine engagement with evidence would be rejected in favour of clearer, emotionally satisfying alternatives. The audience's own psychological needs make them collaborators in their manipulation.
The Toolkit How Propaganda Actually Works
The specific techniques of propaganda are the applied implementation of everything described in the previous four artifacts. Each technique works by exploiting one or more of the psychological mechanisms documented in the neuroscience, moral psychology, social identity theory, and cognitive science of belief. Understanding the techniques is not merely historically interesting. It is the beginning of immunity, or at least of conscious recognition.
Robert Cialdini's analysis of influence, developed in his 1984 book of the same name, provides the psychological substrate for many of these techniques. His six principles of influence (reciprocity, commitment and consistency, social proof, authority, liking, and scarcity) are not propaganda techniques in themselves, but they are the psychological levers that propaganda techniques are designed to activate. Understanding them together provides a map of the human psychological vulnerabilities that the propagandist exploits: not unique weaknesses of particularly gullible people, but universal features of human social cognition that evolved for different purposes and that systematic manipulation is designed to redirect.
Hasher, Goldstein, and Toppino's 1977 study, and its many subsequent replications, demonstrated that repeated exposure to a statement increases the probability that subjects will rate it as true, even when the statement was initially rated as false, and even when subjects are warned that repeated exposure might affect their judgments. The effect operates below the level of conscious deliberation: subjects do not consciously conclude "I've heard this before, therefore it must be true." They simply find the statement easier to process on subsequent exposure, and this processing fluency is misattributed to evidential support.
The political implications are direct. In a high-volume media environment, the statements that are most frequently repeated (because they serve the interests of those who control media channels) will tend to acquire truth-status independent of their accuracy. This explains the standard propaganda practitioner's maxim that the big lie, repeated often enough, will be believed: not because people are foolish, but because the illusory truth effect operates on everyone, regardless of intelligence or education, at the level of automatic processing that precedes deliberate evaluation.
A 2016 study by Pennycook, Cannon, and Rand documented the illusory truth effect even for statements that subjects initially identified as implausible. Repetition increased perceived accuracy even when subjects should have known better. The implication for political communication in a high-repetition media environment is severe: the volume of exposure matters more than the initial credibility of the claim.
The Emotional Channel Why Propaganda Bypasses Reason
The central mystery of propaganda (why it works on educated, intelligent people who know, at some level, that they are being manipulated) is resolved by Artifact 1's neuroscience. Emotion is not opposed to cognition. It is prior to it. The amygdala's threat and reward evaluation precedes the cortex's content processing. By the time deliberate reasoning is engaged with propagandistic content, the emotional framing has already done its work, establishing an affective baseline that colours all subsequent evaluation.
Effective propaganda does not attempt to convince through argument. It attempts to produce an emotional state in which a desired conclusion feels obvious. The argument, if provided at all, is post-hoc rationalisation material, evidence that the System 2 rider can use to justify the emotional position that System 1 has already reached. This is not manipulation in the crude sense of lying. It is manipulation in the precise sense of exploiting the architecture of cognition, specifically, the fact that emotional response precedes and conditions deliberate evaluation.
Goebbels' 1926 diary entry states: "The most brilliant propagandist technique will yield no success unless one fundamental principle is borne in mind constantly (it must confine itself to a few points and repeat them over and over." This is not a cynical observation from outside the psychological research) it is an empirical finding that predates the formal study of the illusory truth effect by five decades. Goebbels understood, from practice, what cognitive psychology later formalised: that repetition produces truth-status independent of content, and that the simplification of complex reality into a few emotionally resonant points is not a compromise of effective communication but its precondition.
The Nazi propaganda apparatus was arguably the most sophisticated single application of mass persuasion principles in the 20th century, and its sophistication lay not in its novelty but in the thoroughness of its application. Every element of Ellul's framework was present: total sociological implication (the Nazi aesthetic pervaded architecture, education, sport, art, and music as well as explicit political communication), emotional activation rather than rational argument, simple repetitive messaging, the construction of a sacred in-group and a demonic out-group, the exploitation of genuine pre-existing grievances (the humiliation of Versailles, the economic devastation of the Weimar years), and the continuous manufacture of spectacle (the rallies, the uniforms, the symbols) designed to produce the intoxicating experience of collective identity that Ellul identifies as propaganda's deepest appeal.
The crucial and uncomfortable point is that the content of Nazi propaganda was, in important respects, irrelevant to its effectiveness. What made it work was not the specific ideology but the emotional and social architecture: the manufacturing of belonging, the identification of an enemy, the promise of restored greatness, the spectacle of collective purpose. These structural elements would have worked for any ideology that could be plausibly mapped onto the psychological needs of the target population. The ideology was the payload; the emotional architecture was the delivery system. And the delivery system is not historically specific.
Propaganda must not investigate the truth objectively, and if it is favourable to the other side, present it according to the theoretical rules of justice; yet it must present only that aspect of the truth which is favourable to its own side. The moment you make even a single exception to this rule, you have begun to undermine the confidence of readers.
Adolf Hitler, Mein Kampf (1925)The relationship between fear and political manipulation has been systematically studied by the social psychologist Jeff Greenberg and his colleagues in their Terror Management Theory research. As noted in Artifact 1, reminding subjects of their own mortality (making death salience temporarily elevated) produces measurable increases in commitment to cultural worldviews, hostility toward those who challenge those worldviews, and preference for strong, certain leadership. Political actors who have discovered this experimentally or empirically (and many have) can use threats, crises, and reminders of vulnerability as tools for generating the psychological conditions under which their particular worldview, their particular leadership style, and their particular proposed solutions are most compellingly received. The manipulation does not require falsifying the threat. It requires emphasising it at the right moment, with the right emotional loading, in a context where the desired political response has already been associated with safety and in-group protection.
This explains one of the most consistently observed patterns in political history: that crises (economic, military, social) reliably produce shifts toward authoritarian politics, not because authoritarianism is objectively more effective in crises, but because crisis activates the threat-response architecture that makes authoritarian solutions emotionally compelling. The propaganda of crisis does not need to invent the crisis. It needs to frame the crisis in terms that make the propagandist's preferred solution feel like the obvious response to an emergency.
The Medium and the Message How Delivery Shapes Belief
Marshall McLuhan's claim that "the medium is the message", first stated in Understanding Media (1964), is often treated as a piece of media theory esoterica. In the context of propaganda, it is a fundamental insight: the format, structure, and sensory modality of a communication channel shapes the cognitive and emotional processing of content independently of what that content says. Propaganda that appears in film activates different emotional processing than the same content in print. Television news structures political reality differently than newspapers. Social media produces a different relationship to political content than either.
Each medium has a characteristic "grammar" that shapes what kinds of content it handles effectively and what kinds it distorts. Television, McLuhan observed, is a "cool" medium (low-definition, requiring audience participation to complete the image) that demands personally compelling, emotionally immediate, visually attractive performers. The "hot" medium of radio is different: it produces intimate, disembodied voice that activates the imagination differently. The political implications of medium grammar became visible in the 1960 Kennedy-Nixon debate: radio listeners judged Nixon the winner; television viewers judged Kennedy (the cooler, more visually composed presence) the winner. The same content, different medium, different conclusion.
Chomsky and Herman's "propaganda model" of media operation provides the most systematic structural account of how mass media in liberal democracies produces ideologically consistent output without requiring central direction or explicit political instruction. Their model identifies five "filters" through which news must pass before reaching the public: ownership concentration in large corporations with economic interests; advertising dependence, which structurally biases toward content that delivers valuable audiences to advertisers; sourcing dependence on official and institutional sources with the access and resources to generate continuous copy; "flak", organised attacks on media that deviate from acceptable narratives; and anti-communist (or analogous) ideology as a systemic framing device.
The propaganda model is not a conspiracy theory. It does not require individual journalists to be consciously propagandistic. It requires only that the structural incentives and constraints of commercial media produce systematic selection and framing of news, emphasising stories and perspectives consistent with the interests of owners, advertisers, and official sources, and marginalising those that are not. Individual journalists who internalise professional norms shaped by these structural constraints will produce ideologically consistent output without any explicit instruction, and will experience themselves as doing objective journalism.
The model has been extensively debated and partially supported empirically. Its most defensible version is not that all media content serves a unified elite interest (the evidence does not support this) but that the structural constraints of commercial media systematically narrow the range of perspectives represented and systematically amplify the perspectives of those with the resources to operate effectively within those constraints. This is propaganda in Ellul's sociological sense: not a ministry directing specific messages, but a structural environment that produces ideological consistency through selection rather than censorship.
Neil Postman's analysis of television in Amusing Ourselves to Death (1985) identifies a specific medium-level propaganda effect that operates without any intentional propaganda at all. Television, Postman argues, structures all content as entertainment: its visual grammar, its temporal rhythm, its commercial structure, and its emphasis on affect over information make it systematically unsuitable for the kind of sustained, nuanced, qualified, evidence-based communication that serious political discourse requires. When political reality is experienced primarily through television, politics itself adapts to the medium: it becomes more performative, more emotional, more focused on personality than policy, more comfortable with simplification than complexity. This is not a result of bad journalism. It is a result of the medium's grammar applied to political content.
We were keeping our eye on 1984. When the year came and the prophecy didn't, thoughtful Americans breathed a sigh of relief. But we had forgotten that alongside Orwell's dark vision, there was another: Huxley's. What Orwell feared were those who would ban books. What Huxley feared was that there would be no reason to ban a book, for there would be no one who wanted to read one.
Neil Postman, Amusing Ourselves to Death (1985)Digital Propaganda The Algorithm as Propagandist
The emergence of social media and algorithmic content recommendation has not created new propaganda techniques. It has provided unprecedented scale, precision, and efficiency for the application of existing techniques, while adding several structural features that make digital propaganda qualitatively different from its 20th-century predecessors.
The first structural difference is personalisation. 20th-century mass propaganda was addressed to a mass, the same message to all members of an audience. Digital propaganda is addressed to individuals: tailored to their specific psychological profile, their demonstrated interests, their established emotional triggers, and their social network structure. The Cambridge Analytica operation in the 2016 American and British elections (regardless of its actual effectiveness, which is disputed) demonstrated that the data infrastructure for psychographic micro-targeting of political messages existed and was being deployed. The personalisation of propaganda is not a hypothetical future capability. It is an existing industrial practice.
The algorithmic recommendation systems that govern content distribution on major social media platforms are optimised for engagement, for the maximisation of time-on-platform, clicks, shares, and reactions. The problem is that the emotional states that maximise engagement are not the emotional states that maximise accurate belief formation. Content that provokes outrage, fear, disgust, and tribal loyalty is substantially more engaging than content that informs, qualifies, and complicates. The algorithms that are optimised for engagement are therefore, as a structural consequence rather than a deliberate policy, optimised for the distribution of emotionally activating, tribally polarising, and accuracy-indifferent content.
Facebook's own internal research, revealed through the 2021 whistleblower disclosures, documented that the platform's engineers had identified a correlation between engagement-optimised content and harmful, divisive, and false content, and that this information had not produced policy changes proportional to the documented harm. The business model and the epistemic health of the population it served were in structural conflict, and the business model won. This is not a story about individual malice. It is a story about institutional incentives producing outcomes that no individual within the institution intended or endorsed.
Eli Pariser's concept of the "filter bubble" (the algorithmically constructed personalised information environment that reflects and amplifies existing beliefs rather than challenging them) describes the digital realisation of Ellul's pre-propaganda environment. Each user inhabits an information ecosystem calibrated to their existing psychological profile, which means calibrated to confirm their existing beliefs, activate their existing emotional responses, and reinforce their existing tribal identities. The filter bubble is the perfect pre-propaganda environment: it maximises the effectiveness of targeted political messages by ensuring that those messages arrive in a context that has been systematically prepared for their reception.
The second structural difference is the dissolution of the distinction between interpersonal and mass communication. Ellul's most important finding about horizontal propaganda (peer-to-peer transmission is more effective than vertical institutional communication because it exploits personal trust) has been industrialised by social media. The share, the retweet, the forward: these are mechanisms for converting mass communication into apparent interpersonal recommendation, wrapping institutional propaganda in the affective packaging of peer endorsement. The friend who shares a political video is, for the receiver, a trusted source rather than a propaganda channel, even if the video was produced by a political operation and algorithmically distributed to reach exactly the social network nodes most likely to amplify it.
The third structural difference is the collapse of shared factual reality. 20th-century propaganda operated in a shared informational environment: even in totalitarian states, there was a reality that the propaganda was distorting, and the distortion could in principle be exposed by reference to that reality. The digital media environment has fragmented the shared informational environment to the point where different communities inhabit genuinely different factual realities (different sets of established facts, different hierarchies of credible sources, different histories of political events) and the cross-community communication required to contest these realities no longer reliably occurs. Hannah Arendt's description of the totalitarian technique of reality replacement (discussed in Artifact 4) has been partially achieved in democratic societies not through state censorship but through voluntary epistemic fragmentation enabled by personalised media.
Data Collection at Scale
Behavioural data from billions of users is aggregated into psychographic profiles of extraordinary granularity, capturing not just stated preferences but inferred emotional states, personality traits, social identities, and cognitive vulnerabilities.
Algorithmic Personalisation
Content recommendation algorithms optimised for engagement serve each user the content most likely to produce high emotional activation. Which systematically favours tribally polarising, emotionally intense, and accuracy-indifferent material.
Filter Bubble Construction
The personalisation of information environments produces epistemic isolation: each user's information ecosystem reinforces their existing beliefs, amplifies their existing emotional responses, and excludes the disconfirming information that could challenge them.
Horizontal Amplification
Targeted content reaches users through peer networks (shares, recommendations, viral spread) wrapping institutional or political propaganda in the affective packaging of personal endorsement and exploiting interpersonal trust.
Reality Fragmentation
At scale, the personalisation of information environments produces populations inhabiting genuinely different factual realities (different established facts, different credible sources, different histories) making cross-reality dialogue structurally difficult and cross-reality persuasion structurally improbable.
Can Propaganda Be Resisted? The Evidence
Ellul's most despairing conclusion is that propaganda cannot be effectively resisted by individuals operating within propagandistic environments, because resistance requires the kind of sustained, effortful, critical engagement with information that the propaganda environment is specifically designed to prevent. The same media saturation that delivers the propaganda also delivers the fatigue, the distraction, and the emotional exhaustion that undermine critical processing. To resist propaganda is to swim against the current of the entire modern information environment.
This is not entirely wrong. But the research literature on propaganda resistance identifies several approaches that genuinely reduce susceptibility, not to the point of immunity, but to the point of meaningful mitigation.
Prebunking (Inoculation Theory). Sander van der Linden and colleagues have developed an approach to propaganda resistance based on the medical inoculation model: pre-exposing people to weakened forms of propaganda techniques (describing the technique, providing a clear example, and explaining why it is manipulative) before full exposure increases resistance to subsequent full-strength applications. The "inoculation" effect has been replicated across multiple studies and multiple propaganda types. The mechanism is that pre-exposure to the technique prompts the activation of critical processing before the emotional activation that the technique is designed to produce. Once you know that "plain folks" rhetoric is a propaganda technique, the politician's blue-collar performance activates a different kind of processing than it would have without the pre-exposure.
Accuracy Nudges. Research by Gordon Pennycook and David Rand has documented that simply prompting people to consider the accuracy of news headlines (asking "is this accurate?" before they share) significantly reduces the sharing of misinformation. The effect is not about careful evaluation of each headline; it is about activating an accuracy-oriented mindset that then applies to subsequent sharing decisions. The fact that digital platforms are optimised for engagement rather than accuracy means that they never prompt this mindset, but the research suggests that doing so would be effective.
Media Literacy and Lateral Reading. The fact-checking organisation framework that emerged in the 2010s demonstrated that the standard approach to evaluating online information (reading a source's "about" page and evaluating the content itself) was easily gamed by sophisticated misinformation operations. Professional fact-checkers, researchers found, used a different approach: "lateral reading" (immediately opening multiple tabs to check what other sources said about the source in question, rather than evaluating the source's self-presentation. This approach) treating the source as the primary object of evaluation rather than the content, proved substantially more effective and is teachable.
The most honest conclusion from the propaganda resistance literature is that individual-level interventions (however genuinely effective in controlled studies) are insufficient to the scale of the problem. The structural conditions that make modern populations vulnerable to propaganda at scale (algorithmic information environments optimised for engagement, the collapse of shared factual reality, the atomisation of individuals from traditional community structures, the concentration of media ownership) are not addressed by teaching individuals to lateral read or activating their accuracy mindset. These structural conditions require structural solutions: regulation of algorithmic recommendation systems, platform liability for demonstrably false content, investment in community journalism and local information infrastructure, and the cultivation of institutional trust through demonstrably accountable journalism.
The individual who learns to resist propaganda is better equipped than the individual who does not. But the individual operating in a structurally propagandistic information environment is fighting a battle of attrition against forces that have orders-of-magnitude more resources, data, and sophistication. Individual epistemic hygiene is necessary but not sufficient. This is the honest conclusion, and it is one that most media literacy education declines to reach, because reaching it implies institutional responsibilities that are politically uncomfortable.
The Examined Propagandee
The title of this section names the most important epistemic position that the study of propaganda makes available: the recognition that one is, oneself, a propagandee. Not a potential target, not a hypothetical victim, not someone who would be susceptible under other conditions, but someone who is, right now, embedded in propagandistic information environments that are actively shaping the beliefs they hold, the emotions they feel about political subjects, and the tribal identities they defend as their own.
This recognition is uncomfortable because it does not exempt the beliefs we are most confident about or most attached to. The beliefs that feel most obviously, self-evidently true are often the most thoroughly propagandised, because the most effective propaganda is invisible to its targets. The certainty that one's political positions are simply rational responses to obvious facts, while opposing positions are the product of manipulation and irrationality, is itself a propaganda effect. It is the integration propaganda of one's own information ecosystem, producing exactly the epistemological self-confidence that makes further manipulation most effective.
The propagandist knows very well that the best propaganda is not that which is openly declared as such, but the propaganda that can present itself as objective reporting, as science, as common sense. The target's conviction that they are beyond propaganda is the propagandist's greatest asset.
After Jacques Ellul, Propagandes (1962)What does it mean to live as an examined propagandee? It does not mean the suspension of all political belief, the adoption of a permanent scepticism that refuses to take any position, or the false equivalence that treats all perspectives as equally manipulated and therefore equally valid. Some beliefs are better calibrated to reality than others. Some information environments are more reliable than others. Some sources are more accountable than others. The recognition that one's beliefs have been shaped by propagandistic processes does not collapse into nihilism about truth.
What it means, practically, is a specific set of epistemic habits applied to one's own political cognition: asking what the source of a belief is and what interests that source serves; checking whether the evidence for a strongly held political position is the kind of evidence that could in principle disconfirm it; noticing the emotional texture of one's engagement with political content and asking whether that emotional texture is the product of accurate information or of effective emotional manipulation; deliberately seeking out the most sophisticated versions of opposing positions rather than the most easily dismissed; and maintaining, as a standing commitment, the willingness to have been wrong, not in the passive sense of acknowledging fallibility in the abstract, but in the active sense of looking for the specific ways in which one's own information environment has shaped one's beliefs without one's knowledge or consent.
This is the red thread that runs through all seven artifacts in this series. The brain forms beliefs through processes invisible to introspection. Religion institutionalises those beliefs into structures that perform psychological functions their adherents may not recognise. Mythology encodes them in narrative structures that bypass rational evaluation. Ideology weaponises them into tribal identity. Propaganda exploits all of the above at industrial scale. Cults take the same mechanisms to their totalising extreme. And the final artifact (How to Think Clearly) is the question of whether and how it is possible, knowing all of this, to do better. The answer is not simple. But the question is unavoidable once the machinery has been seen.
Ellul, J. (1962/1973). Propaganda: The Formation of Men's Attitudes. Knopf. · Bernays, E. (1928). Propaganda. Horace Liveright. · Lippmann, W. (1922). Public Opinion. Harcourt. · Chomsky, N. & Herman, E. (1988). Manufacturing Consent. Pantheon. · Postman, N. (1985). Amusing Ourselves to Death. Viking. · McLuhan, M. (1964). Understanding Media. McGraw-Hill. · Cialdini, R. (1984). Influence: The Psychology of Persuasion. HarperCollins. · Hasher, L., Goldstein, D. & Toppino, T. (1977). Frequency and the conference of referential validity. Journal of Verbal Learning and Verbal Behavior, 16(1). · Pennycook, G., Cannon, T. & Rand, D. (2018). Prior exposure increases perceived accuracy of fake news. Journal of Experimental Psychology: General, 147(12). · van der Linden, S. et al. (2017). Inoculating the public against misinformation about climate change. Global Challenges, 1(2). · Pennycook, G. & Rand, D. (2021). The psychology of fake news. Trends in Cognitive Sciences, 25(5). · Wineburg, S. & McGrew, S. (2017). Lateral reading: reading less and learning more. Stanford History Education Group Working Paper. · Pariser, E. (2011). The Filter Bubble. Penguin Press. · Greenberg, J., Solomon, S. & Pyszczynski, T. (1986). The causes and consequences of the need for self-esteem. In Baumeister (ed.), Public Self and Private Self. · Pratkanis, A. & Aronson, E. (1991). Age of Propaganda. Freeman.
The Series Continues
Five artifacts in. We have traced the machinery from the individual neuron through religion, mythology, and ideology to the industrial-scale exploitation of all of it. What comes next is the most extreme case: belief systems that do not merely influence their adherents but consume them entirely.
Cults are not aberrations. They are the logical endpoint of mechanisms that are operating, in attenuated form, in every belief system this series has examined. Understanding them is not optional for anyone who wants to understand belief.