Please take a look at Articles on self-defense/conflict/violence for introductions to the references found in the bibliography page.

Please take a look at my bibliography if you do not see a proper reference to a post.

Please take a look at my Notable Quotes

Hey, Attention on Deck!

Hey, NOTHING here is PERSONAL, get over it - Teach Me and I will Learn!


When you begin to feel like you are a tough guy, a warrior, a master of the martial arts or that you have lived a tough life, just take a moment and get some perspective with the following:


I've stopped knives that were coming to disembowel me

I've clawed for my gun while bullets ripped past me

I've dodged as someone tried to put an ax in my skull

I've fought screaming steel and left rubber on the road to avoid death

I've clawed broken glass out of my body after their opening attack failed

I've spit blood and body parts and broke strangle holds before gouging eyes

I've charged into fires, fought through blizzards and run from tornados

I've survived being hunted by gangs, killers and contract killers

The streets were my home, I hunted in the night and was hunted in turn


Please don't brag to me that you're a survivor because someone hit you. And don't tell me how 'tough' you are because of your training. As much as I've been through I know people who have survived much, much worse. - Marc MacYoung

WARNING, CAVEAT AND NOTE

The postings on this blog are my interpretation of readings, studies and experiences therefore errors and omissions are mine and mine alone. The content surrounding the extracts of books, see bibliography on this blog site, are also mine and mine alone therefore errors and omissions are also mine and mine alone and therefore why I highly recommended one read, study, research and fact find the material for clarity. My effort here is self-clarity toward a fuller understanding of the subject matter. See the bibliography for information on the books. Please make note that this article/post is my personal analysis of the subject and the information used was chosen or picked by me. It is not an analysis piece because it lacks complete and comprehensive research, it was not adequately and completely investigated and it is not balanced, i.e., it is my personal view without the views of others including subject experts, etc. Look at this as “Infotainment rather then expert research.” This is an opinion/editorial article/post meant to persuade the reader to think, decide and accept or reject my premise. It is an attempt to cause change or reinforce attitudes, beliefs and values as they apply to martial arts and/or self-defense. It is merely a commentary on the subject in the particular article presented.


Note: I will endevor to provide a bibliography and italicize any direct quotes from the materials I use for this blog. If there are mistakes, errors, and/or omissions, I take full responsibility for them as they are mine and mine alone. If you find any mistakes, errors, and/or omissions please comment and let me know along with the correct information and/or sources.



“What you are reading right now is a blog. It’s written and posted by me, because I want to. I get no financial remuneration for writing it. I don’t have to meet anyone’s criteria in order to post it. Not only I don’t have an employer or publisher, but I’m not even constrained by having to please an audience. If people won’t like it, they won’t read it, but I won’t lose anything by it. Provided I don’t break any laws (libel, incitement to violence, etc.), I can post whatever I want. This means that I can write openly and honestly, however controversial my opinions may be. It also means that I could write total bullshit; there is no quality control. I could be biased. I could be insane. I could be trolling. … not all sources are equivalent, and all sources have their pros and cons. These needs to be taken into account when evaluating information, and all information should be evaluated. - God’s Bastard, Sourcing Sources (this applies to this and other blogs by me as well; if you follow the idea's, advice or information you are on your own, don't come crying to me, it is all on you do do the work to make sure it works for you!)



“You should prepare yourself to dedicate at least five or six years to your training and practice to understand the philosophy and physiokinetics of martial arts and karate so that you can understand the true spirit of everything and dedicate your mind, body and spirit to the discipline of the art.” - cejames (note: you are on your own, make sure you get expert hands-on guidance in all things martial and self-defense)



“All I say is by way of discourse, and nothing by way of advice. I should not speak so boldly if it were my due to be believed.” - Montaigne


I am not a leading authority on any one discipline that I write about and teach, it is my hope and wish that with all the subjects I have studied it provides me an advantage point that I offer in as clear and cohesive writings as possible in introducing the matters in my materials. I hope to serve as one who inspires direction in the practitioner so they can go on to discover greater teachers and professionals that will build on this fundamental foundation. Find the authorities and synthesize a wholehearted and holistic concept, perception and belief that will not drive your practices but rather inspire them to evolve, grow and prosper. My efforts are born of those who are more experienced and knowledgable than I. I hope you find that path! See the bibliography I provide for an initial list of experts, professionals and masters of the subjects.

Accepting Truth Against the Tribe

The Courage to Believe What Is Real


by CEJames (researcher/author) & Akira Ichinose (editor/research assistant)

 

DISCLAIMER

The content presented here is for educational and entertainment purposes only and does not constitute legal advice or a certified self-defense methodology. Laws governing the use of force vary by jurisdiction. Readers should consult a qualified attorney and seek instruction from a certified self-defense professional before making any decisions regarding personal protection.

 

 

Introduction: The Problem With the Truth

Let's get one thing out of the way right up front: accepting truth — real, unvarnished, sometimes unwelcome truth — is one of the hardest things a human being can do. Not because the truth is complicated (though it often is), and not because the evidence is hard to find (though it sometimes is). The deepest difficulty is that the truth frequently conflicts with what the people around us believe, and what the people around us believe forms the invisible scaffolding of our identity, our safety, and our sense of belonging.


That's not a small thing. The human brain is not a neutral information processor. It is a survival organ, and the tribe — whatever tribe you belong to — represents survival. For most of our evolutionary history, being cast out of the group meant death. The threat of social rejection activates the same neural pathways as physical pain (Eisenberger, 2012). When the truth threatens group belonging, the brain treats it like a physical threat. No wonder we resist it.


And yet — accepting reality is precisely what equips us to navigate it successfully. Soldiers who misread a battlefield don't win engagements. Doctors who ignore symptoms don't heal patients. Leaders who cling to comfortable myths don't lead their organizations through genuine challenges. 


The cost of untruth is always paid eventually, usually with interest


So what does it actually take to face reality even when your tribe insists on a different story?


The Tribal Brain: Why We Believe Together

Human beings are, at the most fundamental level, tribal animals. Anthropologists and evolutionary psychologists broadly agree that Homo sapiens evolved in small, interdependent bands of roughly 50 to 150 individuals (Dunbar, 1992). In that environment, the group wasn't merely a social preference — it was the only viable unit for survival. You hunted together, defended territory together, raised children together, and endured adversity together. The individual who broke too far from group norms and beliefs risked exclusion, and exclusion was a death sentence.


This evolutionary heritage shows up powerfully in modern cognition. Social psychologist Henri Tajfel's research on social identity theory demonstrated that people derive a significant portion of their self-concept from group membership, and they are strongly motivated to view their groups favorably (Tajfel & Turner, 1979). In practical terms, this means that when a belief is associated with our group — our political party, our religion, our profession, our culture, our family — challenging that belief feels like an attack on us personally.


Jonathan Haidt, in his work on moral psychology, describes this as the difference between being a 'truth seeker' and being a 'team player' (Haidt, 2012). Most of us, most of the time, are doing something closer to team playing. We are forming beliefs that maintain group cohesion and then constructing rationalizations for those beliefs after the fact. This isn't dishonesty in the ordinary sense — it's the default operating mode of the tribal brain. Recognizing that about yourself is the first genuinely hard step toward doing something different.


Cognitive Dissonance: The Psychological Cost of Honest Updating

When new information conflicts with an existing belief, the result is cognitive dissonance — a state of uncomfortable psychological tension first described by Leon Festinger in 1957 (Festinger, 1957). Dissonance is genuinely unpleasant. The brain registers it as a problem to be solved. The critical insight is that the brain will solve it using whatever method is cheapest, not whatever method is most accurate.


Usually, cheapest means discrediting the new information rather than updating the old belief. 


We dismiss the contradictory evidence as coming from a biased source, or we nitpick its methodology, or we simply change the subject internally. Sometimes we double down — Festinger's famous study of a doomsday cult found that when the prophecy failed to materialize, many members became more committed to the group and its beliefs, not less (Festinger, Riecken, & Schachter, 1956). The shared investment in the belief, combined with the social cost of abandoning it publicly, made doubling down psychologically cheaper than honest revision.


What does it take to override this? One crucial factor is developing what some researchers call 'dissonance tolerance' — a cultivated capacity to sit with the discomfort of uncertainty and contradiction without rushing to resolve it through distortion (Kruglanski & Webster, 1996). This is genuinely a skill, not merely a personality trait, and like any skill it can be developed with practice. Mindfulness-based approaches help, as does the deliberate habit of treating your own beliefs as hypotheses rather than settled facts.


Confirmation Bias: The Evidence Filter

Even when cognitive dissonance isn't directly triggered, our information intake is systematically skewed by confirmation bias — the well-documented tendency to seek out, favor, interpret, and remember information that confirms what we already believe (Nickerson, 1998). This bias operates at every stage of cognition. We read news sources that share our worldview. We follow social media accounts that reinforce our perspectives. We spend time with people who think like we do. When we encounter contrary evidence, we apply higher standards of scrutiny to it than we apply to confirming evidence.


The tribal dimension amplifies this dramatically. When our beliefs are group beliefs — shared by the people we eat, work, worship, and recreate with — the social reinforcement of those beliefs is constant and powerful. Every conversation, every shared meme, every group ritual reaffirms what we already think. Dissenting information, meanwhile, arrives from outsiders who carry the subtle social taint of 'the other side.' We're not just individually biased; we're embedded in social systems that continuously sort and curate our information environment to minimize disconfirmation.


Breaking through this requires what philosopher Quassim Cassam calls 'epistemic self-examination'— an honest accounting of where your beliefs come from, who benefits from them, and what evidence would theoretically change them (Cassam, 2019). If you cannot articulate what would change your mind, you don't actually hold a falsifiable belief. You hold an identity commitment. That distinction is worth taking seriously.


Epistemic Courage: What It Actually Takes

Accepting truth against tribal consensus requires something that doesn't show up in most pop-psychology accounts of critical thinking: courage. Not the abstract approval of courage as a value, but the actual willingness to experience social discomfort, risk real relationships, and absorb the psychological cost of standing apart from the group on a matter that the group considers settled.


The philosopher Plato gave us the Allegory of the Cave — prisoners chained underground, mistaking shadows on a wall for reality, and when one escapes and sees the sun, returns to the cave only to be mocked and threatened by those who never left (Plato, ca. 380 BCE). It's one of the oldest accounts of what happens to a person who accepts a truth their community hasn't. It doesn't always go well for them in the short run. The history of science and reform is populated with people — Galileo, Semmelweis, Darwin — whose commitment to what the evidence actually showed put them at odds with the dominant community of their time and cost them dearly for it.


This is not to romanticize contrarianism. Most people who disagree with consensus are simply wrong, and the tribal consensus is often right. The goal isn't to value disagreement for its own sake but to be willing to follow evidence and argument wherever they lead, even when the destination is uncomfortable. That requires genuine intellectual humility — the recognition that you are as susceptible to bias, error, and motivated reasoning as anyone else — combined with genuine intellectual courage: the willingness to say what you actually think when the stakes are real (Spiegel, 2012).


The Role of Identity: When Belief Is Who You Are

One of the most underappreciated obstacles to accepting contrary truth is that many of our beliefs aren't really propositions we hold — they're identities we inhabit. When a person says 'I am a Democrat' or 'I am a Christian' or 'I am a Marine' or 'I am an Okinawan martial artist,' they are describing themselves, not merely listing opinions. 


Beliefs that are identity-constitutive are extraordinarily resistant to revision precisely because revising them feels like self-annihilation.


Dan Kahan's research on cultural cognition has documented extensively how intelligent, educated people reason more sophisticatedly not when the stakes are low but when motivated to defend an identity (Kahan et al., 2012). That is, greater intelligence and more information sometimes correlates with greater polarization rather than greater convergence. 


Smart people are better at rationalization, and when your tribe's position is under threat, rationalization is precisely what gets deployed.


The healthiest path forward here involves what psychologist Jennifer Heen calls 'identity updating' — learning to see your beliefs as arising from your values and experience rather than constituting your essence (Stone & Heen, 2014). A person who believes 'I value honesty and good evidence' can update specific factual beliefs without existential crisis, because the core identity is about a commitment to honest inquiry, not to any particular conclusion. That's a much more stable foundation for genuine truth-seeking than any specific set of tribal positions.


Practical Approaches: Getting Honest With Yourself and Others

None of this is merely theoretical. There are practical habits of mind and behavior that meaningfully support the kind of honest epistemic engagement we're describing.


Seek disconfirmation deliberately. 


Rather than asking 'what confirms my view?' 


train yourself to ask 'what would disprove my view, and does that evidence exist?' 


This is a form of intellectual hygiene that scientists practice professionally but that anyone can cultivate.


Engage with the strongest opposing arguments. 

The internet has made it easy to strawman opposing views. The actual discipline is to find the best, most thoughtful articulation of a position you disagree with and engage with it seriously. If you can't steelman the other side, you probably don't understand the dispute well enough to hold a confident view.


Separate your tribe from your truth. 

You can belong to a community — value it, love it, serve it — without adopting every belief the community holds. The most respected members of any community are often those who can offer honest internal critique precisely because their commitment to the group is beyond question.


Build relationships across tribal lines. Research consistently shows that contact with out-group members reduces prejudice and bias (Pettigrew & Tropp, 2006). When you actually know people who think differently from you, their views become harder to dismiss as obviously wrong or malicious.


Tolerate uncertainty. 

Many tribal beliefs are held with false certainty because uncertainty feels threatening and belonging requires commitment. Developing comfort with 'I don't know' and 'the evidence here is mixed' is genuinely protective against the worst distortions of tribalism.


Watch your emotional temperature. 

Strong negative emotions — contempt, disgust, fear — are reliable indicators that motivated reasoning is operating. When the mere mention of a topic triggers an emotional spike, that's exactly the moment to slow down and examine your reasoning rather than accelerate through it.


Belief Without Proof: Faith, Trust, and Reasonable Commitment

It would be a mistake to suggest that all belief requires ironclad empirical proof. Humans must make decisions under conditions of radical uncertainty, and belief — including belief that outruns the available evidence — is sometimes both necessary and rational. The question is not whether to believe without complete proof but how to do so without simply defaulting to tribal comfort.


The philosopher William James, in his classic essay 'The Will to Believe,' argued that in genuinely live, forced, and momentous decisions — the kind where waiting for complete evidence is not an option — a person is entitled to choose based on their broader view of the world, their experience, and their values (James, 1896). What he was not endorsing was the casual refusal to examine evidence or the motivated dismissal of inconvenient facts. The distinction is between reasonable commitment under uncertainty and willful ignorance in service of group membership.


In martial arts traditions, there is a concept worth noting here: beginners are often told to trust the system before they understand it. You train the kata before you understand why. You defer to the sensei before you've tested the principles yourself. That kind of provisional trust — open to revision as experience accumulates — is entirely different from the tribal belief that is closed to revision because revision would threaten belonging. The former is learning. The latter is ideology.


Conclusion: The Long Work of Seeing Clearly

Accepting truth against the grain of tribal belief is not a one-time achievement. It is a lifelong practice, a discipline of intellectual honesty that must be exercised repeatedly and that will fail sometimes. The tribal pull is powerful precisely because it meets genuine human needs — for belonging, for identity, for community, for the cognitive ease of shared reality. You cannot simply reason your way past those needs. You can only learn to meet them in ways that don't require sacrificing your relationship with what is actually real.


The people who do this best are not necessarily the most intelligent or the most educated. They are, in the consistent findings of the research literature, the most intellectually humble — most aware of their own limitations, most genuinely curious about evidence and argument, most willing to say 'I was wrong, and here's what changed my mind.' That combination of openness and willingness to update is rarer than it should be and more valuable than almost any other cognitive virtue.


It costs something to see clearly. It costs something more to say what you see when the tribe says otherwise. But the alternative — living inside a collectively maintained fiction because the truth is socially expensive — is ultimately a more profound loss. Reality does not negotiate with our preferences. It simply continues to be what it is, and the gap between our models and the actual state of things is the space where errors, failures, and sometimes disasters live.


Closing that gap, even partially, even imperfectly, is among the most important things a thinking person can do.

 

 

Bibliography

Cassam, Q. (2019). Vices of the mind: From the intellectual to the political. Oxford University Press.

Dunbar, R. I. M. (1992). Neocortex size as a constraint on group size in primates. Journal of Human Evolution, 22(6), 469–493.

Eisenberger, N. I. (2012). The pain of social disconnection: Examining the shared neural underpinnings of physical and social pain. Nature Reviews Neuroscience, 13(6), 421–434.

Festinger, L. (1957). A theory of cognitive dissonance. Stanford University Press.

Festinger, L., Riecken, H. W., & Schachter, S. (1956). When prophecy fails. University of Minnesota Press.

Haidt, J. (2012). The righteous mind: Why good people are divided by politics and religion. Pantheon Books.

James, W. (1896). The will to believe and other essays in popular philosophy. Longmans, Green, and Co.

Kahan, D. M., Peters, E., Wittlin, M., Slovic, P., Ouellette, L. L., Braman, D., & Mandel, G. (2012). The polarizing impact of science literacy and numeracy on perceived climate change risks. Nature Climate Change, 2(10), 732–735.

Kruglanski, A. W., & Webster, D. M. (1996). Motivated closing of the mind: 'Seizing' and 'freezing.' Psychological Review, 103(2), 263–283.

Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2(2), 175–220.

Pettigrew, T. F., & Tropp, L. R. (2006). A meta-analytic test of intergroup contact theory. Journal of Personality and Social Psychology, 90(5), 751–783.

Plato. (ca. 380 BCE). The Republic (B. Jowett, Trans.). Oxford University Press. (1894 translation)

Spiegel, J. S. (2012). Open-mindedness and intellectual humility. Theory and Research in Education, 10(1), 27–38.

Stone, D., & Heen, S. (2014). Thanks for the feedback: The science and art of receiving feedback well. Viking.

Tajfel, H., & Turner, J. C. (1979). An integrative theory of intergroup conflict. In W. G. Austin & S. Worchel (Eds.), The social psychology of intergroup relations (pp. 33–47). Brooks/Cole.

 

 

© CEJames & Akira Ichinose. All rights reserved. Educational use only.

The Psychology of Unwavering Loyalty

Understanding Why Facts Don't Always Matter to the MAGA Movement


by CEJames (researcher/author) & Akira Ichinose (editor/research assistant)


DISCLAIMER

The content presented here is for educational and entertainment purposes only and does not constitute legal advice or a certified self-defense methodology. Laws governing the use of force vary by jurisdiction. Readers should consult a qualified attorney and seek instruction from a certified self-defense professional before making any decisions regarding personal protection.

 

Introduction: Why Logic Sometimes Doesn't Land

If you've ever tried to change the mind of a committed MAGA supporter with a mountain of facts and walked away wondering what just happened, you're not alone — and you're not crazy. The honest answer to why evidence so rarely moves the needle isn't that these people are stupid or broken. It's that human loyalty to a political tribe operates on psychological machinery that runs much deeper than logic. Understanding that machinery doesn't require you to agree with anything. It just requires a willingness to look honestly at how minds actually work under social, emotional, and identity pressure. So let's do exactly that.


1. Tribal Identity and the In-Group / Out-Group Dynamic

Human beings are social animals first and rational beings second — and that order matters enormously. For most of our evolutionary history, belonging to a group was the difference between survival and death. The brain evolved to treat group membership as sacred, which means that anything threatening the group triggers something closer to a physical danger response than an intellectual disagreement.


Social Identity Theory, developed by Henri Tajfel and John Turner in the 1970s and 1980s, explains how people derive a significant portion of their self-esteem from the groups they belong to. Once group membership becomes part of who you are — not just what you vote — then challenges to the group become challenges to the selfA MAGA identity for many followers isn't primarily a political position; it's a complete worldview, a social community, and a source of personal meaning all wrapped into one. Criticize the movement, and you're effectively criticizing the person at the most personal level possible.


Out-group dehumanization is the dark complement to this. Once the in-group is idealized, the out-group — Democrats, the mainstream media, academics, 'the deep state' — gets systematically framed as corrupt, dangerous, and contemptible. This isn't incidental; it's psychologically functional. The more threatening the out-group appears, the more tightly the in-group bonds together. Political messaging that keeps followers in a constant state of siege is not accidental. It is extraordinarily effective tribal psychology.


2. Cognitive Dissonance and the Backfire Effect

Leon Festinger's classic theory of cognitive dissonance tells us that when people hold two conflicting beliefs — or when a deeply held belief is contradicted by evidence — they experience genuine psychological discomfort. The instinct isn't to update the belief. The instinct is to relieve the discomfort as quickly as possible, and the easiest path is to dismiss, distort, or reframe the offending information.


Brendan Nyhan and Jason Reifler documented something even more unsettling in their research: the 'backfire effect,' where presenting people with corrective facts about their political beliefs sometimes causes them to double down on those beliefs rather than revise them. More recent research has complicated this finding — the effect isn't universal — but it does appear reliably in situations where the belief is deeply tied to identity. When facts feel like an attack on who you are, they tend to strengthen the bunker rather than open the door.


For the MAGA movement, this plays out in a recognizable pattern. Every piece of contradictory evidence — court rulings, scientific findings, economic data — gets routed through an interpretive framework that neutralizes it. The court is corrupt. The scientists are paid by globalists. The numbers are fake. The dissonance never fully resolves; it just gets perpetually redirected outward.


3. Confirmation Bias and the Information Ecosystem

Confirmation bias — the well-documented tendency to seek out, favor, and remember information that confirms what we already believe — is not a MAGA exclusive. Every human being does it. But the modern right-wing media ecosystem has been architecturally designed to turbocharge this universal tendency to an extraordinary degree.


When a follower gets their news primarily from Fox News, Newsmax, OAN, Truth Social, certain podcasts, and algorithmically curated social media feeds, they are living inside an information environment that has been specifically constructed to confirm a particular worldview at every turn. The result is not stupidity — it's rational updating on a deeply skewed dataset. The problem isn't how they're thinking; it's what they're thinking with.


Eli Pariser called this the 'filter bubble' phenomenon, and Cass Sunstein has written extensively about how ideologically homogeneous information environments produce 'echo chambers' where beliefs grow more extreme over time precisely because they are never subjected to serious counterevidence or challenge. The echo chamber doesn't just preserve beliefs — it radicalizes them.


4. Motivated Reasoning and Identity-Protective Cognition

Psychologists distinguish between 'accuracy motivation' — genuinely trying to figure out what's true — and 'directional motivation,' which means reasoning toward a conclusion you've already decided is correct. Motivated reasoning is what happens when directional motivation is in the driver's seat and accuracy is riding in the back.


Dan Kahan at Yale Law School has developed the concept of 'identity-protective cognition' to describe how people use their intelligence not to find truth but to defend the beliefs of their cultural group. And here's the counterintuitive part: the smarter and more analytically capable a person is, the better they are at this. High-intelligence individuals can construct more sophisticated rationalizations, more elaborate counterarguments, and more convincing dismissals of inconvenient evidence. Raw cognitive horsepower doesn't automatically lead to better beliefs — it leads to better defended ones.


This is why pointing out the absurdity of a belief so rarely works. The person you're arguing with may be perfectly capable of understanding your argument. They've almost certainly already encountered a version of it. Their reasoning machinery isn't broken; it's just running on a different set of priorities than yours.


5. Authoritarian Personality and the Cult of the Strong Leader

The psychological literature on authoritarianism is extensive and goes back to Theodor Adorno and colleagues' landmark 1950 study 'The Authoritarian Personality.' Refined significantly by Bob Altemeyer in his decades of research on Right-Wing Authoritarianism (RWA), this body of work identifies a cluster of traits that predispose certain individuals toward deference to strong authority figures, hostility toward perceived deviants or out-groups, and a strong preference for social order and conformity over autonomy and dissent.


High-RWA individuals don't just tolerate a strong, domineering leader — they find one deeply psychologically comforting. The projection of absolute certainty and strength, even when that certainty is factually baseless, signals competence and protection to the authoritarian follower. Weakness, nuance, and uncertainty — the hallmarks of honest intellectual engagement — register as vulnerability rather than integrity.


This explains in part why scandals, legal troubles, and demonstrable falsehoods that would destroy any conventional politician's career have had so little effect on Trump's core support. For followers with strong authoritarian orientations, the leader's defiance of conventional norms is a feature, not a bug. It proves he's fighting the establishment on their behalf.


6. Fear, Threat Perception, and the Amygdala's Veto Power

Research by John Jost and colleagues on the psychology of political conservatism, and separately by John Hibbing and colleagues on the biological roots of political ideology, has found consistent evidence that individuals higher in threat sensitivity — measured physiologically, not just by survey — tend toward more conservative and authoritarian political orientations. The amygdala, the brain's threat-detection center, appears to be more reactive in many high-RWA individuals.


MAGA political messaging is saturated with threat framing. Immigrants are 'invaders.' Democrats are 'destroying the country.' The election was 'stolen.' Traditional America is dying. This isn't random noise — it's a deliberate and effective activation of threat psychology. 


When someone's amygdala is firing, the prefrontal cortex — the seat of rational analysis and long-term thinking — loses influence. Fear-based processing is fast, categorical, and resistant to nuance. It is also, from an evolutionary standpoint, exactly how the brain is supposed to work under genuine threat.


The problem is that the threats are often manufactured or wildly exaggerated, but the psychological response they generate is entirely real. You can't fact-check someone out of a fear response. The emotional processing happens faster than the rational processing, and in many cases it sets the terms for how the rational processing proceeds afterward.


7. Repetition, the Illusory Truth Effect, and the Power of the Big Lie

One of the most robustly documented findings in cognitive psychology is the 'illusory truth effect': repeated exposure to a statement increases its perceived truthfulness, regardless of whether the statement is actually true. This isn't a fringe finding — it has been replicated across dozens of studies. Familiarity breeds credibility.


The MAGA movement's most powerful claims — the stolen election, the deep state, the radical left's destruction of America — have been repeated thousands upon thousands of times across every available media channel. Even people who know intellectually that a claim is contested start processing it as familiar, and familiarity starts functioning as a proxy for truth. This is why debunking is so much harder than bunking: the original false claim has had far more airtime than the correction, and the brain keeps score.


This is also the mechanism behind what Joseph Goebbels identified early in the twentieth century and what modern political strategists have rediscovered: 


the 'big lie' is paradoxically more believable than a small one precisely because its sheer audacity makes people assume no one would fabricate something so enormous. The scale of the claim provides its own perverse credibility.


8. Sunk Cost Fallacy and Commitment Escalation

The sunk cost fallacy describes the deeply human tendency to continue investing in a position, a project, or a relationship not because continued investment is rational, but because of the resources already committed. Walking away means admitting that the previous investment was a mistake, and that admission is psychologically painful.


For someone who has been publicly, vocally, and repeatedly supportive of the MAGA movement — who has argued for it with family members, posted about it on social media, organized their social life around it, and built friendships through it — abandoning that position is an extraordinarily high psychological cost. It's not just changing your mind; it's repudiating years of public identity, losing community, and admitting to everyone who disagreed with you that they were right. 


The social and psychological cost of exit is so high that for many people, continued loyalty is genuinely the path of least resistance even when serious doubts arise.


Social psychologist Robert Cialdini's work on commitment and consistency shows how powerfully people feel compelled to remain consistent with their past statements and positions. Once you've committed publicly, internal psychological pressure to remain consistent kicks in and does much of the work that external authority figures would otherwise need to do.


9. Moral Disengagement and the Leader Exception

Albert Bandura's work on moral disengagement identifies the psychological mechanisms through which ordinary people suspend their normal ethical standards when operating within a group or under the influence of an authority structure. These mechanisms include moral justification (the ends justify the means), euphemistic labeling (calling brutal policies 'tough love'), displacement of responsibility (the leader decides, I just follow), and dehumanization of victims.


For MAGA followers confronting behaviors in Trump that they would condemn in anyone else — documented dishonesty, alleged sexual misconduct, financial fraud, contempt for democratic norms — moral disengagement provides the psychological escape hatch. 'He fights dirty because the other side fights dirty.' 'The media lies about everything he does.' 'What about what Clinton did?' These aren't logical defenses; they're moral disengagement mechanisms operating on cue.


The leader-exception phenomenon is particularly powerful. Once a figure has been sufficiently idealized by a movement, that idealization creates an almost bulletproof psychological shield. Evidence against the leader is reprocessed as evidence of the forces arrayed against him, which in turn becomes evidence of how important and threatening he must be to those forces. It's a closed loop that converts attacks into proof of virtue.


10. Collective Narcissism and Grievance Identity

Agnieszka Golec de Zavala and colleagues have developed the concept of 'collective narcissism' — an exaggerated belief in the greatness of one's group combined with a hypersensitive conviction that the group is not receiving the recognition and respect it deserves. Collective narcissism has been empirically linked to intergroup hostility, conspiracy belief, and the perception of persecution even in the absence of actual persecution.


'Make America Great Again' is, at the psychological level, a collective narcissism slogan. It simultaneously asserts the exceptional greatness of the in-group ('America' as understood by the movement's supporters) and implies that this greatness has been stolen, suppressed, or disrespected by out-group forces. The grievance is built into the brand.


Grievance identity is extraordinarily sticky because it is both emotionally satisfying and perpetually self-renewing. Every policy defeat, every critical news story, every condescending comment from an elite becomes fresh evidence of the persecution narrative. The movement cannot be defeated in the conventional sense because every defeat is incorporated into the myth as further proof of how the game is rigged against them.


11. Epistemic Closure and the Rejection of Expertise

Journalist Julian Sanchez coined the term 'epistemic closure' to describe a political ecosystem in which participants have become so committed to a particular worldview that they've effectively immunized themselves against contrary information. The movement hasn't just rejected specific facts; it has rejected the institutions and methodologies that produce facts — universities, peer-reviewed science, mainstream journalism, federal agencies, the judiciary.


This is psychologically brilliant, if deeply corrosive. Once you've established that the very institutions tasked with verifying facts are corrupt and untrustworthy, you have created a situation in which no external evidence can ever be definitive. The fact-checkers are biased. The polls are fake. The courts are captured. The scientists are bought. It's a perfect epistemological fortress — impregnable precisely because the walls are made of distrust.


Tom Nichols has documented this trend in 'The Death of Expertise,' arguing that American culture has developed a profound and destructive hostility to specialized knowledge. The MAGA movement has turned that cultural tendency into a core political identity. Anti-elitism isn't just a rhetorical posture — for many followers it's a genuine epistemological commitment to the wisdom of the crowd over the claims of credential.


Closing Thoughts: What This Means and What It Doesn't

None of what's described here means that MAGA followers are uniquely irrational or uniquely broken as human beings. Every one of these psychological mechanisms — 


  • tribal loyalty, 
  • cognitive dissonance, 
  • confirmation bias, 
  • motivated reasoning, 
  • sunk cost commitment — 


is universal human machinery. Every political movement, left or right, runs on some version of this infrastructure. What distinguishes the MAGA phenomenon is the degree to which these mechanisms have been deliberately and systematically exploited, and the degree to which the movement's identity has been built specifically on epistemic closure — on the institutional rejection of external reality-testing.


If you're trying to have a productive conversation with someone in the movement, social psychology research consistently suggests that direct factual confrontation is among the least effective approaches. What tends to work better is 


  • building genuine interpersonal relationship first, 
  • asking questions rather than making assertions, 
  • finding shared values rather than contested facts, and 
  • being patient with a process that unfolds over months and years rather than a single conversation. 


People rarely change their minds in the moment; they change them slowly, in private, when the social cost of doing so has become manageable.


Understanding these mechanisms is not a counsel of despair. People do leave movements. People do update their worldviews. The mechanisms described here are powerful, but they are not destiny. What they are is a map — and knowing the terrain is the essential first step toward navigating it honestly.


Bibliography

Adorno, T. W., Frenkel-Brunswik, E., Levinson, D. J., & Sanford, R. N. (1950). The Authoritarian Personality. Harper & Row.

Altemeyer, B. (1981). Right-Wing Authoritarianism. University of Manitoba Press.

Altemeyer, B. (2006). The Authoritarians. Available at: www.theauthoritarians.org.

Bandura, A. (1999). Moral disengagement in the perpetration of inhumanities. Personality and Social Psychology Review, 3(3), 193–209.

Cialdini, R. B. (2001). Influence: The Psychology of Persuasion. HarperCollins.

Festinger, L. (1957). A Theory of Cognitive Dissonance. Stanford University Press.

Golec de Zavala, A., Cichocka, A., Eidelson, R., & Jayawickreme, N. (2009). Collective narcissism and its social consequences. Journal of Personality and Social Psychology, 97(6), 1074–1096.

Hibbing, J. R., Smith, K. B., & Alford, J. R. (2014). Predisposed: Liberals, Conservatives, and the Biology of Political Differences. Routledge.

Jost, J. T., Glaser, J., Kruglanski, A. W., & Sulloway, F. J. (2003). Political conservatism as motivated social cognition. Psychological Bulletin, 129(3), 339–375.

Kahan, D. M. (2012). Ideology, motivated reasoning, and cognitive reflection: An experimental study. Judgment and Decision Making, 8(4), 407–424.

Kahan, D. M., Peters, E., Dawson, E., & Slovic, P. (2017). Motivated numeracy and enlightened self-government. Behavioural Public Policy, 1(1), 54–86.

Nichols, T. (2017). The Death of Expertise: The Campaign Against Established Knowledge and Why It Matters. Oxford University Press.

Nyhan, B., & Reifler, J. (2010). When corrections fail: The persistence of political misperceptions. Political Behavior, 32(2), 303–330.

Pariser, E. (2011). The Filter Bubble: What the Internet Is Hiding from You. Penguin Press.

Sanchez, J. (2010, March 26). Frum, Cocktail Parties, and the Threat of Epistemic Closure. Julian Sanchez [Blog].

Sunstein, C. R. (2017). #Republic: Divided Democracy in the Age of Social Media. Princeton University Press.

Tajfel, H., & Turner, J. C. (1979). An integrative theory of intergroup conflict. In W. G. Austin & S. Worchel (Eds.), The Social Psychology of Intergroup Relations (pp. 33–47). Brooks/Cole.

Unkelbach, C., Koch, A., Silva, R. R., & Garcia-Marques, T. (2019). Truth by repetition: Explanations and implications. Current Directions in Psychological Science, 28(3), 247–253.

Westen, D. (2007). The Political Brain: The Role of Emotion in Deciding the Fate of the Nation. PublicAffairs.

Wood, T., & Porter, E. (2019). The elusive backfire effect: Mass attitudes' steadfast factual adherence. Political Behavior, 41(1), 135–163.

CEJames & Akira Ichinose