Please take a look at Articles on self-defense/conflict/violence for introductions to the references found in the bibliography page.

Please take a look at my bibliography if you do not see a proper reference to a post.

Please take a look at my Notable Quotes

Hey, Attention on Deck!

Hey, NOTHING here is PERSONAL, get over it - Teach Me and I will Learn!


When you begin to feel like you are a tough guy, a warrior, a master of the martial arts or that you have lived a tough life, just take a moment and get some perspective with the following:


I've stopped knives that were coming to disembowel me

I've clawed for my gun while bullets ripped past me

I've dodged as someone tried to put an ax in my skull

I've fought screaming steel and left rubber on the road to avoid death

I've clawed broken glass out of my body after their opening attack failed

I've spit blood and body parts and broke strangle holds before gouging eyes

I've charged into fires, fought through blizzards and run from tornados

I've survived being hunted by gangs, killers and contract killers

The streets were my home, I hunted in the night and was hunted in turn


Please don't brag to me that you're a survivor because someone hit you. And don't tell me how 'tough' you are because of your training. As much as I've been through I know people who have survived much, much worse. - Marc MacYoung

WARNING, CAVEAT AND NOTE

The postings on this blog are my interpretation of readings, studies and experiences therefore errors and omissions are mine and mine alone. The content surrounding the extracts of books, see bibliography on this blog site, are also mine and mine alone therefore errors and omissions are also mine and mine alone and therefore why I highly recommended one read, study, research and fact find the material for clarity. My effort here is self-clarity toward a fuller understanding of the subject matter. See the bibliography for information on the books. Please make note that this article/post is my personal analysis of the subject and the information used was chosen or picked by me. It is not an analysis piece because it lacks complete and comprehensive research, it was not adequately and completely investigated and it is not balanced, i.e., it is my personal view without the views of others including subject experts, etc. Look at this as “Infotainment rather then expert research.” This is an opinion/editorial article/post meant to persuade the reader to think, decide and accept or reject my premise. It is an attempt to cause change or reinforce attitudes, beliefs and values as they apply to martial arts and/or self-defense. It is merely a commentary on the subject in the particular article presented.


Note: I will endevor to provide a bibliography and italicize any direct quotes from the materials I use for this blog. If there are mistakes, errors, and/or omissions, I take full responsibility for them as they are mine and mine alone. If you find any mistakes, errors, and/or omissions please comment and let me know along with the correct information and/or sources.



“What you are reading right now is a blog. It’s written and posted by me, because I want to. I get no financial remuneration for writing it. I don’t have to meet anyone’s criteria in order to post it. Not only I don’t have an employer or publisher, but I’m not even constrained by having to please an audience. If people won’t like it, they won’t read it, but I won’t lose anything by it. Provided I don’t break any laws (libel, incitement to violence, etc.), I can post whatever I want. This means that I can write openly and honestly, however controversial my opinions may be. It also means that I could write total bullshit; there is no quality control. I could be biased. I could be insane. I could be trolling. … not all sources are equivalent, and all sources have their pros and cons. These needs to be taken into account when evaluating information, and all information should be evaluated. - God’s Bastard, Sourcing Sources (this applies to this and other blogs by me as well; if you follow the idea's, advice or information you are on your own, don't come crying to me, it is all on you do do the work to make sure it works for you!)



“You should prepare yourself to dedicate at least five or six years to your training and practice to understand the philosophy and physiokinetics of martial arts and karate so that you can understand the true spirit of everything and dedicate your mind, body and spirit to the discipline of the art.” - cejames (note: you are on your own, make sure you get expert hands-on guidance in all things martial and self-defense)



“All I say is by way of discourse, and nothing by way of advice. I should not speak so boldly if it were my due to be believed.” - Montaigne


I am not a leading authority on any one discipline that I write about and teach, it is my hope and wish that with all the subjects I have studied it provides me an advantage point that I offer in as clear and cohesive writings as possible in introducing the matters in my materials. I hope to serve as one who inspires direction in the practitioner so they can go on to discover greater teachers and professionals that will build on this fundamental foundation. Find the authorities and synthesize a wholehearted and holistic concept, perception and belief that will not drive your practices but rather inspire them to evolve, grow and prosper. My efforts are born of those who are more experienced and knowledgable than I. I hope you find that path! See the bibliography I provide for an initial list of experts, professionals and masters of the subjects.

Accepting Truth Against the Tribe

The Courage to Believe What Is Real


by CEJames (researcher/author) & Akira Ichinose (editor/research assistant)

 

DISCLAIMER

The content presented here is for educational and entertainment purposes only and does not constitute legal advice or a certified self-defense methodology. Laws governing the use of force vary by jurisdiction. Readers should consult a qualified attorney and seek instruction from a certified self-defense professional before making any decisions regarding personal protection.

 

 

Introduction: The Problem With the Truth

Let's get one thing out of the way right up front: accepting truth — real, unvarnished, sometimes unwelcome truth — is one of the hardest things a human being can do. Not because the truth is complicated (though it often is), and not because the evidence is hard to find (though it sometimes is). The deepest difficulty is that the truth frequently conflicts with what the people around us believe, and what the people around us believe forms the invisible scaffolding of our identity, our safety, and our sense of belonging.


That's not a small thing. The human brain is not a neutral information processor. It is a survival organ, and the tribe — whatever tribe you belong to — represents survival. For most of our evolutionary history, being cast out of the group meant death. The threat of social rejection activates the same neural pathways as physical pain (Eisenberger, 2012). When the truth threatens group belonging, the brain treats it like a physical threat. No wonder we resist it.


And yet — accepting reality is precisely what equips us to navigate it successfully. Soldiers who misread a battlefield don't win engagements. Doctors who ignore symptoms don't heal patients. Leaders who cling to comfortable myths don't lead their organizations through genuine challenges. 


The cost of untruth is always paid eventually, usually with interest


So what does it actually take to face reality even when your tribe insists on a different story?


The Tribal Brain: Why We Believe Together

Human beings are, at the most fundamental level, tribal animals. Anthropologists and evolutionary psychologists broadly agree that Homo sapiens evolved in small, interdependent bands of roughly 50 to 150 individuals (Dunbar, 1992). In that environment, the group wasn't merely a social preference — it was the only viable unit for survival. You hunted together, defended territory together, raised children together, and endured adversity together. The individual who broke too far from group norms and beliefs risked exclusion, and exclusion was a death sentence.


This evolutionary heritage shows up powerfully in modern cognition. Social psychologist Henri Tajfel's research on social identity theory demonstrated that people derive a significant portion of their self-concept from group membership, and they are strongly motivated to view their groups favorably (Tajfel & Turner, 1979). In practical terms, this means that when a belief is associated with our group — our political party, our religion, our profession, our culture, our family — challenging that belief feels like an attack on us personally.


Jonathan Haidt, in his work on moral psychology, describes this as the difference between being a 'truth seeker' and being a 'team player' (Haidt, 2012). Most of us, most of the time, are doing something closer to team playing. We are forming beliefs that maintain group cohesion and then constructing rationalizations for those beliefs after the fact. This isn't dishonesty in the ordinary sense — it's the default operating mode of the tribal brain. Recognizing that about yourself is the first genuinely hard step toward doing something different.


Cognitive Dissonance: The Psychological Cost of Honest Updating

When new information conflicts with an existing belief, the result is cognitive dissonance — a state of uncomfortable psychological tension first described by Leon Festinger in 1957 (Festinger, 1957). Dissonance is genuinely unpleasant. The brain registers it as a problem to be solved. The critical insight is that the brain will solve it using whatever method is cheapest, not whatever method is most accurate.


Usually, cheapest means discrediting the new information rather than updating the old belief. 


We dismiss the contradictory evidence as coming from a biased source, or we nitpick its methodology, or we simply change the subject internally. Sometimes we double down — Festinger's famous study of a doomsday cult found that when the prophecy failed to materialize, many members became more committed to the group and its beliefs, not less (Festinger, Riecken, & Schachter, 1956). The shared investment in the belief, combined with the social cost of abandoning it publicly, made doubling down psychologically cheaper than honest revision.


What does it take to override this? One crucial factor is developing what some researchers call 'dissonance tolerance' — a cultivated capacity to sit with the discomfort of uncertainty and contradiction without rushing to resolve it through distortion (Kruglanski & Webster, 1996). This is genuinely a skill, not merely a personality trait, and like any skill it can be developed with practice. Mindfulness-based approaches help, as does the deliberate habit of treating your own beliefs as hypotheses rather than settled facts.


Confirmation Bias: The Evidence Filter

Even when cognitive dissonance isn't directly triggered, our information intake is systematically skewed by confirmation bias — the well-documented tendency to seek out, favor, interpret, and remember information that confirms what we already believe (Nickerson, 1998). This bias operates at every stage of cognition. We read news sources that share our worldview. We follow social media accounts that reinforce our perspectives. We spend time with people who think like we do. When we encounter contrary evidence, we apply higher standards of scrutiny to it than we apply to confirming evidence.


The tribal dimension amplifies this dramatically. When our beliefs are group beliefs — shared by the people we eat, work, worship, and recreate with — the social reinforcement of those beliefs is constant and powerful. Every conversation, every shared meme, every group ritual reaffirms what we already think. Dissenting information, meanwhile, arrives from outsiders who carry the subtle social taint of 'the other side.' We're not just individually biased; we're embedded in social systems that continuously sort and curate our information environment to minimize disconfirmation.


Breaking through this requires what philosopher Quassim Cassam calls 'epistemic self-examination'— an honest accounting of where your beliefs come from, who benefits from them, and what evidence would theoretically change them (Cassam, 2019). If you cannot articulate what would change your mind, you don't actually hold a falsifiable belief. You hold an identity commitment. That distinction is worth taking seriously.


Epistemic Courage: What It Actually Takes

Accepting truth against tribal consensus requires something that doesn't show up in most pop-psychology accounts of critical thinking: courage. Not the abstract approval of courage as a value, but the actual willingness to experience social discomfort, risk real relationships, and absorb the psychological cost of standing apart from the group on a matter that the group considers settled.


The philosopher Plato gave us the Allegory of the Cave — prisoners chained underground, mistaking shadows on a wall for reality, and when one escapes and sees the sun, returns to the cave only to be mocked and threatened by those who never left (Plato, ca. 380 BCE). It's one of the oldest accounts of what happens to a person who accepts a truth their community hasn't. It doesn't always go well for them in the short run. The history of science and reform is populated with people — Galileo, Semmelweis, Darwin — whose commitment to what the evidence actually showed put them at odds with the dominant community of their time and cost them dearly for it.


This is not to romanticize contrarianism. Most people who disagree with consensus are simply wrong, and the tribal consensus is often right. The goal isn't to value disagreement for its own sake but to be willing to follow evidence and argument wherever they lead, even when the destination is uncomfortable. That requires genuine intellectual humility — the recognition that you are as susceptible to bias, error, and motivated reasoning as anyone else — combined with genuine intellectual courage: the willingness to say what you actually think when the stakes are real (Spiegel, 2012).


The Role of Identity: When Belief Is Who You Are

One of the most underappreciated obstacles to accepting contrary truth is that many of our beliefs aren't really propositions we hold — they're identities we inhabit. When a person says 'I am a Democrat' or 'I am a Christian' or 'I am a Marine' or 'I am an Okinawan martial artist,' they are describing themselves, not merely listing opinions. 


Beliefs that are identity-constitutive are extraordinarily resistant to revision precisely because revising them feels like self-annihilation.


Dan Kahan's research on cultural cognition has documented extensively how intelligent, educated people reason more sophisticatedly not when the stakes are low but when motivated to defend an identity (Kahan et al., 2012). That is, greater intelligence and more information sometimes correlates with greater polarization rather than greater convergence. 


Smart people are better at rationalization, and when your tribe's position is under threat, rationalization is precisely what gets deployed.


The healthiest path forward here involves what psychologist Jennifer Heen calls 'identity updating' — learning to see your beliefs as arising from your values and experience rather than constituting your essence (Stone & Heen, 2014). A person who believes 'I value honesty and good evidence' can update specific factual beliefs without existential crisis, because the core identity is about a commitment to honest inquiry, not to any particular conclusion. That's a much more stable foundation for genuine truth-seeking than any specific set of tribal positions.


Practical Approaches: Getting Honest With Yourself and Others

None of this is merely theoretical. There are practical habits of mind and behavior that meaningfully support the kind of honest epistemic engagement we're describing.


Seek disconfirmation deliberately. 


Rather than asking 'what confirms my view?' 


train yourself to ask 'what would disprove my view, and does that evidence exist?' 


This is a form of intellectual hygiene that scientists practice professionally but that anyone can cultivate.


Engage with the strongest opposing arguments. 

The internet has made it easy to strawman opposing views. The actual discipline is to find the best, most thoughtful articulation of a position you disagree with and engage with it seriously. If you can't steelman the other side, you probably don't understand the dispute well enough to hold a confident view.


Separate your tribe from your truth. 

You can belong to a community — value it, love it, serve it — without adopting every belief the community holds. The most respected members of any community are often those who can offer honest internal critique precisely because their commitment to the group is beyond question.


Build relationships across tribal lines. Research consistently shows that contact with out-group members reduces prejudice and bias (Pettigrew & Tropp, 2006). When you actually know people who think differently from you, their views become harder to dismiss as obviously wrong or malicious.


Tolerate uncertainty. 

Many tribal beliefs are held with false certainty because uncertainty feels threatening and belonging requires commitment. Developing comfort with 'I don't know' and 'the evidence here is mixed' is genuinely protective against the worst distortions of tribalism.


Watch your emotional temperature. 

Strong negative emotions — contempt, disgust, fear — are reliable indicators that motivated reasoning is operating. When the mere mention of a topic triggers an emotional spike, that's exactly the moment to slow down and examine your reasoning rather than accelerate through it.


Belief Without Proof: Faith, Trust, and Reasonable Commitment

It would be a mistake to suggest that all belief requires ironclad empirical proof. Humans must make decisions under conditions of radical uncertainty, and belief — including belief that outruns the available evidence — is sometimes both necessary and rational. The question is not whether to believe without complete proof but how to do so without simply defaulting to tribal comfort.


The philosopher William James, in his classic essay 'The Will to Believe,' argued that in genuinely live, forced, and momentous decisions — the kind where waiting for complete evidence is not an option — a person is entitled to choose based on their broader view of the world, their experience, and their values (James, 1896). What he was not endorsing was the casual refusal to examine evidence or the motivated dismissal of inconvenient facts. The distinction is between reasonable commitment under uncertainty and willful ignorance in service of group membership.


In martial arts traditions, there is a concept worth noting here: beginners are often told to trust the system before they understand it. You train the kata before you understand why. You defer to the sensei before you've tested the principles yourself. That kind of provisional trust — open to revision as experience accumulates — is entirely different from the tribal belief that is closed to revision because revision would threaten belonging. The former is learning. The latter is ideology.


Conclusion: The Long Work of Seeing Clearly

Accepting truth against the grain of tribal belief is not a one-time achievement. It is a lifelong practice, a discipline of intellectual honesty that must be exercised repeatedly and that will fail sometimes. The tribal pull is powerful precisely because it meets genuine human needs — for belonging, for identity, for community, for the cognitive ease of shared reality. You cannot simply reason your way past those needs. You can only learn to meet them in ways that don't require sacrificing your relationship with what is actually real.


The people who do this best are not necessarily the most intelligent or the most educated. They are, in the consistent findings of the research literature, the most intellectually humble — most aware of their own limitations, most genuinely curious about evidence and argument, most willing to say 'I was wrong, and here's what changed my mind.' That combination of openness and willingness to update is rarer than it should be and more valuable than almost any other cognitive virtue.


It costs something to see clearly. It costs something more to say what you see when the tribe says otherwise. But the alternative — living inside a collectively maintained fiction because the truth is socially expensive — is ultimately a more profound loss. Reality does not negotiate with our preferences. It simply continues to be what it is, and the gap between our models and the actual state of things is the space where errors, failures, and sometimes disasters live.


Closing that gap, even partially, even imperfectly, is among the most important things a thinking person can do.

 

 

Bibliography

Cassam, Q. (2019). Vices of the mind: From the intellectual to the political. Oxford University Press.

Dunbar, R. I. M. (1992). Neocortex size as a constraint on group size in primates. Journal of Human Evolution, 22(6), 469–493.

Eisenberger, N. I. (2012). The pain of social disconnection: Examining the shared neural underpinnings of physical and social pain. Nature Reviews Neuroscience, 13(6), 421–434.

Festinger, L. (1957). A theory of cognitive dissonance. Stanford University Press.

Festinger, L., Riecken, H. W., & Schachter, S. (1956). When prophecy fails. University of Minnesota Press.

Haidt, J. (2012). The righteous mind: Why good people are divided by politics and religion. Pantheon Books.

James, W. (1896). The will to believe and other essays in popular philosophy. Longmans, Green, and Co.

Kahan, D. M., Peters, E., Wittlin, M., Slovic, P., Ouellette, L. L., Braman, D., & Mandel, G. (2012). The polarizing impact of science literacy and numeracy on perceived climate change risks. Nature Climate Change, 2(10), 732–735.

Kruglanski, A. W., & Webster, D. M. (1996). Motivated closing of the mind: 'Seizing' and 'freezing.' Psychological Review, 103(2), 263–283.

Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2(2), 175–220.

Pettigrew, T. F., & Tropp, L. R. (2006). A meta-analytic test of intergroup contact theory. Journal of Personality and Social Psychology, 90(5), 751–783.

Plato. (ca. 380 BCE). The Republic (B. Jowett, Trans.). Oxford University Press. (1894 translation)

Spiegel, J. S. (2012). Open-mindedness and intellectual humility. Theory and Research in Education, 10(1), 27–38.

Stone, D., & Heen, S. (2014). Thanks for the feedback: The science and art of receiving feedback well. Viking.

Tajfel, H., & Turner, J. C. (1979). An integrative theory of intergroup conflict. In W. G. Austin & S. Worchel (Eds.), The social psychology of intergroup relations (pp. 33–47). Brooks/Cole.

 

 

© CEJames & Akira Ichinose. All rights reserved. Educational use only.

No comments: