- An issue has the community divided into two sides. (Example: One side has some really dogmatic beliefs of their system or style while the other side, also practitioners of the same system or style, has a different view of those same beliefs and neither side can come to any type of middle ground decision because one and the other has a firm dogmatic BELIEF of their beliefs.)
- You read, hear or perceive an issue/concept/idea, etc., that agrees with your side and provides solid evidence to support it. What you hear, see, read or perceive mentions the argument on the other side in summary form but dismisses it as unworthy of consideration. (Example: You do deep authentic historical research, you analyze that research and perform adequate fact-finding efforts to validate the information and then you synthesize a presentation that refutes the other sides information but the other side at the first instance of disagreement with their beliefs stops, assumes you are wrong and then begins to refute your side even when they don’t know the rest of the story.)
- You remember (falsely) having heard, seen, read, or perceived both sides of the argument. What you really perceived was one side of the argument plus a misleading summary of the other side. (Example: see comments in no. 2 above.)
- When someone presents you links to better arguments on the other side you skip them because you think you already know what they will say, and you assume it must be nonsense. For all practical purposes you are blind to the other argument. It isn’t that you disagree with the strong form of the argument on the other side so much as you don’t know it exists no matter how many times it is put right in front of you. (Comment: this begins the moment the first piece of information associated with a particular person become blinded through cognitive blindness so any future/further information based on fact, fact checked, etc., gets tossed into the circle file, i.e., electronically dumped into dev/null, just because it is associated with the person’s name.)
p.s. “When a person who already believes strongly about something is presented with evidence that is contrary to their belief, they can tend to react in one of, or a combination of three ways; ignore the information (citing that it is irrelevant or blatantly wrong), rationalize the information or belief (citing that their view allows for this information or that the view is correct still by citing other unrelated information, or that they’ve always believed what they believe so it must be true) or react strongly against the information (stating that their already held beliefs and information like it are the only truth, anything to the contrary is false).” - Martin S. Pribble on July 7, 2010 in Thoughts http://martinspribble.com/2010/07/denial-cognitive-dissonance-and-confirmation-bias/
No comments:
Post a Comment