People are usually unwilling to consider another viewpoint’s validity or change their opinion unless they (consciously or not) believe it will benefit them. This usually involves a subconscious prioritization of fallacious shortcuts to reduce cognitive dissonance, ego defense, social circle groupthink and assimilation to maintain short-term harmony, internal and external. If the pursuit of truth is prioritized as their highest principle, they will be cognitively limber in their ability to reassess their intellectual deficiencies, biased tendencies, and how their own personal experiences shape their willingness to readily integrate certain ideas and instantly reject others (and how this correlates in no way with objective truth, but instead with ego-fixation).

…even after the initial evidential basis for their beliefs has been totally refuted, people fail to make appropriate revisions in those beliefs. That subjects’ theories survived virtually intact is particularly impressive when one contrasts the minimal nature of the evidential base from which subjects’ initial beliefs were derived (Le., two “data points”), with the decisiveness of the discrediting to which that evidence was subjected. In everyday experience our intuitive theories and beliefs are sometimes based on just such inconclusive data, but challenges to such beliefs and the formative data for those beliefs are rarely as decisive as the discrediting procedures employed in this study…” (Anderson, et al., 1980)

Intellectual humility does not mean worshiping consensus or following the dogmatic breadcrumbs of “experts”, but instead to scrutinize them as you would anything else. The more you learn about yourself and how humans work in general, the more you can discern dogmatic, fallacious evasion and manipulation from reality; an intimate trust of one’s own experiences with the willingness to reframe and reinterpret those experiences as new knowledge is continually integrated. Empirical perception is fallible, as is “the data” (Amrhein, et al., 2019), but a continual reassessment of both, creating an interdependent relationship between the two, where the individual experience is not ignored over textbook mechanics, but instead is respected and used to enrich them. When the reality sitting right in front of you is ignored and dishonestly reframed to fit a textbook explanation, when the textbook explanation itself is a previously approximated attempt at explaining reality, we have a cognitive behavioral problem that borders on psychotic, but it derives from cowardice, laziness, and stupidity.

Ideologies and protocols are “valid” only insofar as they are blueprints, or launchpads, for entirely fluid cascades of behaviors that are to be tailored to the individual and the shifting contexts of their circumstances and environment. Rigid subscription to protocols and ideology immediately enslaves, cognitively and spiritually. Opportunities close up; the resonant ability of the mind becomes single-minded and myopic. To extract useful truths and insights from a belief system, but then continue our independent growth without becoming indefinitely sucked in is how one develops nuanced understandings, and in doing so becomes a nuanced individual; a flavored character.


Works Cited

Amrhein, Valentin, et al. “Scientists Rise up against Statistical Significance.” Nature, vol. 567, no. 7748, 2019, pp. 305–307., doi:10.1038/d41586-019-00857-9.

Anderson, Craig A., et al. “Perseverance of Social Theories: The Role of Explanation in the Persistence of Discredited Information.” Journal of Personality and Social Psychology, vol. 39, no. 6, 1980, pp. 1037–1049., doi:10.1037/h0077720.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.