Reality is that which, when you stop believing in it, doesn’t go away.
-Philip Dick
[epistemic status: personal impression on a topic on which I have no particular competence. Maybe somebody has already said this somewhere, or maybe the connection I am proposing does not stand up to a closer examination]
In 1942, physiologists Walter Cannon proposed that Vodoo is real. The idea (roughly speaking) was that, because a Vodoo believer was so scared of the curse to die of fright. Until recently, this idea was widely accepted [1].
At that time, leading anthropologists (like Lévi-Strauss and Mead) advocated a radical form of moral relativism: at the risk of strawmanning them, my understanding is that they believed that every social custom is justified in its context, and morally correct inside the culture which generated it. Starting in the 1970s, some anthopologists believed to discovered new, culture-specific, states of counsciousness, proving the “reality” of magic rituals[2].
The Saphir-Whorf hypothesis, which grew popular in the forties, asserts that the world in which a person lives is shaped by the language she speaks. An expert in Uto-Aztecan Languages, Whorf published several influential research papers, arguing among the other things that the Hopi language lacked a concept of linear time comparable to the one of European languages[3]. Citing the Hopi became a fashionable topos among social scientists[4].
In 1975, Paul Feyerabend published his famous pamphlet Against method, advocating the equivalence of all the way of knowing [5] (science is more powerful only because it is the wisdom of a more powerful civilization, but this is not due to any methodological superiority).
I think that all these separate cultural trends can be summed up, without to much loss of information, by the following statement: between 1940 and 1980, many educated people found very compelling the arguments in the form “If you really believe in something, that thing is real for you”. The bounds 1940-1980 are very arbitrary, and surely in some academic circles this way of thinking has never fallen out of grace. Postmodernist philosophy would probably not have been possible without this tenet.
Like postmodernist like to point out, every idea is the fruit of its time. Here, I would like to suggest that the success of postmodernist in that time period could have been helped by a particular historical contingency: the rise of mass communication.
Then–motion pictures in the early twentieth century. Radio. Television. Things began to have mass.
– R. Bradbury, Fahrenheit 451
With the experience of totalitarian regimes, the world had learned the power of propaganda. The most famous dystopic novels of the time, Brave New World and 1984, dealt with the fear that humans could be made to believe anything with the right education[6].
Meanwhile, in capitalistic countries, social traditions and customs were overhauled by corporate advertising campaigns. The citizens of the free world started to buy and eat cereals for breakfast, to get engaged with diamond rings, to go skying and to sunbathe – buying the necessary equipment.
To many observers it appeared that mass media could reshape, with frightening ease, the moral and the factual belief of nations. To grasp the concern, we can consider the fact that the Second Vatican Council published only one decree explicitly related to the dangers posed by new technologies (Inter Mirifica), which is not about nuclear weapons, but about mass media.
And if it is so easy to brainwash people, what does assure us that the thing we believe are real? If there are so many false beliefs, then the probability that ours worldview is the only correct one must be very tiny.
Radical skepticism has always existed; but in imagine that, in the epoch we are discussing, it was tempting to think that one’s truths were as relative as all the others. It would be a grand and beautiful symmetry, and I can see how one could feel extremely wise in believing in that symmetry (which, incidentally, would automatically protect you from believing false things, and make you intellectually superior to anyone who still did).
The age of mass media has been an age in which logic has been held in disdain by vast segment of the intellectual community. What I wrote above makes me wonder if we can posit a casual connection.
Logic was first formalized in Greek cities, and taught in the schools of rhetoric as a tool to help win public debates. Nowadays people spend less time to passively consume content, and more time interacting (and clashing) with each other in public debates. If the connection I am proposing here makes sense, could we expect that this change will eventually result in a renewed appreciation of objectivity and logical reasoning?
[1] Scott Alexander, in Devoodooifying Psychology, links this review of the (very scarce) evidence supporting this claim.
[2] https://en.wikipedia.org/wiki/Transpersonal_anthropology
[3] This NativLang video provides a summary on the question of Hopi time.
[4] See for example P. Dell, The Hopi Family Therapist and the Aristotelian Parents, 1980: The Hopi, instructed from birth in an optimistic and process‐view of the world, provide a worthy model for family therapists who too often succumb to the pessimistic and thing‐view of their Western World. […] To the extent that a therapist remains unaware of his own Aristotelian epistemological heritage, his or her ability to “think systems” will be impeded, as may his or her therapeutic effectiveness.
[5] Excerpt from p. 188: An anthropologist trying to discover the cosmology of his chosen tribe and the way in which it is mirrored in language, in the arts, in daily life, first learns the language and the basic social habits; he inquires how they are related to other activities, including such prima facie unimportant activities as milking cows and cooking meals; he tries to identify key ideas. His attention to minutiae is not the result of a misguided urge for completeness but of the realization that what looks insignificant to one way of thinking (and perceiving) may play a most important role in another. (The differences between the paper-and-pencil operations of a Lorentzian and those of an Einsteinian are often minute, if discernible at all; yet they reflect a major clash of ideologies.) Having found the key ideas the anthropologist tries to understand them. This he does in the same way in which he originally gained an understanding of his own language, including the language of the special profession that provides him with an income. He internalizes the ideas so that their connections are firmly engraved in his memory and his reactions, and can be produced at will. ‘The native society has to be in the anthropologist himself and not merely in his notebooks if he is to understand it.’ This process must be kept free from external interference. For example, the researcher must not try to get a better hold on the ideas of the tribe by likening them to ideas he already knows, or finds more comprehensible or more precise. On no account must he attempt a ‘logical reconstruction’. Such a procedure would tie him to the known, or to what is preferred by certain groups, and would fo rever prevent him from grasping the unknown worldview he is examining. Having completed his study, the anthropologist carries within himself both the native society and his own background, and he may now start comparing the two. The comparison decides whether the native way of thinking can be reproduced in European terms (provided there is a unique set of’ European terms’), or whether it has a ‘logic’ of its own, not fo und in any Western language. In the course of the comparison the anthropologist may rephrase certain native ideas in English. This does not mean that English as spoken independently of the comparison already contains native ideas.