The struggle against scientism
Science is great at many things, but it is not the only path to enlightenment.
Written by Bo Winegard.
About scientism, I was wrong. I once praised the term, arguing that it should be embraced rather than derided. Science, I believed, was the only reliable path to knowledge, while all other approaches led only to fleeting insights, idle speculations, and futile debates. Consequently, I maintained that any discipline aiming at empirical knowledge—even history, sociology, and philosophy—should adopt the scientific method.
I did not dismiss art or literature, both of which I have always deeply admired. Yet I argued that expecting a poem or a painting to provide dependable empirical knowledge is as misguided as expecting a wall of limestone to engage in a dialogue about Socrates. It is a category mistake. Art is about experience, not knowledge.
I went even further. I agreed with Sam Harris that science not only can but should guide our morality. Believing that psychological well-being was the sole intrinsic value in the universe and the ultimate measure of all ethical decisions, I embraced and championed utilitarianism. Once utilitarianism was universally accepted, I thought, we could finally address intractable moral debates through science, using its methods to reveal the psychological effects of competing positions. Naturally, the position that maximized well-being would be the correct one. Woe to Kant and Aristotle.
My being wrong is not terribly interesting or edifying since it is depressingly common. But my reconsideration and ultimate rejection of scientism might be.
I should note that, although I once embraced scientism, I was always skeptical of the softer sciences’ appropriation of the language of physics and chemistry. And I did not endorse a kind of methodological imperialism in which all branches of science are expected to adopt (or mimic) the strategies and jargon of physics. The notion that the tools used to explain planetary motion or electricity could also explain something as complex as social behavior is implausible, and the resulting efforts are often unintentionally comical.
Here, for instance, is a figure from the social psychologist Kurt Lewin’s 1943 article, Defining the ‘Field at a Given Time’.
So far as I can tell, the entire article is full of banal observations cloaked in needlessly complicated jargon. Similar examples proliferate in the social sciences, an academic domain notorious for its lack of productivity and marred by scandals, including a replication crisis and numerous documented cases of fraud. The hope seems to be that by borrowing enough terminology from physics, social scientists might elevate their field into a rigorous, progressive discipline worthy of awe and esteem.
The abject failure of social psychology highlights the sterility of such illusory rigor—that is, of scientism. For over eighty years, social psychologists have grappled with the mysteries of human personality and social behavior through the experimental method, yet even an optimist must confess that the results have been disappointing. Many once-venerated studies fail to replicate, and even those that do often provide little more than all-too-clever confirmations of insights that astute observers have known since the time of the Ancient Greeks. In truth, a novel by Tolstoy or Dostoevsky likely offers deeper revelations about human nature—and it is certainly more entertaining.
That social scientists and others in the humanities often adopt scientific argot to create an illusion of precision and rigor may be lamentable, but it is not fatal to scientism. After all, most proponents of scientism do not advocate using the language of quantum theory to explicate The Scarlet Letter. Rather, they endorse applying the methods and principles of science—skepticism, empiricism, and experimentalism—to explore the many mysteries of the world, whether physical, mental, or spiritual.
Indeed, a hallmark of scientism is its capacious view of science as encompassing any rational, skeptical, and empirical attempt to engage with the world. As Sam Harris observed, “When you are adhering to the highest standards of logic and evidence, you are thinking scientifically.” For this very reason, proponents of scientism believe that embracing the methods of science can liberate many unproductive fields—such as moral philosophy—from the stifling fetters of antiquated norms and ideas.
But this is a vain hope, and the project of scientism is doomed. Its central mission—to extend science into fields as distant from physics and chemistry as literary studies and art criticism—can succeed only by disfiguring reality on a Procrustean bed of scientific hubris. For reality is replete with entities, agents, minds, morals, and meanings that cannot be subdued by science. There is more in heaven and earth than is dreamt of in our (natural) philosophy.
Pluralism must prevail. Poetry, literature, art, and music bring us into contact with the profound mystery of existence. They do not merely provide fleeting experiences; they offer transformative encounters with reality, yielding wisdom and insights inaccessible through science alone.
None of this, of course, is meant to diminish science. If one wishes to predict the path of a bowling ball or the trajectory of a storm, to understand the progression of a disease or the causes of a death, to send a rocket to the moon or to estimate the distance of a star, science is indispensable. The Scientific Revolution—and the West’s continued commitment to rigorous, intrepid exploration of the universe—has eradicated many stifling superstitions and lifted humanity from a mire of ignorance to the heights of enlightenment.
These are great achievements. But like man himself, science has got to know its limitations. Scientism transgresses these, promoting an imperial ideology that subjugates other disciplines while denigrating distinct ways of knowing and interacting with the world. In the place of a tolerant pluralism, it offers an oppressive monism.
Consider the claim that science can help us (or allow us) to determine values—as the subtitle of Sam Harris’s popular and provocative, The Moral Landscape, proclaims. Some defenders of scientism, such as Steven Pinker, forward a judicious version of this proposition, contending that:
…in combination with a few unexceptionable convictions— that all of us value our own welfare and that we are social beings who impinge on each other and can negotiate codes of conduct—the scientific facts militate toward a defensible morality, namely adhering to principles that maximize the flourishing of humans and other sentient beings.
Others, such as the aforementioned Sam Harris, forward a more fulsome version, arguing that:
Morality and values depend on the existence of conscious minds—and specifically on the fact that such minds can experience various forms of well-being and suffering in this universe. Conscious minds and their states are natural phenomena, of course, fully constrained by the laws of Nature (whatever those turn out to be). Therefore, there must be right and wrong answers to questions of morality and values that potentially fall within the purview of science. On this view, some people and cultures will be right (to a greater or lesser degree), and some will be wrong, with respect to what they deem important in life.
Softened by the adjective “defensible,” Pinker’s argument is largely unobjectionable. However, the nature of the “scientific facts” that “militate toward” utilitarianism remains unclear. For that matter, the distinction between a “scientific fact” and what one might call an “ordinary fact” is equally unclear. Science here seems to be an honorific term, perhaps like “true love” or “authentic barbecue.” (Using science—and its cognates—as titles of esteem is a common sign of scientism.)
Harris’s argument, by contrast, represents a bolder form of scientism—and it is fallacious.
In the passage cited above, Harris moves from the premises that “morality and values depend on the existence of conscious minds” and that “conscious minds and their states are natural phenomena” to the conclusion that “there must be right and wrong answers to questions of morality.” This is a clear non sequitur. To see this, consider a similar argument: “Meaning depends on the existence of conscious minds. Conscious minds are natural phenomena. Therefore, there must be right and wrong answers to the meaning of life.”
The fallacy becomes even more apparent when applied to preferences in food, films, or even fabrics, further illustrating the implausibility of Harris’s conclusion, which, in any case, is not warranted by his premises.1
A charitable restatement of Harris’s argument is useful, as it makes his position more plausible while still illustrating its flaws. Here’s one version: It is objectively true that pain is unpleasant, and that pleasure is, well, pleasant. As rational animals, humans generally desire and seek pleasure while deploring and avoiding pain. Therefore, any moral system that increases pain (without increasing pleasure) or decreases pleasure (without decreasing pain), relative to other moral systems, is objectively wrong. This is simply a fact about humans; it requires no metaphysical commitments, no religious faith, no bizarre logic, and no inscrutable epiphanies. And since this is true—and since science can discover what most reliably causes pain and pleasure—science can and should guide our morality.
The astute reader will notice that there is still a gap in the logic, a missing step, which was first made famous by David Hume, and is now known as the is-ought problem. For Harris’s argument to work, one must move from an empirical description of the world to a statement about how it ought to be. According to Hume, however, this move is not logically justified, for nothing of moral consequence follows logically from an empirical assertion or description. Moral conclusions can follow only from moral premises. “A and B; therefore moral claim C” only works logically if A or B has moral content, e.g., “Humans do not like pain, and it is bad to force humans to endure things they do not like; therefore, it is bad to cause humans pain.”
Of course, Harris is not unaware of this problem. He just considers it overstated and largely irrelevant. Perhaps this is why his counterarguments are unpersuasive and often beside the point. For example, he contends:
Following David Hume, many philosophers think that ‘should’ and ‘ought’ can only be derived from our existing desires and goals—otherwise, there simply isn’t any moral sense to be made of what “is.” But this skirts the essential point: Some people don’t know what they’re missing. Thus, their existing desires and goals are not necessarily a guide to the moral landscape. In fact, it is perfectly coherent to say that all of us live, to one or another degree, in ignorance of our deepest possible interests.
This statement evinces a misunderstanding of the is-ought problem, failing to address the deeper issue of extrapolating from what is to what ought to be. A philosopher who accepts Hume’s argument is not committed to the absurd position that people always know what is in their own deepest interests. Rather, such a philosopher holds that moral claims are always grounded in human desires and cannot be derived solely from descriptions of what is true about the world.
For instance, the assertion, “Torturing innocent people causes gratuitous suffering,” is undoubtedly true. However, moving from that fact to the moral claim, “and therefore it is wrong,” requires accepting some prior moral premise—explicitly or implicitly—such as, “Causing gratuitous suffering is wrong.”
Nevertheless, in my view, the is-ought problem is not the primary obstacle to a scientifically grounded morality. After all, if there were universal agreement on the “ought,” the “is” would lead inexorably to that “ought,” and empirical truths would indeed entail moral truths. If everybody agreed that decreasing happiness by .2% is worse than increasing freedom by 1%, then everybody would agree that a policy that did this was immoral and ought to be rejected.
The real obstacle is value pluralism. Humans possess diverse values and desires, which often clash—not only within large, heterogeneous societies but also within the soul of a single individual. These persistent, perhaps intractable, conflicts have inspired countless plays, novels, and films, in which characters wrestle with the world and themselves to resolve moral discord. Such conflicts resist easy answers and certainly do not submit to the tools of science.
From Antigone to Hamlet to Anna Karenina, elevated examples of this theme abound. But the excellent genre film Gone Baby Gone offers a compelling case to consider. In the film, two police officers collude with a mistreated child’s uncle to take a child, Amanda, from her selfish, drug-addicted mother, Helene, and place her into the loving home of one of the officers by faking her death. Private detectives Patrick Kenzie and his girlfriend, Angie Gennaro, eventually discover the deception. Patrick wants to call the police and return Amanda to her mother, but Angie argues that the child is better off remaining in a loving home. The two are forced to grapple with an impossible moral choice.
Angie: She’s happy. She’s happy here. I saw her.
Patrick: Angie don’t do this.
Angie: If you call the police, they’ll send her back.
Patrick: We’re not sending her anywhere. Helene is her mother.
Angie: She’s better off here.
Patrick: Why? Because he has money and he makes her sandwiches?
Angie: Because he loves her.
Patrick: Helene loves her too.
Angie: Helene doesn’t treat her that way.
This conflict illustrates the enduring tension between deontological ethics and utilitarianism, with Patrick appealing to rights (deontological ethics) and Angie appealing to subjective well-being (utilitarian ethics). While I cannot say whether Patrick’s decision was the right one, I am confident that science cannot provide an answer to this troubling moral question.
It is true, of course, that morally interested observers would weigh the ramifications of each decision, carefully considering the psychological and legal costs and benefits before formulating an ethical argument. It is also true that if stealing children from mothers caused no distress or unhappiness, we might view Angie’s pleas to leave Amanda with the police officer in a different light. Yet, even after thoroughly examining the facts and contemplating the relevant psychological effects, we would still face a painful moral dilemma—one whose depth and complexity are likely better explored by philosophers and artists than by sociologists or psychologists.
These arguments against scientism apply even more forcefully to art and literary criticism, fields of profound human interest that undoubtedly offer knowledge and wisdom but largely resist scientific methods and analysis.
A neuroscientist can, of course, offer fascinating insights into the physiology of visual and auditory perception—explaining, for instance, how the brain processes color, form, or melody. These insights may enhance our understanding of how we perceive paintings, sculptures, and symphonies. However, such explanations contribute little to our appreciation of the art itself, which transcends the mechanics of perception, provoking emotions, ideas, and meanings that science cannot capture or effectively analyze.
Consider Paul Cézanne’s Chateau Noir. The building and its windows are painted in complementary colors. Therefore, understanding the physiology of complementary colors might offer some insight into the aesthetic effects of the painting. However, such insight almost certainly pales in comparison to Sister Wendy’s non-scientific yet compelling analysis, which draws attention to nuances of meaning beyond the purview of neuroscience:
The slender, Gothic-arched windows of the chateau reveal nothing but the intense blue of the sky. The complementary relationship of the yellow building and the blue windows emphatically affirms the color harmony of the work, and its ambiguity between “solid” sky and “ephemeral” stone. The building seems impressively permanent, yet also a shallow façade through which the blue hills and sky are visible. It is an intensely blue painting, made even bluer by the intervals of yellow ochre, and united by the more neutral greens.
Similarly, an evolutionary psychologist might explain why certain stories—such as the hero’s journey or a king’s tragic downfall—are nearly universally appealing, but such explanations are unlikely to enhance our aesthetic appreciation of The Odyssey or Macbeth. For that, we turn to literary critics. Admittedly, some critics can be frustratingly pretentious or abstruse, their writings filled with arcane jargon and endless allusions to obscure texts. Yet many other literary critics are lucid and engaging writers who deepen our understanding and enjoyment of literature. And, after all, no intellectual discipline is entirely free from those who seek to impress rather than to edify.
Thus far, an advocate of scientism might agree with everything I have written, arguing that my critique targets not scientism itself but a caricature of it, which should be rejected. Morality and literature lie outside the purview of science and are not disciplines from which one can derive reliable, cumulative knowledge. To be sure, some proselytes of scientism have suggested that even moral philosophy and literary criticism might one day conform to the norms and methods of science, but such exuberance has never been widely shared.

A more plausible version of scientism does not claim that science will solve or eliminate the perennial questions of philosophy. Instead, it holds that science is the best, and perhaps the only, useful method for understanding the empirical world—encompassing everything from rocks and rivers to butterflies and humans. This form of scientism is often, though not always, associated with kind of deflationary (or “nothing but”) reductionism, which asserts that qualities such as beauty or goodness, if they do not contribute to a productive scientific research program, are not (really) real and should be excluded from a scientifically informed understanding of the world.
At times, such deflationary reductionism seems to provoke a perverse delight, as a scientist or philosopher eagerly informs us that something we deeply cherish is, in fact, nothing more than crude, dumb matter. Consider, for instance, Francis Crick, the co-discoverer of the structure of DNA, writing about consciousness:
The Astonishing Hypothesis is that “You,” your joys and your sorrows, your memories and your ambitions, your sense of personal identity and free will, are in fact no more than the behavior of a vast assembly of nerve cells and their associated molecules. As Lewis Carroll’s Alice might have phrased it, “You’re nothing but a pack of neurons.”
My view is that even this more modest version of scientism diminishes—and inevitably distorts—reality and should be rejected. The world is not austere, abstract, colorless; it is bright, loud, beautiful, ugly, tragic, magical, pungent, and often overwhelming. These qualities are not merely subjective judgments; they are as real as solidity and motion. And while they may only be meaningful within certain conceptual frameworks—and may have little utility in disciplines like physics or chemistry—they remain essential to our understanding of the world.
Scientism fails because it is committed to methodological monism, insisting that only the theoretical perspective provides reliable knowledge of the world. Consequently, it contends that anything that cannot be explained through this lens should be dismissed as unreal—a mere illusion, a play of light, a will-o’-the-wisp. But this is an unnecessarily narrow view of reality.
Pluralism offers a broader, more accommodating perspective. We interact with the world in diverse ways, and these interactions can only be fully understood through multiple conceptual frameworks—frameworks that are irreducible but not inconsistent with each other. Literature is not reducible to psychology; psychology is not reducible to biology; biology is not reducible to chemistry; and chemistry is not reducible to physics. All contribute to explaining the world and our place it, allowing us not only to predict the movement of matter but also to grasp the significance of it. Describing mathematically the motion of a national flag blowing in the wind, after all, does not allow us to understand its importance to those who fly it.
Suppose that while hiking, we come across a field of daffodils. We might ask ourselves, “What kind of flowers are these, really?” A good botany guide informs us that “daffodil” refers to various species within the genus Narcissus. Intrigued, we might then ask, “When did daffodils evolve? And why do they have such beautiful petals?” A book on plant evolution explains that the history of daffodils is long and complex, and that pungent or colorful flowers evolved to attract pollinators, promoting cross-fertilization and producing more robust offspring by masking harmful recessive alleles.
This certainly expands our understanding of daffodils. And we could spend months, even years, studying the structure and evolution of plants, further augmenting our knowledge. With world enough and time, we might even become daffodil experts.
But does this theoretical approach exhaust our experience with daffodils or the knowledge that we could obtain about them?
I think not. Consider William Wordsworth’s wonderful poem, I Wandered Lonely as a Cloud:
I wandered lonely as a cloud
That floats on high o'er vales and hills,
When all at once I saw a crowd,
A host, of golden daffodils;
Beside the lake, beneath the trees,
Fluttering and dancing in the breeze.Continuous as the stars that shine
And twinkle on the milky way,
They stretched in never-ending line
Along the margin of a bay:
Ten thousand saw I at a glance,
Tossing their heads in sprightly dance.The waves beside them danced; but they
Out-did the sparkling waves in glee:
A poet could not but be gay,
In such a jocund company:
I gazed—and gazed—but little thought
What wealth the show to me had brought:For oft, when on my couch I lie
In vacant or in pensive mood,
They flash upon that inward eye
Which is the bliss of solitude;
And then my heart with pleasure fills,
And dances with the daffodils.
An advocate of scientism might argue that while this is an amiable poem, it tells us little about daffodils themselves and instead reflects William Wordsworth’s idiosyncratic experience of them. A reader unfamiliar with daffodils would learn nothing substantive about the flowers from the poem, though she might glean that Wordsworth found them jocund, capable of engendering gaiety and filling the heart with pleasure.
A pluralist, however, might counter that the poem is far more than a record of Wordsworth’s personal experience. If it were merely the description of one unique personality’s encounter with flowers, it would hold little interest for the average reader. Instead, the poem resonates because it captures something universal—an aesthetic and emotional response to nature that transcends Wordsworth’s individuality. Moreover, it offers a way of seeing, inviting us to engage with a field of daffodils (or, more broadly, with nature itself) in a particular manner, one that fosters a deeper understanding of the world and our active relationship to it.
After reading and reflecting on Wordsworth’s poem, a reader would almost certainly experience flowers—or trees, birds, clouds, and the rest of nature—more deeply and richly than before. This is because reality is not merely an inert entity “out there” awaiting perception by a passive observer. Rather, as Wordsworth suggests in Tintern Abbey, reality is an interaction: the eye and ear (and other senses) do not merely observe the world—they “half create” it.
Humans live always in a world brimming with meaning and symbolic resonances. The theoretical perspective of science, however, abstracts from this rich, lived experience, stripping away the colors, sounds, pains, and joys that are integral to it. This abstraction serves a perfectly legitimate purpose. For science aims to understand causality and to predict the motion of matter. Through trial and error, philosophers and scientists have discovered that certain qualities that can be easily quantified—such as extension and motion—are more fundamental to this task than others that cannot—such as color and beauty.
However, what is effective for predicting the motion of planets or the outcomes of chemical reactions is poorly suited for grasping the sublimity of a snow-capped mountain or the majesty of a soaring eagle. Pluralism asserts that these experiences are equally essential to understanding the universe. Newton and Dryden, Darwin and Dickens, Einstein and Eliot. All contribute to a comprehensive view of reality. We do not need to choose between poetry and physics. If we are trying to understand why a baseball travels further in hot weather, we will want a physicist. But if we are trying to understand the magic of watching a colossal home run by Mickey Mantle, we will want a poet.
Before concluding, I want to revisit the perverse pleasure that some proponents of scientism seem to derive from gainsaying the cherished beliefs of ordinary people. These prophets of gloom take it as self-evident that science has humbled the human ego, revealing not only that we are insignificant beings on an insignificant planet, but also that we are soulless survival machines, stumbling through life in an unconscious effort to pass on our (potentially immortal) genes to the next generation.
Here, for instance, Robert Sapolsky’s claim in his book Determined:
Maybe you’re deflated by the realization that part of your success in life is due to the fact that your face has appealing features. Or that your praiseworthy self-discipline has much to do with how your cortex was constructed when you were a fetus. That someone loves you because of, say, how their oxytocin receptors work. That you are the other machines don’t have meaning.
And here is another from Francis Crick’s The Astonishing Hypothesis:
We need, therefore, to state the idea in stronger terms. The scientific belief is that our minds—the behavior of our brains—can be explained by the interactions of nerve cells (and other cells) and the molecules associated with them. This is to most people a really surprising concept. It does not come easily to believe that I am the detailed behavior of a set of nerve cells.
Both passages exemplify deflationary reductionism coupled with what can only be described as bad poetry—relying on analogies and metonymies that oversimplify and distort the complexity of human existence. In their eagerness to reduce us to our biological or mechanical components, they disfigure reality.
Sapolsky asserts that we are "machines" that "don’t have meaning." The machine analogy, of course, is nothing new. It dates back at least to the mechanical philosophy espoused by figures such as Descartes and Hobbes (though Descartes also believed in an immaterial soul). Yet, the analogy is misleading. Its purpose today seems less about illuminating human nature and more about deflating human pretensions and uniqueness. For when we hear the phrase "You are a machine," our minds conjure an image of a clumsy metal automaton, making the comparison feel not just reductive but degrading.
Similarly, Crick claims that we are “the detailed behavior of a set of nerve cells.” But this is true only in the way that The Great Gatsby is “the detailed behavior of a collection of words and punctuation marks.” Like a sentence, a paragraph, or a novel, the self is an emergent phenomenon with unique properties. It is no more reducible to brain cells than a work of literature is reducible to words and symbols. (Meaning is created by an emergent combination of words. The meaning of dog in “The dog ran,” is quite different from in “She is a dog.”) The self is a natural phenomenon, and there is nothing spooky or magical about it—but it is far more than "nothing but" nerve cells.
Just as an initial lie often leads to many secondary lies, so too a pivotal philosophical mistake often gives rise to many ancillary errors. Those who reduce the mind to nerve cells and describe humans as machines frequently argue, for instance, that free will is an illusion, that life is devoid of meaning, and that religion (and similar superstitions) must be eradicated. By promoting this imperial brand of science, they drain the blood from the universe, leaving behind a conceptual corpse. Unsurprisingly, such claims provoke a furious response: "If these are the conclusions of modern science," many think, "then so much the worse for science."
But he conflict between our ordinary experience of a meaningful, beautiful, sublime, horrifying, and terrifying universe and modern science is unnecessary. Science has and will continue to challenge some of our most sacred beliefs. We can no longer easily believe that we are uniquely created beings living on a planet at the center of the universe. Yet it is equally true that we inhabit a rich, vibrant world, full of good and evil, triumph and tragedy, joy and strife, and that religion, philosophy, literature, art, and cinema will continue to inform and enhance our understanding of it.
Science is among humanity’s greatest achievements, and scholars in the humanities dismiss it at their own peril. But science is not all-encompassing. Those who cherish and wish to promote it must also acknowledge its limitations. Even very great things diminish in excess.
Bo Winegard is Editor at Aporia.
Support Aporia with a $6 monthly subscription:
You can also follow us on Twitter.
To be fair to Harris, he attempts to address this obvious objection, writing:
Many people worry that any aspect of human subjectivity or culture could fit in the space provided: after all, a preference for chocolate over vanilla ice cream is a natural phenomenon, as is a preference for the comic Bill Burr over Bob Hope. Are we to imagine that there are universal truths about ice cream and comedy that admit of scientific analysis? Well, in a certain sense, yes. Science could, in principle, account for why some of us prefer chocolate to vanilla, and why no one’s favorite flavor of ice cream is aluminum. Comedy must also be susceptible to this kind of study. There will be a fair amount of cultural and generational variation in what counts as funny, but there are basic principles of comedy—like the violation of expectations, the breaking of taboos, etc.—that could be universal. Amusement to the point of laughter is a specific state of the human nervous system that can be scientifically studied. Why do some people laugh more readily than others? What exactly happens when we “get” a joke? These are ultimately questions about the human mind and brain. There will be scientific facts to be known here, and any differences in taste among human beings must be attributable to other facts that fall within the purview of science. If we were ever to arrive at a complete understanding of the human mind, we would understand human preferences of all kinds. And we might even be able to change them.
But this is a different assertion from the claim that there are objectively right and wrong moral principles that can be assessed (determined?) scientifically. I do not doubt that we will discover various individual differences in the brain and body that predict differences in taste. These may allow us to understand (to a limited degree) why Jim likes strawberries more than lemons, while Thomas likes lemons more than strawberries. The controversial and provocative claim that Harris is making, though, is that there are right answers to questions of taste and value that are open to science. I do not think science will tell us if lemons are in fact better than strawberries because the question is not scientific.
To be clear, I do not think that aesthetic judgments are entirely subjective. When a person, Jim, contends that Vertigo is a greater film than Jaws, he is making a claim that demands evidence and argument. If his interlocutor, Thomas, retorted, “That’s preposterous,” Jim would expect Thomas to forward reasons for his dissent. But I do not think such debates are greatly enhanced by appealing to neuroscience or cognitive psychology—and they certainly are not resolved by deferring to the “scientific” evidence.
A great danger of scientism is the appointment of elite “experts” who assume the powers to dictate freedoms. You write brilliantly about the limits of science in the human experience. But what we have seen and experienced is how the experts very quickly become authoritarians. Moreover, the experts rarely if ever employ genuine critique of their own understanding and policies and quickly stymie any real or perceived challenge to their positions.
I don't think that one's stance needs to be into clear-cut belief systems, like scientism or deism when dealing with the external world that one experiences in the here-and-now.
On an individual level we live in a sort of cosmological jungle and we simply lump around in it, trying to survive, and use every opportunity that presents itself to do so. To do less is to risk less that optimal survival.
It is only when we, as individuals, try to codify these personal survival systems that work for us individually, into dogmatic systems to be applied to all others universally. There's a sort of intermediate ground where the strategies are compatible and overlap, and these can by agreement become the basis for socially acceptable behaviors.