Full disclosure: I set out this week with a vague and confused sense of what I might write about. This confusion wasn’t because I didn’t have anything to say, but what I wanted to say seemed convoluted.
Last week’s newsletter tried to highlight how our current crisis of trust is driven by the growing weakness of our institutions. Here, I mean both institutions in the formal sense (like departments and ministries of government) and institutions in terms of codes and rules that shape and offer predictability in our lives.
This week, officials voted to ban Republican Marjorie Taylor Greene from the education and budget committees in the House of Representatives for her support of a range of noxious conspiracy theories, not to mention open calls for violence against Democrats. Mrs. Greene claimed only recently to have abandoned her belief in QAnon, a conspiracy theory that asserted former President Donald Trump was waging a clandestine war on a Satan-worshipping cabal of child-abusers and cannibals. Stories like these only seem to capture the dangerous decline of the collective American mind. And while much of this decay has worsened during our current civil war over information, the problem of fantasy is anything but new.
Which got me thinking about Kurt Anderson’s book Fantasyland, where he notes that two-thirds of Americans believe that “angels and demons are active in the world”; half of Americans are “absolutely certain Heaven exists”; a third believe that global warming is a “hoax perpetrated by a conspiracy of scientists, government, and journalists” with another third believing “extraterrestrials have recently visited (or now reside) on earth.” The picture he paints of overlapping fantasylands, he argues, shows the frailty of reality-based thinking in our modern world, and a developed democracy, no less. What makes these details so unsettling, he argues, is that these errors in thinking remain rigid and resilient in the presence of facts and corrections, even against the urgings of experts. “Once people commit to that approach,” he writes, “the world turns inside out, and no cause-and-effect connection is fixed.”
However, the heart of this problem is that these tendencies reveal how we, as humans, explain the world to ourselves. If there is a pattern to the madness, it’s that these dangerous imaginings are tied to uncertainty — fantasy haunts those spaces where our modern and scientific world has failed to provide a convincing (or at least full) account.
Lines in the Sand
As a species, we abhor uncertainty. Our “always-on” fight or flight reflexes are primed to identify threats and foes. For those reasons, we tend — at least in the broadest sense — to prefer hard edges and clear boundaries: a kind of “black and white” or “good and evil” form of engagement with the world. These reflexes are connected to our own psychological protection. As Adam Grant, the world-renowned social psychologist, writes in his new book, Think Again, humans tend to calcify their opinions and perceptions not because they are correct, but because the solidity of those beliefs affirms their identity: mental stubbornness becomes a means for personal stability.
Our wider cultural environment, suffuse with the firehose of information, supposition, conspiracy, allegation, and noise, has done little to control — or curate — what ought to form those opinions and perceptions rightly. Anderson’s book suggests the groundswell around fantasy began in the 1960s, where people were sold the idea that you could, and perhaps must, “Do your own thing, find your own reality” and that, when it comes to that reality, “it’s all relative.”
He rightly points out the information-communication revolution has only amplified those marching orders. In other words, what Anderson is trying to show, is that how we saw the world (what was *real* in the world) appeared to change around the 1960s, and our ability to surround ourselves by similarly-minded others — or “facts” to refute any challenges to our world view — became infinitely easier in the decades since.
If Anderson is correct about the timing, the precipitous rise of American fantasy was not merely a problem of culture but may have actually been a problem of scholarship. The 1960s — among other things — also marked a change in thinking within the philosophy of science. In other words, how we thought about science — what counted as evidence and what we could know about the world — shifted, too.
Let There Be Light
The debate had reared its head for centuries. Before the 15th century, knowledge (and its pursuit) tended to be limited to a chosen few. You might be the studious reader of religious texts whose truths were revealed by the divine. Or, more practically, your aptitude might grant you entry to associations of practical knowledge, such as guilds. If you were neither blessed by revelation nor accepted as fellow-craftsperson, you were inevitably reliant on someone else — i.e., your church and clergy — for knowledge and ideas about how to live in the world. That changed with the Renaissance.
Among its many consequences, the Renaissance signaled the democratization of knowledge — a recognition that anyone with a reasonably well-functioning set of sense organs and the capacity for reason could discover truths about the world. Rationality, as a capability, offered a process of inquiry and testing. This subsequent unbinding of human potential meant the Renaissance christened seismic shift in how we, as individuals, came to see, study, and understand the world.
But scholarship without standards couldn’t be trusted. The pursuit of knowledge could not be a collective sprint into the darkened woods with eyes open and notebooks in hand. It needed a systematic mode of analysis. And, by the early 20th century, debates over what counted as real insight — or “real science” — had led to the rise of what we may now (at least crudely) call positivism.
According to the philosopher Auguste Comte, who was central to the founding of positivism, such a moment was the logical result of our natural intellectual development: we started at the theological stage, where “miraculous powers or wills are believed to produce the observed events,” before moving to the metaphysical phase, where observable processes of nature arise from impersonal powers, occult qualities or vital forces, and finally arrived at the scientific stage, our intellectual maturity, where we studied “the facts and regularities of nature and society to formulate these observations as descriptive laws, achievable only in adherence to the scientific method.”
Positivism was based on a set of basic affirmations, the most important being that all knowledge regarding matters of fact must be based on the data of experience. This meant that scientists (or those in pursuit of knowledge) could only be certain about those features of the world they could see, touch, taste, and measure. Science was empirical, not metaphysical.
Such a commitment to scientific practice was central to our impressive achievements as a species: from the inventions and innovations of our modern world to the logics of improvement and efficiency that power the machines of knowledge today. And while those “practical” academics lauded their knowledge and weaponized their “empirical” evidence to shape the future, others — also concerned about knowledge and human development — grew unsatisfied with this status quo.
After all, wasn’t there a difference between what a thing is and how we ought to use it? Sure, life may be explained by series of physical and material realities — calories, water, and rest are necessary for survival, for instance. But how life should be lived is a different question. Didn’t that question require us to study flimsier concepts — like ideas and values? What if those elements proved impossible to count or measure?
Concerns with the empirical approach only grew stronger as science offered its modern, technological and economic bounty while remaining mute on how to understand or address the social and moral inefficiencies it produced. The challenge with so-called Big Tech, for instance, stems from the firms’ extensive and universal collection of our data — a logic predicated on an assumption (which I believe incorrect) that even the most complex systems (social, human systems) can be fully understood, modeled, and made predictive if we know and collect more. This is positivism at work. In other words, science was good for answering can we build a global communication network that stitches all of us together? But it never answered whether we should.
Where does this leave us?
Perhaps with little more than stubborn pragmatism. And it’s here I’m reminded of the American philosopher William James. As the inimitable John Kaag writes, James was a father of American pragmatism and understood that “facts may be out there waiting for us to find them, but the truth is our story about the facts.” In other words, we have to do our best with what we can know.
James believed the truth “attaches to ideas in proportion as they prove useful…” but —and this is important — he also knew that truth could not merely be whatever was expedient. Instead, he recognized that we shouldn’t expect certainty in our lifetimes and “negotiate life by way of little-t truths, which guide us more or less successfully in our daily affairs.”
Perhaps even more telling, James urged us “to live today by the truth we can get today and be ready tomorrow to call it falsehood.” James offers a plea for humility — a willingness, as Adam Grant rephrases today, “to admit that the facts may have changed, that what was once right may now be wrong.”
This isn’t simply about learning more about the philosophy of science, of course, nor is it defeating the magical thinking that sprouts through time. But it does mean committing to strategies and tactics for living with uncertainty, particularly where no certainty should be expected. To be stumped by the unknown is no recipe to fill that vacuum with any alternative. And where we do, we need to recognize the bandaid we adopt will inevitably require changing.
Shaping a cultural and social language that is more flexible and responsive — not only to evidence and information but also to the perspectives of others — is only the first step. It is, perhaps most importantly, our most overdue renaissance.
What I’m listening to
Reset: Reclaiming the Internet for Civil Society by Ron Deibert (The 2020 Massey Lectures)
Deibert runs the University of Toronto’s Citizen Lab, which pioneered (and argued early) for greater attention and appreciation for research and activism at the intersection of information and communication technologies, human rights, and global security. Selected to give the prestigious Massey Lectures in 2020, his talks offer a clean, clear, and critical overview of our current era. For the full effect, I recommend listening to them all.
(Note: Part 5: Burning Data offers some incredible details on the environmental toll of our digital age. Did you know sending 65 emails produces as much CO2 as driving a vehicle one kilometer? Or that our email traffic each year is the equivalent of adding 7 million more cars to the road? I didn’t either.)
What I’m reading
What Tech Calls Thinking by Adrian Daub
I’m in the middle of — well — many things all at once. Thinking about the future, however, I’ve been trying to read up and into the world of “Big Tech” — not merely because its fascinating, but also because it tends to be covered superficially (not by all, of course, but there is a kind of sensational product-ism that takes over when advances in knowledge produce something shiny to have and hold. This book, however, tries to isolate and splice through the intellectual genealogy of Silicon Valley —" “This book is about concepts and ideas that pretend to be novel but … are actually old motifs playing dress-up in a hoodie,” Daub writes. More importantly, though, Daub’s concern is the damage this kind of repurposing can do. As he notes: “The tech industry ideas…are not wrong, but they allow the rich and powerful to make distinctions without a difference and elide differences that are politically important to recognize… The danger lies in the fact that they will probably lead to bad thinking.”
—
And —> This.
I’ll put my cards on the table. Amidst the 2020 crisis, stumped by my own research and eager to write something of interest, I started a proposal for a biography of the late British-American journalist Christopher Hitchens. As a long time reader of his work and fascinated with how his mind sifted through reported facts and official statements, I felt the “Hitch,” as friends called him, epitomized both the end of an era of public intellectuals while serving as a model — controversially as always — for the kind of thinker our current moment so dearly missed. I was disabused of my endeavor (and disappointed, too) when I learned another writer — ostensibly a ‘seasoned literary biographer’ — was already on the case. So, when I stumbled on the article in The New York Times, I was left … well, underwhelmed. Setting aside the potential controversy involving the Hitchens estate’s attempts to close ranks before the biographer got to his subject’s inner circle, I was dejected to read his response to the tactics:
“I’m not a particularly smart person, I’m not a particularly ingenious person, all I can do is outwork people,” [the writer, Stephen Phillips] said. “All I can do is send out more emails and try to speak with more people. I’m absolutely committed to this book coming out, and to that end, I just keep on keeping on.”
Look. I don’t think you need to be a genius to write about Christopher Hitchens. As with most works of non-fiction, the importance is commitment and devotion to the task. But I’d pay good money to hear what Hitchens would say if he were here to read this.
Until next time,
A