Curiosity, Superintelligence, and the Benzene Molecule: Some Notes on a Personal Philosophy

As many readers of this blog may know, I have a forthcoming book called The End: What Science and Religion Tell Us About the Apocalypse (Pitchstone Publishing 2016). It brings New Atheism into conversation with Existential Risk Studies, arguing that, on the one hand, advanced dual-use technologies will make religious extremism unprecedentedly dangerous in the future and, on the other, of all the risks within the categories of error and terror, religion is the most serious. In other words, it’s crucial that secularists pay attention to the field of existential riskology, and that existential riskologists pay attention to secular critics of religion. For reasons I explicate in the book, drawing from a wide variety of scholarship, I genuinely don't think it’s hyperbole to say that our survival through the current century may depend on it.

I’m looking forward to discussing and debating these topics in the future. (In fact, I’ll be doing precisely this with a leading atheist figure in an upcoming issue of Skeptic.) The fact is that Existential Risk Studies is in something like a Kuhnian “pre-paradigmatic” state, and consequently the intersection of Existential Risk Studies and New Atheism is even more amorphous and immature. As I mention in the final chapter of The End, in which I discuss some very general but serious strategies for mitigating existential risks (including transhumanism, superintelligence, and space colonization!), there are hardly any shoulders upon which to stand and glimpse the topography ahead. The point is that the immaturity of the topic makes it especially open to reasonable disagreement among individuals with conflicting interpretations of the trajectory of civilization. While I present (what I take to be) robust evidence-based and philosophical arguments for the conclusions at which I arrive, astute readers with a talent for identifying problems may yet spot some that have eluded my attention.

This being said, I feel like there's an opportunity here to share some thoughts about my personal approach to scholarship, which is heavily influenced by my understanding of epistemology. I should point out immediately that while some of the bullet points below are couched in descriptive language (“This is what I do”), they're all essentially prescriptive (“This is what I think that I should do”). In other words, these are “regulative ideals” that I want to guide, constrain, direct, and determine my intellectual behavior, and I do not in any way take myself to be an exemplary practitioner of the good cognitive hygiene of critical thinking and fallibilism. I only strive to be a practitioner of this sort.

So, here goes:

— My entire normative worldview can be more or less reduced to two simple desiderata: curiosity and kindness. Curiosity is the lifeblood that animates critical thinking. It’s what inspires one to constantly critique the beliefs of others — and oneself. Curiosity is the active form of passive doubt, and doubt ought to be the starting point of every intellectual journey we take to situate ourselves in this strange universe full of swirling galaxies, quivering atoms, and evolving organisms. But without kindness, curiosity can be vicious and destructive. Indeed, kindness is what ensures that the critical interrogation of belief systems remains constructive. It’s at the root of the “principle of charity,” according to which the views of others should be interpreted sympathetically, with the aim of understanding first and then criticizing secondIn my view, the two best things that anyone can be in life are curios and kind.

— From an epistemological perspective, beliefs should always and without exception be the destinations and never the points-of-departure of the great journey for truth upon which many (but I must say not all) of us are embarked. And this journey should always and without exception be guided by the best available evidence considered as a whole. On this account, the relevant sense of "evidence" here is information that can be checked by multiple individuals in multiple locations at multiple points in time. Checkability is crucial because it's what guards against the threats of self-deception, error, delusion, and trickery. How can we be sure that Muhammad communicated with the angel Gabriel? We can't, because this claim isn't checkable. How can Muhammad himself be sure that he wasn't being tricked or suffering from temporal lobe epilepsy? He can't, because his experiences aren't checkable. How do we know that Homo sapiens evolved from Homo habilis? Because individuals with the requisite knowledge and instruments can examine the bones, the fossil record, genetic lineages, and so on, to intersubjectively verify this claim. Science is such an absolutely brilliant method of acquiring truth precisely because of its epistemological basis in evidence that's checkable.

A similar idea goes like this: one can distinguish between the “why,” “what,” “how,” and “where” aspects of any given belief (where a belief simply describes a special kind of relationship of cognitive assent between propositions and minds). The "why" aspect concerns the reasons for accepting a belief. It pertains to the belief’s status as rational, reasonable, justified, or warranted (all of which are synonyms in this context). In philosophy, this is where the theories of evidentialism and reliabilism come into play. While science requires that all beliefs must always be founded on evidence, religion is more permissive in that it allows claims based entirely on faith. In other words, "Why accept that, for example, God is three completely distinct persons who are also completely identical in essence?" Ultimately, because of faith.

The "what" aspect concerns the content of the belief. It pertains to the belief's information about or representation of reality, such as that matter is 99.9% empty space or that Jesus will someday return to Earth over the Umayyad Mosque in east Damascus (as Muslims believe). The "how" aspect concerns the methodology by which the belief was acquired. In philosophy, this is where the theories of empiricism and rationalism enter the picture. Whereas modern science relies on observation and reason as the means for acquiring its belief-justifying evidence, the world's religions rely on private revelations had by ancient prophets claiming to have special access to the supernatural. From a purely statistical point of view, observation is an extremely reliable evidence-gathering methodology, whereas revelation is either virtually always or always wrong. (Again, statistically speaking.) Finally, the "where" aspect concerns the ultimate origin of the belief. For example, Einstein devised his special theory of relativity in armchair fashion by logical deduction, and the structure of the benzene molecule was discovered after its discoverer, August KekulĂ©, had a daydream involving the ouroboros.

What matters to science isn't the "what," "how," or "where." As far as science is concerned, Darwin could have received his theory of evolution by natural selection from the supernatural during an acid trip. When it comes to deciding whether or not a belief should be accepted, the one and only thing that matters is the "why." If there’s evidence for a claim, no matter how outrageous, then it will be accepted. Indeed, much of science consists of utterly outrageous claims that, if not for the outrageous evidence supporting them, would be utterly outrageous to accept.

Tying these ideas together, when one puts the “why” of belief before the “what,” it forces beliefs to become the destinations rather than the points-of-departure. Most ideologies, whether religious or secular (e.g., Marxist, nationalist, etc.), care more about the “what” of belief than the “why.” For example, when discussing abortion, the environment, US foreign policy, gun violence, race relations in the US, and GMOs, ideologues will begin and end with the question "What is the claim?" rather than dwelling on the epistemologically paramount question of "Why accept the claim?" The intellectually honest truth-lover is willing to accept a belief if good reasons present themselves, even if the content doesn't align with one's prior system of beliefs. This statement by itself is hackneyed to the point of cliche, yet it's harder to honestly put into practice than most are willing to admit.

— There’s a critical difference between what I would call “debates” and “arguments.” Debates are something like social events whereby each interlocutor attempts, in competitive fashion, to convince the audience that the other is wrong. Rhetoric is the key to victory here, and fallacies are fair game. In contrast, arguments are intellectual events whereby both participants attempt, in a cooperative fashion, to acquire a better overall approximation of the truth. Many people confuse debates with arguments, which is easy to do because they look superficially similar. But the skills sets required for each are completely different. For example, Donald Trump is a master debater. (My apologies.) He has an extraordinary, charismatic way of getting people to applaud for him and boo his opponents, even when his opponents quite clearly champion far better truth-approximations. Whereas good debaters are stubbornly confident and eager to monopolize the conversation, good arguers are perpetually tentative and always eager to understand opposing views before criticizing them. In psychology terms, arguments involve the "central route to persuasion" while debates involve the "peripheral route."

I love arguments. I love to criticize others and to be criticized, just as long as all parties involved are willing to play by the conversational rules of basic epistemology (evidence, reason, and logic are the name of the game here). Being shown wrong often only means that I've learned something, and learning something often only means that I've acquired a slightly better approximation of the truth. Similarly, showing that someone else is wrong typically means that the confidence I have in my own view increases. The point is that I couldn't care less if a belief, contention, or thesis of mine is revealed to be wrong, either because it's incomplete or simply incorrect. This being said, I feel confident that the central claims that I put forward and defend in my forthcoming book and forthcoming articles (like the one in Skeptic) are robust and cogent. But who knows? Many people far more intelligent than myself have been wrong about problems far more complicated than those I attempt to tackle. (Indeed, most of the monumental figures throughout history have turned out to be wrong about nearly all of their beliefs, although given the evidence available to them at the time, their beliefs would have been perfectly reasonable to accept.)

— Finally, I want to emphasize that there’s nothing embarrassing whatsoever about not knowing something. The fact is that, as I’ve written elsewhere, everyone today knows almost nothing about most things. This is our universal contemporary epistemic situation. It results from the fact that collective human knowledge has grown exponentially since the Scientific Revolution while the capacity of individual brains has remained more or less fixed in their finitude. (This gestures at a good reason for developing cognitive enhancement technologies!) The result is that relative ignorance — defined as the difference between what the collective knows compared to what the individual knows — has undergone a mighty explosion. Humans have never been so (relatively) ignorant in all of history. Even the most polymathic intellectuals  Noam Chomsky immediately comes to mind here: a genius among geniuses  are utterly oblivious of entire continents of thoroughly-mapped domains of human understanding.

As I wrote in a recent article for Salon, anti-intellectualism is at the very heart of American culture. But the flip-side of the anti-intellectualist posture of "don't tell me the facts, I don't want to know them," is that people feel insecure and uncomfortable around others who know more than them. One only needs to recognize the sententious truism I mentioned above to see that this is silly. Everyone is profoundly ignorant of almost everything there is to know. In fact, the best thing one can do is to surround oneself by people who either have a deeper knowledge of your own field than you, or are educated in fields of which you know littler or nothing. I'm absolutely delighted when I discover that someone I'm around is either far smarter than myself or far more knowledgeable about a field of inquiry that isn't religious eschatology, existential risks, emerging technologies, and terrorism (my areas of expertise). Along these lines exactly, there’s nothing wrong with being wrong, there’s only something wrong with remaining wrong once facts to the contrary have been presented. Intellectual life is a process of constantly revising one's system of beliefs, and this is why curiosity, critical thinking, and fallibilism are so important.

These are mostly half-formed thoughts. I hope readers can find a scintilla of insight hidden in them. I suppose one thing I've been frustrated by in the atheist community is that (a) the heart and soul of New Atheism is supposed to be its emphasis on the epistemological importance of evidence-based reasoning, and (b) some New Atheist leaders present themselves in a manner that often appears to be more consistent with dogmatism than its epistemic nemesis: fallibilism. One can be quite confident in the truth of a proposition heavily supported by the facts and still not appear to be closed-minded to views that are obviously wrong. It's also important to recognize that when the evidence isn't decisive, as is sometimes/often the case, disagreement between informed individuals can be reasonable. Again, this is where curiosity and kindness enter the picture as regulative ideals. Perhaps the best way to turn people into atheists isn't to emphasize that religion is wrong, but to underline the importance of one's epistemic orientation. After all, insofar as people are rational, their beliefs will tend to converge over time. This is why science agrees about so much and religion agrees about so little.

For those interested, you can get my book here on Amazon. I'll be writing more about apocalyptic terrorism and existential risks in future blog posts!

0 comments: