There are virtually endless scenarios in which an employee, a small business owner, or a sole proprietor might decline to participate in a business activity or deny service as a matter of personal convictions or corporate values. In the absence of compelling reasons to punish such conscientious objection, they should be free to do so. The First Amendment guarantee of freedom of religion (i.e. freedom of conscience) is most necessary when it protects a minority with whom others strongly disagree. In the “land of the free”, the burden of proof should be on those who would use the power of government to coerce another to do their bidding. Here’s a couple dozen of the endless instances in which an individual or business should be able to make decisions in keeping with their ethical commitments. Such decisions inescapably discriminate (i.e. make a distinction or judgment) — not against vendors, customers or clients, but rather — against particular products or services that, for the provider, have an ethical dimension.
Recently a number of philosophically arresting moments have managed to insert themselves into the television landscape. True to form, Ronald D. Moore and company continue to address contemporary political, philosophical, and religious questions in the alternate world of Caprica, territory he brilliantly charted in his groundbreaking Battlestar Galactica. If the pilot is any indication, Caprica promises to explore even more pointedly themes of religious and ethnic tolerance, terrorism, technology, and the nature of the soul. ABC’s FlashForward, clearly aimed at continuing the legacy of Lost and retaining its audience, has somewhat disappointed so far, but has nonetheless woven several provocative existential questions into its narrative, including one powerful Sartrean moment in particular. On the comedic front, NBC’s Community had the temerity to devote an episode to whether humanity is intrinsically good or evil, and did so superbly. I’ll admit to being prone to vegging in front of the tube even when the viewing is less cerebral, but a couple of these moments had me off the couch cheering for the writers.
Into the ever-expanding catalog1 of films predicated on our anxiety about the extent of our free will, enter The Adjustment Bureau, perhaps the most cerebral and ambivalent of the lot. The film envisions a world in which human action is directed, though not quite determined, by a confluence of chance, free will, and the nearly ubiquitous superintendency of “The Chairman”, a quasi-religious, mysterious power that influences human actions through the intervention of a minion of “clerks” who alter circumstances (and occasionally thought patterns) in order to keep the course of human events in line with “The Plan”. This is not, as some have supposed, a film about human pawns and a grandmaster who determines their fate. Rather, The Adjustment Bureau explores how the course of human events might be guided or “nudged” by such a master when the chess pieces themselves are free agents pursuing their own ends. As it turns out, this decidedly more difficult endeavor requires constant “caretaking” or “meddling”. The film itself remains surprisingly ambivalent toward this state of affairs and offers a provocative and nuanced picture of human agency, of our wills as simultaneously malleable and free. Indeed, the various kinds of interventions in The Adjustment Bureau provide a backdrop for considering just what should and should not be considered a violation of the will. Finally, though it wisely avoids any explicit religious references, the film portrays a world that bears a striking resemblance to a particular theological proposal regarding the relationship between God’s sovereignty and human free will, namely open theism.
With all the hand-wringing about whether Stephen Hawking’s The Grand Design changes anything — whether “philosophy is dead” and whether M-theory promises to explain the appearance of our universe in strictly physical terms — Sir Roger Penrose speaks. Because of his stature and relationship to Hawking, he is one of the most interesting commentators, and he is none too impressed. On the September 25th broadcast of Unbelievable?, Alister McGrath is carrying on in his exceedingly unctuous way when, with wonderful British politeness, Penrose interrupts: “I think it’s actually stronger than that. What is referred to as M-theory isn’t even a theory. It’s a collection of ideas, hopes, aspirations. … I think the book is a bit misleading in that respect. It gives you the impression that here is this new theory which is going to explain everything. It’s nothing of the sort. … I think the book suffers rather more strongly than many. It’s not an uncommon thing in popular descriptions of science to latch on to some idea, particularly things to do with string theory, which have absolutely no support from observation. They’re just nice ideas that people have tried to explore.”
The great variety of contradictory religious views is for many reason enough to conclude that there is no truth to be had in such matters. No one religion is at all likely to be closest to the truth. In his debate with Dinesh D’Souza, John Loftus argues that these inter-religious and intra-religious disagreements the gravamen of his case against Christianity, arguing that in effect they cancel each other out in virtue of the mutually exclusive nature of their claims.1 He does not see, apparently, that by such reasoning, the ageless debate between naturalists and theists is also cancelled, each position nullified. Indeed, every point of view falls prey to such a criterion. When we look within naturalism, we also find denominations and sects, a cacophony of diverse and contradictory positions on fundamental questions. It turns out, the problem of pluralism is an equal opportunity employer. Worldviews are like personalities. Each one is unique. Though there are types of personalities, just as there are broad worldview categories, none is identical. Whatever our worldview, that view must countenance the fact that many others think it mistaken. This is the problem of pluralism. The implication of this reality, however, need not be the defeat of any particular set of beliefs. Rather, the proper response is virtue. It begs modesty, a profound intellectual humility about our take on reality. And second, it should serve as a call to personal responsibility for our beliefs, and therefore to the epistemic virtues, for there is no consensus on ultimate questions that we can simply adopt by proxy.
In adolescence, when I was for the first time really struck by the pervasiveness of irreconcilable differences between peoples, my confidence in my own beliefs was shattered irreparably. What had seemed obvious seemed less so. What I believed based upon what I thought was good reasoning was undercut by the realization that my reasoning was unpersuasive to others. And so began my journey as a truth seeker haunted by the fear that truth could not be found. Like Sisyphus, who was condemned to roll a rock up a hill only to see it roll back down, ad infinitum, I found again and again that the briefly confident conclusions of my inquiries crumbled each time with the realization that others who had traversed those same paths had concluded otherwise. This is to say, the problem of pluralism is a real and ever-present foil in my own thinking. Nonetheless, the fact of disagreement about reality is often overstated and misappropriated to prove what it does not. Here I propose what we should, and should not, take from pluralism, by which I mean the evident fact of irreconcilable differences between individuals and communities on both the details and broad strokes of reality.
But why, exactly, is pluralism so problematic? The problem is that, to the extent that we hold mutually exclusive beliefs, it follows necessarily that very nearly all of us are wrong about many of the things we believe. This is not to minimize that which we hold in common. Graciously, substantial agreement is possible about a great deal that is required for the necessities of life. Nonetheless, our political, ethical, philosophical, historical, and religious beliefs exemplify virtually every conceivable point of view, and insofar as they reference an external world that does not indulge contradictions, many of those beliefs must be erroneous. Unfortunately, the realization that many of our beliefs are mistaken does not thereby reveal those which are true and which are false. Rather, pluralism casts suspicion on all of our controversial beliefs. The problem is exacerbated in that we must make decisions of great consequence not only for ourselves but also as families, communities, and nations. The stakes are high, and our great need is to ground our beliefs on secure foundations. But the pervasive error entailed by our pluralism persistently undermines our efforts. Our human quest for knowledge and understanding, especially in the Modern era, has largely been the effort to find solid ground amidst the quicksand, but to no avail. It seems our pluralism is inescapable. Or is it?
Consensus by Circling
Years ago, in conversation with some Mormon missionaries, I was presented with an argument that was part of Joseph Smith’s own departure from the received Christianity of his day. Smith was frustrated by the profusion of Christian denominations who disagreed with each other on points of doctrine large and small. He perceived these disagreements as an indication that none of them had the truth, and was at a loss until, as the story goes, the truth was restored to him by the angel Moroni. These missionaries appealed to my own frustration with the endless disagreements amongst Christians, suggesting that in Mormonism I could finally escape the squabbling and find a set of beliefs agreed upon by all. As the Church of Jesus Christ of Latter Day Saints has grown and evolved, that promised consensus is harder to find even from within, even with a “living prophet”. The main problem with their argument, however, was that these earnest missionaries did not see their own church as yet one more party to the debate about the way of things. Of course I could find more consensus by joining their party and renouncing the claims of others, just as I could by joining the Moonies or the Marxists and forswearing the rest. It is always possible to find some level of consensus by simply drawing the circle smaller. But drawing circles only underscores the persistent factiousness. And if complete consensus is demanded, that circle will have to be drawn so small as to include only oneself.
In other words, as his creed was like no man’s else, and being well pleased that Providence had intrusted him alone, of mortals, with the treasure of a true faith, Richard Digby determined to seclude himself to the sole and constant enjoyment of his happy fortune.Nathaniel Hawthorne, “The Man of Adamant” (1837).
Arguments from pluralism against religious truth proceed in the same vein. The disagreement at every level of religious affiliation is regarded as a pox on them all, without seeing that criticism as one of the dissenting parties to the discussion. If it is merely disagreement that invalidates all sides, the naturalist’s own views on God, religion, and ethics are swept away by that same tide. The irreconcilable differences between the varieties of religious expression are no more ageless or intractable than that between naturalists and theists. For at least several thousand years, humans have disagreed about whether atoms or gods are at the bottom of the universe9. Sure, the religious enterprise has failed to come to unanimous agreement about the nature of God. But the philosophical enterprise has failed no less in achieving any real consensus about fundamental reality. It is no answer to say, “but we basically agree amongst ourselves”. The problem is not Christian pluralism or religious pluralism. Pluralism challenges all. Disagreement is a defining feature of the human condition, and one cannot escape the problem of pluralism simply by choosing another circle.
Problem Solved? Positivism.
In the early part of the twentieth century a solution was proposed. Keying off on the more general agreement achievable when talking about things like rocks and trees and red apples, logical positivists sought agreement by banishing more ethereal subjects from the land of meaningful propositions. Whatever could not be touched, smelt, seen, heard, or deduced thereof, would not be considered a sensible subject or object of a sentence. On this proposal, the proposition “God exists” is neither true nor false. It is meaningless. “God” is not a thing we can point to or show to others in order to speak meaningful sentences about it. No doubt, if universally accepted, positivism promised to drastically diminish the range of human disagreement by constraining what was up for discussion. But in the end, positivism fell on its own sword, for its own criterion of meaning was philosophical, unfit to be weighed and measured.2 Furthermore, by so strictly limiting the explanatory options, it led to positions that were obviously wrong. For example, since conscious states are not sensible objects, feelings like pain were of necessity redefined in terms of something observable. So, behaviorists proposed that pain was not that felt sensation in the mind as we had thought, but rather the act of saying “ouch!”, or some such. Michael Egnor suggests that the final blow to the viability of behaviorism was a joke. After a night of passion, one behaviorist rolls over in bed and says to the other: “that was good for you; how was it for me?” However discomfiting the problem of pluralism, positivism presumed an artificial constraint that could not be sustained and led us down dead end trails. It was no escape.
Problem Solved? Naturalism.
Though shedding the hard and fast rules of positivism, naturalists continue in that tradition by constraining what can exist to that which can be a subject of the sciences, especially of physics. And who can blame them? Science rocks! By positing hypotheses, winnowing out successful hypotheses by methodical, experimental testing, only to start the process over again,3 scientists have achieved remarkable feats and bested all other means of winning agreement about how the world works. Thomas Nagel sympathizes with the impulse to universalize science:
“This reductionist dream is nourished by the extraordinary success of the physical sciences, not least in their recent application to the understanding of life through molecular biology. It is natural to try to take any successful intellectual method as far as it will go.”4
Thanks to science, we’ve sent men to the moon, and no educated person doubts the reality of elliptical planetary orbits or the double helix structure of DNA. Science is superlative at mastering matter and energy and has significantly extended the range of facts that are agreeable to us all. But here we arrive at the point of contention. Should we, because of that tremendous success, foreclose on questions science cannot answer and on hypothetical entities beyond scientific verification? The question is the answer. It is precisely the kind of question that science cannot answer about itself. To adjudicate the question, we will have to defer to reason, including the unquantifiable canons of logic, and to the history of science and ideas. We will have to appraise other supposed sources of knowledge, such as introspective awareness, moral intuition, and wisdom based on life experience. Nagel continues:
“Yet the impulse to find an explanation of everything in physics has over the last fifty years got out of control. The concepts of physical science provide a very special, and partial, description of the world that experience reveals to us. It is the world with all subjective consciousness, sensory appearances, thought, value, purpose, and will left out; what remains is the mathematically describable order of things and events in space and time.”
Is science sufficiently expansive to capture the full breadth of reality exhaustively? Whether it is or is not is not self-evident. Once this inevitable question is on the table, the problem of pluralism returns in full force, a multitude of positions vying for acceptance.
In any case, the problem of pluralism rears its head even if we accept science as the sole or preeminent source of knowledge. Even within naturalism, each of the conceivable positions allowed by the data is well represented. We find strong physicalists and emergent property dualists, compatibilists and incompatibilists, determinists and libertarians, moral realists and nonrealists, ontologists and nominalists, conservatives and liberals. Human experience simply begs questions that are not answered decisively by the scientific data, and some that cannot be in virtue of its inherent limitations. Furthermore, it is impossible not to ask what the data means, to venture beyond data into synthesis and interpretation. The debate about the meaning of the surprising and strange quantum world is illustrative. No one disputes the experimental data, that photon and electron trajectories can only be determined probabilistically, and quantum mechanics is employed everyday in real life applications. Nonetheless, though the Copenhagen interpretation of this phenomenon is the orthodox one, notable naysayers persist, as well as at least half a dozen rival interpretations that are also consonant with the data. Scientific data is in one sense not unlike religious texts. It is a core set of givens that serves as a jumping-off point for a multiplicity of interpretations. It is no surprise, then, that even having given science pride of place, naturalism eludes precise definition. It lacks a universally accepted set of truths and can only be roughly characterized: epistemologically, it’s science aided by reason; ontologically, it’s elementary particles at bottom; etiologically, the story is neo-Darwinian; theologically, no God or gods exist. Beyond this central creed, disagreement runs amuck.
Finally, naturalism as a worldview is not entitled by right to appropriate the special esteem we grant science. The scientific enterprise emerged out of a Christian culture, was forged by an eclectic mix of orthodox and heterodox “natural philosophers”, and continues to be practiced by the religious and non-religious alike. Scientific methodology is a heritage we share in common and is largely embraced by all. But while the success of science within its domain is indisputable, it is arguable whether naturalism as an all-encompassing worldview is likewise superior to its competitors in mitigating or eliminating our irreconcilable differences. Naturalists disagree amongst themselves and with others. Whatever else it may be, naturalism is not an escape from the problem of pluralism.
The Upside of Pluralism
As the proverb goes, iron sharpens iron. Disagreement, dissension, and debate are a refining fire, par excellence. The desire and need to control nature for our own ends and our innate desire for knowledge are powerful generators of discovery, but there is no greater engine for the refinement and discrediting of ideas than the ceaseless argument about how the world works and what it all means. I have argued that there is no escape from pluralism. We are condemned to live at ideological odds with others. But this is not to say that our arguments are stagnant, are without purpose. On the contrary, in many of our most interminable disagreements, there has been real movement, even progress.
There is no more contentious arena than the political. It’s to be expected. Political systems effect our lives intimately for better or worse. And, as James Madison opined: “What is government itself but the greatest of all reflections on human nature?” The debate over proper governance is epic. Great thinkers have pondered and disputed it endlessly. Wars and revolutions have been fought. Contemporary political debate is a morass of intemperate wrangling. And yet, with a historical perspective, we can see a remarkable shift in the terms of debate. As Fareed Zakaria points out: “For the vast majority of the world, democracy is the sole surviving source of political legitimacy. Dictators such as Egypt’s Hosni Mubarak and Zimbabwe’s Robert Mugabe go to great effort and expense to organize national elections — which, of course, they win handily. When the enemies of democracy mouth its rhetoric and ape its ritual, you know it has won the war.”5 Moreover, some measure of both free markets and of government regulation are largely taken for granted. The raging debate resides in the center and is largely one of degree, of the appropriate measure of each. Many old arguments that seemed irreconcilable at the time were, in fact, settled. New arguments have taken their place. The moral legitimacy of American chattel slavery was so intractable that its resolution cost over 600,000 lives. A hundred years later, fully equal treatment for all was no less divisive. Graciously, the second time around it was resolved politically, though not without great personal sacrifice by civil rights activists. Today the legitimacy of slavery and legal discrimination isn’t given a second thought, and we debate instead the merits of affirmative action and reparations. The argument continues, but that is progress nonetheless.
So be it for politics, but one might think that religion is categorically different, that with its dogma, “leaps of faith”, and eternal stakes it is immune to the refiner’s fire. Such a view requires a strange anthropology, a belief that religious people are some alien creature, somehow divested of their natural rationality and sensitivity to recalcitrant facts. The history does not bear this out. Most ancient religions are just that: relics of the past. Their followers were persuaded or otherwise motivated to discard their beliefs. Conversions to and from religions as well as the loss of religious faith altogether are commonplace. And within religious traditions, believers individually exhibit a diversity and varying confidence in their beliefs, each believer uniquely persuaded by their experiences and the evidence available to them. Religions as communities evolve as well. To name but one example, there was a time when for many Christians it was plausible to think it appropriate to persecute dissenters and wage wars over doctrinal disputes. But by exegetical debate and the weight of decisive events, such as the Thirty Years War, the consensus interpretation of scripture was reformed to such an extent that coercive indoctrination is unthinkable now. It’s no different in the philosophy of religion. To everyone’s surprise, the logical argument from evil against God was basically put to rest, and the terms of debate relocated to an inductive form of the argument. Big Bang cosmology and our increasing awareness of the necessary fine-tuning of the universe weigh heavily in the debate about God’s existence, prompting the formulation of new or revived atheistic explanations like quantum tunneling and bubble universes. Demonstrations of mind-brain correlation in neuroscience have given succor to physicalist monists and forced refinement, or at least clarification, in the substance dualist’s view. In biblical studies, the development of new methods of textual criticism provided a vast body of widely accepted facts that inform questions of authorship and dating. Indeed, even the most conservative articulations of belief in biblical inspiration have been shaped by these developments. Though we are far from the end of many such debates, religious inquiry is by no means stagnant or immune to the refining fire.
Far from inhibiting the expansion of human understanding, in every field our inescapable pluralism is its catalyst. The quest for knowledge and understanding is a community project, a human project. Public debate and discourse is the principle means of moving it forward, kicking and screaming. And as Robert Frost would have it, “the only way around is through”. We cannot skip ahead to the resolution of the debates that so exercise us today. In any case, we cannot assume that these debates will be settled on behalf of the good and the true. Our only recourse is to participate in the debate in the hope that our best efforts to understand the world may lead to our own enlightenment and also contribute to the betterment of human understanding. Our communal quest for knowledge cannot proceed without individuals who are willing to slog through the difficult and unseemly debates that litter the path.
The Imperatives of Pluralism
If there is no escape from pluralism, as I think the case, what follows? As communities, the reality of pluralism warrants tolerance, freedom of speech and of conscience, and the preservation of mechanisms that facilitate the dialectic, such as journals, editorials, peer review, round tables, public debates, etcetera. These are vital. But furthermore, there is a personal imperative. Pluralism presses upon each of us an obligation to earn our beliefs by earnest inquiry, whether we welcome this onus or not. On consequential issues where there is significant disagreement, we neglect the relevant questions at our own peril. Of course, we may throw in with the majority or our own circle of friends, but to do so is a gamble. Majorities have been wrong. Authorities have been wrong. There is simply no reliable way to defer our personal responsibility to others. We can’t outsource our thinking. Again, history is instructive, and in this case fearfully so. I shudder to think that I may have opposed Galileo, Locke, Wilberforce, MLK or sided with Calhoun, with Torquemada, with Hitler. Many did, and it is naive to think we are immune from aligning ourselves against the good and the true. The Nobel Laureate Percy Bridgman described the ultimately personal nature of truth-seeking in the context of science.
The process I want to call scientific is a process that involves the continual apprehension of meaning, the constant appraisal of significance, accompanied by the running act of checking … and of judging correctness or incorrectness. This checking and judging and accepting, that together constitute understanding, are done by me and can be done for me by no one else… They are as private as my toothache, and without them science is dead. Quoted in The Age of Science, by Gerard Piel (Basic Books: 2001), p. 21.56Quoted in The Age of Science, by Gerard Piel (Basic Books: 2001), p. 21.5
The contentious scientific, political, religious, and ethical issues of our own day demand our care. If we have done our due diligence and end up on the wrong side of history, we may be forgiven. But if we sit it out, we may be the unwitting enablers of ignorance and injustice in our own day, without excuse. It is imperative that we take the pursuit of truth as a serious and personal calling.
Secondly, it is imperative that we believe knowledge is possible. As much as the tradition of skepticism, the postmodern rejection of the possibility of knowledge is a resignation to our inescapable pluralism and just as demoralizing to our quest for truth. Postmodern analysis is deservedly renowned for its deconstruction of the self interests that incline us to believe one way or the other. Ironically, there is much Truth in this analysis. But when postmoderns prescribe relativism, they take a right when they should turn left. To suggest that because of our apparently irreconcilable differences we are all right — that it is “true” for you — is to paper over our differences and end the dialogue that promises the possibility of convergence on the truth. It would be better to infer that we are all wrong, or more accurately, partially wrong. None of us has the complete and final account of reality. This turn, by contrast, serves as an impetus for the ongoing quest. We must likewise reject the notion that our beliefs are captive to our cultural context. Culture is powerful, but not all powerful. There have always been dissenters and revolutionaries who have been able to see through the assumptions taken for granted by their countrymen. The pronounced pluralism of our own time only makes this easier because it is so obvious that our assumptions can and should be questioned.7
It follows from our incomplete knowledge that intellectual humility is in order. Remember that pluralism entails by necessity that we are very likely wrong about some of our beliefs. We are not omniscient. Not by a long shot. “For now we see through a glass, darkly… For now we know in part.” Intellectual humility is to seriously entertain the possibility that we may be wrong, and on the flipside, to be open to the possibility that others may be right. This principle of fallibility is well put by James William McClendon, “that even one’s most cherished and tenaciously held convictions might be false and are in principle always subject to rejection, reformulation, improvement, or reformation.”8 On either side of every debate there are those that seem utterly incapable of second-guessing themselves. Such certain minds, who are not troubled in the least by the fact that others see things differently, escape my comprehension. But because of their intransigence, we should not follow their lead nor despair at the apparent impasses in the contemporary conversation. They too can serve as foils in our own deliberations about the merits of one view or another. And only if we ourselves are open will we be able to be corrected if we are in error. Basil Mitchell gets it exactly right with his recommendation that a spirit of self-reflection and self-criticism is apt no matter the subject.
The main thrust of my argument has been to the effect that the charge that to accept the possibility of criticism is to rule out commitment is palpably untrue to the way our thinking really works in matters of any importance, whether religious or not. Even in the realm of the natural sciences, where the advancement of knowledge is the central concern and where the subject matter is strictly delimited, a considerable degree of tenacity is required if new theories are to be adequately tested and properly developed. Hence, established scientific systems are not abandoned in the face of problems and puzzles that are not immediately soluble. Science advances precisely by the sustained attempt to iron out these anomolies. ~ “Faith and Criticism as Interdependent” in Faith and Criticism (Oxford University Press: 1994), p.46.
The rejection of the possibility of religious truth with which we began, merely in virtue of its contentiousness, is a case of special pleading and dismissiveness. I am sympathetic with that impulse, divisive as the history of religious differences have been. And yet, it is all too easy to dismiss religious claims in this way, with one fell swoop. It relieves one of the trouble of having to examine and weigh them. To do so, however, is to throw stones in a glass house. It is a failure to see that one’s own house is not in order. Pluralism is a challenge to us all and these imperatives are just the tip of the iceberg. The epistemic virtues are many and plot the course well. Pluralism itself settles nothing. We are left right back where we started with the need to appraise the evidence as best we can. But we arrive there, I would hope, with a profound sense of modesty about our ability to do so definitively. Thank God, the continuance of a stable and inhabitable natural world does not depend on us. And just as Camus thought Sisyphus could find joy and significance in his redundant task, we too can make the most of our inescapable pluralism.
In the face of our disagreement, let us not abandon truth, but rather add love.
1 “Does the Christian God Exist?” A Debate between Dinesh D’Souza and John W. Loftus (February 9, 2010). Loftus states: “When they [the world religions and sects] criticize each other, they’re all right. What’s left, I think, is the demise of Christianity and religion as a whole.” Later, Dinesh responds to a restatement of this argument: “The presence of disagreement does not invalidate the possibility of truth.”
2 C. A. Campbell summed up the status of Positivism nicely as it waned in influence: “In the days when the Verifiability Principle was accepted by its devotees as a secure philosophical truth, one could understand, though one might not agree with, the sweeping claim that many of the traditional problems of philosophy had been shown to be mere ‘pseudo-problems’. It was easy to see how, given the Principle’s validity, most of the leading questions which agitated our forefathers in metaphysics, in ethics, and in theology, automatically become nonsensical questions. What is perplexing, however, is that despite the pretty generally acknowledged deterioration in the Principle’s status to that of a convenient methodological postulate, the attitude to these same questions seems to have changed but little. To admit that the Verifiability Principle is not an assured truth entails the admission that a problem can no longer be dismissed as meaningless simply on the ground that it cannot be stated in a way which satisfies the Principle. Whether or not a problem is meaningless is now something that can only be decided after critical examination of the particular case on its own individual merits. But the old antipathies seem in large measure to have survived the disappearance of their logical basis. One gets the impression that for at least many thinkers with Positivist sympathies the ‘liquidation’ of a large, if unspecified, group of traditional philosophic problems is still established fact. If that impression is mistaken, well and good. One may then hope for an early recrudescence of interest in certain problems that have too long suffered the consequences of an unhappy tabu. If the impression is correct, a real service would be done to philosophy if it were plainly stated which of the traditional problems are still regarded as pseudo-problems, and what are the reasons, old or new, for passing this sentence on them. The smoke of old battles, perhaps understandably, darkens the philosophic air, to the considerable inconvenience of all concerned.” “Is ‘Free Will’ a Pseudo-Problem?”, In Defence of Free Will (Routledge: 2004, orig. 1967), p. 17.
3 This is, of course, a caricature of scientific method. Philosophers of science will be quick to point out that there is no strict demarcation of what is and is not appropriately scientific methodology, and here too a debate continues.
4 Thomas Nagel, Secular Philosophy and the Religious Temperament (Oxford University Press: 2009), p. 25.
5 Fareee Zakaria, The Future of Freedom: Illiberal Democracy at Home and Abroad (W.W. Norton: 2003), p. 13.
6 Apropos to my defense of the salubrious effect of the competition of ideas, Piel goes on to describe what happens to beliefs earned in private when they enter the marketplace of ideas. “Upon publication, the work enters the public, social process of science. Members of the community who are interested will address it in their individual responsibility. They are a democracy of warring sovereigns. If science is not dead, they will root our frailty in the design of the experiment and error in the data. They will challenge the premises on which the work was undertaken and the meaning the author has found in it and, perhaps, argue for their own. Debate will be unsparing in the common cause of consensus.” In, Gerard Piel, The Age of Science (Basic Books: 2001), pp. 21-2.
7 D’Souza makes this very point: “If you happen to be born in Afghanistan, you’d be a Muslim. If you happen to be born in Tibet, you’d be a Buddhist. That’s true, but what on earth does that prove? I happen to have been born in Bombay, India, which happens to be a Hindu country. The second largest group is Muslim. Even so, by choice, I am a Christian. Just because the majority religion is one thing doesn’t make it right or wrong. By the way, what he says about Christianity or Islam is equally true about beliefs in history or science. If you are born in Oxford, England you are more likely to believe the Theory of Evolution than if you are born in Oxford, Mississippi. If you are born in New Guinea you are less likely to accept Einstein’s Theory of Relativity than if you are born in New York City. What does this say about whether Einstein’s Theory of Relatively is true? Absolutely nothing.”
8 McClendon, Understanding Religious Conviction (University of Notre Dame Press: 1975), p. 118.
9 The Epicureans and Platonists anticipated so many of the debates we continue today.
“But you seem pretty sure that your point of view is correct. Good luck. So are the Islamists. So are the Hindus. So are the Jains. So are the Zoroastrians.” Deepak Chopra on “The Future of Faith”, Faith Under Fire (April 30, 2005) Episode 10, Season 2.
Radio talk show host Hugh Hewitt concluded 2009 by broadcasting a debate about God between polemicists Michael Shermer and Gregory Koukl, thereby bidding adieu to what he called “The Decade of the New Atheists”. It was indeed a remarkable cultural phenomenon how four atheologians in particular rose to prominence by selling scads of books: Sam Harris with The End of Faith (2004), Christopher Hitchens with god is not Great (2007), Daniel Dennet with Breaking the Spell (2006), and, of course, Richard Dawkins with The God Delusion (2006). But just as noteworthy, perhaps, is the cavalcade of able critics who rose to these challenges to Christian theism. As with the cottage industry of criticism that accompanied Dan Brown’s and then Ron Howard’s The Davinci Code, these broadsides served as provocation for countless apologists. Of course, none of them were remotely as successful as their atheistic rivals in terms of sales. One wonders whether they will slip into oblivion just as Hume survives in philosophy readers, while most of his contemporaneous critics do not. Whatever happens, the swift and mostly scholarly response to this one decade’s worth of the perennial barrage on Christian theism leaves it an open question whether, in the final analysis, it was the atheists or their counterparts who owned the aughts. Consider the following list an opportunity to judge this contest of ideas for yourself.
Central to the plot of Clint Eastwood’s Invictus is William Ernest Henley’s short poem of the same name. Though the role of the poem suffers some historical revisionism in the film, its role in the life of Nelson Mandela is worth consideration. The film recounts the remarkable story of Mandela’s efforts at national reconciliation through his embrace of the South African rugby team, which at the time remained a symbol of Apartheid’s ethnic segregation. In 1996, when I returned for the first time to South Africa, my childhood home, some old friends shared with me how meaningful it was when Mandela appeared at Ellis Park donning the Springbok green and gold. I’m gratified that this remarkable story of reconciliation has made it to the screen, especially while Morgan Freeman is still with us. He was born to play Mandela. During Mandela’s long internment on Robben Island, Henley’s poem adorned a wall of his cell, a constant reminder that though his freedom had been taken from him, he remained “the captain of his soul“. The words of this poem, and their significance to Mandela, underscore a central point of contention in the debate about human free will. It seems to me that one problem with some arguments for compatibilism, the idea that determinism and human responsibility are compatible, is the conflating of freedom and free will. Mandela’s story is a powerful reminder that there is freedom beyond freedom. That is, it matters whether we are captains or merely observers of our souls.
Calling upon Henley’s poem as a powerful expression of our sense of having free will, here I consider one particular line of argument: that to be free in the sense relevant to moral responsibility is just to be free from external constraint. This view, classical compatibilism, continues to assert itself in spite of so obviously missing the target.1
Christopher Hitchens’ god is not Great is an expression of the profoundest moral outrage at the transgressions of religious people. As such, Hitchens follows in a long and honorable tradition. Indeed, in his life and teaching, Jesus also was a consummate critic of corrupted religion. In particular, it was the religious authorities of his time and place — the pharisees — that he most roundly denounced. His criticisms were many, but included charges of hypocrisy, pride, legalism, and unkindness. Like Hitchens, a consistent theme in Jesus’ criticism is how inhumane their religious strictures had become. For example, in one of a number of confrontations over Sabbath observance, Jesus reminds the pharisees that the Sabbath was instituted for the sake of humankind, not vice-versa. Furthermore, the letters of early church leaders follow Jesus’ precedent in confronting the failings of his earliest followers. And they all stood in a long line of prophetic voices that, according to the biblical record, were called by God to correct the recurring degeneration of Hebrew, and then Christian, religion. Finally, today you can browse the bookshelves of any Christian bookstore to find volume after volume lamenting this or that shortcoming of the Church. Clearly religion can be corrupt, even poisonous, and it is hardly exempt from criticism. But though Hitchens is in good company in his indictment of religious transgressions, god is not Great is something of a missed opportunity. Because his rhetoric evinces such a profound contempt for people of faith, Hitchens fails to speak persuasively to the very people he thinks need saving. If intended merely as a call to arms for his compatriots, god is not Great is a tour de force. But if he hopes to deconvert the converted, to liberate those captive to religion, another course is needed. If that is the aim, here’s how to criticize religion.
An increasingly popular rhetorical meme in debates about God, it seems, is the idea that the theist is really on the same trajectory as the atheist. After all, the theist has also rejected every god, save one. It was perhaps Stephen Henry Roberts who revived this charge: “I contend that we are both atheists. I just believe in one fewer god than you do. When you understand why you dismiss all the other possible gods, you will understand why I dismiss yours.” Richard Dawkins echoes: “We are all atheists about most of the gods that humanity has believed in. Some of us just go one god further.” Or, in Christopher Hitchens’ words: “Everyone in this room is an atheist. Everyone can name a god in which they do not believe.” Interestingly, the charge dates back to at least AD 155, when devotees of the Roman pantheon of gods leveled a similar accusation. At the trial of Polycarp, the Martyrdom of Polycarp records that the crowd yelled: “This is the teacher of atheism, the father of the Christians, the enemy of our gods, who teaches so many to turn from the worship of the gods and not to sacrifice.”1