Loading...

The Intellectual and Moral Courage of Atheism

The Intellectual and Moral Courage of Atheism

Among the many topics the ‘four horsemen’ discussed in 2007 was how religion and science compared in respect of humility and hubris. Religion, for its part, stands accused of conspicuous overconfidence and sensational lack of humility. The expanding universe, the laws of physics, the fine-tuned physical constants, the laws of chemistry, the slow grind of evolution’s mills – all were set in motion so that, in the 14-billion-year fullness of time, we should come into existence. Even the constantly reiterated insistence that we are miserable offenders, born in sin, is a kind of inverted arrogance: such vanity, to presume that our moral conduct has some sort of cosmic significance, as though the Creator of the Universe wouldn’t have better things to do than tot up our black marks and our brownie points. The universe is all concerned with me. Is that not the arrogance that passeth all understanding?

Carl Sagan, in Pale Blue Dot, makes the exculpatory point that our distant ancestors could scarcely escape such cosmic narcissism. With no roof over their heads and no artificial light, they nightly watched the stars wheeling overhead. And what was at the centre of the wheel? The exact location of the observer, of course. No wonder they thought the universe was ‘all about me’. In the other sense of ‘about’, it did indeed revolve ‘about me’. ‘I’ was the epicentre of the cosmos. But that excuse, if it is one, evaporated with Copernicus and Galileo.

Turning, then, to theologians’ overconfidence, admittedly few quite reach the heights scaled by the seventeenth-century archbishop James Ussher, who was so sure of his chronology that he gave the origin of the universe a precise date: 22 October, 4004 bc. Not 21 or 23 October but precisely on the evening of 22 October. Not September or November but definitely, with the immense authority of the Church, October. Not 4003 or 4005, not ‘somewhere around the fourth or fifth millennium bc’ but, no doubt about it, 4004 bc. Others, as I said, are not quite so precise about it, but it is characteristic of theologians that they just make stuff up. Make it up with liberal abandon and force it, with a presumed limitless authority, upon others, sometimes – at least in former times and still today in Islamic theocracies – on pain of torture and death.

Such arbitrary precision shows itself, too, in the bossy rules for living that religious leaders impose on their followers. And when it comes to control-freakery, Islam is way out ahead, in a class of its own. Here are some choice examples from the Concise Commandments of Islam handed down by Ayatollah Ozma Sayyed Mohammad Reda Musavi Golpaygani, a respected Iranian ‘scholar’. Concerning the wet-nursing of babies, alone, there are no fewer than twenty-three minutely specified rules, translated as ‘Issues’. Here’s the first of them, Issue 547. The rest are equally precise, equally bossy, and equally devoid of apparent rationale:

If a woman wet-nurses a child, in accordance to the conditions to be stated in Issue 560, the father of that child cannot marry the woman’s daughters, nor can he marry the daughters of the husband whom the milk belongs to, even his wet-nurse daughters, but it is permissible for him to marry the wet-nurse daughters of the woman . . . [and it goes on].

Here’s another example from the wet-nursing department, Issue 553:

If the wife of a man’s father wet-nurses a girl with his father’s milk, then the man cannot marry that girl.

Father’s milk’? What? I suppose in a culture where a woman is the property of her husband, ‘father’s milk’ is not as weird as it sounds to us.

Issue 555 is similarly puzzling, this time about ‘brother’s milk’:

A man cannot marry a girl who has been wet-nursed by his sister or his brother’s wife with his brother’s milk.

I don’t know the origin of this creepy obsession with wet-nursing, but it is not without its scriptural basis:

When the Qur’aan was first revealed, the number of breast-feedings that would make a child a relative (mahram) was ten, then this was abrogated and replaced with the number of five which is well-known.[1]

That was part of the reply from another ‘scholar’ to the following recent cri de coeur from a (pardonably) confused woman on social media:

I breastfed my brother-in-law’s son for a month, and my son was breastfed by my brother-in-law’s wife. I have a daughter and a son who are older than the child who was breastfed by my brother-in-law’s wife, and she also had two children before the child of hers whom I breastfed.  I hope that you can describe the kind of breastfeeding that makes the child a mahram and the rulings that apply to the rest of the siblings? Thank you very much.

The precision of ‘five’ breast feedings is typical of this kind of religious control-freakery. It surfaced bizarrely in a 2007 fatwa issued by Dr Izzat Atiyya, a lecturer at Al-Azhar University in Cairo, who was concerned about the prohibition against male and female colleagues being alone together and came up with an ingenious solution. The female colleague should feed her male colleague ‘directly from her breast’ at least five times. This would make them ‘relatives’ and thereby enable them to be alone together at work. Note that four times would not suffice. He apparently wasn’t joking at the time, although he did retract his fatwa after the outcry it provoked. How can people bear to live their lives bound by such insanely specific yet manifestly pointless rules?

With some relief, perhaps, we turn to science. Science is often accused of arrogantly claiming to know everything, but the barb is capaciously wide of the mark. Scientists love not knowing the answer, because it gives us something to do, something to think about. We loudly assert ignorance, in a gleeful proclamation of what needs to be done.

How did life begin? I don’t know, nobody knows, we wish we did, and we eagerly exchange hypotheses, together with suggestions for how to investigate them. What caused the apocalyptic mass extinction at the end of the Permian period, a quarter of a billion years ago? We don’t know, but we have some interesting hypotheses to think about. What did the common ancestor of humans and chimpanzees look like? We don’t know, but we do know a bit about it. We know the continent on which it lived (Africa, as Darwin guessed), and molecular evidence tells us roughly when (between 6 million and 8 million years ago). What is dark matter? We don’t know, and a substantial fraction of the physics community would dearly like to.

Ignorance, to a scientist, is an itch that begs to be pleasurably scratched. Ignorance, if you are a theologian, is something to be washed away by shamelessly making something up. If you are an authority figure like the Pope, you might do it by thinking privately to yourself and waiting for an answer to pop into your head – which you then proclaim as a ‘revelation’. Or you might do it by ‘interpreting’ a Bronze Age text whose author was even more ignorant than you are.

Popes can promulgate their private opinions as ‘dogma’, but only if those opinions have the backing of a substantial number of Catholics through history: long tradition of belief in a proposition is, somewhat mysteriously to a scientific mind, regarded as evidence for the truth of that proposition. In 1950, Pope Pius XII (unkindly known as ‘Hitler’s Pope’) promulgated the dogma that Jesus’ mother Mary, on her death, was bodily – i.e. not merely spiritually – lifted up into heaven. ‘Bodily’ means that if you’d looked in her grave, you’d have found it empty. The Pope’s reasoning had absolutely nothing to do with evidence. He cited 1 Corinthians 15:54: ‘then shall be brought to pass the saying that is written, Death is swallowed up in victory’. The saying makes no mention of Mary. There is not the smallest reason to suppose the author of the epistle had Mary in mind. We see again the typical theological trick of taking a text and ‘interpreting’ it in a way that just might have some vague, symbolic, hand-waving connection with something else. Presumably, too, like so many religious beliefs, Pius XII’s dogma was at least partly based on a feeling of what would be fitting for one so holy as Mary. But the Pope’s main motivation, according to Dr Kenneth Howell, director of the John Henry Cardinal Newman Institute of Catholic Thought, University of Illinois, came from a different meaning of what was fitting. The world of 1950 was recovering from the devastation of the Second World War and desperately needed the balm of a healing message. Howell quotes the Pope’s words, then gives his own interpretation:

Pius XII clearly expresses his hope that meditation on Mary’s assumption will lead the faithful to a greater awareness of our common dignity as the human family. . . . What would impel human beings to keep their eyes fixed on their supernatural end and to desire the salvation of their fellow human beings? Mary’s assumption was a reminder of, and impetus toward, greater respect for humanity because the Assumption cannot be separated from the rest of Mary’s earthly life.

It’s fascinating to see how the theological mind works: in particular, the lack of interest in – indeed, the contempt for – factual evidence. Never mind whether there’s any evidence that Mary was assumed bodily into heaven; it would be good for people to believe she was. It isn’t that theologians deliberately tell untruths. It’s as though they just don’t care about truth; aren’t interested in truth; don’t know what truth even means; demote truth to negligible status compared with other considerations, such as symbolic or mythic significance. And yet at the same time, Catholics are compelled to believe these made-up ‘truths’ – compelled in no uncertain terms. Even before Pius XII promulgated the Assumption as a dogma, the eighteenth-century Pope Benedict XIV declared the Assumption of Mary to be ‘a probable opinion which to deny were impious and blasphemous’. If to deny a ‘probable opinion’ is ‘impious and blasphemous’, you can imagine the penalty for denying an infallible dogma! Once again, note the brazen confidence with which religious leaders assert ‘facts’ which even they admit are supported by no historical evidence at all.

The Catholic Encyclopedia is a treasury of overconfident sophistry. Purgatory is a sort of celestial waiting room in which the dead are punished for their sins (‘purged’) before eventually being admitted to heaven. The Encyclopedia’s entry on purgatory has a long section on ‘Errors’, listing the mistaken views of heretics such as the Albigenses, Waldenses, Hussites and Apostolici, unsurprisingly joined by Martin Luther and John Calvin.[2]

The biblical evidence for the existence of purgatory is, shall we say, ‘creative’, again employing the common theological trick of vague, hand-waving analogy. For example, the Encyclopedia notes that ‘God forgave the incredulity of Moses and Aaron, but as punishment kept them from the “land of promise”’. That banishment is viewed as a kind of metaphor for purgatory. More gruesomely, when David had Uriah the Hittite killed so that he could marry Uriah’s beautiful wife, the Lord forgave him – but didn’t let him off scot-free: God killed the child of the marriage (2 Samuel 12:13–14). Hard on the innocent child, you might think. But apparently a useful metaphor for the partial punishment that is purgatory, and one not overlooked by the Encyclopedia’s authors.

The section of the purgatory entry called ‘Proofs’ is interesting because it purports to use a form of logic. Here’s how the argument goes. If the dead went straight to heaven, there’d be no point in our praying for their souls. And we do pray for their souls, don’t we? Therefore it must follow that they don’t go straight to heaven. Therefore there must be purgatory. QED. Are professors of theology really paid to do this kind of thing?

Enough; let’s turn again to science. Scientists know when they don’t know the answer. But they also know when they do, and they shouldn’t be coy about proclaiming it. It’s not hubristic to state known facts when the evidence is secure. Yes, yes, philosophers of science tell us a fact is no more than a hypothesis which may one day be falsified but which has so far withstood strenuous attempts to do so. Let us by all means pay lip service to that incantation, while muttering, in homage to Galileo’s muttered eppur si muove, the sensible words of Stephen Jay Gould:

In science, ‘fact’ can only mean ‘confirmed to such a degree that it would be perverse to withhold provisional assent.’ I suppose that apples might start to rise tomorrow, but the possibility does not merit equal time in physics classrooms.[3]

Facts in this sense include the following, and not one of them owes anything whatsoever to the many millions of hours devoted to theological ratiocination. The universe began between 13 billion and 14 billion years ago. The sun, and the planets orbiting it, including ours, condensed out of a rotating disk of gas, dust and debris about 4.5 billion years ago. The map of the world changes as the tens of millions of years go by. We know the approximate shape of the continents and where they were at any named time in geological history. And we can project ahead and draw the map of the world as it will change in the future. We know how different the constellations in the sky would have appeared to our ancestors and how they will appear to our descendants.

Matter in the universe is non-randomly distributed in discrete bodies, many of them rotating, each on its own axis, and many of them in elliptical orbit around other such bodies according to mathematical laws which enable us to predict, to the exact second, when notable events such as eclipses and transits will occur. These bodies – stars, planets, planetesimals, knobbly chunks of rock, etc. – are themselves clustered in galaxies, many billions of them, separated by distances orders of magnitude larger than the (already very large) spacing of (again, many billions of) stars within galaxies.

Matter is composed of atoms, and there is a finite number of types of atoms – the hundred or so elements. We know the mass of each of these elemental atoms, and we know why any one element can have more than one isotope with slightly different mass. Chemists have a huge body of knowledge about how and why the elements combine in molecules. In living cells, molecules can be extremely large, constructed of thousands of atoms in precise, and exactly known, spatial relation to one another. The methods by which the exact structures of these macromolecules are discovered are wonderfully ingenious, involving meticulous measurements on the scattering of X-rays beamed through crystals. Among the macromolecules fathomed by this method is DNA, the universal genetic molecule. The strictly digital code by which DNA influences the shape and nature of proteins – another family of macromolecules which are the elegantly honed machine-tools of life – is exactly known in every detail. The ways in which those proteins influence the behaviour of cells in developing embryos, and hence influence the form and functioning of all living things, is work in progress: a great deal is known; much challengingly remains to be learned.

For any particular gene in any individual animal, we can write down the exact sequence of DNA code letters in the gene. This means we can count, with total precision, the number of single-letter discrepancies between two individuals. This is a serviceable measure of how long ago their common ancestor lived. This works for comparisons within a species – between you and Barack Obama, for instance. And it works for comparisons of different species – between you and an aardvark, say. Again, you can count the discrepancies exactly. There are just more discrepancies the further back in time the shared ancestor lived. Such precision lifts the spirit and justifies pride in our species, Homo sapiens. For once, and without hubris, Linnaeus’s specific name seems warranted.

Hubris is unjustified pride. Pride can be justified, and science does so in spades. So does Beethoven, so do Shakespeare, Michelangelo, Christopher Wren. So do the engineers who built the giant telescopes in Hawaii and in the Canary Islands, the giant radio telescopes and very large arrays that stare sightless into the southern sky; or the Hubble orbiting telescope and the spacecraft that launched it. The engineering feats deep underground at CERN, combining monumental size with minutely accurate tolerances of measurement, literally moved me to tears when I was shown around. The engineering, the mathematics, the physics, in the Rosetta mission that successfully soft-landed a robot vehicle on the tiny target of a comet also made me proud to be human. Modified versions of the same technology may one day save our planet by enabling us to divert a dangerous comet like the one that killed the dinosaurs.

Who does not feel a swelling of human pride when they hear about the LIGO instruments which, synchronously in Louisiana and Washington State, detected gravitation waves whose amplitude would be dwarfed by a single proton? This feat of measurement, with its profound significance for cosmology, is equivalent to measuring the distance from Earth to the star Proxima Centauri to an accuracy of one human hair’s breadth.

Comparable accuracy is achieved in experimental tests of quantum theory. And here there is a revealing mismatch between our human capacity to demonstrate, with invincible conviction, the predictions of a theory experimentally and our capacity to visualize the theory itself. Our brains evolved to understand the movement of buffalo-sized objects at lion speeds in the moderately scaled spaces afforded by the African savannah. Evolution didn’t equip us to deal intuitively with what happens to objects when they move at Einsteinian speeds through Einsteinian spaces, or with the sheer weirdness of objects too small to deserve the name ‘object’ at all. Yet somehow the emergent power of our evolved brains has enabled us to develop the crystalline edifice of mathematics by which we accurately predict the behaviour of entities that lie under the radar of our intuitive comprehension. This, too, makes me proud to be human, although to my regret I am not among the mathematically gifted of my species.

Less rarefied but still proud-making is the advanced, and continually advancing, technology that surrounds us in our everyday lives. Your smartphone, your laptop computer, the satnav in your car and the satellites that feed it, your car itself, the giant airliner that can loft not just its own weight plus passengers and cargo but also the 120 tons of fuel it ekes out over a thirteen-hour journey of seven thousand miles.

Less familiar, but destined to become more so, is 3D printing. A computer ‘prints’ a solid object, say a chess bishop, by depositing a sequence of layers, a process radically and interestingly different from the biological version of ‘3D printing’ which is embryology. A 3D printer can make an exact copy of an existing object. One technique is to feed the computer a series of photographs of the object to be copied, taken from all different angles. The computer does the formidably complicated mathematics to synthesize the specification of the solid shape by integrating the angular views. There may be life forms in the universe that make their children in this body-scanning kind of way, but our own reproduction is instructively different. This, incidentally, is why almost all biology textbooks are seriously wrong when they describe DNA as a ‘blueprint’ for life. DNA may be a blueprint for protein, but it is not a blueprint for a baby. It’s more like a recipe or a computer program.

We are not arrogant, not hubristic, to celebrate the sheer bulk and detail of what we know through science. We are simply telling the honest and irrefutable truth. Also honest is the frank admission of how much we don’t yet know – how much more work remains to be done. That is the very antithesis of hubristic arrogance. Science combines a massive contribution, in volume and detail, of what we do know with humility in proclaiming what we don’t. Religion, by embarrassing contrast, has contributed literally zero to what we know, combined with huge hubristic confidence in the alleged facts it has simply made up.

But I want to suggest a further and less obvious point about the contrast of religion with atheism. I want to argue that the atheistic worldview has an unsung virtue of intellectual courage. Why is there something rather than nothing? Our physicist colleague Lawrence Krauss, in his book A Universe from Nothing,[4] controversially suggests that, for quantum-theoretic reasons, Nothing (the capital letter is deliberate) is unstable. Just as matter and antimatter annihilate each other to make Nothing, so the reverse can happen. A random quantum fluctuation causes matter and antimatter to spring spontaneously out of Nothing. Krauss’s critics largely focus on the definition of Nothing. His version may not be what everybody understands by nothing, but at least it is supremely simple – as simple it must be, if it is to satisfy us as the base of a ‘crane’ explanation (Dan Dennett’s phrase), such as cosmic inflation or evolution. It is simple compared to the world that followed from it by largely understood processes: the big bang, inflation, galaxy formation, star formation, element formation in the interior of stars, supernova explosions blasting the elements into space, condensation of element-rich dust clouds into rocky planets such as Earth, the laws of chemistry by which, on this planet at least, the first self-replicating molecule arose, then evolution by natural selection and the whole of biology which is now, at least in principle, understood.

Why did I speak of intellectual courage? Because the human mind, including my own, rebels emotionally against the idea that something as complex as life, and the rest of the expanding universe, could have ‘just happened’. It takes intellectual courage to kick yourself out of your emotional incredulity and persuade yourself that there is no other rational choice. Emotion screams: ‘No, it’s too much to believe! You are trying to tell me the entire universe, including me and the trees and the Great Barrier Reef and the Andromeda Galaxy and a tardigrade’s finger, all came about by mindless atomic collisions, no supervisor, no architect? You cannot be serious. All this complexity and glory stemmed from Nothing and a random quantum fluctuation? Give me a break.’ Reason quietly and soberly replies: ‘Yes. Most of the steps in the chain are well understood, although until recently they weren’t. In the case of the biological steps, they’ve been understood since 1859. But more important, even if we never understand all the steps, nothing can change the principle that, however improbable the entity you are trying to explain, postulating a creator god doesn’t help you, because the god would itself need exactly the same kind of explanation.’ However difficult it may be to explain the origin of simplicity, the spontaneous arising of complexity is, by definition, more improbable. And a creative intelligence capable of designing a universe would have to be supremely improbable and supremely in need of explanation in its own right. However improbable the naturalistic answer to the riddle of existence, the theistic alternative is even more so. But it needs a courageous leap of reason to accept the conclusion.

This is what I meant when I said the atheistic worldview requires intellectual courage. It requires moral courage, too. As an atheist, you abandon your imaginary friend, you forgo the comforting props of a celestial father figure to bail you out of trouble. You are going to die, and you’ll never see your dead loved ones again. There’s no holy book to tell you what to do, tell you what’s right or wrong. You are an intellectual adult. You must face up to life, to moral decisions. But there is dignity in that grown-up courage. You stand tall and face into the keen wind of reality. You have company: warm, human arms around you, and a legacy of culture which has built up not only scientific knowledge and the material comforts that applied science brings but also art, music, the rule of law, and civilized discourse on morals. Morality and standards for life can be built up by intelligent design – design by real, intelligent humans who actually exist. Atheists have the intellectual courage to accept reality for what it is: wonderfully and shockingly explicable. As an atheist, you have the moral courage to live to the full the only life you’re ever going to get: to fully inhabit reality, rejoice in it, and do your best finally to leave it better than you found it.

 

 

[1] https://islamqa.info/en/27280.

[2] http://www.catholic.org/encyclopedia/view.php?id=9745.

[3] ‘Evolution as fact and theory’.

[4] For which I wrote an afterword.

Next article
They Think It's Murder

List of articles

Previous article
On Science and Scientists: A Conversation with Neil deGrasse Tyson