If one were to ask the average American scientist whether or not there is a god, they would most likely receive a negative response. According to the most recent polls, barely a third of scientists in the United States believe in a god, a sharp contrast to the roughly eighty percent of America that does. It is a glaring gap, and it begs the question, what about science makes belief in a god so anathematic?
In the mid-nineteenth century, Thomas Huxley, a biologist and ardent supporter of Charles Darwin’s theory of evolution (so much so that he eventually garnered the moniker “Darwin’s Bulldog”) had this to say about the reason why scientists are so averse to faith:
“The improver of natural knowledge absolutely refuses to acknowledge authority, as such. For him, scepticism is the highest of duties; blind faith the one unpardonable sin… The man of science has learned to believe in justification, not by faith, but by verification.”
“Blind faith” or a “God of the gaps” explanation wherein one simply shrugs their shoulders and says, “I don’t know. God did it” is antithetical to the scientific way of life, and understandably so.
After all, science is based on evidence–testable, verifiable evidence. It demands that there be proof in the pudding rather than appeals to some pie-in-the-sky deity who supposedly directs the world from on high.
“Skepticism is the highest of duties,” and it is the scientist’s duty to ask how and why.
So if the vast majority of scientists are correct that God does not exist, they should have a fair bit of evidence shored up to prove it, or, at the very least, they should be able to offer an alternative explanation as to how and why everything we know and see has come to be apart from the machinations of a deity.
And they do.
That is, they claim to.
Their explanation for the complexity of the natural world begins with a presupposed primordial soup.
In 1859, Charles Darwin’s On The Origin of Species made its debut, and in the century and a half since publication, the theories contained therein have continued to dominate the scientific landscape.
Presently, evolution and natural selection are akin to sacred creeds. Indeed, evolution in particular has ceased to be merely a theory, and is now considered an uncontestable “fact,” comparable by some to the law of gravity.
To challenge Darwin is, as Yale Professor Dr. David Gelernter has said, “to take your life into your hands intellectually.”
And yet, some, including Dr. Gelernter, are doing just that, and why shouldn’t they? Skepticism is, after all, the highest duty of any good scientist, and if Darwinian theory is correct, it should have nothing to fear from a healthy dose of inquiry.
Put up or shut up, as they say, and if Darwin’s theories cannot hold up to criticism or scrutiny, the scientific community ought to clear them away.
So with respect to Darwin’s theories, can they?
And perhaps, more importantly, with respect to the regnant scientific community, do they?
The answer, at present, is no on both counts.
In On The Origin of Species, Darwin explained monumental change by assuming all life-forms descend from a common ancestor and evolve through random, heritable variation and natural selection.
No divine design or direction needed.
However, even at the time of his writing, Darwin was aware of a particular and nagging problem with his theory: an incursion in the fossil record which rather than showing gradual, continuous change between organisms, showed sudden, discontinuous change.
The kind that his theory could not explain.
This rather cataclysmic hiccup is the so-called “Cambrian Explosion,” a period of roughly 70 million years (a “blink of an eye” in biological and geological terms) wherein a plethora of new life forms entered the fossil record seemingly without any ancestral antecedents. Darwin acknowledged the problem for what it was: a mortal blow to his theory, writing,
“If numerous species, belonging to the same genera or families, have really started into life all at once, the fact would be fatal to the theory of evolution through natural selection.”
He further stated that
“To the question why we do not find such rich fossiliferous deposits belonging to these assumed earlier periods prior to the Cambrian system, I can give no satisfactory answer… The case at present must remain inexplicable; and may be truly urged as a valid argument against the views here entertained.”
This was a remarkably humble admittance given the boundless confidence Darwin’s present-day defenders now seem to possess. However, Darwin did remain optimistic that given time, new evidence of Precambrian organisms would emerge, vindicating his theory.
That has not happened.
Instead, new discoveries in the fossil record have only confirmed what Darwin and his then critics, among them the Head of Harvard’s Lawrence Scientific School, Louis Agassiz, observed. There are no discernable transitional predecessors for the Cambrian Explosion, only more challenges to Darwin’s essential gradualism and a whole lot of dirt.
The late and celebrated evolutionary biologist Stephen Jay Gould, acknowledged this in the late 80s, saying, “The Precambrian record is now sufficiently good that the old rationale about undiscovered sequences of smoothly transitional forms will no longer wash.”
However, others, such as Former Director of the National Center for Science Education Eugenie Scott, have said that the discrepancies in the fossil record can be explained as merely “…a sampling problem.”
But the fact remains that the Precambrian creatures Darwin so hoped we would find have vanished without a trace, as if they’d never been there in the first place. Or, perhaps, one could say, as if they’d been divinely erased.
However, setting the problematic fossil record aside, a new challenge to Darwinism has emerged in more recent times, aided substantively by twenty and twenty-first century advancements in molecular biology as well as the kind of math done in the average pre-algebra class.
In 1953, Francis Crick and James Watson discovered the structure of DNA, elucidating the double helix and leading to a revolution in the field of molecular biology. However, alongside this tremendous discovery came new questions about the complexity of the cell, something which had previously been considered little more than an “undifferentiated block of jello.”
In Darwin’s time, it was reasonable enough to assume this was true, and the notion that an undifferentiated block of jello could arise given time, plus matter, plus chance seemed plausible, even probable.
However, the cell is not a block of jello.
It is, in the words of Dr. David Berlinski (Princeton PhD),
“…an unbelievably complex bit of machinery. Unfathomably complex. And we haven’t understood its complexity at all. Every time we look, there seems to be an additional layer of rebarbative complexity.”
This presents a problem for the Darwinian model, a problem that is only compounded by the fact that Crick and Watson not only demonstrated that the cell was incredibly structurally complex but also that it was coded with information–not unlike a computer program. For anyone who knows coding, it’s an incredibly intricate process (i.e. not the sort of thing that arises from time, plus matter, plus chance).
Common sense tells us that today, and it told others the same back then.
Within a decade, Watson and Crick’s discoveries about the cell’s complexity and the coding of DNA had percolated beyond the fields of molecular and evolutionary biology and found their way to the desks of mathematicians, physicists, and engineers who took a few looks at the new information set before them, sat it alongside Darwin’s theory of evolution, and said,
“Watson, we have a problem.”
Thus, in April of 1966, a group of some of the world’s most renowned mathematicians, physicists, and biologists gathered at the Wistar Institute in Philadelphia to duke out what they called “Mathematical Challenges to the Neo-Darwinian Interpretation of Evolution.”
For the mathematicians present, the issue was a straightforward one: Now knowing something of the complexity of the cell and the coding of DNA, the odds of generating new life (a protein, specifically) through random mutation were simply too minute to entertain.
Dr. Gelernter details this in his 2019 essay “Giving Up Darwin,” but in brief, the building blocks of life are strands of protein composed of amino acids arranged in a particular sequence like a beaded necklace. A “modest-sized” protein chain is 150-amino acids long (the average is 250), and for each of those 150 spots, there are 20 amino acids to pick from. In total, the number of possible arrangements of those 150 amino acids is 10… to the 195th power (that’s 10 with 195 zeros after it. A trillion is 10 with just 11 zeros after it).
Thus, the odds of generating a particular strand of protein is 1 over 10 to the 195th power.
Just to put that into perspective, there are only 10 to the 80th power atoms in the entire universe.
However, the biologists at the symposium argued that the mathematicians were being too stingy with their estimation of functional protein sequences. Yes, the odds seemed prohibitive, but who knew how many sequences would work in the end? At the time, none of them did.
However, MIT Professor Dr. Murray Eden observed,
“No currently existing formal language can tolerate random changes in the symbol sequences which express its sentences. Meaning is almost invariably destroyed. Any changes must be syntactically lawful ones.”
Put another way, he predicted that random mutations were more likely to hurt than help the chain experiencing the mutating.
He was right.
In 2007, almost fifty years after the Wistar Symposium convened, Dr. Murray and his Darwin-critical compatriots were vindicated by the work of Dr. Douglas Axe (Cambridge PhD) who showed that the odds of getting a functional protein, the most basic, rudimentary form of life, through random variation were 1 over 10… to the 77th power.
While those odds are better than 1 over 10 to the 195th power, they remain so astronomical as to be functionally impossible.
However, evolutionary biologists and their sympathizers are quick to remind critics that evolution has been running since time immemorial, and given billions of years, it may well be the case that the astronomical odds of generating a functional protein through random mutation narrow to something approaching reasonable.
Give anything enough time and the impossible supposedly becomes possible.
This was one of the arguments put forth by Richard Dawkins in his 1986 book The Blind Watchmaker, wherein he writes, “…given enough time, a monkey bashing away at random on a typewriter could produce all the works of Shakespeare.”
Let’s see.
In 2003, students and lecturers at the University of Plymouth received a grant to test Dawkins’ popularized hypothesis. They placed a computer and keyboard in an enclosure with six macaques monkeys and let them get to work. Over the course of a month, aside from using the equipment as a toilet, the monkeys managed to produce five-typed pages, but not a single word.
However, who’s to say that if they’d had 4.55 billion years (the approximate age of the earth) to whack away at the keyboard, a Shakespearean sonnet wouldn’t have popped out?
MIT graduate Dr. Gerald Schroeder, for one, who, in 2004, during a public debate in New York with “The World’s Most Notorious Atheist” Professor Anthony Flew, systematically refuted the “infinite monkey theorem” on the simple basis that there are not enough protons, neutrons, and electrons in the entire universe to generate just the written trials (the trials!) necessary for monkeys, typing at random, to produce “Shall I Compare Thee to a Summer’s Day?”
The universe would need to be 10 to the 600th power larger than it is to accommodate that feat. Again, that’s ten with six-hundred zeros after it.
It is worth noting that after having been presented with arguments about the complexity of cells and DNA, as well as other evidence, Professor Flew declared himself a deist, saying,
“[It] was a major change of course for me, but it was nevertheless consistent with the principle I have embraced since the beginning of my philosophical life–of following the argument, no matter where it leads.”
Such intellectual integrity is a wonder and a rarity, but many of Flew’s atheist peers attributed his change to “senility.”
Returning to the more straightforward issue of whether random mutations can, given the entire history of the earth, create a functional protein, the answer is also no.
At present, scientists estimate that in the whole of history, there have been 10 to the 40th power bacteria in existence. Assuming that every single one of those bacteria experienced a mutation (an event evolutionary biologists admit is a rarity), that would be 10 to the 40th power mutations in the course of human history. Stacked against the likelihood of randomly generating a functional protein (which, again, is 1 over 10 to the 77th power), what is left is 1 over 10… to the 37th power.
It is mathematically impossible.
In brief, there simply has not been anywhere near enough time for Darwin’s theory to work. The universe has not had enough tries to come close to generating the most rudimentary forms of life. In the words of Dr. Gelernter,
“From whatever angle you come at it, the answer is ‘no.’ There has not been enough time. The number of ‘throws’ we’ve had is too puny to be worth talking about. It doesn’t even approach puninous!”
Thus, Gelernter asserts, the math is out, the odds impossible, and Darwinian theory, for all its beauty, must be given up. The question he poses to biologists now is:
“How cleanly and quickly can the field get over Darwin, and move on?”
Helpfully, that question has already been answered by another arm of the scientific community. Because in the early and mid-twentieth century, cosmologists underwent the definitive death of the “steady state” theory–Darwin’s ideological equivalent within cosmology.
As we shall see, it took them over fifty years to finally admit defeat.
In the late-1920s, enterprising, young astronomer Edwin Hubble (of Hubble telescope fame) looked up at the stars and saw something that Einstein’s reigning “steady state” theory of the universe could not explain. It looked like the universe was expanding. This discovery sent shockwaves through the scientific community, but Hubble persisted, showing his findings to Einstein himself who, after a “torturous” thought process, finally accepted the truth, admitting,
“New observations by Hubble and Humason concerning the redshift of light in distant nebulae make it appear likely that the general structure of the universe is not static.”
However, a great number of cosmologists resisted this new theory. The “Big Bang,” as renowned astronomer and cosmologist Sir Fred Hoyle disparagingly coined it, was unpalatable.
British astrophysicist Arthur Eddington put it more colorfully, proclaiming, “the notion of a beginning is repugnant to me…. The expanding universe is preposterous … incredible … it leaves me cold.”
Yet in 1963, physicists Arno Penzias and Robert Wilson detected remnants of microwave radiation in the sky, observable evidence that the Big Bang had occurred: there was a beginning to the universe.
By the late 1970s, the “steady state” theory had been “dead and buried” by almost every reputable cosmologist in the field. The evidence was overwhelmingly in favor of Big Bang Cosmology, and yet despite the scientific consensus, a visible amount of ideological reticence remained.
MIT Professor Philip Morrison remarked in the mid-70s, “I find it hard to accept the big‐bang theory; I would like to reject it.”
Why would this be? How could this be? How could scientists, those men of verification and evidence as Thomas Huxley put it so succinctly a century prior, be reluctant, even unwilling, to go where the evidence leads?
The answer was given by none other than Professor Stephen Hawking, who, in the late-twentieth century, admitted, “So long as the universe had a beginning, we could suppose it had a creator.”
Put simply, if there was a Big Bang, there must have been a Big Banger!
To scientists intent on rejecting all authority save that found within the scientific community, this was a nightmare scenario.
The worst of all possible worlds!
It certainly did not help matters that Noble laureate Arno Penzias remarked,
“The best data we have concerning the Big Bang are exactly what I would have predicted, had I nothing to go on but the five books of Moses, the Psalms, the Bible as a whole.”
It was like pouring salt in the wound.
Alternative explanations for the Big Banger quickly proliferated–they had to.
Because for scientists to say the Big Bang had no cause (i.e. no Big Banger) would be to say it was, in short, a virgin birth.
Since most scientists hold the Biblical Virgin Birth in contempt, affirming the universe’s virgin birth would be logically incoherent.
Especially since the Big Bang lays claim to the birth of all creation while the Virgin Birth, canonically, only claims to have birthed the baby born to save it.
Thus, scientists must either bend their knees to the Big Banger or offer alternatives.
Most have chosen the latter.
But what of these alternatives?
If the Big Banger is indeed existent albeit not the God chronicled in the Bible, then who or what is it?
One proposed hypothesis tries to skirt the question entirely.
This is the multiverse theory, which was generated to explain away the discombobulating reality that, Big Bang notwithstanding, the universe seems to be “fine-tuned” for the existence of human life.
So much so is its apparent design that Sir Fred Hoyle (of disparaging Big Bang naming fame) grumbled that it looked like “a super-intellect has monkeyed with physics, as well as with chemistry and biology,” further stating, “The universe looks like a put-up job.”
The multiverse theory endeavors to solve this problem by blowing out the imagined number of universes in the hopes that if the number is large enough, fine-tuning could be just a matter of time, plus matter, plus chance (Like Darwin all over again).
No Big Banger needed.
But does the multiverse theory actually accomplish this?
The hypothesis itself has been described by Professor Richard Dawkins like this:
“We live in a kind of bubbling foam of universes, and each bubble in the foam has a different set of laws of physics. The vast majority of those laws of physics are not conducive to giving rise to us. A tiny minority are… We could only be in one of those bubbles that have the necessary laws of physics to bring us into existence, and therefore, obviously, since we do exist… we must be in such a universe.“
In other words, we live in a kind of cosmic bubble bath.
However, the question remains:
…Who drew the bath?
That, scientists, cannot explain, and with no little horror, the realization sets in: far from diminishing or skirting the problem of the Big Banger, multiverse proponents have only inflated their antagonizer.
The Big Banger has become an even Bigger Bubbler!
Thus, the origin problem when it comes to the universe cannot be attenuated by the multiverse.
But what about life on earth?
Can scientists explain the origin of the “warm little pond” Darwin presupposed over a hundred and fifty years ago?
They’ve tried.
Presently, the source of life on earth that cosmologists, molecular biologists, and evolutionary biologists alike have settled upon is none other than extraterrestrials.
The theory that alien lifeforms “seeded” our planet (known as “panspermia”) was one embraced and popularized by aforementioned astronomer and cosmologist Sir Fred Hoyle in the mid-twentieth century at the time when he was still fighting the ascendance of Big Bang Cosmology.
He was bolstered in his belief by Nobel Prize Winner and aforementioned co-discoverer of the double helix, Francis Crick, who wrote in 1981 that it was “not implausible… that life here was seeded by microorganisms sent on some form of spaceship by an advanced civilization.”
Even more recently, famed evolutionary biologist, aforementioned Professor Richard Dawkins concurred: “It could be that at some earlier time in the universe, some civilization evolved by probably some kind of Darwinian means to a very, very high level of technology and designed a form of life that they seeded onto this planet.”
In summary, the options on the table to explain the origins of the universe and life on earth are 1) a super-intellect who has, by scientists’ own admission, “monkeyed around with the physics” in order to permit our existence, 2) a virgin birth, 3) a cosmic bubble bath known as the multiverse, and 4) agrarian-minded aliens.
Each option seems incredible, but each one is still on the table, including, under option number 1, the God of the Bible. Indeed, to a number of scientists, that answer is increasingly credible.
In fact, all the way back in 1978, agnostic and Head of the Goddard Space Program at NASA Dr. Robert Jastrow, having observed the contortions of his fellow scientists to avoid Big Bang Cosmology and, in turn, the God of the Bible, said this:
“For the scientist who has lived by his faith in the power of reason, the story ends like a bad dream. He has scaled the mountains of ignorance; he is about to conquer the highest peak; as he pulls himself over the final rock, he is greeted by a band of theologians who have been sitting there for centuries.”
Were Dr. Jastrow still alive today, he may well have risked being tarred and feathered like an oleaginous wild turkey for saying such a thing because when it comes to religion within the body scientific, the community is far from welcoming.
However, despite this, there are a number of top-tier scientists today that loudly and proudly declare their faith.
Christian convert, Fellow of The Royal Society of Chemistry, and Nanoscientist Dr. James Tour has said,
“I’m certainly an anomaly in the academy. I love Jesus more than anything. I just love Jesus… and I just want to spend my life talking about Him and how good He is. And what I’ve found in the academy is that if you speak a little bit about Jesus, they’ll give you trouble, but if you speak a lot, they leave you alone. They just don’t even want to get you started–I just will not quit. I just love Jesus more than anything. I’ll take out full page ads in the school newspaper and just talk about Jesus and how wonderful He is.”
Dr. Francis Collins, another convert to Christianity, headed the Human Genome Project and received both the Presidential Medal of Freedom as well as the National Medal of Science in 2007 and 2009, respectively. He currently directs the National Institute of Health, an appointment he received from President Barack Obama in 2007.
Despite his accomplishments, Collins has received a fair amount of criticism for his beliefs, some arguing his Christian faith ought to exempt him from places of prominence within the scientific community.
Collins has offered a simple rejoinder:
“I have found there is a wonderful harmony in the complementary truths of science and faith. The God of the Bible is also the God of the genome. God can be found in the cathedral or in the laboratory. By investigating God’s majestic and awesome creation, science can actually be a means of worship.”
In sharp contrast, scientists who cleave to atheism absolutely, positively do not want to find God anywhere, least of all in the laboratory!
One could imagine if God dared stick His head out of a petri-dish, the mission-minded scientific atheist would quickly whip out a materialist mallet and smash dish and deity to smithereens.
“Gotcha!” He’d say, sweeping the bits of glass and providence away. “They’ll be no blasted deity today!”
In all seriousness, the fervor with which many scientists pledge fealty to atheism does–by their own admission–strain the bounds of credulity.
Former endowed chair of biology and zoology at Harvard University and ardent atheist, Dr. Richard Lewontin, has said as much, writing,
“Our willingness to accept scientific claims that are against common sense is the key to an understanding of the real struggle between science and the supernatural. We take the side of science in spite of the patent absurdity of some of its constructs, in spite of its failure to fulfill many of its extravagant promises of health and life, in spite of the tolerance of the scientific community for unsubstantiated just-so stories, because we have a prior commitment, a commitment to materialism… we cannot allow a Divine Foot in the door.”
Interestingly, even Richard Dawkins has not been that extreme. In fact, during a debate with Oxford Professor, Christian, and Triple-Doctor of Mathematics, Science, and Philosophy John Lennox on the topic “Has Science Buried God?,” Dawkins said he would be willing to consider some kind of higher power:
“You could possibly persuade me that there was some kind of creative force in the universe. Some kind of physical, mathematical genius who created everything… You could possibly persuade me of that, but that is radically and fundamentally incompatible with the sort of god who cares about sin, a sort of god who cares about what you do with your genitals, or the sort of god who has the slightest interest in your private thoughts and wickedness.”
Dawkins central problem, then, is not the existence of a higher power but rather the nature of the higher power, and he would rather a universe of “blind, pitiless, indifference” as he says in his book River Out of Eden, than a God who enjoins His people to practice abstinence.
That is his preference.
But it is just that. A preference.
To believe or not to believe, that is the question, and well-respected scientists of all stripes disagree.
Some, like Tour, Collins, and Lennox, even hold hard and fast to Christianity.
And yet, thanks to the way many scientific atheists talk today and the widespread propagation of the things they say, much of society has been led to believe that people of faith, particularly the Christian faith, are off their rocker, crazy.
A bunch of science-denying loony-toons who believe aliens came and–
Wait a second…
In all seriousness, though. The kind of anti-religious vitriol favored by the Dawkinses of the world is literally a century old.
In the run up to the historic Scopes Monkey Trial of 1925, Baltimore Sun journalist H.L. Mencken published a piece entitled “Homo Neanderthalis” wherein he ripped into religious rejectors of evolution as societal ills and impediments, saying,
“The so-called religious organizations which now lead the war against teaching evolution are nothing more, at bottom, than conspiracies of the inferior man against his betters… no man of any education or other human dignity belongs to them… They have fought every new truth ever heard of, and they have killed every truth-seeker who got into their hands.”
It was a timely critique, but times have changed.
The landscape of the science versus faith debate is not the same as it was in the early twentieth century.
The definitive death of Darwin’s theories is underway.
The Big Banger remains existent and unchanged.
And no longer is it the good and faithful who are running away, clutching crucifixes and bottles of holy water to ward against the incursion of challengers to the faith.
Many are actually eager to debate.
The fact of the matter is there is a revival afoot in the sciences born not of Biblical literalism or fundamentalist fatuousness but of real, genuine scientific inquiry and a desire to appreciate the earth and universe in all their splendor and majesty.
As such, Christian men and women within the sciences are stepping up and stepping out in faith, saying, as Galileo said,
“I do not feel obliged to believe that the same God who has endowed us with sense, reason, and intellect has intended us to forgo their use.”
Christians pursue scientific truth too.
So, yes.
The “God of the gaps” that Huxley and many nineteenth and twentieth-century scientists held in contempt is dead.
He remains dead.
Science has killed him and shot Darwin and the “steady state” alongside.
However, the God of the Bible has survived the triple homicide and come roaring back in the sciences.
In truth, as time marches on, more and more science seems to be pointing to God, and should this continue, there is a very real chance that soon, perhaps very soon, the skeptics will simply be overwhelmed and the cynics will be forced to declare themselves.
All that to say…
When it comes to the science and faith debate, consider carefully those who no longer want to debate.
Consider when it was that their scientific skepticism went away.
The issue, they say, is closed.
But methinks the emperor has no clothes.
This is a revised version of my research paper for Christian Theologies in America Spring 2020.