In the feature article, I raised the issue of time, only to say that it does not need to be a problem. Whether differences built up between populations gradually, or rapidly at the beginning, or in occasional brief spurts of intense change, seems to be an empirical matter. The pattern of change should not be assumed ahead of time.
However, we do have to work within certain constraints. If James Ussher’s dates are anything to go by, the Flood occurred in 2348 B.C., and the dispersion from the Tower of Babel occurred in 2234 B.C. Even conservative writers do not agree on the exact dating (e.g., Morris, 1974, pp. 247-250) but, for the sake of argument, let us say that human variation began around the time of Ussher’s date for the Flood. This sets a time limit of approximately 4,350 years.
Next, we need to know the extent of variation. The commonly cited figure is 0.2%. In other words, if you were to compare your DNA with the DNA of a stranger picked randomly from anywhere in the world, you would find that two base pairs (the “rungs” of the twisted, ladder-like DNA molecule) in every thousand base pairs are different, on average. There are around 3 billion base pairs in human DNA, so 0.2% of this figure would equal 6 million base pairs.
Actually, the situation is a little worse than this. If ancient art is anything to go by, skin coloration was a significant feature at an early stage (again, for the sake of argument, I will not worry about the discrepancies between archaeological and biblical chronologies). We could assume that obvious physical variations were fairly well established by the time of Abraham (c. 2000 B.C.). Is there enough time to accumulate these changes in the first few hundred years after the Flood?
The situation is helped a little by the estimate that only 6% of the 0.2% variation represents differences across major groupings (Gutin, 1994, p. 72). Between, say, a European and an Asian chosen at random, we would expect to find a difference of only 360,000 base pairs. Of course, all we need are sufficient mutations in the genes that are most responsible for making us appear different to people in other places. In the case of skin color (see feature article), this could mean a few mutations among a handful of genes.
So far, this is just a sketch of where we need to go in terms of a biblical model. No one, including the evolutionist, has explained all the empirical data. Still, 6 million mutations in such a short time requires some explanation.
One solution may lie in much higher mutation rates. Most estimates have rested on molecular clocks which, in turn, have rested on evolutionary assumptions. Until recently, we have not had good empirical measures of the mutation rates in humans. The situation improved when geneticists were able to analyze DNA from individuals with well-established family trees going back several generations. One study found that mutation rates in mitochondrial DNA are 18 times higher than previous estimates (Parsons, et al., 1997). If this new rate were applied to the “mitochondrial Eve” research, it would turn out that this hypothetical woman lived 6,000 years ago. “No one thinks that’s the case,” science writer Ann Gibbons is quick to point out (1998, 279:29). Still, if these new estimates hold, evolutionary anthropologists will have to do some fancy footwork around their dates for key events in the development of modern humans. Most important, the new data may put a biblical empirical model in closer reach.
Gibbons, Ann (1998), “Calibrating the Mitochondrial Clock,” Science, 279:28-29, January 2.
Gutin, Joann C. (1994), “End of the Rainbow,” Discover, 15:70-75, November.
Morris, Henry M. (1974), Scientific Creationism (San Diego, CA: Creation-Life Publishers).
Parsons, Thomas J., et al. (1997), “A High Observed Substitution Rate in the Human Mitochondrial DNA Control Region,” Nature Genetics, 15:363.