On April 25, 1953, James Watson and Francis Crick published a scientific paper describing for the first time the intricacies of the DNA molecule. For their attainment, they received the Nobel Prize—and initiated a biological revolution. The elucidation of the molecular biology of the gene clearly ranks among the greatest scientific achievements of all time. Because of this discovery, a new age has dawned—the Genetic Age.
In the opinion of many scientists, the last great revolution in science was the coming of the Nuclear Age. Nuclear technology tends to be viewed as either the most powerful industry for human benefit—or the most dangerous tool for human destruction—ever available for mankind’s use. With the development of genetic engineering, the potential for controversy is even greater because in their experiments scientists no longer are dealing with merely inanimate nature, but instead with human subjects—and the consequences are far-reaching indeed. Some have made comparisons between current advances and those that led, little more than a generation ago, to the dropping of the atomic bombs over Nagasaki and Hiroshima. Science fiction writers have created, in the true tradition of Dr. Frankenstein, modern-day monsters ranging from potentially killer microorganisms to duplicates of Adolf Hitler. Some among us see the immediate demise of the human race; others see, and tremble before, the prospect of a Huxlian Brave New World that promises the complete and utter dehumanization of mankind. What, then, is the truth of the matter?
Today the citizens of most civilized countries are better fed, better clothed, and healthier than they have ever been. Transportation, educational, medical, industrial, and even recreational facilities are vastly improved, compared to those of previous generations. Prospects for the future, then, should be brighter than ever. But are they?
While no one knows exactly what the future will hold, there are growing indications that much of it may not be for good. The fact is that mankind has become more smug as scientific knowledge has increased. Humanity has drifted farther and farther from God, and progressively attempts to cut itself loose from the moral and ethical standards found within God’s Word. It certainly is safe to say that the average person of our day knows far less about the Bible than the common man of a half-century ago. What will happen, then, as science accelerates, while man’s relationship with, and knowledge of, his Creator degenerates? The possibilities are staggering. And the frightening thing is that now we are confronting situations we thought only future generations would have to face.
GENETIC ENGINEERING—AN OVERVIEW
In the past, genetic engineering generally was looked upon as an area of science dealing with the substitution of new (“improved”) genes for old (damaged) ones. But to the man on the street today, it means far more than that—something like conjuring up DNA monsters, or cloning world-renowned figures such as Hitler, Churchill, or Stalin. In this article, the term will be used in its broadest sense to include any form of artificial reproduction or genetic manipulation. Among some of the questions to be considered are these: (a) how extensive is our current reproductive technology; (b) how is it being employed presently; (c) what are the scientific and biblical ramifications; and (d) what should be the Christian’s response to the use of this technology?
The motivation behind much genetic engineering research is commendable. Scientists are anxious to alleviate human suffering by the correction of genetic or behavioral defects, therapeutically control and rehabilitate those who are dangerous to society, and improve the general functioning and future potential of the human race. Few would argue with the goal of helping people function better. Even opponents of human genetic engineering would concede that most scientists are not attempting to be malicious or oligarchical elitists.
We must remember, however, that even scientists are not completely free of the desire for power. Further, some scientists work on underlying assumptions that suggest: (a) we can do better than nature (or as the Christian would say, better than God); (b) we are responsible to no higher Being than ourselves; (c) economic value is the final test in considering what should or should not be done; and (d) the end justifies the means. Clearly, the potential for a very real and very serious problem exists. Should this attitude become dominant, there may be no effective barrier against irresponsible uses of genetic engineering.
As we examine the concepts behind genetic engineering, we must distinguish between various types of genetic research. The first has to do with modification, which involves making minor changes in an existing structure by splicing in new genetic material, or by altering the material already present. Generally, this type of procedure has as its goal the improvement of an organism, or the prevention or cure of disease. Few would oppose such uses of genetic engineering, as long as scientists follow proper guidelines.
A second type of genetic research relating to both animals and humans centers on procreation. Currently, for example, technology that once was available only for use in animals now can be employed in humans, allowing people to reproduce when previously they were unable to do so.
A third, more controversial, type of genetic engineering centers on the creation of new life forms. Some scientists see the day approaching when we shall go beyond small-scale genetic modification to produce more novel living beings. This is a drastic departure from conferring a specific trait on an existing organism, or genetically modifying an organism so as to give it a healthier or longer life. One writer has referred to this as “engineering the engineer,” as opposed to “engineering his engine” (Kass, 1971, p. 779).
A BRIEF HISTORY OF GENETIC ENGINEERING
Historically, experiments intended to alter human life began in 1970 when Stanford Rogers, a physician and biochemist, attempted to introduce into his patients a gene for production of the enzyme arginase. The patients’ systems were incapable of manufacturing the enzyme—a factor that eventually would cause their deaths. Dr. Rogers injected his subjects with a virus that can produce the enzyme, in the hope that the virus would infect their DNA. Subsequently, the host’s immune system would destroy the virus, yet leave behind the gene for arginase production. The experiment failed, resulting in a swift outcry of criticism from the scientific community.
In July of 1980, a more extensive experiment was attempted by Martin Cline, then head of hematology and oncology at the University of California at Los Angeles. Working with him was a team of Israeli medical doctors, headed by Eliezer Rachmilewitz of the Hadassah Hospital in Jerusalem. Patients under the care of Dr. Rachmilewitz had a rare but fatal disease known as beta zero thalassemia. Dr. Cline injected their bone marrow with a new gene, in the hope that it would be able to correct the defect in the patients’ immune systems.
Such was not to be, however. This experiment failed as well, and cost Dr. Cline his job and research grants. Few in the scientific community, at this early stage in the history of genetic experiments, were willing to put their professional careers on the line. With human lives at stake, the risk was too great. Fewer scientists still were willing to forgive those who tried—and failed.
It appeared, then, that whatever benefits might accrue to humanity from biotechnology would come only indirectly. Indeed, early successes in the field of genetic engineering seemed to confirm that fact. By the early 1980s, business ventures had been formed for the specific purpose of advancing and investing in various kinds of genetic research, the offshoots of which certainly would benefit mankind. Compounds such as interferon, and even insulin intended for use in humans, soon were being produced by genetically altered bacteria. Eventually, human growth hormone was added to that list. People were benefiting, indirectly, from genetic engineering.
By the late 1980s and early 1990s, however, the benefits derived from genetic engineering no longer were indirect. Advances in the field were arriving at breakneck speed. Hardly a day passed that scientists from one corner of the globe or another did not announce yet another breakthrough that conferred additional genetic blessings on humanity. For example, an article on “Conquering Inherited Enemies” in Time magazine announced:
Genetic engineers at a handful of U.S. laboratories are getting ready to embark on the first trials of human gene therapy, a revolutionary approach to conquering inherited ailments. Employing the subtlest available techniques of recombinant DNA, the scientists will attempt to inject healthy copies of the affected gene into the bone marrow cells of a victim of a genetic disorder. If all goes well, the good genes will begin producing enough of the missing enzyme to cure the disease. That will be cheering news for the hundreds of thousands of patients who suffer from the 3,000 known genetic disorders (Angier, 1985, p. 59).
Just five years later, another article in Time reported the epochal events surrounding the treatment of a little 4-year-old girl.
Last week, on the 10th floor of the massive Clinical Center of the National Institutes of Health (NIH) in Bethesda, MD, the still unidentified child assumed a historic role. In the first federally approved use of gene therapy, a team of doctors introduced into her bloodstream some 1 billion cells, each containing a copy of a foreign gene. If all goes well, these cells will begin producing ADA, the essential enzyme she requires, and her devastated immune system will slowly begin to recover (Jaroff, 1990, p. 74).
No longer, then, are the potential benefits to humanity from genetic engineering merely indirect in nature. We have moved past the point where people enjoy longer, healthier lives simply because they can take insulin or interferon produced by genetically altered bacteria. Now people themselves are part and parcel of intricate laboratory experiments—experiments that, we are told, will bode well for humanity in both the near and distant future.
THE CURRENT CONTROVERSY OVER CLONING
Genetic engineering (in animals or in humans) potentially can take place: (a) before conception; (b) at conception; (c) prenatally; and (d) postnatally. I have dealt with each of these in an in-depth fashion elsewhere (see Thompson, 1995). For the purposes of this article, however, I would like to restrict the discussion to genetic engineering that occurs at conception.
Discussions of reproductive technologies occurring at conception usually include: (1) cloning; (2) artificial insemination; and (3) in vitro fertilization (IVF). Of these, I will limit my comments here only to cloning.
The English word “clone” derives from the Greek klon, meaning a sprout or twig, and in science refers to an asexual process of reproduction resulting in an exact genetic duplicate of the original. Cloning is quite natural for many of Earth’s life forms. For example, when the amoeba reproduces by splitting into two parts, it is cloning itself. In essence, then, cloning is a way to grow many identical cells or organisms from a single ancestor. However, most plants and animals reproduce sexually—a process that requires a contribution of genes from both the male and female of the species. Therefore, any attempt to clone such organisms, including humans, must involve sophisticated technology. In the science fiction version of cloning, a body cell (also known as a somatic cell) is used to make a copy of an individual. But cloning of relatively complex creatures, such as mammals, for example, usually must begin with an egg, or perhaps even a fertilized egg. Only then can scientists make copies of one unique set of genes.
In one technique known as nuclear transfer, an unfertilized egg is taken from the female, and its nucleus is either destroyed (e.g., by radiation) or removed. The nucleus from a body cell then is placed in the egg, which, when implanted in the uterus, behaves as if it has been fertilized, except that all of its genetic information has been derived from a single individual rather than a pair of parents.
This type of cloning possesses potential benefits. Its greatest value, however, is not as an alternative means of reproduction, but as a powerful laboratory research tool, especially in developmental biology. Cloning can aid in the study of nuclear differentiation by helping scientists to better understand how an embryonic cell becomes a nerve cell, a blood cell, etc. It also can be very helpful in the study of immunology and organ rejection. Additionally, cloning can be used with great benefit in medical research. For example, it can be used in the study of cancer, and also can be used in the study of the aging process.
During the 1950s, F.C. Steward of Cornell University demonstrated how to clone certain plants, and produced carrots by the thousands through such a procedure (see Steward, 1970). In 1952, Robert Briggs and Thomas King of the Institute for Cancer Research in Philadelphia cloned a leopard frog (see Briggs and King, 1952). Since then, carrots, tomatoes, fruit flies, and even frogs have been cloned. The successes (and there were many) were the result of painstaking research carried out using embryonic or neonatal somatic cells (viz., non-adult cells). By the late 1970s, scientists lamented that, in spite of numerous attempts in laboratories around the world, “...no one has yet shown that it is possible to clone a mammal by using a body cell nucleus from an adult” (Lygre, 1979, p. 41). Something—no one quite knew what—seemed to make the somatic cell of the adult an unlikely candidate for cloning procedures. However, investigators did not abandon their efforts, and attempts to clone organisms using adult somatic cells continued at an unprecedented pace.
Clement Markert of Yale University perfected a method that allowed researchers to remove one set of chromosomes—either those from the sperm or those from the egg—just after fertilization. Through biochemical means, the remaining set could be made to double, producing an egg with a double set of the sperm’s (or egg’s) chromosomes. Since the same number of chromosomes as found in a fertilized egg then was present, embryonic development could begin. Peter Hope and Karl Illmensee at the Jackson Laboratory in Bar Harbor, Maine employed this technique in mice, and produced seven offspring, all females. While it is true that none of the seven was a clone of the genetic parents, if the same procedure were repeated on those seven mice (retaining the chromosomes of their eggs), their offspring would be clones.
The first clones of large animals were produced by S.M. Willadsen (1986), who transferred a single cell from an 8-cell sheep embryo to an unfertilized egg whose nucleus had been destroyed. Three of the four reconstituted embryos transferred to ewes’ oviducts developed into lambs that were genetically identical.
But what about attempts at human cloning? Landrum Shettles reported in the American Journal of Obstetrics and Gynecology that he personally had cloned human embryos to the blastocyst stage (the point in early development where the whole embryo has the appearance of a hollow sphere; see Clark, 1979, p. 99). As one writer summarized the experiment:
According to the report, he had removed the genetic material from a human egg cell and replaced it with the nucleus of a human spermatogonium, the precursor of the sperm cell. Because the spermatogonium contains a double set of chromosomes, it is a complete blueprint for the individual. The egg was fertilized, cell division began, and three days later the embryo was at the morula stage, its cluster of cells ready for implantation. If the paper was true, then it meant that the first glimmering of a human being had already been cloned (Kahn, 1988, p. 164, emp. added).
The operative phrase here, of course, is “if the paper was true.” Most scientists working in this field did not believe that it was, and remained skeptical of Dr. Shettles’ experiment. Why? “Shettles never presented evidence that the egg was enucleated, ...nor did he use genetic markers that would have proved that the sole parent of the embryo was indeed the transplanted spermatogonium” (Kahn, 1988, p. 164).
In 1978, science writer David Rorvik authored, and the J.B. Lippincott Company of Philadelphia published, In His Image: The Cloning of a Man. The book reportedly told the story of a 67-year-old eccentric millionaire who had himself cloned successfully, and spawned a serious scientific controversy since it was published as nonfiction. Most scientists dispute claims such as those made by Rorvik and others in regard to the cloning of humans. In its publication, ASM News, the American Society for Microbiology stated:
Four eminent cell biologists have testified before congress that adult cloning of humans has not been and may never be achieved because of biological barriers. They also called David Rorvik’s book, In His Image: The Cloning of a Man, a fictional work replete with scientific errors (1978, p. 334).
One scientist suggested concerning Rorvik’s work: “His book sets new standards for the label ‘nonfiction’ ” (Lygre, 1979, p. 41). In 1981, U.S. District Court judge John Fullam ruled the book to be fiction (Fullam, 1981, p. 2-F) and, several years after its publication, Lippincott publicly acknowledged the book as a hoax.
To some, however, the idea of human clones is not beyond the realm of possibility. Several years ago, Kimball Atwood, professor of microbiology at the University of Illinois, stated that humans could be cloned “within a few years” (as quoted in Rorvik, 1969, p. 9). Nobel laureate James Watson later predicted:
...if the matter proceeds in its current nondirected fashion, a human being born of clonal reproduction most likely will appear on the earth within the next twenty to fifty years, and even sooner, if some nation should actively promote the venture (1971).
To date, there is no credible evidence that humans have been cloned, in the traditional sense of the word.
But who can know what the future may hold in this regard? For example, in October 1993, at a meeting of the American Fertility Society, two Americans, Jerry Hall and Robert Stillman, touched off an unexpected controversy when they presented a research paper on IVF procedures. At the time, Dr. Hall was the director of the in vitro laboratory at George Washington University; Dr. Stillman headed the University’s IVF program. Starting with 17 human embryos ranging from the two-cell to the eight-cell stage, Hall and Stillman used new technology to multiply the embryos from 17 to a total of 48. News magazines and major city newspapers heralded the landmark event with feature articles. The New York Times published a front-page article under a headline that screamed, “Scientist Clones Human Embryos, and Creates an Ethical Challenge.” Newsweek and Time both prepared cover stories on the Hall/Stillman experiments (see Adler, 1993; Elmer-Dewitt, 1993).
The controversy caused by the Hall/Stillman experiment was due, in large part, to the fact that human embryos were involved. However, it is important to note what the experiment did, and did not involve. First, the experiment did not involve the type of cloning of science fiction fame—in which genetic material from a mature individual is nurtured and grown into a replica of the original. Second, the experiment did not involve the cutting and splicing procedures by which DNA strands from cells are mixed and matched. In some instances, to mention just one example, molecular biologists have inserted human genes into the DNA of bacteria to produce insulin in large quantities. But the Hall/Stillman experiment did not involve this kind of genetic engineering.
Hall and Stillman were searching for a way to make IVF more successful. A woman in which only a single embryo is implanted has somewhere between a 10 and 20% chance of becoming pregnant, if all goes well. But if that single embryo could be cloned into three or four, the chances of a pregnancy would increase dramatically. These two researchers were not trying to produce cloned embryos that would be implanted into a potential mother. Instead, they were working with abnormal embryos resulting from fertilization of an egg by multiple sperm cells, and which therefore would not live more than a few days at best.
Method by which Stillman and Hall produced twin embryos from a single embryo (after Kolberg, 1993)
The experiment involved allowing the single-cell embryos to divide into two cells, and then separating them. To do this, the outer coating around the cells (known as the zona pellucida) that is essential to the embryo’s proper development had to be removed. Once the cells had been separated, an artificial zona pellucida had to be created to take the place of the original one that had been destroyed. Hall and Stillman developed an artificial zona pellucida from a gel derived from seaweed. Once the artificial coating was replaced, the cells began to grow.
The experiment, so far as Hall and Stillman were concerned, had been a success, and was repeated numerous times, producing 48 clones in all. But none of the clones lived more than six days. A detailed description of the process used by Hall and Stillman was published in Science News (see Fackelmann, 1994a). While many praised the novel experiment, criticism from some in the academic and scientific communities was quite strong (see Fackelmann, 1994b). Unfortunately, the headlines in newspapers and magazines were not always representative of the actual facts. Humans had not been cloned. While we cannot condone the manner in which the Hall/Stillman research was carried out (i.e., accepting the inevitable death of living human embryos as the by-product of a scientific experiment), at the same time it is important that we understand exactly what the new technology allowed scientists to do, and that we not overstate the case regarding what was accomplished.
[to be continued]
Adler, Jerry (1993), “Clone Hype,” Newsweek, pp. 60-62, November 8.
Angier, Natalie (1985), “Conquering Inherited Enemies,” Time, pp. 59-60, October 21.
ASM News (1978), (Washington, DC: American Society for Microbiology), p. 334, July.
Briggs, Robert and Thomas J. King (1952), “Transplantation of Living Nuclei from Blastula Cells into Enucleated Frog Eggs,” Proceedings of the National Academy of Sciences, 38:455-463.
Clark, Matt (1979), “Clones Again,” Newsweek, February 12.
Elmer-Dewitt, Philip (1993), “Cloning: Where Do We Draw the Line?,” Time, pp. 65-70, November 8.
Fackelmann, Kathy A. (1994a), “Cloning Human Embryos,” Science News, 145:92-93,95, February 5.
Fackelmann, Kathy A. (1994b), “University Probe Faults ‘Cloning’ Research,” Science News, 146:406, December 17.
Fullam, John (1981), as quoted in “Clone Deemed a Hoax,” Dallas Times Herald, p. F-2, March 22.
Jaroff, Leon (1990), “Giant Step for Gene Therapy,” Time, pp. 74-76, September 24.
Kahn, Carol (1988), “Double Takes,” Omni, 11:58-65,164, October.
Kass, Leon (1971), “The New Biology: What Price Relieving Man’s Estate?,” Science, 174:779-788.
Kolberg, Rebecca (1993), “Human Embryo Cloning Reported,” Science, 262:652-653, October 29.
Lygre, David (1979), Life Manipulation (New York: Walker).
Rorvik, David (1969), “Cloning: Asexual Human Reproduction,” Science Digest, pp. 7-9, November.
Rorvik, David (1978), In His Image: The Cloning of a Man (Philadelphia, PA: Lippincott).
Steward, F.C. (1970), “From Cultured Cells to Whole Plants: The Introduction and Control of Their Growth and Differentiation,” Proceedings of the Royal Society [B], 175:1-30.
Thompson, Bert (1995), The Christian and Medical Ethics (Montgomery, AL: Apologetics Press).
Watson, James (1971), “Moving Toward the Clonal Man: Is This What We Want?,” Atlantic, 227:50-53.
Willadsen, S.M. (1986), “Nuclear Transplantation in Sheep Embryos,” Nature, 320:63-65.