Advertisement

Window Into the Body : Blood: Its Mystique, Uses Grow

Share via
Times Staff Writer

From antiquity, it has been synonymous with life itself. It was seen as the seat of the soul. It was thought to have profound, magical powers that could banish disease and restore youth.

Some advised drinking it as a cure for epilepsy or rabies. The Egyptians applied it to the head as a cure for baldness and graying. Others said that sipping fresh quantities of it from slain gladiators would bring strength and vigor, even to an ordinary man. Jean Baptiste Denis, a 17th-Century French physician, argued that a transfusion of it would cure insanity.

Through the ages, it was thought to be so powerful that there were strict taboos on its use. Some believed it should never touch the ground, lest it render the soil barren. Pre-Islamic Arabs and some Indian tribes in North America abstained from eating red meat for fear of becoming possessed by demons or taking on the spirits of beasts.

Advertisement

Testament Views

The Old Testament warned that it must never be consumed, but reserved for sacrifice--”to make atonement for your souls.” The New Testament told of how it would be transformed from wine so that all who drank it could become one with Christ.

In more recent times, the mystique of blood has, if anything, increased--even as scientists have unraveled many of its mysterious inner workings. Today blood has become as potent as an antibiotic in curing some illnesses; it has become a near-miraculous tool for diagnosing others. Indeed, it is now the window through which doctors look into the body and uncover hidden secrets such as the presence of cancer cells or the levels of cholesterol.

Yet blood’s potential uses and limitations remain somewhat misunderstood and highly controversial.

Advertisement

A Test for AIDS

On Tuesday and Wednesday, at a conference in Atlanta sponsored by the federal Centers for Disease Control, public health officials, private physicians and civil liberties advocates will debate one of the newest potential uses of blood: as a way to identify those who have been infected by the deadly AIDS virus.

In doing so, the participants are sure to run into not only some of man’s most deep-seated fears and expectations about blood but also some of the age-old problems of studying this liquid tissue--what it can reveal and what it cannot, how it can sometimes cure but at other times exacerbate or even cause illness.

The issue is especially urgent because the acquired immune deficiency syndrome has killed more than half its 30,000 victims in this country and is estimated to have infected as many as 1.5 million Americans in the six years since the disease was first identified. There is no cure in sight and painfully little is known, except that the virus is transmitted through body fluids, especially blood.

Advertisement

The scientific study of blood is a relatively new subject in modern medicine.

“The blood has fascinated mankind as long as human thought has been recorded and human memory recalls,” says Maxwell W. Wintrobe, head of the department of internal medicine at the University of Utah Medical Center in Salt Lake City.

“To the utility of the blood has been attributed the fate of nations as well as the outcome of individual relationships. The poets spoke of thick blood and thin blood, of pale blood, of noble blood, of pure and eloquent blood. To be of the same or different blood mattered much in human affairs.

“Yet, with so much attributed to blood and so long a history of attention given to it, it is strange that medical science was slow in directing its inquiries to this important fluid,” concludes Wintrobe, author of a recent book, “Hematology: the Blossoming of a Science.”

Dramatic Changes

A number of dramatic changes--from the invention of the microscope in the late 15th Century to William Harvey’s explanation of how blood circulates through the body in the early 17th--had to occur before scientific work on blood could commence. Nonetheless, crude experiments on blood began early in the history of Western civilization.

The idea of transfusing blood dates back at least to the beginning of the Roman Empire and seems to come from literature, according to Harold A. Oberman, writing in a 1981 book “Clinical Practice of Blood Transfusion.”

One of the first references to blood transfusion, he said, occurs in Ovid’s “Metamorphoses,” in which Jason asks Medea to restore his father’s youth. Medea did this by draining the old man’s blood and filling “his ancient veins with a rich elixir” of roots and herbs, owl’s wings, werewolf’s entrails and “the head of a crow which had been alive for nine centuries.”

Advertisement

In the 15th Century, according to medical historians, there is evidence that doctors planned to transfuse the ailing Pope Innocent VIII with the blood of young boys, although the scheme apparently never came to pass because of the Pope’s concern over the fate of the children. Documents from the 17th Century reveal numerous attempts to perform experimental blood transfusions on dogs and birds.

Animal-Blood Transfusion

The first recorded blood transfusion from an animal to a human took place in 1667, after Harvey’s revolutionary theories about the functioning of the heart and the circulation of blood became widely known.

Although there have since been debates as to who deserves credit for being the first to transfuse a human subject--a Frenchman or an Englishman--most historians now credit the French.

Jean Baptiste Denis’ work, however, was not greeted with much pride or enthusiasm by the French medical establishment of his day. Denis had successfully transfused four patients, but when his fifth patient died, the medical establishment of Paris, apparently threatened by what the new procedure might do for their more traditional practices, quickly descended upon him.

Denis’ unfortunate patient, a 34-year-old servant, was being treated with a transfusion of less “heated” blood in an attempt to cure him of periodic bouts of insanity. In his account of the procedure, Denis explained that he had used animal rather than human blood in all of his experiments because animal blood was less likely “to be rendered impure by passion or vice.”

Outlawed Procedure

Not only did they denounce him, Denis’ medical colleagues prodded and may have even bribed the dead patient’s wife into filing a lawsuit against him. While the courts eventually exonerated Denis, the medical establishment of Paris never forgave him and eventually persuaded much of Europe to outlaw his experimental procedure.

Advertisement

For 150 years, it lay dormant. Although blood “letting” continued to be a popular cure for plagues and fevers and a means of releasing “evil spirits” from the body, for the most part the curative nature of blood returned to the province of the imaginations of poets and philosophers.

When the practice of blood transfusion was taken up again, in London in the early 19th Century, a major theoretical advance had been made.

Reflecting the work of several of his colleagues, James Blundell, an outstanding obstetrician of his day, had begun to suspect that “the blood of one animal could not be substituted for that of another with impunity,” Oberman said in his history of blood transfusion.

Blundell is now thought by most scholars to have performed the first human-to-human transfusion and correctly saw the procedure as an appropriate treatment for women who hemorrhaged during childbirth.

Unorthodox Views

Nonetheless, Blundell, too, was often at odds with the medical establishment of his day and, largely because of his unorthodox views about blood, was eventually forced to resign his position at one of London’s leading hospitals.

In the early 20th Century, Karl Landsteiner discovered the existence of human blood groups, which largely assured the safety of transfusions because, until then, many patients became ill or died as a result of getting incompatible blood. Yet, according to Reuben Ottenberg, a New York physician who first used typing and compatibility tests before performing transfusions, it “took about five years of campaigning, experimenting and a few accidents to convince the medical public that blood tests before transfusion were essential.”

Advertisement

If doctors were slow to accept blood transfusions and blood analyses as a routine part of their medical practice, there were certainly understandable reasons for their delay, medical historians now agree.

Until the advances of modern chemistry and the discovery of anti-coagulants in the early 20th Century, patients were likely to bleed to death from even the simplest attempt at a transfusion.

Early Transfusions

Early transfusions had relied on quills and crude syringes, so modern laboratory and surgical equipment had to be developed before doctors could even remove blood from one body and put it into another with any degree of safety or precision.

And sterilization and refrigeration had to be invented before blood could be used without a high risk of infection or other forms of deadly contamination.

Now professional blood banks exist all over the world, collecting and storing blood for use in emergencies, planned surgeries and a variety of medical therapies, including some forms of cancer treatment.

In the United States alone, well over a million gallons of blood are drawn each year and the demand for more grows ever faster as more complex medical procedures are developed. Among the most dramatic in recent years have been the explosive rise in heart surgery and organ transplants.

Advertisement

“Blood is a very interesting material,” said George S. Smith, professor of pathology at UCLA and director of the university’s clinical laboratories. “There probably never will be enough. The use expands to meet the supply.”

Formidable Obstacles

Indeed, even with enormous scientific advances, there are still formidable obstacles in finding enough blood for all of its medical uses.

“Five percent of the population gives all the blood,” Smith said. “No matter what we do, we don’t seem to be able to change that figure much.”

One solution to the difficulties and impracticabilities of getting blood for transfusions was tried in Moscow in the 1930s. There, doctors turned to fresh cadavers. It is an approach that some doctors in the United States still think should be explored further. After all, they argue, the use of cadaver blood differs, philosophically and scientifically, little from the use of donor organs. But the idea has met with general abhorrence by most people in Western countries.

Why? Perhaps because blood continues to carry with it a powerful mystique, even in sophisticated societies. The association between blood and death brings forth images of graveyards and vampires; indeed some religious groups, such as the Jehovah’s Witnesses, oppose blood transfusions altogether.

“There is something about blood,” Smith said. “Many people are terrified of it. It’s an uncontrollable feeling. I don’t know any other substance that will cause a significant portion of the population to faint when they see it.”

Advertisement

Microscope’s Impact

The success not only of blood transfusions but modern medicine in general have depended to a large extent on scientists understanding what blood is and how it functions in the body.

In that regard, perhaps the single most important development, after Harvey’s discovery of circulation, occurred in the 19th Century, when the microscope came into widespread use--thanks largely to a French physician named Alfred Donne. He established a course on the use of microscopes and promoted the idea of using them not just as a toy or curiosity but as an instrument with which to examine body fluids.

This “toy,” which has allowed doctors and scientists to peer into the microscopic world of the body, has in turn opened the way for the 20th Century to establish hematology--the study of blood, the blood-forming tissues and blood diseases--as a full-fledged branch of medical science.

When advances in the field of hematology finally began to emerge, they came at breakneck speed. In many cases, they have come so fast and in such great quantity that the medical community has barely been able to keep pace.

It is now known, for example, that there are not just a handful of blood types (A, B, AB, O, Rh negative and Rh positive) but more than 200 blood groups. So far, however, only a fraction of these groups have proven particularly useful to physicians or researchers.

Artificial Blood

In recent years, scientists have begun to explore the possibility of using artificial blood. But such synthetic agents, though promising, have not yet lived up to expectations.

Advertisement

Just as the blood was once thought to reveal much about the spirit and the psyche, now it is known that the blood serves virtually every part of the body, moving oxygen from the lungs to the organs, transporting hormones from the glands to the tissues, and carrying cells and other substances to aid the body in its defense against infection.

As a result of this knowledge of what blood does, medical researchers have been able to devise numerous tests to show what blood reveals.

Hematologic tests of red blood cells, for example, can identify disorders such as anemia; white cell counts can expose infections and confirm leukemia. In bacteriological tests, blood samples are drawn and then grown in various culture media to isolate and identify microorganisms that cause such things as skin irritations. Chemical tests uncover changes in the sugar, fat, cholesterol and other chemical constituents of the blood, thus revealing and monitoring such disorders as diabetes and heart disease.

So helpful have these tests been in diagnosing disease that even modern scientists have, on occasion, gotten carried away with their expectations. Less than two decades ago, for example, medical researchers throughout the world were convinced that an agent in blood called carcinoembryonic antigen, or CEA, would provide an accurate early warning sign of cancer. The theory, once thought to be worthy of the Nobel Prize, proved to be unreliable.

Test for AIDS Virus

One of the most important and surely the most controversial form of blood testing involves immunological or serologic tests that reveal the presence of viruses that cause hepatitis, syphilis or AIDS, for example.

Such tests have been controversial because, like most medical procedures, they are not 100% accurate. Moreover, there has been increasing concern in recent years about the ethics of mandating any kind of screening tests, even for contagious diseases.

Advertisement

The Centers for Disease Control, which is sponsoring this week’s conference on widespread blood testing for the AIDS virus, has proposed that states consider such screening for all marriage license applicants, for everyone who is admitted to a hospital and for those seeking medical care for pregnancy or for sexually transmitted diseases.

Advocates of such mandatory screening say these measures may be the only hope of containing the spread of the epidemic. But opponents say widespread blood testing for the AIDS virus may result in gross civil liberties violations while doing little to abate the spread of the infection. They note, for instance, that premarital blood tests for syphilis, which were mandated by almost every state between the mid-1940s and the early 1960s, now are generally viewed as a costly and ineffective means of stopping venereal disease and have been abandoned by many states.

History also shows that medical screening tests might be misused either out of ignorance or prejudice, said Joel Howell, a physician and medical historian at the University of Michigan.

Syphilis Prejudice

For example, in the not-too-distant past in the United States, syphilis victims were denied marriage licenses, excluded from the military and often denied jobs.

Epileptics also were refused marriage licenses and were excluded from schools and colleges. Similarly, tuberculosis victims were excluded from jobs and schools, and if they were unlucky enough to have been immigrants, they were turned out of the country altogether.

Two experiments, one that ended in the 1960s, the other in the 1970s, turned the tide of public opinion against human testing and led eventually to the codifying of many of the current federal regulations that severely restrict the use of human subjects in medical experiments, said James H. Jones, a professor of history at the University of Houston whose 1981 book, “Bad Blood,” chronicles the history and aftermath of one of those experiments.

Advertisement

One was known as the Tuskegee Syphilis Experiment, which involved the U.S. Public Health Service and various state and local agencies in Alabama. Over a period of 40 years, researchers deli1650815585400 black men who were suffering from syphilis. It ended in 1972 after being uncovered by an Associated Press reporter.

The other experiment involved the intentional inoculation of mentally retarded children at Willowbrook State School in Staten Island with hepatitis in an effort to study the highly contagious disease.

To some it is inconceivable that anything so egregious would ever be attempted again in this country. But to others, the idea is not at all farfetched, especially in view of the unpopular groups that are most at risk of contracting AIDS: intravenous drug users and homosexual men.

Regardless of the historical and scientific problems associated with blood testing, various pressures for it continue to mount, not the least of which is the public’s expectation that, somehow, blood tests by themselves will stop the spread of AIDS.

The power of blood in our culture is a notion that “runs very deep through our culture . . . and may be very hard to overcome,” said Allan M. Brandt, a Harvard Medical School historian.

Advertisement