As corporations push new medicines, sound and affordable healthcare
In the past two months I have talked with many people who have a keen
interest in whether the Senate will decide to ban therapeutic cloning.
At a conference at a Philadelphia hospital, a large number of people,
their bodies racked with tremors from Parkinson's disease, gathered to
hear me speak about the ethics of stem cell research. A few weeks
earlier I had spoken to another group, many of whom were breathing with
the assistance of oxygen tanks because they have a genetic disease,
Alpha-1 antitrypsin deficiency, that destroys their lungs and livers.
Earlier still I met with a group of parents whose children are paralyzed
as a result of spinal cord injuries.
At each meeting I told the audience there was a good chance that the
government would criminalize research that might find answers to their
ailments if it required using cloned human embryos, on the grounds that
research using such embryos is unethical. The audience members were
incredulous. And well they should have been. A bizarre alliance of
antiabortion religious zealots and technophobic neoconservatives along
with a smattering of scientifically befuddled antibiotech progressives
is pushing hard to insure that the Senate accords more moral concern to
cloned embryos in dishes than it does to kids who can't walk and
grandmothers who can't hold a fork or breathe.
Perhaps it should come as no surprise that George W. Bush and the House
of Representatives have already taken the position that any research
requiring the destruction of an embryo, cloned or otherwise, is wrong.
This view derives from the belief, held by many in the Republican camp,
that personhood begins at conception, that embryos are people and that
killing them to help other people is simply wrong. Although this view
about the moral status of embryos does not square with what is known
about them--science has shown that embryos require more than genes in
order to develop, that not all embryos have the capacity to become a
person and that not all conception begins a life--it at least has the
virtue of moral clarity.
But aside from those who see embryos as tiny people, such clarity of
moral vision is absent among cloning opponents. Consider the views of
Leon Kass, William Kristol, Charles Krauthammer and Francis Fukuyama.
Each says he opposes research involving the cloning of human embryos.
Each has been pushing furiously in the media and in policy circles to
make the case that nothing could be more morally heinous than harvesting
stem cells from such embryos. And each says that his repugnance at the
idea of cloning research has nothing to do with a religiously based view
of what an embryo is.
The core of the case against cloning for cures is that it involves the
creation, to quote the latest in a landslide of moral fulminations from
Krauthammer, "of a human embryo for the sole purpose of using it for its
parts...it will sanction the creation of an entire industry of embryo
manufacture whose explicit purpose is...dismemberment for research."
Sounds like a very grim business indeed--and some progressives, notably
Jeremy Rifkin and Norman Mailer, have sounded a similar alarm as they
have joined the anticloning crusade.
From the secular viewpoint, which Krauthammer and like-minded cloning
opponents claim to hold, there is no evidence for the position that
embryonic clones are persons or even potential persons. As a simple fact
of science, embryos that reside in dishes are going nowhere. The
potential to become anything requires a suitable environment. Talk of
"dismemberment," which implicitly confers moral status on embryos,
betrays the sort of faith-based thinking that Krauthammer says he wants
to eschew. Equally ill-informed is the notion that equivalent medical
benefits can be derived from research on adult stem cells; cloned
embryonic stem cells have unique properties that cannot be duplicated.
The idea that women could be transformed into commercial egg farms also
troubles Krauthammer, as well as some feminists and the Christian
Medical Association. The CMA estimates that to make embryonic stem-cell
cloning work, more than a billion eggs would have to be harvested. But
fortunately for those hoping for cures, the CMA is wrong: Needed now for
cloned embryonic stem-cell research are thousands of eggs, not billions.
While cloning people is a long shot, cloning embryos is not, and it
should be possible to get the research done either by paying women for
their eggs or asking those who suffer from a disease, or who know
someone they care about who has a disease, to donate them. Women are
already selling and donating eggs to others who are trying to have
babies. Women and men are also donating their kidneys, their bone marrow
and portions of their livers to help others, at far greater risk to
themselves than egg donation entails. And there is no reason that embryo
splitting, the technique used today in animals, could not provide the
requisite embryo and cloned stem-cell lines to treat all in need without
a big increase in voluntary egg donation from women.
In addition to conjuring up the frightening but unrealistic image of
women toiling in Dickensian embryo-cloning factories, those like
Krauthammer, who would leave so many senior citizens unable to move
their own bodies, offer two other moral thoughts. If we don't draw the
line at cloning for cures, there will soon enough be a clone moving into
your neighborhood; and besides, it is selfish and arrogant to seek to
alter our own genetic makeup to live longer.
The reality is that cloning has a terrible track record in making
embryos that can become fetuses, much less anything born alive. The most
recent review of cloning research shows an 85 percent failure rate in
getting cow embryos to develop into animals. And of those clones born
alive, a significant percentage, more than a third, have serious
life-threatening health problems. Cloned embryos have far less potential
than embryos created the old-fashioned way, or even frozen embryos, of
becoming anything except a ball of cells that can be tricked into
becoming other cells that can cure diseases. Where Krauthammer sees
cloned embryos as persons drawn and quartered for their organs, in
reality there exists merely a construct of a cell that has no potential
to become anything if it is kept safely in a dish and almost no
potential to develop even if it is put into a womb. Indeed, current work
on primate cloning has been so unproductive, which is to say none made
to date, that there is a growing sentiment in scientific circles that
human cloning for reproduction is impossible. The chance of anyone
cloning a full-fledged human is almost nil, but in any case there is no
reason that it cannot be stopped simply by banning the transfer of these
embryos into wombs.
But should we really be manipulating our genes to try to cure diseases
and live longer? Kass and Fukuyama, in various magazine pieces and
books, say no--that it is selfish and arrogant indulgence at its worst.
Humanity is not meant to play with its genes simply to live longer and
Now, it can be dangerous to try to change genes. One young man is dead
because of an experiment in gene therapy at my medical school. But the
idea that genes are the defining essence of who we are and therefore
cannot be touched or manipulated recalls the rantings of Gen. Jack D.
Ripper in Doctor Strangelove, who wanted to preserve the
integrity of his precious bodily fluids. There's nothing inherently
morally wrong with trying to engineer cells, genes and even cloned
embryos to repair diseases and terminal illnesses. Coming from those who
type on computers, wear glasses, inject themselves with insulin, have
had an organ transplant, who walk with crutches or artificial joints or
who have used in vitro fertilization or neonatal intensive care to
create their children, talk of the inviolate essence of human nature and
repugnance at the "manufactured" posthuman is at best disingenuous.
The debate over human cloning and stem cell research has not been one of
this nation's finest moral hours. Pseudoscience, ideology and plain
fearmongering have been much in evidence. If the discussions were merely
academic, this would be merely unfortunate. They are not. The flimsy
case against cloning for cures is being brought to the White House, the
Senate and the American people as if the opponents hold the moral high
ground. They don't. The sick and the dying do. The Senate must keep its
moral priorities firmly in mind as the vote on banning therapeutic
cloning draws close.
President Dubya loves to crusade against the "axis of evil" in his war on terrorism--but he formed an unholy alliance with the countries that make up the "axis" to declare war on the condom as a
A recent front-page story in the Boston Globe proclaimed that New
England leads the nation in Ritalin prescription levels. Somewhat to my
surprise, the prevalence of Ritalin ingestion was generally hailed as a
good thing--as indeed it may be in cases of children with ADHD. But to
me the most startling aspect of the Globe's analysis was the
seeming embrace in many places of Ritalin as a "performance enhancer."
Prescription rates are highest in wealthy suburbs.
While the reasons for such a statistical skewing need more exploration
than this article revealed, what I found particularly interesting was
the speculation that New Englanders have a greater investment in
academic achievement: "'Our income is higher than in other states, and
we value education,' said Gene E. Harkless, director of the family
nurse-practitioner program at the University of New Hampshire. 'We have
families that are seeking above-average children.'"
Aren't we all. (And by "all," I mean all--wouldn't it be nice if
everyone understood that those decades of lawsuits over affirmative
action and school integration meant that poor and inner-city families
also "value education" and are "seeking above- average children"?) But
Ritalin, after all, works on the body as the pharmacological equivalent
of cocaine or amphetamines. It does seem a little ironic that poor
inner-city African-Americans, who from time to time do tend to get a
little down about the mouth despite the joys of welfare reform, are so
much more likely than richer suburban whites to be incarcerated for
self-medicating with home-brewed, nonprescription cocaine derivatives.
If in white neighborhoods Ritalin is being prescribed as a psychological
"fix" no different from reading glasses or hearing aids, it's no wonder
the property values are higher. Clearly the way up for ghettos is to
sweep those drugs off the street and into the hands of drug companies
that can scientifically ladle the stuff into underprivileged young black
children. I'll bet that within a single generation, the number of
African-Americans taking Ritalin--to say nothing of Prozac and
Viagra--will equal rates among whites. Income and property values will
rise accordingly. Dopamine for the masses!
Another potential reason for the disparity is, of course, the matter of
access to medical care. Prescriptions for just about anything are likely
to be higher where people can afford to see doctors on a regular
basis--or where access to doctors is relatively greater: New England has
one of the highest concentrations of doctors in the country. But access
isn't everything. Dr. Sally Satel, a fellow at the American Enterprise
Institute, says that when she prescribes Prozac to her lucky
African-American patients, "I start at a lower dose, 5 or 10 milligrams
instead of the usual 10-to-20-milligram dose" because "blacks metabolize
antidepressants more slowly than Caucasians and Asians." Her bottom line
is that the practice of medicine should not be "colorblind" and that
race is a rough guide to "the reality" of biological differences.
Indeed, her book, PC, M.D.: How Political Correctness Is Corrupting
Medicine, is filled with broad assertions like "Asians tend to have
a greater sensitivity to narcotics" and "Caucasians are far more likely
to carry the gene mutations that cause multiple sclerosis and cystic
fibrosis." Unfortunately for her patients, Dr. Satel confuses a shifting
political designation with a biological one. Take, for example, her
statement that "many human genetic variations tend to cluster by racial
groups--that is, by people whose ancestors came from a particular
geographic region." But what we call race does not reflect geographic
ancestry with any kind of medical accuracy. While "black" or "white" may
have sociological, economic and political consequence as reflected in
how someone "looks" in the job market or "appears" while driving or
"seems" when trying to rent an apartment, race is not a biological
category. Color may have very real social significance, in other words,
but it is not the same as demographic epidemiology.
It is one thing to acknowledge that people from certain regions of
Central Europe may have a predisposition to Tay-Sachs, particularly
Ashkenazi (but not Sephardic or Middle Eastern) Jews. This is a reality
that reflects extended kinship resulting from geographic or social
isolation, not racial difference. It reflects a difference at the
mitochondrial level, yes, but certainly not a difference that can be
detected by looking at someone when they come into the examining room.
For that matter, the very term "Caucasian"--at least as Americans use
it, i.e., to mean "white"--is ridiculously unscientific. Any given one
of Dr. Satel's "Asian" patients could probably more reliably claim
affinity with the peoples of the Caucasus mountains than the English-,
Irish- and Scandinavian-descended population of which the gene pool of
"white" Americans is largely composed. In any event, a group's
predisposition to a given disease or lack of it can mislead in making
individual diagnoses--as a black friend of mine found out to his
detriment when his doctor put off doing a biopsy on a mole because
"blacks aren't prone to skin cancer."
To be fair, Dr. Satel admits that "a black American may have dark
skin--but her genes may well be a complex mix of ancestors from West
Africa, Europe and Asia." Still, she insists that racial profiling is of
use because "an imprecise clue is better than no clue at all." But let
us consider a parallel truth: A white American may have light skin, but
her genes may well be a complex mix of ancestors from West Africa,
Europe and Asia. Given the complexly libidinous history of the United
States of America, I worry that unless doctors take the time to talk to
their patients, to ask, to develop nuanced family histories or, if
circumstances warrant, to perform detailed genomic analyses, it would be
safer if they assumed that, as a matter of fact, they haven't a clue.
We live in a world where race is so buried in our language and habits of
thought that unconscious prejudgments too easily channel us into
empirical inconsistency; it is time we ceased allowing anyone, even
scientists, to rationalize that consistent inconsistency as
More than the much-reviled products of Big Tobacco, big helpings and Big Food constitute the number-one threat to America's children, especially when the fare is helpings of fats, sugars and salt. Yet the nation so concerned about protecting kids from nefarious images on library computers also
allows its schools to bombard them with food and snack ads on Channel One and to sign exclusive deals with purveyors of habit-forming, tooth-rotting, waist-swelling soft drinks.
Foreigners who arrive in the United States often remark on the national obsessions about food and money. It is perhaps not surprising that a gluttonous mammon would rule the federal regulators of our food chain, but Marion Nestle, professor of nutrition at New York University, confesses that she has heard few of her nutritionist colleagues discuss the cardinal point: "Food companies will make and market any product that sells, regardless of its nutritional value or its effect on health."
Nestle goes on to demonstrate that not only do food companies use traditional corporate checkbook clout with Congress to insure their unfettered right to make money; they also co-opt much of the scientific and nutritional establishment to aid in their efforts. For example, the omnipresent "milk mustache" advertisements often show blacks and Asians--precisely those who are most likely to be lactose-intolerant. But then "science" rides to the rescue: There are a lot more research dollars shunted to those arguing that lactose intolerance is not a problem than there are for those who think otherwise. In fact, the Physicians' Committee for Responsible Medicine sued to annul the federal dietary guidelines, which recommended two to three servings of milk products daily; six of the eleven people on the voting committee had received research grants, lectureships or other support from the food industry.
Here Nestle wobbles a little in her argument, however. She waves the standard of science on behalf of the Food and Drug Administration when it comes to food supplements and herbal medicines, but devalues the "science" as well by revealing the conflicts of interest among researchers and regulators. Science is often up for sale. Researchers go to the food corporations for the same reason that bandits rob banks: That's where the money is, not least since the FDA's own research funding is controlled by Congressional committees in charge of agriculture, whose primary aim is hardly to promote dieting--it is the force-feeding of agribusiness with federal funds. Indeed, Nestle concedes, "USDA officials believe that really encouraging people to follow dietary guidelines would be so expensive and disruptive to the agricultural economy as to create impossible political barriers."
The dietary guidelines Nestle is referring to were monumentalized in the famous "food pyramid" familiar to every primary school student. But the pharaohs finished theirs in less time than it took the FDA to pilot its version past the army of lobbyists who resented the hierarchical implication that some foods were healthier than others. As a whistleblower on the FDA advisory committee that was drawing up the guidelines, Nestle is well qualified to recount the obstacles it faced.
In fact, many people did become more health-conscious as a result of such guidelines, but as Nation readers know, practice does not always match theory. Much of Food Politics reveals how the food industry has seized upon the marketing possibilities of consumers' safety concerns and perverted them by adding supplements to junk foods and then making health claims for the products.
Food is an elemental subject, on a par with sex and religion for the strength of people's beliefs about it. Otherwise rational people have no difficulty believing the impossible during breakfast, where their stomachs are concerned. Big Food relies on that snake-oil factor, the scientific illiteracy of most consumers. For example, marketers are happy with the advice to eat less saturated fat, since most buyers won't recognize it when it's drizzled across their salad. But advice to eat less of anything recognizable stirs up serious political opposition.
Federal dietary guidelines recommending that we "eat less" were thinned down to suggesting that we "avoid too much," which metabolized into "choose a diet low in..." And so on. For example, Nestle relates how in 1977 the National Cattlemen's Association jumped on Bob Dole's compromise wording on reducing red meat in the diet and increasing lean meat consumption: "Decrease is a bad word, Senator," the cattlemen warned him. The cowboys effectively corralled the McGovern committee on dietary guidelines: "Decrease consumption of meat" was fattened into "choose meats, poultry and fish which will reduce saturated fat intake."
Sometimes the more potential for harm, the more it seems likely that a product's positive--or putative--health benefits will be touted. We get vitamin-supplemented Gummi Bears and, what provokes Nestle's justifiable ire most, Froot Loops. This marshmallow blasted "cereal...contains no fruit and no fiber" and "53% of the calories come from added sugar," she inveighs. The perfect breakfast complement to a twenty-ounce bottle of cola that will be downed in school? Such pseudo-foods occupy the very top of the food pyramid, which characterizes them as to be used sparingly, or rather, only use if you have good dental insurance.
As Nestle points out, health warnings on alcohol and tobacco have done little to stop consumers. But picture a tobacco company allowed to sell cigarettes as "healthier" or "with added vitamins." (Indeed, she details a campaign by the alcohol companies to get Congress to allow them to market their products as healthy elixirs until Strom Thurmond's religious principles outweighed his conservatism enough for him to help shoot down the proposal.)
I was mildly surprised that Nestle does not comment on the imprecise use of "serving" information on food packaging. As a longtime student of labels, I find that the unhealthiest foods seem to have incredibly small "servings" compared with what consumers actually eat or drink. For the USDA, one slice of white bread or one ounce of breakfast cereal is a "serving" of grain, and nutritional data such as caloric content are rendered "per serving." A cinema-size actual serving of soda may contain 800 calories in sugar, before you get down to the buttered popcorn, not to mention the Big Mac before or after.
Food marketers are hardly breaking people's arms to persuade them to eat this stuff, of course. It is, after all, a great American principle that you can have your cake, eat it and slim down at the same time. What Nestle calls "techno-foods"--those labeled "healthier," "less fat," "lite," "more fiber"--pander to the health consciousness of a generation that will do anything to lose weight and live longer, except eat less.
The ultimate example of food marketing has to be Olestra, the cooking fat that passes through the gut undigested. Its maker, Proctor & Gamble, has spent up to $500 million on it, and spent twenty-seven years of the FDA's time getting various approvals, while it kept trying to remove the mandated health warning that the product could cause cramping, loose stools and block the absorption of fat-soluble vitamins. P&G should count its blessings. A Center for Science in the Public Interest report says: "Olestra sometimes causes underwear staining. That phenomenon may be caused most commonly by greasy, hard-to-wipe-off fecal matter, but occasionally also from anal leakage (leakage of liquid Olestra through the anal sphincter)."
By 1998 Proctor & Gamble disingenuously claimed that 250 tons, or four railcarfuls, of fat had not been added to American waistlines. No one claimed it had--what the company meant was that was how much Olestra had been used to fry chips. The public expectations were quite high, though; Nestle says that "people also were disappointed that the chips did not help them lose weight." Indeed, she reports that some ended up with more calories from eating Olestra-fried chips than they would have from other kinds, because they consumed a higher volume, convinced that they were calorie-free, though of course they were not.
But given the amount of money involved and the way food-industry/scientific-community connections are structured, "it is virtually impossible for any nutritionist interested in the benefits and risks of Olestra to avoid some sort of financial relationship with P&G unless one systematically refuses all speaking invitations, travel reimbursements, honoraria and meals from outside parties," Nestle observes.
In yet another case of Big Food getting its way, Nestle chronicles how the State Department came to declare that signing the World Health Organization/UNICEF international code on marketing of baby formula would flout the Constitution. "Inasmuch as this explanation strains credulity," Nestle suggests, the real reason was lobbying by US formula companies. The formula makers are fighting a war of attrition against mother's milk, in other words, not just here but internationally.
A more recent case involves the coalition that forced the FDA to allow claims of benefits from untested herbal supplements. I wish Nestle had gone into more detail about the sociology of this mélange of New Age alternative-medicine users, libertarian types and those who mistrust the medical establishment. Groups like Citizens for Health and the Corporate Alliance for Integrative Medicine rallied behind the rapidly growing corporations to ram a suppository up the FDA and its power to control sales of what on occasion have proven to be fatally flawed "alternative" remedies for everything from impotence to Alzheimer's. As she quotes an FDA commissioner, "[We] are literally back at the turn of the century, when snake oil salesmen made claims for their products that could not be substantiated." She reports claims that 12 percent of users of herbal medicines, or about 12 million people, suffer from some kind of adverse effect.
People may feel better when they take supplements, but should health officials use "feelings" as a basis for regulatory measures, she asks? Or should the FDA instead "take the lead in reenergizing a crucial phase of its basic mission to promote honest, rational scientific medicine by vigorously combating its opposite"?
Many people may want to know what "science" is. Is it corporate-sponsored research, or the AMA defending its professional turf with the same vigor with which it has traditionally fought "socialized medicine"? Nestle shows how the American Academy of Pediatrics tried to insure that highly profitable baby formula flowed through its hands and rallied against direct sales to mothers. Was that concern for the "client" or concern for professional prerogatives?
Perhaps Nestle should have been more polemical. The food supplement row raised the question of "whether irreparable damage has been done to the ability of our federal regulatory system to ensure the safety of foods and supplements and to balance public health interests against the economic interests of corporations," she writes. But her own reporting suggests that the barbarians are already inside the gates and forcing their wares on the gullible.
Nestle sees no magic bullet to retrieve the situation. She wants "some federal system to guarantee that all those products on the shelves are safe and effective," and she asks, "Shouldn't there be some regulatory framework to control patently absurd or misleading claims?" To answer that in the affirmative is not necessarily the same as agreeing that the FDA is the best agency, certainly in its present form, nor that the AMA and similar organizations are in the corner of good science. The FDA's record does not inspire confidence, which is one of the reasons the herbalists' revolt was so successful in Congress. Its arrogance often matches its ignorance. While reading this book I went to a small British-owned cholesterol shop in Manhattan (pork pies, etc.). Its owner can't import kippers because the FDA does not recognize them as food. His first shipment of a brand of British Band-Aids was held on suspicion of being a soup, and when that confusion was finally cleared up, the FDA demanded of him a medical-goods import license.
I would like to hear more about how the FDA could be made more responsive and more efficient. It seems that in their present form, the regulatory bodies need some means of democratic oversight to check bureaucracy and to weigh problems of undue influence from the producing industries. Nestle details problems we've come to see elsewhere: the revolving door between civil servants, Congressional staff and industry. She also suggests rules--"a higher and stronger 'firewall'" between regulatory agencies and industry to inhibit the easy career glide from poaching to gamekeeping and back again--and she is entirely correct that the last bodies that should be overlooking FDA funding are the Congressional agriculture committees, which are dedicated to the prosperity of agribusiness.
Otherwise, Nestle's wish list ranges from sensible to Mission Impossible: tighter labeling rules so people can see exactly what they are consuming. A ban on advertising of junk foods in schools, especially candies and soft drinks with high sugar content. Sumptuary taxes on soft drinks as well--sure to be opposed bitterly by the lobbyists. If alcohol and tobacco advertisements cannot be allowed on children's TV, why allow advertising of foods that promote obesity and future health ills on a par with them?
At first glance, Nestle's call for an ethical standard for food choices for nutritionists and the industry seems highly idealistic; but ten years ago, who would have foreseen Philip Morris's berating of state governments for not spending their tobacco settlement money on the pledged anti-child-smoking campaigns? Already, more and more scientific journals are demanding disclosure of conflicts of interest for papers submitted.
Nestle does not touch the subject directly, but who knows, maybe campaign finance reform really will cut indirectly the pork in the political diet and the crap in the school lunches. However, it will be a hard push. Educating the public is a start, and Food Politics is an excellent introduction to how decisions are made in Washington--and their effects on consumers. Let's hope people take more notice of it than they do of the dietary guidelines.
o, my momma called, "Why are they letting them gouge us like this?" she wanted to know. "They" are our so-called political leaders in Washington, and "them" are the drugmakers now costing her $500 a month. Nearing 87, Lillie Mabel Hightower has to take two medicines regularly, including a heart pill to keep the old ticker ticking. She tells me her pill bill goes up just about every time she refills her two prescriptions, having soared 40 percent in only two years. For someone on Social Security, the difference between $3,600 a year and $6,000 a year is a serious piece of change. "Of course I know why," she quickly added in answer to her own question: "It's the big money they give the politicians. But can't we do something? Who do I write?"
Like my mom's, the blood pressure of millions of seniors and others has reached the political boiling point because of price-gouging by big drug companies. Americans pay the highest prices in the world for prescriptions--an average of 30 percent more, for example, than Canadians pay for the exact same drugs. The companies jacked up our prices by another 17.1 percent last year while they went laughing to the bank with the highest profit margins of any industry, more than triple the average of all Fortune 500 corporations.
Political consultants in Washington recognize the explosiveness of this issue, so there has been a flurry of bills, press conferences and photo-ops by both parties, with each claiming that it cares more than the other about the problem. But, as Hemingway once advised, never mistake motion for action. No lobbying group is as well financed and well connected as the drug industry is in our capital city. It has 625 registered lobbyists on its payroll--ninety more lobbyists than there are members of Congress! The industry also liberally greases the skids of the legislative process with huge campaign donations, topping $26 million in the last election cycle. The result is that Washington postures, drug prices keep going up and seniors continue to seethe.
Still, my mother asks, "Can't we do something?" Yes.
Look to the states where citizens' groups have teamed up with legislative leaders who not only are in motion but have taken action. While Washington fiddles and faddles, twenty-six states now have some sort of program to cut drug costs, at least for the low-income elderly, and several are leading the way toward programs to take the gouge out of prescription prices for everyone.
Chellie Pingree led the charge in Maine. A small businesswoman, she was elected to the State Senate, where she took up the cause of seniors being pounded by drug prices so high that some were forced to choose between paying for essential medications or the heating bill. With the leadership of grassroots groups like the Maine People's Alliance, Consumers for Affordable Health Care and the Maine State Council of Senior Citizens, she led busloads of seniors on well-publicized trips across the Canadian border to buy their medicines; on just one trip, twenty-five seniors got prescriptions filled for $16,000 less than in the United States. Why should people have to take a six-hour bus ride to get fair prices, she asked? She answered by sponsoring the Fairer Prescription Drug Prices Act, which empowered a state pricing board to set retail prices in Maine.
Pingree's bill allowed seniors to go into any pharmacy in Maine and get the prescriptions they need at the same discounted price that Canada's government negotiates with drugmakers for its citizens. Her bill was simple, comprehensive, nonbureaucratic, effective...and it drove the big drugmakers bonkers. They dispatched their own buses to Augusta, loaded with lobbyists and money, in a frantic effort to kill the bill. In a blitz of TV and newspaper ads, the industry labeled the bill "a crazy idea" that would force the drug industry to abandon Maine. But the grassroots groups went to work, and Pingree, by now the Senate majority leader, took her bill directly to the people, holding public meetings from Madawaska to Biddeford. On April 12, 2000, her bill passed in both houses of the legislature by veto-proof margins, and the governor signed it.
Of course, the industry immediately hitched up a twenty-mule team of lawyers and rushed to federal court, but Maine's price-control law has been upheld all the way through the appeals court level and now awaits judgment at the US Supreme Court. Meanwhile, twenty-three other states, from Arizona to Wisconsin, are considering Maine's fair-pricing law, and public pressure from people like my momma is turning up the heat for national legislation.
Pingree, who is now running for the US Senate, has put the issue at the center of her campaign, vowing to bring the populist coalition behind the Maine Solution into play nationally. "There's such a disconnect between Washington and people's reality," she says. "This is more than an issue to people, it's personal outrage. Walk into any room, and it doesn't matter if the people are in overalls or suits; they've all got a story."
USAction, a network of state and local citizens' coalitions, has been a leader in developing the state proposals, and it's now working with other groups to move this public grievance from the low, slow backburner of Congress to the forefront of the progressive agenda (202-624-1730 or www.usaction.org). "This is a case where the people are miles ahead of the politicians," points out USAction's executive director, Jeff Blum, urging that progressives in Congress put forth a "fair pricing" plan that would give every senior the lowest price available on every drug. The key is not merely to provide universal coverage but to connect this to effective controls over the industry's ripoff prices. US citizens should pay no more than the average price that the drugmakers charge foreign customers in Canada, Japan, Italy and elsewhere.
If Congressional Democrats have a strategic bone left in their bodies, they'll grab this proposal and run with it, for the Lillie Mabel Hightowers are desperately looking for someone who'll stand with them against the drug profiteers. As pollster Celinda Lake reports, "This is the most powerful and intense issue of the 2002 elections, and Democrats should take the lead." If the party won't even stand up for our mommas, who'll stand up for the party?
Odds are good that on a plane or boat or bus somewhere in the world sits a refugee headed for the United States carrying the seeds of a weapon of mass destruction. The agent he unwittingly carries is insidious and lethal but slow acting, so the deaths it causes can come months or even years after it is disseminated in the population. It has the potential to overwhelm, to kill thousands, and there may be no vaccine, antidote or cure.
What is this ominous threat? An ingenious new biological weapon? No, it is a very old nemesis of humankind--tuberculosis, an infectious disease that kills more than 2 million people every year. Because the majority of them are poor and outside our borders, we don't hear much about them, but that may soon change. Tuberculosis is making a comeback and is conquering the treatments that have kept this killer at bay in the developed world. Multidrug-resistant tuberculosis--a death sentence in most developing countries--is becoming more common and is incurable in about half the cases, even in the United States.
Two numbers in the President's budget proposal stand in stark contrast to each other: $6 billion to fight bioterrorism versus $200 million to the Global Fund to Fight AIDS, Tuberculosis and Malaria. That works out to a little more than $1 billion per US anthrax death last year, as compared with $33 per global victim of the more common infectious scourges in 2001.
Re-emerging infectious diseases like malaria, HIV and tuberculosis continue their inexorable march, devastating poor countries in Africa, Asia and South America and ultimately threatening the richest countries. Either we are all protected or we are all at risk. It would be better to recognize that the developed world's inaction and callousness have allowed epidemics to flourish in the fertile soil of poverty, malnutrition and poor living conditions made worse by wars, internal displacements, repressive regimes, refugee crises, economic sanctions and huge debt payments that require poor countries to cut public services.
If we can possibly be unmoved by the staggering numbers affected by the AIDS pandemic alone (3 million dead in 2001, more than 40 million people living with HIV, 28 million of them in Africa), then perhaps we will be moved by fear. Of the world's 6 billion inhabitants, 2 billion are infected with latent tuberculosis. With adequate treatment, TB is 90 percent curable, yet only a fraction of those with the disease have access to this simple technology: a course of medications costing only $10 to $20 per patient. As a result, tuberculosis is now the world's second leading infectious killer after AIDS. Resistant tuberculosis, the result of inadequate treatment, is spreading at alarming rates in poor countries and in urban centers of rich countries. According to the World Health Organization, an eight-hour plane flight with an infected person is enough to risk getting TB.
Of all the rich countries, the United States has its head buried most deeply in the sand. It is the stingiest, spending only one cent on foreign aid for every $10 of GNP. Only $1 in $20 of the aid budget goes to health. A recent WHO report estimated that spending by industrialized countries of just 0.1 percent more of their GNPs on health aid would save 8 million lives, realize up to $500 billion a year via the economic benefits of improved health and help those in poor countries escape illness and poverty.
George W. Bush's recent pledge to increase foreign aid comes too late and with too many strings attached. The proposal doesn't start until 2004, making it largely hypothetical. The current budget keeps spending flat at about $11.6 billion. Even with the promised increase, US spending on foreign aid as a proportion of GNP will still pale in comparison with that of other developed nations. And along with this carrot comes a big stick: Only countries that continue to let corporations raid their economies through detrimental free-trade policies will be eligible.
Bioterrorism is a danger that should be taken seriously, but the current counterterrorism frenzy threatens to militarize the public health system, draining resources away from the research, surveillance systems and treatments needed for existing health problems. Already the political war profiteers have criticized the CDC's funding priorities, using the terrorist threat as cover in an attempt to advance their reactionary agenda. In a letter this past November to Health and Human Services Secretary Tommy Thompson, Republican Representatives Joseph Pitts, John Shadegg and Christopher Smith criticized the CDC for "inappropriate" actions. The Congressmen wrote that "we have grown increasingly concerned about some of the activities that the CDC is funding and promoting--activities that are highly controversial in nature, and funding that could be better used for our War on Terrorism." They specifically objected to AIDS prevention programs targeted at gay men and to a CDC website link to a conference sponsored by organizations promoting reproductive health, including abortions, for women. Bush was happy to oblige by cutting $340 million from the CDC's nonbioterrorism budget.
Just as a missile shield will not protect us from crazed men with box cutters, so mass vaccination campaigns and huge stockpiles of antibiotics will not keep us healthy in an increasingly unhealthy world. September 11 should make us more aware than ever of our shared vulnerability. Making the world safer and healthier means prevention and early treatment of disease, inside and outside our borders. It means building a healthcare system designed to keep people healthy instead of spending billions on bogymen while the real killers are on our doorstep.
When Bill Clinton signed the welfare overhaul in 1996, he and his supporters promised that its problems could be fixed later. One problem at the top of the list was the bill's savaging of the food stamp program, including sharp financial cuts and the removal of legal immigrants from its rolls. It wasn't fixed.
Six years and lots of empty plates later, there's a chance to make a considerable improvement--if the senators who see the need to fix it hold out. The 2002 farm bill, with a price tag of $75 billion, has passed both houses and is now in conference committee. The Senate version adds $8.9 billion to the nutrition budget, mostly for food stamps, and requalifies most legal immigrants, including all children and also the disabled--which would add an estimated 400,000 people. In determining general eligibility, it also takes a more realistic view of what poor people have to spend for shelter and to keep a car running. The House adds only a bit more than a third of that and, in the spirit of both the House leadership and the 1996 welfare bill, doesn't fix much.
The Senate pays for its nutrition increases by limiting the farm subsidies that can be paid to individual and corporate farmers. The House--and some senators--would rather keep the money flowing in the same old streams. "The problem is that the House doesn't like the payment limits," says Andy Fisher, spokesman for Senator Richard Lugar of Indiana. "You would think that members would be ashamed to take that position, but they're not."
It's one of the quirks of Washington that the major federal nutrition programs are part of the farm bill, written by legislators generally more interested in peanut price supports than peanut butter sandwiches. This year the bill was complicated by the large number of vulnerable Democratic senators from farm states--including Tim Johnson of South Dakota, Jean Carnahan of Missouri, Tom Harkin of Iowa and Paul Wellstone of Minnesota--and Democratic nervousness about the effect of subsidy limits on their chances this fall. But, insists Wellstone, "There's no reason why we can't get it right on both family farms and nutrition."
The issue comes up as the need for food help is surging, spiked by the unemployment jump of the past year--especially in areas like Florida and Las Vegas, where the drop in airline traffic belted low-wage (and heavily immigrant) tourism workers. America's Second Harvest, the national alliance of food banks, issued a call to action in February to raise 365 million pounds of food. "When legal immigrants lost eligibility," says Doug O'Brien of Second Harvest, "it just shifted responsibility from the federal government to food banks." With the small difference that it's a lot harder for food banks to pay for it.
In January the Bush Administration--driven by the realities of hunger in Texas and by GOP interest in the Hispanic vote--came out for sharply relaxing the legal immigrant exclusion, giving the idea momentum. The National Governors Association, seeing hunger from closer up than Congress does, backs the Senate bill.
But in early March the Senate plan was hurt by the discovery that its bill would cost $6 billion more than expected and more than the budget allowed. Still, the senators on the conference committee include longtime nutrition advocates like Lugar and Patrick Leahy, and a high-ranking Senate staff member insists that the Senate side has so far refused--even after the revelation of its faulty accounting--to put nutrition cuts on the table. "I'm very hopeful," says Representative Eva Clayton, the first black woman to represent North Carolina and described by O'Brien as a "heroine" on hunger issues. "The House has not been very strong or aggressive on the issue of nutrition. We really need a little more pressure on us."
Senators--and all those who think it's a good idea to feed more hungry Americans--should turn up the heat on the nutrition conference committee.
It's the largest profession in healthcare. It's the largest female profession in America. But despite its tremendous importance and impact, most people know very little about contemporary nursing. Public ignorance of the present-day profession, however, pales in comparison with ignorance of nursing's history. How many of us know that the development of nursing as the first secular profession for respectable women was a major feminist achievement? Or that Florence Nightingale was not, in fact, the "founder" of modern nursing? Or that nurses played a key role in developing the American hospital system, as nursing historian Sioban Nelson has documented in her recent book Say Little, Do Much? How many of us know about the role of nursing in the development of public health and care of the chronically ill and poor? Most important, how many of us recognize that society's persistent devaluation of nursing--reflected today in the prejudices of many newly liberated female physicians, health policy experts and journalists--is a legacy of longstanding, socially enforced subordination to medicine?
Katrin Schultheiss, an assistant professor of history and women's studies at the University of Illinois, Chicago, is one of a handful of non-nurses who understand what the profession has to teach us about the complex process of female emancipation, as well as about the development of modern healthcare systems. She recounts the tortuous history of how the "professionalization" of nursing in France coincided with anticlericalism and the secularization of the field. Although her story focuses on the forty-year period from 1880 to 1922 and takes place in one country, the gender dilemmas Schultheiss explores have hampered nurses' ability to care for patients in healthcare systems around the globe, including in the United States.
Her tale begins with the advent of France's Third Republic and follows political reformers who attacked clerical authority as they tried to modernize the healthcare system. Until that time, nursing outside the home was typically provided by convent-trained nuns. Modern hospital reformers recognized that nursing required more nurses with more systematic education, but therein lay the problem. Since knowledge is power, the acquisition of knowledge was inevitably a challenge to authority.
Physicians, as men, did not welcome women on their terrain. As members of a developing profession--one that did not then command the prestige it enjoys today--doctors were also adamant about defending their field "from irregular or illegal practitioners."
Even doctors who recognized the need for a more educated nursing work force and who wanted to laicize the care of the sick would not countenance the education of nurses if, in the process, nurses attained the kind of knowledge and stature that would allow them to demand greater authority and autonomy in both the workplace and society. So even lay nursing had to be constructed in altruistic terms that stressed not nurses' knowledge but their virtue. As Schultheiss writes, "As long as nursing was clearly understood to be a custodial, maternal, or charitable occupation, and as long as nurses were regarded as the social, economic, and educational peers of the patients, rather than the doctors, there would be no ambiguity about who held medical authority within the hospital."
In Paris, nursing nuns, while obedient and devoted, presented a problem to medical reformers. "The very existence of an autonomous community of women called into question the hierarchy of power within municipal institutions," Schultheiss notes. Happily, secular authorities found lay nurses, as one reformer commented, to be "infinitely more subordinate than the religious nurses and more scrupulous in the strict execution of doctors' orders."
While anticlerical reformers touted the benefit of lay nurses, the French public was attached to the nuns who had provided what out-of-home nursing care had existed since the seventeenth century, and even before. Of course, Schultheiss points out, even support for religious nurses was cast in gendered terms. Proponents of the nuns insisted that nursing should be left to a special group of religious women because it would corrupt lay women for their real work--which was mothering. "A woman is either a bad mother or a bad nurse," was their motto. To convince the public to support secularization, reformers had to "feminize nursing--to turn nursing into a general feminine virtue that all women could possess."
Schultheiss's story also introduces us to a peculiar hybrid form of religious nurse--the "hospitalières" of the Hospices Civils of Lyons. These women were secular nuns, congregationist sisters "who undertook a lifelong commitment to serve the sick and poor under harsh physical conditions and with virtually no monetary compensation, but who remained under the direct authority of the secular administration." According to Schultheiss, laicizers supported them because they were easily controllable and because their sense of devotion was easily manipulated by civil administrators who didn't want to pay the real cost of nursing care.
In this section of the book, class also enters the story: If civil administrators were to get nursing care for little or nothing, women's educational standards--and thus their salaries--had to be low. Whether they were secularizers or not, reformers recognized that more highly educated women of a better class would eventually demand more pay, and more say.
Finally, Schultheiss takes us to Bordeaux, where we meet Anna Hamilton, a reformer and devotee of Florence Nightingale. With connections to the international nursing reform movement, Hamilton wanted to open a nursing school that would produce a "new nurse," recruited from the so-called better classes. This new nurse, she insisted, would deliver better patient care than nursing nuns. Hamilton's critique of the nuns, Schultheiss explains, was not based on anticlericalism. Rather, Hamilton argued that the nuns had "distanced themselves from direct patient care" while creating obstacles to the creation of "a single medical hierarchy grounded on universal principles of hygiene and scientific health care."
Hamilton was able to gain support for her project from Paul-Louis Lande, a physician who became mayor of Bordeaux, because she firmly linked the "professionalization and feminization of nursing." Doctors in Bordeaux, Schultheiss writes, recognized "the need for improving the training of hospital nurses, but rejected all aspects of reform that expanded the nurses' autonomy or authority beyond the narrowest limits."
Hamilton accepted these limits, asserting that "it is extremely ridiculous for a nurse who possesses neither the knowledge nor the rights nor the sex of the doctor to try to imitate his way of interacting with the patient and to try to use his language." Thus, in France, as in England and the United States, the nurse-doctor game began with the acceptance of the notion that nurses could not--or should not--possess medical knowledge and that they therefore could not--and should not--use medical language.
Schultheiss ends her story after the First World War. The war produced such a huge need for nurses that the debate over the virtues of lay versus religious nurses effectively ended. When more than 100,000 nurses culled from every social class enlisted to serve "la Patrie," this "demonstrated that women's special aptitudes could be attached fruitfully to the state." However, even during this period and afterward, nursing was valued not for its knowledge but for its virtue. It had become, the author concludes, "a twentieth-century version of republican motherhood."
French nursing carries that legacy to this day. Last year, when I was strolling down the Boulevard St. Germaine in Paris, a book displayed in the window of a children's bookstore caught my eye. It was called Je Sais Qui Me Soigne ("I Know Who Takes Care of Me") and is part of a series on citizenship and the professions. Nurses make a brief appearance in the book--as doctors' servants who have, as the text reads, "just enough schooling to follow doctors' orders."
For nurses struggling to put their education to use for patients, rather than for physicians, the ability to escape, at least temporarily, medical domination has always made home care attractive. Which brings us to Karen Buhler-Wilkerson's part of the story. In No Place Like Home, Buhler-Wilkerson, a professor of community health and director of the Center for the Study of the History of Nursing at the University of Pennsylvania School of Nursing, traces the development of home care from the opening of the first US home-care agency--the Ladies Benevolent Society, founded in 1813 in Charleston, South Carolina--through the present.
In Charleston, as elsewhere, respectable society ladies started home-care agencies because they felt "obligated to improve the conditions of and provide for the comfort of the poor," who were, in turn, "expected to manifest their gratitude to the rich," who established these agencies. But they did not deliver the care. Nurses did.
No Place Like Home does a great service to these ordinary nurses who are often dismissed as know-nothings by some nursing elites today. Buhler-Wilkerson details the complexity of caring for victims of tuberculosis or managing patients during typhoid epidemics. She also documents the persistence of the issues with which home-care agencies still struggle today: how to navigate doctor-nurse relationships; how to choose appropriate patients for home-care services; how to deal with gender, race and class prejudice; and how to secure long-term services for the chronically ill.
From the early days of home care, doctors were concerned about nurses invading their territory. In Boston, for example, doctors "confided to lady managers that 'the constant danger with trained nurses is that they shall usurp the doctors' position and prescribe for patients.'"
At the turn of the twentieth century, with the founding of the Henry Street Settlement on the Lower East Side of Manhattan, Lillian Wald and her colleagues developed public health nursing--"to improve standards of living" of the poor. One of the great innovations of the Henry Street Settlement was the establishment of a "First Aid Room." This was a kind of community clinic where immigrants could gain easy access to nursing care for routine health problems. Doctors, however, soon complained that "nurses were carrying ointments and even giving pills outside the strict control of physicians." Even outspoken nurses like Wald's colleague, socialist Lavinia Dock, feared a confrontation with powerful physicians. By 1911 questionable cases were no longer treated in the First Aid Room. "Later publications," Buhler-Wilkerson writes, assured the public that "the real Henry Street Settlement nurse will make the doctor feel that she is exerting every effort to have his treatment, not hers, intelligently followed."
An equally fascinating subject tackled by Buhler-Wilkerson is the impact of racial prejudice on nurse-patient and nurse-doctor relationships. In both the North and the South, lady managers as well as nurses fretted about whether it was appropriate for white nurses to care for black patients or black nurses for white patients. When insurers, notably Metropolitan Life, entered the field at the turn of the last century, managers considered the same imponderables. Race invariably trumped the needs of care and even of doctor domination of the nurse-physician relationship. For example, Buhler-Wilkerson tells us that the respectable ladies of Richmond, Virginia, who ran home care in that city, decided it was "'eminently' satisfactory for white nurses to care for black patients on the 'same footing' as white patients--but drew the line at white nurses 'taking orders from colored physicians.'"
The advent of health insurance also had a critical impact on the home-care agencies. Wald convinced Metropolitan Life to cover home-care services in 1909. Met Life wanted to reduce the high mortality rate of black life insurance subscribers--thus delaying payments on their life insurance policies. Home-care nursing's preventive approach initially seemed to make good business sense. By the 1950s, public health nursing and medical advances had paid off: Fewer people were dying of infectious diseases, and more acute illnesses were treated in the hospital. This meant that the bulk of home-care patients were chronically ill. To reward public health nursing for its success, Met Life curtailed its home-care program. "Providing care for those who failed to recover quickly was, from an insurance perspective, a poor investment," Buhler-Wilkerson states bluntly.
Since the fate of nursing is tied to the fate of the patients nurses serve, the situation has not improved much, as first Medicare and Medicaid and now managed care have "rediscovered" home care. Indeed, today the promise of the home as a place where nurses and their patients can escape the negative consequences of medical paternalism and give or receive higher-quality care has remained largely unfulfilled.
In Devices & Desires, Margarete Sandelowski uses a different lens--the world of medical technology--to explore the issue of gender and nursing. This brilliant book shows just how much the "charitable, devotional and altruistic" image of the nurse conceals. From the discovery of the thermometer to the development of intensive heart and fetal monitoring, Sandelowski documents doctors' dependence on nurses for their reputation for scientific and technical mastery. As Sandelowski shows, nurses have been critical in administering medical technology, monitoring the information it provides and interpreting that information to physicians, not to mention "educating patients about new devices, getting patients to accept and comply with their use, and alleviating patients' fears about them."
An eye-opening segment describes the use of the first thermometer, rather than the hand, as a diagnostic tool in the mid-nineteenth century. In it, we learn that the thermometer we take for granted today was originally an unwieldy, dangerous instrument that had to be carefully manipulated so as not to injure the patient. Because diagnosis and treatment involved taking the patient's temperature numerous times a day, busy physicians assigned the task to nurses. This involved, however, far more than simply recording data. The nurse, Sandelowski writes, "had to know what caused various temperatures to occur and the nursing measures that would lower or raise temperature to normal levels."
While physicians were the ones to insert the first unwieldy and equally dangerous intravenous devices, nurses were the ones to make sure the patient's arm remained immobile and that the patient could tolerate the discomfort of IV therapy. Nurses are the ones who developed the intensive-care unit--to provide intensive nursing care--and who track and interpret data from fetal monitors. As the primary users of much medical machinery, nurses are often more knowledgeable about equipment than doctors. Indeed, "the benefits of machine monitoring could not be fully harnessed without nurses who understood and could act immediately on the information monitors generated." While the public does not recognize this fact, the author tells us that medical equipment manufacturers certainly do. This is why nurses continually work with physicians and manufacturers to create design improvements and to insure that "expensive machinery [is] fully utilized."
What is amazing about this story is how little nurses have benefited from their technological mastery. Sandelowski shrewdly diagnoses a classic Catch-22. While it is true that nurses' status is somewhat enhanced by their technical proficiency, the recognition they receive does not match their actual accomplishments. That's because physicians quickly label the technical activities nurses engage in as "simple enough" for a nurse to perform.
No matter how much nurses participate in the diagnostic process, of course, physicians have maintained a legal and linguistic stranglehold on "medical" diagnosis. Even as "physicians were increasingly expecting them to perform de facto acts of diagnosis," Sandelowski writes, "nurses were in the bizarre position of having to be mindful of symptoms without speaking their mind about them."
Nurses were supposed to be able to distinguish between normal and abnormal conditions and to look for reasons for any abnormal findings. But nurses were never to use the words "normal" or "abnormal" in reporting or recording patient conditions, and they were to refrain from offering their opinions on etiology or diagnosis.... Nurses were to say (report and record) only what they saw, unlike physicians, who maintained the right to say what they knew.
This has produced the peculiar phenomenon--even today--of the nurse who recognizes that a cancer patient has diarrhea or a mentally ill patient is hallucinating, but who is not allowed to use the actual medical word because that would suggest that she, or he, is making a "medical diagnosis."
As she describes these phenomena, Sandelowski never paints nurses as innocent victims of nasty, overbearing physicians. In their perennial attempt to find "a socially valued place and distinctive identity," Sandelowski argues, many members of the profession have, albeit unwittingly, adopted common gender stereotypes that perpetuate the oppression of nurses.
One segment of the profession, Sandelowski contends, has bought into the notion that the complex practical, technical work that ordinary nurses perform is indeed simple and know-nothing.
Typically conceived of as nothing more than the physician's hand, and persistently caught in the Western cultural dichotomy between merely manual and highly prized mental, or intellectual, work, nurses have struggled to show that nursing is largely brain work. In the process, however, they have inadvertently complied with the prevailing cultural practice of denigrating the very "body-knowledge" that is the forte of the nurse.
This is particularly evident in the nurse-practitioner movement, which so many elite nurses now promote. "The key factor differentiating nurse practitioners from other nurses," she writes, "is both the use of medical instruments and the use of instruments in ways previously denied nurses." But, she points out, in our bottom-line-driven healthcare system "the role emerges as largely economically and 'medically-driven'.... The traditional image is maintained of nurses as the extra hands and eyes of physicians willingly and cheaply filling voids and bridging gaps in health care."
Other segments of the profession, Sandelowski argues, have opposed nurses' emotional and social work to their technological activities, arguing that technology is somehow an inauthentic nursing activity, while "caring" is both authentic and an essential "antidote to technology." Sandelowski shrewdly insists that in opposing "nursing/touch and technology," the profession has been "identified both with and against technology and thus, in an ironic way, with and against itself."
While it is not the purpose of these books to offer advice about dealing with the many problems nursing confronts, they implicitly point to one incontrovertible solution: We can appreciate what nurses do in the present only if we understand how their work has been constructed in the past and what they have contributed--and can contribute--to our healthcare system.
Understanding and analyzing nursing's decades-long struggle for "a socially valued place and distinctive identity" is not an academic exercise. It is central to reversing the chronic underfunding of the nursing services most of us will eventually depend on in hospitals and other healthcare institutions, and also the undereducation of the nursing work force at almost all levels of practice. And it is critical to any solution to the severe nursing shortage, which, if not quickly and effectively addressed, will have disastrous consequences as the population grows older and sicker.
In the next few weeks the Senate will hold hearings and vote on legislation that would outlaw the cloning of human embryos, either for the purpose of medical experimentation or the birth of a human being. The House already passed a similar bill in July. Until now the cloning debate has been viewed in Washington and the media as a classic struggle pitting social conservatives, antiabortion activists and the Catholic Church against the scientific community and progressive forces, with Republicans lined up on one side and Democrats on the other. Below the surface, however, another reality is beginning to take shape. Although reluctant to acknowledge it, some social conservatives and some left activists find common ground on the cloning issue [see Ralph Brave, "Governing the Genome," December 10, 2001]. An example of this convergence is a statement issued by sixty-seven prominent left progressives on January 23 supporting legislation to outlaw the cloning of human embryos.
The progressives backing this legislation worry that the market for human eggs that would be created by such research will provide unethical incentives for women to undergo health-threatening hormone treatment and surgery. They are also concerned about the increasing bioindustrialization of life by the scientific community and life science companies and are shocked that clonal human embryos have been patented and declared to be human inventions. On the other hand, few, if any, on the left oppose research on adult stem cells, which can be taken from people after birth and which have proved promising in animal studies and clinical trials. This approach poses none of the ethical, social or economic risks of strategies using embryo stem cells.
What about cloning a human being? Most members of Congress on both sides of the aisle would oppose a clonal birth. But for many in Congress, and in the scientific community and the biotech industry as well, opposition is solely based on the fact that the cloning technique is still unsafe and could pose a risk of producing a malformed baby. Far fewer members of either party would be against cloning a human baby were the procedure to become safe and reliable. After all, argue proponents, if an infertile couple desires to pass on their genetic inheritance by producing clones of one or both partners, shouldn't they be able to exercise their right of choice in the newly emerging biotech marketplace? Moreover, we are told not to be overly concerned, because even though the clone will have the exact same genetic makeup as the original, it will develop differently because its social and environmental context will be different from that of the donor.
What unites social conservatives and progressives on cloning issues is their commitment to the intrinsic value of life and their opposition to what they perceive to be a purely utilitarian perspective on biotech issues. To be sure, the social conservatives and left activists differ in the "life issues" they embrace and champion. The former crusade for what they regard as the rights of the unborn and family values and rail against infanticide, euthanasia and pornography. The latter speak out on behalf of the poor, women, abused children, fellow animals and the global environment. Both groups come together in opposition to cloning--but for different reasons.
Many on the left argue that with cloning the new progeny become the ultimate shopping experience--designed in advance, produced to specification and purchased in the biological marketplace. Cloning is, first and foremost, an act of production, not creation. Using the new biotechnologies, a living being is produced with the same degree of engineering as we have come to expect on an assembly line. For the first time in the history of our species, we can dictate, in advance, the final genetic constitution of the offspring. The child is no longer a unique creation--one of a kind--but rather an engineered reproduction.
The left also warns that cloning opens the way to a commercial eugenics civilization. Already life science companies have leaped ahead of the political game being played out in Congress and the media by patenting human embryos and stem cells, giving them upfront ownership and control of a new form of reproductive commerce, with frightening implications for the future of society. Many on the left worry that human cloning, embryonic stem cell research and, soon, designer babies, lay the groundwork for a new form of biocolonialism, in which global life science companies become the ultimate arbiters of the evolutionary process itself.
Neither the social conservatives nor the left activists are entirely comfortable with the new alliance, and they will continue to disagree in many areas. But on biotech issues both of these groups will increasingly break ranks with their traditional political affiliations--the social conservatives with market libertarians and the left activists with social democratic parties.
The biotech era will bring with it a very different constellation of political visions and social forces just as the industrial era did. The current debate over cloning human embryos and stem cell research is already loosening the old alliances and categories. It is just the beginning of the new biopolitics.