Is human cloning a feminist issue? Two
cloning bans are currently winding their way through Congress: In the
Senate, the Human Cloning Prohibition Act seeks to ban all cloning of
human cells, while a House version leaves a window open for cloning
stem cells but bans attempts to create a cloned human being. Since
both bills are the brainchildren of antichoice Republican yahoos, who
have done nothing for women's health or rights in their entire lives,
I was surprised to get an e-mail inviting me to sign a petition
supporting the total ban, organized by feminist heroine Judy
Norsigian of the Boston Women's Health Book Collective (the producers
of Our Bodies, Ourselves) and signed by Ruth Hubbard, Barbara
Seaman, Naomi Klein and many others (you can find it at
www.ourbodiesourselves.org/clone3.htm). Are feminists so worried about "creating a
duplicate human" that they would ban potentially useful medical
research? Isn't that the mirror image of antichoice attempts to block
research using stem cells from embryos created during in vitro
My antennae go up when people start talking about
threats to "human individuality and dignity"--that's a harrumph, not
an argument. The petition raises one real ethical issue, however,
that hasn't gotten much attention but by itself justifies a ban on
trying to clone a person: The necessary experimentation--implanting
clonal embryos in surrogate mothers until one survives till
birth--would involve serious medical risks for the women and lots of
severely defective babies. Dolly, the cloned Scottish sheep, was the
outcome of a process that included hundreds of monstrous discards,
and Dolly herself has encountered developmental problems. That's good
reason to go slow on human research--especially when you consider
that the people pushing it most aggressively are the Raelians, the
UFO-worshiping cult of technogeeks who have enlisted the services of
Panayiotis Zanos, a self-described "cowboy" of assisted reproduction
who has been fired from two academic jobs for financial and other
Experimental ethics aside, though, I have a hard
time taking cloning seriously as a threat to women or anyone
else--the scenarios are so nutty. Jean Bethke Elshtain, who took a
break from bashing gay marriage to testify last month before Congress
against cloning, wrote a piece in The New Republic in 1997 in
which she seemed to think cloning an adult cell would produce another
adult--a carbon of yourself that could be kept for spare parts, or
maybe a small army of Mozart xeroxes, all wearing knee breeches and
playing the Marriage of Figaro. Actually, Mozart's clone would
be less like him than identical twins are like each other: He would
have different mitochondrial DNA and a different prenatal
environment, not to mention a childhood in twenty-first-century
America with the Smith family rather than in eighteenth-century
Austria under the thumb of the redoubtable Leopold Mozart. The clone
might be musical, or he might be a billiard-playing lounge lizard,
but he couldn't compose Figaro. Someone already did
People thinking about cloning tend to imagine Brave New
World dystopias in which genetic engineering reinforces
inequality. But why, for example, would a corporation go to the
trouble of cloning cheap labor? We have Mexico and Central America
right next door! As for cloning geniuses to create superbabies, good
luck. The last thing most Americans want are kids smarter than they
are, rolling their eyeballs every time Dad starts in on the gays and
slouching off to their rooms to I-M other genius kids in Sanskrit.
Over nine years, only 229 babies were born to women using the sperm
bank stocked with Nobel Prize winners' semen--a tiny fraction, I'll
bet, of those conceived in motel rooms with reproductive assistance
from Dr. Jack Daniel's.
Similarly, cloning raises fears of
do-it-yourself eugenics--designer babies "enhanced" through gene
manipulation. It's hard to see that catching on, either. Half of all
pregnancies are unintended in this country. People could be planning
for "perfect" babies today--preparing for conception by giving up
cigarettes and alcohol and unhealthy foods, reading Stendhal to their
fetuses in French. Only a handful of yuppie control freaks actually
do this, the same ones who obsess about getting their child into a
nursery school that leads straight to Harvard. Those people are
already the "genetic elite"--white, with lots of family money. What
do they need genetic enhancement for? They think they're perfect
Advocates of genetic tinkering make a lot of assumptions
that opponents tacitly accept: for instance, that intelligence,
talent and other qualities are genetic, and in a simple way. Gays,
for example, worry that discovery of a "gay gene" will permit
selective abortion of homosexual fetuses, but it's obvious that
same-sex desire is more complicated than a single gene. Think of
Ancient Greece, or Smith College. Even if genetic enhancement isn't
the pipe dream I suspect it is, feminists should be the first to
understand how socially mediated supposedly inborn qualities
are--after all, women are always being told anatomy is their
There's a strain of feminism that comes out of the
women's health movement of the seventies that is deeply suspicious of
reproductive technology. In this view, prenatal testing, in vitro
fertilization and other innovations commodify women's bodies, are
subtly coercive and increase women's anxieties, while moving us
steadily away from experiencing pregnancy and childbirth as normal,
natural processes. There's some truth to that, butwhat about the side
of feminism that wants to open up new possibilities for women?
Reproductive technology lets women have children, and healthy
children, later; have kids with lesbian partners; have kids despite
disabilities and illness. Cloning sounds a little weird, but so did
in vitro in 1978, when Louise Brown became the first "test tube
baby." Of course, these technologies have evolved in the context of
for-profit medicine; of course they represent skewed priorities,
given that 43 million Americans lack health insurance and millions
worldwide die of curable diseases like malaria. Who could argue that
the money and brain power devoted to cloning stem cells could not be
better used on something else? But the same can be said of every
aspect of American life. The enemy isn't the research, it's
The state's justice system crushes poor people like Ernestina Rodriguez.
Memo to editors of campus papers: When the next right-wing ideologue shows up with an ad full of nonsense, just take the money and print it. That way, they will not be able to pose as the victim of "political correctness," they will not get millions of dollars' worth of free publicity and their ideas will not acquire the glamour of the forbidden. By the same token, you will not look afraid of debate and controversy, nor will you have to explain why you rejected their ad while printing something equally false, offensive or stupid on some previous occasion.
Never mind that the people accusing you of censorship practice it themselves: In an amusing riposte to David Horowitz's flamethrower ad opposing reparations for slavery, Salon's David Mazel proved unable to place an enthusiastically pro-abortion ad in papers on conservative campuses; and as Fairness and Accuracy in Reporting points out, the Boston Globe, which editorialized against students who rejected the Horowitz ad, itself rejected an ad criticizing Staples, a major advertiser, for using old-growth forest pulp in its typing paper. So there, and so there! But you're in a better place to make such arguments stick if you can stand--however cynically and self-servingly--on the high ground of free speech yourself.
Just as Horowitz faded, having shot himself in the foot by refusing to pay the Daily Princetonian after it printed his ad but editorialized against it, up comes the soi-disant Independent Women's Forum--you know, that intrepid band of far-right free spirits funded by the ultraconservative Sarah Scaife Foundation--with an ad in the UCLA Daily Bruin and Yale Daily News urging students to "Take Back the Campus!" and "Combat the radical feminist assault on Truth." The IWF charges "campus feminism" with being "a kind of cult" in which "students are inculcated with bizarre conspiracy theories about the 'capitalist patriarchal hegemony,'" a fount of "Ms./Information," "male-bashing and victimology." Brainwashing isn't exactly what comes to mind when I think of the revolution in scholarship that has produced such celebrated historians as Linda Gordon, Ellen DuBois, Joan Scott, Rickie Solinger, Leslie Reagan and Kathy Peiss. The sweeping, paranoiac language gives it away--this is IWF member Christina Hoff Sommers speaking from her perch at that noted institution of higher learning, the American Enterprise Institute.
The bulk of the ad consists of a list of "the ten most common feminist myths" and the "facts" that supposedly prove them false. Much of this is lifted from Sommers's Who Stole Feminism?, a book that attempted to deploy a few gotchas against hyperbolic statistics and questionable studies to deny the significance of violence, sexism and discrimination in women's lives. I mean, how important is it that "rule of thumb" may not derive, as some feminist activists believe and some newspapers have printed, from an old legal rule permitting husbands to beat their wives with a stick no thicker than their thumb (Myth #4)? Feminists did not make this folk etymology up out of nothing--actually, according to Sharon Fenick of the University of Chicago, writing on the Urban Legends website, it probably goes back to the eighteenth century, when the respected English judge, Francis Buller, earned the nickname "Judge Thumb," for declaring such "correction" permissible. That it was legal for premodern English husbands to beat their wives within limits is not in dispute (in her book, Sommers obscures this fact by omitting the Latin phrases from a passage in Blackstone's Commentaries); nor is the fact that wife-beating, regardless of the law, was, and sometimes still is, treated lightly by the legal system under the rubric of marital privacy. Thus, in 1910 the Supreme Court, in Thompson v. Thompson, barred wives from suing husbands for "injuries to person or property as though they were strangers." (I learned this, and much else relating to the history of American marriage, from Yale feminist historian Nancy Cott's fascinating Public Vows: A History of Marriage and the Nation.)
And what about Myth #2, "Women earn 75 cents for every dollar a man earns." That doesn't come from some man-bashing fabulator squirreled away in a women's studies department. It comes from the US government! The IWF argues that the disparity disappears when you take education, training, occupation, continuity of employment, motherhood and other factors into account--but even if that were true, which it isn't, to overlook all those things is itself advocacy, a politicized way of defining sex discrimination in order to minimize it.
And then there's #1, the mother of all myths: "One in four women in college has been the victim of rape or attempted rape." The IWF debunks this number, which comes from the research of Mary Koss, by citing the low numbers of reported rapes on college campuses, but the one-in-four figure includes off-campus and pre-college rapes and rape attempts. Are Koss's numbers the last word? Of course not. In 1998 the Centers for Disease Control and Prevention found that among all women, one in five had experienced a rape or attempted rape at some point in her life. In January the Justice Department released a report claiming that 3 percent of college women experience rape or attempted rape per school year, which does add up over four years.
Does irresponsible, lax or even slanted use of facts and figures exist in "campus feminism"? Sure--and out of it, too. (Try economics.) But what does that have to do with women's studies, a very large, very lively interdisciplinary field of intellectual inquiry, in which many of the supposed verities of contemporary feminism are hotly contested? The real debate isn't over the merits of this study or that--in social science "results" are always provisional. Now that the IWF has thrown down the gauntlet, feminist scholars should call for that real debate--Resolved: Women's lives were more seriously studied and accurately understood when almost no tenured professors were female. Or, Resolved: Violence against women is not a major social problem. Or, Resolved: If women aren't equal, it's their own darn fault.
They were kidnapped on the street, or summoned to the village square, or lured from home with false promises of work, to be forced into the Japanese military's far-flung, highly organized system of sexual slavery throughout its occupied territories in the 1930s and 1940s. Of some 200,000 so-called comfort women only a quarter survived their ordeal; many of those died soon after of injuries, disease, madness, suicide. For years the ones who remained were silent, living marginal lives. But beginning in 1991, first one, then a trickle, then hundreds of middle-aged and elderly women from Korea, China, Taiwan, the Philippines and other former Japanese possessions came forward demanding that the Japanese government acknowledge its responsibility, apologize and make reparations. Despite a vigorous campaign of international protest, with mass demonstrations in Korea and Taiwan, Japan has hung tough: In 1995 Prime Minister Tomiichi Muayama offered "profound"--but unofficial--apologies and set up a small fund to help the women, to be financed on a voluntary basis by business; this past March, the Hiroshima high court overturned a modest award to three Korean women. As if official foot-dragging weren't demeaning enough, a popular comic-book history of Japan alleges that the comfort women were volunteers, and ultraright-wing nationalists have produced middle-school textbooks, approved for use in classrooms, that omit any mention of the government's role in the comfort-woman program.
Frustrated in Japan, the comfort women have now turned to the US Court of Appeals for the Washington, DC, Circuit. Under the 212-year-old Alien Tort Claims Act, foreigners may sue one another in US courts for human rights violations; the women are also relying on a law against sexual trafficking passed last year by Congress. In mid-May, however, the State Department asked the Justice Department to file a brief expressing its sympathies with the women's sufferings but urging that the case be dismissed as lacking jurisdiction: Japan has sovereign immunity, under which nations agree to overlook each other's wrongdoings, and moreover, treaties between it and the United States put finis to claims arising from the war.
In other words, it's all right to seize girls and women and put them in rape camps--aka "comfort stations"--for the amusement of soldiers far from home, as long as it's part of official military policy. War is hell, as the trustees of the New School noted in their letter absolving their president, Bob Kerrey, of the killing of as many as twenty-one Vietnamese women and children. If it's OK to murder civilians, how wrong can it be to rape and enslave them?
"The Administration's position is particularly terrible and irresponsible when you consider the evolution of attitudes toward wartime rape over the last ten years," says Elizabeth Cronise, who with Michael Hausfeld is arguing the comfort women's case. Indeed, sexual violence in war has typically been regarded as the inevitable concomitant of battle, part of the spoils of war, maybe even, for the evolutionary-psychology minded, the point of it: Think of the rape of the Sabine women or the plot of the Iliad, which is precipitated by a fight between Achilles and Agamemnon over possession of the captured Trojan girls Chryseis and Briseis, although my wonderful old Greek professor Howard Porter didn't quite put it like that. It was only this past February that an international tribunal brought charges solely for war crimes of sexual violence, when three Bosnian Serbs were convicted in The Hague of organizing and participating in the rape, torture and sexual enslavement of Muslim women.
But even by these ghastly standards, the case of the comfort women stands out for the degree of planning and organization the Japanese military employed. Noting, for example, that subject populations tended to resent the rape of local women, authorities typically shipped the women far from home; although the women saw little or no money, "comfort stations" were set up as brothels with ethnically graduated fees, from Japanese women at the top to Chinese women at the bottom. The system was not, strictly speaking, a wartime phenomenon: It began in 1932, with the Japanese occupation of Manchuria, and continued after the war's end. In fact, according to Yoshimi Yoshiaki, whose Comfort Women: Sexual Slavery in the Japanese Military During World War II (Columbia) is crucial reading, the Japanese military authorities set up comfort stations for the conquering American troops. As Cronise points out, even if the United States has closed the books on Japan's wartime atrocities, it could still side with the comfort women on the grounds that many of them were enslaved during peacetime.
"The government's position is technically defensible," says Widney Brown, advocacy director for women's rights at Human Rights Watch. "What's not defensible is the Department of Justice's giving as a reason that it doesn't want to jeopardize relations with Japan." Incredibly, the Justice Department is arguing just that, along with the further self-interested point that a ruling in favor of the comfort women would open the United States to human rights lawsuits in other countries. (Remember that the United States has sabotaged the International Criminal Court.) Says Brown, "It shows a failure to understand the significance of the comfort women case as a major step in the development of human rights for women. After all, their case could have been brought up in the Far East tribunal right after World War II, but it wasn't. This is a major chance to move beyond that. You could even argue that the view of women as property--if not of one man, then another--was what prevented sexual slavery from being seen as a war crime until now."
The US lawsuit may well be the comfort women's last chance. Now in their 70s and 80s, most will soon be dead, and since few married or had children, there won't be many descendants to continue the fight for reparations. By stonewalling, the Japanese government will have won. And the Bush Administration will have helped it. All that's missing is the call for healing and mutual forgiveness.
Once in a while you come across a book that is so original, so persuasive, so meticulously researched and documented that it overrides some of your most taken-for-granted assumptions and beliefs. Devices and Desires is such a work. The author, Andrea Tone, associate professor of history at Georgia Tech, belongs to a small band of new historians who are reassessing the lives of nineteenth-century women through attention to their personal (and I do mean very personal) health aids. An earlier example would be Rachel Maines's The Technology of Orgasm, published by Johns Hopkins in 1999, which describes and illustrates the 1880s-style vibrators that doctors freely used in their offices--and women in their homes--for relief of pelvic congestion and the female "hysteria" associated with it.
Devices and Desires opens in 1873 when, through the machinations of Anthony Comstock--star agent for the New York Society for the Suppression of Vice (NYSSV)--Congress unexpectedly voted to make contraceptives illegal. Many Americans disapproved, and when the news reached Ireland, George Bernard Shaw coined the word "Comstockery," which, he predicted, would become "the world's standing joke at the expense of the United States."
It may be that talk of the new law made contraception known to some folks who had never heard of it before. It may be that, as with Eve's forbidden fruit, the ban made pregnancy prevention seem more alluring or naughty--or more fun. Then too, the "bootleg" business environment that ensued was relatively welcoming to entrepreneurial immigrants, smart single mothers with families to support and other ambitious "outsiders." "As with condoms," Tone observes, "creating diaphragms was easy and inexpensive, an ideal venture for those with little money and a penchant for risk." In any case the business of contraception flourished in the Comstock era, embracing scores of diverse devices and spermicides for women and men. Hundreds if not thousands of small entrepreneurs and distributors profited, as did as an impressive handful of industrial giants, including the arch-hypocrite Samuel Colgate, millionaire heir to the New Jersey-based soap firm, who served as president of Comstock's NYSSV while openly promoting Vaseline to "destroy spermatozoa, without injury to the uterus or vagina."
Other well-established companies that made, distributed and freely advertised contraceptives--ranging from intrauterine devices (IUDs) to vaginal pessaries (appliances intended to support the uterus that could also prevent the passage of sperm), and from douching syringes, suppositories and foaming tablets to sponges and male caps--included some still familiar names: B.F. Goodrich, Sears, Roebuck & Co. and Goodyear. "The B.F. Goodrich Company," notes Tone, "manufactured three soft-rubber IUDs--one pear- and two donut-shaped, each available in five sizes--and twelve hard-rubber models. Two of the latter models were one-size-fits-all rings." Physicians were leading players in the commercialization of mass-produced IUDs--constructed from rubber, metal, ivory and even wood--although some models were promoted for do-it-yourself insertion.
Tone's exhaustive research led her--like an ace detective or shoe-leather crime reporter--through an eight-year coast-to-coast investigation of Post Office Department records, Federal Trade Commission transcripts (some with decaying diaphragms and condoms glued to the pages), American Medical Association (AMA) Health Fraud Archives, records of the NYSSV, credit reports from nineteenth-century Dun and Co. collections, patents, love letters, arrest records, trial records, advertisements and trade catalogues--as well as "entrapment" letters, some drafted by Comstock himself. (He or another agent would pose as being in desperate need of birth control, get the goods and make the arrest.) Established companies, Tone discovered, run by "honest, brave men" who supported Comstock and NYSSV, were never targets for such treatment, which was reserved for smaller entrepreneurs--especially immigrants ("sly" Jews) and women ("old she villains").
Even so, many of those arrested were let off or punished lightly, while the entrapping agents and prosecutors ran the risk of being scolded and humiliated by judges and juries who doubted the advisability, and constitutionality, of such far-reaching Congressional interference into personal matters. Over time, what Tone calls a "zone of tolerance" was created to buffer the flourishing contraceptives trade and its practitioners. In fact, Shaw's prediction that Comstockery would become a "standing joke" was soon realized here in the United States, even, it would seem, in Congress itself.
The hugely ambitious Comstock Act, however, was hardly about contraceptives alone. It stated:
No obscene, lewd or lascivious book, pamphlet, picture, paper, print or other publication of an indecent character, or any article or thing designed to be intended for the prevention of conception or the inducing of abortion, nor any article or thing intended or adapted for any indecent or immoral use or nature, nor any written or printed card, circular, book, pamphlet, advertisement or notice of any kind giving information, directly or indirectly, where, or how, or of whom, or by what means either of the things mentioned may be obtained or made...shall be carried in the mail.
You do the math. A small army of Post Office inspectors (known as special agents) were required to enforce such an effort. But Congress refused to drum up a serious budget for the measure when Comstock went into effect and made light of the ambitious national program by raising the number of inspectors--nationwide--from fifty-nine to only sixty-three. Looking at the postal arrest figures from May 1875 through April 1876, Tone counted a total of 410 apprehensions, of which only twenty-seven were for violations of the Comstock law.
Nonetheless, in later generations we took it for granted that from passage of the Comstock Act until the post-World War I rise of Margaret Sanger, the average American had little or no access to what we now call "family planning" (a term, Tone informs us, suggested in the 1960s by Malcolm X--"because Negroes [are] more willing to plan than to be controlled"). And while it's true that some of the contraceptives available at the time were ineffective, dangerous or both, others, including condoms, cervical caps (apt to be euphemistically advertised as pessaries in the Comstock era), diaphragms, sponges and some spermicides were often pretty good and relatively inexpensive. Tone notes that in this sea of alternatives many determined wives and husbands doubled, tripled or quadrupled on protection. Given this environment, it's not surprising that after 1880 the national fertility rates for both white and black women declined rapidly, reaching an all-time low in 1940--twenty years before Enovid, the first birth-control pill, came to market and only three years after the AMA's 1937 resolution to "endorse" contraception and recommend it for inclusion in the standard medical school curriculum.
Between 1880 and 1940 the average fertility rate of whites dropped from 4.4 children per woman to 2.1. For blacks it dropped from 7.5 children to 3. Given these incontrovertible facts--a flourishing contraceptives industry paired with a steady decline in births--how could we have come to believe otherwise, that our great and great-great grandmothers were, so to speak, up the fertility creek without a paddle?
Some of the historical distortion must be attributed to the work of Margaret Sanger, who originally dreamed of female empowerment through woman-oriented contraceptive technologies. She viewed birth control as a woman's right and responsibility, and wrote in 1922 that "the question of bearing and rearing children is the concern of the mother and potential mother.... No woman can call herself free who does not own and control her own body." Condoms "compromised this objective by placing women's procreative destiny in men's hands." Until her death in 1966 Sanger promoted the manufacture first of diaphragms and later the pill, never quite answering objections from other feminists--in the 1920s and again in the 1960s--that this transferred power over women's bodies to doctors who were overwhelmingly (in the case of gynecologists, 97 percent) male. Sanger came to believe so strongly in medically controlled contraception that in a 1952 letter she stated that her greatest achievement had been "to keep the movement strictly and sanely under medical auspices."
This was an about-face from her earlier position. In her extraordinary 1915 pamphlet, Family Limitation, a home guide to contraception, Sanger, as Tone explains, "envisioned a world of grassroots birth control where women from all walks of life could use contraceptives without reliance on doctors, a populist approach she would soon abandon." Family Limitation discussed douches, condoms and cervical caps. (The essential difference between a pessary or cap and a diaphragm is that the generally thimble-shaped caps fit over the cervix by suction and are less likely to be displaced. The diaphragm, however, more or less divides the vagina vertically into two compartments, protecting the cervix from the arena where sperm is deposited. Both methods can benefit from outside help with fitting and correct technique, but the cap has a better record of over-the-counter success, and was long distributed by this means in France, England and the United States.) Sanger ultimately recommended caps, which she felt could be most easily and discreetly used and controlled by women. She distributed 100,000 copies of her pamphlet, imploring women to learn how to insert caps into their own bodies and then to "teach each other" how to as well.
When Sanger and her sister Ethel opened their first clinic in 1916 they instructed women, eight at a time, on how to use over-the-counter (OTC) contraceptives, including condoms, suppositories and rubber pessaries. When police, inevitably, raided the clinic, they found boxes of Mizpah pessaries. An effective OTC contraceptive, this flexible rubber cap was sold by druggists and mail-order vendors for the alleged purpose of treating such medical conditions as a displaced (or prolapsed) uterus. But as Tone writes: "Family Limitation got Sanger into more trouble. In 1915, she found herself back in Europe dodging American law while continuing her contraceptive education.... The trip across the Atlantic was risky. War had broken out." Back home, her husband, William Sanger, had his own problems. And as fate would have it, so did Anthony Comstock. William Sanger had been arrested by Comstock for distributing Family Limitation. And Comstock, who caught a cold in the courtroom during the trial, died soon after of pneumonia.
After her divorce from William, Margaret admitted she was looking for "a widower with money." James Noah Henry Slee, twenty years her senior, was "a well-heeled member of Manhattan's business elite...part of the same establishment Sanger had vilified in her younger, more radical years." They married in 1922 and with his backing, Tone explains,
Sanger embarked on a new chapter of her career, one that distanced the birth control movement from its radical origins and placed it on a more conservative path.... She recognized...that medical science enjoyed increasing prestige and political clout...she sought birth control allies through an ideology that trumpeted women's health over their civil liberties and cast doctors, not patients, as agents of contraceptive choice.
Sanger switched her preference to the diaphragm, particularly the Holland-Rantos brand, which sold exclusively to doctors. (This company, established in 1925, was funded by none other than Mr. Margaret Sanger, James Noah Henry Slee.) Sanger next prevailed on her besotted bridegroom to hire a distinguished, high-salaried doctor to promote their new company:
1925 is to be the big year for the break in birth control...the medical profession will take up the work...I shall feel that I have made my contribution to the cause and...I can withdraw from full-time activity.... If I am able to accomplish this victory...I shall bless my adorable husband, JNH Slee, and retire with him to the garden of love.
Sanger did not retire. In the following years she worked ceaselessly toward her goal of getting the AMA to endorse birth control. Her "signature" story, often bringing audiences to tears, concerned Sadie Sachs, a young immigrant mother of three, married to Jake, a truck driver. When Sadie begged a doctor to give her birth control he cruelly retorted, "Tell Jake to sleep on the roof." Sadie later died of septicemia following a self-induced abortion. Sanger was now in the business of helping the public forget that some of the widely available OTC methods worked very well for many people. As Tone points out, if Sadie could afford a physician visit, she could surely afford the far lower price of a contraceptive.
In addition to the move toward medicalization, our collective memory may have been dealt a brainwashing by panic-driven "eugenicists." As Sanger moved up socially she supported birth control for some elitist reasons, such as "the facilitation of the process of weeding out the unfit [and] of preventing the birth of defectives." But this was mild compared with the phobic reasoning of some of our greatest national leaders, who also feared the newcomers from Europe. Falling birthrates among our native born and the widespread immigration of foreigners from southern and eastern Europe (over 23 million people arrived on America's shores between 1880 and 1920) led Teddy Roosevelt to warn in 1912 that if middle-class American women used fertility control it "means racial death." In 1927 Supreme Court Justice Oliver Wendell Holmes, our great champion of civil liberties, stunned many of his admirers when, in Buck v. Bell, he agreed to uphold a Virginia eugenics statute legalizing the coerced sterilization of "socially inadequate persons." Carrie Buck, the plaintiff, was young, single and white, the "daughter of an imbecile," the mother of an "illegitimate feeble minded child." Holmes agreed to the cutting of Buck's fallopian tubes, proclaiming, "Three generations of imbeciles are enough." Tone adds that during the Nuremberg trials following World War II, accused Nazi war criminals cited Buck v. Bell to justify the forced sterilization of some 2 million Germans.
Here in the United States, the eugenics and population control movements promoted--and continue to promote--the need to develop contraceptives that take prescription (and often removal) out of the woman's hands. For example, in interviews with 686 low-income users of Norplant--a hormonal contraceptive, intended to last for five years, that consists of six matchstick-sized capsules implanted in a woman's arm--researchers at Columbia University's Center for Population and Family Health learned that 40 percent anticipated or experienced "cost barriers" that could impede the removal of Norplant. They urge that family-planning clinics "follow a policy of Norplant removal on demand, regardless of the patient's ability to pay." Some feminists charge that the effectiveness of OTC methods (carefully used) is still downplayed in quasi-official figures, a dangerous public health mistake in this age of galloping STDs.
Meanwhile, the effectiveness of doctor-controlled methods has been exaggerated, as the FDA has acknowledged. Previously, it gave out "ideal" figures for oral-contraceptive effectiveness, in contrast to discouraging clinic "use" figures for barrier methods. In the new round of product labeling this has been partially corrected; actual-use figures for the Pill are placed in a truthful range of 92-95 percent, not at the falsely optimistic 99 percent-plus.
Devices and Desires is replete with riveting histories of women and men who labored--legally and illegally--in the ever-challenging arena of conception control, from the Comstock era through today, and includes portraits of the men who developed Enovid, the first pill, as well as those behind the notorious IUD, the Dalkon shield. Those who read this fascinating book will have a far keener and more credible sense of what has happened and where we are now. Most women are still unsatisfied with their contraceptive choices, and as Tone concludes,
It is ironic that in a post-Roe v. Wade world that celebrates reproductive choice, the most frequently used contraceptive in the country--by a wide-margin--is female sterilization. In a very real sense Americans are still waiting for the heralded "second contraceptive revolution" to arrive.... In the absence of universal health care or prescription drug coverage, one way out of the contraceptive conundrum may be the development of more affordable over-the-counter methods, which would increase men's and women's options without tethering contraceptives to the medical marketplace from which millions are excluded.... Today to meet the needs of women and men who lack sufficient resources, we must supplement reliable medical methods with inexpensive over-the-counter options.
In the 1960s, the introduction of the Pill, revival of the ever-treacherous IUD and "stealth sterilization" of welfare moms--whose tubes were tied, without permission, after giving birth--placed contraception still more firmly in doctors' hands. In the parlance of that decade the "greasy kid stuff," including condoms, was left in the dust. Because of overpopulation fears, the new technologies enjoyed a diplomatic immunity--at women's expense. At an annual meeting of medical school deans, Nobel laureate Dr. Frederick Robbins declared, "The dangers of overpopulation are so great that we may have to use certain techniques of conception control that may entail considerable risk to the individual woman." Original Pills contained 150 micrograms of estrogen; today we know that 20 suffices. Millions of women served as guinea pigs for high-dose pills and IUDs, and thousands died. In 1970, at Senate hearings, Dr. Louis Hellman, chairman of the FDA advisory committee that twice declared the Pill safe, admitted that in his equation of benefit versus risk he "put population first, before benefits to the individual woman's health."
As women demanded to "take our bodies back" from deceitful doctors, the spirit of Comstock rose up again. In 1973 Our Bodies, Ourselves, Ellen Frankfort's Vaginal Politics and my book Free and Female were banned in Cleveland and Washington, DC. In 1979 shipments of cervical caps that independent women's self-help clinics imported from England were seized by FDA agents. Senator Ted Kennedy helped to get the caps released, but the FDA restricted their use to "investigational device" (ludicrous--the same caps had been in continuous use in England for a century), thus subverting the grassroots revival of the cap. Who can say who put the FDA up to this, but perhaps some future Andrea Tone-type women's history sleuth will get to the bottom of it.
Meanwhile, one of my hopes for Tone's extraordinary book is that it might encourage many people--men as well as women--to reconsider the barrier methods, respect them more and possibly learn to enjoy them, as some say they do. In contrast to the steady decline of teenage pregnancy, the epidemic of sexually transmitted diseases in young adults is increasing at a truly alarming rate. For example, estimates are that 46 percent of female college students are now infected with human papilloma virus (HPV), which can cause both genital warts and cervical cancer. Are student health services reliably advising their clients of this? My informants say no.
Here we go, starting on what promises to be a pleasantly engrossing tour of the landmarks of three centuries of Anglo-American intellectual feminism, guided by a seriously impressive scholar, Elaine Showalter of Princeton University.
Enslave your girls and women, harbor anti-US terrorists, destroy
every vestige of civilization in your homeland, and the Bush
Administration will embrace you. All that matters is that you line up as
an ally in the drug war, the only international cause that this nation
still takes seriously.
That's the message sent with the recent gift of $43 million to the
Taliban rulers of Afghanistan, the most virulent anti-American violators
of human rights in the world today. The gift, announced last Thursday by
Secretary of State Colin Powell, in addition to other recent aid, makes
the United States the main sponsor of the Taliban and rewards that "rogue regime"
for declaring that opium growing is against the will of God. So, too, by
the Taliban's estimation, are most human activities, but it's the ban on
drugs that catches this administration's attention.
Never mind that Osama bin Laden still operates the leading
anti-American terror operation from his base in Afghanistan, from which,
among other crimes, he launched two bloody attacks on American embassies
in Africa in 1998.
Sadly, the Bush Administration is cozying up to the Taliban regime at
a time when the United Nations, at US insistence, imposes sanctions on
Afghanistan because the Kabul government will not turn over Bin Laden.
The war on drugs has become our own fanatics' obsession and easily
trumps all other concerns. How else could we come to reward the Taliban,
who has subjected the female half of the Afghan population to a continual
reign of terror in a country once considered enlightened in its treatment
At no point in modern history have women and girls been more
systematically abused than in Afghanistan where, in the name of madness
masquerading as Islam, the government in Kabul obliterates their
fundamental human rights. Women may not appear in public without being
covered from head to toe with the oppressive shroud called the
burkha , and they may not leave the house without being accompanied by
a male family member. They've not been permitted to attend school or be
treated by male doctors, yet women have been banned from practicing
medicine or any profession for that matter.
The lot of males is better if they blindly accept the laws of an
extreme religious theocracy that prescribes strict rules governing all
behavior, from a ban on shaving to what crops may be grown. It is this
last power that has captured the enthusiasm of the Bush White House.
The Taliban fanatics, economically and diplomatically isolated, are at
the breaking point, and so, in return for a pittance of legitimacy and
cash from the Bush Administration, they have been willing to appear to
reverse themselves on the growing of opium. That a totalitarian country
can effectively crack down on its farmers is not surprising. But it is
grotesque for a US official, James P. Callahan, director of the State
Department's Asian anti-drug program, to describe the Taliban's special
methods in the language of representative democracy: "The Taliban used a
system of consensus-building," Callahan said after a visit with the
Taliban, adding that the Taliban justified the ban on drugs "in very
Of course, Callahan also reported, those who didn't obey the
theocratic edict would be sent to prison.
In a country where those who break minor rules are simply beaten on
the spot by religious police and others are stoned to death, it's
understandable that the government's "religious" argument might be
compelling. Even if it means, as Callahan concedes, that most of the
farmers who grew the poppies will now confront starvation. That's because
the Afghan economy has been ruined by the religious extremism of the
Taliban, making the attraction of opium as a previously tolerated quick
cash crop overwhelming.
For that reason, the opium ban will not last unless the United States is
willing to pour far larger amounts of money into underwriting the Afghan
As the Drug Enforcement Administration's Steven Casteel admitted, "The
bad side of the ban is that it's bringing their country--or certain
regions of their country--to economic ruin." Nor did he hold out much
hope for Afghan farmers growing other crops such as wheat, which require
a vast infrastructure to supply water and fertilizer that no longer
exists in that devastated country. There's little doubt that the Taliban
will turn once again to the easily taxed cash crop of opium in order to
stay in power.
The Taliban may suddenly be the dream regime of our own war drug war
zealots, but in the end this alliance will prove a costly failure. Our
long sad history of signing up dictators in the war on drugs demonstrates
the futility of building a foreign policy on a domestic obsession.
When we left that old journalistic evergreen, the evils of daycare, two weeks ago, the media hysteria over the NICHD study had just about peaked. The researchers had begun to turn on each other in public, never a good sign--Jay Belsky, a champion soundbiter who had seized the media initiative by strongly suggesting that the study showed that more than thirty hours a week with anyone but Mom would risk turning little Dick and Jane into obnoxious brats, was sharply challenged by numerous co-researchers, who claimed the study's results were tentative, ambiguous and negligible, hardly results at all, really. After a few rounds of this, the media suddenly remembered that no one had actually seen the study, which won't be published for another year and which does seem on the face of it rather counterintuitive: Daddy care is bad? Granny care is bad? Quality of care makes no difference? What really did the trick, though, I suspect, was that every fed-up woman journalist in America sat down and bashed out a piece telling the doomsayers to lay off, already. With 13 million kids in daycare, and two-thirds of women with children under 6 in the work force, working moms are a critical mass, and they are really, really tired of being made to feel guilty when they are, in fact, still the ones doing double duty at work and home.
Compare the kerfuffle over the quantity of hours spent in daycare with the ho-hum response to studies of its quality. On May 1, Worthy Wage Day for childcare workers, came a study from Berkeley and Washington, DC, that looked at staffing in seventy-five better-than-average California daycare centers serving kids aged 2 1/2 through 5. According to Then and Now: Changes in Child Care Staffing 1994-2000, staffers and directors are leaving the field in droves. At the centers in the study, 75 percent of teachers and 40 percent of directors on the job in 1996 had quit four years later. Some centers had turnover rates of 100 percent or more (!) from one year to the next. Half the leavers abandoned the field entirely--raising their incomes by a whopping $8,000 a year compared with the other half, who remained in childcare. Nor were those who left easily replaced: Most of the centers that lost staffers could not fill all their job slots by the next year.
The demoralization and turmoil caused by constant turnover stress both the workers who stay and the children. Making matters worse, the new workers are "significantly less well-educated" than those they replace--only a third have bachelor's degrees, as opposed to almost half of the leavers. Pay, say the researchers, is the main issue: Not only have salaries not risen with the rising tide supposedly lifting all boats; when adjusted for inflation, they have actually fallen. A daycare teacher works twelve months a year to earn $24,606--just over half the average salary of public-school teachers, who work for ten months (not that schoolteachers are well-paid, either). Center directors, at the top of the field, earn on average a mere $37,571; the recommended starting salary for elementary-school teachers in California is $38,000. (In France, which has a first-rate public daycare system, daycare teachers and elementary-school teachers are paid the same.) Daycare teachers love their work--two-thirds say they would recommend it as a career--but simply do not earn enough to make a life in the field.
It's a paradox: Even as more and more families, of every social class, rely on daycare, and even as we learn more and more about the importance of early childhood education for intellectual and social development, and even as we talk endlessly about the importance of "quality" and "stability" and "qualified" staff, the amount of money we are willing--or able--to pay the people we ask to do this demanding and important job goes down. Instead of addressing this reality, we endlessly distract ourselves with Mommy Wars. (You let your child have milk from the store? My child drinks nothing but organic goat milk from flocks tended by Apollo himself!) And because as Americans we don't really believe the rest of the world exists, when a study comes along suggesting that other-than-mother-care produces some nasty and difficult kids, we don't think to ask if this is a problem in Denmark or France, and if not, why not.
Two new books of great interest, Ann Crittenden's The Price of Motherhood and Nancy Folbre's The Invisible Heart: Economics and Family Values, point out that there is a crisis of care in America. Women are incredibly disadvantaged when they perform traditionally female work--childcare, housework, eldercare--unpaid within families. (According to Crittenden, motherhood is the single biggest cause of poverty for women.) The free market cannot replace this unpaid labor at decent rates, Folbre argues, because it would be too expensive: Even now, most families cannot afford tuition at a "quality" daycare center, any more than they can afford private school. And men are hardly falling over themselves to do their share--nobody's talking about the Daddy Track, you'll notice. Both writers call for recognizing the work of care as essential to the economy: Top-quality daycare should be funded by the government, like school, because it is a "public good."
Unfortunately, funding public goods is not exactly a high priority of government, which is busily cutting programs for children in favor of a huge tax cut for the rich. These days our main public goods seem to be prisons ($4.5 billion), the drug war ($19 billion, including $1 billion in military aid to Colombia), abstinence education ($250 million) and executing Timothy McVeigh ($50 million, not counting plane tix for celebrity death witness Gore Vidal). You can always find money for the things you really want.
Once again the Bosnian Initiative Frankfurt, a German human rights group, is asking Nation readers to help fund summer camp for Bosnian refugee children. Many readers have become an integral part of this wonderful effort, sometimes going beyond donations to correspond with particular children. $150 makes you a "godparent" and pays for two weeks of camp for a child, but gifts of any size are welcome. Send checks made out to the Bosnian Initiative Frankfurt to me at The Nation, and I will forward them.
The morning after I returned to Chicago from the recent Margaret Mead Legacy conference at Barnard College, honoring the centenary of the anthropologist's birth, a newspaper columnist rang me at dawn with the demand that I explain "from a feminist perspective" why Tony Soprano is "this millennium's first
sex symbol." "Does this mean that we're all going backwards?" she asked with relish. Dumbfounded, I countered, rather crankily, with the request that she give me evidence that any women anywhere were claiming sexual attraction to a dumb, sexist and racist, unfaithful, badly out of shape, psychologically damaged organized crime capo. (Not that I don't love the show; who doesn't, even if I have to watch my own people get minstrelized to a fare-thee-well.) Nothing daunted, and with not a shred of shame that she in fact had no evidence, the columnist cleverly countered, "Well, if it were true, what would your feminist perspective be?" As Rayna Rapp of the New School had declared to the amused Barnard audience, I had that feminist anthropologist's "WWMMS moment": What Would Margaret Mead Say?
What would she say? We could easily remark about Mead what Walt Whitman, another New York-based celebrity, claimed of himself: She is large, she contains multitudes. Mead was professionally active for fully half of the last century and, by choice and her own never-ending efforts, very much a public voice for most of that time. She said a lot of different things in different decades, and she was received variously by her own professional colleagues and within a shifting American public sphere. The Barnard conferees, including the college President, Judith Shapiro, herself a feminist anthropologist, and Mead's daughter Mary Catherine Bateson, spoke on key aspects of Mead's work, on the ways in which she had inspired their own research, and on what her legacy might be in this millennium. They agreed that she and her cohort made a series of novel connections: envisioning the malleability of gender relations, seeing human corporeality, ritual and psychology as one, emphasizing the deeply enculturated nature of child-rearing and of adolescent coming of age. Elaine Charnov, director of the Margaret Mead Film Festival, and Faye Ginsburg of New York University both spoke as well of Mead's prescience in the use of ethnographic film and her general status as technological pioneer.
"But there is no Margaret Mead now," the panelists lamented with the partisan, largely female audience crowding the auditorium, and variously attributed that fact to her heroic uniqueness, to her intellectual coming of age being the "right time and the right place" for public sphere presence, to the rise of the New Right in the West and the triumph of global neoliberalism, to the renaissance of biological reductionism in the overwhelming American popular-cultural presence of sociobiology, to our collective retreat from public voice.
Mead became a popular icon over the course of the 1960s, and attained the status of Holy Woman in her last years, as the late Roy Rappaport commented (she died in 1978). And it is difficult to evaluate Holy Women, particularly in the long wake of a posthumous attack--by the Australian anthropologist Derek Freeman, in 1983--that quickly became a mass media firestorm. Margaret Mead and Samoa is a badly written and unconvincing claim that Mead, influenced in a "culturally determinist" direction by her nefarious adviser Franz Boas, falsely interpreted the Hobbesian world in which Samoan youth came of age as a gentle idyll. Freeman claimed that the true Samoa is characterized by a "primeval rank system" that dictates a "regime of physical punishment" of children and violent "rivalrous aggression" among men, "highly emotional and impulsive behavior that is animal-like in its ferocity" among chiefs, and a rape rate "among the highest to be found anywhere in the world." Scholars criticized Freeman's theoretical vacuity and empirical flaws, his ahistorical claim of an Eternal Samoa, his failure to realize that his key informants--older, high-status males--were no more a "true and accurate lens" of Samoan culture than were Mead's young female companions. Most especially, feminists noted the rank sexism of Freeman's focus on Mead's youth and size: The "liberated young American...only twenty-three years of age...[was] smaller in stature than some of the girls she was studying."
But public reception of, as opposed to in-house reaction to, Freeman's book was most importantly about the extraordinary fit between his line of attack and newly dominant New Rightist politics. Margaret Mead and Samoa provided a Heaven-sent opportunity for the press to rant against the "liberal feminist culture" and "lifestyle experiments" with which it newly identified Mead, conveniently forgetting its fervent eulogies of her of only half a decade earlier. As the late David Schneider noted at the time, Freeman's book was "a work that celebrate[d] a particular political climate by denigrating another." Freeman's vision, at one fell swoop, allowed commentators to deny female intellectual capacities across all societies; to naturalize male dominance, male sexual violence and aggression; and at the same time, to slur Samoans--and with them all so-called primitives, that is, all people of color--as violent, nasty savages. Logically self-contradictory and empirically bankrupt though it was, Freeman's narrative was wonderfully composed to fit American Reagan-era contentions of the foolishness of the movements of the 1960s and 1970s, as you "can't change human nature," and Western, upper-class, male, heterosexual, and white dominance are natural after all.
Freeman's frisson in popular culture is now long past, victim of the increasingly rapid biodegradation of American popular consciousness. I routinely ask gender studies and anthropology classes if they have heard of Freeman or his book, and very few respond affirmatively, even when they have a niggling sense that there is some sort of blot on Mead's reputation. Only American New Rightists remember and believe in Freeman's attack on Mead: A Lexis/Nexis search for all articles referring to the two since 1990 revealed only a handful of sneering articles in rightist outlets, whereas a general search using Mead's name alone garnered thousands of "hits."
But while specific media scandals always fade, popular-cultural troping of anthropology and anthropologists is unceasing, and distorts whatever information members of the discipline may have to offer considering power and culture in human societies. The issue is more crucial than is often realized, because popular apprehensions of anthropology and anthropologists are importantly interwoven with changing American constructions of Others--those deemed somehow apart from the norm by virtue of race, gender, nationality, class, religion, sexual identity. "Culture" and "biology" are the two key domains through which Americans historically have laundered politics from public sphere inspection. And anthropology, sententiously self-described as the most humanistic of the sciences, the most scientific of the humanities, has thus evolved into a cynosure of political approbation and attack, both refuge and refuse in contemporary contestations over power.
I have recently identified a series of anthropological "Halloween costumes" into which, since the 1960s, American popular writing, film, television, cartoons and advertising have tended to squeeze all anthropological knowledge. Each costume--Technicians of the Sacred, Last Macho Raiders, Evil Imperialist Anthropologists, Barbarians at the Gates and Human Nature Experts--reflects minor strands in some past or present anthropological writing. More important, each enacts a retrogressive politics with reference to culture and power on the US and global stage. The costumes, in other words, act as Procrustean beds, amputating those pesky limbs of anthropological knowledge that flop outside their predetermined grids.
Technicians of the Sacred, for example, posits anthropologists as time travelers who bring back to us visions of Noble Savages living nonviolently and cooperatively, practicing sexual equality, respecting the environment and engaging in religious worship somehow more "spiritual" than ours. Such a seemingly benign vision, however, yanks contemporaneous populations out of our shared stream of world history and prevents us from understanding the ways in which they lack political power on local and global stages. Last Macho Raiders imagines anthropologists as a guild populated by cool Harrison Ford lookalikes, virile, positive imperialists. This particular Halloween costume has long had little appeal to most members of the guild but remains widely used in popular culture. The Evil Imperialist Anthropologist, on the other hand, who is simply a Last Macho Raider seen from the viewpoint of the Raided, has roots in some Third World and Native American writing of the 1960s, became a stock postmodern character in much academic writing in the 1980s and spilled over into popular culture. Barbarians at the Gates envisions anthropologists as foolish multiculturalists, misguided salespeople hawking inferior--non-Western--cultural materials to a gullible American public. In other words, it is a rightist, racist framing of the Technicians of the Sacred trope, and has been heavily purveyed in New Rightist writing through our recent culture wars. And Human Nature Experts paints anthropologists as pure scientists--gatherers of facts alone. Of course we need to stand for empirical reality, but this particular trope functions in the public sphere today almost solely as a rationale for sociobiological arguments, as if all fact and logic were the sacred possession of that contemporary version of biological reductionism alone, and the active armies of distinguished anti-sociobiology scientists were a mere rumor in the wind.
We might say, then, that Freeman, in an act of simultaneous attack and self-aggrandizement, dressed Mead as a Barbarian at the Gates and himself as a Human Nature Expert. Ironically, though, Mead herself, over the decades, had no small hand in the fashioning of the Human Nature Expert costume. At the same time, the Halloween costumes are most definitely not unisex, and Mead, like the rest of us, never escaped her gender. Looping back to the years of Mead's intellectual coming of age and first writing helps us to trace the changing lineaments of "culture" in American politics, particularly that shifting overlap of gender and race that is so crucial to the framing of rationales for and protests against contemporary lines of stratification.
In the 1920s, counter to the assertions and interpretations of more recent commentators, Mead's Coming of Age in Samoa was written, and was read, not as a paean to free love or women's rights or even the romantic lives of "noble savages," but rather as a scientific account of certain differing cultural features in a "more simple" society that "we," meaning middle-class white Americans, might wish to adopt in order to raise "our youth" in a less stressful manner. Mead defined herself early on, as I have written, as an objective scientist, a professional social engineer. Despite her sometimes somewhat overblown lyricism--"A group of youths may dance for the pleasure of some visiting maiden," "lovers slip home from trysts beneath the palm trees"--Mead ultimately is no Technician of the Sacred, no romantic antimodernist. Her 1920s Samoa is a "shallow society" where "no one plays for very high stakes," of use to "us" as an object of scientific study: it is a "human experiment" under "controlled conditions."
While Mead worked almost solely with girls and women, and certainly wrote, as a contemporary reviewer put it, with the "clean, clear frankness of the scientist" about their sexual experiences, she did not draw explicitly feminist lessons from her fieldwork. Like many young women of her postsuffrage era, to whom feminism seemed both passé and potentially professionally damaging, Mead took advantage of the doors opened by the women's rights activism of her mother's and grandmother's generation (and indeed, by her own mother and grandmother) but did not herself join that ongoing movement. Nancy Lutkehaus of USC pointed out at the conference that Mead became a public intellectual in the 1920s because of her fortuitously advantageous position in time and space: based in New York, writing about the South Seas at a point when that region had caught the American public imagination, and indeed at just the point that newspapers and magazines were proliferating across the American landscape. And, I would add, because she wrote then not as a radical but as a modernist, advocating changes for the white middle classes--coeducation, less authoritarian childrearing, greater frankness about the facts of birth, reproduction, and death--already in the works in the Roaring Twenties.
But scientist or not, Mead was at times portrayed in the press as many anthropologists, especially female anthropologists, have been since: with a certain condescending, inappropriately sexualizing humor, identified with her inevitably stigmatized subjects. Lutkehaus has unearthed 1920s newspaper stories claiming that Mead went to Samoa to study the "origins of the flapper." And in the 1930s, a popular magazine coyly described Mead as "a slender, comely girl who danced her way into the understanding of the Melanesian people and became an adopted daughter and a sort of princess of the Samoans. When they anointed her with palm oil and indicated that a dance was in order, she did a nice hula and they declared her in--indicating the adaptability of the modern young woman if she just has a chance to step out."
It was one of the first steps on the long road toward my Sopranos interlocutor--herself, as we shall see, merely part of a vast contemporary phenomenon.
Mead's early anthropological work reflected both developing British and American concerns: She did careful kinship and social organizational research in Samoa and New Guinea, and studied what she construed to be varying cultural temperaments--characteristic psychological states--and their connections to sex roles and life cycles. But by 1935, when she published the widely read Sex and Temperament in Three Primitive Societies, her analytic twig was permanently bent in the Americanist psychological, "culture and personality" direction. Sex and Temperament, rather than Coming of Age, is the work in which Mead makes her clearest arguments concerning the plasticity of human sex role arrangements: "Many, if not all, of the personality traits that we have called masculine or feminine are as lightly linked to sex as are the clothing, the manners, and the form of head-dress that a society at a given period assigns to either sex.... We are forced to conclude that human nature is almost unbelievably malleable."
This is the modal Mead, the ur-popular culture anthropologist who was rediscovered by Second Wave feminists, assigned all over the academy and read aloud in consciousness-raising groups in the 1970s. It is important to remember, though, that every generation reads selectively. Mead's "gender malleability" statements are, in fact, lodged inside a larger argument against women's equal rights as represented by the contemporary Soviet Union. Mead saw in the opening of all occupations to women there a "sacrifice in complexity" of culture: "The removal of all legal and economic barriers against women's participating in the world on an equal footing with men may be in itself a standardizing move towards the wholesale stamping-out of the diversity of attitudes that is such a dearly bought product of civilization."
Ironically, the very popular troping of anthropology for political purposes has contributed to the discipline's unpolitical reputation. Explicitly political statements in anthropological texts vanish in the course of reading, the discipline's pioneering anti-racism and near-implosion over Vietnam have succumbed to the culture of forgetting, and the long-vital left tradition in anthropology worldwide is popularly and often even professionally invisible. When I gave an early talk on Mead for a group of women's studies professors, a senior political scientist exclaimed afterward that she was surprised to learn that Mead "had any politics." All God's anthropologists got politics, most especially the very public Margaret Mead.
Those politics varied considerably over the long decades of Depression, war, decolonization and cold war, and the 1960s and 1970s conjuncture of Vietnam, civil rights, the Second Wave of feminism and associated gay rights organizing. The one common thread across the decades, though, was Mead's adherence to Progressive social engineering, and thus her profound commitment to the notion of disinterested science and the rule of experts. In terms of gender, from the 1930s until the early 1970s, when she did take on a liberal feminist stance, Mead embraced the conservative Freudian notion of the universal "constructive receptivity of the female and the vigorous outgoing constructive activity of the male." While Mead continued to argue against isolated, narrow nuclear families, and for careers for better-off women as long as they were "womanly" both at home and at work, it is no wonder that Betty Friedan in 1961 spent almost an entire chapter of her celebrated book attacking Mead's pernicious "super saleswomanship" of the Feminine Mystique. It is equally unsurprising that Second Wave feminists, in rediscovering both Friedan and Mead for their own purposes, read Friedan as selectively as they (we) did Mead.
Similarly, Mead's war work for the American government extended into both her very successful postwar advocacy of federal funding for anthropological research and her cold warrior stance against, among other actions, antinuclear and anti-Vietnam War protests. (The latter issue led to a huge fight at the 1971 American Anthropological Association meetings, during which Mead was hissed by an antiwar audience of 700.) These political actions were overwhelmed, in popular culture, by Mead's highly public approbation of "questing youth" from the mid-1960s forward, and the liberal feminist alliances of her last years. She even had her own character in the first stage version of Hair, who celebrated male "long hair and other flamboyant affectations," and whose song ended in the recitative, "Kids, be whatever you are, do whatever you do, just so long as you don't hurt anybody."
Hair's sendup of Mead followed her thinly disguised appearance as a famous older female anthropologist in Irving Wallace's sleazy 1963 potboiler, The Three Sirens: "She thought of the place: the temperate trade winds, the tall, sinewy, bronze people, the oral legends, the orgiastic rites." The sexual imputations in both texts return us to consideration of my morning journalist and Tony Soprano. Like all occupational groups, anthropologists have traditions of internal self-reference, but ours have intersected in particularly damaging ways with the changing Zeitgeist. From Clyde Kluckhohn's 1940s boast that we were all "eccentrics" "interested in bizarre things," to Clifford Geertz's 1984 reference to anthropologists as "merchants of astonishment," many of us have enjoyed exoticizing ourselves, playing, as I have written, the court jesters of academe. While some of this self-exoticization has always arisen from identification with oppressed populations, the overall effect of the court jester construction is dire. Anthropologists have become the American public's "exotics at home," identified with our demonized, trivialized subjects, or rather, those who are presumed to be our sole subjects--non-Westerners around the globe, the poor, the nonwhite, and sexual minorities in every country. And, of course, in double irony, the numbers of nonwhite, non-Western and/or gay anthropologists, never insignificant, grow larger every year. By a process of infinite recursion, the stigmatized figurings of subjects and researchers repeatedly rub off on one another, denying dignity, history and human rights to domestic and foreign "exotics," and stripping anthropologists of the intellectual authority with which to contribute to progressive politics.
Ironically, the "exotics at home" complex also reifies the discipline's long historical game of peekaboo with Americanist research. Mead's own master's thesis concerned the link between exposure to English and IQ test results among Italian-American schoolchildren in New Jersey (Tony and Carmela's grandparents!), and significant numbers of anthropologists have done United States fieldwork in every decade since. All of the Barnard conference senior panelists, for example, are engaged in Americanist research. But in the grand adaptive radiation of disciplinary institutionalization as American universities grew over the twentieth century, anthropology was defined in contradistinction to sociology as the study of non-Westerners (and based on ethnographic rather than quantitative methodology), perpetrating falsehoods about actual work being done in both fields. Since at least the 1940s, anthropologists and middlebrow media have repeatedly "discovered" that anthropology is "just now coming home." In 1974 Time declared that
the gimmick is that anthropologists, after decades of following Margaret Mead to Samoa and Bronislaw Malinowski to the Trobriand Islands, have staked out new territory.... U.S. anthropology, it seems, must recognize that the primary tribe to study is the Americans.
But then, fifteen years later, the New York Times Book Review declared:
Pity the poor anthropologist. She has trekked the highlands, machetied the jungles, sifted the sands for new tribes to study. But the Ik have been exposed, the Tasaday tallied. What's left? Increasingly, today's would-be Meads and Benedicts are turning in their bush jackets for tweeds, for some easy poking around in their own backyards--where, lo and behold, they unearth practices as alien to Western norms as any found in the heart of New Guinea.
And so it goes. While such consistent failure to engage with empirical reality, and the condescending notion that US fieldwork must of necessity be merely "easy poking around," are extremely annoying to those of us who have done arduous research in American settings, I want to make a different point. That is, the culture of forgetting involved here enacts what I have labeled the anthropological gambit, or the pseudo-profound claim that "we" are like "primitives." The attribution of "our" characteristics to "them," and vice versa, is always good for a laugh in popular culture. "We" are only the Ik, the Tasaday, the Trobrianders. An X is only a Y.
As if to prove the point, the day before I left for the Barnard conference, the New York Times happily declared that "some distant day, anthropologists may discover what was surely the tribal art of 20th century American suburbia: paint-by-number paintings."
There is nothing innately wrong with cuteness--it is after all a matter of taste. The point, though, is that the gambit, which is ubiquitous in the public sphere, is inherently political, engages in hidden rhetorical work. It certainly represents Edward Sapir's "destructive analysis of the familiar," one element of a liberatory cultural critique in use in the West at least since Michel de Montaigne's ironized cannibals. But it does so in a profoundly ahistorical, noncontextual way, and so places Others at temporal distance to ourselves and effaces the questions of history and power on both poles of the contrast.
Finally, anthropologists like Margaret Mead and Ruth Benedict pioneered an open-minded consideration of the varieties of cross-cultural human sexuality. Esther Newton, who recently published a memoir magnificently titled Margaret Mead Made Me Gay, spoke compellingly at the conference on the impact of her adolescent reading of Mead's "defense of cultural and temperamental difference." But that reputation for sexual frankness, combined with femaleness, the anthropological gambit and exoticism at home, has also long made women anthropologists, as I have noted for Margaret Mead, vulnerable to sexually insinuating popular-cultural costuming. In the rollicking postwar musical and film On the Town, recently revived on Broadway, a sex-crazed debutante "anthropology student" character throws herself at one of the sailors on leave because he "exactly resembles Pithecanthropus erectus, a man extinct since six million BC." The television series 90210 had a sluttish "feminist anthropologist" character, and Sarah Jessica Parker, in Sex and the City, calls herself a "sexual anthropologist." The way things seem to be going, we might want to specify another media Halloween costume for the feminist anthropologist: bustier and fishnets.
Feminist anthropologists tend to be a pretty tough bunch and certainly can take care of ourselves. (One of the conference participants pointed out to me with great glee that she was wearing fishnet hose.) The real issue is the one with which I began: the deadly intersection of the distorting Halloween costumes, the anthropological gambit and garden-variety sexism in the public sphere preventing the popular dissemination of actual knowledge on gender, culture and power. Consider my own experience. I write about gender, sexuality, race, ethnicity, and class formation in the American past and present, particularly in American cities, and I have done popular writing on racial injustice (and its connections to gender) in the present. But when does the Fourth Estate contact me? Let's just review instances from the past few years.
A 20/20 reporter called, wanting a professor to explain on-camera why "some men are sexually attracted to very obese women." (He kept trying to assure me that the show "would not be sleazy." Right.) A public radio show host invited me to do a Valentine's Day show with her on love and courtship ritual. A local television newswoman wanted my "anthropological" analysis of why women were buying Wonderbras. A Newsweek reporter asked for my thoughts on why, despite so many decades of feminism, American women still enlisted the aids of hair dye, makeup, plastic surgery and diets. Didn't that prove that we were genetically encoded to try to attract men to impregnate us and protect our offspring? A young stringer for Glamour called, looking for "expert" analysis of why women are attracted to certain types of men. An Essence reporter wanted my thoughts on why Afro-American women, according to her, repeatedly and irrationally fell for "thugs." And last fall, a Good Morning America producer begged me to appear on a show with the theme "Is Infidelity Genetic?"
So I wasn't entirely surprised by the Tony Soprano call, with its silly implications of a homogeneous feminism obsessed with mass media portrayals of sex roles. And I don't think even the Whitmanesque Margaret Mead, today, would have an easier time of it. In her memorable last years, Mead could and did play the progressive Sibyl, the wise elder using her vast cross-cultural knowledge to comment favorably, to a largely adulatory press, on the increasing social liberalization that we saw around us. But that was before Jimmy Carter, with whom Mead was closely politically allied, lost disastrously to Ronald Reagan in 1980. It was before the New Right, before the race and gender backlashes of "the underclass" and "feminazis" and the internal problematics of identity politics, before the second, more successful rise of sociobiology, before the global rightward tilt and recent neoliberal triumph, before the corporate consolidation and increasing tabloidization of the media. And, of course, it was before the "turn to language" complexified the ways in which we all envision science, culture and knowledge.
We no longer unproblematically invoke "science," but consider it a culturally contingent, powerful process. Emily Martin of Princeton and Rayna Rapp spoke at the conference about their ongoing and highly regarded research on gender in the production and consumption of scientific knowledge. But both scholars also work actively and even teach with bench scientists. Similarly, many of us work to reposition our analyses, no matter what their regional focus, both to reduce the United States and to enlarge it, always with the goal of accurately apprehending gender, culture, history and power. That is, we no longer simply observe the world from the perspective of the West, but instead consider the long regional and global histories of contact, trade, power politics, racial and ethnic formation and shifting political boundaries of which the United States and other Northern and Southern nations are part. At the same time, we no longer attempt to see the United States just as one among many nations--as in "an X is only a Y"--as it is still and has been the locus of global imperial power since the end of the Second World War.
These less Olympian, more nuanced understandings are a harder sell in the public sphere, but along with other progressives, we continue our Sisyphean engagement. And Mead's image, although not entirely according to her intentions, continues to inspire radical visions of American and global potentials: of international understanding, of gender, race, class and sexual equalities, of a different, more egalitarian world. Marcyliena Morgan of UCLA, for example, reported that the hip-hop-involved adolescent Afro-American girls with whom she works see Mead as "a liberating force." As Rayna Rapp ringingly concluded, "Collectively, we are surely on the case."
Depending on who's counting, one in four, five or six American children lives in poverty, the highest rate in the industrialized West. Nearly 11 million have no health insurance. Hundreds of thousands are in foster care. Five hundred thousand are homeless. The infant mortality rate in the inner cities of Washington, New Haven, East St. Louis and Chicago rivals that of Malaysia. There is one thing America has, though, that you won't find in France or Denmark or Sweden or Italy, and that is the persistent conviction that children would be just fine if only their mothers would give up working and stay home.
Consider the media feeding frenzy around the latest research released by the National Institute for Child Health and Development. Just about every paper has given major play to its finding that 17 percent of children who regularly spend thirty hours or more a week in childcare between the ages of three months and four and a half years are aggressive, disobedient and defiant in kindergarten, versus only 6 percent of children who have spent less than ten hours a week in childcare. (Childcare, by the way, is everybody but Mom, including nannies, Dad and Grandma--so forget equal parenting, and forget, too, the nanas and bubbes and aunts and older sisters who have taken care of small children for centuries while mothers toiled in the fields or behind counters or over laundry vats long before "working mothers" existed.) Buried in the coverage is the study's other finding: that high-quality childcare is associated with better cognitive and linguistic skills. Unmentioned is the fact that only a few years ago welfare moms were lambasted as lazy and useless for staying home with their children by some of the same right-wing ideologues now crowing on TV about the NICHD study. The truth is, the daycare debate has always been about college-educated working moms--women with good jobs some think they shouldn't have, and children every quirk of whose development is of interest to the opinion classes.
As it happens, Jay Belsky, who has gotten the lion's share of the press attention and is often cited, incorrectly, as the study's lead or even sole author, has been warning against the dangers of early childcare since 1986, when he claimed it caused babies and toddlers to fail to bond with their mothers. That didn't pan out but Belsky is all over the press now, boasting of his lack of political correctness in bringing people the unpleasant truth. "I won't lie down and play dead," he told the New York Times. Elsewhere, he has recommended not only parental leave but that mothers reconsider full-time work. Sarah Friedman, Kathleen McCartney and other researchers on the study don't agree at all. "This study was conducted by a team of some thirty researchers," Friedman told me. "His view is not the majority view." And she adds, "the type of analysis does not allow us to infer causality." In other words, childcare may not cause aggression but may be associated with something else that does--family stress, exhausted parents. Says Deborah Vandrell of the University of Wisconsin, "Mothers should stay home? Childcare is bad for kids? The data don't support that." And indeed, the study isn't so dire: Most kids in childcare are fine; the problematic behavior falls within the normal range; moreover, kids kept out of childcare double their rate of aggression when they finally get to school, suggesting that Vandrell may be right when she theorizes that the results mostly reflect the opportunity for aggressive behavior, and that kids would benefit from better conflict-resolution skills. (In an all-caps e-mail to me, Belsky professed himself "appalled" that McCartney put this idea forward on Face the Nation--he claims the study refutes it--and accused his colleagues of focusing on childcare quality rather than quantity because they don't want to be "unpopular.")
It's easy to take potshots at social science, so I'll just note in passing that one of the criteria for "cooperation" is "keeps room neat and clean without being reminded." It does seem like yesterday, though, that Bruno Bettelheim was blaming the group care typical of an Israeli kibbutz for making kids too sociable, too compliant, not ruggedly individualist enough. I know, it sounds crazy now--have you ever met a laid-back Israeli? But then, as Caryl Rivers pointed out on Women's E-News, back in the 1950s stay-at-home moms were blamed for producing a generation of mollycoddled wimps unable to stand up to the communists. If middle-class working moms really did trade the briefcase for the stroller, not only would lots of them be poor and frustrated, but within five years we'd be reading about spoiled, feminized sons and angry, condescending daughters already plotting their escape to Lesbian Island.
My French friends find the American debate over childcare utterly mystifying--all French 3-year-olds go to the écoles maternelles, and many are in crèches long before that. In European countries with long-established childcare systems, the American suspicion of daycare does not exist. (Vandrell noted that the European papers haven't even reported the NICHD study.) But then, why would it? European parents have government-paid parental leave and government-funded childcare systems staffed with well-trained and decently paid professionals. In this country, paid leave is a rarity, and daycare is like babysitting: Any warm body will do. Pay is abysmal, training rare, formal standards low. And, of course, the very conservatives who champion the NICHD study oppose every attempt to raise those standards, because that would cost money, encourage "bureaucracy" and go against the know-nothing faux libertarianism that is their political stock in trade.
There's another difference, though: Although everywhere childcare is connected to women's employment, in Europe childcare was developed as something that would be beneficial for children, like nursery school; in this country, it's seen as something for women--women, who if middle-class shouldn't have jobs and if low-income shouldn't have kids. Daycare in America is about feminism. That's why no matter how many studies appear touting the benefits of high-quality childcare, the ones that hit the headlines are always full of gloom.