Quantcast

Nation Topics - Society | The Nation

Topic Page

Articles

News and Features

I am no doubt not the only one who writes in order to have no face. Do not ask who I am and do not ask me to remain the same.
      --Michel Foucault

Ever since I was assigned to read A Room of One's Own in college eight years ago, I have kept it close for support. Besides the fact that, like Virginia Woolf, I had also read Leo Tolstoy's journals and was similarly enraged by the clarity he exhibited at such a young age, I felt she was on to something. "A room of one's own," however, wasn't quite it.

I grew up in the 1980s inside a charming, if small, Spanish-style California home with a mom, a sister, a piano, many dogs and two brothers who were relentless in their efforts to jimmy my locks with butter knives. Although I liked to think I had my own room, the space's original and (now that I'm an adult I can say it) idiotic plans called for its doubling as a shortcut to one of the house's more popular bathrooms. I went to great and occasionally violent lengths to discourage use of this particular feature. The results? Ha, as DrRogue would say.

Time and time again, my brothers broke triumphantly in, capturing me on my bed writing in my journal. I smile now, but back then, those intrusions enraged me. You see, unlike those jerks I was becoming a woman, and it was making me miserable. I liked being a girl. And it wasn't necessarily that I was surprised to discover that girls became women--in a rational way, I knew it happened all the time. It was just that I found the obvious sexuality of it offensive. It was clear to me that once breasts made their appearance against a shirt, a person could not be taken seriously.

I looked around at school and saw happy, pretty girls who went to the beach. They seemed content being female, and I liked boys too, so I did what they did. Then I went home depressed, slammed my doors, locked them and wrote to my journal about how much I loathed myself, until my brothers broke in after school. Here is a genuine entry from November 4, 1988: "I've gained five lbs. in a week if that's possible. I loathe myself. I'm sick of being regarded as 'muscular'--I want to be petite like Kate. Sports have ruined me."

Yes, you could say I was self-involved, melodramatic, petty. Worse, I was repetitive: November 4, 1988, was no date with an epiphany. Nonetheless, I was in pain. All I knew was that I'd gone from being an outspoken girl interested in everything to someone withdrawn and incapable of participating in class. I was depressed, but more than that I was hating myself for being a woman. I'd slipped onto a path that is as vicious and uncreative as it is a cliché of young womanhood. In a rare moment of teen lightness, I named it the Dark Horrible Sucking Trail of the Lost Voice. The Trail shouldn't be underestimated. Every day, another girl gets stuck in its mud.

Virginia Woolf would have liked DrRogue, a bright, confident writer with a lashing wit. As to whether DrRogue is a genius as defined by Woolf, only time will tell. In the meantime, she manages beyond the confines of a tiny school (there are just five girls in her class), her parents' limited financial means and, oh yes, childhood, to find experience and "a room of her own" for the cultivation of her talents. You see, DrRogue is known to those in her small New England town as 13-year-old Susan.

We met on July 14, 2000. I'm a part-time producer of commercial websites for teen girls and was spending some time perusing the vast number of homepages linked to one another on the Internet. There are thousands out there, colorful and animated, with names like Glitter Girl, Fairy Dawn, Overcooked, Pixie Kitten, Foily Tin, Quasi Grrrl, Peppermint and Intelligent Life. They can be constructed at established homepage-building areas of sites such as gURL, ChickClick, Lycos and Bolt. Or they can be created independently, then hooked into freer-floating webrings like "Shut Up You're Only 16!" and "Music Girl." Some are smart; some are sappy. Most are filled with poetry. If you visit, you may find writings, drawings, photos, interactive games and the creator's deepest, darkest secret. But you won't ever find her. Instead, your experience will be that of discovering an anonymous diary on a crowded city street. You can read it and learn everything about its owner, but look up and she's long gone.

DrRogue, however, is right here. I have no idea who she is.

DrRogue: Hello!

She flashes onto my screen, an insistent clump of text in a small, square dialog box aboard America Online's ubiquitous Instant Messenger (IM).

DrRogue: Hey, Bronwyn!

Aha, exclamation points. Dead giveaway. DrRogue, I now know, is one of the dozen or so teen girls whom I have e-mailed about their homepages. By now, I've visited enough to identify the prints.

I flip through my list of teen e-mails. DrRogue is Susan, the one with the website called Intelligent Life. Evidently, she has made a note of my AOL e-mail address and posted it to her "buddy list" to see when I'm online. She's flashing again.

DrRogue: Hello, it's Susan.

(Slowness, you see, is terminal here.)

BGAR2: Yes, Susan, of Intelligent Life. Hello.
DrRogue: Yeah, yeah, just thought I'd reiterate.

In my web travels, I've discovered that most Internet-savvy, homepage-creating girls provide only first names on their sites and to people they meet online. Discussing this find via e-mail with many different young women, I learn that the "first name only" policy is pretty strictly followed in these parts. Not every teen, of course, approaches her online development in the same way. Websites like Goosehead, for example, the work of a 15-year-old student, her parents and a growing staff, serves up a huge number of provocative photos of the pretty tenth-grade founder. For self-run sites, however, those who forgo anonymity are just throwing bait to bad people. "Avoid the psychopaths," as DrRogue says.

Anonymity is one of the hallmarks of safety online. And both anonymity and online safety, I learn, are crucial to privacy. If you'd asked me in high school what privacy meant in my life, I might have said, locked physical space that cannot be invaded with a butter knife. I mention this to DrRogue. It becomes clear that things have changed.

DrRogue: Privacy is [about] having your personal space in a more intellectually abstract, metaphysical sense. It is making sure that no one can find out too much about your real life through your online follies, keeping relationships strictly anon and not putting yourself at risk of psychopaths.

Evidently, DrRogue's issues go beyond keeping her brother out of her room (although this is cited as an ancillary goal). No matter the "horror stories blown up by newscasters," Susan asserts that she doesn't "feel threatened that [her] anonymity is slipping away." On the contrary, she blames the media for exacerbating the situation with their stupidity:

DrRogue: Newscasters report the invasion of my privacy very sternly, as though they're the ones decoding the human genome. From the way they speak, as though every technical word is new and unknown, it's clear that they're still asking their kids how to log on to the Internet.

Notwithstanding adult ignorance, DrRogue's understanding of privacy, "metaphysical" as it may be, is still predominantly about hiding the cord that could lead "psychopaths" to the "real" her. But isn't this paranoia for good reason? I ask. Haven't we all heard stories of "psychopaths" locating teenagers from clues carelessly dropped in chatrooms? DrRogue blisters at this one.

DrRogue: And then we have, "Cyber stalkers! Are your kids safe on the net?" If they have an IQ higher than a rock. Never give your full, real name, specific location, or phone number to a stranger. EVERY kid should know that!

"Keeping relationships strictly anon," in fact, is just the beginning of what a user, child or adult, must learn about the web, according to DrRogue. Sensing my ignorance, she outlines society's paranoia, the real issues as she sees them and the solutions--albeit in a somewhat mocking tone:

DrRogue: "Companies are putting something called 'COOKIES' onto your hard drive to TRACK WHAT SITES YOU VISIT!" screams the television. Anyone with a brain and a mouse should know that you can clear all the cookies off your hard drive anytime, or even set your computer to not accept them at all. "CREDIT CARD FRAUD!" is another big one. If you have a grain of sense, you won't give your credit card number to sites with names like Bob's Discount Warehouse. If you're not sure it's credible, DON'T SHOP THERE. "A new virus is spreading like wildfire around the world and through major corporations!" Major corporations where the employees are so out of touch, apparently, that they'll download anything.

Although DrRogue and most of the teens I encounter online are very careful about their privacy, the Annenberg Public Policy Center's The Internet and the Family 2000 reports that 75 percent of teens consider it acceptable to reveal what the study has defined as "private family information" in exchange for gifts, online. These "older kids," aged 13 to 17, are even more likely than younger ones, according to the study, to divulge personal details, including the names of their own and their parents' favorite stores, the type of car their family drives, whether their parents talk about politics and what their parents do on the weekends.

In the adult world, a person's concern for her own physical safety goes without saying, while the current discourse surrounds public image more than it does personal safety. In a New York Times Magazine article on privacy and technology, Jeffrey Rosen states that through the interception of e-mails, the tracking of browsing habits and purchases online and statements made in chatrooms, one's "public identity may be distorted by fragments of information that have little to do with how you define yourself."

Indeed, we've all heard stories about e-mail fragments being taken out of context by employers and others. The Washington Post described the case of James Rutt, the CEO of Network Solutions Inc., who feared his years of candid postings about sex, politics and his weight problem might be taken out of context, thus damaging his reputation and his ability to run his company effectively. Consequently, the new CEO employed a program called Scribble to help erase his online past.

Rutt's story is not unusual. This fear of being taken for a "fragment" of information is enough for most of us to employ private e-mail accounts we switch to at work when we have something personal to communicate. Even though we take these precautions, everyone I know goes to great lengths to erase all of her business e-mails when leaving a company. Why? Because the medium is seductive: Web travel is about exploration, and e-mails begun as impersonal memos often become more intimate exchanges. Intimacy does not translate well to third parties.

Take the 1997 example Rosen cites of Harvard professor Lawrence Lessig's e-mailed comment to a friend that he had "sold [his] soul" in downloading Microsoft Internet Explorer. Lessig, who had downloaded Explorer simply to enter a contest for a PowerBook, was not stating a biased opinion about the company, he was flippantly quoting a song. However, Judge Thomas Penfield Jackson, who had chosen Lessig as an adviser in the Microsoft antitrust suit, was forced to take such a bias seriously, removing Lessig as his adviser.

Rosen makes the point that we all wear different social "masks" in different settings and with different people. This may be true, even as it is socially taboo. From our estimations of Joan of Arc down to President Clinton, we celebrate and trust those with consistent (and consistently good) characters and criticize those who "waffle," or behave as if they are "spineless." Because we do not know how to define someone who changes with each new setting, we call her things like "mercurial," a "chameleon" or "two-faced." Consequently, the web induces fear in an adult not just for the things online crooks might do with Social Security numbers and other personal information but what a user herself may do to complicate her own "character."

However, while the adult world may suffer from a preoccupation with consistency, teens have the peculiar advantage of being dismissed as inconsistent before they begin. Adolescence is the societally condoned window in which we may shuffle through identities with abandon. My older sister, for example, horrified my grade-school self by morphing from punk to hippie to Hare Krishna to surfer girl all in the next room. Yet by the time I graduated from high school, I'd been through a few of those identities myself.

Besides their assumed role as explorers, teens may actually face fewer risks of this kind than their adult counterparts. Statistics relate that around 45 percent of large companies monitor their employees e-mail accounts, while few junior highs and high schools in America offer all their students computers or e-mail accounts. Even if they did, chances are that monitoring kids' private mail in public schools would raise countless legal issues. DrRogue's school is just now hooking up six new computers for the whole school of fifty-six students and has no intention of providing them with e-mail accounts. Ironically, it is this deprivation that has made monitoring less of an issue for students than it is for employees at big companies.

For girls like DrRogue who create and maintain personal homepages and explore the web's massive serving of chatrooms and boards, safety is not the only motivation for anonymity. "Bodilessness," it seems, can be the means to a more intellectual objective: credibility. Says Peppermint, a 17-year-old writer,

I definitely feel "bodiless" when writing for my page. The Internet has provided me with an audience that will forever be my biggest fan and my worst enemy. Just as I can say what I want on my site without fear of rejection, others can e-mail me their honest thoughts without the face-to-face consequences of criticism. My audience, as well as my privacy, is crucial to my development as a writer.

In a world with, as DrRogue sees it, "a seemingly endless supply of people to talk to and sites to visit, from all around the world," immediate, physical privacy gives some young women confidence they don't have when they're attached to developing physiques, school cliques or societal and familial expectations.

Only bodiless, confess some girls, can they relate certain ideas and thoughts at all. Indeed, screen names have become so popular that AOL is currently offering the option of seven aliases to the users of Instant Messenger. DrRogue herself has five that she can remember. In addition, girls can, if only for the experience, switch genders online. (Studies suggest 40 percent have done it.) "The ONLY reason I go into a chatroom is to pretend to be someone else!" insists DrRogue. Other young women are altogether wary of chatrooms. "I don't believe in chatrooms," explains Peppermint. "They've become a forum for online popularity contests and cyber sex." Does the monitoring of chatrooms add to their problems?

DrRogue: I've never felt like I was being monitored, because a monitor would do a better job of kicking out the scum.

Bodiless, many girls use their homepages as a sounding board before taking ideas out into "the real world." Others put the space and the "audience" to work on facets of themselves or ideas they plan to confine permanently there.

Peppermint, whose "real life" friends know her as Caitlin, addresses the consequences that she believes her physicality can have upon the reception of her ideas. "I do not post pictures of myself," she states in an e-mail, "simply because I want to be perceived as 'more than just a pretty face.'" Peppermint is so adamant that her physical self be distinct from her online self that she does not give her website address to "real life" friends.

Peppermint: I don't allow friends that I have known in person to have my Website address. The Web is a sacred place for me to speak to a receptive and critical audience, while at the same time, I do not need to worry about making a first impression, [about] coming off how I'd like to, or [about] what will happen the next time I see them. Because I don't need to make impressions, I am who I am, and I am being honest.

Peppermint believes her femininity and "real life" identity can negatively affect the reception of her ideas. She observes: "As a woman, I think we will always be viewed as sexual objects. [I want a person] to be able to look past the outside and realize that there is an opinionated, intelligent, creative young woman behind the pretty face." DrRogue, who at 13 is really just entering adolescence, simply may not have experienced her femininity as a handicap yet. If all goes well, she never will.

To many, appearance is the essential problem of female development. With the world standing by to notice her changing body, a young woman begins to perceive her somewhat limited access to what psychologist Lyn Mikel Brown (and many others, of course) points to as the patriarchal framework of our culture. From here, a struggle to retain her childhood identity and value system ensues, followed typically by a loss of voice, the narrowing of desires and expectations and the capitulation to conventional notions of womanhood. Yes, the Dark Horrible Sucking Trail of the Lost Voice is so trodden it is cliché. We've all seen countless articles on the phenomenon, but that doesn't make it any less painful for the young women going through it.

Less publicized, though more interesting than the pervasiveness of the Sucking Trail, is what Lyn Brown and Carol Gilligan (author of groundbreaking studies of female adolescence) have identified as a period before adolescence when girls' "voices" are at their most powerful. Young women, most of whom I imagine to be variations on DrRogue, actively resist dominant cultural notions of femininity at the edge of puberty. Finding a means to connect, harness and preserve the loud, defiant voices may empower girls to defy cultural norms and, in the process, eclipse the resentment that Virginia Woolf so protested.

On the one hand, Peppermint's sensitivity to a societal bias we'd like to think has passed is tragic. On the other hand, unlike those of us who were teenagers even ten years ago, Peppermint and DrRogue can literally construct their own worlds, with their own standards, where the only thing that matters is their ideas. They can't live in it forever, but maybe a few hours a day is long enough to change their lives.

BGAR2: how many hours do you spend online each day?
DrRogue: 1, usually.
BGAR2: really? That's nothing.
DrRogue: 2, really.
BGAR2: hmmm.
DrRogue: 3, if I'm bored.
BGAR2: still, I imagined more.
DrRogue: well, I'm prolly scaling down a lot.
BGAR2: why's that?
DrRogue: let's just say I've had to LIMIT my online time in the past.
BGAR2: ahh. los padres?
DrRogue: si.
BGAR2: how much time did you spend yesterday?
DrRogue: lemme check my log.
DrRogue: ok, I lied. I spent 4 hours online.

Intelligent Life, DrRogue's latest homepage, which makes vague note of a Susan somewhere in the meat of its smart, sometimes acerbic, steadfastly spelling-error-free content, is exactly what it sounds like: an SOS for brain activity in a spectrum overwrought with misspelled emotion. Intelligent Life, when just a month old, had already received nearly 400 visitors--and that was during the summer. Whether they're up to DrRogue's standards is another matter.

Intelligent Life: I'm not being snobbish or narrow-minded, the time has simply come to draw the line. I want to meet people (of any age) who are bright and exciting, funny, kind, and intelligent. I want to meet people who are clearly individuals, not stereotypical, bumbling, senseless teenagers with limited vocabularies who take extreme liberties with spelling.

In short, do not visit Intelligent Life if you are, and there's no easy way to say this, a "ditz." Ding.

DrRogue: Have you been to Narly Carly yet?

Narly Carly's Super Awesome Page, to be specific, is DrRogue's spoof site--she recommended I look at it for research. Pulling the purple page with the rotating star up onto the screen, I see another reactionary move by DrRogue: a parody of the many sunny, earnest, "overwrought" teen sites splattered across the web. In the usual autobiographical style of these things, the fictional Narly Carly describes herself and her life, albeit without any of the eloquence DrRogue saves for, well, DrRogue. "I am a junior at Willingford High!" screams Narly at her visitors, "Go Wolfs!!!"

Although to the naked eye Narly Carly's Super Awesome Page looks quite a lot like any other teen site, its status as a farce lies in its suspicious abuse of exclamation points, the word "like" and an overload of personal information, among other things. Narly gives away reels of intimate details--for the visitors who "get it," this is a reproach of lax security. There is, after all, no one currently policing the Internet to keep people from divulging too much about themselves. In this era, something like Narly Carly serves as a gentle warning--as gossip does in a small town--to keep people in line. The irony is, of course, that a real Narly Carly may not understand irony.

Narly Carly bears the treadmarks of an adolescent critical of hypocrisy in older girls, making Susan appear to be someone Gilligan might identify as a "resister." Before they give up any measure of voice and shift into idealized femininity, girls are louder than ever, embodying what Gilligan believes may be political potential of an active adolescent underground. Whatever Susan's reasons for building an older "teenybopper's" site, they are her own. However, the underground political potential, along with Susan's strength and clarity of character, are palpable on Narly Carly. Indeed, a handful of the guestbook's visitors, whether male or female, passed the first test--they "got it." Said one visitor: "This page is so evil! I know whoever made it did this intentionally. No 'real' person acts this pathetic. And 'like' was WAY back in the 1980s. I know this is a joke and the person who made it is laughing their head off reading the guestbook."

When I argue that web diarists like her must be a bit self-conscious, DrRogue seethes: "People really put themselves out on these things!" Unlike my diaries though, they also get visitors who comment and provide discourse and insight, making the creator feel less alienated, making the pages actually useful methods of growth. Not to knock the diary--I certainly got somewhere venting in my own. Anaïs Nin kept a journal to "free" herself of "personae." The web is, in some ways, a more evolved journal, even as it is so many other things. Studies have shown that students write better papers and learn foreign languages more fluently when they actually have something to communicate to another person.

BGAR2: so these sites are like journals.
DrRogue: exactly.
BGAR2: couldn't you print them out and store them in a closet or something and then delete them?
DrRogue: AH no!
DrRogue: that would defeat the purpose of the web!
BGAR2: what's the purpose?
DrRogue: interactivity, for one.
DrRogue: longevity of information, two. i can visit the Susan of a year ago. she's there in the same place, just as alive.
BGAR2: but how do you know when you're done?
DrRogue: I stop visiting it. I'm sick of it.

D.W. Winnicott defined a process of imaginative "saturation" in children's play in which the child plays with a certain toy or enacts an imaginative experience until all of the emotional ambivalence, fear, anxiety, etc., are diffused from that action or thing. DrRogues may be "playing" out their emotions to make offline "reality" less emotional. Selfishly, while she contributes to the textual wasteland of so many sites created and then abandoned, DrRogue hates to stumble upon such a "haunted" site herself.

DrRogue: It makes me feel really bad, manipulated almost, when I'm browsing a site, and then there's a date, and that date is like, March 13, 1996.
BGAR2: why, because it's old?
DrRogue: "does this person still exist?"
BGAR2: hmm
DrRogue: because I spent time getting to know the person
BGAR2: is it a waste if it's old?
DrRogue: it depends.
DrRogue: I like retail sites because they're constantly busy.
BGAR2: yeah, the idea of an updated site is good.
DrRogue: like someone's alive.

A good character's job is to "manipulate" her audience. Perhaps then, a date is some sort of narrative flaw that pulls DrRogue out of the story. What should a date matter to her anyway, I wonder: She'll never meet the site's creators. Why does she care whether she or he is still "alive"? The presence of a date on a site is like an actor's mustache falling off--it brings reality back into focus.

DrRogue's sense that a retail site, for example, might have human qualities, or something resembling a heartbeat, implies her ability to suspend disbelief so that the mechanism--words on the web--dissolves away. Further, it suggests a narrative view of the Internet, a desire to read and live through other people's stories. Susan seeks to learn about the world, about people and about herself, insight she can glean from any good story, regardless of its medium. I express some weariness of the homepages, but DrRogue says reading about their creators' everyday doings is fascinating, "sort of like having someone's life for a minute."

Stories, like "playing," can be powerful agents of personal transformation. "The right stories can open our hearts and change who we are," says Janet Murray, a professor of a digital fiction course at MIT. Indeed, ultimately the best stories render their technologies transparent so that we experience only the power of the characters and the story itself.

DrRogue created Intelligent Life to find other people, to hear their stories, to "have someone's life for a minute." In exchange, she shares her own experiences and, in doing so, develops a bit more as a human being. Logging on has enabled DrRogue to get beyond her small town, her age and her financial situation and has allowed her through narrative to experience the world.

Somehow, maybe because she pummels me with Instant Messages whenever I log on, I have come to associate the web with DrRogue. It is her "room," you might say.

DrRogue: yo
BGAR2: yo
BGAR2: shouldn't you be at camp or something
DrRogue: I said ALL my friends were at camp. I'm not so fortunate.
BGAR2: oh. sucks. well, you have the web.
DrRogue: I have the web.

Writing has always allowed people to step outside their skin, to try on different identities, to see through other perspectives. Lyn Lifshin, who edited a collection of women's journals by professional writers and others, recalled that contributors' friends were often shocked at the people represented in the diaries. For Foucault, writing was about growth and escaping the confines of identity. No one understands this better than the growth-hungry DrRogue, who, thanks to technology, can go even further in her explorations. She can gain experience of the world from a tiny room in Vermont.

The term "cyberspace" was coined by William Gibson, the prolific science fiction novelist, to define the virtual landscape of a human being's consciousness. It is voyeurism, entertainment, education, communication, interaction and self-expression all at once. Above all, it is a human environment, an extension of, rather than an escape from, the "real world." As such, it poses "real world" risks as well as opportunities. For young women like DrRogue and Peppermint, it is the real stories, the sense of community and communication, that keep them coming back.

Peppermint: Receiving responsive email to something I've written is the most rewarding part of the experience. I've received in excess of fifty letters, especially from girls a few years younger than myself, saying that I've taught them that there is nothing wrong with being yourself. This is a lesson that I wish I had learned at their age, and to know that I have taught it to someone younger than me is an incredible feeling.
DrRogue: IGG. [I gotta go.] Time to do something productive today.
BGAR2: Go write your novel.

I forgot to mention that, since she's not going to camp, DrRogue is writing a novel. It's tentatively titled: Teen Girls: Not as Stupid as You Think.

Paris

Will Paris become the first large city in France (indeed, the first major city or national capital anywhere in the world) to elect an openly gay candidate as mayor--the Socialist Bertrand Delanoë? Beyond that question, highly symbolic for same-sexers everywhere, the two-stage Paris elections, which take place on March 11 and 18, will have far-reaching consequences. If the left succeeds in winning the City of Light for the first time ever, that could presage a national victory in France's presidential and legislative elections, to be held in May 2002.

The right has maintained hegemony in Paris since the first municipal elections in 1977, when Jacques Chirac--then president of the neo-Gaullist RPR--led the right to victory. He remained mayor of Paris until his election as President in 1995, when he dictated the selection of his successor--the current mayor, Jean Tiberi, Chirac's first deputy mayor for nearly two decades.

But the unbroken succession in this city, dominated for the past twenty-four years by la droite chiraquienne (the Chirac conservative majority), has suddenly and dramatically deteriorated following an avalanche of financial and electoral scandals. Mayor Tiberi is at the center of accusations concerning, among other things, the secret financing of political parties through a highly organized system of corrupt rakeoffs and kickbacks on contracts, filling the electoral rolls with phantom voters, illegally allocating low-rent apartments in city-owned housing (reserved for the economically disadvantaged) to political cronies and the families of elected officials and giving hundreds of no-show municipal jobs to full-time workers for the RPR and its campaigns. These scandals now threaten to undo Chirac himself, a danger underscored by the recent arrest of Chirac's former municipal chief of staff.

Aside from the scandals, the right's management of the city has also been sharply criticized. Paris is in danger of becoming a city-museum: Rents have skyrocketed, and the lack of subsidized housing for low- and middle-income families has forced many of them to move to less costly suburban developments with onerous commutes. While tourism continues to increase, Paris has serious air pollution and has not modernized or improved its public transportation or its noncommercial recreation facilities (especially for the young, hardest hit by unemployment).

All this means the left has an excellent chance of winning the municipal elections. Its leading candidate, Dominique Strauss-Kahn (number two in Prime Minister Lionel Jospin's government and the man in charge of the French economy), was forced to withdraw and lost his ministry after being tainted by yet another money scandal. After much internecine maneuvering, Jospin's Socialist Party finally settled on Delanoë. A little-known senator at the time of his designation, with a reputation as an unimaginative apparatchik and Jospin loyalist, Delanoë chose to make his homosexuality public during a 1998 television appearance. His matter-of-fact manner in coming out made few waves at the time, but since then he has enjoyed the media's favor, and--thanks to the Ubuesque situation in which Paris's right finds itself--his popularity has steadily increased in the opinion polls. And he has secured the endorsement of all the smaller parties in Jospin's "plural left" coalition, including the Communists (except for the Greens, who are nonetheless expected to support Delanoë in the second round of municipal voting). Within the city's sizable gay population, Delanoë has broad support, all the more so because, in contrast to his right-wing opponents, he has been a supporter of the pact of civil solidarity (PACS) for unmarried couples. Passed by the left-controlled National Assembly in 1999, the PACS recognizes and gives a large number of social and fiscal rights to domestic partnerships, whether homo- or heterosexual.

The conservative RPR, recognizing that Tiberi was politically bankrupt, dropped him as its mayoral candidate in favor of Philippe Séguin, a former party president and minister under Chirac. But even though Séguin is an RPR heavyweight with a reputation for intellectual honesty, his rigidity and independence (in the past he's feuded with Chirac) render him unsuited for the kinds of concessions necessary to unify the fractious Paris right. Meanwhile Mayor Tiberi, despite being disavowed by the RPR, has maintained his candidacy for re-election as an independent and is running a slate of candidates for City Council. In addition, the two extreme-right parties are running their own slates.

Delanoë's ambitious campaign proposals include making Paris "a model of democracy" (through the creation of neighborhood and youth councils and an official forum for civic organizations, the institution of referendums by petition and, above all, a guarantee of "real financial transparency" in government); building 5,000 new low-income and student housing units a year; ending the policy of "autos first" by building a new tramway to encircle Paris, creating 300 kilometers of new bus lanes, more car-free zones for pedestrians, cyclists and roller skaters, and improved river transport on the Seine; providing more daycare centers and facilities for the aged and handicapped; and doubling the city's cultural budget. He also promises halfway houses for gay kids rejected by their parents, giving gay organizations equal access to the subsidies the city already provides to civic groups, creating a new gay archive/research center and waging aggressive city-sponsored campaigns against antigay discrimination and AIDS.

However, even if Delanoë wins, as now seems probable, giving Paris back its former demographic mixture and improving its quality of life--especially for those of modest means--won't be easy, given a city budget that has been fiscally unsound for many years. Moreover, Delanoë's personal shortcomings continue to raise doubts, including on the left: Can a party workhorse with no management experience, nominated almost by default for his mayoral post, innovatively lead the first city of France?

Still, changing the political control of Paris--and ending what Le Nouvel Observateur has dubbed "the corrupt Chirac-Tiberi system"--is a top priority, one that could lead to the right's (and Chirac's) defeat next year. And if this permits one of the world's greatest cities to elect a mayor who is openly gay, why not gamble on his success? Frédéric Martel

If a critic's clout can be measured by the ability to make an artist's name, the most important art critic in America today is clearly Rudolph Giuliani. Just over a year ago he excoriated the Brooklyn Museum of Art for including in its "Sensation" show Chris Ofili's Holy Virgin Mary--the elephant-dung-decorated painting of an African BVM, which the mayor found "anti-Catholic," blasphemous and disgusting--and turned Ofili himself into a sensation overnight: One collector, I heard, complained that the media attention had driven Ofili's prices so high he couldn't afford him anymore. If I were Jake & Dinos Chapman, represented by a perverse sculpture of deformed and weirdly sexualized children, I would have been seriously peeved, and if I had been Richard Patterson, whose Blue Minotaur, a profound meditation on postmodernity and the heroic tradition, got no attention at all, I would have wept.

You'd think the mayor would have learned to stay his theocritical thunderbolts, but once again he has gone after the Brooklyn Museum for including an "anti-Catholic" work--Renee Cox's Yo Mama's Last Supper--in the new show of contemporary black photographers, "Committed to the Image." He's even suggested that what New York needs is a "decency commission," which got big laughs all around, since the mayor, a married man, is openly carrying on with his mistress, upon whom he has bestowed police protection worth some $200,000 annually at taxpayers' expense. As the whole world now knows, Yo Mama is a five-panel picture in which Cox appears naked, as Jesus, surrounded by male disciples--ten black, one white--at the Last Supper. As an artwork it's negligible, glossily produced but awkwardly composed and, to my eye, rather silly. Cox is thin and beautiful; the men, in robes and caftans, are handsome and buff--apparently the first Christians spent a lot of time in the gym and at the hair salon, getting elaborate dreadlocked coiffures. Unlike the figures in Leonardo's Last Supper, which are highly individualized and dramatically connected, the figures here are generic and stiff. My eye kept going to the limited food on offer: bowls of wax-looking fruit (did they have bananas in Old Jerusalem?), rolls, pita bread. Was the Last Supper a diet Seder?

If you want to see visually haunting work at "Committed to the Image," there's Gordon Parks, Albert Chong, Imari, Nathaniel Burkins and many others. LeRoy W. Henderson's black ballet student, dressed in white and standing in front of a damaged classical frieze, interrogates the Western tradition much more deeply than Yo Mama does. Mfon's self-portraits of her mastectomized torso, a meditation on beauty, heroism and tragedy expressed through the female body, lay bare the high-fashion hokiness of Cox's costume drama. For fan and foe alike, the interest of Yo Mama appears to be political. Cox describes her art in ideological terms ("my images demand enlightenment through an equitable realignment of our race and gender politics"), and she has been quite pungent in defending it. As with The Holy Virgin Mary, the mayor hasn't actually seen it, nor had the numerous people who sent me frothing e-mails after I defended government support for the arts on The O'Reilly Factor.

Even the New York Observer's famously conservative art critic, Hilton Kramer, who usually delights in withering descriptions of pictures he hates, apparently felt that depicting Christ as a naked black woman was so obviously, outrageously anti-Catholic he need say no more about the photo before embarking on his usual rampage. It would be interesting to know where the offense lies: Is it that Cox as Christ is naked, black or female? All three? Two out of three? If one thinks of Catholics, the people, there's nothing bigoted about any of this. (Like Ofili, Cox is Catholic--as are most perpetrators of "anti-Catholic" works.) There is no ethnic stereotyping of the sort on view, for instance, on St. Patrick's Day, when the proverbial drunkenness of the Irish is the butt of endless rude humor, especially from the Irish themselves. While we're on the subject of ethnic stereotyping, it's worth noting that in a great deal of Christian art, Jesus and the disciples are portrayed as Northern Europeans, while Judas is given the hooked nose and scraggly features of a cartoon Jew.

But if what is meant by anti-Catholic is anti-Catholic Church, why can't an artist protest its doctrines and policies? The Church is not a monastery in a wilderness, it's a powerful earthly institution that uses all the tools of modern politics to make social policy conform to its theology--and not just for Catholics, for everyone. It has to expect to take its knocks in the public arena. A church that has a 2,000-year tradition of disdain for women's bodies--documented most recently by Garry Wills (a Catholic) in his splendid polemic Papal Sin--and that still bars women from the priesthood because Jesus was a man can't really be surprised if a twenty-first-century woman wonders what would be different if Jesus had been female, and flaunts that female body. And a church with a long history of racism--no worse than other mainstream American religions but certainly no better--can't expect the topic to be banned from discussion forever.

At the Brooklyn Museum, Yo Mama's Last Supper is in a separate room with its own security guard. On Sunday afternoon, it attracted blacks, whites, Asians, parents with small children, older women in groups, dating couples, students taking notes--le tout Brooklyn, which is turning out in large numbers for the show. I asked one black woman, who described herself as a Christian, what the picture meant to her. "It shows Life as a woman," she said. "It's beautiful." Her friend, who said he was a Muslim, liked the picture too.

If only I could get the Mayor to review my book!


* * *

Show George W. Bush you support RU-486. Make a donation in W.'s name to the Concord Feminist Health Center (38 South Main Street, Concord, NH 03301) and help it buy the ultrasound machine this method requires. The center will send the President a card to let him know you were thinking of him when you wrote your check.

The network honchos called by
Louisiana Representative Billy Tauzin and the House Energy and
Commerce Committee to testify on the election night debacle were a
decidedly ungrateful bunch. True, they were forced to sit through a
video of their billion-dollar babies making idiots of themselves.
(Watching Dan Rather offering "a big tip and a hip, hip, hurrah and a
great big Texas howdy to the new President of the United States," and
instructing viewers to "Sip it. Savor it. Cup it. Photostat it.
Underline it in red. Press it in a book. Put it in an album. Hang it
on the wall," more than once ought to be considered cruel and unusual
by anyone's standards.) And how rare it must be that anyone, much
less mere members of Congress, would dare keep these boys cooling
their heels for a full five hours before finally bringing them
forward to demand that they swear to tell the truth, the whole truth
and nothing but the truth--in public, no less. But really, all the
"concern" and "uneasiness" voiced by the execs about government
meddling in the news was a bit much. There was never any danger to
the networks' independence in Tauzin's hearings; at least none that
originated from Congress, rather than their own parent
companies.

Tauzin, a Democrat turned Republican, originally
professed to possess an "analysis" that indicated "in almost every
case, [the networks] favored early calls for Al Gore over George
Bush." Absent any evidence, however, he withdrew the charge of
intentional bias and retreated behind a mysterious theory of "flawed
data models" and "biased statistical results" that happened to favor
Democrats. He offered no evidence this time either, but almost all
reporters felt duty-bound to repeat his nonsensical accusations.
Hence precious little attention was focused on more concrete
election-coverage questions, most notably Fox's decision to rely on
the analysis of John "I can't be honest about [my cousin George W.
Bush's] campaign.... He's family, and I'm for him" Ellis. And
needless to say, there was no time left for an examination of the
corrupting effect of the networks' interlocking structure of
corporate ownership.

Had Tauzin and company really tried
to censor or intimidate the networks, that would have been
interesting, but it is damn near impossible to imagine. As a
comprehensive report on media lobbying by the Center for Public
Integrity demonstrates, when it comes to mutual backscratching, the
primates in the National Zoo have nothing over the networks and
Congress.

Take Tauzin, for instance. According to the CPI
report--which might as well have been classified "top secret" for all
the attention lavished on it by the media it exposes--the cagey Cajun
received more PAC money from media companies than anyone else in the
House, including more than $150,000 from entertainment and
telecommunications companies for his 2000 campaign, in which he had
no credible opponent. Moreover, no member of Congress has traveled
more frequently on the media industry's dime. Between 1997 and 2000,
Tauzin and his staff took a total of forty-two trips--one out of
eight industry-sponsored junkets taken by members of Congress during
that period. In December 1999 Tauzin and his wife enjoyed a six-day,
$18,910 trip to industry "meetings" in Paris. Representative John
Sweeney managed to make the same trip for a mere $7,445. How can
Tauzin act as an honest broker for the networks filling his pockets?
Easy: He simply does not believe in the concept of conflict of
interest. "I have no choice but to do effective oversight," he says
by way of explanation. Tauzin's view is hardly unique. His successor
as chairman of the House Telecommunications Subcommittee, Fred Upton,
has a portfolio worth millions in those very same
companies.

Again, we are seeing nothing unusual here,
except perhaps gumption. In 1999 alone, according to the CPI, the
fifty largest media companies and four of their trade associations
coughed up more than $30 million to lobby Congress, an increase of
26.4 percent in three years. Since 1993, they have given more than
$75 million in direct campaign contributions, according to the Center
for Responsive Politics. And the numbers tell just a small part of
the story. These fellas are not just selling toasters, after all. As
former FCC chairman Reed Hundt has explained, more important than the
industry's money is the perception of its "near-ubiquitous, pervasive
power to completely alter the beliefs of every American." Politicians
fear that if they displease these companies, they will simply
"disappear" from view.

And what do the media want in
exchange for this largesse? They want to be left alone so they can
make themselves and their stockholders rich, regardless of their
impact on American democracy. To take just one example, according to
data collected by Competitive Media Reporting, politicians and
special interests spent an estimated $600 million for paid political
ads in the last election cycle, which makes the $11 million or so the
National Association of Broadcasters and five media outlets
cumulatively spent between 1996 and 1998 to defeat campaign finance
reform look like a prudent investment. Note, by the way, that John
McCain, the heroic white knight of campaign finance reform, who
raises more money from the media companies than even Tauzin, was
crucial to the media companies' successful effort to kill the FCC's
plan to force a lowering of the cost of political commercials, the
primary culprit driving the vicious election/money
cycle.

With Michael Powell as George Bush's new appointee
to head the FCC, the networks might not even have to bother lobbying
Congress anymore. Powell signaled his own expansive definition of
conflict of interest when he refused to recuse himself from the vote
approving the merger of AOL and Time Warner, despite the fact that
his father, Colin Powell, stood to make millions from the stock he
received as a company director. (I don't suppose he opposes the
repeal of the estate tax, either.)

"We don't look to the
government to correct the press. We look to the people," explained
ABC News president David Westin to Tauzin's committee. "If we fail,
the audience will judge us and move somewhere else." I'm thinking
France.

The first moments of a recent documentary about Students for a Democratic Society (SDS), Rebels With a Cause, recall one of the signal images of the 1960s civil rights struggle: police training torrents of water from fire hoses on demonstrators in Birmingham, Alabama. According to its makers, the student New Left and the antiwar movement derived their principal inspiration from the struggle for black freedom.

In this phase of the movement blacks and their allies sought three rights: integrated public schools; desegregation of public accommodations such as trains and buses, restrooms and water fountains, restaurants and, in much of the South, the right to walk down a street unmolested; and perhaps most important, voting rights. Nearly forty years later the Birmingham confrontation reminds us not only of the violence of Southern resistance to these elementary components of black freedom but also how clear-cut the issues were. The simple justice embodied in these demands had lurked on the margins of political life since the 1870s betrayal of black Reconstruction by politicians and their masters, Northern industrialists. Not that proponents of black freedom were quiescent in the interim. But despite some victories and defeats, mainly in the fight against lynching and legal frame-ups, these demands remained controversial and were largely sidelined. In 1940 Jim Crow was alive and well in the South and many other regions of America.

World War II and its aftermath changed all that. Under threat of a March on Washington by A. Philip Randolph of the Sleeping Car Porters union and other black leaders, in 1941 President Roosevelt issued an executive order banning discrimination in military-industry hiring. But when millions of black veterans returned from the war and found that America had returned to business as usual--even as the United States was embroiled in a cold war, claiming to be at the forefront of freedom and democracy--pressure mounted for a massive assault on discrimination and segregation. Shrewdly looking forward to an uphill re-election battle, President Truman ordered the desegregation of the armed forces, and the turbulent 1948 Democratic convention passed the strongest civil rights plank of any party since the Radical Reconstruction laws of 1866. Consequently, the party suffered the first of what became a long line of defecting Southern political leaders when Strom Thurmond bolted and ran as the Dixiecrat Party candidate for President. In the early 1950s the NAACP mounted a series of legal cases against school segregation that culminated in the 1954 Supreme Court decision Brown v. Board of Education, sustaining the plaintiffs' claim that the historic Court doctrine of "separate but equal" was untenable. The Court agreed that school integration was the only way to guarantee equal education to black children. The next year an NAACP activist, Rosa Parks, refused to move to the back of a Montgomery, Alabama, bus--a disobedience that sparked a monumental boycott that finally ended in victory for the black community. In 1962, flanked by federal troops, James Meredith entered the University of Mississippi, and, under similar circumstances, a black woman named Autherine Lucy broke the color bar at the University of Alabama.

As important as these breakthroughs were, they were regarded by some as only the first stage of what was considered to be the most important phase in the struggle-- achieving black voting rights as a prelude to overturning white-supremacist economic and political domination of the South. Many on both sides of the civil rights divide were convinced that the transformation of the South by ending black exclusion was the key to changing US politics. The struggle for voting rights proved as bloody as it was controversial, for it threatened to reduce the Democratic Party to a permanent minority. In summer 1964, in the midst of a major effort by civil rights organizations to enroll thousands of new black voters, three field-workers were murdered and many others were beaten, jailed and in some cases forced to run for their lives. Fearing further losses in the less-than-solid South, the Kennedy Administration had to be dragged kicking, if not screaming, to protect field-workers in various civil rights groups. Indeed, it can be argued that beginning with the 1964 Republican presidential candidate Barry Goldwater, and perfected to an art form by Richard Nixon, the right's infamous Southern Strategy was the key to Republican/conservative domination of national politics in subsequent decades. But driven by the exigencies of the Vietnam War as well as the wager that the Democrats could successfully cut their losses by winning the solid backing of millions of black and Latino voters, Lyndon Johnson's Administration rapidly pushed the Civil Rights and Voting Rights acts through Congress. By 1965 a century of legally sanctioned black subordination apparently came to an end. The Democrats won black loyalty, which to this day remains the irreducible condition of their ability to contend for national political power.

II.

Today the predominant commentary on the state of race relations in the United States no longer focuses on discrimination but on the consequences of the economic and cultural chasm that separates black and white: white flight from the cities; the formation of a black middle class, which in pursuit of a better life has tended to produce its own segregated suburban communities; and the persistence of black poverty. For after more than three decades of "rights," the economic and cultural reality is that the separation between black and white has resurged with a vengeance. Housing segregation is perhaps more profound than in the pre-civil rights era, and the tiering of the occupational structure still condemns a large percentage of blacks and Latinos to the bottom layer. What's more, the combination of the two has widened the educational gap that pro-integration advocates had hoped would by now be consigned to historical memory. In the wake of achieving voting rights and passing antidiscrimination laws, some now worry more about whether blacks will maintain access to the elite circles of American society, especially in education and the economy. Others have discovered that, despite their confidence that legal rights are secure, the stigma of race remains the unmeltable condition of the black social and economic situation.

But even voting rights are in jeopardy. What are we to make of the spectacle of mob intimidation of election officials and the brazen theft of thousands of black Florida votes during the 2000 presidential election--so egregious it led the NAACP to file suit? Or the ideological and political mobilization of a conservative Supreme Court majority to halt the ballot recount of George W. Bush's razor-thin lead? Or the refusal of a single member of the Senate, forty-eight of whom are white Democrats, to join mainly black House members in requesting an investigation of the Florida voting events? Vincent Bugliosi has persuasively shown that Gore's lawyer did not use the best arguments at the Court [see "None Dare Call It Treason," February 5]. And then there's the Congressional battle over George W. Bush's nomination of John Ashcroft as Attorney General, during which Ted Kennedy concluded that he could not muster the votes to sustain a filibuster and had to be content with joining forty-one of his colleagues in a largely symbolic protest vote against confirmation. Ashcroft's nomination was perhaps the Bush presidency's blatant reminder that the selection of Condoleezza Rice and Colin Powell for leading foreign policy positions should not be understood as having any relevance to the Administration's stance on domestic racial politics. Contrary to progressives' expectation that the Democratic Party would constitute a genuine opposition to what has become a fairly strident right-wing government, as the events since November 7 have amply demonstrated, the Democrats seem to know their place.

These events--calumnies, really--cast suspicion on the easy assumption that the civil rights struggle, even in its legal aspect, largely ended in 1965. The deliberate denial of black suffrage in Florida was a statement by the Republican minority that it would not countenance the Gore campaign's bold maneuver, in tandem with the AFL-CIO and the NAACP, to register and turn out tens of thousands of new black and non-Cuban Latino voters. As if to acknowledge his indiscretion, after it became clear that he "wuz robbed," Gore ordered his supporters not to take to the streets; instead, he contented himself with a weak and ultimately unsuccessful series of legal moves to undo the theft. Thus even if their hopes for victory always depended on blacks showing up at the polls in the battleground states, including Florida, Gore and the Democratic Party proved unwilling to fight fire with fire. In the end the right believes it has a royal right to political power because it is grounded in the power of Big Capital and the legacy of racism; the Democrats believe, in their hearts, that they are somehow illegitimate because of their dependence on blacks and organized labor.

III.

Yet inspired by the notion that the United States is a nation of laws, some writers have interpreted the 1960s judicial and legislative gains to mean that the basis has been laid for full equality of opportunity for blacks, Latinos and other oppressed groups. (I speak here of those like Ward Connerly, Glenn Loury and Shelby Steele.) They cite affirmative action as evidence that blacks may be able to attain places in the commanding heights of corporate office, if not power. But they forget that Nixon utilized affirmative action to derail efforts, inspired by the 1960s civil rights victories, to dramatically expand funding for education and other public goods. In the context of Nixon's abrogation of the Bretton Woods agreement, which destabilized world currencies, and the restructuring of the economy through the flight of manufacturing to Third World countries and capital flight from those same countries, affirmative action was the fig leaf covering the downsizing of the welfare state. As the political winds blew rightward, welfare "reform" became a mantra for Nixon's successors, from Jimmy Carter and Ronald Reagan through Bill Clinton. Needless to say, providing spaces for black students in a handful of elite colleges and universities was not, in itself, a bad idea. That affirmative action accompanied the Reagan tax cuts, ballooning military spending, the gradual end of an income floor for the unemployed and severe spending cuts in housing, public health and education demonstrated its purpose: to expand the black professional and managerial class in the wake of deindustrialization--which left millions of industrial workers, many of them black and Latino, stranded--and the hollowing out of the welfare state, which in contrast to the 1960s gains widened the gap between rich and poor.

If only the most naïve believed that in time racism would disappear, many have misread William Julius Wilson's 1978 book The Declining Significance of Race to argue that black inequality is not a sign of the persistence of institutional or structural racism but is a remnant. On the contrary, Wilson claims that the persistence of black poverty may not be a sign of discrimination in the old sense but an indication of the vulnerability of blacks to structural changes in the economy. Race still frames the fate of many blacks, but there is also a crucial class dimension to their social situation: Despite having acquired comprehensive legal rights, both deindustrialization of many large cities and black as well as white middle-class flight have left working-class blacks in the lurch. Once, many had well-paid union jobs in steel mills, auto plants and other production industries. But since the mid-1970s, most urban and rural areas where they live have become bereft of good jobs and local services. There are not enough jobs to go around, and those that can be found are McJobs--service employment at or near the minimum wage. Cities also lack the tax base to provide citizens with decent education, recreation, housing and health facilities, and basic environmental standards like clean air and water. Lacking skills, many blacks are stuck in segregated ghettos without decent jobs, viable schools, hope or options.

To make things worse, the neoliberal policies of both Democratic and Republican governments have drastically slowed public-sector job growth and have eliminated one of the main sources of economic stability for blacks and Latinos. In fact, during his campaign Al Gore boasted that he was an architect of the Clinton Administration's program of streamlining the federal government through layoffs and attrition, which was undertaken to facilitate "paying down the debt." The private sector was expected to pick up the slack, and it did: In many cases laid-off public employees joined former industrial workers in the retail trades and in low-paid construction and nonunion factory jobs.

But the problems associated with race/class fail to detain some writers--e.g., John McWhorter and Stephen Carter. Having stipulated black economic progress without examining this claim in any depth, their main concern is how to assure that blacks obtain places in the elite intellectual and managerial professions. As if to stress the urgency of this problem, they adduce evidence to show that despite the broad application of affirmative action, blacks have fallen back in educational achievement. For example, while celebrating the civil rights movement's accomplishments, in Losing the Race: Self-Sabotage in Black America, his book about the question of why black students still lag behind whites in educational attainment, linguistics professor McWhorter discounts the salience of racism. McWhorter allows that although some police brutality and discriminatory practices such as racial profiling still exist, these regrettable throwbacks are being eradicated. Accepting the federal government's definition of poverty (now $17,650 a year or less for a household of four), McWhorter minimizes these problems by citing statistics showing that under 25 percent of America's black population lives in impoverished conditions, a decline from 55 percent in 1960. Of course, these federal standards have been severely criticized by many economists and social activists for substantially understating the amount that people actually need to live on in some of America's major cities.

While $17,650 may constitute a realistic minimum standard in some rural areas and small towns--and even then there is considerable dispute about this figure--anyone living in New York, Chicago, Los Angeles, Philadelphia, Detroit, Oakland, Washington or Atlanta--cities with large black, Latino and Asian populations--knows that this income is far below what people need to live on. The United Way's regional survey of living standards found that households of three in New York City require at least $44,208 to meet minimum decent standards. While the amounts are less for most other cities, they do not run below $25,000, and most of the time they are higher. In many of these cities two-bedroom apartments without utilities cannot be rented for less than $800 a month (plus at least $100 for telephone, gas and electricity)--more than half the take-home pay of a household earning $25,000. In New York City the rent is closer to $1,200, even in the far reaches of the Bronx and Brooklyn. But perhaps half of black households earn less than the rock-bottom comfort level in most areas today.

None of this bothers McWhorter and other black centrists and conservatives. They know that when measured by grades and standardized tests, educational attainment among children of poor black families may fall short because of poverty, insecurity or broken homes. Why, they ask, do black students in the middle class perform consistently worse by every measure than comparable white students, even when, owing mainly to affirmative action, they gain admission to elite colleges and universities? Relying heavily on anecdotes drawn from his own teaching experience at the University of California, Berkeley, McWhorter advances the thesis that, across the class system, blacks are afflicted by three cultural barriers to higher educational achievement that are of their own making: victimology, according to which all blacks suffer from racism that stands between them and success; separatism, a congenital suspicion of whites and of white norms, including high educational achievement; and, perhaps most salient, endemic anti-intellectualism among blacks, which regards those who study hard and care about learning as "nerds" who tend to be held in contempt. To be "cool" is to slide by rather than do well. The point of higher education is to obtain a credential that qualifies one for a good job, not to take academic learning seriously.

McWhorter is deeply embarrassed by the studied indifference of many of his black students (almost none of whom are economically disadvantaged) for intellectual pursuits and is equally disturbed by their low grades and inferior test scores. In the end, he attributes much of the problem to affirmative action--a "necessary step" but one that he believes ultimately degrades blacks in the eyes of white society. McWhorter admits that he is troubled that he got tenure in four and a half years instead of the usual seven, and that he was privileged in part because he is black. Like another "affirmative action baby," Stephen Carter, he is grateful for the boost but believes the door should be closed behind him because the policy has morally objectionable consequences, especially cynicism, examples of which litter the pages of his book. Affirmative action "inherently divests blacks and Latinos of the unalloyed sense of personal, individual responsibility for their accomplishments.... The fact that they tend not to be aware of this follows naturally from the fact that affirmative action bars it from their lives: You don't miss what you never had." Instead, he wants blacks "to spread their wings and compete" with others on an equal basis. Condemning the thesis of Richard Herrnstein and Charles Murray (in The Bell Curve) that affirmative action should be eliminated because blacks "are too dumb," at the same time he urges his readers to "combat anti-intellectualism" and victimology as cultural norms.

Before discussing what I regard as the serious flaws in McWhorter's thesis, it is important to note that it is by no means unique; a growing number of black as well as white pro-civil rights intellectuals embrace it. Having asserted that the legislative and court victories effectively buried the "external" barriers to black advancement, these critics seek answers to the persistence of racial educational disparity within the black community itself--particularly the "culture" of victimization, a culture that, however anachronistic from an objective standpoint, remains a powerful force. Unlike many black conservatives, McWhorter professes admiration for the icons of the civil rights struggle, especially Martin Luther King Jr. and Adam Clayton Powell Jr. But he is not alone in his uncritical assessment of the past half-century as an era of black "progress." As a result, he discounts the claim that racism remains a structural barrier to blacks' further advances. Nor is there justification for the politics and culture of separatism; for McWhorter, blacks are shooting themselves in the foot because these values prevent them from overcoming the psychological and cultural legacy of slavery, the root cause of the anti-intellectualism that thwarts black students from performing well in an academic environment. This culture, he argues, is reproduced as well in the black family when parents refuse to take their children's low or failing grades as an occasion for scolding or punishment and, in the extreme, in peer pressure against the few kids who, as early as elementary school, are good scholars. Academically bright black kids are ridiculed and charged with not being loyal to "the race."

We have seen this pattern played out in a somewhat different mode among working-class white high school students. In his study of male working-class students in an English comprehensive high school--the term that corresponds to our public schools--Paul Willis describes in his study Learning to Labor how the "lads" rebel against the curriculum and other forms of school authority and thereby prepare themselves for "working-class jobs" in a nearby car factory. We can see in this and many other school experiences the same phenomena that McWhorter ascribes to racial culture. To study and achieve high enough grades to obtain a place in the university is to betray the class and its traditions. One risks isolation from his comrades for daring to be good. But it is not merely anti-intellectualism that produces such behavior; it is a specific form of solidarity that violates the prevailing norm that the key to economic success for the individual working-class kid is educational attainment.

But as a certified good student who, by his own account, suffered many of these indignities at the hands of classmates, this is beyond McWhorter's grasp. Since he sees no external barriers, anyone who refuses to succeed when he or she has been afforded the opportunity without really trying must be the prisoner of a retrograde culture. And since he defines anti-intellectualism in terms of comparative school performances in relation to whites--and there is no exploration of the concept of intellectual endeavor as such--he must avoid entering the territory of rampant American anti-intellectualism. It may come as a shock to some, but it can be plausibly claimed that we--whites, blacks and everyone else--live in an anti-intellectual culture. Those who entertain ideas that have no apparent practical utility, enjoy reading books without being required by course assignment and play or listen to classical music, attend art museums, etc., are sometimes labeled "Mr. or Miss deadbrains." Moreover, as many have observed, the public intellectual--a person of ideas whose task is to hold a mirror to society and criticize it--is an endangered species in this country.

McWhorter focuses absolutely no attention on the loss of intellectual life in this sense. His idea of intellectualism is entirely framed by conventional technical intelligence and by a sense of shame that blacks are coded as intellectually inferior. There is little justification of the life of the mind in Losing the Race except as a valuable career tool. So McWhorter cares about educational attainment as a matter of race pride and climbing the ladder. He cannot imagine that working-class and middle-class whites are routinely discouraged from spending their time in "useless" intellectual pursuits or that the disparity between black, Latino, Asian and white school performances can be attributed to anything other than internal cultural deficits.

IV.

Scott Malcomson and Thomas Holt have cast different lights on race disparity in the post-civil rights era. Departing, but only implicitly, from Tocqueville's delineation of the three races of American society--Indian, black and white--Malcomson, a freelance writer and journalist, offers a long, sometimes rambling history, fascinating autobiography and social analysis of the career of race in America. The main historical theme is that racial separatism is interwoven with virtually every aspect of American life, from the sixteenth-century European explorations and conquests that led to the brutal massacres of Native Americans, on through slavery and postemancipation race relations. Malcomson's story is nothing less than a chronicle of the defeat of the ideal of one race and one nation. Black, Native American and white remain, throughout the centuries, as races apart. Not civil war, social movements or legislative change have succeeded in overcoming the fundamental pattern of white domination and the racializing of others. Nor have territorial and economic expansion allayed separatism and differential, racially burdened access to economic and political power.

Malcomson, the son of a white Protestant minister and civil rights activist, caps One Drop of Blood with a hundred-page memoir of his childhood and youth in 1960s and '70s Oakland. In these pages, the most original part of the book, Malcomson offers an affecting account of his own experience with race but also renders the history of three major twentieth-century Oakland celebrities--Jack London, William Knowland and Bobby Seale. London was, of course, one of America's leading writers of the first two decades of the past century. He was famous not only for his adventure stories, which captured the imagination of millions of young adults and their parents as well, but also for his radical politics. He was a founder of the Intercollegiate Socialist Society, the student branch of a once-vibrant Socialist Party, which regularly won local offices in California until World War I and spawned the explosive political career of another writer, Upton Sinclair. (In 1934 Sinclair even became the Democratic candidate for governor.) Less well known was London's flaming racism, a sentiment shared by the archconservative Oakland Tribune publisher and US Senator William Knowland.

Deeply influenced by his parents' anti-racist politics, Malcomson was enthralled by the movement and especially by the Black Panthers, whose ubiquitous presence in Oakland as a political and social force was unusual--they lacked the same level of visibility elsewhere, even in their period of fame. But the pathos of his involvement is that his main reference group was neither black nor white: Their separation was simply too painful for a young idealist to bear. Instead, he hung out with a group of Asian students who nevertheless emulated black cultural mores. They did well in school but, by this account, nothing much except social life went on there. To satisfy his intellectual appetites, Malcomson sought refuge in extracurricular reading, an experience I shared in my own high school days in the largely Jewish and Italian East Bronx of the 1940s and 1950s.

The point of the memoir is to provide a contemporary illustration of the red thread that runs through his book: The promise of a century and a half of struggle for black freedom, and especially of the end of racial separation and of racism, remains unfulfilled. Malcomson takes Oakland as a paradigmatic instance of a majority black population gaining political office but little power because the city's white economic elites simply refused to play, until civic disruption forced them (especially Knowland) to acknowledge that their power was threatened by the militancy of new political forces within the black community. With the waning of black power, Oakland, like many other cities, reverted to its ghettos, punctuated in the 1980s and 1990s by a wave of gentrification that threatened the fragile black neighborhoods and, in the wake of high rents, forced many to leave for cheaper quarters. When Malcomson returns home in the late 1990s after a few decades in New York he finds that the experiment of his father and a local black preacher in interracial unity in behalf of community renewal has collapsed, leaving the clergyman with little hope but a firm conviction that since whites were unreliable allies, blacks had no alternative but to address the devastation of black neighborhoods on their own.

Thomas Holt is a black University of Chicago social and cultural historian whose major work, The Problem of Freedom, is a brilliant, multifaceted account of Jamaican race, labor and politics in the nineteenth and early twentieth centuries. The question Holt asks in his latest book-length essay, The Problem of Race in the Twenty-First Century, is whether W.E.B. Du Bois's comment that the color line is the problem of the twentieth century remains the main question for our century. Holt addresses the issue from a global perspective on two axes: to place race in the context of both the national and global economies, and to adopt a "global" theoretical framework of analysis that situates race historically in terms of the transformation of production regimes from early to late capitalism. In bold but sharp strokes the book outlines three stages: pre-Fordist, Fordist and post-Fordist.

Fordism was more than a production system based on a continuously moving assembly line where workers performed repetitive tasks. The "ism" in the term signifies that mass production and mass consumption are locked in an ineluctable embrace: If the line makes possible mass production, ways must be found to provide for mass consumption. Fordism therefore entails the formation of a new consumer by means of raising wages, mainly by the vast expansion of the credit system. Now workers could buy cars, houses and appliances, even send their kids to college on the principle of "buy now, pay later." This practical consideration led to a new conception of modernity.

The advent of consumer society has changed the face of America and the rest of the capitalist world. Before Fordism, which Holt terms pre-Fordism, the US economy was crucially dependent on slavery. Whatever their egregious moral and ethical features, blacks were at the core of both the pre-Fordist and Fordist production regimes. In the slave and post-Reconstruction eras they planted and picked cotton and tobacco; during and after World War II they were recruited into the vast network of industrial plants as unskilled and semiskilled labor. Indeed, they shared in the cornucopia of consumer society, owning late-model cars and single-family homes; in some instances, they were able to send their kids to college. As Holt points out, exploitative as these relationships were, blacks occupied central places in American economic life.

The 1970s witnessed the restructuring of world capitalism: The victories of the labor movement prompted capital to seek cheaper labor abroad; cities like Detroit, Cleveland, New York, Oakland and Chicago, where black workers constituted a significant proportion of the manufacturing labor force, were rapidly deindustrialized. In their place arose not only retail establishments but "new economy" computer-mediated industries like hardware and software production, dot-coms and financial services.

But the reindustrialization of the 1980s and 1990s occurred without the participation of blacks. In industries marked by the old technologies, hundreds of thousands of immigrants made garments, toys and other consumer goods. Lacking capital, African-Americans could not own the retail businesses in their own communities; these are largely owned by immigrants--Koreans, Indians, Caribbean and African merchants. Holt argues that American blacks are now largely excluded from the global economy; they occupy economic niches that are no longer at the center of production, positions that augur badly for the future of race in America. Even the growing black middle class is located in the public sector, which has been under severe attack since the 1980s. Moreover, Holt plainly rejects the judgment of "black progress" that leads to culturalist explanations for the growing economic and social disparity between blacks and whites. He finds:

overwhelming contemporary evidence that racism permeates every institution, every pore of everyday life. Justice in our courts, earnings on our jobs, whether we have a job at all, the quality of our life, the means and timing of our death--all form the stacked deck every child born black must take up to play the game of life.

V.

Holt's essay ends with the faint hope that racial stigma and segregation will one day be overcome. For now, he insists, race remains, for all African-Americans--not only those suffering the deprivations of ghetto life--the problem of the twenty-first century. But neither Malcomson nor Holt can find sources of resistance. In fact, Holt explicitly gives up on the labor movement--a prime mover within the Fordist regime. It has been severely weakened by post-Fordist globalization. Nor does he identify forces within the African-American movements capable of leading a fight. As other sharp critics of racism such as Paul Gilroy point out, despair overwhelms hope.

The concept that judicial and legislative prohibitions against discrimination are sufficient to erase the legacy of four centuries of social and economic oppression is deeply embedded in the American imagination, always alert to the quick fix. From this view follows the inevitable conclusion that if income and social disparities stubbornly refuse to go away, something must be wrong with the victim. Thus the tricky and misleading term "culture" as explanation, which segues into proposals that blacks "pull up their socks" and reach for the main chance. But those who are passionate in their insistence that the elimination of the structurally induced racial divide will require a monumental struggle have hesitated before the gateway of class politics.

For those like Holt and Malcomson, who are less concerned with whether blacks enter corporate boardrooms than with raising the bottom, there is little alternative to calling on the power of the labor movement to join the fray. It might be argued that organized labor, still dominated by a relatively conservative white leadership, has shown little inclination to mount a fierce defense of black interests. But as unions lose their traditional white, blue-collar base, the labor movement is becoming more black and Latino, and certainly more female. And as its constituency is transformed, there are signs that the top leadership is learning a few lessons. The AFL-CIO's 2000 call for amnesty for undocumented immigrants was a move of historic precedent. And its attempt to organize low-wage workers is a reversal of past practice. If race remains a central problem of the new century, the way forward is probably to re-establish the race/class alliance that fell on hard times in the 1960s and 1970s. Without it, black freedom is confined to a cry in the darkness.

Almost every week, it seems, we get
to read about some state execution, performed or imminent, wreathed
in the usual toxic fog of race or sex prejudice, or incompetency of
counsel, or prosecutorial misconduct.

Take the recent
execution in Ashcroft country, February 7, of Stanley Lingar, done in
the Potosi Correctional Center in Missouri, for killing 16-year
-old
Thomas Allen back in 1985. In the penalty phase of Lingar's trial,
prosecutor Richard Callahan, who may now be headed for the seat on
the Missouri State Supreme Court recently vacated by his
mother-in-law, argued for death, citing Lingar's homosexuality to the
jury as the crucial factor that should tilt poison into the guilty
man's veins. Governor Bob Holden turned down a clemency appeal and
told the press he'd "lost no sleep" over signing off on Lingar's
fate.

Is there any hope that the ample list of innocent
people either lost to the executioners or saved at the eleventh hour
will prompt a national moratorium such as is being sought by Senator
Russell Feingold of Wisconsin?

A year ago it seemed
possible. On January 31, 2000, Illinois Governor George Ryan
suspended imposition of the death penalty in his state on the grounds
that he could not support a system "which, in its administration, has
proven so fraught with error."

By June a Field Poll
reported the sensational finding that in the state with the most
crowded death row in the nation, Californians by nearly 4 to 1
favored stopping state executions to study how the death penalty was
being applied. The Field Poll respondents were told about wrong
convictions, also about appeals to Governor Gray Davis by religious
leaders for a moratorium. A poll at the end of last year, in which
California respondents were not offered this framework, put support
for a moratorium at 42 percent, just behind those opposed to any such
move. A national poll last fall found 53 percent for a
moratorium.

The discrepancy in the California polls
actually affords comfort to abolitionists, since it shows that when
respondents are told about innocent people saved from lethal
injection, often at the last moment, support for a moratorium soars.
It's a matter of public education.

But where are the
educators? Many eligible political leaders have fled the field of
battle, convinced that opposition to the death penalty is a sure-fire
vote loser. In the second presidential debate last fall Al Gore
wagged his head in agreement when George W. Bush declared his faith
in executions as a deterrent.

A few years ago Hillary
Clinton spoke of her private colloquys with the shade of Eleanor
Roosevelt. Their conversations left La Clinton unpersuaded, since she
stands square for death, as does New York's senior senator, Charles
Schumer.

Indeed, the death penalty is no longer a gut
issue, or even a necessary stand, for those, like Schumer, who are
associated with the Democratic Party's liberal wing. On February 12
the New York Post quoted Kerry Kennedy Cuomo, long known as a
leading death-penalty opponent, as saying that "it would be futile"
to try to repeal capital punishment in New York.

Mrs.
Cuomo, daughter of Robert F. Kennedy, told the Post that she
believes her husband, Andrew, a contender for the Democratic
nomination for governor, shares her views. "To tell you the truth, on
the death penalty, it's not as big an issue in the state as it was a
few years ago." Mrs. Cuomo's father-in-law, Mario, repeatedly vetoed
death-penalty measures during his years as governor.

In
line with Kerry Kennedy Cuomo's spineless stance, many liberal or
what are now cautiously called "human rights" groups have also found
it politic to sideline capital punishment as an issue. No better
illustration is available than the recent tussle over John Ashcroft's
nomination as Attorney General. Scores of groups flailed at him on
choice, racism and hate crimes, but not on the most racist
application of hatein the arsenal of state power: the death
penalty.

Return for a moment to the fight to save Lingar's
life. Privacy Rights Education Project, the statewide Missouri gay
lobby group, endorsed Holden in his gubernatorial race. PREP,
however, was quite muted on Lingar's fate, taking little action
except to send a letter to the governor the day before the execution.
Another gay organization, the Gay and Lesbian Alliance Against
Defamation, the folks who want to shut down Dr. Laura, is a national
group but happens to have an office in Kansas City, Missouri. Surely
what prosecutor Callahan did to Stanley Lingar is well beyond
defamation. Where was the Gay and Lesbian Alliance on this case? Not
a peep from them. Noisy on hate crimes but silent on the death
penalty is the Human Rights Campaign, the nation's largest
gay-advocacy group.

The issue of capital punishment is
drawing much more attention these days. Just when help could really
make a difference, where are all these (ostensibly) liberal and
progressive groups? The Anti-Defamation League (all right, strike the
word "ostensible"), whose national director, Abraham Foxman, pulled
down $389,000 in 1999, was busy writing letters for Marc Rich. The
death penalty? The ADL endorsed Bill Clinton's appalling
Antiterrorism and Effective Death Penalty Act of 1996.

The
impetus given by Ryan last year could fall apart. Governor Ryan
himself faces difficult re-election prospects in 2002, and a
successor could rescind the moratorium. Liberals should abandon their
absurd and dangerous obsession with hate crimes and muster against
this most hateful excrescence on the justice system. Let them take
encouragement from the district attorney of San Francisco, Terrence
Hallinan, who told a San Francisco court on February 6 that he would
not participate in the capital sentencing of one Robert Massey since
"the death penalty does not constitute any more of a deterrent than
life without parole" and, among other evils, "discriminates racially
and financially, being visited mainly on racial minorities and the
poor.... It forfeits the stature and respect to which our state is
entitled by reducing us to a primitive code of
retribution."

The recording
industry has been celebrating the supposed defeat of Napster. The
Court of Appeals for the Ninth Circuit has affirmed the grant of a
preliminary injunction that may well have the effect of closing the
service down completely and ending the commercial existence of
Napster's parent (that is, unless the record companies agree to an
implausible deal Napster has proposed). But despite appearances, what
has happened, far from being a victory, is the beginning of the
industry's end. Even for those who have no particular stake in the
sharing of music on the web, there's value in understanding why the
"victory"over Napster is actually a profound and irreversible
calamity for the record companies. What is now happening to music
will soon be happening to many other forms of "content" in the
information society. The Napster case has much to teach us about the
collapse of publishers generally, and about the liberative
possibilities of the decay of the cultural oligopolies that dominated
the second half of the twentieth century.

The shuttering of
Napster will not achieve the music industry's goals because the
technology of music-sharing no longer requires the centralized
registry of music offered for sharing among the network's listeners
that Napster provided. Freely available software called OpenNap
allows any computer in the world to perform the task of facilitating
sharing; it is already widely used. Napster itself--as it kept
pointing out to increasingly unsympathetic courts--maintained no
inventory of music: It simply allowed listeners to find out what
other listeners were offering to share. Almost all the various
sharing programs in existence can switch from official Napster to
other sharing facilitators with a single click. And when they move,
the music moves with them. Now, in the publicity barrage surrounding
the decision, 60 million Napster users will find out about OpenNap,
which cannot be sued or prohibited because, as free software, no one
controls its distribution and any lawsuits would have to be brought
against all its users worldwide. Suddenly, instead of a problem posed
by one commercial entity that can be closed down or acquired, the
industry will be facing the same technical threat, with no one to sue
but its own customers. No business can survive by suing or harassing
its own market.

The music industry (by which we mean the
five companies that supply about 90 percent of the world's popular
music) is dying not because of Napster but because of an underlying
economic truth. In the world of digital products that can be copied
and moved at no cost, traditional distribution structures, which
depend on the ownership of the content or of the right to distribute,
are fatally inefficient. As John Guare's famous play has drummed into
all our minds, everyone in society is divided from everyone else by
six degrees of separation. The most efficient distribution system in
the world is to let everyone give music to whoever they know would
like it. When music has passed through six hands under the current
distribution system, it hasn't even reached the store. When it has
passed through six hands in a system that doesn't require the
distributor to buy the right to pass it along, it has already reached
several million listeners.

This increase in efficiency
means that composers, songwriters and performers have everything to
gain from making use of the system of unowned or anarchistic
distribution, provided that each listener at the end of the chain
still knows how to pay the artist and feels under some obligation to
do so, or will buy something else--a concert ticket, a T-shirt, a
poster--as a result of having received the music for free. Hundreds
of potential "business models" remain to be explored once the
proprietary distributor has disappeared, no one of which will be
perfect for all artistic producers but all of which will be the
subject of experiment in decades to come, once the dinosaurs are
gone.

No doubt there will be some immediate pain that will
be felt by artists rather than the shareholders of music
conglomerates. The greatest of celebrity musicians will do fine under
any system, while those who are currently waiting on tables or
driving a cab to support themselves have nothing to lose. For the
signed recording artists just barely making it, on the other hand,
the changes are of legitimate concern. But musicians as a whole stand
to gain far more than they lose. Their wholesale defection from the
existing distribution system is about to begin, leaving the music
industry--like manuscript illuminators, piano-roll manufacturers and
letterpress printers--a quaint and diminutive relic of a passé
economy.

The industry's giants won't disappear overnight,
or perhaps at all. But because their role as owner-distributors makes
no economic sense, they will have to become suppliers of services in
the production and promotion of music. Advertising agencies,
production services consultants, packagers--they will be anything but
owners of the music they market to the world.

What is most
important about this phenomenon is that it applies to everything that
can be distributed as a stream of digital bits by the simple human
mechanism of passing it along. The result will be more music, poetry,
photography and journalism available to a far wider audience. Artists
will see a whole new world of readers, listeners and viewers; though
each audience member will be paying less, the artist won't have to
take the small end of a split determined by the distribution
oligarchs who have cheated and swindled them ever since Edison. For
those who worry about the cultural, economic and political power of
the global media companies, the dreamed-of revolution is at hand. The
industry may right now be making a joyful noise unto the Lord, but it
is we, not they, who are about to enter the promised land.

Blogs

“Simply a statement that I support life" is the new "I'm not a scientist."

October 23, 2014

People in law enforcement need to re-evaluate their responsibilities.

October 23, 2014

With political groups increasingly eager to buy seats on state-level benches, judges are forced to act more like politicians.

October 23, 2014

Too many questions remain about the death of Latandra Ellington.

October 23, 2014

Still, you should probably quarantine yourself to other news networks.

October 22, 2014

Michael Sam’s efforts to become the first openly gay player in NFL history took a massive hit on Tuesday, with news that he was being cut from the Dallas Cowboys.

October 22, 2014

NYU’s Polytechnic School of Engineering or “Poly” facilitates innovation through exploitation of graduate student labor. 

October 21, 2014

A forerunner of The Guardian’s current series, “King or Queen for a Day.”

October 21, 2014

The first Ebola patient to die in America was a victim of a biased healthcare system.

October 21, 2014

A world-class athlete is standing up to the barbaric surgical practices women are subjected to in international sports. She deserves our support.

October 20, 2014