News and Features
In an end-of-the-year column devoted to "Politics and Prose," Peter Beinart, editor of The New Republic, asserted that there had been a "new gravity" and "sobriety" to American journalism since September 11. Literary responses had failed, he argued, to process the event, notably in a commemorative issue of The New Yorker in which the writing had been "excessive, even grotesque when applied to mass carnage in downtown New York."
Beinart declared it was now the era of the essay--"non-reported, non-narrative, political or historical analysis"--and "the sombre profile of a person in power"--stripped of excessive description, wanton psychoanalysis and "edge" but not of dutiful and accurate quotation. "American journalism, after a long while on the sidelines," he rallied, was "back in the game."
It was a shaky argument, one some editor of The New Republic (a magazine that confuses an antiliterary style of journalism with an anti-indulgent outlook as a matter of policy) was bound to try to make sometime.
Let's face it, the new Hunter S. Thompson won't ever be found in its Puritan liberal pages, though the journalism of a New Yorker writer like Jonathan Franzen just might be, albeit a soberer, straighter version. Franzen himself exhibits too minute a panic in his work, too much of an "edge" (see his novel of last year, The Corrections), is simply too much like a literary forefather such as Joseph Heller (Catch-22 and, more important for Franzen, Something Happened) to make any editor at The New Republic feel he had a grip on the world. And what is The New Republic--or any news and culture magazine--about if it isn't grip, skeptical firmness, analytical rectitude?
Ever since the 1960s and the advent of New Journalism--subjective and, yes, "literary" in its aspirations, distinguished by figures like Truman Capote, Norman Mailer, Gay Talese, Tom Wolfe, Gail Sheehy, Joan Didion--there has been an ongoing and necessary argument in favor of old-school values like objectivity, plain writing and reporting craft. Beinart's analysis of the American print media today is just the latest salvo, objectively put of course, saying out with "the New" and in with the old. It's part of a larger debate about consciousness and language, and how best to represent the state of the nation in both journalism and fiction in ways that reassure Americans their world can be secured, defined, reinforced.
Ironically, the tag New Journalism has been a misnomer from the beginning, implying--all the more alongside the revolutionary context of the 1960s that birthed it--a rejection of past values and a blind dive into the postpsychedelic waters of contemporary reality. It also denies the historical significance of figures like George Orwell, Martha Gellhorn, Joseph Mitchell and Damon Runyon, who created openings in journalistic convention, idiosyncrasies that demonstrate that "New Journalism" had been around for the best part of the century--if a writer had the gift and the license to explore the possibilities. For that matter, is it so far from Walt Whitman's 1882 diary of the Civil War in Specimen Days, to Michael Herr's scattershot report on Vietnam, Dispatches?
Many writers disliked the term New Journalism for these very reasons, preferring less-catchy descriptions like "Immersion Journalism" to describe the intense amounts of research and closeness to one's subject matter required to make such subjective reporting great and accurate storytelling; or "Literary Journalism" because of the undisguised desire to apply the techniques of fiction to a retelling of factual events and conversations.
One of the most notorious indicators of the style was the use of interior monologue, even pure streams of consciousness in groundbreaking pieces like Gay Talese's "The Loser," a brilliant profile of boxer Floyd Patterson (Esquire, 1964) and Tom Wolfe's "The First Tycoon of Teen" (New York, 1965) a feature story on the recording mogul Phil Spector. How absurd, these voices from inside their heads! Wolfe's rhetorical answer to the critics was a slap in the face: "How could a journalist, writing nonfiction, accurately penetrate the thoughts of another person? The answer proved to be marvelously simple: interview him about his thoughts and emotions."
A radical and disciplined art, New Journalism presented a cinematic and psychological rupture with the prevailing journalistic approaches, using dialogue, scenes, thoughts in a dramatic reconstruction of events and interview material. But it still depended on the old verities: solid research, thorough interviewing, good writing (albeit more jazzy in tone and form) and diligent fact-checking. It was an extension of the possibilities, not a denial or negation of what had happened before.
Not content to disturb the print media, New Journalists started shaking up the literary world by producing "narrative non-fiction" bestsellers that caught the times better than any novelist seemed capable of: Capote's masterful and groundbreaking insight into murder and America's pathological underbelly, In Cold Blood (1965); Didion's neurotic essays on her pale sense of selfhood amid West Coast cultural decadence, Slouching Towards Bethlehem (1968); Mailer's rambunctious, egomaniacal coverage of an anti-Vietnam War protest march on the Pentagon, The Armies of the Night (1968).
Books like Thompson's Fear and Loathing on the Campaign Trail '72 (1973), Herr's Dispatches (1977), Mailer's The Executioner's Song (1979) and Wolfe's The Right Stuff (1979) were among a slather of later releases that proved the phenomenon was not going away--from magazine and newspaper journalism or the bestseller lists. In a twist of fate, Mikal Gilmore, the brother of convicted killer Gary Gilmore, Mailer's subject in the capital punishment "thriller" The Executioner's Song, would go on to become one of the few decent writers of the 1990s operating within what could be called the New Journalist tradition, producing a superb book on his brother as well as some excellent writing for Rolling Stone.
Something sick, though, has been happening since the 1960s and '70s heyday of such writers and books. News as non-stop entertainment, the journalist as B-grade personality, a long, slow, moronic nose dive into excess on a scale difficult to imagine back then.
Beinart is right to attack a media consumed today by "lifestyle writing," the bastard child of New Journalism, and a puffed-up aesthetic attitude lacking the flair and depth of earlier, greater writers. Rather than simply attack an excess of style, though, and perhaps a poverty of generational talent, I'd locate the current malaise in the format-driven glibness that is smothering the oxygen of intelligence--not to mention true journalistic creativity--out of magazines and newspapers today.
As serious print media have attempted to go "lighter" and chase readers in the past decade, circulation figures have dropped, even plummeted. This is a worldwide crisis for up-market magazines and newspapers, dimly explained with arguments (not entirely believed, even by those proposing them) that people are getting more information from the Internet or that the educated reader is disappearing. The truth, more awfully, is that readers of all stripes are disillusioned with what's available. Editors and publishers don't seem to know what to do about that except to go further down-market to anything dumber, faster and glitzier, pursuing that fragmenting audience, that shrinking attention span.
Unfortunately, the old formulas aren't functioning anymore in this fractured, increasingly unstable--some might even say dystopic--market. Thus the argument for "sections" and targeted bites of information neatly accompanied by highly supportive advertising. Even if it's meaningless and no one reads it, at least it turns a profit.
If the New Journalist was merely an "impresario" of stories, as the critic Michael Arlen caustically observed in 1972, today's news feature is altogether more miserable, niche-marketed directly to you without need of any bigger and possibly destabilizing voice. The impresarios are mostly gone; now only the product exists, its sheen undisturbed.
Market conditions of the industry aside, there is something deeply conservative beneath Beinart's analysis, a view that spells trouble for the future of modern journalism and where it might go in the United States today--and therefore the world at large. Certainly Beinart's reactionary spirit is in tune with the nation's siege mentality and a chauvinism that encourages the closing not just of borders but of the state of the American mind. There is a feeling that the unexpected, the elusive, manifest in the form of volatile individuals and their creativity, are not legitimate concerns and activities for American voices in an era of uncertainty and instability.
This affects both the media and literature as the struggle for "representation" in American life takes on a deeply political dimension in terms of the language that should be used. It is not just a matter of what is debated, interpreted, depicted--but how that debate should be carried out, the implication being that the wrong words themselves betray the state. There has been an across-the-board conservative intellectual push in the United States for some time now, making an argument for a return to literary order in fiction. B.R. Myers's controversial essay, "A Reader's Manifesto," in The Atlantic Monthly last summer, struck similar notes to Beinart's more recent aria, attacking the wordy pretensions and metaphoric excesses of contemporary American fiction writers like Don DeLillo, Cormac McCarthy and E. Annie Proulx. Subtitled "An attack on the growing pretentiousness of American literary prose," the essay denounced evil postmodernists, showoffs and "pansified intellectuals" who had undermined good language and sound thinking across the nation. What Myers demanded was "a reorientation towards tradition."
In one of many trainspotting examples he berated Proulx for some "characteristic prose" where she thanked her children at the end of Close Range (1999) for putting up with her "strangled, work-driven ways." According to Myers this phrase made "no sense on any level." When a reader wrote in to complain that it was "an implied metaphor and hardly difficult to understand," Myers stuck to his guns, returning to the dictionary and rules of grammar to justify himself. Fortunately for us, language moves--and is received--poetically and intuitively, even if Myers doesn't want to admit it.
However, he did finger a crucial distraction from the building of the modern American novel and how it is reviewed, even sanctified today. He called this "the sentence cult," those who adore wonderful phrases and patches of writing that finally do not add up to a fully felt, organically "alive" book worth reading, let alone relating to deeply. On this point of literary fragmentation, a collapse away from storytelling and character, a collapse of identification and therefore identity, he may well be right. As to whether such a collapse makes the literature inherently bad--well, that's another thing altogether.
This debate about the state of the American novel has been going on for years. Indeed, American literature regularly convulses to such landmark essays--and the need to write them--a battle for intellectual territory that should not be underestimated. The reverberations of these opinion pieces among cultural elites carry through as manifestoes for the times and exert enormous influence in publishing houses and the media. They should also be understood as beachheads for the highbrow magazines presenting them: in this case the long-running desire of the Atlantic to overtake Harper's as the defining literary and intellectual periodical of the day, a desire underlined by its drift toward political and aesthetic conservatism. The Atlantic is ready for the Bush era, righteous, satisfied and a little smug, just as Harper's might be seen as aristocratically Clintonian, progressive to the point of dilettantism and somehow out of step with the narrowing contemporary mood.
Myers's essay is an attempt to supersede an argument put forth by Franzen in Harper's in 1996, in a piece titled "Perchance to Dream: In the Age of Images a Reason to Write Novels." In that essay, Franzen wrote of his own "despair about the American novel." His conclusions, and hopes, however, were somewhat different from those of Myers.
Of the social novel Franzen wrote: "I didn't know that Philip Roth, twenty years earlier, had already performed the autopsy, describing 'American reality' as a thing that stupefies...sickens...infuriates, and finally...is even a kind of embarrassment to one's own meager imagination. The actuality is continually outdoing our talents." His despair for the state of the American novel was born out of the 1991Gulf War and "a winter when every house in the nation was haunted by the ghostly telepresences of Peter Arnett in Baghdad and Tom Brokaw in Saudi Arabia--a winter when the inhabitants of those houses seemed less like individuals than a collective algorithm for the conversion of media jingoism into an 89 percent approval rating."
Questioning the difficulties of social realism in the age of electronic media, and arguing that TV could represent reality better than any novel could, Franzen pined for the days when a book like Catch-22 had a huge social impact, raising questions about society to such a level that its title became part of the common vocabulary, entering itself in the dictionary (a thought that must give B.R. Myers a sleepless night or two).
What Franzen saw in Heller's black and absurdist work was less of a need to find legitimacy in a realistically detailed social novel of the present à la Dickens, but instead to write a novel that socially engaged, quite a different--if not unrelated--thing. It was this thinking that guided him in writing The Corrections, with its vaguely hallucinogenic, forensically detailed portrait of American family life, and the struggle of its characters to remain human amid the blizzard of consumer alienation. Despite the rave reviews and bestseller status, it is perhaps a little early yet to know if Franzen has succeeded in his project of engagement; but there is no doubt he has struck a nerve.
None too surprisingly, The New Republic took Franzen to task for his epic yet atomized scope. Observing the influence of the novelist Don DeLillo on the younger Franzen, the critic James Wood made a piercing summation of the senior writer's impact on The Corrections: "The DeLillo notion of the novelist as a kind of Frankfurt School entertainer, fighting the culture with dialectical devilry, has been woefully influential, and will take some time to die." Noting that Franzen imagined "a correction of DeLillo in favor of the human," Wood went on to say that this was "more than welcome, it is an urgent task of contemporary American fiction, whose characteristic products are books of great self-consciousness with no selves in them; curiously arrested books that know a thousand different things--the recipe for the best Indonesian fish curry! the sonics of the trombone! the drug market in Detroit! the history of strip cartoons!--but do not know a single human being."
It's clear that Wood--one of America's finest literary critics--finally favors something of Franzen's humanity but resents the occult unease beneath DeLillo's crowded linguistic responses to consumer capitalism and how he applies that language to create a surreptitious and infecting despair, a deep, flamboyant coldness. There is also a vague feeling from Wood that DeLillo is somehow evil, a monster of hidden tones, corrupting America from within. He is certainly appalled by a DeLillo essay that appeared in the New York Times, "The Power of History," wherein the novelist declared, "At its root level, fiction is a kind of religious fanaticism, with elements of obsession, superstition, and awe. Such qualities will sooner or later state their adversarial relationship with history."
How strange those words from 1997 read now, post-September 11. In the buildup to this statement DeLillo had defined the modern novelist as a radical and an outsider to all systems: political, social, linguistic. "Fiction will always examine the small anonymous corners of human experience," he wrote.
But there is also the magnetic force of public events and the people behind them. There is something in the novel itself, its size, its openness to strong social themes that suggests a matching of odd-couple appetites--the solitary writer and the public figure at the teeming center of events. The writer wants to see inside the human works, down to dreams and routine rambling thoughts, in order to locate the neural strands that link him to men and women who shape history. Genius, ruthlessness, military mastery, eloquent self-sacrifice--the coin of actual seething lives.
Against the force of history, so powerful, visible and real, the novelist poses the idiosyncratic self. Here it is, sly, mazed, mercurial, scared half-crazy. It is also free and undivided, the only thing that can match the enormous dimensions of social reality.
This is a nihilistic view, divorcing itself from history's involving tug or becoming perhaps a perversion of it. DeLillo might argue that such perversions are simply the logical result of a "social individual" in the Information Age. A kind of endgame--alienated, yes, but lit with negative protest nonetheless.
Franzen identified this problem similarly in his "Perchance to Dream" essay as the way "privacy is exactly what the American Century has tended toward. First there was mass suburbanization, then the perfection of at-home entertainment, and finally the creation of virtual communities whose most striking feature is that interaction within them is entirely optional--terminable the instant the experience ceases to gratify the user."
The collapse of the myth of the Internet as a democratizing force in news and information, as a glue for a new public consciousness, is part of this great feeling of disaffection and disconnection. While it remains a counterculture organizing ground for assorted global protest groups, it is not quite the democratic free-for-all it was once hoped to be. Meanwhile, clichés like "the New New Journalism" and "the Way New Journalism," which try to give countercultural weight to new forms of Internet journalism, have fallen fast to the reality that major news corporations are maintaining their centrality and indeed expanding it, seeking international print and electronic monopolies over freelance writers in a manner that all but strangles them out of the mainstream system. Add to this a babble of impotent, even crazed voices, and you have confusion, not liberty, shouting to be heard outside the corporate gates.
Where New Journalism once challenged a homogeneity of opinion, even one of its most extreme practitioners, Hunter S. Thompson, the godfather of "Gonzo," finds a heterogeneity on the Net so repulsive he can't bear it. As he put it, "There is a line somewhere between democratizing journalism and every man a journalist. You can't really believe what you read in the papers anyway, but at least there is some spectrum of reliability. Maybe it's becoming like the TV talk shows or the tabloids where anything's acceptable as long as it's interesting."
The language of the Net itself is affecting new books and the audiences who might be reading them. Figures like Dave Eggers in A Heartbreaking Work of Staggering Genius are also partly a literary byproduct of chat rooms and websites, and use an eclipsed language more heady and conversational at once, and therefore "young." Overall, though, one senses an impatience at root in Net culture, a desperation for sensation and the moment that does not feed itself into writing or reading books. In that regard the seething quality of the public consciousness, its near-madness, is really what the Net comes to represent--and with it a deep loneliness, a frenzy masked as social activity. Novelists like DeLillo, Franzen, Eggers, David Foster Wallace and Rick Moody order that sea of thought, but also manifest its rabidity and pointless depths, indexing it to the furies and absurdities of consumer culture. To steal a line from Marshall McLuhan, "Some like it cold."
A critic like Wood finds this sprawling ambition depressing. You might recall his lament that "It is now customary to read 700-page novels, to spend hours and hours within a fictional world, without experiencing anything really affecting, sublime, or beautiful.... This is partly because some of the more impressive novelistic minds of our age do not think that language and the representation of consciousness are the novelist's quarries anymore." It could be argued that just when New Journalism was pushing its way into literature's representational culture, the more talented novelists were moving out to the fringes of consciousness, to places "nonfiction narrative" could not reach. So much so that Tom Wolfe himself eventually berated modern American novelists for their abstractions and lack of research in his own essay manifesto, "Stalking the Billion-Footed Beast." Wolfe espoused a return to the qualities of naturalism, citing Emile Zola and, of course, the importance of the "novelist as reporter." (Since Wolfe had recently written The Bonfire of the Vanities, his screed was seen in many quarters as bald self-promotion.)
Wolfe's views are not so far away from those of B.R. Myers. Wolfe's zippy writing style may have sung with pop culture verve, but he has always been a conservative at heart, as his rigid championing of realism, or "documentation," as he prefers to call it, shows. Aside from a stoush with Mailer over A Man in Full, Wolfe has also argued with John Updike and John Irving, the latter describing all of Wolfe's novels as nothing but "yak" and "journalistic hyperbole described as fiction." Wolfe responded to them all in an essay called "My Three Stooges" (it can be found in his latest collection, Hooking Up), accusing Mailer, Updike and Irving of having "wasted their careers by not engaging in the life around them...turning their backs on the rich material of an amazing country at a fabulous moment in history."
In Wolfe's final opinion, the American social novel is suffering not from "obsolescence" but from "anorexia." For all its force of actuality, though, Franzen's sickened density in The Corrections is quite a different creature from Wolfe's idea of what a social novel should be. It doesn't just observe or document; it palpitates, realistically, with the surreal excess Wolfe once identified as an indulgence. And in a strange way, perhaps, it softens the blows of DeLillo, tries to put us back together again without hiding the cracks.
Now, however, a new era of unvarnished reporting and the dogmatism of style that underlies it appears to be dawning. September 11 is fuel to this conservative fire. The world has become so unsteady, the argument runs, that we have to get back to our roots, find the lines that moor us safely to what and where we are. Plasticity of language, tangential and subjective reporting, work that emphasizes a fractured or restless view of the world--these must be stopped. Examples abound.
There can be little argument that September 11 has sent everyone into the spin of re-evaluation. But writers have always had to wrestle with such extreme moments, monstrous acts that threaten to annihilate us, spiritually if not actually. Where is the sense in it? How do we become human again, rather than vengeful, blind with loss or hate? One danger for literary journalism, of course, is that it threatens to aestheticize the experience of an event like September 11. The same may well be true of writing about Hiroshima, the Holocaust, even something as basic as a brutal, anonymous murder. Straight journalism must negotiate the obverse dangers, the tendency to reduce everything to the details, an impartiality that becomes desensitizing and objective to the point of emotional irrelevance.
The proof of value must finally lie within the words themselves. And for all of Beinart's criticisms of unnecessary poetics and dubious metaphors, the literary fraternity and journalists of literary inclination still gave us much to be grateful for. What that may mean in terms of novels and a broader state of mind to come is still too early to tell; but his and Myers's demand for a retreat from the frontiers of ambiguity, from wordplay and a tensile language that the likes of Don DeLillo tease into something conscious and unsettling within us is, well, a backward step. It may be awful to say it, but the obsession with information that underlies the work of DeLillo, Franzen and others could still be capturing the real and enduring trauma for society, way beyond the immediate horror of September 11 and its psychic impact.
I have to note that the English novelist Ian McEwan's dark and cool eloquence in The Guardian--his interrogation of the images and our action-replay absorption in them, our nauseating lust for news--was of the first order, as both literary essay and as a moral inquiry between self and the society of spectacle. Factual journalism alone can't easily create that kind of recognition. In Vanity Fair, the novelist Toni Morrison's address to the dead was the finest elegy I read from anyone, anywhere, with her bruising admission of "nothing to give...except this gesture, this thread between your humanity and mine." Yes, facts can make us grieve, too, but there are times when we also need the obscure magic of poetry to heal us.
Even the issue of The New Yorker so maligned by Beinart was filled with great literary journalism. The one exception to the form was an essay--nonreported, nonnarrative, political, historical, analytical--by Susan Sontag, a piece strangely overlooked by Beinart in his comments, given the new aesthetic world order he perceives. Sontag questioned the proposition of national innocence, and how that outlook refuses some of the baggage--some of the baggage--of responsibility America has to bear for its foreign policy. It was easier to misunderstand, simplify and demonize her arguments than to take on board the questions she was asking, even the sober ones.
September 11 did do something to the imagination, did go beyond words. It was a profound blow to the spirit. In all the realms of journalism and analysis since then, some have spoken well, some haven't. Some, most interesting of all, have evoked confusion and mixed feelings, and longed for the light of understanding. The clamor to speak has itself become a problem, a moral dilemma that reflects the media's sickening habit of overproduction, its sheer commerciality.
After any death, any tragedy, there is an inevitable level of sobriety and reserve, yes. If that leads to better journalism, better writing, better books, how wonderful for us all. But the argument that literary responses and literary journalism are somehow not up to the task, that reflection and rebuilding the public consciousness should be left to the practitioners of conventional bricks-and-mortar journalism and old-fashioned storytellers who know their rules of grammar, is far from convincing. To do the job fully we need a little soul and poetry, a little shaking up too. Literary journalism and great, radically written novels are more than able to fill that gap. And perhaps raise a few questions as well in that world between the imagined and the real where nightmares--and dreams--are born.
Seeking to eliminate the Palestinians as a people, it is destroying their civil life.
Uncovering the industry's multibillion-dollar global smuggling network.
A friend and I were sitting around commiserating about the things that get to us: unloading small indignities, comparing thorns. "So there I was," she said, "sitting on the bus and this man across the aisle starts waving a copy of law professor Randall Kennedy's new book Nigger. He's got this mean-looking face with little raisiny eyes, and a pointy head, and he's taking this book in and out of his backpack. He's not reading it, mind you. He's just flashing it at black people."
"Don't be so touchy," I responded. "Professor Kennedy says that the N-word is just another word for 'pal' these days. So your guy was probably one of those muted souls you hear about on Fox cable, one of the ones who's been totally silenced by too much political correctness. I'd assume he was just trying to sign 'Have a nice day.'"
"Maybe so," she said, digging through her purse and pulling out a copy of Michael Moore's bestselling Stupid White Men. "But if I see him again, I'm armed with a 'nice day' of my own."
"That's not nice," I tell her. "Besides, I've decided to get in on the publishing boom myself. My next book will be called Penis. I had been going to title it Civil Claims That Shaped the Evidentiary History of Primogeniture: Paternity and Inheritance Rights in Anglo-American Jurisprudence, 1883-1956, but somehow Penis seems so much more concise. We lawyers love concision."
She raised one eyebrow. "And the mere fact that hordes of sweaty-palmed adolescents might line up to sneak home a copy, or that Howard Stern would pant over it all the way to the top of the bestseller list, or that college kids would make it the one book they take on spring break----"
"...is the last thing on my mind," I assured her. "Really, I'm just trying to engage in a scholarly debate about some of the more nuanced aspects of statutory interpretation under Rule 861, subsection (c), paragraph 2... And besides, now that South Park has made the word so much a part of popular culture, I fail to see what all the fuss is about. When I hear young people singing lyrics that use the P-word, I just hum along. After all, there are no bad words, just ungood hermeneutics."
"No wonder Oprah canceled her book club," she muttered.
Seriously. We do seem to have entered a weird season in which the exercise of First Amendment rights has become a kind of XXX-treme Sport, with people taking the concept of free speech for an Olympic workout, as though to build up that constitutional muscle. People speak not just freely but wantonly, thoughtlessly, mainlined from their hormones. We live in a minefield of scorched-earth, who-me-a-diplomat?, let's-see-if-this-hurts words. As my young son twirls the radio dial in search of whatever pop music his friends are listening to, it is less the lyrics that alarm me than the disc jockeys, all of whom speak as though they were crashing cars. It makes me very grateful to have been part of the "love generation," because for today's youth, the spoken word seems governed by people from whom sticks and stones had to be wrested when they were children--truly unpleasant people who've spent years perfecting their remaining weapon: the words that can supposedly never hurt you.
The flight from the imagined horrors of political correctness seems to have overtaken common sense. Or is it possible that we have come perilously close to a state where hate speech is the common sense? In a bar in Dorchester, Massachusetts, recently, a black man was surrounded by a group of white patrons and taunted with a series of escalatingly hostile racial epithets. The bartender refused to intervene despite being begged to do something by a white friend of the man. The taunting continued until the black man tried to leave, whereupon the crowd followed him outside and beat him severely. In Los Angeles, the head of the police commission publicly called Congresswoman Maxine Waters a "bitch"--to the glee of Log Cabin Republicans, who published an editorial gloating about how good it felt to hear him say that. And in San Jose, California, a judge allowed a white high school student to escape punishment after the student, angry at an African-American teacher who had suspended his best friend, scrawled "Thanks, Nigga" on a school wall. The judge was swayed by an argument that "nigga" is not the same as "nigger" but rather an inoffensive rap music term of endearment common among soul brothers.
Frankly, if Harvard president Lawrence Summers is going to be calling professors to account for generating controversy not befitting that venerable institution, the disingenuous Professor Kennedy would be my first choice. Kennedy's argument that the word "nigger" has lost its sting because black entertainers like Eddie Murphy have popularized it, either dehistoricizes the word to a boneheaded extent or ignores the basic capaciousness of all language. The dictionary is filled with words that have multiple meanings, depending on context. "Obsession" is "the perfume," but it can also be the basis for a harassment suit. Nigger, The Book, is an appeal to pure sensation. It's fine to recognize that ironic reversals of meaning are invaluable survival tools. But what's selling this book is not the hail-fellow-well-met banality of "nigger" but rather the ongoing liveliness of its negativity: It hits in the gut, catches the eye, knots the stomach, jerks the knee, grabs the arm. Kennedy milks this phenomenon only to ask with an entirely straight face: "So what's the big deal?"
The New Yorker recently featured a cartoon by Art Spiegelman that captures my concern: A young skinhead furtively spray-paints a swastika on a wall. In the last panel, someone has put the wall up in a museum and the skinhead is shown sipping champagne with glittery fashionistas and art critics. I do not doubt that hateful or shocking speech can be "mainstreamed" through overuse; I am alarmed that we want to. But my greater concern is whether this gratuitous nonsense should be the most visible test of political speech in an era when government officials tell us to watch our words--even words spoken in confidence to one's lawyer--and leave us to sort out precisely what that means.
Afghan women are free of the Taliban, but liberation is still a distant dream.
Do Not Employ Arabs, Enemies Should Not Be Offered a Livelihood and We Will Assist Those Who Do Not Provide Work For Arabs are just a few of the slogans covering billboards throughout Jerusalem. These placards refer to Palestinian citizens of Israel. One poster even provides a detailed list of taxi companies that employ Arab citizens and companies that don't. Jewish history, it seems, has been forgotten.
This kind of blatant racism is now common in Israel; it feeds off the widespread fear of suicide bombings, which have also managed to change the Jerusalem landscape. Downtown streets are almost empty, and most businesses have been seriously hurt because of the dramatic decline in clientele. A recent poll suggests that 67 percent of Israelis have reduced the number of times they leave their homes. The only companies that have been thriving in the past months are security firms. Every supermarket, bank, theater and cafe now employs private guards whose duty is to search customers as they enter the building.
One of the effects of this new practice is that profiling has become ubiquitous. Arab-looking residents refrain from using public transportation and from going to all-Jewish neighborhoods and shopping centers. It is not unusual in the city to see groups of Arab men searched at gunpoint by Israeli police, their faces against the wall and their hands in the air.
On the national level, politicians have been exploiting the pervasive fear, using it to foment a form of fervent nationalism tinged with racism. Effi Eitam, the new leader of the National Religious Party, recently approved to become a minister in Sharon's government, has characterized all Palestinian citizens of Israel as "a cancer." "Arabs," he claims, "will never have political rule in the land of Israel," which in Eitam's opinion includes the West Bank and Gaza. Support for Sharon has also risen from 45 to 62 percent following the latest Israeli offensive. The fact that Palestinian citizens, who make up almost 20 percent of the population, adamantly oppose Israel's military assault suggests that only one in five Jewish citizens is against Sharon's war. Most Jews consider themselves victims in this conflict, not aggressors.
The deeply rooted victim syndrome has been manipulated over the past year by the mainstream media in order to rally the public around the flag. For television viewers, Palestinian suffering is virtually nonexistent, while attacks on Jews are graphically portrayed, replayed time and again, thus rendering victimhood the existential condition of Israeli Jews. Radio and television have practically turned into government organs, allowing almost no criticism of Israel's policies to be aired.
It is within this stifling atmosphere that one must understand the slow resurgence of the Israeli peace camp. There are now about 400 new combat reservists who refuse to serve in the occupied territories, joining a similar number of refuseniks from Yesh Gvul ("There Is a Limit"). "We will not go on fighting beyond the 'green line' for the purposes of domination, expulsion, starvation and humiliation of an entire people," the soldiers wrote in an open letter. Since the eruption of the second intifada, eighty-seven conscientious objectors have been incarcerated; thirty-five are currently sitting in jail, more than in any other period in Israel's history.
On April 3, 4,000 Jewish and Arab protesters marched together from Jerusalem toward Kalandia checkpoint, located on the outskirts of Ramallah. The procession was led by women and included four truckloads of humanitarian aid. The demonstrators were stopped by a police blockade only minutes after they set out. As a member of the negotiation team, I was on the police side of the blockade when scores of tear gas canisters and stun grenades were thrown into the crowd. Policemen immediately pursued the protesters, trampling and violently beating them with their clubs. Among the injured were three Arab Knesset members. Later, while waiting for the trucks to return from Ramallah, a police officer explained that a woman precipitated the outburst: "She spat on one of the officers."
The next day, protesters gathered in front of the American Embassy in Tel Aviv to call on the US government to stop Israel's military incursion. The group was mostly composed of Palestinian citizens of Israel, although there were quite a few Jews. Again, the police assaulted the demonstrators, this time because one of them was carrying a Palestinian flag.
Two days later, on April 6, 15,000 people marched from Rabin Square to the Defense Ministry in Tel Aviv, calling on Sharon to immediately withdraw all military forces from the occupied territories and to restart negotiations with the Palestinian Authority. "The occupation is killing us all!" the demonstrators shouted. Channel 2 spent twenty seconds covering the event; Channel 1, Israel's public station, ignored it.
Not everyone disregarded the protest. Likud Knesset Member Gideon Ezra called upon the secret services to begin monitoring more carefully the activities of leftist organizations and blamed the only two journalists who continue to document what is happening on the Palestinian side--Amira Hass and Gideon Levy--for encouraging the campaign against Israel. Given the increasingly repressive atmosphere inside Israel, it appears that without massive pressure from abroad--not unlike the sanctions imposed on South Africa--Israel will not withdraw from the occupied territories, nor will it cease to oppress and subjugate the Palestinian people.
o, my momma called, "Why are they letting them gouge us like this?" she wanted to know. "They" are our so-called political leaders in Washington, and "them" are the drugmakers now costing her $500 a month. Nearing 87, Lillie Mabel Hightower has to take two medicines regularly, including a heart pill to keep the old ticker ticking. She tells me her pill bill goes up just about every time she refills her two prescriptions, having soared 40 percent in only two years. For someone on Social Security, the difference between $3,600 a year and $6,000 a year is a serious piece of change. "Of course I know why," she quickly added in answer to her own question: "It's the big money they give the politicians. But can't we do something? Who do I write?"
Like my mom's, the blood pressure of millions of seniors and others has reached the political boiling point because of price-gouging by big drug companies. Americans pay the highest prices in the world for prescriptions--an average of 30 percent more, for example, than Canadians pay for the exact same drugs. The companies jacked up our prices by another 17.1 percent last year while they went laughing to the bank with the highest profit margins of any industry, more than triple the average of all Fortune 500 corporations.
Political consultants in Washington recognize the explosiveness of this issue, so there has been a flurry of bills, press conferences and photo-ops by both parties, with each claiming that it cares more than the other about the problem. But, as Hemingway once advised, never mistake motion for action. No lobbying group is as well financed and well connected as the drug industry is in our capital city. It has 625 registered lobbyists on its payroll--ninety more lobbyists than there are members of Congress! The industry also liberally greases the skids of the legislative process with huge campaign donations, topping $26 million in the last election cycle. The result is that Washington postures, drug prices keep going up and seniors continue to seethe.
Still, my mother asks, "Can't we do something?" Yes.
As Maine Goes...
Look to the states where citizens' groups have teamed up with legislative leaders who not only are in motion but have taken action. While Washington fiddles and faddles, twenty-six states now have some sort of program to cut drug costs, at least for the low-income elderly, and several are leading the way toward programs to take the gouge out of prescription prices for everyone.
Chellie Pingree led the charge in Maine. A small businesswoman, she was elected to the State Senate, where she took up the cause of seniors being pounded by drug prices so high that some were forced to choose between paying for essential medications or the heating bill. With the leadership of grassroots groups like the Maine People's Alliance, Consumers for Affordable Health Care and the Maine State Council of Senior Citizens, she led busloads of seniors on well-publicized trips across the Canadian border to buy their medicines; on just one trip, twenty-five seniors got prescriptions filled for $16,000 less than in the United States. Why should people have to take a six-hour bus ride to get fair prices, she asked? She answered by sponsoring the Fairer Prescription Drug Prices Act, which empowered a state pricing board to set retail prices in Maine.
Pingree's bill allowed seniors to go into any pharmacy in Maine and get the prescriptions they need at the same discounted price that Canada's government negotiates with drugmakers for its citizens. Her bill was simple, comprehensive, nonbureaucratic, effective...and it drove the big drugmakers bonkers. They dispatched their own buses to Augusta, loaded with lobbyists and money, in a frantic effort to kill the bill. In a blitz of TV and newspaper ads, the industry labeled the bill "a crazy idea" that would force the drug industry to abandon Maine. But the grassroots groups went to work, and Pingree, by now the Senate majority leader, took her bill directly to the people, holding public meetings from Madawaska to Biddeford. On April 12, 2000, her bill passed in both houses of the legislature by veto-proof margins, and the governor signed it.
Of course, the industry immediately hitched up a twenty-mule team of lawyers and rushed to federal court, but Maine's price-control law has been upheld all the way through the appeals court level and now awaits judgment at the US Supreme Court. Meanwhile, twenty-three other states, from Arizona to Wisconsin, are considering Maine's fair-pricing law, and public pressure from people like my momma is turning up the heat for national legislation.
Pingree, who is now running for the US Senate, has put the issue at the center of her campaign, vowing to bring the populist coalition behind the Maine Solution into play nationally. "There's such a disconnect between Washington and people's reality," she says. "This is more than an issue to people, it's personal outrage. Walk into any room, and it doesn't matter if the people are in overalls or suits; they've all got a story."
USAction, a network of state and local citizens' coalitions, has been a leader in developing the state proposals, and it's now working with other groups to move this public grievance from the low, slow backburner of Congress to the forefront of the progressive agenda (202-624-1730 or www.usaction.org). "This is a case where the people are miles ahead of the politicians," points out USAction's executive director, Jeff Blum, urging that progressives in Congress put forth a "fair pricing" plan that would give every senior the lowest price available on every drug. The key is not merely to provide universal coverage but to connect this to effective controls over the industry's ripoff prices. US citizens should pay no more than the average price that the drugmakers charge foreign customers in Canada, Japan, Italy and elsewhere.
If Congressional Democrats have a strategic bone left in their bodies, they'll grab this proposal and run with it, for the Lillie Mabel Hightowers are desperately looking for someone who'll stand with them against the drug profiteers. As pollster Celinda Lake reports, "This is the most powerful and intense issue of the 2002 elections, and Democrats should take the lead." If the party won't even stand up for our mommas, who'll stand up for the party?
What does it mean to be Jewish?
Great Britain grants a homeland to the homeless Jews.
"Thirty years from now the big university campuses will be relics," business "guru" Peter Drucker proclaimed in Forbes five years ago. "It took more than 200 years for the printed book to create the modern school. It won't take nearly that long for the [next] big change." Historian David Noble echoes Drucker's prophecies but awaits the promised land with considerably less enthusiasm. "A dismal new era of higher education has dawned," he writes in Digital Diploma Mills. "In future years we will look upon the wired remains of our once great democratic higher education system and wonder how we let it happen."
Most readers of this magazine will side with Noble in this implicit debate over the future of higher education. They will rightly applaud his forceful call for the "preservation and extension of affordable, accessible, quality education for everyone" and his spirited resistance to "the commercialization and corporatization of higher education." Not surprisingly, many college faculty members have already cheered Noble's critique of the "automation of higher education." Although Noble himself is famously resistant to computer technology, the essays that make up this book have been widely circulated on the Internet through e-mail, listservs and web-based journals. Indeed, it would be hard to come up with a better example of the fulfillment of the promise of the Internet as a disseminator of critical ideas and a forum for democratic dialogue than the circulation and discussion of Noble's writings on higher education and technology.
Noble performed an invaluable service in publishing online the original articles upon which this book is largely based. They helped initiate a broad debate about the value of information technology in higher education, about the spread of distance education and about the commercialization of universities. Such questions badly need to be asked if we are to maintain our universities as vital democratic institutions. But while the original essays were powerful provocations and polemics, the book itself is a disappointing and limited guide to current debates over the future of the university.
One problem is that the book has a dated quality, since the essays are reproduced largely as they were first circulated online starting in October 1997 (except for some minor editorial changes and the addition of a brief chapter on Army online education efforts). In those four-plus years, we have watched the rise and fall of a whole set of digital learning ventures that go unmentioned here. Thus, Noble warns ominously early in the book that "Columbia [University] has now become party to an agreement with yet another company that intends to peddle its core arts and science courses." But only in a tacked-on paragraph in the next to last chapter do we learn the name of the company, Fathom, which was launched two years ago, and of its very limited success in "peddling" those courses, despite Columbia president George Rupp's promise that it would become "the premier knowledge portal on the Internet." We similarly learn that the Western Governors' Virtual University "enrolled only 10 people" when it opened "this fall" (which probably means 1998, when Noble wrote the original article) but not that the current enrollment, as of February 2002, is 2,500. For the most part, the evidence that Noble presents is highly selective and anecdotal, and there are annoyingly few footnotes to allow checking of sources or quotes.
The appearance of these essays with almost no revision from their initial serial publication on the Internet also helps to explain why Noble's arguments often sound contradictory. On page 36, for example, he may flatly assert that "a dismal new era of higher education has dawned"; but just twenty-four pages later, we learn that "the tide had turned" and the "the bloom is off the rose." Later, he reverses course on the same page, first warning that "one university after another is either setting up its own for-profit online subsidiary or otherwise working with Street-wise collaborators to trade on its brand name in soliciting investors," but then acknowledging (quoting a reporter) that administrators have realized "that putting programs online doesn't necessarily bring riches." When Noble writes that "far sooner than most observers might have imagined, the juggernaut of online education appeared to stall," he must have himself in mind, two chapters earlier. Often, Noble is reflecting the great hysteria about online education that swept through the academy in the late 1990s. At other times (particularly when the prose has been lightly revised), he indicates the sober second thoughts that have more recently emerged, especially following the dot-com stock market crash in early 2000.
In the end, one is provided remarkably few facts in Digital Diploma Mills about the state of distance education, commercialization or the actual impact of technology in higher education. How many students are studying online? Which courses and degrees are most likely to appear online? How many commercial companies are involved in online education? To what degree have faculty employed computer technology in their teaching? What has been the impact on student learning? Which universities have changed their intellectual property policies in response to digital developments? One searches in vain in Noble's book for answers, or even for a summary of the best evidence currently available.
Moreover, Noble undercuts his own case with hyperbole and by failing to provide evidence to support his charges. For example, most readers of his book will not realize that online distance education still represents a tiny proportion of college courses taken in the United States--probably less than 5 percent. Noble sweepingly maintains, "Study after study seemed to confirm that computer-based instruction reduces performance levels." But he doesn't cite which studies. He also writes, "Recent surveys of the instructional use of information technology in higher education clearly indicate that there have been no significant gains in pedagogical enhancement." Oddly, here Noble picks up the rhetoric of distance-education advocates who argue that there is "no significant difference" in learning outcomes between distance and in-person classes.
Many commentators have pointed out Noble's own resistance to computer technology. He refuses to use e-mail and has his students hand-write their papers. Surely, there is no reason to criticize Noble for this personal choice (though one feels sorry for his students). Noble himself responds defensively to such criticisms in the book's introduction: "A critic of technological development is no more 'anti-technology' than a movie critic is 'anti-movie.'" Yes, we do not expect movie critics to love all movies, but we do expect them to go to the movies. Many intelligent and thoughtful people don't own television sets, but none of them are likely to become the next TV critic for the New York Times. Thus, Noble's refusal to use new technology, even in limited ways, makes him a less than able guide to what is actually happening in technology and education.
Certainly, Noble's book offers little evidence of engagement with recent developments in the instructional technology field. One resulting distortion is that some readers will think that online distance education is the most important educational use of computer technology. Actually, while very few faculty teach online courses, most have integrated new technology into their regular courses--more than three-fifths make use of e-mail; more than two-fifths use web resources, according to a 2000 campus computing survey. And few of these faculty members can be characterized, as Noble does in his usual broad-brush style, as "techno-zealots who simply view computers as the panacea for everything, because they like to play with them."
Indeed, contrary to Noble's suggestion, some of the most thoughtful and balanced criticisms of the uses of technology in education have come from those most involved with its application in the classroom. Take, for example, Randy Bass, a professor of English at Georgetown University, who leads the Visible Knowledge Project (http://crossroads.georgetown.edu/vkp), a five-year effort to investigate closely whether technology improves student learning. Bass has vigorously argued that technological tools must be used as "engines of inquiry," not "engines of productivity." Or Andrew Feenberg, a San Diego State University distance-education pioneer as well as a philosopher and disciple of Herbert Marcuse, who has insisted that educational technology "be shaped by educational dialogue rather than the production-oriented logic of automation," and that such "a dialogic approach to online education...could be a factor making for fundamental social change."
One would have no way of knowing from Noble's book that the conventional wisdom of even distance-education enthusiasts is now that cost savings are unlikely, or that most educational technology advocates, many of them faculty members, see their goal as enhancing student learning and teacher-student dialogue. Noble, in fact, never acknowledges that the push to use computer technology in the classroom now emanates at least as much from faculty members interested in using these tools to improve their teaching as it does from profit-seeking administrators and private investors.
Noble does worry a great deal about the impact of commercialization and commodification on our universities--a much more serious threat than that posed by instructional technology. But here, too, the book provides an incomplete picture. Much of Noble's book is devoted to savaging large public and private universities--especially UCLA, which is the subject of three chapters--for jumping on the high-technology and distance-education bandwagons. Yet at least as important a story is the emergence of freestanding, for-profit educational institutions, which see online courses as a key part of their expansion strategy. For example, while most people think of Stanley Kaplan as a test preparation operation, it is actually a subsidiary of the billion-dollar Washington Post media conglomerate and owns a chain of forty-one undergraduate colleges and enrolls more than 11,000 students in a variety of online programs, ranging from paralegal training to full legal degrees at its Concord Law School, which advertises itself as "the nation's only entirely online law school." This for-profit sector is growing rapidly and becoming increasingly concentrated in a smaller number of corporate hands. The fast-growing University of Phoenix is now the largest private university in the United States, with more than 100,000 students and almost one-third in online programs, which are growing more than twice as fast as its brick-and-mortar operation. Despite a generally declining stock market, the price of the tracking stock for the University of Phoenix's online operation has increased more than 80 percent in the past year.
As the Chronicle of Higher Education reported last year, "consolidation...is sweeping the growing for-profit sector of higher education," fueled by rising stock prices in these companies. This past winter, for example, Education Management Corporation, with 28,000 students, acquired Argosy Education Group and its 5,000 students. The threat posed by these for-profit operations is rooted in their ability to raise money for expansion through Wall Street ("Wall Street," jokes the University of Phoenix's John Sperling, "is our endowment") and by diminishing public support for second-tier state universities and community colleges (the institutions from which for-profits are most likely to draw new students). Yet, except for an offhand reference to Phoenix, Digital Diploma Mills says nothing about these publicly traded higher-education companies. And these for-profit schools are actually only a small part of the more important and much broader for-profit educational "sector," which is also largely ignored by Noble and includes hundreds of vendors of different products and services, and whose size is now in the hundreds of billions of dollars--what Morgan Stanley Dean Witter calls, without blushing, an "addressable market opportunity at the dawn of a new paradigm."
A strong cautionary tale is provided by Noble, that of the involvement of UCLA's extension division with a commercial company called Onlinelearning.net--the most informative chapter in the book. He shows how some UCLA administrators as early as 1993 greedily embraced a vision of riches to be made in the online marketing of the college's extension courses. UCLA upper management apparently bought the fanciful projections of their commercial partners that the online venture would generate $50 million per year within five years, a profit level that quickly plummeted below $1 million annually. But Noble conflates the UCLA online-extension debacle with a more benign effort by the UCLA College of Letters and Sciences, beginning in 1997, to require all instructors to post their course syllabuses on the web. He seems unwilling to draw distinctions between the venal and scandalous actions of top UCLA administrators and the sometimes ham-handed efforts of other administrators to get UCLA faculty to enhance their classes by developing course websites, a fairly common educational practice and a useful convenience for students. Three-fifths of UCLA students surveyed said that the websites had increased interactions with instructors, and social science faculty recently gave the website initiative a mostly positive evaluation.
Sounding an "early alarm" so that faculty members can undertake "defensive preparation and the envisioning of alternatives" is how Noble explains his purpose in writing Digital Diploma Mills. But will faculty be well armed if they are unaware of the actual landscape they are traversing? In the end, Noble leaves us only with a deep and abiding suspicion of both technology and capitalism. His analysis of technology and education does echo Marx's critique of capitalism, with its evocation of concepts like commodification, alienation, exchange and labor theories of value. But unlike Marx, who produced a critical analysis of the exploitative nature of early capitalist production without outright rejection of the technology that made industrialization possible, Noble cannot manage the same feat.
In the current political climate, Noble's undifferentiated suspicion of technology hinders us more than it helps us. Are we prepared to follow him in his suspicion of any use of technology in higher education? Are faculty members willing to abjure e-mail in communicating with their students and colleagues? Are instructors at small colleges with limited library collections prepared to tell their students not to use the 7 million online items in the Library of Congress's American Memory collection? Are they ready to say to students with physical disabilities that limit their ability to attend on-campus classes or conduct library research that they can't participate in higher education? Are faculty at schools with working adults who struggle to commute to campus prepared to insist that all course materials be handed directly to students rather than making some of it available to their students online?
Similarly, what lines are we prepared to draw with respect to commercialization of higher education within the capitalist society in which we live? Are faculty willing to abandon publishing their textbooks with large media conglomerates and forgo having their books sold through nationwide bookstore chains? Are they prepared to say to working-class students who view higher education as the route to upward mobility that they cannot take courses that help them in the job market?
Noble's answer to most of these questions would undoubtedly be yes, insisting, as he does, that anything less than the "genuine interpersonal interaction," face to face, undermines the sanctity of the essential teacher-student relationship. In a March 2000 Chronicle of Higher Education online dialogue about his critique of technology in education, Noble complained that no one had offered "compelling evidence of a pedagogical advantage" in online instruction. (He pristinely refused to join online, and had a Chronicle reporter type in his answers relayed over the phone.) A student at UCLA, who had unexpectedly taken an online course, noted in her contribution to the Q&A that because she tended to be "shy and reserved," e-mail and online discussion groups allowed her to speak more freely to her instructor, and that she thought she retained more information in the online course than in her traditional face-to-face classes at UCLA. Noble rejected the student's conclusion that the online course had helped her find her voice, arguing that writing was "in reality not a solution, but an avoidance of the difficulty." "Speaking eloquently, persuasively, passionately," he concluded, "is essential to citizenship in a democracy." Putting aside the insensitivity of Noble's reply, his position, as Andrew Feenberg points out in Transforming Technology: A Critical Theory Revisited, is reminiscent of Plato's fear that writing (the cutting-edge instructional technology in the ancient world) would replace spoken discourse in classical Greece, thus destroying the student-teacher relationship. (Ironically, as Feenberg also notes, "Plato used a written text as the vehicle for his critique of writing, setting a precedent" for current-day critics of educational technology like Noble who have circulated their works on the Internet.)
The conservative stance of opposing all change--no technology, no new modes of instruction--is appealing because it keeps us from any possible complicity with changes that undercut existing faculty rights and privileges. But opposition to all technology means that we are unable to support "open source" technological innovations (including putting course materials online free) that constitute a promising area of resistance to global marketization. And it makes it impossible to work for protections that might be needed in a new environment. Finally, it leaves unchanged the growing inequality between full-time and part-time faculty that has redefined labor relations in the contemporary university--the real scandal of the higher-education workplace. Without challenging the dramatic differences in wages and workloads of full professors and adjunct instructors, faculty rejection of educational technology begins to remind us of the narrow privileges that craft workers fought to maintain in the early decades of industrial capitalism at the expense of the unskilled workers flooding into their workplaces.
We prefer to work from a more pragmatic and realistic stance that asks concretely about the benefits and costs of both new technology and new educational arrangements to students, faculty (full- and part-time) and the larger society. Among other things, that means that academic freedom and intellectual property must be protected in the online environment. And the faculty being asked to experiment with new technology need to be provided with adequate support and rewards for their (ad)ventures. As the astute technology commentator Phil Agre wrote when he first circulated Noble's work on the Internet, "the point is neither to embrace or reject technology but to really think through and analyze...the opportunities that technology presents for more fully embodying the values of a democratic society in the institutions of higher education."
- We Spend $600 Billion a Year on Defense, but Couldn’t Stop a Mailman From Landing His Gyrocopter on the Capitol Lawn
- Cornel West Is Not Mike Tyson
- How NBC Knowingly Let Syria Rebels’ False War Propaganda Stand For Years
- The New Thought Police
- New York City Just Outlawed Running Credit Checks on Job Applicants