Help

Nation Topics - Society

Topic Page

Articles

News and Features

On November 7, voters in Alabama erased from that state's Constitution a provision dating from 1901 that declared that "the legislature shall never pass any law to authorize or legalize any marriage between any white person and a Negro, or descendant of a Negro." This declaration represented in part a desire by white supremacists to express as fully as possible their intention to expunge the racially egalitarian symbols, hopes and reforms of Reconstruction. Although Alabama had never enacted a law expressly authorizing interracial marriage, in 1872 the state's Supreme Court did invalidate the law that prohibited such unions. But it promptly reversed itself in 1877 when white supremacists regained power. The Alabama Constitution's disapproval of interracial marriage, however, had still deeper roots. It stemmed from the presumption that white men had the authority to dictate whom, in racial terms, a person could and could not marry. It was also rooted in the belief that certain segments of the population were simply too degraded to be eligible as partners in marriage with whites. At one point or another, forty states prohibited marriage across racial lines. In all of them blacks were stigmatized as matrimonial untouchables. In several, "Mongolians" (people of Japanese or Chinese ancestry), "Malays" (Filipinos) and Native Americans were also placed beyond the pale of acceptability.

Rationales for barring interracial marriage are useful to consider, especially since some of them echo so resonantly justifications voiced today by defenders of prohibitions against same-sex marriage. One rationale for barring interracial marriages was that the progeny of such matches would be incapable of procreating. Another was that God did not intend for the races to mix. Another was that colored people, especially blacks, are irredeemably inferior to whites and pose a terrible risk of contamination. The Negrophobic Thomas Dixon spoke for many white supremacists when he warned in his novel The Leopard's Spots that "this Republic can have no future if racial lines are broken and its proud citizenry sinks to the level of a mongrel breed." A single drop of Negro blood, he maintained apocalyptically, "kinks the hair, flattens the nose, then the lip, puts out the light of intellect, and lights the fires of brutal passions."

Although opponents of prohibitions on interracial marriage have waged struggles in many forums (e.g., academia, the churches, journalism), two in particular have been decisive. One is the courtroom. In 1967 in the most aptly titled case in American history--Loving v. The Commonwealth of Virginia--the United States Supreme Court ruled that prohibitions against interracial marriage violated the equal protection and due process clauses of the Fourteenth Amendment. (Although much credit is lavished on the Court's decision, it bears noting that nineteen years earlier, in 1948, the Supreme Court of California had reached the same conclusion in an extraordinary, albeit neglected, opinion by Justice Roger Traynor.) When the federal Supreme Court struck down Jim Crow laws at the marriage altar, it relied on the massive change in public attitudes reflected and nourished by Brown v. Board of Education (1954), Martin Luther King Jr.'s "I Have A Dream" address (1963), the Civil Rights Act (1964) and the Voting Rights Act (1965). The Court also relied on the fact that by 1967, only sixteen states, in one region of the country, continued to retain laws prohibiting interracial marriage. This highlights the importance of the second major forum in which opponents of racial bars pressed their struggle: state legislatures. Between World War II and the Civil Rights Revolution, scores of state legislatures repealed bans against interracial marriage, thereby laying the moral, social and political groundwork for the Loving decision. Rarely will any court truly be a pioneer. Much more typically judges act in support of a development that is already well under way.

Unlike opponents of Brown v. Board of Education, antagonists of Loving were unable to mount anything like "massive resistance." They neither rioted, nor promulgated Congressional manifestoes condemning the Court, nor closed down marriage bureaus to prevent the desegregation of matrimony. There was, however, some opposition. In 1970, for example, a judge near Fort McClellan, Alabama, denied on racial grounds a marriage license to a white soldier and his black fiancée. This prompted a lawsuit initiated by the US Justice Department that led to the invalidation of Alabama's statute prohibiting interracial marriage. Yet the Alabama constitutional provision prohibiting the enactment of any law expressly authorizing black-white interracial marriage remained intact until the recent referendum.

That an expression of official opposition to interracial marriage remained a part of the Alabama Constitution for so long reflects the fear and loathing of black-white intimacy that remains a potent force in American culture. Sobering, too, was the closeness of the vote; 40 percent of the Alabama electorate voted against removing the obnoxious prohibition. Still, given the rootedness of segregation at the marriage altar, the ultimate outcome of the referendum should be applauded. The complete erasure of state-sponsored stigmatization of interracial marriage is an important achievement in our struggle for racial justice and harmony.

We must hold countries accountable for economic abuses just as we do for torture.

Montgomery's transit system isn't segregated anymore. It barely exists.

Amid all the partisan sniping, talking-head screeching and judicial decisions, there are two indisputable facts that go far toward explaining the true tragedy of the Florida recount.

Fact one: In this election, punch-card voting machines recorded five times as many ballots with no presidential vote as did the more modern optical-scanning systems. A New York Times analysis of forty-eight of the state's sixty-eight counties found that 1.5 percent of the ballots tallied under the punch-card method showed no vote at the top of the ticket, while only 0.3 percent of the ballots counted by the newer machines registered no vote for the President. An Orlando Sun-Sentinel examination concluded that counties using the best optical-scanning method recorded presidential votes on more than 99 percent of the ballots, and counties using the old punch-card devices counted presidential votes on only 96.1 percent of the ballots.

Fact two: Punch-card machines were more widely used in areas where low-income and African-American citizens vote. Two-thirds of the state's black voters reside in counties using punch cards, while 56 percent of white voters do.

Put these two undeniable facts together and the conclusion is inescapable: A statistically significant slice of the Florida electorate was disfranchised by voting technology. That is, a disproportionate number of voters done in by the error-prone punch-card machines were low-income and black Floridians, who generally favored Al Gore over George W. Bush. Presumably, some no-vote ballots actually did not include a vote for President. But given the closeness of the election--decided by .008 percent--it is likely that presidential votes missed by the punch-card machines would have decisively affected the contest. Bush "won"--among other reasons--because of voting-machine discrimination.

This crucial part of the tale has been overwhelmed by dimple-mania and the usual campaign back-and-forth. But ten days after the election, the Sun-Sentinel reported that "Florida's different vote-counting machines resulted in more GOP votes." For example, Brevard County, the home of space-shuttle launches, spent $1 million on more advanced machines in 1999, moving from punch-card tabulators to optical scanning machines that read pen-marked ballots (and that immediately return to the voter a ballot with a problem). Under the new system, the voting machines in this Bush-leaning county found presidential votes on 99.7 percent of the ballots. In 1996 the county's punch-card machines read presidential votes on 97.2 percent. Which means Bush, thanks to the upgrade, likely banked an additional 453 votes for his statewide total--practically his post-recount victory margin. The paper noted that the twenty-five counties that used the punch-card machines went for Gore over Bush 51.8 percent to 46 percent and produced 144,985 ballots with unrecorded presidential votes. Had the people who cast these ballots entered voting booths equipped with the more efficient machines, Gore no doubt would have collected hundreds--if not thousands--more votes than Bush.

There have been allegations that black Floridians encountered racial intimidation at voting sites. (The Justice Department has initiated an informal assessment, not an investigation.) And Bush benefited from the all-too-routine bias by which minority areas receive poorer government services. Unfortunately not just for Gore but for the victims of this quiet bias in Florida, this inequity was unaddressed by the Florida circuit court and the US Supreme Court, partly because the Gore campaign didn't raise it.

The Gore legal challenge focused on 14,000 or so supposedly no-vote punch-card ballots in Miami-Dade and Palm Beach counties, not the statewide problem, and called for a manual review only of those ballots. The Veep's lawyer did not argue that the county-by-county patchwork voting system operated less effectively for blacks, a constituency that Democrats rely on to win elections. In his ruling against Gore, Circuit Judge N. Sanders Sauls noted that the record "shows error and/or less than total accuracy in regard to the punch-card voting devices utilized in Dade and Palm Beach Counties." But Sauls declared that Gore's legal team had not established "a reasonable probability" that the statewide results would turn out differently if those ballots were counted in a better fashion. Either Gore's attorneys screwed up big by not making this point more obvious--which they might have done had they filed contests based on the wider issue--or Sauls misread the math. As for the US Supreme Court, it displayed no eagerness to adjudicate such a touchy and fundamental voting-rights matter as systematic disfranchisement through technology. Its decision--in which it told the state Supreme Court to try again--indicates that the Court wanted to approach the Florida case narrowly, at least in the first go-round.

If a system is decisively skewed to one group's advantage, does that amount to theft? Or is that just the way it is? Clearly, a more equitable vote-counting system in the state--punch-cards for all or optical-scanners for all--would have yielded a different final count. This is an injustice that no court has confronted, on which Bush may well ride into the White House, and that should not be forgotten.

Foundations formed when nonprofit hospitals became for-profit are often not living up to their obligations.

Could an American citizen be sentenced to jail simply for making a speech? If the speech is in defense of Pennsylvania death-row inmate Mumia Abu-Jamal and the speaker is an activist in the struggle to save Abu-Jamal from the executioner's needle, the answer may be yes. C. Clark Kissinger, head of Mumia-support activities of the New York City-based Refuse & Resist!, an organization that's leading the international campaign to gain a new trial for the former Black Panther, will find out on December 6, when he faces a parole hearing in federal court in Philadelphia.

Technically, the issue is parole violation, but the charge for which Kissinger was convicted this past April--failure to obey a lawful order (to move) during a sitdown protest at the Liberty Bell--led to his being fined $250 and placed on one year's probation with what his lawyer, Ron Kuby, calls the stiffest terms he's seen for such a minor violation. Those terms, which were also imposed on eight other protesters, include surrendering his passport, having to file income and expense reports for himself and his wife, providing a list of anyone he contacts who has committed a crime and having to get permission from his probation officer whenever he wants to leave New York City or Long Island. "You have to remember that Philadelphia is ground zero for the Mumia case," says Kuby. "This is clearly an attack on Mumia and on Mumia's supporters. It is aimed at preventing Clark and others from doing any support activities at all."

"I turned in my passport, and I report to my parole officer," says Kissinger, "but when it comes to First Amendment stuff, I have refused to cooperate." He says he has not complied with an order to avoid any contacts with felons, saying, "In my line of work, most of the people I see have been arrested for something!" Nor has he filed any financial information about his family, a requirement Kuby's office says is simply an effort to gain information on the operations and funding of Refuse & Resist! As for his travel restrictions, Kissinger says, "Whenever it's been a request for something personal, like visiting my sick mother in Massachusetts, it's granted by my parole officer, but whenever it's something political, he has referred it to the sentencing judge, Federal Magistrate Arnold Rapoport, and he's always refused me permission." That's what happened in August when Kissinger asked for permission to go to Philadelphia during the Republican National Convention to make a speech at an officially sanctioned Mumia demonstration. When no permission was forthcoming, Kissinger simply went and gave the speech. Shortly after that, his probation officer notified the magistrate, claiming Kissinger had violated the terms of his parole--thus setting in motion the hearing to have it revoked.

The move comes as other Mumia support activities have also been facing what they say is harassment. Several weeks ago, following a regular weekly protest on Philadelphia's Broad Street, Ernst Ford, one of the organizers, says he found himself being followed home by a police car. As he began unloading signs from his truck, he claims, one policeman approached him saying, "Mumia's gonna die, and so are you." Ford and the International Concerned Family and Friends of Mumia, sponsor of the protests, filed a complaint with the police, but so far they haven't heard back. The police department declined to comment, saying that "the investigation of the complaint has not been completed."

The international Mumia group is itself having problems, including facing a review of its charity registration by the state, which includes a request for ten years' worth of financial records--a review made more difficult thanks to a suspicious burglary of the organization's headquarters earlier this year. Computers and expensive stereo equipment were left untouched, but a drawer of financial papers was rifled.

Click here for Eric Alterman's latest dispatch on Florida.

A quarter-million people thronged Abraham Lincoln's Memorial that day. In the sweltering August humidity, executive secretary Roy Wilkins gravely announced that Dr. William Edward Burghardt Du Bois--NAACP founding father and "senior intellectual militant of his people"--had died in exile the day before.

It's easy to forget. What we now think of, monolithically, as the civil rights movement was at the time a splintering half-dozen special-interest groups in ill-coordinated pitched camps. Thurgood Marshall, never known for tact or political correctitude, called the Nation of Islam "a buncha thugs organized from prisons and financed, I'm sure, by some Arab Group." The NOI viewed the Urban League as a black front for a white agenda. A fringe figure gaining notoriety for his recent Playboy interview with an obscure journalist named Alex Haley, Malcolm X irreverently dismissed both "the farce on Washington" and the young minister just moments away from oratorical immortality, the Rev. Dr. Martin Luther King Jr., as "Bishop Chickenwings."

If the legacy of Du Bois's long life was unclear then, what can it all mean now? What possessed him to renounce the widely coveted citizenship for which those gathered there that day--inspired in part by his example--were marching? What can a scholarly biography of the patron saint of African-American intellectuals--written by a tenured professor for a prestigious publishing house, impatiently awaited by specialists and educated generalists alike--what can all this mean to 101 million eligible nonvoters "entirely ignorant of my work and quite indifferent to it," as Du Bois said in his time, much less to 30 million African-Americans beyond the Talented Tenth and those few old-timers in Harlem who remember Du Bois as being, mostly, a remarkably crotchety old man?

With these mixed feelings of pleasure, gratitude, frustration and momentous occasion, I read the monumentally ambitious sequel, seven years in the making, itself a National Book Award finalist, to David Levering Lewis's Pulitzer Prize-winning Biography of a Race, 1868-1919.

"I remember well," Du Bois wrote, famously, "when the shadow swept across me." He was born "a tangle of New England lineages"--Dutch, Bantu, French Huguenot--within living memory of the Fourteenth Amendment and The Communist Manifesto, one generation removed from slavery. And though he laid claim to both his African and European heritage, still it was a peculiar sensation. "One ever feels his two-ness--an American, a Negro; two souls, two thoughts, two unreconciled strivings; two warring ideals in one dark body, whose dogged strength alone keeps it from being torn asunder." Yet Du Bois knew full well that had he not felt, very early on, this double-consciousness, he might easily have become just another "unquestioning worshiper at the shrine of the established social order."

Willie D. charted his course as early as his teens, inaugurating his writing and public-speaking careers with articles in the Springfield Republican and a high school valedictory address on abolitionist Wendell Phillips. He arrived at the Harvard of Santayana and William James, who thought him easily among the most gifted of his students, already notorious for the "arrogant rectitude" others would resent all his life. He graduated cum laude, honing his prose with a rigorously liberal education in Latin, Greek, modern languages, literature, history and philosophy. But for a graduate student in sociology during the 1890s, Max Weber's Berlin, not Cambridge, was the place to be. And it was there, chain-smoking fluent German, celebrating both his 25th birthday and "his own genius," that W.E.B. Du Bois spelled out his life's ambition: "to make a name in science, to make a name in literature, to raise my race." Only because his scholarship ran out did Du Bois return to America for the consolation prize: Harvard's first African-American PhD.

Atlanta, after Europe and the North, came as a shock. Not that the recent lynching was in itself any great surprise. Du Bois simply wasn't prepared, passing by the local grocer, to see the souvenirs of severed fingers on display out front. Headquartered at Atlanta University, for the next twelve years he taught history and economics. By the time Frederick Douglass died in 1895, the Tuskegee model of black higher education was dominant, and Booker T. Washington its leading lobbyist. That same year Washington, whose power had been growing since 1885, had delivered his famous Atlanta Exposition speech: "In all things purely social," he said, holding up both hands, digits spread wide, "we can be as separate as the [five] fingers"--he paused dramatically, clenching each hand into a fist--"yet as the hand in all things essential to mutual progress." Convinced that Washington's appeasement had paved the way for Plessy v. Ferguson in 1896, Du Bois and other black intellectuals felt sold down the river. Du Bois's scathing review of Washington's Up From Slavery (1901), declaring war on merely vocational training of a "contented and industrious peasantry," was collected in The Souls of Black Folk (1903). Du Bois and Washington came, notoriously, to ideological blows. It was the beginning of the end for Booker T. Washington.

Yet there was no personal animus between them. Shrewdly, Washington tried to hire Du Bois away to Tuskegee, even taking him along on one of his fundraising junkets. But once at Andrew Carnegie's office, Washington--who knew where his bread was buttered and that Du Bois could be counted on not to keep his mouth shut--left him waiting downstairs. "Have you," Washington asked, "read Mr. Carnegie's book?" W.E.B. allowed he had not. "You ought to," said Booker T. "Mr. Carnegie likes it."

Around 1909, certain Niagara Movement radicals and Jewish abolitionist holdovers formed a coalition that became the NAACP. Du Bois moved to New York, where, as editor of The Crisis for the next twenty-five years, his word was gospel.

Meanwhile, Marcus Garvey addressed a Harlem crowd of 2,000 in 1917, preaching black economic independence and resettlement. He even offered, to the resurgent Klan's delight, to transport them back to Africa. Now, the masses might be fooled by the plumed and gold-braided pretensions and Napoleonic pageantry of

the Emperor Marcus Mosiah Garvey--self-proclaimed High Potentate and Provisional President-General of all Africa, Commander in Chief of the Black Star Line, an entire fleet of three dubiously seaworthy vessels--with his back-to-the-motherland schemes, his dukes and duchesses of Uganda and Niger, his knight commanders of the Distinguished Order of Ethiopia and the Nile. But Du Bois, who had just returned from Firestone's Liberia as diplomatic envoy, knew better. (Besides, everybody who was anybody knew that what Garvey's Universal Negro Improvement Association really stood for was "Ugliest Negroes in America.") As far as Du Bois was concerned, Garvey was either a lunatic or a traitor. Whereas, it seemed to Garvey--who saw Du Bois's NAACP as the National Association for the Advancement of Certain People--that the lunacy was for blacks to expect equality in America. In the end, his daring, energy and charisma were surpassed only by his ignorance of finance. Du Bois sounded the rallying cry: "Garvey Must Go." The FBI agreed. And if deportation on the grounds of being an undesirable alien wouldn't hold up in court, mail fraud would do nicely. Arrested in 1922, tried and convicted in 1923, Garvey took up residence at Atlanta Federal two years before Malcolm X was born.

Remember, back before they were Jim Crowed into academic ghettos, when history was literature and vice versa? When nonspecialists read Macaulay, Michelet? Poet, short-story writer, essayist and novelist as well as historian, Du Bois was by no means master of all the genres he assayed. But he electrified African-American literature as writer during the twentieth century's first decade. Then, as editor, he paved the way for younger writers during subsequent decades. Biography, however, is a late development in the tradition. What advances have eminent African-Americans like David Levering Lewis made in that "most delicate and humane of all the branches of the art of writing"? And do his tomes amount to a "masterpiece of the biographer's craft"?

With their cast of legendary characters, colorful set locations, gripping storylines and virtuoso draftsmanship, they certainly aspire to it. For analytical rigor, judicious gossip and subtle insight into the social, political and economic "roots and ramifications" of "racial, religious, and ethnic confrontation, and assimilation in America" between Reconstruction and the civil rights movement, Lewis is fully equal to the task of his formidable subject. And his lucid, downright old-fashioned good writing, so full of fine flourishes and phrases, is mostly innocent of academic jargon. So much so that for years--visiting the same archives, examining the same documents and cross-examining the same witnesses while working my way carefully through these volumes, underlining passages in mechanical pencil, leaving yellow flags on every other page--I kept trying to figure out my misgivings.

And then it hit me. The problem here is not one of length--Boswell's massive Life of Samuel Johnson still startles, 200 years later--but scale, of Turgenev's "right relation" among a dozen or so vivifying narrative elements beyond character and what used to be called "plot." All of these together in a constant juggle of transitions--abstract to concrete, poetic to prosaic, description to dialogue, sentence length and rhythm--can create compelling momentum. Any one of these, overrelied upon in a fact-filled narrative of 1,500 pages, can be lethal. "With the 20th century," said Virginia Woolf,

a change came over biography, as it came over fiction and poetry.... the author's relation to his subject is different. He is no longer the serious and sympathetic companion, toiling slavishly in the footsteps of his hero.... Moreover, he does not think himself constrained to follow every step of the way.... he sees his subject spread about him. He chooses; he synthesizes; in short, he has ceased to be the chronicler; he has become an artist.

Cautious of overstepping the bounds of the historically permissible, the distinguished professor has crafted a straightforward chronicle. Far too often, characters are molded not organically from suggestive situation but by accretion of meticulous archival detail--endless lists of academic pedigree heaped, all at once, in static inventories of naturalistic description--then left to atrophy in the reader's mind. A compelling narrative begins where the dossier leaves off. And a good biographer is a historian, but a good historian isn't necessarily a biographer. The progression from one to the other is no more formally inevitable than that from short-story writer to novelist. But don't get me wrong. The aesthetic quibble is really by way of illustrating how close this life might have come to greatness, to the artistry of all that Lytton Strachey left out in tending toward that "becoming brevity...which excludes everything that is redundant and nothing that is significant," and which, "surely, is the first duty of the biographer."

Du Bois's influence on African-American literature, as both writer and editor, is hard to exaggerate. Between Phyllis Wheatley, the publication of Souls, the silence of Charles Chestnutt and the death of Paul Laurence Dunbar from drunken disillusionment in 1906, dozens of poets, authors and pamphleteers emerged, boycotting the happy-blacky-nappy, banjo-strumming, watermelon-eating, darky dialect of previous eras. Of this work, says James Weldon Johnson in the classic history Black Manhattan, "Some was good, most was mediocre, much was bad, and practically all of it unknown to the general public." As late as 1914, with the exception of Johnson's Autobiography of an Ex-Colored Man, there wasn't much in the way of African-American literature, and Du Bois thought things looked bleak. By 1920, New York was America's greatest city, and Harlem--a two-square-mile city within the city where a quarter-million African-Americans boasted more poets, journalists, musicians, composers, actors, dramatists and nightclubs than any other spot on earth--became the world-famous capital of black America. It seemed to Du Bois that a renaissance of American Negro literature was now due.

His lover/literary editor Jessie Fauset, to put the arts on equal footing with social policy, urged an editorial shift in the pages of The Crisis. In short order, she published Langston Hughes's "The Negro Speaks of Rivers" in 1921 and prose poetry by Jean Toomer, later collected in Cane (1923). For the first time in history--just when Du Bois feared he'd have no worthy successors--a literature of African-Americans, by African-Americans and for African-Americans and anyone else who cared to listen was not only a possibility but a reality. The Harlem Renaissance was under way.

One prodigy Du Bois particularly delighted in was pinky-ringed young poet Countee Cullen. Companionable, uncombative, anxious for the kind of credibility a tidy résumé and Harvard degree could confer, Cullen idolized Du Bois to a degree perhaps predictable in a cautious orphan risen from impoverished obscurity to international fame by the age of 22 yet lacking, in the final analysis, the kind of intellectual and artistic daring that could sustain it. Du Bois, for his part, perhaps projected onto Cullen some of the paternal pride and ambition long buried with the infant son he'd loved and lost. And so he married off his only daughter. Langston Hughes rented a tuxedo, an organist played Tannhäuser and sixteen bridesmaids wore white. The only problem--aside from the fact that Countee Cullen was gay--was that the girl admired but didn't love him. It was a match made in Hell, a dramatic example of how "spectacularly wrongheaded" Du Bois could be.

For a decade or more, the Harlem Renaissance promised 10 million African-Americans "taken for granted by one political party and despised by the other, poor and overwhelmingly rural, frightened and disunited," the illusion of an era of freedom, justice and equality undreamed of since Reconstruction. To his immense credit, Du Bois was not lulled into submission, mistrusting the impulse toward "salon exotica" and a smattering of prizes for prodigies. Then as now, the means of production--the Hollywood studios, the recording studios, the theaters--were for the most part white-owned. As early as 1926, he warned about "the politics of patronage," challenging that African-Americans would get the art that they deserved--or were willing to pay for: "If a colored man wants to publish a book, he has to get a white publisher and a white newspaper to say it's great; and then [black people] say so." (Ain't a damn thang changed.) By 1934 it had become embarrassingly clear that civil rights would not follow logically from "forceful prose" and a demonstration of artistic excellence on the part of a few Ivy League Negroes. The movement was dead, "scuttled," as chief publicist Alain Locke put it, as much from within as from without, by faddish market swings and stock speculations of Zora Neale Hurston Niggerati, on the one hand, and the liberal Negrotarians on the other.

For Du Bois, as for most African-Americans, the Depression hit harder and faster and lasted longer than for the country at large. The royal wedding had wiped out his savings, and his Crisis salary hadn't been paid for months. He was broke.

Du Bois became increasingly radicalized during the 1930s and '40s. As he saw it, the NAACP, by focusing almost exclusively on legal strategy, was beginning to work "for the black masses but not with them." In 1934, out of sync with the mainstream leadership, he left in disgust. He returned to Atlanta University, reading Das Kapital and writing Black Reconstruction in America (1935). Du Bois, who first visited the Soviet Union in 1926, returned in 1936. Home from History's frontlines a self-professed "Bolshevik," even though, as a Socialist, he combined "cultural nationalism, Scandinavian cooperativism, Booker Washington and Marx in about equal parts," Du Bois remained unconvinced that the Communist Party, which never attracted more than a few hundred black members, was their last best hope. In any case, African-Americans did not "propose to be the shock troops of the Communist Revolution."

During the McCarthy era, the black leadership, bending in the prevailing ideological winds, began to distance itself from the left. Back in New York, involved in nuclear disarmament activity declared subversive by the US government, Du Bois was arrested and tried as an unregistered agent of a foreign power. He was acquitted in 1951, but the State Department confiscated his passport, prohibiting travel abroad. It was the last straw.

The prophet was without honor only in his own country. So when the government embargo was lifted in 1958, Du Bois went on lecture tours of Eastern Europe and the Soviet Union, becoming a kind of poster boy in the Communist effort to discredit the States. He was awarded the Lenin Peace Prize in 1959, and in Red China, his birthday was declared a national holiday by Chou En-lai. Did the party use Du Bois? Or did Du Bois use the party to further his own agenda? Both, most likely.

In 1960, seventeen African states, including Kwame Nkrumah's Ghana, gained independence. At Nkrumah's invitation, Du Bois exiled himself, renouncing his American citizenship. He officially joined the Communist Party in 1961. Shrunken now and a bit stooped, his memory not quite as sharp as it once was, the scholar-citizen spent his last days in a spacious house with a view of flowering shrubs in Accra's best neighborhood, an honored guest of state, surrounded by busts of Lenin and Chairman Mao and an impressive library of Marxist thought, editing the Negro encyclopedia and receiving visitors the world over. At last, on August 27, 1963, the visionary whose long life--spanning Reconstruction, Plessy v. Ferguson, two World Wars, Brown v. Board of Education and now the civil rights movement--had been the literal embodiment of the nineteenth century's collision with the twentieth, died in Accra, where he was accorded an elaborate state funeral.

The bioepic ends, as it began 1,500 pages ago in Volume I, with the death of W.E.B. Du Bois. A living institution, he was "productive, multiple, controversial, and emblematic." His influence--as cultural ambassador, as writer and editor, as activist whose spectrum of social, political and economic thought seems refracted in phenomena as varied as Ho Chi Minh, the Negritude of poet-statesmen Aimé Césaire and Léopold Senghor as well as the Black Power movement that peaked after his death--is ubiquitous.

A difficult man as capable of coldness to old friends as he was reluctant to admit mistakes, a prickly Brahmin who walked with kings but failed to acquire the common touch, Dr. Du Bois emerges a kind of tragic hero as flawed as he was gifted. At times you wonder whether he wasn't his own most formidable enemy. But whatever his blind spots, he was only too well aware, looking backward, that battling racism real and imagined at every turn had twisted him into a far less "human" being than he might otherwise have been.

Fifteen years and two computer crashes in the research and writing, these volumes were a lifetime, literally, in the making. As a boy born in Little Rock two decades before the civil rights movement began, Lewis had a portentous encounter with the great man. Fisk man and author of books on South Africa and the Dreyfus Affair, he's now a professor of history at Rutgers. And just as Renaissance scholarship would be incomplete without When Harlem Was in Vogue, the twenty books and 100 articles of W.E.B. Du Bois's eighty-year publishing career, so handsomely anthologized in Nathan Irvin Huggins's Library of America Writings, are indispensably complemented by what is, if not a masterpiece of biography, then almost certainly the standard social, political and intellectual history of his life and times.

To judge from magazine covers, the American divorce rate is either a disaster for children or no problem at all. First came the famous "Dan Quayle Was Right" article in The Atlantic in 1993, with a cover line that said divorce "dramatically weakens and undermines our society." Then, in 1998, Newsweek heralded The Nurture Assumption, whose author, Judith Rich Harris, argued that whether parents divorce makes little difference in children's lives because genetics and peer groups determine their problems. This September, Time featured The Unexpected Legacy of Divorce, a book whose authors, psychologists Judith Wallerstein and Julia Lewis and journalist Sandra Blakeslee, brought the gloomy news that a majority of children of divorce still suffer twenty-five years later. What's a reader to think?

The facts are not in dispute: The American divorce rate doubled in the 1960s and 1970s and has held steady or possibly declined a bit since then. At current rates, about half of all marriages will end in divorce. One million children experience a parental divorce every year. Most of them are upset in the immediate aftermath of the breakup. Some act out, others become withdrawn. Too often, fathers fail to provide adequate financial support and mothers and children see their standards of living drop. Without doubt, going through a divorce is a traumatic experience for parents and children alike.

But are most children harmed in the long term? On this question, recent media coverage has lurched between two extremes. At one end is the doomsday view that divorce sentences children to life at emotional hard labor. In this view, a parental divorce starts a chain of events that leaves most adult children anxious, unhappy and often unable to make a commitment to a partner. At the other end is the evolutionary psychologists' view that children's behavior is genetically programmed, so that whether parents divorce doesn't matter very much. According to this line of reasoning, divorce is just a flag that identifies genetically challenged families whose troubles would have occurred even if the parents had stayed together.

Said this starkly, neither extreme seems convincing. Yet it is a sad fact of public debates about social problems that the extremes tend to capture everyone's attention. Magazines are sold and talk shows are fueled by the announcement that a particular problem is devastating American society and then by the news that--wait a minute--it's really not a big problem after all. There's little patience for discussions of problems that are serious but not calamitous. And yet the gravity of many social problems lies in the demilitarized zone between the extremes.

For example, consider teenage childbearing. It was initially declared a scourge. A leading researcher wrote famously in 1968, "The girl who has an illegitimate child at the age of 16 suddenly has 90 percent of her life's script written for her." More recently, however, some researchers and commentators have argued that most teenage mothers would not be better off had they delayed having children. Teenage childbearing, it is alleged, merely reflects growing up in disadvantaged circumstances. Poor teen mothers would still be poor even if they hadn't had their babies. While there is some merit to this argument, research suggests that having a baby as a teenager does add to the difficulties girls from disadvantaged backgrounds face.

Research on divorce also suggests that extreme views are inaccurate. But you wouldn't know it to read the latest report by Wallerstein and her colleagues on her long-term study of children of divorce. In 1971, she selected sixty families that had been referred by their attorneys and others to her marriage and divorce clinic in Marin County, California, shortly after the parents separated. Wallerstein kept in touch with the 131 children from these families. Her book on the first five years, Surviving the Breakup: How Children and Parents Cope with Divorce, written with Joan Berlin Kelly, contained insightful portraits of the difficulties the children faced as their parents struggled with the separation and its aftermath. Her book about how they were doing at the ten- and fifteen-year mark, Second Chances: Men, Women, and Children a Decade After Divorce, written with Sandra Blakeslee, became a bestseller. It chronicled the continuing problems that most of the children were having.

For her new book, she was able to talk to ninety-three of the children at the twenty-five year mark. Her striking conclusion is that most of these individuals, now 33 years old on average, have suffered greatly in adulthood. A minority have managed to construct successful personal lives, but only with great effort. The legacy of divorce, it turns out, doesn't fade away:

Contrary to what we have long thought, the major impact of divorce does not occur during childhood or adolescence. Rather, it rises in adulthood as serious romantic relationships move center stage. When it comes time to choose a life mate and build a new family, the effects of divorce crescendo.

Young adults from divorced families, Wallerstein writes, lack the image of an intact marriage. Because they haven't had the chance to watch parents in successful marriages, they don't know how to have one. When it comes time to choose a partner or a spouse, their anxiety rises; they fear repeating the mistakes of their parents. Lacking a good model, they tend to make bad choices. (In the realm of work, in contrast, Wallerstein's subjects had no particular problems.)

A woman who took the role of caregiver to a distraught parent or to younger siblings while growing up, for instance, may choose a man who needs lots of caring in order to function. But she soon finds his neediness and dependency intolerable, and the relationship ends. Wallerstein writes of one such woman in her study:

She described how she would come home after work and find her partner lying on the couch, waiting for her to take charge. It was just like taking care of her mom. At that point, she realized she had to get out.

Young men, Wallerstein tells us, were wary of commitment because they were afraid their marriages would end as badly as their parents' had. Many avoided casual dating and led solitary lives. She tells the story of Larry, who after courting and living with Grace for seven years still could not bring himself to marry her. Not until she packed up and left in frustration did he agree. He told Wallerstein:

I realized I loved her and that she was important to me but I was unable to make a decision. I was afraid because of the divorce. I was afraid of being left and I think that is why I was afraid of making a commitment to her.

Other children in the study turned to alcohol, drugs and, particularly among girls, early sexual activity. Wallerstein writes that sexual promiscuity was a result of girls' feelings of abandonment by their fathers. Their low self-esteem, their craving for love and their wish to be noticed led them to seek sexual liaisons and sometimes to start ill-conceived partnerships and marriages.

Overall, we are told, close to half the women and over one-third of the men were able to establish successful personal lives by the twenty-five-year mark--but only after considerable pain and suffering, much anxiety about repeating the mistakes of their parents, many failed relationships and, for one-third, psychotherapy. The rest were still floundering. Only 60 percent had ever married, compared with about 80 percent among all adults at their ages. Moreover, only one-third had children, as if they were afraid of doing to children what had been done to them.

Without doubt, a disturbing picture. And what makes it even more disturbing is Wallerstein's claim that her subjects are more or less representative of the typical American middle-class family that undergoes a divorce. Her families were carefully screened, she assures us, so that the children were doing "reasonably well" at school and had been developmentally "on target" before the divorce. Nor were the families especially troubled before the breakup, she says. "Naturally," Wallerstein writes, "I wanted to be sure that any problems we saw did not predate the divorce. Neither they nor their parents were ever my patients."

This claim to have a sample of typical, not unduly troubled families is, however, contradicted by the extensive psychological problems that the parents displayed when they were assessed at the initial interview. But you won't find that information in this book or the previous one. Only in the appendix to her first book, Surviving the Breakup, in 1980, does Wallerstein discuss the parents' mental states. There we learn the startling information that 50 percent of the fathers and close to half the mothers were "moderately disturbed or frequently incapacitated by disabling neuroses or addictions" when the study started:

Here were the chronically depressed, sometimes suicidal individuals, the men and women with severe neurotic difficulties or with handicaps in relating to another person, or those with long-standing problems in controlling their rage or sexual impulses.

And that's not all: An additional 15 percent of the fathers and 20 percent of the mothers were found to be "severely troubled during their marriages." These people "had histories of mental illness including paranoid thinking, bizarre behavior, manic-depressive illnesses, and generally fragile or unsuccessful attempts to cope with the demands of life, marriage, and family."

Typical American middle-class families? Hardly. These were by and large troubled families of the kind one might expect to come to a divorce clinic for therapy. Why this information was excluded from the nine-page appendix on the research sample in the new book--why an interested reader can only find it buried in the appendix of a book written twenty years ago--is puzzling. Does Wallerstein now consider this information to be in error? Irrelevant? Or just embarrassing?

The problem for Wallerstein is that troubled families often produce troubled children, whether or not the parents divorce. So it may be a considerable overstatement to blame the divorce and its aftermath for nearly all the problems she saw among her children over the twenty-five years. In a study of the records of several thousand British children who were followed from birth to age 33, Lindsay Chase-Lansdale, Christine McRae Battle and I found that children whose parents would later divorce already showed more emotional problems at age 7 than children from families that would remain together. The gap widened as the divorces occurred and the children reached adulthood, suggesting that divorce did have a detrimental long-term effect on some of them. But a large share of the gap preceded the divorces and might have appeared even had the parents stayed together.

Sensitive to the particularities of her sample, Wallerstein recruited a "comparison sample" of adults from nondivorced families. The comparison sample, we are told, was selected to match the socioeconomic level of the families in the study. In many respects, the individuals in the comparison group were doing better than the study's children, which Wallerstein presents as evidence that divorce really is the cause of the difficulties in the latter group. But since the comparison sample presumably was not matched on the parents' chronic depression, suicidal tendencies, problems in controlling rage, bizarre behavior and manic-depressive illness, their inclusion does not prove Wallerstein's case.

What, then, can we take from Wallerstein's study? It is an insightful, long-term investigation of the lives of children from troubled divorced families. It gives us valuable information on what happens to children when things go wrong before and after a divorce. And things sometimes do go wrong: Many divorcing parents face the kinds of difficulties that Wallerstein saw in her families. Her basic point that divorce can have effects that last into adulthood, or even peak in adulthood, is valid. She was one of the first people to write about children who seemed fine in the short-term but experienced emotional difficulties in adolescence or young adulthood--in her previous book she called this the "sleeper effect"--and now she is the first to describe it in detail among adults who have reached their 30s. Psychotherapists, social workers, teachers and other professionals who see troubled children of divorce and their parents will find her analyses instructive. Parents and children who are struggling with divorce-related problems will find her analyses helpful.

But no one should believe that the negative effects of divorce are as widespread as Wallerstein claims. Some portion of what she labels as the effects of divorce on children probably wasn't connected to the divorce. And the typical family that experiences divorce won't have as tough a time as Wallerstein's families did. Parents with better mental health than this heavily impaired sample can more easily avoid the worst of the anger, anxiety and depression that comes with divorce. They are better able to maintain the daily routines of their children's home and school lives. Their children can more easily avoid the extremes of anxiety and self-doubt that plague Wallerstein's children when they reach adulthood.

What divorce does to children is to raise the risk of serious long-term problems, such as severe anxiety or depression, having a child as a teenager or failing to graduate from high school. But the risk is still low enough that most children in divorced families don't have these problems. In the British study, we found that although divorce raised the risk of emotional problems in young adulthood by 31 percent, the vast majority of children from divorced families did not show evidence of serious emotional problems as young adults.

Except for Wallerstein, many of the writers most concerned about divorce now appear to recognize this distinction. Barbara Dafoe Whitehead, who wrote the "Dan Quayle Was Right" piece in The Atlantic (drawing heavily on Wallerstein's earlier work), acknowledged in a more recent, book-length treatment, The Divorce Culture, that a majority of children probably aren't seriously harmed in the long term. But she argued that even if only a minority of children are harmed, divorce is so common that a "minority" is still a lot. And she is correct. Divorce is not a problem that "dramatically weakens and undermines our society," but it nevertheless deserves our attention.

For that reason, some of the remedies Wallerstein suggests would be useful: creating more support groups in schools for children whose parents are divorcing, insuring that divorced fathers contribute to the cost of their children's college education and educating newly separated parents about how to shield their children from conflict. Measures such as these would help some children without imposing undue strain on parents, schools or the courts.

Less clearly useful is Wallerstein's recommendation that parents in unhappy, loveless, but low-conflict marriages consider staying together for the sake of their children. I think she is probably right that children can develop adequately in "good enough" marriages that limp along without an inner life of love and companionship. There were millions of these marriages during the baby-boom years of the 1950s, when wives weren't supposed to work and women were forced to choose between having a career and being a mother. The result was often frustration and depression. Few people (not even Wallerstein) want to constrain women's choices again. Certainly, unhappy parents have an obligation to try hard to change an unsuccessful marriage before scuttling it. Without doubt some parents resort to divorce too hastily. But no one as yet has a formula that can tell parents how much pain they must bear, how much conflict to endure, before ending a marriage becomes the better alternative for themselves and their children.

Least defensible is the attempt by Wallerstein to inform readers whose parents have divorced that their problems with intimacy stem from the breakup. In high self-help style, Wallerstein tells her readers:

You were a little child when your parents broke up, and it frightened you badly, more than you have ever acknowledged.... When one parent left, you felt like there was nothing you could ever rely on. And you said to yourself that you would never open yourself to the same kinds of risks. You would stay away from loving. Or you only get involved with people you don't care about so you won't get hurt. Either way, you don't love and you don't commit.

And so forth. Wallerstein plants the seed of the pernicious effect of exposure to divorce as a young child--and then waters it. Yes, the reader thinks, that must be why I'm so anxious about getting married. Never mind that making a commitment to marry someone is anxiety-producing for young adults from any background. Or that we live in an era when the average person waits four to five years longer to marry than was the case a half-century ago. Wallerstein encourages readers to believe that most of their commitment problems stem from their parents' divorces. But parental divorce isn't that powerful, and its effects aren't that pervasive. To be sure, it raises the chances that children will run into problems in adulthood, but most of them don't. Unfortunately, that's a cover line that doesn't sell many magazines.

Blogs

  ACORN's closing up shop. Who's to blame?

March 25, 2010

GOP, Tea Partisans should honor Reagan, who said: "There is no room for racism, anti-Semitism, or other forms of ethnic and racial hatred in this country."

March 25, 2010

Consider Florida Congressman's Alan Grayson's "Medicare You Can Buy Into Act," which has 80 cosponsors and growing grassroots support.

March 23, 2010

 What I learned from health care weekend: who gets arrested, who gets media coverage, and who can actually get bills passed.

March 23, 2010

Tea party actions are an act of sedition not just racism.

March 22, 2010

Just how many saved lives does he have to oppose before Stupak's stripped of the term "pro-life" once and for all?

March 22, 2010

 After early struggles and White House hurdles, Obama's grassroots arm says it helped swing tangible votes for health care reform.

March 22, 2010

Congress begins to forge FDR's "essential link in our national defenses against individual and social insecurity.”

March 21, 2010

Stupak endorses bill as raucous debate moves forward and vote approaches.

March 21, 2010

"I haven't heard anything like this in 40, 45 years," says Congressman John Lewis. "Since the march to Selma, really."

March 20, 2010