How Silicon Valley Broke the Economy

The Confidence Game

How Silicon Valley broke the economy.

Facebook
Twitter
Email
Flipboard
Pocket

One of Apple cofounder Steve Jobs’s most audacious marketing triumphs is rarely mentioned in the paeans to his genius that remain a staple of business content farms. In 1982, Jobs offered to donate a computer to every K–12 school in America, provided Congress pass a bill giving Apple substantial tax write-offs for the donations. When he arrived in Washington, DC, to lobby for what became known as the Apple Bill, the 28-year-old CEO looked “more like a summer intern than the head of a $600-million-a-year corporation,” according to The Washington Post, but he already showed signs of his famous arrogance. He barraged the legislators with white papers and proclaimed that they “would be crazy not to take us up on this.” Jobs knew the strength of his hand: A mania for computer literacy was sweeping the nation as an answer to the competitive threats of globalization and the reescalation of the Cold War’s technology and space races. Yet even as preparing students for the Information Age became a national priority, the Reagan era’s budget cuts meant that few schools could afford a brand-new $2,400 Apple II computer.

The Apple Bill passed the House overwhelmingly but then died in the Senate after a bureaucratic snafu for which Jobs forever blamed Republican Senator Bob Dole of Kansas, then chair of the Finance Committee. Yet all was not lost: A similar bill passed in California, and Apple flooded its home state with almost 10,000 computers. Apple’s success in California gave it a leg up in the lucrative education market as states around the country began to computerize their classrooms. But education was not radically transformed, unless you count a spike in The Oregon Trail–related deaths from dysentery. If anything, those who have studied the rapid introduction of computers into classrooms in the 1980s and ’90s tend to conclude that it exacerbated inequities. Elite students and schools zoomed smoothly into cyberspace, while poorer schools fell further behind, bogged down by a lack of training and resources.

A young, charismatic geek hawks his wares using bold promises of social progress but actually makes things worse and gets extremely rich in the process—today it is easy to see the story of the Apple Bill as a stand-in for the history of the digital revolution as a whole. The growing concern about the role that technology plays in our lives and society is fueled in no small part by a growing realization that we have been duped. We were told that computerizing everything would lead to greater prosperity, personal empowerment, collective understanding, even the ability to transcend the limits of the physical realm and create a big, beautiful global brain made out of electrons. Instead, our extreme dependence on technology seems to have mainly enriched and empowered a handful of tech companies at the expense of everyone else. The panic over Facebook’s impact on democracy sparked by Donald Trump’s election in a haze of fake news and Russian bots felt like the national version of the personal anxiety that seizes many of us when we find ourselves snapping away from our phone for what seems like the 1,000th time in an hour and contemplating how our lives are being stolen by a screen. We are stuck in a really bad system.

This realization has led to a justifiable anger and derision aimed at the architects of this system. Silicon Valley executives and engineers are taken to task every week in the op-ed pages of our largest newspapers. We are told that their irresponsibility and greed have undermined our freedom and degraded our democratic institutions. While it is gratifying to see tech billionaires get a (very small) portion of their comeuppance, we often forget that until very recently, Silicon Valley was hailed by almost everyone as creating the path toward a brilliant future. Perhaps we should pause and contemplate how this situation came to be, lest we make the same mistakes again. The story of how Silicon Valley ended up at the center of the American dream in the late 20th and early 21st centuries, as well as the ambiguous reality behind its own techno-​utopian dreams, is the subject of Margaret O’Mara’s sweeping new history, The Code: Silicon Valley and the Remaking of America. In it, she puts Silicon Valley into the context of a larger story about postwar America’s economic and social transformations, highlighting its connections with the mainstream rather than the cultural quirks and business practices that set it apart. The Code urges us to consider Silicon Valley’s shortcomings as America’s shortcomings, even if it fails to interrogate them as deeply as our current crisis—and the role that technology played in bringing it about—seems to warrant.

Silicon Valley entered the public consciousness in the 1970s as something of a charmed place. The first recorded mention of Silicon Valley was in a 1971 article by a writer for a technology newspaper reporting on the region’s semiconductor industry, which was booming despite the economic doldrums that had descended on most of the country. As the Rust Belt foundered and Detroit crumbled, Silicon Valley soared to heights barely conveyed by the metrics that O’Mara rattles off in the opening pages of The Code: “Three billion smartphones. Two billion social media users. Two trillion-dollar companies” and “the richest people in the history of humanity.” Many people have attempted to divine the secret of Silicon Valley’s success. The consensus became that the Valley had pioneered a form of quicksilver entrepreneurialism perfectly suited to the Information Age. It was fast, flexible, meritocratic, and open to new ways of doing things. It allowed brilliant young people to turn crazy ideas into world-changing companies practically overnight. Silicon Valley came to represent the innovative power of capitalism freed from the clutches of uptight men in midcentury business suits, bestowed upon the masses by a new, appealing folk hero: the cherub-faced start-up founder hacking away in his dorm room.

The Code both bolsters and revises this story. On the one hand, O’Mara, a historian at the University of Washington, is clearly enamored with tales of entrepreneurial derring-do. From the “traitorous eight” who broke dramatically from the Shockley Semiconductor Laboratory in 1957 to start Fairchild Semiconductor and create the modern silicon transistor to the well-documented story of Facebook’s founding, the major milestones of Silicon Valley history are told in heroic terms that can seem gratingly out of touch, given what we know about how it all turned out. In her portrayal of Silicon Valley’s tech titans, O’Mara emphasizes virtuous qualities like determination, ingenuity, and humanistic concern, while hints of darker motives are studiously ignored. We learn that a “visionary and relentless” Jeff Bezos continued to drive a beat-up Honda Accord even as he became a billionaire, but his reported remark to an Amazon sales team that they ought to treat small publishers the way a lion treats a sickly gazelle is apparently not deemed worthy of the historical record. But at the same time, O’Mara helps us understand why Silicon Valley’s economic dominance can’t be chalked up solely to the grit and smarts of entrepreneurs battling it out in the free market. At every stage of its development, she shows how the booming tech industry was aided and abetted by a wide swath of American society both inside and outside the Valley. Marketing gurus shaped the tech companies’ images, educators evangelized for technology in schools, best-selling futurists preached personalized tech as a means toward personal liberation. What emerges in The Code is less the story of a tribe of misfits working against the grain than the simultaneous alignment of the country’s political, cultural, and technical elites around the view that Silicon Valley held the key to the future.

Above all, O’Mara highlights the profound role that the US government played in Silicon Valley’s rise. At the end of World War II, the region was still the sleepy, sun-drenched Santa Clara Valley, home to farms and orchards, an upstart Stanford University, and a scattering of small electronics and aerospace firms. Then came the space and arms races, given new urgency in 1957 with the launch of Sputnik, which suggested a serious Soviet advantage. Millions of dollars in government funding flooded technology companies and universities around the country. An outsize portion went to Northern California’s burgeoning tech industry, thanks in large part to Stanford’s far-sighted provost Frederick Terman, who reshaped the university into a hub for engineering and the applied sciences.

Stanford and the surrounding area became a hive of government R&D during these years, as IBM and Lockheed Martin opened local outposts and the first native start-ups hit the ground. While these early companies relied on what O’Mara calls the Valley’s “ecosystem” of fresh-faced engineers seeking freedom and sunshine in California, venture capitalists sniffing out a profitable new industry, and lawyers, construction companies, and real estate agents jumping to serve their somewhat quirky ways, she makes it clear that the lifeblood pumping through it all was government money. Fairchild Semiconductor’s biggest clients for its new silicon chips were NASA, which put them in the Apollo rockets, and the Defense Department, which stuck them in Minuteman nuclear missiles. The brains of all of today’s devices have their origin in the United States’ drive to defeat the Soviet Union in the Cold War.

But the role of public funding in the creation of Silicon Valley is not the big government success story a good liberal might be tempted to consider it. As O’Mara points out, during the Cold War American leaders deliberately pushed public funds to private industry rather than government programs because they thought the market was the best way to spur technological progress while avoiding the specter of centralized planning, which had come to smack of communist tyranny. In the years that followed, this belief in the market as the means to achieve the goals of liberal democracy spread to nearly every aspect of life and society, from public education and health care to social justice, solidifying into the creed we now call neoliberalism. As the role of the state was eclipsed by the market, Silicon Valley—full of brilliant entrepreneurs devising technologies that promised to revolutionize everything they touched—was well positioned to step into the void.

The earliest start-up founders hardly seemed eager to assume the mantle of social visionary that their successors, today’s flashy celebrity technologists, happily take up. They were buttoned-down engineers who reflected the cool practicality of their major government and corporate clients. As the 1960s wore on, they were increasingly out of touch. Amid the tumult of the civil rights movement and the protests against the Vietnam War, the major concern in Silicon Valley’s manicured technology parks was a Johnson-era drop in military spending. The relatively few techies who were political at the time were conservative.

Things started to change in the 1970s. The ’60s made a belated arrival in the Valley as a younger generation of geeks steeped in countercultural values began to apply them to the development of computer technology. The weight of Silicon Valley’s culture shifted from the conservative suits to long-haired techno-utopians with dreams of radically reorganizing society through technology. This shift was perhaps best embodied by Lee Felsenstein, a former self-described “child radical” who cut his teeth running communications operations for anti-war and civil rights protests before going on to develop the Tom Swift Terminal, one of the earliest personal computers. Felsenstein believed that giving everyday people access to computers could liberate them from the crushing hierarchy of modern industrial society by breaking the monopoly on information held by corporations and government bureaucracies. “To change the rules, change the tools,” he liked to say. Whereas Silicon Valley had traditionally developed tools for the Man, these techies wanted to make tools to undermine him. They created a loose-knit network of hobbyist groups, drop-in computer centers, and DIY publications to share knowledge and work toward the ideal of personal liberation through technology. Their dreams seemed increasingly achievable as computers shrank from massive, room-filling mainframes to the smaller-room-filling minicomputers to, finally, in 1975, the first commercially viable personal computer, the Altair.

Yet as O’Mara shows, the techno-utopians did not ultimately constitute such a radical break from the past. While their calls to democratize computing may have echoed Marxist cries to seize the means of production, most were capitalists at heart. To advance the personal computer “revolution,” they founded start-ups, trade magazines, and business forums, relying on funding from venture capital funds often with roots in the old money elite. Jobs became the most celebrated entrepreneur of the era by embodying the discordant figures of both the cowboy capitalist and the touchy-feely hippie, an image crafted in large part by the marketing guru Regis McKenna. Silicon Valley soon became an industry that looked a lot like those that had come before. It was nearly as white and male as they were. Its engineers worked soul-crushing hours and blew off steam with boozy pool parties. And its most successful company, Microsoft, clawed its way to the top through ruthless monopolistic tactics.

Perhaps the strongest case against the supposed subversiveness of the personal computer pioneers is how quickly they were embraced by those in power. As profits rose and spectacular IPOs seized headlines throughout the 1980s, Silicon Valley was championed by the rising stars of supply-side economics, who hitched their drive for tax cuts and deregulation to tech’s venture-capital-fueled rocket ship. The groundwork was laid in 1978, when the Valley’s venture capitalists formed an alliance with the Republicans to kill then-President Jimmy Carter’s proposed increase in the capital gains tax. They beta-​tested Reaganomics by advancing the dubious argument that millionaires’ making slightly less money on their investments might stifle technological innovation by limiting the supply of capital available to start-ups. And they carried the day.

As president, Ronald Reagan doubled down with tax cuts and wild technophilia. In a truly trippy speech to students at Moscow State University in 1988, he hailed the transcendent possibilities of the new economy epitomized by Silicon Valley, predicting a future in which “human innovation increasingly makes physical resources obsolete.” Meanwhile, the market-friendly New Democrats embraced the tech industry so enthusiastically that they became known, to their chagrin, as Atari Democrats. The media turned Silicon Valley entrepreneurs into international celebrities with flattering profiles and cover stories—living proof that the mix of technological innovation, risk taking, corporate social responsibility, and lack of regulation that defined Silicon Valley in the popular imagination was the template for unending growth and prosperity, even in an era of deindustrialization and globalization.

The near-universal celebration of Silicon Valley as an avatar of free-market capitalism in the 1980s helped ensure that the market would guide the Internet’s development in the 1990s, as it became the cutting-edge technology that promised to change everything. The Internet began as an academic resource, first as ARPANET, funded and overseen by the Department of Defense, and later as the National Science Foundation’s NSFNET. And while Al Gore didn’t invent the Internet, he did spearhead the push to privatize it: As the Clinton administration’s “technology czar,” he helped develop its landmark National Information Infrastructure (NII) plan, which emphasized the role of private industry and the importance of telecommunications deregulation in constructing America’s “information superhighway.” Not surprisingly, Gore would later do a little-known turn as a venture capitalist with the prestigious Valley firm Kleiner Perkins, becoming very wealthy in the process. In response to his NII plan, the advocacy group Computer Professionals for Social Responsibility warned of a possible corporate takeover of the Internet. “An imaginative view of the risks of an NII designed without sufficient attention to public-interest needs can be found in the modern genre of dystopian fiction known as ‘cyberpunk,’” they wrote. “Cyberpunk novelists depict a world in which a handful of multinational corporations have seized control, not only of the physical world, but of the virtual world of cyberspace.” Who can deny that today’s commercial Internet has largely fulfilled this cyberpunk nightmare? Someone should ask Gore what he thinks.

Despite offering evidence to the contrary, O’Mara narrates her tale of Silicon Valley’s rise as, ultimately, a success story. At the end of the book, we see it as the envy of other states around the country and other countries around the world, an “exuberantly capitalist, slightly anarchic tech ecosystem that had evolved over several generations.” Throughout the book, she highlights the many issues that have sparked increasing public consternation with Big Tech of late, from its lack of diversity to its stupendous concentration of wealth, but these are framed in the end as unfortunate side effects of the headlong rush to create a new and brilliant future. She hardly mentions the revelations by the National Security Agency whistle-blower Edward Snowden of the US government’s chilling capacity to siphon users’ most intimate information from Silicon Valley’s platforms and the voraciousness with which it has done so. Nor does she grapple with Uber, which built its multibillion-dollar leviathan on the backs of meagerly paid drivers. The fact that in order to carry out almost anything online we must subject ourselves to a hypercommodified hellscape of targeted advertising and algorithmic sorting does not appear to be a huge cause for concern. But these and many other aspects of our digital landscape have made me wonder if a technical complex born out of Cold War militarism and mainstreamed in a free-market frenzy might not be fundamentally always at odds with human flourishing. O’Mara suggests at the end of her book that Silicon Valley’s flaws might be redeemed by a new, more enlightened, and more diverse generation of techies. But haven’t we heard this story before?

If there is a larger lesson to learn from The Code, it is that technology cannot be separated from the social and political contexts in which it is created. The major currents in society shape and guide the creation of a system that appears to spring from the minds of its inventors alone. Militarism and unbridled capitalism remain among the most powerful forces in the United States, and to my mind, there is no reason to believe that a new generation of techies might resist them any more effectively than the previous ones. The question of fixing Silicon Valley is inseparable from the question of fixing the system of postwar American capitalism, of which it is perhaps the purest expression. Some believe that the problems we see are bugs that might be fixed with a patch. Others think the code is so bad at its core that a radical rewrite is the only answer. Although The Code was written for people in the first group, it offers an important lesson for those of us in the second: Silicon Valley is as much a symptom as it is a cause of our current crisis. Resisting its bad influence on society will ultimately prove meaningless if we cannot also formulate a vision of a better world—one with a more humane relationship to technology—to counteract it. And, alas, there is no app for that.

Thank you for reading The Nation!

We hope you enjoyed the story you just read, just one of the many incisive, deeply-reported articles we publish daily. Now more than ever, we need fearless journalism that shifts the needle on important issues, uncovers malfeasance and corruption, and uplifts voices and perspectives that often go unheard in mainstream media.

Throughout this critical election year and a time of media austerity and renewed campus activism and rising labor organizing, independent journalism that gets to the heart of the matter is more critical than ever before. Donate right now and help us hold the powerful accountable, shine a light on issues that would otherwise be swept under the rug, and build a more just and equitable future.

For nearly 160 years, The Nation has stood for truth, justice, and moral clarity. As a reader-supported publication, we are not beholden to the whims of advertisers or a corporate owner. But it does take financial resources to report on stories that may take weeks or months to properly investigate, thoroughly edit and fact-check articles, and get our stories into the hands of readers.

Donate today and stand with us for a better future. Thank you for being a supporter of independent journalism.

Thank you for your generosity.

Ad Policy
x