Twenty-eighteen was the year of living more dangerously than we knew. Climate change made its way from the op-ed pages of the American press to the news section. The poles heated up so fast that the ice sheets that cool the planet by reflecting the sun’s rays began to melt, shifting the rate of warming into overdrive. Polar bears became a threatened species. The temperature in Ouargla, Algeria, reached 124.3 degrees Fahrenheit, the highest reliable reading ever recorded. Japan sweltered, while Europe froze, and then caught fire. A 165 mph mega-typhoon struck the Philippines. In America, much of Northern California burned to the ground, and hurricanes of unprecedented strength barreled across both coasts. Meanwhile, last year’s greenhouse-gas emissions reached record heights, accelerating like a “speeding freight train,” according to the Global Carbon Project. The UN’s Intergovernmental Panel on Climate Change concluded that an effort of a magnitude that has “no documented historic precedent” will be necessary to avoid food shortages, wildfires, and so. Will humans become an endangered species, like the dodo?
In the face of overwhelming evidence that that our planet is rapidly getting hotter, our brilliant president famously dismissed climate change as a Chinese hoax, plans to withdraw from the Paris Climate Accords, and, late in 2018, rejected a report issued by 13 of his own federal agencies concluding that the effects of climate change will likely shrink the gross national product of the United States by 10 percent before the end of this century. In half that time, by 2050, crop yields will have dropped to levels not seen since the 1980s. He simply said, “I don’t believe it.” Neither, apparently, do millions of Americans. As recently as the middle of last year, Gallup polls reported that no more than 3 percent elevated the environment to the level of “the most important problem facing the country.” The good news is that two later surveys conducted after the soon-to-be not-so-freakish weather that characterized 2018 reveal a dramatic change: Seven in 10 Americans now believe that our planet is warming. Still, 48 percent of Republicans agree with Trump, and the question remains, Why did it take so long for the rest of the country to come around?
Lots of factors, of course, affect what we choose to believe. One is the power of power. The old saw tells us that “History belongs to the victors.” Now we see that the victors lay claim to “reality” as well as history. An unnamed official in the George W. Bush administration, widely believed to be Karl Rove, disparaged what he called the “reality-based community” for its naïveté. He said, “We’re an empire now, and when we act, we create our own reality.” Trump merely crossed out “empire” and substituted his own name. When he acts, in other words, he creates his own reality.
Another factor, one that is regularly ignored, is popular culture—movies, TV shows, games—many of which deride science, facts, and reason. There has always been a wealth of anecdotal evidence attesting to the power of stories to mold our opinions. Many people, for example, grow up getting their history lessons from movies, believing that, say, the X-Men resolved the Cuban missile crisis (X-Men: First Class), or that Pocahontas fell in love with John Smith (she was a child when they met), or that Britain’s King George VI stuttered so badly that he needed an Oscar-level effort to overcome it (The King’s Speech).
Now there are cognitive studies that confirm what we already knew: Stories have an outsize impact on belief, often more than logic does. As former Hollywood Reporter editor in chief Janet Min told The New York Times, “Algorithmic behavior existed long before there was an algorithm.” Explaining why the tabloids persist in reporting that Brad Pitt and Jennifer Aniston had babies—two dozen in all, according to media reporter Jim Rutenberg—she continued, “Whether politics or celebrity, narratives are what matter to people—the details often don’t.” Cut to The Apprentice, which constructed a flattering, if fictional, persona for Trump, turning the bankruptcy king into a savvy developer. Because it was, like Keeping Up With the Kardashians, a “reality” show, it conveyed an air of “truthiness,” a neologism signifying an appearance, as opposed to the reality, of truth. This became the template for Trump’s political career.
As has been often remarked, disdaining facts throws the door open to all manner of alternative realities. Although astrophysicists as prominent as Stephen Hawking and Neil deGrasse Tyson have argued that multiverses are possible, so far none have shown up at our door, and there is little to no evidence that they exist. That hasn’t stopped our fictions from featuring them. Most recently, Spider-Man: Into the Spider-Verse, is packed with Spider-Man variants from multiple dimensions, and in the last year or two, surplus worlds crowd ours in 2016’s Westworld, Stranger Things, Falling Water, and The OA, as well as 2017’s Legion, and 2018’s Counterpart.
In Maniac (2018), on Netflix, Owen, an institutionalized schizophrenic, complains that he is sees things that aren’t there, but to his putative girlfriend, Annie, multiple realities are just fine. She says something like, “So what. People see aliens, ghosts all the time.” Annie dresses him in street clothes so she can break him out, saying, “I saw it in a movie and it works.” In other words, any reality, any universe will do, including those in movies.
Many of these shows are overrun with supernaturals like zombies, vampires, and witches. To one degree or another, they are entertaining enough, but in the political universe, analogous fantasies can become toxic, the stuff of conspiracy theories like the International Fake Space Station, ancient aliens, Pizzagate, “crisis actors,” etc., all of which can be justified as “alternative facts,” in the immortal words of presidential counselor Kellyanne Conway.
It wasn’t always like this. Prior to World War II, we had the mad scientists transgressing the boundaries of permissible research, but their dangerous tinkering was confined to B-movies at the bottom of double bills. After the war, these Dr. Frankensteins gradually disappeared, replaced by calm, confident authority figures in white lab coats. Radiation from nuclear testing may have given rise to a variety of giant insects—flies, ants, spiders, wasps—in 1950s sci-fi, but if scientists created them, scientists defeated them as well. Indeed, from the end of World War II through the Obama administration, mainstream culture worshipped at the altar of science. And why not? After all, science had given us the A-bomb that ended World War II, followed by the H-bomb that seemed to guarantee American hegemony for the foreseeable future. The Salk vaccine erased the terrifying scourge of polio, DDT killed the little buggers that were helping themselves to our crops, while Watson and Crick unlocked the genetic code by revealing the secrets of the double helix—all in quick succession.
Hand in hand, science and technology were the twin agents of progress, marching in lockstep toward the brave new world of the future that we were assured would be better than the present, just as the present was better than the past. The postwar faith in progress was epitomized by the 1964 New York World’s Fair, a futuristic extravaganza that beguiled visitors with sugarplums of endless improvement, like Eisenhower’s “Atoms for Peace” program.
The fair pops up with surprising regularity in today’s shows, a totem of sorts for the technological utopia that never happened. George R.R. Martin, who wrote A Song of Ice and Fire, the series on which Game of Thrones is based, remembers, “When I was a kid you’d visit the Carousel of Progress at the World’s Fair, and you could see all the amazing things the future had in store for us: robots and flying cars, and so on. Everyone thought life was getting better and better.” Iron Man 2 (2010), contains a scene in which Tony Stark’s father, launching a Stark Industries expo modeled on the fair, extols progress: “Technology holds infinite possibilities for mankind and will one day rid society of all its ills.” In Brad Bird’s Tomorrowland (2015), we actually visit the fair, and are dazzled by its wonders.
Today, mainstream culture still worships science and technology. Scientists are good guys in Steven Spielberg’s wildly popular Jurassic Park franchise, as is Spock in the Star Trek series. Ditto federal agencies like the National Aeronautics and Space Administration (NASA) and the Centers for Disease Control (CDC). In Steven Soderbergh’s expertly executed 2011 film Contagion, the CDC is more than up to the task of quelling a plague that threatens humanity. Two years later, in World War Z, it was the UN’s World Heath Organization mobilizing virologists all over the globe to stem a zombie plague. And in 2016’s Hidden Figures, set at the height of the space race with the Soviets, it’s NASA that puts an American in space by facing up to its racism and sexism.
America’s infatuation with test tubes and beakers was, however, marbled by a vein of distinct unease. The Bomb was the very definition of a double-edged sword. It could easily have—and still might—destroy us all. Few Americans appreciated living on the edge of nuclear annihilation. Science and the technologies to which it gives rise are hard to untangle, but it was primarily technology that took the hit. It was one thing to split the atom—an exciting scientific adventure—but the Bomb was something else.
The anxiety occasioned by the arms race just wouldn’t go away. It was evident in the crop of pre- or post-apocalyptic anti-nuke movies that were released in the late 1950s and early ’60s. Far and away the best of the bunch was Stanley Kubrick’s Dr. Strangelove or: How I Learned to Stop Worrying and Love the Bomb, released in 1964, the same year as the World’s Fair. It features a so-called “doomsday device” set to respond automatically to a Soviet first strike, so that humans are powerless to stop it, even in the event of a false alarm. In other words, we ceded responsibility for the survival of the species to machines. As Kubrick remarked at the time, “There is an almost total preoccupation with a technical solution to the problem of the bomb. Our theme is that there is no technical solution.”
Nineteen sixty-four was also the eve of the American intervention in Vietnam. “When the Vietnam War happened,” Martin continues, “we discovered that some of these technological things had nasty snaps at the end, like pollution and global warming and the hole in the ozone layer. People lost their faith in progress.”
The backlash against technology informed an array of Luddite, anti-tech shows in which machines were the bad guys, including Kubrick’s own 2001: A Space Odyssey (1968) in which a homicidal computer, HAL 9000, concludes that its IQ is higher than that of the humans aboard the Jupiter-bound Discovery One, who are dispensable. And let’s not forget The Graduate (1967), with its disdain for “plastics,” which became shorthand for an artificial, shallow society.
Demonstrating against the high-tech juggernaut devised by technocrats like Secretary of Defense Robert McNamara, who ran the Vietnam War for President Kennedy and gave us such wonders as the defoliant Agent Orange and the high-tech “electronic battlefield,” peace protesters took Kubrick’s technophobia out of the theaters and into the streets. Gen. Curtis LeMay may have threatened to bomb Vietnam into the Stone Age, but for the flower children, the Stone Age was the bucolic state of nature rediscovered in a back-to-the-land commune in Vermont or Northern California. Stewart Brand’s Whole Earth Catalog was to hippies what Mao’s Little Red Book was to anti-war radicals, and the Mother Earth News became the New York Times of the ecology movement. The first Earth Day was held in 1970.
Kubrick was the canary in the coal mine, as it were, an early warning that mainstream shows would soon be overwhelmed by an extreme culture that turned them on their heads. What worked for the mainstream did not work for the extremes, where Big Science agencies like NASA and the CDC most often fail. In The Walking Dead (2010–), for example, the CDC is blown to bits.
Extreme culture, whether far left or far right, is animated by an insistent drumbeat of skepticism, if not downright hostility, toward technology. On the left, James Cameron, picking up the gantlet thrown down by the Luddites of the late-’60s and early ’70s, pits nature against technology in his anti-capitalist blockbuster, Avatar (2009). NASA, which flexed its muscles in Hidden Figures, and again in First Man (2018), is a no-show. The great adventure of space travel has been privatized, that is, outsourced to a corporation driven solely by profit, like the imperial England’s infamous East India Company. When the natives of Pandora fight back against the company’s fleet of heavily armored gunships, they are assisted by the beasts of the jungle, who turn the tide of battle. As Cameron put it, “Nature gets to fight back,” and not only does the company lose, but the hero defects to the other side.
In Homecoming (2018–), on Amazon Prime, the company is Big Pharma, contracted by the military to develop a sinister drug that erases PTSD in demobilized vets so that they are willing to re-up for another tour of duty in Iraq. Julia Roberts is employed to feed them the drug, but, like the hero of Avatar, she defects to the side of the victims.
Compromised though technology may be, the science behind it managed to survive relatively unscathed, protected as it was by a cocoon of respect, but lately, it has been tarnished as well, especially in shows from the far right, where we begin again to find the mad scientists of yesteryear, like the real-life researcher responsible for the CRISPR baby in China. Of course, fundamentalism has no use for science whatsoever, and neither do the tobacco and energy giants unless it favors their products, hence their efforts to discredit and/or suppress findings (often by their own researchers) injurious to their interests. And science hasn’t done itself any favors by concealing its ties to the corporations that sponsor much of its research.
Christopher Nolan’s Interstellar (2014) sheds crocodile tears over the demise of Big Science, but what it really mourns is the death of American exceptionalism, defined by our legendary audacity, optimism, and can-do spirit, here throttled by democracy, majority rule, which inevitably devolves into mediocrity. NASA has been defunded by shortsighted voters, and forced underground. Americans have foolishly turned their eyes away from the sky in favor of feeding themselves, beset as they are by a blight that has turned the world into a vast, sizzling frying pan. Strewing nuggets of Ayn Randian wisdom throughout his film, Nolan has the hero say, “We used to look up at the sky and wonder at our place in the stars, now we just look down and worry about our place in the dirt.” Rather than treating the blight as a man-made calamity that might be ameliorated by shrinking our carbon footprint, he treats it as a natural disaster beyond our control.
Likewise, in the films of Brad Bird, quantity crushes quality. In his Tomorrowland, NASA has been closed down, reflecting the will of the benighted masses, while in The Incredibles (2004), the government bans superheroes precisely because they’re super, forcing them to assume their work-a-day secret identities, allowing evildoers to run wild.
Once science loses its glow, reason and facts can’t be far behind. Damon Lindelof was one half of the duo who ran Lost, the ABC hit that riveted the nation from 2004 to 2010. Lost pits empiricism against the faith, magic, and the supernatural, but it’s no contest. The supernatural wins.
Like a dog with a bone, Lindelof gnawed at science-versus-faith in show after show. Post-Lost, he co-wrote Prometheus (2012), with Ridley Scott, that again slams the door on science, trading creationism for evolution. And in his much-lauded HBO hit, The Leftovers (2014–17), he also puts faith through its paces. A large swatch of humanity abruptly vanishes from the face of the earth without so much as a wave goodbye. “Whatever happened to them, it’s a miraculous event and implies a higher power,” he explained, concluding, “You can’t be an atheist anymore.”
Fortunately, we no longer live in an age wherein puritanical sects demonize fantasy and deny us the pleasures of the imagination. The fact remains, however, that when shows like The Leftovers and its ilk elevate the supernatural over science, and a Trump touts his contempt for facts, dismissing inconvenient ones as “fake,” the post-truth house of lies he has built appears to be an inviting home to millions of Americans. As recent polls indicated, when the effects of climate change become more and more severe, it will become impossible to ignore, and the Trump base will hopefully awaken from its slumber. Let’s hope it won’t be too late.