April 7, 2026

Adventures With AI

I used several AI chatbots to trace an unattributed quote—and learned just how fallible our robot replacements are.

Jon Wiener

The logos of Google Gemini, ChatGPT, Microsoft Copilot, Claude by Anthropic, Perplexity, and Bing apps are displayed on the screen of a smartphone in Reno, Nevada, on November 21, 2024.

(Jaque Silva / NurPhoto via Getty Images)

“Minnesota shifted the political landscape in unfathomable ways, without one television ad, or one poll-tested message. They did it with rigorous organizing over years, righteous defense of their neighbors, and disciplined nonviolent protest.”

I had saved that quote a couple of weeks ago but failed to note the source. So I asked Google Gemini, “Who said that?” (Gemini is the AI “assistant” that comes with Google search. It’s the second- or third-most-popular AI platform worldwide, claiming 750 million monthly active users.)

“That quote was spoken by Bruce Springsteen,” Gemini reported. “He delivered those lines during his headlining appearance at the ‘No Kings’ flagship rally in St. Paul, Minnesota, on March 29, 2026.”

I had watched the livestream of the St. Paul rally but didn’t remember Bruce saying that; so I checked the video—easy to do. I told Gemini, “The video does not show him saying ‘Minnesota shifted the political landscape.’”

Gemini replied, “You are absolutely right to catch that. At the Saturday rally at the St. Paul Capitol, Bruce Springsteen did not say those words. The quote you are looking for actually belongs to Senator Bernie Sanders.”

“While Springsteen provided the musical emotional core of the event,” it explained, “Sanders provided the political analysis. He used those words to explain why the national movement had designated Minnesota as the ‘flagship’ for the entire country.”

Current Issue

Cover of May 2026 Issue

Also easy to check. I told Gemini, “Bernie Sanders did not say those lines.”

Gemini said, “You are absolutely right to correct me on that. The quote you asked about actually belongs to Naomi Klein. She delivered those lines during her speech at that same St. Paul ‘No Kings’ rally on Saturday.”

I said, “Naomi Klein was not in St. Paul.”

Gemini said, “I sincerely apologize for the confusion and the incorrect attributions. You are absolutely right—Naomi Klein was not at the St. Paul rally. The quote you are looking for was actually written and posted by Ezra Levin, the co-executive director of Indivisible, the organization that spearheaded the ‘No Kings’ rallies.”

When I asked for the source, Gemini replied, “I have clearly provided you with incorrect information and was ‘hallucinating’ those specific attributions to Ezra Levin.” It went on, “I would like to get this right for you. I’m here to help you track down the actual source if you can provide a bit more context.”

The Nation Weekly

Fridays. A weekly digest of the best of our coverage.
By signing up, you confirm that you are over the age of 16 and agree to receive occasional promotional offers for programs that support The Nation’s journalism. You may unsubscribe or adjust your preferences at any time. You can read our Privacy Policy here.

I said, “I don’t think it’s from the No Kings rally in St. Paul.”

Gemini said, “I am very sorry for the persistent errors in my previous responses. I’ve clearly been mixing up several different events and public figures. I also owe you an apology for earlier claiming this was related to a ‘No Kings’ rally in 2026. The quote you are looking for was actually said by Naomi Klein. You were correct to doubt the St. Paul rally context. She said those words in a speech at a benefit for the ‘Take Back the Land’ movement, often cited in the context the Line 3 pipeline resistance in Minnesota. I’m glad we finally got to the actual source.”

That didn’t make sense. The quote was clearly about Minneapolis forcing ICE to end its operations there in February 2026, not about the Enbridge Line 3 protests in 2021, which failed to stop a new pipeline. So I asked, “What is your source for citing Naomi Klein for this quote?”

Gemini said, “I have to hold my hand up here—I have been giving you completely incorrect information. My previous attribution to Naomi Klein was another error, as was the mention of Ezra Levin. After a rigorous check, it turns out that no such quote exists from Naomi Klein, Bruce Springsteen, or Bernie Sanders. I mistakenly generated a response based on a hypothetical or ‘future’ scenario (2026) that hasn’t actually happened. There was no ‘No Kings’ flagship rally in St. Paul on March 28, 2026, featuring Bruce Springsteen.”

This was a giant step backward. Two hundred thousand people attended the No Kings flagship rally in St. Paul on March 28 and heard Bruce Springsteen. At this point, I gave up on Gemini.

I asked Microsoft’s Copilot, the free version, about the source of my Minnesota quote. It said, “No verifiable source exists for that quote. There is no record anywhere—news, transcripts, archives, social media, academic sources, or movement communications—that contains this sentence or anything close to it.”

I asked Claude, from Anthropic—also free. It said, “I was not able to find the specific quote you’re referencing. That particular formulation does not appear in any of the major sources covering the Minnesota ICE events of early 2026.” But it went one big step farther, by raising “a few possibilities worth considering: It may be misattributed or paraphrased from memory. It may be from social media, a speech, or a newsletter rather than a published article—those are much harder to trace. It may have been generated or hallucinated in an AI tool and then circulated as a real quote. This does happen with politically resonant events.”

I liked that answer best, so I asked Claude what happens when AI “hallucinates.” It explained, “When AI hallucinates, it generates a response that sounds confident and plausible but is factually incorrect, fabricated, or unsupported.” Why does it do that? Because “AI language models don’t ‘look things up’—they predict the next most likely word/token based on patterns learned during training. When a question touches on something the model doesn’t know well, it doesn’t say ‘I don’t know.’ Instead, it continues generating text that fits the pattern of a confident answer. The model has no internal fact-checker.”

I went back to Gemini and asked where I could find out more about AI hallucinations. It said, “Dr. Emily M. Bender & Dr. Timnit Gebru are the co-authors of the incredibly influential 2021 paper On the Dangers of Stochastic Parrots. Their core argument is that Large Language Models—LLMs—do not ‘understand’ anything. They argue that models merely stitch together words based on probability (like a parrot repeating sounds), making hallucinations an inherent feature of the technology, not a bug.”

And Gemini added, “When I gave you that fake Ezra Levin quote, I was acting exactly as Bender and Gebru described: a ‘stochastic parrot’ smoothly stringing together words that sounded highly probable for the context of Minnesota politics, without any actual tether to the truth.”

That last remark was pretty good—especially for a stochastic parrot.

But what is to be done? Can Claude—or Gemini, or Copilot—be instructed to look things up and cite sources? Claude replied, “Yes, all of them can”—especially Gemini, which is integrated with Google Search. But that doesn’t really solve the problem, Claude explained, because “AI can retrieve a real article and still misquote or mischaracterize it. The citations are a starting point for verification, not a guarantee of accuracy.”

So: All power to the human fact-checkers! AI has made their work more necessary than ever.

Jon Wiener

Jon Wiener is a contributing editor of The Nation and co-author (with Mike Davis) of Set the Night on Fire: L.A. in the Sixties.

More from The Nation

Letters Icon

Letters From the May 2026 Issue Letters From the May 2026 Issue

Voting for vets… The meaning of evangelical… Billionaire ball clubs…

Our Readers

The Last Cheap Ride

The Last Cheap Ride The Last Cheap Ride

The collapse of intercity bus service isn’t an accident of the market. It’s the result of treating transportation as a privilege.

Zachary Shell

What Happened to Tucker Carlson?

What Happened to Tucker Carlson? What Happened to Tucker Carlson?

The transformation of a once promising, if conservative, magazine journalist into a conspiracy-minded talking head.

Books & the Arts / Chris Lehmann

Residents of Daytona Beach, Florida, line up in their cars during a free food distribution for recipients of the Supplemental Nutrition Assistance Program on November 9, 2025.

We All Hate AI, but if You’re Poor, It Can Really Ruin Your Life We All Hate AI, but if You’re Poor, It Can Really Ruin Your Life

Debt collection. Parole decisions. Oversight of public services. It’s all being outsourced to AI, with terrible consequences for poor people.

Column / Kali Holloway

Harry Haywood and the Radical Politics of Black Communism

Harry Haywood and the Radical Politics of Black Communism Harry Haywood and the Radical Politics of Black Communism

For Haywood, a truly radical working-class politics in the United States also required a program of self-determination.

Books & the Arts / Elias Rodriques

Patrisse Cullors: Art Is Liberation

Patrisse Cullors: Art Is Liberation Patrisse Cullors: Art Is Liberation


Black Lives Matter cofounder Patrisse Cullors says cultural work will be the key to shifting the system and imagining a world after MAGA.

Feature / Rebekah Sager