Podcast / Tech Won’t Save Us / Mar 14, 2024

Silicon Valley Deserves Your Anger, With Ed Zitron

On this episode of Tech Won’t Save Us, a discussion about the public’s discontent with big tech.

The Nation Podcasts
The Nation Podcasts

Here's where to find podcasts from The Nation. Political talk without the boring parts, featuring the writers, activists and artists who shape the news, from a progressive perspective.

Silicon Valley Deserves Your Anger w/ Ed Zitron | Tech Won't Save Us
byThe Nation Magazine

On this episode of Tech Won't Save Us, Paris Marx is joined by Ed Zitron to discuss the public’s growing disillusionment with the tech industry as it pivoted to chasing profits instead of providing us with anything useful.

Ed Zitron is the host of Better Offline and writes the Where’s Your Ed At newsletter. He’s also the CEO of EZPR.

Advertising Inquiries: https://redcircle.com/brands

Privacy & Opt-Out: https://redcircle.com/privacy

iPhone displaying icons of the four big tech companies

GAFA apps and icons on an iPhone. Google, Amazon, Facebook and Apple are the four US multinational IT or online service companies that dominated cyberspace during 2010s

(Koshiro K / Shutterstock)

On this episode of Tech Won’t Save Us, we’re joined by Ed Zitron to discuss the public’s growing disillusionment with the tech industry as it pivoted to chasing profits instead of providing us with anything useful.

Ed Zitron is the host of Better Offline and writes the Where’s Your Ed At newsletter. He’s also the CEO of EZPR.

The Nation Podcasts
The Nation Podcasts

Here's where to find podcasts from The Nation. Political talk without the boring parts, featuring the writers, activists and artists who shape the news, from a progressive perspective.

The Ever-Evolving Espionage Act w/ Sam Lebovic | American Prestige
byThe Nation Magazine

On this episode of American Prestige, Sam Lebovic, professor of history at George Mason University, joins Danny and Derek for a look at the Espionage Act of 1917 and its use over the years. In this first part of the discussion, they explore the dominant ideologies at the time of its inception, its implementation in cases from Eugene Debs to Herbert Yardley, the law’s effect on whistle-blowing, America’s burgeoning “secrecy regime”, how the interpretation shifted from the early years of the Act’s existence, and more through World War II.

Subscribe to American Prestige on Patreon to hear the second part of this discussion on our Sunday bonus episode!

Sam’s book is State of Silence: The Espionage Act and the Rise of America's Secrecy Regime.

Advertising Inquiries: https://redcircle.com/brands

Privacy & Opt-Out: https://redcircle.com/privacy

Paris Marx: Ed, welcome back to Tech Won’t Save Us.

Ed Zitron: Thank you for having me.

Paris Marx: When you were last on the show, I think we discussed Elon Musk quite a bit. 

Ed Zitron: And he hasn’t done anything since we last talked. 

Paris Marx: He’s disappeared. You never hear about him in the news. But you have this new podcast that you started up just recently called Better Offline. So I figured that was a great opportunity to have you back on to discuss where the tech industry is right now, how you’re feeling about things, and just all the great, fantastic things that our tech billionaires are doing for us these days.

Ed Zitron: Oh, wonderful futures they’re making for us.

Paris Marx: We just love living in the world they’ve made, I guess the start just as a general question. It’s probably obvious based on what we’ve just been saying, but how are you feeling about the state of the tech industry right now? What’s your assessment on how things are going?

Ed Zitron: So I think we’re in the middle of a reckoning ands the next few years are going to be very telling about what happens to startups going forward. So a few weeks ago, I wrote a piece where it was saying why everyone has kind of turned on tech. Long and short of it is tech made a ton of promises in the 2010s, and they delivered on a lot of them. They were like: Your files will be wherever you need them. You won’t have to use physical media, you’ll be able to stream stuff, and they delivered, genuinely. I know that streaming is extremely questionable as an industry at times and cloud computing is not good, in many ways, but for the large part, tech actually delivered stuff. There was cool things that people could use. The jumps between iPhones were significant. There were new laptops that were just that bit much faster. It felt exciting. Then around the mid-2010s, so the around when Waymo first announced one of their first robot car things, when Elon Musk started talking a lot about Autopilot. Well, that was when the Apple Watch came out as well. It’s almost like the tech industry got frozen in amber.

There were new things, but things stopped being exciting. All of the things that we were told were coming, robots, AI, automated companions, all these things never showed up. I think as we speak, right now, we’re seeing what happens when you fail to deliver but you still get rich, because tech has existed with these massive multipliers and made all of these people obscenely rich. People have kind of accepted it, because they said: Well, it’s how we got iPhones, it’s how we got cloud computing, how we got all these things. Tech hasn’t done anything like that for a while and their big magical trick is AI, which is doing what exactly? Is AI doing, for a consumer today, much more than Siri was doing 5, 6, 7 years ago? It isn’t. On top of that, it’s so expensive, that everyone’s putting all this money into this very expensive, not super useful tech. It just feels, especially off the back of what we’re like less than a year since the SVP thing happened. I’m just a bit worried. It could be okay. Things, as a business operator in this space, feel better. But I don’t know, I’m very worried about this push of AI, not just for the fake job thing, but also because it costs too much and it does nothing.

Paris Marx: I think that’s really well put. I think there’s a lot in there for us to dig into through the course of this conversation and we’ll return to the AI stuff a little bit later. But it stands out to me that — and of course, when you say SVP, that’s of course Silicon Valley Bank and the implosion that happened there, for listeners who might not remember. But we’re in this AI push, this moment where this is all supposedly going to change everything. It’s interesting to me that you point back to the mid 2010s, as this moment when a lot of stuff really significantly changed in the sense that they started making these huge promises, they were not able to fulfill on these promises, the stuff that we actually got from these companies, were not these big steps forward. But were kind of incremental steps as things also seem to be getting worse and worse and worse. And we can talk about that as well. But the mid 2010s was also the last AI boom moment, when AI was going to replace all the jobs and all this kind of stuff. 

Ed Zitron: I had a client in 2016 telling people: Oh, well, it will ingest all your knowledge and be able to answer questions in a chatbot form, this is 2016. That’s different. 

Paris Marx: It’s funny too, because there was an article I read the other day that was about Apple and how it was behind in the generative AI thing. And it was like: This is such a big threat to Apple’s business model because everyone’s going to be using these generative AI models, and it’s going to make the voice assistant so much better, and Apple has had so much trouble with voice assistants. And it’s like: Nobody liked the voice assistants the first time around. It never caught on. Amazon has largely disinvested from it. Sure, Alexa is still around, but it’s not a focus like it was a few years ago, because it was just another one of these things that they pushed out and didn’t really catch on with people. The generative AI moment isn’t going to change that.

Ed Zitron: And actually, there was one thing that I left out that’s very important as well. 2021 was a watershed moment for tech in the worst possible way. It was the first time I think tech has really just tried to lie. Metaverse and crypto — what did they do? They didn’t do much. They did, however, make a bunch of guys really rich, but it’s the first time I saw just an outright con on consumers where it’s like: Hey, Metaverse is the future. You’ve got to be part of this. What is it? It’s a thing you’ve already really seen. But we would like you to claim this as the future. 

Cryptocurrency — what does this do? Nothing! It might make you rich, but it probably won’t. So you have this two year period where 2021 seemed like everything was going to be okay. Everyone was gonna get rich, Biden administration, everyone’s doing well, money was frothing around. And no one really knew — because I don’t think most people know macro economics, myself included — so a lot of people didn’t see it coming when interest rates screwed everyone. And everyone at once realized a while the tech ecosystem was built on a form of financial con, venture capital wasn’t held accountable. So, now they’re trying to push through AI and it’s like: What? No, we saw the metaverse that didn’t do anything. We saw crypto that didn’t do anything. Tell us what this will do? It’s the same McKinsey freaks telling us that this is the future.

Paris Marx: The crypto and Metaverse moment was like: Okay, there are these visions. There are these ideas, but there’s nothing tangible here that’s really making anything better. It’s just how can we extract more money from people by forcing them into these futures? Or making them believe it for a little while. Whereas in the past, sure they lied about a ton of things, and over promised about a ton of things, but there was often something tangible there that they believed there could actually be a follow through on. In part, it makes me think back to the dot-com boom moment where you also had a lot of these companies that really had no foundation to them, but were riding this wave up as there was just all of this money that was flooding into the space. It seemed like there was also a moment of that in those pandemic years, when there was a bunch of money flowing, it had to go somewhere. So here’s all these scams and cons to absorb it.

Ed Zitron: Another part of it and I’ve written about this great deal, is generally before this — and I know it’s not perfect, and other examples where they didn’t — but these companies found ways to grow that somewhat benefited the customer. Didn’t always totally do it in the nicest way, but it kind of benefited them. There was a way of looking and you go: Okay, they’re making stuff for people and the people will use it. Right now people are waking up to the fact that tech companies are willing to make their things worse to make more money. And I think that they are more aware of it than they’ve ever been. I know I’m somewhat self-serving, and that this is my rot economy thesis. However, I do think that there is coming a time when people are going to realize: Why are these tech companies worth $20 billion when they make things worse? Why is it that Microsoft is worth — what? — $3 trillion, and they’re just flooding money into this system, ChatGPT and Copilot and all these things, that they can’t even explain why you’d use it. 

The Superbowl commercial for Copilot, it was so weird. It was like: Oh, yeah, do the code for my 3D Open World game. Give me Mike’s trucks, a logo for Mike’s trucks, which just doesn’t work in practice. You put it in and it’s like: [jibberish] Mike! It’s really strange. If you’re listening to this go type in “classic logo for Mike’s trucks” into Microsoft Copilot. It is not what’s in the commercial. But it’s so weird, they spent several million dollars on this commercial and you’re watching and you’re like: Not even they can come up with a reason why you need generative AI. That’s crazy, man. I don’t remember another time in this industry, when I was just like: Oh, yeah, no one can tell you why they’re selling it. They’re just like: Oh, it’s the future, and you should buy it today. Please use it, please use it, now. We need you to use this so that the markets think we’re growing at 10 to 20%, every quarter. So that Satya Nadella can get, he must make at least $30-50 million. Sundar Pichai from Google gets $280 million or something or $220 million in 2022. These guys are insanely rich, but they’re creating nothing. I’m all over the place because this stuff is making me a little crazy. I’m not gonna lie.

Paris Marx: One of the pieces you wrote recently that I read in preparation for this interview was really going into that: How, with a lot of these companies we have been seeing it slowly more and more over time, if you think about Facebook, making their product worse, so that it could extract more money from people and get more data off of people to feed into these broader considerations that they had around making more money off of what everyone’s doing on their platform. But over time people are talking about it a lot now, but you’ve still seen it for the past number of years, like the Google search engine, getting worse as it’s become more oriented toward the needs of advertisers versus the actual users who are using it. And again, and again, whether it’s social media platforms and other things that happen online, there has been this slow degradation of the quality of these things, because there’s this need to extract more and more profit from it. And the options for where you’re going to make that profit have become fewer, as the real growth in this industry and the real innovation has tapered off. So, now it just becomes really extracting as much as possible from what is already there, even if that means the experience and the actual quality of the product has to decline in the process.

Ed Zitron: And at this point, what is innovation if the products are getting worse? The core issue here is the whole reason these companies got fairly open remit to do what they want it, was because they were providing something to society — that their products changed how we operate our lives. The  cloud computing boom, especially the consumer cloud computing boom, that actually can’t be understated. That genuinely changed people’s lives. Like that really helped. Google Docs, Google Docs is an incredible human achievement. I know I sound like a tech fantasist there. But to be honest, it is. Word used to be insanely expensive, now Google Docs does that for billions of people. That is a crazy thing! It’s crazy they did that. 

What does Google do now? In 2019, the former head of search, I forget what his name was, did something called a Code Yellow. He’s saying: There’s a problem with search. We’ll need to get together and talk about it. And his big problem was he felt like the ad side of Google was getting too involved with search. 2020, they replaced him with Prabhakar Raghavan, I’m probably saying that wrong and I’m so sorry if I am. That was the guy who used to run Google ads. That is the guy who runs search now. That’s the guy. They gave search to the worst possible person, unless of course you think: Maybe it’s the best one if they need to grow search and they need to grow ads. And it’s just frustrating. It’s frustrating and one of the reasons why I think people like my show, is because I’m pissed off about this stuff. And I think everyone should be. Everyone should be. You have actually been Paris, you’ve been ahead of everyone on this, with how pissed off you’ve been.

Paris Marx: Thank you.

Ed Zitron: I think as people realized why these things are happening, and when people see, clear as day, what tech’s elite is actually doing, I think they’re going to be angry too. And they should be, they should be as angry about this as politicians, possibly angrier. I’m not saying anything going on with Trump is great, but they gave the Library of Alexandria to an ads guy to make money on it. It’s just, you can’t Google stuff anymore. You can, but it’s half-assed. These are the reasons why people are turning on tech. They’re becoming more easy to understand because before, a few years ago, when everything started falling apart, even I have trouble explaining the whole interest rate phenomenon, the free interest rate phenomenon. 

I’m sure someone will do a great newsletter, or already has done one about why that happened. But that’s a hard thing to do and say: Oh, it was easy for them to borrow money. Thus they invested it haphazardly, and things grew because they could do that. That’s kind of a messy thing. But saying: They gave Google Search to an ads guy so that your search is worse to make Google money, people can understand that. People are willing to consider for a second that maybe Google shouldn’t be worth trillions, maybe Google shouldn’t be able to monetize their products in the way they do. Except maybe we’re going to get a Republican president, and we’re not going to be able to stop this, maybe it’s going to get worse. It’s deeply worrying. Unfortunately, all those of us outside the United States just have to deal with it because we have so much less power to try to hold these companies to account and to force them to do anything. And these companies are, of course, American until it comes to tax purposes, then they’re in another country. 

Paris Marx: Exactly! [laughs].

Ed Zitron: It’s so good. 

Paris Marx: Even when they’re trying to achieve this multinational tax treaty, to try to force them to pay some taxes in other countries, the US is constantly stepping in and pushing it and delaying it, and getting mad at countries when they tried to bring it to implementation. But it’s interesting what you say about Google there, because I recently reviewed Kara Swisher’s new memoir. There are many things I could say about that, but one of the stories that stood out is she was talking about getting this call from Larry Page in 2004, ahead of Google going public, and at that time they were trying to reassure people that the investors, the shareholders, weren’t going to push Google to embrace profit and get rid of its morals, and don’t be evil and all that kind of stuff. So, at that time, Page and Brin had already given into putting advertising on the platform when their initial paper.

Ed Zitron: I was just about to say that — it’s literally like it’s antithetical to good search.

Paris Marx: Now, just thinking about we could go back to that paper, but even just that moment, when the company is going public, and they’re trying to say: Don’t be evil is our guiding mantra; we’re not going to give in to all this stuff. And then seeing what Google has become 20 years later, it just shows you exactly what you’re talking about. This trajectory that they’ve been on over time where they need to constantly do more to compromise the product in order to show greater returns to the shareholders.

Ed Zitron: And right now the reckoning that I was previously talking about, the one I’m kind of scared of, is the dead internet theory. I’m coming up with a newsletter this week. At some point, these models — the OpenAI, Anthropic, what have you — they’re all trained off of publicly available Internet content. What happens when that content starts becoming made by AI? What happens when vast amounts of the Internet are — which just to be clear, vast amounts the internet right now are built so that Google can read them not human beings — AI is written based off of that and user generated content. Except users are becoming more aware of the fact that they screwed. Look at the Reddit IPO: Oh, wow, I can buy in at the same price as other idiots buying on the [unintelligble]. Reddit is still very pissed off that Google made a $60 million deal to train Reddit, train Gemini on Reddit’s data. 

So it’s like, all of that content was made for free for Reddit to make Alexis Ohanian and Sam Altman rich. That’s why that happened. Steve Hoffman as well. They all get rich off of this. But you know what? I believe the internet is going to be overwhelmed by a slop of AI-generated and SEO-generated crap. At that point, the models will be trained on that crap. When there is much more stuff that is generated by AI, or generated for algorithms to read, what happens to the internet? What happens to the internet when it’s just full of nothing? And the models that are creating the information that you find on the internet, because Google is trying to put AI into search, Quora has turned into this monstrosity where ChatGPT is at the top and that’s why someone Googled: Can you melt eggs? And it said yes. At some point people’s source of information is going to be AI, trained on stuff pretty much from 2023, back, and an increasingly empty internet. At some point, users aren’t even going to want to use these sites as well because all you’re doing is making someone else rich. 

It’s one of the rare cases where blockchain might have worked, if it worked in the way they claimed it does. Well, they lied about that. In the event, there was a way of actually rewarding users for their content, that would be wonderful. That’s never gonna happen because it doesn’t scale. That is actually the problem of the tech industry. It’s built off of free use of labor that they would not pay for, because if they had to pay for it, then they wouldn’t be worth $3 trillion. They would have to pay all this money to people for things they’ve done. And I’m just worried about what the internet looks like, in five years. What is Google in five years? What is Bing in five years? What are the things, is Bing or Google going to provide me with search results? Or is it going to be regurgitated nonsense, generated by nonsense? I mean, that’s what Arc’s new browser is doing. Oh, it’s an agent that reads things and tells you what the web says. This is a bad time, and I don’t think people realize how bad.

Paris Marx: But as you say, even if we think back to the pre-generative AI moment, so many times that you tried to find something online, it was content that was made for search engine optimization. You tried to get some answer off of Google and it was just all these stupid listicles that were made to appeal to the search engine, or to get you to click some affiliate links or something like that. And we had already seen a lot of media organizations headed in that direction as well, where sure they had their articles that were telling you about stuff, but also all these other articles that were just designed in order to boost the search engine ranking, and to get people to click affiliate links, and all this kind of stuff. There was already so much of that, that Google was already feeling compromised in the sense that: Okay, there are only certain things that work on it, or a lot of people were searching, for example, if they were trying to get an answer to something or a review on how some product was actually working, 

They would make it so that you’d only get results from Reddit because they felt then at least they get users opinions of what was going on instead of the garbage that was on Google. And so now if you head into this stage, where content that is generated by AI tools  — or bots, or whatever you want to call them — is so much easier to do because it is not like that never existed before. Like these tools were still being used in a more rudimentary way. Now, if that becomes much more easier, it’s so much easier to just fill the internet with all this garbage. I’ve also been reading about dead internet theory recently, too, because it does feel like if we haven’t reached it, then these tools that Sam Altman, and all these other AI ghouls tell us is the future of everything and going to make the world so much better is sending us in this direction, where I don’t see how we don’t arrive at this place where the internet is basically useless.

Ed Zitron: I have this other theory that they really just to their core do not respect actual people. I think they believe that things like Reddit or Wikipedia, which are built off of the backs of the very old school internet thing of: We need to build something; we need to make sure that people have this this is a cool thing to do. I think they believe that that was a stopgap solution for what AI can do, that AI can read everything and thus AI can give these answers. Reddit is just a symptom to them of a larger problem where human beings have to fill in the gaps, when we could just have an AI eat everything and shit it out, and that would be the answer. Because think about it, why didn’t Reddit — I don’t know, maybe sometime around 2018-2019 — why didn’t they do an internal round for moderators? Why didn’t they do something for them? Because they don’t care. They don’t give a shit about any of them. They don’t give a rat fuck. Alexis Ohanian must have another car. He must buy another copy of Nintendo Power for no reason. 

God damn that guy! He’s a nice bloke in person, but Jesus Christ man. He invested in Axie Infinity, which is well looking at actual sweatshop labor. But the way they look at these platforms, the way they look at, you can see it in a lot of the way that generative AI freaks on Twitter are like: Wow, writers are gonna go away with the dodo. Oh, Sora AI is going to make directors and filmographers homeless! They love that idea because it means that they won. But the thing is, they look at it and they say: Well, this is good enough. People just want good enough, and to an extent they’re right. Except the problem is their good enough is very different from actual good enough. Good enough is BinkyBob52 on Reddit, giving you a poorly written but still useful thing about why your PC is acting that way. What it isn’t, is a hallucinated solution that does not make sense or help in any way. 

They don’t see that the internet was built on the back of user generated content and that user generated content cannot be mimicked by a machine. The problem is no one’s going to convince them otherwise. It’s just going to happen. Reddit is going to start falling apart when it goes public. I already think it’s happening now. People are so angry. They’re so tired about how Steve Huffman treats them. At some point Twitter, Twitter is probably one of the greatest examples of how user generated content has given the company value. It’s falling apart at the seams. It’s worth like $32 bucks, $33 maybe. Elon Musk is giving people who pay him $16, usually more than $16, on ad sharing. It’s worrying because I think we are in the midst of the corporate overlords just turning on the internet and saying: I have everything now. I have it all — but you humans, you can go. I’ve got all I need now to create Encyclopedia Muskandia. I have all I need now. You people were nice to us on the way, but I’ve got mine. Now I will create an internet where I will generate what the customer wants.

I think they’re conflating customer and user. They think that people come to the internet to go: I would like this and now I am done. People want to browse. People want to absorb themselves in stuff. People don’t go online, so they can have something generated. They go online, so they can talk to someone or see someone, do something. The greatest things that have happened on the internet, and some of the worst, are a result of people doing things with or to other people. When you start removing those connections, when you start taking the focus away from those, you will kill the internet. It could regrow, I think it’s more than likely, we just get a big, fat sickly dog version of Google for a few years. And then they realize it’s not good, this isn’t working! But I actually think one of the big tech companies, I don’t think maybe Google dies, but one of Anthropic or OpenAI, something bad happens to one of them. It’s just unsustainable, the whole thing is unsustainable, as I keep saying.

Paris Marx: Totally. But when the hype and when the boom around generative AI starts to go down, which it inevitably will, I think these companies are in trouble because they don’t have a real business model. There’s no there, there. Sure they’re churning out all this stuff, but it requires so much money — it requires so much energy, and so much water, and so much cloud computing resources and all this kind of stuff in order to power it — that once the hype goes away, there’s nothing left to keep driving it.

Ed Zitron: But it costs too much to do nothing. It’s not doing anything right now. And they’re like: Oh, it’ll be good in the future. And when you ask how, they get very upset.

Paris Marx: Exactly. But I want to come back to what you were saying about the human side of it, because I feel like this is something that is always left out of tech’s visions, or that they can’t seem to account for. Because I feel like in the sense, generative AI is picking up on something else that they have been doing for a very long time, which is basically trying to eliminate or replace, or put themselves in the middle of human interaction. Humans don’t want to engage with one another, they want some tool to be in between themselves. So you don’t need to talk to another person or you don’t need to see another person or anything like that. I think that that has always fundamentally misunderstood, or disregarded, our need for human connection, and how the vast majority of us are wired to want that into need that to thrive. 

Sure, we might not want people in our face all the time, but we do want to engage with other people and the tech industry seems determined to try to minimize the actual amount of human interaction that we have. I think that is, in many ways, directly related to this what some people call a crisis of loneliness and stuff. How people increasingly seem to be more lonely, have fewer personal connections, fewer friends that they see regularly, all this kind of stuff. It’s impossible not to recognize that that is, in part, a result of not just the digital revolution, or whatnot, but how that has specifically been designed and rolled out by companies that have particular incentives to want to ensure that you’re using their technologies much more than you’re talking to an actual person.

Ed Zitron: I sort of agree there. I was a very shy child; I was not a big talker. And the internet was a big part of socializing me. I was able to make friends. I’ve made some of my closest friends on Twitter. I found my job; I found this podcast. I found all these things through Twitter, for example. Without that, I would not have met as many people. But I agree that there is a sense that they are trying to intercept human interactions and try and make human reactions just to squeeze a little bit more profit. That’s what’s so sickly about it’s like before, everyone knows that a free product has ads, we get it. Absolutely. But when Facebook and Instagram don’t show you things you actually want to see, they don’t show you the people you’re following. They don’t show you photos, they show you a bunch of videos. It’s also insane that most Instagram ads I see are for games that don’t exist. It’s like they have this trend where they’ll show someone playing a game with a little man and they hit a plus one and then there’s two men, and they’re shooting something and it’s for Evany, which is a totally different game. 

Putting that aside, I think that what the biggest thing is is they just find customers annoying. I think that they just find this whole thing where they have to serve someone and that that person requires something just a bit — it’s just getting in the way of the money. Google search was a wonderful product for so long, but I think at some point, they were like: Well, we need to grow, and these people just aren’t doing what we want them to. They need to click more ads, and we’ll make the ads harder to avoid. We will make it harder to tell. And that worked pretty well, but why don’t we just instead of having it have the top bar with video, news, why don’t we just randomize what it says there. So sometimes it doesn’t show you news. Perhaps when they go on news, we’ll change it so that for some reason, it doesn’t allow you to do it chronologically, so you have to go back to Google. They make these things more convoluted to keep people on them longer. And they do that because — and it’s very kind of mid to late 90s cynicism here — we are fucking rats in a maze for them. They are just torturing us as much as possible at this point.  

I think that they have just reached a level where they’re trying to find a way to build products that will get used enough to make it discreet and so we won’t think they’re complete con artists. But these aren’t products for people. Generative AI isn’t for people; it’s so that they can say Google has grown the AI part of the company 10%. We’re investing this much; we are doing future stuff to other people that don’t use products, investors, so that they can make things look bigger, eternally grow and grow forever, grow, grow, grow. We’re owning all these data centers. Now, that’s real estate we have now, we have physical things. Wow. Isn’t the future amazing? Who gives a shit if the products good or if it does anything. If it’s not insanely unprofitable, and you’ve written about datacenters a lot, and I don’t think people realize how unprofitable this whole con is. But I think people are aware that this is a con now. I think people are waking up to these things and saying: I don’t need generative AI. What does it do? Why do I need ChatGPT? It allowed me to write a seven point business plan. If you need ChatGPT to write your business plan your business is fucked, mate.

Paris Marx: Unless you’re one of those people who just used generative AI to be the whole business. Did you see those videos in the early days?

Ed Zitron: Whoa, what’s that? What do you mean by that?

Paris Marx: They were just like: We’re going to make a website with generative AI and all the stuff on the website is going to be generative AI, and people are going to buy stuff off this website.

Ed Zitron: That is the future, though. That’s what Google is going to be full of. It’s already full of outright scams. If you go and looked up, about a year or so ago, I was trying to get a Ninja Creami, I think, to make ice cream. You went on Google and you put it in and this thing was sold out everywhere. But there were three or four just outright scam websites, that just don’t ship anything. And it’s like: This is what the internet has become already. What do you think they’re going to do when they’re like: Well, we could just get rid of all this user generated stuff. What’s amazing for them is their theory is not only will we fill the internet full of generative stuff, but people have to pay us to generate their part of the internet. Wow. It’s just like Linktree but bigger. And it’s very scary, it’s very scary, indeed.

Paris Marx: Well, it comes back to the lack of innovation that we were talking about. Because part of the reason that Google needs to keep compromising the search engine is because so much of its money comes from it, even though they’ve done the self-driving cars, and they bought Android and they bought so many other things. The search engine itself is still the cash cow and it’s still where the money comes from. So they need to find ways that keep sucking more money out of it because all these other projects that they’ve tried, haven’t been working. I wanted to go to something that you were talking about in a piece that you wrote recently, since you brought up the data centers. Which is that so many of these AI companies — whether it’s Google doing its own AI thing and how it bought Deep Mind, a number of years back — but also Microsoft’s investments in OpenAI, the investments that Amazon has been making in AI companies and have so much of it takes these AI products and funnels so much of it into their cloud computing businesses to make sure there’s this new wave of customers flooding into it. Can you talk about that piece of it and what they’re up to?

Ed Zitron: This is the snake eating its own tail big time. So what happens is, so Microsoft invested $10 billion in OpenAI, but around the time of the Sam Altman situation where he was fired, and then came back, and we still don’t know why that happened. Just to be clear, it’s insane. We don’t know. Microsoft invested $10 billion, but Semafor reported that in fact, most of that money was in cloud computing credits. Anthropic got invested, I think Google put $3 billion into Anthropic, and they made it so that they would be the exclusive cloud partner of Anthropic. So effectively, what happened was these companies went: We’re going to invest billions of dollars, and then that money is immediately going to come back to us in the form of the cloud computing credits you need to run your business that we just invested in. 

So I have no idea how that is legal, that feels like some sort of con where they’re just like: We’re just going to send ourselves money through a slightly scenic route. It’s put these big tech entities, specifically Microsoft and Google, in this position where their cloud businesses, they just bought revenue. Every dollar that goes through the AI ecosystem, through Claude, which is  Antrhopic’s model, or ChatGPT’s OpenAI, every dollar that goes through those goes back to one of the two companies. It’s insane. Microsoft didn’t give $10 billion to OpenAI they gave, well they don’t know how much it is, but most of it is in cloud credits. They just kind of gave them $8 billion, and then OpenAI will give that back to them very slowly.

Paris Marx: You can purchase stuff on our Azure servers.

Ed Zitron: I would love to know if that is actually going to be anywhere on their earnings. I wonder if it’s just going to come up as AI revenue? And it probably will. They just gave themselves money. They just grew their own company by handing themselves money. I’m pausing, because it’s just, every time I describe this stuff, I’m like: How is that legal?

Paris Marx: I feel like it comes back to something that we were talking about earlier, as well where, when you were talking about the zero interest rates of like the 2010s and how that fueled so many of these businesses, how we talk about tech, or have talked about tech, so often as innovation and things moving forward. But always underlying that has been this financialization. And this degree of finding financial schemes in order to blow up these businesses or avoid taxes by doing these weird tax schemes around the world, and all this kind of stuff. And it just feels like another way that they’re taking advantage of these accounting rules or whatnot, in order to move some numbers around on a spreadsheet, so it looks really weird. But also control whatever’s going on here and funnel all this hype into their cloud businesses at the same time.

Ed Zitron: It also effectively creates monopolies with micro-monopolies attached, because how can you compete with OpenAI and Anthropic now? If they ran into money troubles, you think Microsoft and Google are going to be like: No, you have to compete like everyone. No, they’re going to give them a sweet deal. They’re probably given them one now. There’s nothing to stop them. It’s a preferred partner. This is how this works. It’s not like it was 20 years ago. No one in their garage is going to compete with OpenAI anymore. OpenAI does not pay. It’s like living in a house where the rent is paid for, and the utilities are paid for, versus someone who has to get a job while also paying those things. It’s just deeply uncompetitive, but also, what is it doing? What is any of this stuff doing? What does OpenAI do? Where are the things? Every time I tweet about this, someone who say: Oh, well, their AI agents are coming, AI agents are going to change everything. And I’m like, how? And they say: Well, you’re already in things you using today. And I’ll say: Okay, well, what? And they are like Well, it’s ChatGPT. Okay, so we’re just back to ChatGPT, now? What AI agents could browse the web for you? Okay, why? Why would they do that? Well, they could run apps for you like the rabbit r1, I can say, order me dinner and take me home using Uber. [mocking voice] Oooh! Or I could use my fucking thumbs. Why do I need this? 

On top of that, it’s all unprofitable already. It’s not like a situation where it’s the super early days of AI, this like 20 years of progress that’s got us to here today. Someone made a post about this on Twitter the other day, a very good point, which is you can’t look at this as since ChatGPT. That’s not what this is. ChatGPT is part of a 20 year push of artificial intelligence. We’re at the top of the S curve. We’re getting very close. But also we’re not in a position where we’re just going to refine some things in this will get cheaper, the models will get cheaper. They might, but how much cheaper? Look we’re going to run up against a hardware problem soon. And also, none of this stuff makes money! None of it makes money. What am I paying for? What are you paying for? What are businesses paying for? There was a Wall Street Journal article the other day, which was saying: Anthropic and OpenAI are moving forward to pitching to enterprise. Even though there are still problems with hallucinations. You can’t fix those! They’re not going anywhere. If a hallucination happens with a regulated industry, and that loses people money, the SEC will have their guts regardless. No one’s going to rely on this stuff. 

Knight Capital happened, what, 12 years ago? It was a hedge fund that had an algorithm issue. They ended up losing about $410 million and they ended up basically shutting dow. Since then, people are extremely worrisome about automation in the financial services industry. And where else are you going to sell to with this stuff? I think what it is is that you can see who is getting rich really gratuitously. You can see the people who are making the money; you can see how much they’re making money as they lay off thousands of people except usually when that stuff was happening, you could point to something and go, but at least we have this. What do we have? Just on a really basic level. What have we got here? What is the thing? I am a tech guy; I like the Vision Pro when it doesn’t give me a goddamn headache. But I like the Vision Pro I can look at that and use it and say: This feels like the future. I can watch Dune on a big screen and that feels like something. Still expensive, shouldn’t have released it this early but still, even then you can point and say: That did a thing. I can see why someone might want this. I can see the function. 

I am literally a PR guy. I talk to AI companies all the time. I’ve seen a few like bill fighting things that are pretty cool, like features that are cool thanks to AI. I just don’t know, if AI disappeared tomorrow, what would we lose? What would the death of generative AI do? What would we lose other than trillions of billions of wasted dollars? What has it done? What will it do? It’s like the Metaverse, it’s the same fucking thing. It’s: Oh, you have immersive worlds now. Oh, it’s going to automate everything. What is it going to automate and where? How is that better than what we have today? I’ve heard so many cases where people will say: Oh, it will automate investment strategies. Huh? That happened a decade ago, I was talking to robo-advisor clients in 2014. What’s new? What can we do now? What is the thing? What is the thing I should be excited about? I want to be excited, make me excited. I’m open to being excited here or even slightly happy. And I’m getting neither, I don’t know what’s missing. I don’t know what it is. But what I can tell you is it’s making a bunch of guys really, really, really rich. And that sucks. It sucks. You can’t even do the very gross thing of sayiny: But at least they created something, because they haven’t.

Paris Marx: I’ve been feeling it for a long time, I guess. But especially seeing what’s been happening over the past few years, probably starting with the crypto and Metaverse stuff. But then I feel like the Vision Pro is a very uninspiring product that shows just how much Apple has fallen from where it used to be. And of course, it was never perfect, but I think you could usually rely on it for more innovation or thoughtfulness or usefulness than what you’re getting out of that.

Ed Zitron: I could have a whole argument with you about that, but at the same time, you are the customer here. They should be impressing you.

Paris Marx: I’m never sticking a headset on my face. I’m just not doing it.

Ed Zitron: But that’s the thing. They should tell you why you should have to; they should have a compelling reason for you to do so. And they don’t they don’t have jack diddly squat. They never do. It’s so weird. They’re just like: Yeah, you should just just do it! It’s good! Go on. Even with the Apple Watch, Apple made this big push to show people how this fit into their lives. The Vision Pro it just felt they’re like: Yeah, you just walk around an office with that on. I don’t know what the fuck to tell you, give me four grand. I’m Apple, buy the thing. What the hell? You never made me do this. And what’s silly about it is I was using it the other day watched the first half of “Dune,” I’m late with movie sometimes. And I was watching and being like: Why didn’t Apple do a whole push with directors? This is an amazing way to watch cinema, And then I realized they did it because they didn’t feel like they bloody well had to, and they didn’t. They made $700 million. I try not to be too cynical with this stuff. But there’s so many times when it just like: You guys just don’t care, do you? You think we’re little pigs.

Paris Marx: But we are in a place where that cynicism is warranted. Because especially now as we move into, and have been in this AI boom for so long, as this has been happening, we’ve been seeing over the past year or so all of the layoffs that have been happening at major tech companies, at video game firms at so many, of these major companies. You can see time and again that these layoffs are not because the companies are unnecessarily struggling or not because they really can’t afford to pay these workers or anything. But so much of it is just so that they can keep investors happy, keep the share price up, all this kind of stuff. And it links back to so much of what has been going on in the tech industry for a long time. Where again, it’s not about innovation. It’s not about making people’s lives better. It’s about: Okay, what can we do that is going to make us some money in the short-term or get investors excited to drive up share prices in these companies and hopefully we can cash out at the right level. It completely justifies why people are getting so pissed off with this industry at this moment. Something that has been building for a long time now. 

Ed Zitron: It sucks because they have so much money. Google’s the one that  — I was about to say never understand, we totally understand why they do it — they laid off what 10,000 people this year already. They make $10 billion in profit in a quarter. It’s just sickening. I have to wonder at some point whether this doesn’t start hitting tech labor too. My only slight hope here is that I believe that the dejected software engineers — all of these tech workers actually — are eventually just going to go: I’m going to build something new, and I’m going to build something that doesn’t require venture capital. I think that that might be a move. I’m seeing a bit of this already with some clients, with people I know. They’re just so disenfranchised with the venture capital big tech model. Because they look up to these companies and they used to look up to them and say: These people were our gods; these were the people that inspired us. And that was much easier when you had Steve Jobs who wasn’t a tech guy, but they had created the mythos with him. He’s a deadbeat dad, a real piece of shit, stinky goblin. We’re doing an episode of behind the bastards — apparently, he never washed.

Paris Marx: I do think I read that one time.

Ed Zitron: He’s a stinky goblin. But nevertheless, Steve Jobs was part of — Steve Wozniak was the real reason — creating things that changed the world, like iTunes changed everything. The iPhone did change everything, these things were changing the world, but they were changing them by providing a service that was invaluable. That’s what the old ecosystem used to want to do. Then, “Why Software Is Eating the World,” happened that horrible essay, where Marc Andreessen, aka the King Pin, popped up and said: Well actually tech valuation should not be connected to how good the business is; they should be connected to how cool the business is, and people just bought it. That was the beginning of the Blitzscaling thing. That was the beginning of growth at all cost startups, where startups were fattened up, and then sold off and found up and then sold off. It was weird, Silicon Valley for a while was almost the Skunk Works for big tech. 

You’d have these companies build features that were still fundamental companies that they could sell the APIs, then big tech would eat it up. It almost worked for a bit, but it started falling apart when I think venture just got too powerful. Venture just completely removed itself from value creation of any kind. So you got some value creation, but it wasn’t scrappy anymore. It wasn’t about a bunch of software engineers coming together and doing something cool. It was: How do we make something to flock to someone else? How do we make the thing to raise money rather than raising money for our thing? Instead of coming up with something and then saying to a venture capitalist: This is why it’s worth money. It’s how do we create venture SEO? How do we make our company resemble it? And on some level, maybe that is the greater problem. 

That the value creation in the tech ecosystem turned into different versions of trying to please someone who wasn’t a customer. Startups building themselves to sell to venture capitalists, so that that company could sell to a big tech company, so that the big tech company could build something that will pleases investors, but not customers. It sucks, it sucks a great deal. I started in tech in 2008. I’m not saying it’s always been good, but it wasn’t like this. Maybe I was just stupid, aybe I was just ignorant of the forces. But at the same time, it’s just really sad to watch. I think it could go the other way if big tech keeps pissing off engineers, because at some point, there will be a brain drain. There will be less reason for people to work at Facebook or Google. People will be doing boring things. Just trying to think of how it could change quickly, but it’s got to be that one of these AI companies collapses. That’s the thing that will change something.

Paris Marx: There needs to be real shakeout and a real readjustment, I guess, of what’s going on there. Do you think that the crypto moment was a real mask off for VC? Or do you think that was kind of coming well before?

Ed Zitron: I think crypto was more of a mask off for consumers and tech companies. I think that crypto and the Metaverse were the two times where I think real people went: What the fuck are you talking about? Wait, what? Huh? I don’t understand this and you’re not making the effort helped me understand. When the iPhone lost its headphone jack, Apple ushered in: Do you really like the wire? I’m not saying they were good for doing it. But they actually had a value proposition which is do you really like having wires? And the answer is no, no one really likes that. There are reasons why you might have it. There was a reason to do it. It sucked, but you could see what the better future might be — even though it was one that heavily benefited Apple. 

With the Metaverse, it’s like: Oh, we’re all gonna do work and live in the Metaverse. Nobody wants to do that. What are you talking about? Oh, we’re going to have decentralized currency. Who the fuck asked for that? No one asked for that. Oh, I’m gonna start buying NFT’s. I’m going to have a picture of an ape one of 10,000 pictures of an ape. That’s the future of art. Fuck you! What the hell does that mean? What are you talking about? The mask off moment was for consumers rather than venture. But also venture made a bunch of money off just nothing, just nothing, nothing, just nothing. No one called anyone out for this. In a just society, people would be ripping Andreessen to shreds for the crypto stuff. They would do something to Chris Dixon. Chris Dixon, great guy who used to be a decent software guy, used to be a New York software guy, used to invest in software companies. He has become just a actual con artist. His book is so funny. I have read sections, oh what is it? He did a Web3 book.

Paris Marx: “Read Write Own.” Is that the one?

Ed Zitron: Yes, and it just it feels like it was generated with ChatGPT. Molly White did an amazing review of it and was just like: Yeah, there’s no actual examples of anything working in here. But it’s just like: Well, if things were decentralized, that would be good. The tech ecosystem has just lost any soul. It’s lost anyone to really look up to. Who do you look up to now? Sam Altman, Elon Musk? But who do you look up to in the tech ecosystem now? Who is the hero? Who is the actual person who developed something, who you can point to and go: Damn, that’s cool. I mentioned Alexis Ohanian — he drew the Hipmunk logo. He was very heavily involved in Reddit. He was very much part of the software ecosystem. Sam Altman, what the fuck has Sam Altman done? Sam Altman has found ways to make himself and his friends rich. That’s his biggest technological achievement. He has had one company that you sold for $40 million that was a complete flop. That’s our hero? Loopt?

Paris Marx: And then he was parachuted into the top job at Y Combinator, because Paul Graham liked him.

Ed Zitron: It’s so funny, all these right-wing, techno fascists, all these conservative freaks within the tech industry are like: You just need to pull up your bootstraps. It’s a meritocracy. No, it’s goddamn not. Sam Altman hasn’t done anything. He has made people rich by accident. He was able to get in on early deals with companies. He’s going to make $400 million on Reddit, it’s insane. The people being rewarded are not doing anything, there is no longer a way of looking at this stuff. And saying: This is a system that benefits hard workers, or innovation of any kind.

Paris Marx: All the things that we’ve been talking throughout this interview is, I increasingly feel, like this digital revolution that we were sold — this big transformation that was going to come from the internet, and all this rolling out and connecting us around the globe and all this kind of stuff — is increasingly looking like it’s been a failure. In large part because of how it was captured by corporations and how they’ve remade it. For a long time, we could see that, sure. There were downsides to this commercial model. And there were things that these companies were doing that we weren’t very happy with, but we could see that there were a lot of tangible benefits that had come of what they had delivered. 

But I feel like more and more, like we’ve been talking about, as they increasingly have to take the good things that they’ve done, and make them worse and worse in order to create more profit. To churn out that little bit extra profit that they can get from it, that the benefits that we did have are increasingly being eroded. While the drawbacks of this model that they’ve created, continue to grow and grow and grow. And I don’t know, as you were saying, I don’t know how this gets reversed without a much bigger readjustment, or shakeout, to really just reset everything at this point, because so much just seems irrevocably broken.

Ed Zitron: My one hope is that I think it is changing in Silicon Valley, because the sheen has come off AI companies, they’re not just automatically getting funding anymore. There are other companies coming out that do this crazy thing where they make more money than they spend.

Paris Marx: What! They’re allowed to do that in tech?

Ed Zitron: They’re allowed and they’re doing it today. It’s very, very scary idea, I know. But I think that there is going to be a generational shift where none of these people have anyone to look up to in tech, the only good tech guy I can really think of is Woz [Steve Wozniak], he hasn’t really done stuff in quite some time.

Paris Marx: He just likes to pontificate every now and then and call people out.

Ed Zitron: He’s genuinely like a lovely tech dork. It’s great. But who do they look up to now? What great companies are there in tech to look and go: Goddamn, I wish I was working at Google with the thing that’s Google’s doing. Oh, maybe I could work Meta and work on the Oculus, I guess? Apple, I still think  — I know we have problems with them — they at least seem to realize that there are customers to fulfill an obligation to otherwise they won’t get paid.

Paris Marx: I think that part of it comes from where hardware is such a much bigger part of their business than many of those other tech companies.

Ed Zitron: Also, they build physical things and I think that’s the problem. I think that’s the thing. It’s not obvious who Google’s customer really is anymore. Salesforce, same deal. Meta, same deal. Facebook and Instagram have been hostile to users for a long time. There are two things that could happen. I think that the AI arms race will come back and really fuck over a bunch of companies. The massive over investment in real estate for datacenters and a massive buy up to build these things up. If demand in AI starts to shrink, then those things are going to sit there. If you start seeing those projects get cancelled, that’s how you know things are turning.

Paris Marx: I also don’t see how a company like Nvidia doesn’t crash once the hype comes out of the AI bubble.

Ed Zitron: Jensen Huang is a smart guy though. He’s like a hardware guy. He’s not going to fall apart that easily. But you never know. I actually really want to dig into Nvidia’s finances. I want to see how much they’ve scaled up to capture demand here. But I if AI begin shrinking, and all of these big tech companies, who invested so much money and time and marketing energy into it, get burned. I think they might start thinkino: Oh, maybe we need to build something for people again. I don’t think it will be quick because that worm will take a while to turn. But I also think that at some point, we’re going to see the current crop of CEOs cycle out and when that happens something good could happen, or really bad, it really depends. But the younger founders I meet these days don’t seem as idealistic, but they seem much more business oriented. You see something in founders these days where they want to tell you they’re profitable very quickly. They want to let you know what they’re doing and why they’re doing it. There’s a lot less fluff from younger people in the tech industry these days. 

My vague hope is just that that generation is going to wash out the David Sachs of the world, the Jason Calacanis of the world. You’ve got a lot more young investors who are generally decent people. And yeah, you’ve still got a bunch of horror shows, but at some point Andreessen will retire. But at some point, there will be a generational shift here, the same one that brought in someone like Sam Altman. But I just don’t think you’re going to see as many petty kings created like Sam Altman in the future. There is not a world now, where your loser startup that did nothing is gonna get sold for $40 million, that’s just not going to happen anymore. I’m hoping this happens because if that doesn’t tech is eventually going to eat itself, tech is going to stop entirely making things that look good for investors rather than consumers. And when that happens, they will keep doing that as much as they can until something just snaps. Until they realize: There is no more revenue here. Because there’s no person involved. And it sucks, it sucks that something bad has to happen to fix things, but I don’t see how the hell we get out of it.

Paris Marx: My view is there’s no escaping that basically, something has to really break down and fall apart for anything to get reinvented. I don’t have a lot of hope that the companies as they’re currently assembled, or the people in charge of them, are going to be the people to lead us to some better future. Ultimately, it’s going to be challenging the power that they have, and the business models that have made it that way in the first place. If the way of growing a tech company remains the way that it’s been for the past 20 years, then there’s going to be no real incentive to change what has been going on. Part of me feels like the AI boom in the AI hype, and all the lies that surround that are really just hoping that we can last long enough for the interest rates to start coming down again. And then we go back into how everything was before the pandemic. I really think that’s completely unsustainable and I hope that that’s not the case.

Ed Zitron: My one other thing I’ll say is that I think at some point the tech ecosystem will get tired of people hating them. They had a nice time when all the attention they were getting was very positive, and that’s gone away. It’s still there, there are still plenty of ways to get positive press. But you know what, it’s far less of them. They’re much harder if you’re just a fucking liar. So, I mean, my conspiracy theory is something makes OpenAI destabilize. I think that that is where the rock begins to fall apart because that company burns money, they make a billion dollars in revenue. No one has ever talked about their profits. And there is no path to profit from that company as it stands, unless they make chips cheaper, and just it’s not going to happen, Sam. No, you’re not getting $7 trillion for chip venture, no one’s gonna do that. I don’t think tech people also realize how hard chip making is and how slow it is and how the amount of machines you need to do so. It’s kind of insane and I don’t think that he  realizes that, at some point, even investors are gonna say: So the only way your business works is if we give you $7 trillion to build a new chip? No.

Paris Marx: And don’t forget finding an energy breakthrough that is nowhere on the cards right now to power all this.

Ed Zitron: If $7 trillion could make that chip, why wouldn’t you give it to Nvidia? Why wouldn’t you give it to Jensen Huang? He is very clearly capable of doing this. He very clearly knows what he’s doing. Why hasn’t that happened? And the answer is because I don’t think it’s actually possible what he’s asking. I don’t think Sam Altman even really understands anything. If you hear him in speeches, he actually sounds kind of a dullard. For years people have been telling me these people are geniuses, but you hear them and they’re the least articulate people in the world. They sound so dumb! Mark Zuckerberg sounds like Mark Zuckerberg. He’s just kind of like: [robot noises]. He is a robot, fine. But Elon Musk, one of the dumbest sounding people I’ve ever heard. Sam Altman dumber than that, but just boring as well, soulless. Satya Nadella can be quite an interesting speaker, Sundar Pichai less so. Tim Cook, meh. His accent’s fun, a little folksy and I’ll take it more than Steve Jobs. But I just think the sheen will come off at some point if it isn’t already off entirely. And let’s be honest, even if we get a republican government, they’re no friends to the big tech, they fucking hate them. I think tech needs to realize that the good times are here for now. But if the world turns on AI, if the world turns on these big tech companies, everyone’s going to suffer again. We’ve already suffered several times in the last few years.

Paris Marx: I think that is basically inevitable at this point. Ed, it’s always fantastic to speak with you, to get your thoughts on what’s going on in this tech industry and I would highly recommend if people like the show as they obviously do, and the critical perspectives that we have on here Ed’s show Better Offline is definitely one to check out, to hear what he has to say about the tech industry. So thanks again for taking the time to come on the show Ed, always great to chat.

Ed Zitron: No problem. Thank you for having me. I love coming on here.

Subscribe to The Nation to Support all of our podcasts

Dear reader,

I hope you enjoyed the article you just read. It’s just one of the many deeply reported and boundary-pushing stories we publish every day at The Nation. In a time of continued erosion of our fundamental rights and urgent global struggles for peace, independent journalism is now more vital than ever.

As a Nation reader, you are likely an engaged progressive who is passionate about bold ideas. I know I can count on you to help sustain our mission-driven journalism.

This month, we’re kicking off an ambitious Summer Fundraising Campaign with the goal of raising $15,000. With your support, we can continue to produce the hard-hitting journalism you rely on to cut through the noise of conservative, corporate media. Please, donate today.

A better world is out there—and we need your support to reach it.


Katrina vanden Heuvel
Editorial Director and Publisher, The Nation

Paris Marx

Paris Marx is a tech critic and host of the Tech Won’t Save Us podcast. He writes the Disconnect newsletter and is the author of Road to Nowhere: What Silicon Valley Gets Wrong about the Future of Transportation.

More from The Nation