The city is at the forefront of the fight against using big tech to surveill residents. But AI poses new threats.
Illustration by Melinda Beck.
Last May, Local Progress, an organization whose membership includes almost 2,000 locally elected progressive officials from around the country, issued a report warning of the rapidly growing dangers faced by communities due to the spread of AI-based surveillance systems. It cited as an example networks of automated license-plate readers (ALPRs) and smart streetlights, which enable police departments to identify specific suspects by combining crime witnesses’ descriptions of individuals and vehicles with the massive amounts of data on residents’ movements collected by these systems. Because AI excels in pattern recognition, it potentially allows police to find needles in an urban haystack, locating people and then tracking their movements in real time with pinpoint accuracy. Companies such as Flock Safety, founded in 2017 and based in Atlanta, and Ubicquia, based in Fort Lauderdale, have made fortunes providing such systems to federal, state, local, and private entities. Flock Safety alone operates 80,000 AI-powered cameras in 6,000 communities, and it is currently launching a new “surveillance drone” product to hoover up still more data.
While city governments and police departments portray the technology as inherently benign—simply a souped-up, “smart” crime-fighting tool that increases the efficiency of law-enforcement activity by orders of magnitude and comes with no downsides for law-abiding residents—critics argue that the growing use of these products represents a serious threat to civil liberties. Privacy advocates worry that in an era of mass surveillance, such systems could be exploited by bad actors and used to monitor political protesters, women seeking reproductive care, immigrants just trying to go about their business, and others.
“There has thus far not been an automated license-plate-reader system able to adequately protect the technology from being co-opted by authoritarianism,” says LiJia Gong, the legal and policy director at Local Progress. “These tech corporations oftentimes use cities and local governments as testing grounds to normalize surveillance and automation tools.” Many of the contracts for ALPR systems and smart-camera facial- and vehicle-recognition networks allow companies to update the underlying software without seeking approval from local authorities—meaning that when these systems are installed, it’s almost impossible to know how their tracking capabilities will develop as advancements in AI lead to improvements in its ability to recognize patterns and make predictions.
As the Trump administration carries out an unprecedented crackdown on immigrants, the ability of Customs and Border Protection to access such data has raised alarms, and there have been a number of reports alleging that federal agents may have found backdoor entry points into ALRP and smart-streetlight systems, even in states that prohibit their police departments from cooperating with these agencies. There have also been allegations that a Texas woman was tracked after self-administering an abortion. In October, the University of Washington’s Center for Human Rights reported that the US Border Patrol had backdoor access to the surveillance networks of at least 10 police departments in the state, none of which had authorized such use of their data—though police analysts argue that since Flock Safety has disabled its software’s ability to share information with federal agencies in states that prohibit such data-sharing, the “back door” was likely rogue cops illegally passing along information to the feds.
Flock denies that there is a back door to its technology or that it shares locally or state-owned data with ICE. Josh Thomas, the company’s chief communications officer, tells me, “We don’t work with ICE. We’ve never worked with ICE. We have no contract with ICE. There’s no back door into the Flock Safety system. All of our customers 100 percent own and control their own data. Flock doesn’t share it at all or sell it to any third parties.”
Few cities in the country have been as consumed by debates over the use of AI-powered surveillance systems as San Diego, which has a small police force for a city its size—less than 1,900 officers for roughly 1.4 million people—and has long relied on high-tech crime-fighting tools to fill the gaps in its personnel and funding. Over a decade ago, San Diego’s district attorney used surveillance data to link nearly three dozen young African American men to local gang violence and to charge them with offenses connected to several shootings, despite the fact that they weren’t near the scenes of the crimes at the time. Many took plea bargains, and the charges against those who didn’t were eventually dismissed; if their cases had proceeded, each of those young men would have faced many decades behind bars.
“Far before Trump, we’ve been concerned with federal overreach and data-sharing,” says Homayra Yusufi, a senior policy strategist at the Partnership for the Advancement of New Americans (PANA) in San Diego. Districts that are mainly populated by people of color and immigrants are “over-policed and over-surveilled,” she says, and cameras and ALPRs—with their telltale bug-like antennae—speckle virtually every intersection in these neighborhoods.
The San Diego Police Department has a contract with Ubicquia to provide ALPR and smart-streetlight systems; Ubicquia, in turn, has subcontracted out the implementation of the project to Flock. The two companies and the SDPD say the technology is designed to leave a digital trail of who has asked for information and how the information has been shared, “so if there is abuse or if somebody lies about it, they can be held accountable by the appropriate governing bodies,” Thomas says. Capt. Charles Lara, who oversees the SDPD’s Research, Analysis, and Planning Unit, says that the department regularly conducts audits of how its surveillance systems are being used and who is accessing the information, and claims that the systems are less invasive than the phones everyone carries around with them. “No one wants to live in a police state,” Lara says. “But at the end of the day, people are misunderstanding the Fourth Amendment in public places.” (While the Fourth Amendment guarantees a right to privacy in private spaces, courts have found that it generally doesn’t guarantee such a right in public areas.) Moreover, all officers are given guidance specifically stating that per California law, they cannot share surveillance data with the feds or other out-of-state agencies, and if they do, they will be reported to Internal Affairs and disciplined, Lara says.
Despite such reassurances, immigrants’ rights organizations and privacy advocates are increasingly alarmed by the federal government’s use of every available tool to clamp down on perceived enemies and fear that, despite laws that limit information-sharing, the SDPD’s surveillance tools could at some point be put to use by the feds.
Notwithstanding the California Values Act passed in October 2017, which limits local and state law-enforcement cooperation with ICE and other immigration-enforcement agencies, some more conservative cities in the San Diego metropolitan region have reportedly shared surveillance data with the feds. One of them, El Cajon, is being sued by the state’s attorney general, Rob Bonta. “Despite clear guidance and multiple warnings, the City of El Cajon Police Department continues to share this data with numerous out-of-state law enforcement agencies throughout the country,” Bonta’s office claimed in the October 2025 press release announcing the lawsuit, which was filed after a local PBS station reported that El Cajon’s surveillance data was used in immigration-related searches at least 550 times in the first nine months of 2025.
In 2016, San Diego signed contracts to install more than 3,000 smart streetlights and ALPRs, although, perhaps fearing a backlash, the city neglected to tell the public about the new devices until three years later. By then, Lilly Irani, a professor of communication and science studies at the University of California San Diego and a former Google engineer, was warning of “data creep” and—like activists in the San Francisco Bay Area; New York City; Portland, Oregon; and elsewhere—was decrying Big Tech’s use of massive datasets on residents’ movements to create the building blocks of a total-surveillance society.
Even before February 28, the reasons for Donald Trump’s imploding approval rating were abundantly clear: untrammeled corruption and personal enrichment to the tune of billions of dollars during an affordability crisis, a foreign policy guided only by his own derelict sense of morality, and the deployment of a murderous campaign of occupation, detention, and deportation on American streets.
Now an undeclared, unauthorized, unpopular, and unconstitutional war of aggression against Iran has spread like wildfire through the region and into Europe. A new “forever war”—with an ever-increasing likelihood of American troops on the ground—may very well be upon us.
As we’ve seen over and over, this administration uses lies, misdirection, and attempts to flood the zone to justify its abuses of power at home and abroad. Just as Trump, Marco Rubio, and Pete Hegseth offer erratic and contradictory rationales for the attacks on Iran, the administration is also spreading the lie that the upcoming midterm elections are under threat from noncitizens on voter rolls. When these lies go unchecked, they become the basis for further authoritarian encroachment and war.
In these dark times, independent journalism is uniquely able to uncover the falsehoods that threaten our republic—and civilians around the world—and shine a bright light on the truth.
The Nation’s experienced team of writers, editors, and fact-checkers understands the scale of what we’re up against and the urgency with which we have to act. That’s why we’re publishing critical reporting and analysis of the war on Iran, ICE violence at home, new forms of voter suppression emerging in the courts, and much more.
But this journalism is possible only with your support.
This March, The Nation needs to raise $50,000 to ensure that we have the resources for reporting and analysis that sets the record straight and empowers people of conscience to organize. Will you donate today?
In the local press, Irani warned that her neighbors were being used as “lab rats” for Big Tech social-engineering experiments. Khalid Alexander, an activist with Pillars of the Community, an organization that works with young Black men who have long been targets of police surveillance and whose names appear in gang-membership databases, teamed up with other privacy advocates to organize a series of community meetings to discuss the dangers of a surveillance state.
Those meetings gave rise to the Transparent and Responsible Use of Surveillance Technology San Diego (TRUST SD) Coalition. Made up of dozens of racial- and community-justice groups, civil-liberties advocates, and labor organizations, it pushed not for the elimination of surveillance technology but for the creation of a citywide ordinance that would place restrictions on the use of the technology and ensure that the public had a right to know when it was being deployed.
Over the coming years, TRUST SD’s activists were joined in their opposition to runaway, unregulated AI by Sean Elo-Rivera, a member of the San Diego City Council (as well as of Local Progress) with a background in community organizing. A tall, lean man with short salt-and-pepper hair and a trim goatee, Elo-Rivera dresses casually—black jeans, scuffed black Dr. Martens, a green jacket emblazoned with his name over a white vest, and a button shirt open at the neck. His City Hall office is adorned with civil-rights and organizing posters, and his bookshelf displays a hefty edition of the works of Che Guevara. “There’s no disputing” that AI-driven technology helps fight crime, Elo-Rivera acknowledges. But it comes “at a cost to people’s rights and civil liberties.”
Ever since he was elected to represent an immigrant-heavy district in the heart of the city, Elo-Rivera and his onetime City Council colleague Monica Montgomery Steppe (now a county supervisor) have led the fight against renewing San Diego’s contracts for the smart technology. In 2022, with municipalities around the country rethinking their approach to law enforcement in the wake of George Floyd’s murder and the resulting racial-justice protests, they scored a major victory when the contract with Ubicquia for smart streetlights wasn’t renewed—only to butt up against what they claim was the company’s initial refusal to actually turn off the cameras and kill the data streams. “San Diego’s not the only place where that’s happened—where a municipality has said ‘Shut ’em down’ and they’ve continued to let them run,” Elo-Rivera says. Similar stories have been reported in Eugene, Oregon; in Verona, Wisconsin; in Evanston, Illinois; and in Cambridge, Massachusetts. In Evanston, the city attorney sent Flock a cease-and-desist letter accusing the company of “an intentional and unauthorized disclosure of protected data” after it allegedly reinstalled cameras without the city’s permission. Flock Safety denies the claim, saying that when a city cancels its contract, the cameras no longer upload to the cloud even if they are still capturing images.
During that same period, as San Diego was rethinking its embrace of surveillance technology, Elo-Rivera helped push the landmark Transparent and Responsible Use of Surveillance Technology (TRUST) Ordinance through the City Council, which was passed in 2022. TRUST mandated that the city’s decisions about installing AI-driven surveillance technology be carried out in public and go through a formal approval process; that the police specify how the technology would be used; and that an “impact report” be submitted for each new technology. Locals also gained the right to sue the city if it turned out that the technologies were being used in ways not authorized by the council and in a manner likely to result in harm to San Diegans. In addition, the ordinance created a Privacy Advisory Board made up of technology experts, civil-rights advocates, community-group leaders, and others who could articulate concerns about the new AI surveillance technology and recommend safeguards. Khalid Alexander, of TRUST SD, was among those appointed to the board.
For a while, it looked as though one of America’s most livable, laid-back cities was bucking the trend toward total surveillance. Lara, the SDPD captain, bemoans the fact that the ordinance—which he says is the most restrictive big-city privacy legislation in the US—doesn’t allow his department to share its data with other jurisdictions and creates a lag time of many months between when the police request new technology and when it can be approved and deployed. Lara calls it “probably the most challenging thing we’ve had to address as we move toward a modern police department.”
The privacy advocates’ victory was, however, short-lived. The ordinance does continue to force the SDPD to be more public about how it is using data, but over the past few years, the Privacy Advisory Board’s recommendations on whether to deploy the technology have been increasingly overruled or simply ignored by local officials.
A couple years after the board’s creation, the SDPD requested that the Ubicquia contract (and the Flock subcontract) for 500 smart streetlights be reactivated. With backing from newly elected Mayor Todd Gloria, and with the public souring on the reforms put in place in the aftermath George Floyd’s murder, a majority of Elo-Rivera’s council colleagues voted for a five-year contract, the funding for which would have to be reapproved annually. When, last year, the board again recommended that the city not continue to fund the Ubicquia-Flock contract, citing numerous allegations nationwide regarding the misuse of such data, the SDPD lobbied heavily for renewal. The City Council agreed to its request.
That outcome was a major disappointment for the activists who had pushed for changes in the city’s relationship to AI surveillance technology. They had expected the ordinance and the Privacy Advisory Board to rein in the sprawling surveillance systems. Instead, in the years since, the technologies have grown in power, and the police have come to rely more and more on the use of AI-generated data.
Get unlimited access: $9.50 for six months.
“The city has undermined the process since day one,” Alexander says in frustration. “The board is made up of volunteers, and the city has refused to fund any research staffers for the board. The disappointment is beginning to outweigh some of the smaller [victories] we’ve won along the way.”
If you look at maps of smart-camera and ALPR deployments in San Diego, you see clusters in the poorer, non-white areas: in Barrio Logan, in the African American communities of South San Diego, in the urban core around Downtown. By contrast, in the more affluent northern and eastern suburbs, there are far fewer cameras and ALPRs. The clear takeaway is that, as with so much else in modern American life, the burdens of surveillance disproportionately fall on the poor and the non-white.
Even before the Ubicquia-Flock contract was renewed, Alexander had quit the privacy board, which he had come to see as a paper tiger. In his resignation letter, he bemoaned the fact that “the Mayor and majority of council members have shown that they are either unwilling or unable to even consider rejecting requests coming from SDPD.”
Elo-Rivera isn’t quite as pessimistic. Over the past few years, even as the campaigns to moderate San Diego’s use of AI-driven surveillance systems have foundered, his efforts to rein in AI usage have expanded beyond surveillance technology.
Having pushed back against price-fixing AI software in the rental market, Elo-Rivera has now shifted his attention to Waymos, the driverless cabs that have been operating in San Francisco for the past few years and are increasingly prevalent in San Diego. California law prohibits localities from implementing outright bans on Waymos on public streets, but private entities can still impose restrictions. Elo-Rivera has been in talks with the recently expanded San Diego International Airport to see if he can persuade its executives to bar Waymos from picking up or dropping off customers on its property. “It is imperative to me that we ask ourselves who benefits from these technologies and in what way,” Elo-Rivera says. “There are a lot of people here who drive either for primary income or secondary income. Our commitment is to make sure everyday people are centered in the decisions of how these technologies are used.”
A few years into his tenure on the City Council, he became interested in how landlords were using AI to skirt antitrust laws and collectively jack up rents by feeding their own data into software that identifies patterns across the city’s rental markets. “People bust their ass here, because it’s really fucking expensive,” Elo-Rivera says. “They’re running a marathon on a track tilted 90 degrees against them. The playing field is completely tilted in the wrong direction.”
Mia Loseff, the housing-program manager at Local Progress, also recognized that this was a problem. Her organization began studying the impact of such software four years ago and found that algorithmic price-fixing raises rents by an average of $70 per month—with some cities, such as Atlanta, seeing much higher increases. In part because of this research, one county and 12 cities in the United States, including San Diego, have banned the use of such software over the past 18 months. In October, legislators in California took the process a step further, outlawing its use statewide.
Elo-Rivera is also trying to persuade his City Council colleagues to support a ban on the dynamic-pricing strategies that grocery stores are rolling out—using AI to hike the price of staple foods during high-demand shopping hours, meaning that customers don’t know in advance what they’re going to pay for food and may even find that the prices change while they’re inside the store if there’s a surge in customers. “I think it is wrong for consumers to be subject to price volatility based on the time of day they’re going to shop for basic items,” he says.
San Diegans’ efforts to regulate AI have had mixed success. Despite the accountability coalitions and the creation of the Privacy Advisory Board, AI surveillance systems have proved remarkably difficult to control. Advocates feel as though they’re playing an endless game of Whac-A-Mole.
“We built the biggest coalition against Flock. We had 70 organizations sign on, the labor council, unions. But the police seem to be more powerful than any of them,” Irani, the UCSD professor, says.
For Elo-Rivera, AI represents one of the defining political challenges of our time. Before he was elected to the City Council, he was employed by a nonprofit organization working with at-risk young people. “My work was with communities that had to fight incredibly hard for things they should not have to fight for,” he says. “Access to transportation, access to healthy food in schools.” That experience primed him to bring a critical eye to the society-upending technologies that the Big Tech proponents of an AI revolution are unleashing. “The scale of these changes is unrivaled,” Elo-Rivera says. “The recipe for people giving up, for cynicism, for real social turmoil—it’s all there. Tech companies are attempting to squeeze consumers from every angle they can, and we have to be able to push back and fight on every single front.”
Sasha AbramskySasha Abramsky is the author of several books, including The American Way of Poverty, The House of Twenty Thousand Books, Little Wonder: The Fabulous Story of Lottie Dod, the World's First Female Sports Superstar, and Chaos Comes Calling: The Battle Against the Far-Right Takeover of Small-Town America. His latest book is American Carnage: How Trump, Musk, and DOGE Butchered the US Government.