Denver Must Reject the Expansion of Mass Surveillance

Denver Mayor Mike Johnston is pushing to replace Flock Automatic License Plate Reader (ALPR) cameras with a new system from Axon. While this may sound like a simple vendor change, the reality is far more concerning: this proposal would expand mass surveillance in our city, with serious consequences—especially for communities of color.

The Denver City Council is now preparing to vote on this contract Monday, March 30th. This is a critical moment for our communities.

What Are ALPR Cameras—and Why Do They Matter?

ALPR cameras don’t just scan license plates. They track people.

Every time a vehicle passes one of these cameras, the system captures an image, logs the location, and records the exact time. Over time, this creates a detailed map of a person’s life—where they live, where they work, where they spend time, and even who they may associate with.

The Denver Police Department has also confirmed that these cameras can be used for livestream video, raising even deeper concerns about real-time surveillance.

Importantly, you don’t have to be suspected of a crime to be tracked. These systems collect data on everyone.

We’ve Seen This Before

Denver has already experimented with ALPR surveillance through its partnership with Flock. That rollout sparked widespread concern about data sharing, privacy violations, and the lack of transparency and oversight.

Now, instead of addressing those concerns with meaningful safeguards, the city is moving to replace Flock with Axon—another surveillance vendor—without putting proper protections in place.

The Problem with “Predictive” Policing

These technologies rely on artificial intelligence systems trained on historical policing data. But that data reflects decades of racially biased policing, over-surveillance, and the criminalization of Black and Brown communities.

AI doesn’t predict crime—it predicts where police have historically focused their attention.

This creates a dangerous feedback loop:

  • Biased policing produces biased data
  • Biased data trains AI systems
  • AI systems justify even more surveillance in the same communities

The result is not safety—it’s the reinforcement of existing harm.

Even Axon’s own AI Ethics Board warned about this trajectory, cautioning that these tools could lead to “a race to the bottom of more pervasive and more powerful surveillance.” The board ultimately resigned after the company ignored these concerns.

A Slippery Slope Toward an AI-Driven Surveillance State

Axon is not new to Denver. The company already provides tasers, body cameras, and drones to the Denver Police Department. Adding ALPR cameras would further consolidate a network of surveillance tools under one company—moving us closer to what Axon itself describes as “real-time, AI-powered crime centers.”

This is not a future our communities have asked for.

Holding law enforcement accountable is already difficult. Holding opaque AI systems accountable for bias, misuse, or error will be even harder.

What Real Safety Looks Like

Mass surveillance does not make our communities safer.

Real safety comes from investing in:

  • Housing
  • Healthcare
  • Education
  • Community-based care

Instead of expanding surveillance infrastructure that feeds systems of over-policing, incarceration, and deportation, Denver should invest in the resources our communities actually need to thrive.

Take Action Now

The City Council vote was moved to Monday, March 30th. Here’s how you can make your voice heard:

1. Sign the petition
Tell Denver City Council to vote NO on the Axon ALPR contract:
https://secure.ngpvan.com/XMyoiCFJ3E2acNZUX04Hpw2

2. Call your City Council member
Demand a NO vote and speak out against mass surveillance and predictive policing.

3. Show up in person
Join us at Denver City Council on Monday March 30th at 5:30 PM.