A 30-second ad from Ring stood out among Sunday’s glut of Super Bowl commercials shilling myriad forms of artificial intelligence, serving as a frightening reminder of how ubiquitous surveillance cameras have become amid our diminishing privacy. After all, if the “Search Party” function built into the doorbell camera’s app can be used to find a lost dog, as Sunday night’s ad emphasized, there’s little to stop it from being used to track people. The doorbell camera’s feature is especially worrisome now as the government encroaches on our civil liberties.
Pointing out the potential of the Amazon-owned Ring being used to squash dissent may sound alarmist considering the hopeful and upbeat tone of Ring’s ad, which pitches the tool as an update to the missing dog posters a family might hang around their neighborhood. As Ring founder Jamie Siminoff tells viewers in the ad, “One post of a dog’s photo in the Ring app starts outdoor cameras looking for a match,” using artificial intelligence “to help families find lost dogs.” Walking down a bucolic street with what looks like a Belgian Malinois, Siminoff implores potential users to “be a hero in your neighborhood.”
The ad’s imagery of a web of cameras almost fully covering a suburban neighborhood with its automated scans may fill some people with a sense of security. But unlike the camera’s Fire Watch system, which can pick up signs of smoke and flames during wildfires, the use of AI to track a living creature’s movement is a worrying escalation. Even the sight of paid-actor dogs being reunited with their paid-actor families can’t distract from that disturbing subtext.
No matter how Ring and other surveillance tech companies may downplay it, there’s no world in which finding lost dogs is the final end-use for this technology.
It’s also the absolute wrong time in the United States for an ad promoting a warmer, fuzzier version of the panopticon. Just last month, activists opposed to Immigration and Customs Enforcement accused that agency’s officers of using Ring cameras to facilitate their deportation raids and arrests. At issue is Ring’s announcement last October that it had signed a deal with Flock Safety, a company that scans license plates and counts local law enforcement among its clients.
Here’s how CNBC covered the newly formed partnership at the time:
The Ring Community Requests feature will be available for use with the FlockOS and Flock Nova platforms that are contracted by local public safety agencies. That will enable law enforcement officers to directly request video evidence from Ring cameras, but citizens will make the decision whether to share video. Police requests will go into what is called the Ring Neighbors feed, which pings camera users within an area identified as relevant to the crime, and camera owners can then share video, which is kept in a secure environment and can only be used for the single crime investigation.
Both Ring and Flock have issued statements denying that their footage is being funneled directly to ICE or any other subagency of the Department of Homeland Security. A blog post from Flock last month stressed that “every piece of data collected by Flock license plate readers is owned and controlled by the customer, whether that customer is a city, county, school district, or private organization.” (The major exception comes in the form of legal orders to turn over that footage, like warrants and subpoenas.) But some of those cities or law enforcement agencies serving as customers may have information-sharing relationships with ICE or DHS already in place, letting them potentially serve as a middleman for that data.
It’s also the absolute wrong time in the United States for an ad promoting a warmer, fuzzier version of the panopticon.
That’s not to say that DHS doesn’t already have an alarming number of AI tools at its disposal. As my colleague Ja’han Jones has previously noted, the sheer pace and number of ways the department is incorporating AI is troubling. That includes an almost $30 million system that Palantir has developed to “help find and track individuals for deportation.” More distressing, still, is an app called Mobile Fortify that federal agents in Minneapolis have used to scan protesters’ faces for rapid identification.








