AI and Unintended Consequences

Ángel Díaz considers the role of technology in perpetuating inequality and racism.

By Andrew Faught

Watch Professor Diaz's Featured Insights video

Sixty-eight million American homes are equipped with doorbell cameras. Professor Ángel Díaz researches how the seemingly neutral devices can reinforce inequality, particularly for persons of color.

"Ring cameras do the invisible work of policing who does and doesn't belong in a community,” he says. “Marginalized communities can be left feeling out of place in their own homes."

“One of the big issues that we have in law is the difference between the rights that you have in your private home versus the rights that you have in public spaces,” says Diaz of private surveillance. “And so what that means oftentimes is that the Fourth Amendment provides fewer protections when you are moving about the public sphere.”

Díaz explores the challenges in “The Public Harms of Private Surveillance,” his UCLA Law Review article that outlines a novel framework for thinking about the racial harms that are chilling equal access to public roads and sidewalks.

“Footage captured by your neighbor might actually be handed over to a police department without your knowledge or consent,” Díaz says. The cameras are known to fuel neighborhood watch apps with biased descriptions. That, in turn, has led to increased surveillance, racial profiling, and erroneous reports of crime. “If you appear suspicious running in your neighborhood, there might be a lively discussion on Nextdoor coordinating how to call the cops to make you leave,” he adds.

Traditionally, legal discussions around surveillance focus on privacy, particularly the rights of the person who owns the device, as stated in a manufacturer’s terms and conditions. Unlike constitutional rights, which are publicly debated and legally enforced, these private agreements can be changed on a corporation’s whim.

“One of the ideas that I try to advance through my scholarship is that we should really embrace our common-law traditions to think about novel harms,” Díaz says. “Maybe we should think about using these frameworks and ask, ‘To what extent are new technologies transforming old methods to use surveillance to exclude people from the public sphere?’”

Díaz’s approach reframes surveillance as a collective issue, opening the door for new forms of legal theorizing around harm and accountability. The shift toward corporate malleability of privacy rights represents a significant transfer of power. He suggests that a handful of technology companies now play a central role in shaping civil liberties, often without the transparency or accountability expected of public institutions.

Doorbell cameras are just one of the seemingly race-neutral technologies that Diaz studies. His research includes “Online Racialization and the Myth of Colorblind Content Policy,” which considers how social media speech policies “obscure and legitimize systems of white supremacy” via moderation restrictions that set an incredibly high bar for enforcement. “By requiring explicit racial animus or undeniable calls to violence before company intervention, content policy largely shields the vast arsenal of attacks available to white voices who trade in the language of coded messages and dog whistles,” he writes.

Tech and Law: An Origin Story

Díaz, who joined LLS at the beginning of the 2025-26 academic year and is a former editor of the California Law Review and the Berkeley Technology Law Journal, imbues his perspective on technology and the law where it intersects with the curriculum in his range of classes: “Contracts;” “Information Privacy Law;” and “Race, Technology, and the Law.” His scholarship examines how private law interacts with public regulation to shape access to public space and privatize inequality.

His research genesis is personal.

“My childhood helped me understand how much communities of color are treated as inherently suspicious, but also how much these same communities are central to the vibrancy of public life in American cities,” Díaz says.

Born and raised in Los Angeles, Díaz is the child of immigrants from El Salvador. Early on, his mother’s work put him squarely at the intersection of technology and privacy.

“Like many children of immigrants, I began advocating for my parents as early as elementary school,” Díaz says. “One of my first memories is asking a family to remove a nanny cam from a home my mother was cleaning.”

Those early experiences helped shape his desire to study the law and to understand the scope and possibilities of constitutional rights and freedoms.

Díaz’s research dovetails with an interest in critical race theory (CRT), the academic framework that states racism transcends individual biases and is embedded in legal systems, policies, and institutions.

“Law plays an important role in how Americans think about race and racial discrimination,” he says. “CRT has always engaged with the politics of the day, criticizing both conservative and liberal policy solutions. My students study interventions across the political spectrum and how to leverage the power of law to dismantle racial hierarchy.”

"We should think about using these frameworks and ask, ‘To what extent are new technologies transforming old methods to use surveillance to exclude people from the public sphere?’”

The rapid expansion of AI into everyday life has introduced not only convenience and efficiency, but also a complex web of legal, ethical, and social questions that are still unfolding. “ One of the things that I want our students to really think through is how in more and more industries, we have corporate terms governing public and private rights more than our Constitution,” Díaz says.

Many of the cameras rely on artificial intelligence for features such as facial recognition; behavior analysis to assess whether a visitor is a security threat; and smart object recognition to minimize false notifications. Biases in systems can have unintended effects on communities of color.

Lessons Learned: Scholarship Takes New Approach

For Díaz, if communities can articulate the harms they experience, then legal systems can recognize and respond to those harms. Policymakers, in turn, can help to develop frameworks that prioritize fairness and accountability.

“What I'm really the most interested in is for impacted communities to be able to describe the harms that they're experiencing, and to have the law hear them and adapt and respond to them,” Díaz says. “Part of this is looking at the practices of marginalized groups, how they are interacting with these new technologies, and trying to describe the harms that they're experiencing.”

As technology becomes more deeply integrated into daily life, Díaz supports efforts toward thoughtful regulation and accountability. In the U.S., protections against surveillance are inconsistent and fragmented. Federal laws, such as Fourth Amendment protections against unreasonable searches and seizures, limit certain government actions. State laws, such as California’s constitutional right to privacy, offer additional safeguards. Local governments may impose their own restrictions, such as bans on facial recognition.

But these protections vary widely by jurisdiction and often fail to address the full scope of modern surveillance technologies. Díaz proposes looking to older areas of law, particularly the concept of public nuisance. Traditionally, public nuisance law deals with actions that interfere with the rights of the general public, such as blocking a roadway or polluting shared resources.

He suggests that this framework could be adapted to address modern surveillance practices. For example, if a network of private cameras effectively deters certain individuals from using public sidewalks due to fear of harassment or monitoring, this could be seen as an interference with their right to access public space. In other words, a new definition of public nuisance.

This approach represents a shift in how legal scholars think about harm. Rather than focusing solely on individual privacy violations, it considers the broader social impact of technology on collective rights. It also opens the door to new forms of legal action, including lawsuits brought by individuals who can demonstrate that they have been uniquely harmed by these practices.

While much of Díaz’s focus is on marginalized communities, he says the issues affect everyone. Surveillance systems and the data they collect can be used in ways that extend beyond their original purpose. Individuals who participate in protests or express dissenting views may find themselves subject to increased scrutiny. Even those who might otherwise feel insulated from these concerns can become targets under a growing dragnet, Díaz says.

“The main focus of my work is to first describe the problem and use that push for legal reforms that push us toward a more equitable society.”

Andrew Faught is a journalist and author whose work has appeared in The Pennsylvania Gazette, the magazines of UC Berkeley Haas School of Business, Villanova University, Hopkins Bloomberg Public Health, Harvard Kennedy School, Loyola University Maryland, Smith College, and more.