Something is Rotten in the State of Denmark—ALPRs

The expanding role of automated license plate readers

Law enforcement “function creep,” the gradual expansion or alteration of the powers, authority, or technological capabilities of a police force or other law enforcement agency, can have serious negative implications for civil liberties and privacy. As law enforcement agencies are given new tools or interpret existing tools in new ways, they may begin to use the powers or technologies in ways that were not originally intended or anticipated. This can result in the erosion of civil liberties and an increase in surveillance and control over the general population.

Highlighting this concern, a 2022 Surveillance and Society journal article titled “From Banal Surveillance to Function Creep” by Gabriel Pereira of the London School of Economics and Christoph Raetzsch of Aarhus University, discusses the deployment and use of Automatic License Plate Reader (ALPR) systems in Denmark since 2016, for various purposes including parking, environmental zoning, and policing. The authors argue that the configurations of these systems and their intended uses can have unintended consequences—function creep—where the systems are used for purposes beyond their original intended use. 

ALPRs are “high-speed cameras that can rapidly scan numerous computer-readable images of license plates, eliminating the need for law enforcement personnel to do manual checks.” ALPRs are the focus of my 2022 report titled The Pitfalls of Law Enforcement License Plate Readers in California and Safeguards to Protect the Public. Despite their potential to solve crimes, ALPRs have troubling error rates, may misappropriate police resources, and risk the privacy and safety of drivers. 

The authors provide three case studies to illustrate the function creep concept: the use of ALPRs in privately-owned parking lots to simplify payment processes, the enforcement of environmental zoning regulations (a ban of highly polluting vehicles, such as diesel trucks, from entering certain parts of the city), and the surveillance of roads by Danish police to combat “serious and organized crime.” In each case, the authors highlight the potential for function creep and the potential consequences of such creep, including issues of privacy and the potential for abuse of the systems.

Of the three uses investigated, one stands out in particular—the Danish use of ALPRs for general policing. In Denmark, ALPR systems were “initially deployed as a border technology” with the stated goal of assisting law enforcement on matters of organized crime. ALPRs would be set up at border crossings and were meant to help track down purveyors of explosives and terrorism. However, ALPRs quickly ballooned beyond their original intentions. 

Pereira and Raetzsch wrote:

[ALPR’s] use has broadened beyond border control, with coverage increasing from 48 to 171 mobile cameras on top of police cars and 24 to 160 stationary cameras, most of which are placed at border crossings and large highway junctions. In 2022, the police expect a further expansion of the system to a total of 276 mobile and 272 stationary cameras, which will enable an even more granular analysis of vehicle movements across the country. 


[T]he ALPR system serves a much wider function than just “serious and organized crime.” In reality, it operates within the wider goal of turning the police into a data-driven organization, part of a push for “Intelligence-Led Policing” (ILP).

For a critique of “intelligence-led policing” and “predictive policing,” see Appendix A of my ALPR report as well as my 2019 commentary.

Denmark police also increased the data retention limits—the time period that police are allowed to hold on to stored records—from twenty-four hours to thirty days, all while considering almost all of the country a “targeted operation.” Another point of focus in my report is the danger imposed by lax retention limits. 

As the Danish experience shows, one of the dangers of function creep is the potential for surveillance technologies to expand their scope beyond their original intention. For example, if a police force is given the authority to use ALPR technology to identify criminal suspects, it may eventually begin to use that technology for other purposes, such as monitoring and tracking the movements of innocent people. This can lead to a loss of privacy and a feeling of being constantly monitored, which can have a chilling effect on personal freedom of expression and other fundamental rights. It is difficult to put the genie back into the bottle. 

It is important for law enforcement agencies to be transparent about how they are using their powers and technologies and to ensure that they are only used in ways that are justified and necessary. This requires robust oversight mechanisms to ensure that the powers and technologies of law enforcement agencies are not being abused or misused. It also requires a commitment to protecting civil liberties and the rule of law, and a willingness to hold law enforcement agencies accountable when they fail to uphold those values.

The author would like to thank Lawrence McQuillan for his comments on an earlier draft. 

Jonathan Hofer is a Research Associate at the Independent Institute. He has written extensively on both California and national public policy issues. He holds a BA in political science from the University of California, Berkeley. His research interests include privacy law, student privacy, local surveillance, and the impact of emerging technologies on civil liberties.
Beacon Posts by Jonathan Hofer | Full Biography and Publications
  • Catalyst
  • Beyond Homeless