Salzburg Global Fellow Nessa Lynch analyzes the negative impacts of emerging technologies in justice systems on individual and collective human rights
This article was written by Salzburg Global Fellow Nessa Lynch, who participated in the Global Innovations on Youth Violence, Safety and Justice initiative.
Emerging technologies such as artificial intelligence, predictive technologies, and remote biometric surveillance are in increasing use globally, including in criminal justice systems. However, the rapid development and expanding usage of emerging technologies has been led predominantly by commercial interests rather than deliberative policy or legislative processes.
These emerging technologies negatively impact individual and collective human rights. such as privacy and freedom of expression. Emerging technologies are increasingly a feature of youth justice systems, and these systems are both raising new issues and embedding existing systemic inequalities and human rights impacts.
As used here, the term ”youth justice system” encompasses law enforcement, diversion, and court and sentencing systems. But, in a broader sense, similar trends can be observed in adjacent and interlocking systems such as immigration, border control, and national security.
Potential uses of emerging technologies in youth justice systems include public-facing uses such as surveillance, data analytics, open-source intelligence analysis, individual and geographic predictive analytics, and in internal processes such as the use of generative AI in police report preparation. Worldwide, actors like the European Union are effectively implementing regulations and constraints of such technologies.
Uses of Technology in Youth Justice Systems
The first example of remote biometric surveillance is an increasingly prevalent and potentially highly intrusive usage of technology. This refers to systems which can track or identify a person by means of automated processing and algorithmic matching of their biometrics. Facial recognition technology is the most well-known of this suite of technologies. This can be used in real-time or retrospectively as an investigative tool. In a previous co-authored study, we analyzed the effect of facial recognition technology in youth justice systems. We found that this technology can impact children’s procedural rights in the justice system, as well as adjacent contexts such as education and protest movements like youth climate justice. We found that groups of children such as minority and Indigenous children may be disproportionately impacted, especially where there is bias either in the operation of the system, or the locations where the system is deployed.
A second example is predictive analytics in policing, which includes applications of artificial intelligence and machine learning that uses data and algorithms to anticipate and prevent crime and harm. It can be used in criminal network analysis, “hotspot” policing, and analysis of social media and other data for extremist or hateful content. While the use of statistics and probability to guide decision-making and deployment of law enforcement resources is not novel, the addition of powerful AI systems with the ability to analyze and make predictions from individual and collective data makes a system much more powerful, and the potential human rights impacts more acute. This technology has significant negative impacts on youth justice systems.
The use of this technology raises several distinct concerns relating to children and youth. Children and youth in conflict with the law almost inevitably have histories of exclusion from education, contact with welfare systems and mental health systems, generating significant amounts of data. Collection of children’s data often occurs as a byproduct of law enforcement interactions with their family members. The addition of predictive analytics can super-charge existing biases and discrimination in the criminal justice system. This can result in increased policing and surveillance of already marginalized communities, such as racial and ethnic minority children, causing a detrimental impact on children's sense of safety and well-being.
There are two regulatory lenses which can frame principled decisions on whether and how to use technologies in justice systems. The European Union is at the forefront of global moves to prohibit or constrain technology use-cases that are at odds with human dignity and fundamental rights. Under the European Union’s Artificial Intelligence Act, there are significant restrictions on remote biometric surveillance for law enforcement purposes in public spaces, which took effect on February 2, 2025. Use of this technology is now confined to a few specific situations, including imminent terrorist threats and finding missing people and vulnerable victims, and safeguards include prior independent authorization. Predictive policing based on individual risk factors of natural persons is also banned. In this aspect of regulation, the European Union is signaling that there are certain “redlines” in the use of technology in the justice system.
An approach that centers children’s human rights could effectively minimize the risks of emerging technology for children. Such an approach would also provide space for innovation where there are appropriate safeguards, and the use is in the child’s best interests. This requires centering children’s participation in regulatory design, and taking account of their evolving capacities, rather than adopting a protectionist and paternalistic approach.
Since 2021, Fellows participating in the Global Innovations on Youth Violence, Safety and Justice initiative have contributed to the Global Innovations on Youth Violence, Safety and Justice Report. Each section highlights the key challenges and opportunities identified by the initiative’s international participants, with illustrative case studies, recommendations for consideration and action, and suggestions of where the research agenda should focus in future. This report is continuously updated to reflect new findings, case studies, and resources.
Explore the full digital report here.
Learn more about the Global Innovations on Youth Violence, Safety and Justice initiative.
Salzburg Global is grateful to the MacArthur Foundation and the Harry Frank Guggenheim Foundation for their generous support and partnership that made this program possible.