Faculty of law blogs / UNIVERSITY OF OXFORD

Political accountability at the digital border: Who’s responsible when technology takes over?

Posted

Time to read

4 Minutes

Samuel Singler is a Lecturer in Criminology at the Department of Sociology, University of Essex. His research focuses on the relationship between digital border control technologies and criminal justice practices, particularly in the Global South. This is the third post in the Border Criminologies 'Southern Perspectives on Border Criminology' themed series curated by Rimple Metha and Ana Aliverti. You can find Samuels's corresponding article in the International Journal for Crime, Justice and Social Democracy here: Performativity, Pragmatism and Border Control Technologies: Democratising the Ontologies of Border Criminology 

computer circuit board

New technologies have repeatedly transformed attempts to control cross-border mobility. Now, automated digital technologies and algorithms are part and parcel of border control globally, and many new tools have quickly receded into the mundane background of everyday life. Identity verification, alert list checks and risk assessments are carried out behind the scenes, out of sight and out of mind of most border crossers. The invisibility of these tools is presented as one of their key benefits by public authorities.

Digital border control technologies offer both risks and opportunities. Their developers promise “greater transparency” and facilitation, while critics argue that data-driven borders place vulnerable individuals at greater risk. Who should we believe? When digital tools have negative effects, who can we hold accountable? In an age of increasingly autonomous technologies, is our traditional understanding of human accountability becoming outdated?

The global diffusion of digital border control tools—often developed in the Global North and deployed by Southern state authorities—suggests that postcolonial power operates through them. Rethinking technology can contribute to decolonizing and Southernizing border criminology by destabilizing dominant Northern forms of knowledge production about migration and border control. In my article, I illustrate my theoretical arguments with reference to the Migration Information and Data Analysis System (MIDAS), developed by the (primarily Northern-funded) International Organization for Migration and deployed in Nigeria.

Theorizing technology can help us answer pressing political questions about accountability and social justice at digital borders, but there are many ways of understanding what exactly technologies are and how much agency they have. Our preferences regarding these theoretical alternatives determine our answers to the questions raised above.

My own answer to the question of whether autonomous digital tools have made human accountability an outdated notion is: “No.”

To deconstruct the political effects of border control technologies, I distinguish between their material and performative effects, intended and unintended effects, and their development and use.

The first distinction, between material and performative effects, is based on the framework of performativity pioneered by Judith Butler and already used by criminologists to analyze global criminal justice practices. Technologies have material effects when they directly influence bordering practices. Automated border checks are quicker than those relying on pen and paper; digital risk assessment algorithms can blacklist passengers based on connections that humans would not have made; surveillance and detection technologies can push migration routes into more dangerous areas.

Even when digital tools have only limited material impacts, they can still have significant performative effects. Technological performances—demonstrating and promoting new technical systems publicly—allow public and private actors to pursue political goals. For instance, in 2023 Nigeria’s Minister of Interior stated that “the NIS [Nigerian Immigration Service] is now better equipped with advanced technology to curtail any breach in Nigeria’s borders.” Such statements have been crucial to ensuring Nigeria’s involvement in regional and global political partnerships, despite researchers arguing that the country’s borders remain highly permeable.

The second distinction, between intended and unintended effects, is where assessing political accountability becomes a more hotly contested issue. Some effects of digital borders are intended, such as speeding up border crossing for “trusted” travelers. Other impacts, such as wrongfully denying entry based on false positives, are (we may hope) unintended by their developers.

According to ‘posthumanist’ theories of technology, the complexity and autonomy of digital tools have resulted in a dizzying array of unintended effects, which undermine traditional assessments of human accountability. In one prominent study, Jane Bennett argued that humans are now “incapable of bearing full responsibility” for the effects of new technologies. Such perspectives can demonstrate how digital systems can themselves transform border control practices. Yet, as Thomas Lemke has pointed out, focusing on technical agency often “inadvertently translates into a systematic blindness concerning the inequalities, asymmetries and hierarchies” built into technical objects.

The philosophical tradition of pragmatism can help us temper claims about technological agency. Philosophical pragmatists conceptualize technology as “the invention, development, and cognitive deployment of tools and other artifacts, brought to bear on raw materials and intermediate stock parts, with a view to the resolution of perceived problems.”

This definition foregrounds the final analytical distinction, between developers and operators of border control technologies. In contrast to popular rhetoric regarding the unstoppable march of technological innovation, the pragmatist view highlights how humans create new technologies when confronted with perceived social problems that previous tools have failed to resolve. This developmental process is characterized by uncertainty and contestation over how these problems—such as migration control—should be conceptualized, and which technical alternatives are best suited for the task.

For instance, as I explain in the article, officials from the International Organization for Migration debated whether their biometric border control system is a suitable solution to the problem of border control in Nigeria. Some officials argued that the risks outweighed the benefits, given a lack of robust legal frameworks to protect against rights infringements and discrimination. Ultimately, those proposing to expand the system won the debate, based on a securitized understanding of border control and a teleological view of technological innovation.

Distinguishing between the development and use of border control technologies demonstrates how human actors normatively deliberate about and choose between technological alternatives for addressing migration control. The discriminatory effects of new tools may be unintended, but they are rarely unforeseen. Such harms are also often entirely preventable if humans choose to prevent them. Even if these tools shape bordering practices in unintended ways, normative deliberation regarding their desirability remains a uniquely human capacity.

When the harms of digital borders are blamed on technical faults—and when their limitations are reconceptualized as “information gaps” that must be “closed” by increasingly intrusive systems—we should resist such technicist arguments, hold their developers accountable, and ask what alternatives were foregone in favor of the securitized ‘data-driven’ border.

 

 

Any comments about this post? Get in touch with us! Send us an email, or post a comment here or on Facebook. You can also tweet us.

How to cite this blog post (Harvard style):

S. Singler. (2023) Political accountability at the digital border: Who’s responsible when technology takes over?. Available at:https://blogs.law.ox.ac.uk/border-criminologies-blog/blog-post/2023/10/political-accountability-digital-border-whos. Accessed on: 10/05/2024

Share

With the support of