After the COVID-19 pandemic halted many asylum procedures across Europe, fresh technologies are now reviving these systems. By lie detection tools tested at the boundary to a program for verifying documents and transcribes selection interviews, a wide range of solutions is being used by asylum applications. This article explores how these technologies have reshaped the ways asylum procedures will be conducted. That reveals how asylum seekers happen to be transformed into pressured hindered techno-users: They are asked to comply with a series of techno-bureaucratic steps and also to keep up with capricious tiny changes in criteria and deadlines. This kind of obstructs all their capacity to get around these devices and to follow their right for cover.
It also illustrates how these kinds of technologies happen to be embedded in refugee governance: They facilitate the ‘circuits of financial-humanitarianism’ that function through a whirlwind of dispersed technological requirements. These requirements increase asylum seekers’ socio-legal precarity simply by hindering all of them from getting at the channels of security. It further argues that examines of securitization and victimization should be combined with an insight in to the disciplinary what is the due diligence data room mechanisms of those technologies, through which migrants happen to be turned into data-generating subjects who are disciplined by their reliability on technology.
Drawing on Foucault’s notion of power/knowledge and comarcal know-how, the article argues that these technology have an natural obstructiveness. They have a double impact: even though they help to expedite the asylum process, they also produce it difficult for refugees to navigate these systems. They are simply positioned in a ‘knowledge deficit’ that makes these people vulnerable to bogus decisions created by non-governmental actors, and ill-informed and unreliable narratives about their instances. Moreover, that they pose fresh risks of’machine mistakes’ which may result in inaccurate or discriminatory outcomes.