Wouter Van Rossem

Finding Blind Spots: Data quality & the European Information Systems
for security, border and migration management

Department of Science, Technology, and Policy Studies, University of Twente, w.r.vanrossem@utwente.nl

Supervisors: Prof. Stefan Kuhlmann, Dr. Annalisa Pelizza, and Dr. Maurice van Keulen


Wouter Van Rossem holds a Bachelor and Master of Science in Computer Science, and a Master of Science in Management, all from the Vrije Universiteit Brussel (VUB) in Belgium. He also has 5 years of professional experience. After his university studies he worked as an IT consultant at a multinational consultancy firm, and later as a software engineer on a streaming television platform in Brussels. Before starting his PhD at the University of Twente he finished a traineeship at the European Commission’s Joint Research Centre (JRC) in Ispra, Italy at their Text and Data Mining unit.

Personal website: https://www.woutervanrossem.eu/


Interoperability of information systems is a widespread goal of governments to make better use of data they have available to make it useable between systems and/or organisations. Semantic interoperability of data is a high priority as it aims to make data understandable by other systems. We see interoperability as part of wider data quality strategies, such as standardisation or linking records between systems. This research aims to understand the methods used for undertaking interoperability projects by looking back into the past at how data quality is constituted within techno-social assemblages, what methods are used to harmonise data quality rules in interoperability-aimed projects, and what effects this has on the application of the rules.

We will look at the European Information Systems for security, border and migration management and the organisation that manages these systems. These information systems store and process similar data, but all have different original purposes. They all do identity management, but one is for example concerned with identifying asylum seekers, while another deals with visa information of travellers. Interoperability between these systems is a goal and therefore provides an interesting case for analysing different purposes and practices of various social groups surrounding data quality.

We will use a combination of methods to analyse this with different approaches. We will introduce a novel view to look at how data standards and quality are inscribed within these techno-social assemblages and we will analyse changes arising in it from aiming for interoperability. By analysing technical, design, and legislative documents we expect to see the practical politics of data quality inscriptions and uncover otherwise hidden choices. Ethnographic fieldwork at the organisation that manages these information systems aims to understand the different cultures and historical changes surrounding data quality.