Full report: https://www.tweedekamer.nl/kamerstukken/detail?id=2024D17777&did=2024D17777

Bias report Short Stay Visum – SigmaRed for the Dutch Ministry of Ministry of Foreign Affairs

Bias report requested by the Dutch Ministry of Ministry of Foreign Affairs on the application process for short-stay visas (known as Kort Verblijf Visum, or KVV), which makes use of a rule-based classification model to categorise applicants into a fast, regular or intensive track. The goal of the study is to “detect and assess potential inter-group bias by examining the relationship between risk profile percentages and rejection rates across different demographic groups”.

Based on a comparative analysis of disparate impact ratios between 2022 and 2023, it is concluded that “no disproportionate discrimination based on age marital status or gender” is found*. In the report, the rationale for excluding many bias metrics is provided. However, this explanation is absent for conditional demographic parity (CDP). Despite the common understanding that CDP is suggested as an alternative to DI to mitigate Simpson’s paradox, the authors do not clarify why DI is favored over CDP. Using CDP as a bias metric may result in different quantitative outcomes that might fail to support the current conclusion of the report.

Moreover, the bias assessment does not evaluate the eligibility of the selection criteria used in the assessed risk profile. The following 7 criteria are used in the model:

  1. Purpose of stay
  2. Location of application
  3. Nationality
  4. Gender
  5. Age class
  6. Marital status
  7. Professional.

Before such selection criteria can be included in a risk profile, it is imperative to justify why differentiation based on these criteria is proportional, suitable and necessary. Guidelines from the Netherlands Institute on Human Rights outline this obligation. For instance, quantitative evidence supporting the inclusion of selection criteria in a risk profile could be obtained through hypothesis testing on random samples of visa applicants. It is unclear why this obvious first step in assessing bias in risk profiling is absent in the report.

In the context of differentiation on the basis of age, the Netherlands Institute on Human Rights explains:

“It is not necessarily prohibited for an algorithm to consider someone’s age. However, there must be a clear connection between age and the aim pursued. Until it is shown that someone’s age increases the likelihood [of a rejected visum application], age is ineligible as a selection criteria in algorithmic-driven selection procedures.”

So, it’s remarkable that the assessment solely focusses on the quantitative aspects of bias testing and concludes that no age discrimination occurs.

In general, the organisational and qualitative dimension of deploying algorithmic-driven decision-making processes is not covered in this bias assessment. This is noteworthy as experts argue that both the quantitative and qualitative reasoning paradigm are needed to assess bias in algorithmic-driven decision-making. No silver quantitative bullet exist to mitigate algorithmic bias. Algorithms are designed by people and hence organizational checks and balances, including algorithm risk management frameworks, need to be reviewed to assess bias in algorithms. Given the absence of a qualitative review of the above mentioned profiling criteria 1-7, this is a weak spot of the report.

Lastly, instead of using advanced causal inference techniques such as inverse probability weighting (IPW) and instrument variable (IV) analysis to assess whether the rule-based classification model had a direct effect on the decisions made by officers, a preference is given to the simpler F-test.

* for visa applicants with the Yemeni nationality a certain degree of unequal treatment is reported

Newsletter

Stay up to date about our work by signing up for our newsletter

Newsletter

Stay up to date about our work by signing up for our newsletter

Building public knowledge for ethical algorithms