Addendum Preventing prejudice

Technical audit identification code

TA:AA:2024:02

Summary

In its inspection of the legitimate use of student finance for students living away from home, the Education Executive Agency of The Netherlands (DUO) selected students for control with a non-European migration background significantly more often. This demonstrates an unconscious bias in DUO’s control process. Students with a non-European migration background were assigned a higher risk score by a risk profile and were more often manually selected for a home visit. This is evident from follow-up research that NGO Algorithm Audit carried out on behalf of DUO, which was sent by the minister to the House of Representatives on May 22. The results of the research strengthen the outcomes of previous research, on the basis of which the minister apologized on behalf of the cabinet on March 1, 2024 for indirect discrimination in the control process.

Press release can be found here.

Source of case

Education Executive Agency of The Netherlands (DUO)

Github repository

https://github.com/NGO-Algorithm-Audit/DUO-CUB

Algoprudence

The full report (TA:AA:2024:02) can be found here.

    / [pdf]

Financed by


Dienst Uitvoering Onderwijs (DUO)

Additional research Statistics Netherlands and Algorithm Audit to College Grant Control process

22-05-2024 migration backgroud supervised bias test Statistics Netherlands

Reaction Dutch Minister Education, Culture and Science

Official reactie

Reaction Netherlands Human Rights Institute on age discrimination

12-04-2024 reactie

Age discrimination

Policies, such as those implemented by public sector agencies investigating (un)duly granted social welfare or employers seeking new employees, can intentionally or unintentionally lead to differentiation between certain groups of people. If an organization makes this distinction based on grounds that are legally protected, such as gender, origin, sexual orientation, or a disability or chronic illness, and there is no valid justifying reason for doing so, then the organization is engaging in prohibited discrimination. We refer to this as discrimination.

But what about age? Both the Rotterdam-algorithm and DUO-algorithm, as studied by Algorithm Audit, differentiated based on age. However, in these cases, age discrimination does not occur.

EU non-discrimination law also prohibits discrimination on the basis of age. For instance, arbitrarily rejecting a job applicant because someone is too old is not unlawful. However, legislation regarding age differentiation allows more room for a justifying argument than for the aforementioned personal characteristics. This is especially true when the algorithm is not applied in the context of labor.

Therefore, in the case of detecting unduly granted social welfare or misuse of college loan, it is not necessarily prohibited for an algorithm to consider someone’s age. However, there must be a clear connection between age and the aim pursued. Until it is shown that someone’s age increases the likelihood of misuse or fraud, age is ineligible as a selection criteria in algorithmic-driven selection procedures. For example, pertaining to disability allowances for youngsters (Wajong) a clear connection exists and an algorithm can lawfully differentiate upon age.

React to this technical audit

Your reaction will be sent to the auditing team. The team will review your response and, if it complies with Algorithm Audit’s guidelines, the reaction will be placed in the Discussion & debate section above.

* required

Newsletter

Stay up to date about our work by signing up for our newsletter

Newsletter

Stay up to date about our work by signing up for our newsletter

Building public knowledge for ethical algorithms