Preventing prejudice

Algoprudence idetification code

TA:AA:2024:01

Summary

In the period 2012-2022, students who lived close to their parent(s) have been selected significantly more often by Dutch public sector organisation DUO than other students. The algorithm used to support the selection procedure performed as expected. The combination of the algorithm-driven risk scoring and manual selection for the contorl process resulted in a significant overrepresentation of certain groups. Selected students were visited at home to verify whether they were not misusing college allowances. This is the main conclusion of the audit conducted by the Algorithm Audit Foundation on behalf of DUO. DUO’s control process came under scrutiny in 2023 following news items from Investico and NOS, which stated that students with a migration background were more often accused of abuse than other students. berichtgeving

A press release can be found here.

Source of the case

Education Executive Agency of The Netherlands (DUO)

Algoprudence

The technical audit report (TA:AA:2024:01) can be downloaded here.

    / [pdf]

Funded by


Dutch public sector organisation DUO

Dutch cabinet's response to investigations DUO's control process

01-03-2024 political action
Description

Report Preventing prejudice has been sent as part of the Internal research documents to Dutch Parliament

DUO apologizes for indirect discrimination in college allowances control process

01-03-2024 press release
Description

Press release DUO

Reaction Netherlands Human Rights Institute on age discrimination

12-04-2024 reaction

Age Discrimination

Policies, such as those implemented by public sector agencies investigating (un)duly granted social welfare or employers seeking new employees, can intentionally or unintentionally lead to differentiation between certain groups of people. If an organization makes this distinction based on grounds that are legally protected, such as gender, origin, sexual orientation, or a disability or chronic illness, and there is no valid justifying reason for doing so, then the organization is engaging in prohibited discrimination. We refer to this as discrimination.

But what about age? Both the Rotterdam-algorithm and DUO-algorithm, as studied by Algorithm Audit, differentiated based on age. However, in these cases, age discrimination does not occur.

EU non-discrimination law also prohibits discrimination on the basis of age. For instance, arbitrarily rejecting a job applicant because someone is too old is not unlawful. However, legislation regarding age differentiation allows more room for a justifying argument than for the aforementioned personal characteristics. This is especially true when the algorithm is not applied in the context of labor.

Therefore, in the case of detecting unduly granted social welfare or misuse of college loan, it is not necessarily prohibited for an algorithm to consider someone’s age. However, there must be a clear connection between age and the aim pursued. Until it is shown that someone’s age increases the likelihood of misuse or fraud, age is ineligible as a selection criteria in algorithmic-driven selection procedures. For example, pertaining to disability allowances for youngsters (Wajong) a clear connection exists and an algorithm can lawfully differentiate upon age.

React to this technical audit

Your reaction will be sent to the auditing team. The team will review your response and, if it complies with Algorithm Audit’s guidelines, the reaction will be placed in the Discussion & debate section above.

* required

Newsletter

Stay up to date about our work by signing up for our newsletter

Newsletter

Stay up to date about our work by signing up for our newsletter

Building public knowledge for ethical algorithms