Bias prevented

Algoprudence number



In the period 2012-2022, students who lived close to their parent(s) have been selected significantly more often by Dutch public sector organisation DUO than other students. The algorithm used to support the selection procedure performed as expected. The combination of the algorithm-driven risk scoring and manual selection for the contorl process resulted in a significant overrepresentation of certain groups. Selected students were visited at home to verify whether they were not misusing college allowances. This is the main conclusion of the audit conducted by the Algorithm Audit Foundation on behalf of DUO. DUO’s control process came under scrutiny in 2023 following news items from Investico and NOS, which stated that students with a migration background were more often accused of abuse than other students. berichtgeving

A press release can be found here.

Source of the case

Dutch public sector organisation DUO


The technical audit report (AA:2024:01:TA) can be downloaded here.

Funded by

Dutch public sector organisation DUO

Dutch cabinet's response to investigations DUO's control process

01-03-2024 political action

Report Bias prevented has been sent as part of the Internal research documents to Dutch Parliament

DUO apologizes for indirect discrimination in college allowances control process

01-03-2024 press release

Press release DUO

React to this technical audit

Your reaction will be sent to the auditing team. The team will review your response and, if it complies with Algorithm Audit’s guidelines, the reaction will be placed in the Discussion & debate section above.

* required


Stay up to date about our work by signing up for our newsletter


Stay up to date about our work by signing up for our newsletter