Meta’s algorithm indirectly discriminates against women: what’s next?

The Netherlands Institute of Human Rights has ruled that Meta’s machine learning-driven auction-based advertising system indirectly discriminates against women. Here’s a breakdown of the ruling and its implications:

Key takeaways from the ruling

  • Human Rights Institutes in one EU member state (e.g., The Netherlands) can take legal action against a social media platform (Meta Inc.) registered in another member state (Ireland).
  • Non-governmental organizations (NGOs) established to advocate for specific demographic groups (e.g., women) can represent these groups in legal actions.
  • Personalizing job ads on social media is classified as a service, not as employment mediation or recruitment.
  • Evidence showed that over 85% of stereotypically female-associated jobs (e.g., receptionist, preschool teacher) were predominantly shown to women, while male-associated jobs (e.g., mechanic, pilot) were mainly shown to men.
  • This triggered the reversal of the burden of proof, requiring Meta to demonstrate impartial treatment, which they failed to do.
  • The machine learning algorithm used profiling in dynamic ways, sometimes factoring in gender. While this didn’t qualify as direct discrimination, it led to indirect discriminatory outcomes.
  • The Netherlands Institute of Human Rights assessed the algorithm against three criteria:
    • Legitimacy: Differentiating job vacancies is a legitimate aim for Meta to enhance its services.
    • Suitability: ML-driven differentiation is suitable for achieving this aim.
    • Necessity: The method was deemed unnecessary, as alternatives like algorithmic differentiation combined with monitoring disparate impacts were available.

This isn’t a new problem for Meta. In the US, under pressure from the Department of Justice, Meta implemented proper monitoring systems, including the Variance Reduction System (VRS), to improve fairness in personalized ads. While the VRS system could potentially address the issues raised by this ruling, it’s disappointing that Meta hasn’t proactively introduced this solution in European markets.

Newsletter

Stay up to date about our work by signing up for our newsletter

Newsletter

Stay up to date about our work by signing up for our newsletter

Building public knowledge for ethical algorithms