Pulitzer Center Update March 17, 2023

Questioning the Algorithm

Author:
three illustrated file cabinets
English

Every year, millions of welfare benefits recipients across Europe are profiled for fraud by opaque...

The cupboard as a metaphor for a classification system/algorithm with the drawers representing different categorization classes. Black cupboard: The classification system itself can be a black box, meaning that the mechanisms by which a data item is classified into a specific class are often hidden, even to the developers of these systems. Image by Anton Grabolle / Better Images of AI / Classification Cupboard / CC-BY 4.0.

How AI interrogates welfare recipients

The promise was bright and shiny. An “ethical solution” that promised to create “unbiased citizen outcomes” and “a fair distribution of welfare” sounds like the perfect system for a country where welfare fraud tops the political agenda. And so in 2017, the city of Rotterdam in the Netherlands deployed a machine learning algorithm to catch lawbreakers and flag individuals with a high risk of welfare fraud. 

Did the system live up to the promise? Was the tool ethical? Were the outcomes unbiased and fair? Fair to whom? 

These are some of the questions AI Accountability Fellow Gabriel Geiger and his colleagues at Lighthouse Reports tried to answer. In their investigation with WIRED, Geiger and his team were able to reconstruct Rotterdam’s welfare fraud risk scoring system based on the algorithm and training data they obtained from the city. What they found was a system that discriminates based on ethnicity and gender and evidence of fundamental flaws that made the system both inaccurate and unfair. Women, people who do not speak Dutch, and people with children were scored much higher in the system, flagging them to authorities and subjecting them to potential investigations.

What are the real-life consequences of such systems? Geiger shows this by asking Rotterdam resident Pepita Ceelie to enter her personal information in the reconstructed system. Ceelie can barely hide her disbelief and anger. Her gender and divorce—and even the existence of her adult son—significantly increase her welfare fraud risk score. “What does he have to do with this?” she asks. Ceelie was twice investigated by Rotterdam’s welfare fraud team. Both times investigators found no wrongdoings. “They don’t know me, I’m not a number,” Ceelie says. “I’m a human being.”

Rotterdam leads a growing trend of government reliance on machine learning to profile millions of people in welfare systems across Europe. Without such reporting, people like Ceelie may not understand why they are scored by the algorithm as high risk or why they are investigated over and over. This reporting is not possible without journalists asking the right questions when governments and corporations make big, shiny promises.

Best,


IMPACT

On March 13, 2023, Pictures of the Year International (POY) awarded the Pulitzer Center-supported films Qatar's World Cup Building Boom: Too Hot To Work and Bring Them Home the POY Documentary Journalist Award of Excellence. A 20-minute documentary, Qatar’s World Cup Building Boom was produced in part by grantees Aryn Baker, Ed Kashi, and Tom Laffay for the Pulitzer Center project Too Hot for Work: How Qatar Offers Lessons for the Economy of a Heating Planet. The film was released by TIME and Context. 

The film "Bring Them Home," produced by grantees Ray Whitehouse, Katrina De Vera, and Kate Woodsome, is part of the Pulitzer Center project A Quiet Crisis: The Tragedy of State Hostage-Taking, published by The Washington Post.


JOIN THE 1619 EDUCATION NETWORK

Applications for The 1619 Project Education Network are due on March 20 at 11:59pm EDT. Watch the highlights from our 1619 Education Conference above to learn more about the network, and visit this link to apply!


This message first appeared in the March 17, 2023, edition of the Pulitzer Center's weekly newsletter. Subscribe today.

Click here to read the full newsletter.

RELATED CONTENT