Data journalist for hire. Writing about algorithms, privacy, and human rights.

A pixelated image of three students raising their hands. The raised hand of the student in the center of the picture is highlighted in red.

Inside America’s school internet censorship machine

A WIRED investigation into internet censorship in US schools found widespread use of filters to censor health, identity, and other crucial information. Students say it makes the web entirely unusable.

An illustration of stick-figure students in graduation robes.

False alarm: How Wisconsin uses race and income to label students ‘high risk’

After a decade of use and millions of predictions, a Markup investigation found that Wisconsin’s dropout prediction algorithm incorrectly and negatively influenced how educators perceived students, particularly students of color.

Major universities are using race as a ‘high impact predictor’ of student success

Documents and data obtained by The Markup show that the predictive algorithms large public universities use to assign students risk scores incorporate race as a ‘high impact predictor” and that Black students are labeled ‘high risk’ at as much as quadruple the rate of their white peers.

shadowtrack.jpg

‘They track every move’: How U.S. parole apps created digital prisoners

State and federal law enforcement agencies are increasingly forcing low-risk probationers and parolees to install tracking apps on their cell phones. Users say the loss of freedom is like being locked up all over again, and the companies behind these apps are looking to roll out predictive analytics features that critics say will lead to unnecessary re-arrests.

flawedalgs.jpg

Flawed algorithms are grading millions of students’ essays

Fooled by gibberish and highly susceptible to human bias, automated essay-scoring systems are being increasingly adopted, a Motherboard investigation has found.

Crepperware story art.jpeg

Google purged almost 1,000 abusive apps. Now some of them are coming back.

Some of the programs banned by Google have now rebranded or added disclaimers and returned to the Play Store. Meanwhile, new programs with overtly abusive purposes have slipped through the company’s automated monitoring systems.

juryselection.jpg

This company is using racially biased algorithms to select jurors

Momus Analytics' predictive scoring system is using race to grade potential jurors on vague qualities like "leadership" and "personal responsibility."

Schoolspyware.jpg

Schools spy on kids to prevent shootings, but there’s no evidence it works

Companies that make this software say that their machine learning detection systems keep students safe from themselves and away from harmful online content. Their numbers aren’t always trustworthy and no independent research backs up their claims.