The Chilean government, like many others, has deployed predictive modelling software to assess children’s risk of facing harm or abuse. The Childhood Alert System, an “early warning system” based on algorithmic predictions, assigns risk scores to children and adolescents. But this system consistently and disproportionately focuses on low-income families and is deemed merely an exercise in “poverty profiling.” This conversation will look at the implications of this system for children’s rights in Chile. We will examine how algorithmic risk prediction, far from being a neutral exercise, can stigmatize and criminalize families in poverty, exacerbate harmful interventions in children’s lives, and invisibilize other risks. We will ask, what does it mean to introduce predictive analytics into child welfare decisions, and what stories are these risk scores really telling?