Spying on student devices through advanced technology has become a critical strategy for schools aiming to identify self-harm behaviors early. AI-powered monitoring tools, such as those implemented in schools across the United States, analyze students’ online activity on school-issued devices, flagging keywords related to suicide or self-harm. These alerts often lead to timely interventions, but not without controversy or unintended consequences.
The Role Of AI-Powered Monitoring Tools
Artificial intelligence tools like GoGuardian, Gaggle, Lightspeed, Bark, and Securly were initially introduced to ensure safe internet use on school-issued devices. However, during the COVID-19 pandemic, these tools were adapted to address rising rates of suicidal ideation and self-harm among students. They now scan for keywords and phrases that suggest potential harm, sending alerts to school counselors or law enforcement.
Millions of students, nearly half of the U.S. school population, are subject to this kind of surveillance. The system operates round-the-clock, pulling students out of class during school hours or triggering police visits to homes when alerts occur after hours.
Real-Life Cases: A Double-Edged Sword
In Neosho, Missouri, 16-year-old Madi Cholka’s distressing story highlights both the potential and pitfalls of these systems. After typing about her intent to overdose on her school Chromebook, an alert reached the school counselor, who contacted the police. The intervention saved her life that night, yet Madi tragically died two years later despite repeated hospitalizations and her mother’s vigilance.
Similarly, in Connecticut, another student’s parents were awakened by police after a false alarm. The monitoring software flagged the student’s old poem as concerning. Although the situation was resolved quickly, the experience left the student shaken and raised questions about privacy and trauma caused by these interventions.
The Privacy Debate Surrounding Spying On Student Devices
Monitoring students’ online activity sparks heated debates about privacy and equity. Civil rights groups argue that this technology disproportionately targets marginalized students, including those in the LGBTQ+ community. By surveilling personal conversations, these tools often breach students’ trust and autonomy.
Moreover, data on the accuracy and outcomes of such alerts remains unavailable, as private companies and school districts hold this information tightly. False positives, though intended to err on the side of caution, can waste resources and alienate students.
Positive Impact: A Life-Saving Tool
Despite its flaws, many educators and counselors believe in the efficacy of monitoring systems. In Neosho, the introduction of these alerts, alongside on-site therapy, reportedly contributed to a culture where suicide rates dropped significantly for years. Former superintendent Jim Cummins noted, “Even if we can’t quantify it, the statistics don’t lie. We did everything we could to try not to lose one.”
Balancing Intervention And Privacy
Some schools have scaled back after-hours interventions due to controversies surrounding police involvement. Instead, they focus on addressing alerts during the school day. While this limits the invasiveness of the system, it also raises concerns about missing critical moments when students are most vulnerable.
Critics argue that these tools place undue reliance on technology instead of fostering in-person mental health support. However, counselors like Talmage Clubbs in Neosho stress the moral dilemma of turning off the alerts, fearing the consequences of missing an opportunity to save a life.
Breaking The Silence Around Mental Health
The broader impact of these tools extends beyond immediate interventions. By normalizing conversations around mental health and providing support at critical junctures, schools can break the stigma surrounding suicide. Parents like Angel Cholka, who lost her daughter Madi, remain steadfast in their belief that the technology offers hope, even if it delays the inevitable.
Cholka reflects, “I know for a fact — that alone kept my daughter here a little longer.”
While the debate continues, the technology underscores a pressing reality: the need for comprehensive mental health resources in schools. It also raises critical questions about the balance between privacy, intervention, and the role of technology in safeguarding students’ well-being.