Sat. Jul 27th, 2024

AI monitoring software used in schools to identify students at risk for suicide can come at a great cost, according to an article in TIME. The software, developed by companies such as Bark, Gaggle, GoGuardian, and Securly, tracks students’ computer use to flag activity that may indicate a risk of self-harm. While the intentions behind using the software may be good, there are concerns about privacy, bias, and unintended consequences. The software collects large amounts of data about students’ lives, raising privacy concerns. There are also concerns that the algorithms used to identify at-risk students may be biased against certain groups. Furthermore, researchers have found that the software often leads to increased encounters between students and law enforcement, potentially exacerbating problems. It is also unclear whether the software can accurately detect suicide risk in students. Without sufficient evidence, it is difficult to determine if the benefits of using the software outweigh the risks. Recommendations include providing families with comprehensive information about how the software is being used and allowing them to opt out without penalty, as well as implementing more regulation to protect students.