
Starting next week, Instagram will notify parents to check on their teen searching for terms related to self-harm or suicide. Meta says a similar alert system for its AI chatbots is coming later this year.
The new Instagram feature sends parents an alert when their child “repeatedly tries to search for terms clearly associated with suicide or self-harm within a short period of time.” It’s rolling out in the US, UK, Australia, and Canada starting next week, but it’s only for parents and teens who opt in to supervision. It’s expected to expand to other regions later this year.
“The vast majority of teens do not try to search for suicide and …