Google has declared war on the independent media and has begun blocking emails from NaturalNews from getting to our readers. We recommend GoodGopher.com as a free, uncensored email receiving service, or ProtonMail.com as a free, encrypted email send and receive service.
03/07/2017 / By Jayson Veley
Back in January, it was reported that 14-year-old Naika Venant hanged herself in the bathroom of her Miami home and broadcasted it on the Internet via Facebook Live. After some confusion regarding the girl’s address, Naika was picked up and rushed to Jackson North Hospital, where she was sadly pronounced dead.
Just a few weeks prior in December, a 12-year-old girl in Georgia did the same thing. Katelyn Nicole Davis live-streamed the 40-minute suicide through an app called Live.me, and the video was immediately published on various social media sites like Facebook. Now, with the help of advanced technology, Facebook claims to have developed a way to help social media users who are contemplating suicide.
Using artificial intelligence, Facebook has developed a system that can analyze a user’s statuses and comments posted by friends and detect suicidal warning signs. For example, if a user posts a status that expresses sadness or some level of depression, the AI system would be able to detect that. Similarly, if a post received comments such as “are you okay” or “I’m worried about you,” the system would detect that as well. (RELATED: Read about how antidepressants in America are actually causing suicide rates to soar.)
Once the algorithm recognizes signs of a potential suicide, the information is immediately sent to and reviewed by Facebook’s community operations team. After the swift review, the team contacts the user and offers advice and assistance.
Of course, like most other Internet-based programs, Facebook’s AI system is not perfect. The algorithm is only trained to analyze strings of words that form sentences, not the actual emotions and feelings of the users. This could potentially lead to Facebook’s community operations team reaching out to users who are actually just fine. For example, if a Facebook user posts something silly, and then a friend jokingly comments, “I’m worried about you,” the user’s information would still be sent to the operations team. This would inundate them with tens of thousands of cases to be reviewed, which would take time and attention away from the real cases that need to be addressed as quickly as possible.
Still, Facebook’s new technology is receiving praise where it counts. Dr. Dan Reiden, the executive director of Save.org (which also happens to be involved in Facebook’s initiative), said, “Their ongoing and future efforts give me hope for saving more lives globally from the tragedy of suicide.” He added, “The opportunity for prevention, even with Facebook Live, is better now than ever before.” (RELATED: This one easy trick will help treat your depression.)
Though they are not using Artificial Intelligence like Facebook is, other social media sites also have systems in place to help prevent suicide and self-injury. On Twitter, when a tweet is flagged, it is sent to a team that is “devoted to handling threats of self-harm or suicide.” The team then reaches out to the Twitter user with advice and assistance, and they also encourage the user who flagged the tweet to offer support as well.
When an image is reported on Instagram that depicts self-harm, the image is quickly removed from the social media app. Instagram states on their website that they then take steps to help the user in need.
On Snapchat, although there is no way to report another user over self-harm or safety concerns, users are still able to go on their website and file a Safety Concern. After the Snapchat team determines that the suicide threat is legitimate and warrants immediate action, the user receives contact information for the suicide hotline and is encouraged to call the police if necessary.
Given the very real threat of cyber bullying and depression among young people, it is encouraging to see not only Facebook but other social media sites as well, stepping up and offering solutions to the problem.
Sources:
Tagged Under: artificial intelligence, depression, Facebook, Social media, Suicide prevention
Get independent news alerts on natural cures, food lab tests, cannabis medicine, science, robotics, drones, privacy and more.