Twitch’s unwillingness to engage in suicide prevention continues to put even more people at risk

CW : Suicide attempt/Self Harm
Disclaimer : I am myself a Twitch Partner

“How does Twitch deal with SELF HARM and SUICIDE? Shamefully…” says the recent reddit thread. It is the account of a person by the username of badxan who, as a moderator of a twitch channel, saw a viewer obviously being troubled in the chat room.

Viewer made some statements in chat that he was having a rough day and he “might as well do an IRL stream and shoot himself in the head” or “hang himself”.

Holding a mental health first aid certificate, badxan went on to discuss privately with the viewer and managed to identify that he was in severe need of help. Badxan thought they could use the help of Twitch by using the report feature under the Self-Harm option, adding in the description that the person needed urgent medical attention.

Within 5 minutes, there was a response from Twitch, but not what I was expecting…

Twitch banned his account.

As indicated in the guideline page of Twitch, while the report function is used for a breach of the terms of services (ToS), The Self-Harm option doesn’t mention any sanctions taken, nor any guidelines to help people in need, making the intended use of this option difficult to parse.

Self-Harm: The reported user has threatened, attempted, or is at risk of self-inflicted harm or death. Such activity may include suicide threats, intentional physical harm, use of illegal drugs, or drinking in excess.

One could easily think that it could be used as a lifeline in order for suicide prevention help to kick in, even if automated, but the company’s answer is purely and simply to silence every mention of distress made by viewers and streamers alike.

Twitch, now owned by Amazon, has 100 millions visitors per month and 15 millions daily active users, nearly half of them spending 20 hours a week on the website. With chat rooms, a friend system and private messages available, the website has very much become a social network occupying a huge part of the lives of many people around the world. Yet, Twitch has absolutely made no effort to protect the most vulnerable. Instead choosing to worsen their situation by banning their account and put additional burden on their shoulders.

Other social networks offers at least basic options allowing for people to receive help. Twitter has a Trust & Safety council by grassroots organizations designed to help people in need. Facebook has a help-page available, as well as a report feature allowing for the website to directly give useful information such as a help line for everyone involved. None of them are perfect answers, but they offer the basic foundations that could help someone in need and make the difference between life and death.

3262bfec-2832-42c9-a534-117c7895ffcb_JDX-NO-RATIO_WEB.jpg
Alexandre Taillefer in “Tout le monde en parle” (Journal de Montreal)

Badxan is only one of the many times someone has been subjected to the blatant unwillingness of Twitch to make the platform a better, safer place. Québecois businessman Alexandre Taillefer lost his 14-year-old son in late 2015 and had this to say in a TV show :

My son gave no signal. The only signal he sent was through a website he spent a lot of time on called Twitch, now owned by Amazon. […] My son sent warning signs during May, very clear signs, with the word “suicide” in it. And Amazon, who is able to detect that you want red shoes, doesn’t do a thing about it.

If I had been alerted at this moment, in any way possible – during May, my son lost his life on the 6th of december, six months later -, I think it would have made a difference.

Speaking on CBC Montréal in the wake of the news, psychologist Carl-Maria Mörch says that companies needs to be more proactive for suicide prevention. “It doesn’t mean these companies are responsible for what the people have said on their website, but they should absolutely create opportunities for these people to be helped,” he said.

“There needs to be better coordination with health care professionals, who could even help train moderators who watch online conversations.”.

It has been nearly two years since then and nothing has been done by Twitch to give better tools to help people in need. Even worse, Taillefer mentioned that the person his son reached out to did not flag any report, yet in this same situation, badxan who did flag the user to Twitch, saw the viewer banned instead of receiving any kind of help. Badxan  has not received any answer from Twitch through the help page or twitter since then.

The inaction of Twitch also reminds the case of dmbrandon, a popular twitch streamer who berated a viewer who donated to him saying “I tried killing myself last August, discovered your videos once I was released, and Smite has become a positive outlet for me. Thanks.”

He then said : ““There are a lot of streamers out there who would appreciate that message, I’m just going to call you an asshole. It’s a selfish, stupid thing to do.” and went on to continue berating the viewer for five minutes.

While dmbrandon’s comments made the rounds of the internet, he has since then received absolutely no sanction from Twitch from his behaviour, not even the slightest of statement coming from them discouraging people to act this way.

As Twitch keeps doing nothing, more and more people who could receive a helping hand are left with nothing, or even abuse down the line in the only communities where they can feel safe or willing to speak out. Yet, a few options could change lives. A comprehensive report feature that strives to help rather than punish people could go a long way, as well as pushing its audience to treat every suicidal behaviour seriously.

Twitch, as a popular social network, has the moral imperative to treat this very seriously and start to think on ways to help the most vulnerable. They should be made to answer for their inaction.

Leave a comment