Report Option for Talk of Self-Harm

Just had a player talk about how he wants to unalive and I don’t necessarily think that’s abusive chat and the person should be punished. Maybe an option specifically for this is needed?

Added from a later post:

This thread is specifically about people talking about hurting themselves using the game’s chat. It’s a problem for multiple reasons

1: that player might follow through, esp if the responses aren’t nice
2: some people have suicidal ideations or past attempts and such talk is triggering, albeit not necessarily “abusive”
3: players are left with a sense of powerlessness because they cannot help without categorizing the player’s (in question) communication as abusive
4: memeing about self-harm is not acceptable because of 2 and 3 and probably other reasons that I can’t think about

What I propose is that players who talk about self-harm can be reported using the “self-harm” option, and the case is reviewed by moderation. If the player is determined to be at risk, they are met with a pop-up at some frequency in the game’s menu landing for some period of time that provides them with relevant information such as suicide hotlines and helpful text to try to diffuse the situation (similar to what google does). If the user expresses a desire to self-harm, has a plan, has a timeline, etc, authorities are contacted just as they would be if the person went around in public saying such things.

From blitzcloud:
pretty easy to fix, barely an inconvenience.

Put a list of words that will get automatically turned into a suicide hotline message for the one that types it.

“suicide”, “kill urself”, “kill yourself”, etc.

I mean that every interaction should be met with an appropriate message:

self-harm-> positive message with suicide hotline numbers.

Telling others to self harm->message to the user that promotes it to stop as they never know what the other person is going through, etc.

I feel this sort of reinforcement has better results as it gives you a pause to reflect.

1 Like

That’s a good thought. But I see a lot of people saying that stuff to meme, so I wonder how that function would work without being abused.

I think the cases should be reviewed. I think talk about self-harm is legitimate enough to warrant the attention of moderation.

Absolutely. People cant say ggez anymore. But they can get away with talking about self harm or telling others to hurt themselves?? Ridiculous

Yah. Whether it’s genuine talk or people are memeing about it, both require attention from moderation.

One of the top things I wanted from OW2 was an improved report system. And sadly we didn’t get it.

It’d also just make me feel better… this person is talking about self-harm and I can do nothing because of anonymity. Well, my only option could get them actioned, I guess. It’s distressing.

pretty easy to fix, barely an inconvenience.

Put a list of words that will get automatically turned into a suicide hotline message for the one that types it.

“suicide”, “kill urself”, “kill yourself”, etc.

Dont tell blizz, they WILL send police to their house, just let them be. Telling someone to kill themselves is an insta report and ban, but if they are talking about killing themselves even as a joke, blizz will make everything worse they arent the best company about that type of stuff.

But the problem is there’s a huge difference between someone saying “kill yourself” and “I’m going to kill myself”. One should be reported and the other should be reached out to.

I mean that every interaction should be met with an appropriate message:

self-harm-> positive message with suicide hotline numbers.

Telling others to self harm->message to the user that promotes it to stop as they never know what the other person is going through, etc.

I feel this sort of reinforcement has better results as it gives you a pause to reflect.

1 Like

I know it’s a report option on TikTok and Twitter,so they wouldn’t be the first to add it.

Maybe I’m misunderstanding what you’re saying. But you’re suggesting an automated system, right?

Because if it’s automated, those systems probably catch a bare fraction of what they’re supposed to. Which is better than nothing, but it’s a poor solution.

Too be fair, telling someone else to self harm does very much fall under abusive chat.

This thread is specifically about people talking about hurting themselves using the game’s chat. It’s a problem for multiple reasons

1: that player might follow through, esp if the responses aren’t nice
2: some people have suicidal ideations or past attempts and such talk is triggering, albeit not necessarily “abusive”
3: players are left with a sense of powerlessness because they cannot help without categorizing the player’s (in question) communication as abusive

What I propose is that players who talk about self-harm can be reported using the “self-harm” option, and the case is reviewed by moderation. If the player is determined to be at risk, they are met with a pop-up at some frequency in the game’s menu landing for some period of time that provides them with relevant information such as suicide hotlines and helpful text to try to diffuse the situation. If the user expresses a desire to self-harm, has a plan, has a timeline, etc, authorities are contacted just as they would be if the person went around in public saying such things.

Even if it helps/saves one life, something should be done about it. If they are joking, something should really be done about it. We should not have to figure it out. There are plenty of six figure salaries at Blizzard that should be doing something about this.