Just had a player talk about how he wants to unalive and I don’t necessarily think that’s abusive chat and the person should be punished. Maybe an option specifically for this is needed?
Added from a later post:
This thread is specifically about people talking about hurting themselves using the game’s chat. It’s a problem for multiple reasons
1: that player might follow through, esp if the responses aren’t nice
2: some people have suicidal ideations or past attempts and such talk is triggering, albeit not necessarily “abusive”
3: players are left with a sense of powerlessness because they cannot help without categorizing the player’s (in question) communication as abusive
4: memeing about self-harm is not acceptable because of 2 and 3 and probably other reasons that I can’t think about
What I propose is that players who talk about self-harm can be reported using the “self-harm” option, and the case is reviewed by moderation. If the player is determined to be at risk, they are met with a pop-up at some frequency in the game’s menu landing for some period of time that provides them with relevant information such as suicide hotlines and helpful text to try to diffuse the situation (similar to what google does). If the user expresses a desire to self-harm, has a plan, has a timeline, etc, authorities are contacted just as they would be if the person went around in public saying such things.
From blitzcloud:
pretty easy to fix, barely an inconvenience.
Put a list of words that will get automatically turned into a suicide hotline message for the one that types it.
“suicide”, “kill urself”, “kill yourself”, etc.
I mean that every interaction should be met with an appropriate message:
self-harm-> positive message with suicide hotline numbers.
Telling others to self harm->message to the user that promotes it to stop as they never know what the other person is going through, etc.
I feel this sort of reinforcement has better results as it gives you a pause to reflect.