Machine Learning is Racist. A Fact

all im saying is that machine learning has been proven to have racist stereotypes in the past… whats different this time ?

1 Like

Some machine learning has had racist stereotypes when fed with data which had racist stereotypes.

Data analysis, and ai… It’s my job.

They are likely to just feed it reports which were vetted as training data.

4 Likes

so the most toxic community in gaming isnt going to feed it incorrect data ?? loooolol

2 Likes

[citation needed] -

This is about to be fed plenty of trash data.

1 Like

How are League players going to sabotage blizzard bot?

1 Like

Vetted data. They are going for toxic chat first.

I’m not sure if you are trolling with this thread, but i think it’s an interesting statement regardless. I am VERY wary of automated censorship and policing things like communication.
It seems very Orwellian to me.

Oh boy, this wont spiral out of control

What is this in response to?

what’s this about? Blizzard bot? did i miss some news, is this their new report system or chat moderation or something?

It will be searching for the racism with a singular mind that it will ban players that spew them. What is the problem here? Maybe someone wanted to drop the ‘n’ word in a friendly way and gets false flagged? doubtful.

Question, just one, given how usually no one knows the race of someone playing, how will the bot find out?

Yep, but in the context of a games chat channel? Which already has a reporting system which people review. It won’t be worse than what we have.

Hey robot wizard!looks like I can still find ya!

1 Like

News… They hired a data science team to work on stuff. They are now starting to put stuff in practice in a small way.

More to come.

1 Like

Hell yeah!!! I’m never far away on the forums

The person is being edgy. He may be referring to hero selection.

I expect they will train it to look for griefers / throwers soon enough.

I doubt they will use reports to train it for that.

Not exactly, there have been instances where models excluded black people by a combination of zip code and income; these are more systematic inequalities than stereotypes, but yeah proper vetting of data should solve the problem.

1 Like

Oh yeah, I worked on a system to deal with similar problems. Wrote a research paper on it.

Building a system which works the same as the last system is easy. Building one based off the last systems choices which works better is hard.

Vetting the chat logs is a very good start.