Impossible to do without both humans and algorithms
Always going to produce absurd, hypocritical and controversial interpretations of and changes to the TOS
Cause a horrific workload for those programming the algorithm, who must devise ways of screening for not actually well defined and constantly changing TOS no nos
Literally traumatize and cause massive mental damage to the human moderators
Facebook has these problems on an even worse scale, and still operates basically computer equipped sweatshops of hundreds and thousands of people in less economically developed parts of the world, most of whom report massive mental trauma from having to constantly review absolutely horrific content, day in, day out, for years.
Another alternative is /social networks this large should not exist/.
There are many, many other alternatives.
Its just that social networks this large have basically destroyed the brains of people who use them, so now they can hardly imagine alternatives.
And that is /another/ argument for why they shouldnt exist, the fact that they normalize themselves they way social and cultural institutions do, but with no actual accountability the way that local and state governments at least theoretically do.
This is also an explanation of why such things are not likely to go away. In addition to being addictive at an individual level, the network effect causes peer pressure to engage more, and otherisizes those who do not and makes them social outcasts, at least amongst the relevant ages ranges for given platforms, but this has also already become more pervasive in matters of direct economic importance, with many companies not hiring, and apartments not renting if they cannot first verify your social media presence on these large platforms.
To slightly inaccurately quote Morpheus from Deus Ex:
The human being desires judgement, without this, group cohesion is impossible, and thus, civilization.
At first you (humans) worshipped Gods, then, the fame and fortune of others. Next, it will be autonomous systems of surveillance, pervasive everywhere.
Welp, turns out that real life mass scale social media networks literally are a hybrid or synthesis of the elements of the latter two mechanisms of social reverence/judgement.
Fair, but surely a lot of that is automated, no? You’d want a human to review it, but it’s not like you’d need people watching the streams constantly.
I’m just saying that eliminating 500 people means they have a lot more than 500 people working there, probably well over 2k. That’s way bigger than I expected.
hardware products, like Steam Deck and Valve Index
Windows compat - Proton; granted, most of the people working on this aren’t Valve employees, but contractors Valve pays
make games - not often, but there’s still maintenance work
manage a CDN - not quite as much data as Twitch, but still substantial, and it’s certainly in the realm of not being a huge difference in terms of manpower to maintain
Steam Link app - available on many of the platforms you listed
Steam mobile app
Steam app - Linux, Windows, macOS
So Valve has a similar-ish level of complexity with well under 500 employees. Maybe Twitch needs another 100 or so employees to manage the CDN, but surely not another 1500 or more.
You know, ppl have tried to automate it in its totality. They’ve tried to make it 50/50, but it turns out there’s not much to automate. Sure you can automate copyright claims on media sources, but that’s about it. As soon as there’s any complexity to it, human review is necessary. You have to appreciate that content moderation mistakes can have a ripple effect into platform integrity and company image as well as user experience. The risks are easy to underestimate.
Oh sure. I’m just saying that a big chunk of it can be automated, so you’re left with manual review of clips that either users or bots generate. That’s a big workload, but how many people are we talking? 50? 500? I’m guessing it’s closer to 50 than 500, but I don’t really know.
Content moderators would be fairly labor intense.
Content Moderation at this kind of scale is:
Impossible to do without both humans and algorithms
Always going to produce absurd, hypocritical and controversial interpretations of and changes to the TOS
Cause a horrific workload for those programming the algorithm, who must devise ways of screening for not actually well defined and constantly changing TOS no nos
Literally traumatize and cause massive mental damage to the human moderators
Facebook has these problems on an even worse scale, and still operates basically computer equipped sweatshops of hundreds and thousands of people in less economically developed parts of the world, most of whom report massive mental trauma from having to constantly review absolutely horrific content, day in, day out, for years.
100% agree, but the alternative is giving the average user moderation power and hoping they do a good job at it, or not moderate at all.
Thats /an/ alternative.
Another alternative is /social networks this large should not exist/.
There are many, many other alternatives.
Its just that social networks this large have basically destroyed the brains of people who use them, so now they can hardly imagine alternatives.
And that is /another/ argument for why they shouldnt exist, the fact that they normalize themselves they way social and cultural institutions do, but with no actual accountability the way that local and state governments at least theoretically do.
This is also an explanation of why such things are not likely to go away. In addition to being addictive at an individual level, the network effect causes peer pressure to engage more, and otherisizes those who do not and makes them social outcasts, at least amongst the relevant ages ranges for given platforms, but this has also already become more pervasive in matters of direct economic importance, with many companies not hiring, and apartments not renting if they cannot first verify your social media presence on these large platforms.
To slightly inaccurately quote Morpheus from Deus Ex:
Welp, turns out that real life mass scale social media networks literally are a hybrid or synthesis of the elements of the latter two mechanisms of social reverence/judgement.
Fair, but surely a lot of that is automated, no? You’d want a human to review it, but it’s not like you’d need people watching the streams constantly.
I’m just saying that eliminating 500 people means they have a lot more than 500 people working there, probably well over 2k. That’s way bigger than I expected.
I think you greatly underestimate how large of a platform Twitch truly is. They have over thirty million daily active users.
Probably. I only watch one streamer, and only occasionally.
That said, headcount shouldn’t need to scale much with more users. Look at Valve, which has ~360 employees and hit 33.5M active users, ~11M playing a game. Here’s some of what Valve does:
So Valve has a similar-ish level of complexity with well under 500 employees. Maybe Twitch needs another 100 or so employees to manage the CDN, but surely not another 1500 or more.
You know, ppl have tried to automate it in its totality. They’ve tried to make it 50/50, but it turns out there’s not much to automate. Sure you can automate copyright claims on media sources, but that’s about it. As soon as there’s any complexity to it, human review is necessary. You have to appreciate that content moderation mistakes can have a ripple effect into platform integrity and company image as well as user experience. The risks are easy to underestimate.
Oh sure. I’m just saying that a big chunk of it can be automated, so you’re left with manual review of clips that either users or bots generate. That’s a big workload, but how many people are we talking? 50? 500? I’m guessing it’s closer to 50 than 500, but I don’t really know.
So stop doing it then