Re: Why we really can't use Facebook for technical discussion.

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



> On Jun 7, 2021, at 9:30 AM, Keith Moore <moore@xxxxxxxxxxxxxxxxxxxx> wrote:
> 
> On 6/7/21 8:16 AM, Phillip Hallam-Baker wrote:
> 
>> What we have here is the predictable result of a company that failed to take moderation seriously and is now desperately throwing technology at a problem rather than fixing the core problem that they designed their environment to maximize conflict because that was most profitable for them.
> As much as I hate FB (I left the platform in 2016 and have never looked back) I think "failed to take moderation seriously" glosses over a number of inherent problems with social media, particularly when done on a large scale.   
> 
> One is of course that human moderation is time-consuming and therefore expensive if the moderators are paid.  It's also hard for a large number of human moderators (required to deal with large volumes of users and traffic) act uniformly.   On another widely used platform the moderators are almost completely arbitrary, despite supposedly enforcing a common set of clear standards.   So it's not surprising if social media platforms resort to algorithms.   And of course the algorithms are flawed because AI is a long way from understanding the many subtleties of human interaction.
> 
> Unpaid human moderators can be even more capricious than paid humans, because the desire to impose one's own prejudices on others is a strong motivator of volunteers.
> 
> Even under the best of conditions moderation (whether done by humans or machines) is dangerous both because it can easily squelch valuable input, and because it's often easily gamed for that purpose by people for whom such input is inconvenient.     No matter how noble the intent, the effect of moderation is often to favor established prejudices, or worse, to enable bullying.
> 
> I don't claim to know the solution, but I don't think it's a simple matter of "taking moderation seriously"

Not to mention - I assume many of us have read of what life is like for many of those moderators. Imagine sitting at a screen for 8 or 9 hours a day, staring at endless photos ranging from child abuse to child porn, BDSM, violence, blood, gore, and that is barely, barely scratching the surface. Having to make flash decisions about the worst of the worst of the worst of humanity. All day. Every day. For not much pay. And minimal psychological/mental health support.

https://www.theguardian.com/news/2017/may/25/facebook-moderator-underpaid-overburdened-extreme-content

https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator-interviews-trauma-working-conditions-arizona

https://gizmodo.com/the-life-of-a-facebook-moderator-sounds-even-worse-than-1835656998

----
Andy Ringsmuth
5609 Harding Drive
Lincoln, NE 68521-5831
(402) 304-0083
andy@xxxxxxxxxxxx

“Better even die free, than to live slaves.” - Frederick Douglas, 1863





[Index of Archives]     [IETF Annoucements]     [IETF]     [IP Storage]     [Yosemite News]     [Linux SCTP]     [Linux Newbies]     [Mhonarc]     [Fedora Users]

  Powered by Linux