Facebook moderators break NDAs to expose working conditions (theverge.com)

1014 points by notinversed 124 days ago

590 comments

cldellow 124 days ago

This was a valuable article to read.

Facebook is enormously valuable. They made something like $15B in net income in the last four quarters.

Content moderators are a necessary condition for that profit. If kiddie porn, gore and animal cruelty flooded the network, it would cease to be a destination visited by people that advertisers will pay to reach.

And yet, there are two sets of entry-level knowledge workers at Facebook: engineers ($150k/year, benefits, upward career trajectory) and content moderators ($30k/year, no benefits, likely going to acquire mental illnesses).

I understand the arguments about supply and demand of labour, but I'd have more respect for Facebook if they demonstrated awareness of this issue. The article talks about moderators re-evaluating the same piece of distressing content that they've already flagged. Why? I suspect because the moderator is cheap, and so Facebook isn't putting in the effort to ensure that every judgment needs to be made the minimum number of times.

More so than salary, I suspect Facebook considers the moderator cheap in terms of reputation risk. By outsourcing to contractors located offsite from main campus, engineers aren't thinking daily about the absolutely horrible stuff moderators are seeing, and so the one group doesn't impact Facebook's ability to hire engineers. This is a guess - can anyone at Facebook speak to whether engineers are aware of the working conditions of moderators, and agitate to improve their lot?

    ummwhat 124 days ago

    I've talked about this somewhere else. What you're seeing is just a manifestation of what I call "the fundamental problem of user created content." Said problem is that warehousing and distributing content scales insanely well but curation does not. Until we have strong AI, curation is a manual process. Moderators just aren't efficient enough at processing content for their output to pay for a full time salary. You can cut costs by making end users into moderators (the Reddit model) but results may vary.

    This problem applies to other forms of sites and content as well. The app store gets hundreds of submissions in the game category per day. Hosting and distributing that content is easy. Only a tiny fraction of those games are going to be played and rated enough times to show up in a recommendation engine. The bulk of the incoming content stream isn't being matched to interested people at all (sorting by interesting is the same curation issue as filtering by offensive).

    Literally every content platform is going to have some problem similar to Facebook. We can blame Facebook, but the reality is no one has a good solution. Not even me.

      harry8 124 days ago

      The solution is to legislate it be done properly which is a business expense. Where it is too expensive to be done properly, there is no business case to have a user created content platform. Being a platform is enormously lucrative, shirking the responsibilities to cut costs and make it even more so is evil. Really.

        gradys 124 days ago

        Could you describe some potential legislation you'd be happy with? How would you define "properly"? I'm especially interested in how this would work in places like the United States where there are strong free speech laws.

          harry8 124 days ago

          The detail of how best to regulate is a subject matter for experts, not me. In principle precisely the same way we have legislated against pumping sulphur dioxide out of a chimney in the atmosphere. Same rules for everyone. Now go and compete. You get experts on pollution, mitigation and economics to advise on what will work best for the outcomes you want to achieve. Similar domain experts are required here.

          Now I have no doubt that facebrick and the evil-goog will try for regulatory capture and employ their vast army of Washington lobbyists in rent-seeking activities but that is a separate problem applicable in every industry where market failure means you have to regulate, not just this one. (And yes it does need solving, absolutely urgently). If you don't have to regulate because the market is working, great! Don't regulate! That's not what we have here, this is market failure but we haven't yet regulated. It will come because there are clearly externalities in this market, Well it will come unless we give up on democracy first. Something with which facebrick and evil-goog seem really comfortable as long as they supplant it.

      rhizome 124 days ago

      >Said problem is that warehousing and distributing content scales insanely well but curation does not

      I don't think it's a problem any more than weather is a "problem" that required the invention of the roof. As far as user-generated content goes, it's easy to invent the hose, but the faucet is much more complex.

      It's irresponsible to provide only acceleration ("reach," "virality," etc.) for information without also implementing brakes. Moderation has been a known fix for decades, but people like Zuck -- in disposition and inexperience -- won't launch anything human-powered, and moderation requires a human eye. Fuck us, right?

        ummwhat 124 days ago

        I think you're view will change when you start putting numbers on it. I'm going to talk about the games in the app store example because it's the instance of the problem that I first thought about and it is easier to put a monetary value on an app than on a comment.

        Let's suppose Apple wants to make sure every game in their store gets at least 5 ratings. To do this they are going to pay people to play each game for 5 min and then assign a rating. In an hour a curator can get through 12 games. Over an 8 hour day thats 96 games. Apple would need to hire about 7 (iirc) full time employees just to give every game one rating. We wanted every game to have 5 ratings. Assuming that users on average provide one rating (but sometimes 0), we still need curators to provide the other 4. That means Apple would be employing 28 full time employees just to get their content minimally curated.

        Let's use the salary from the article and say each curator costs 30k a year. That means this whole department of curation is going to cost about 900k a year.

        Most of what they are sorting is going to be shovelware anyway. To justify the cost of this department, the curators need to find gems, content that's really good but would have otherwise been looked over. They need to find 900k of gems per year. Considering the average earning of an app, this is just barely plausible. If you want to do anything more advanced with your curation, like have the curators generate key words or figure out which demographic might like it, your budget is blown for sure.

        IDK how the economics of comment curation works out, but I bet it's even worse. After all, no one is even paying for comments, and making a comment is way less effort than making an app.

          rhizome 123 days ago

          Why would they keep all the other numbers the same? They could raise the cost to list an app. They could charge extra for a "verified by Apple" designation. Charging monthly fees to be listed in the app store at all. Any of these would raise the bar on mere submissions to the app store, and beyond that supply a positive signal to the end user. There are plenty more options along these lines, too, but conventional business practice is to preserve the illusion that the internet is $free.

          I don't care how much it costs, though, and I don't think Zuck cares either. If FB remains unmoderated it will invariably turn in to a shithole and people will go elsewhere rather than wading through a bunch of racist anti-vax cartoons just to find out where their friends are meeting for lunch next week. FB has plenty of money to pay as many human moderators as it would take to stop complaints, in all languages, but they won't. Where FB might go bankrupt if they don't implement brakes on reach, they'd just as soon prefer to go bankrupt rather than hire humans to do the work.

          BRAlNlAC 123 days ago

          I don’t understand your argument at all. Is 900k/year to curate a store that’s on almost all of their products (or even one) supposed to be a lot for Apple? Seems like a rounding error to me. Not too mention that the real benefit isn’t from finding gems but from not allowing rotten content to spoil your platform. If they(FAANGs) can poach neuroscientists with 7 figure salaries like we saw yesterday, they can afford to have humans do moderation and not treat those moderators like garbage. Also they aren’t doing comment moderation, but content, mostly videos in the article it seemed. The fact of the matter is causing PTSD in your workers is an enormous (not to mention cruel) externality that Facebook absolutely needs to be held accountable for.

            ummwhat 123 days ago

            The confusion arises because I'm not at all talking about filtering rotten content (aka facebooks instance of the problem). I'm talking about sorting for good content, possibly identifying a niche hit in a stack of shovelware. On the surface these are very different tasks but they are both manifestations of what I'm calling the fundemental problem. The task of curating, that is identifying the content of incoming data to either sort or filter it, must still be done by hand and thus rarely if ever pays for itself. Sure 900k is a rounding error to Apple, but how much do they make off that 900k? And keep in mind the 900k figure was a bare minimum just to accomplish 5 ratings.

            The issue is that if you want to identify content as "fun", "offensive", "dramatic", "tense", "crass" or any other of the myriad of labels that haven't been machine learned, very quickly the labor to generate these labels will cost more than the value generated by these labels.

              BRAlNlAC 122 days ago

              Thanks, this makes much more sense. Typically we’ve used the market as a judge/pre-judge, but low inspiration me-too content in an opaque marketplace seems to disrupt this model significantly. It seems to me the value, especially for Apple, would be from having a better store UX. Tags that find users good stuff may not lead to Gems that you can pump to get more sales, but it might mean on-boarding a % more users per quarter to your service/ecosystem—especially over time as reputation builds. Conversely, if the App Store becomes too cluttered with garbage users may lose interest in the device or even ecosystem entirely if the quality of new apps they discover drops off dramatically due to a lack of tools/curation. An efficient curated store certainly improves the feeling of an experience versus one loaded with junk. Tagging isn’t trivial, however it can lead to a much better experience.

      ngold 124 days ago

      15 billion dollars in pure profit. You could pay a living wage with health benefits, or you could hire twice as many people and have everyone work less while paying more.

      This is just Facebook being cheap.

        nojvek 124 days ago

        Something something. Corporations are hive mind AI optimizing for growth and profit. They don’t give a shit. Amazon doesn’t give a shit unless it impacts the bottom line. Google barely has support and YouTube kids is full of questionable content.

        solstice 123 days ago

        The image of the Chernobyl liquidators just popped into my mind, each worker only allowed to shovel radiative shit for 90 seconds before having a mandatory time out. Treating content moderators in a similar way seems like a step into the right direction from a mental health perspective

          Meskupie 123 days ago

          The largest impact events will likely be the smaller percentage of posts that really resonate with the moderator. What would be best is the flexibility to take a break after reviewing those level of posts rather than simply periodically.

          The most important thing for moderators would be group and individual therapy sessions run at work with hired therapists. Group therapy is great because you can relate to other moderators struggles which helps process, and individual to help process the most traumatic events. This is what Kik has implemented and seems to be effective.

      Spooky23 124 days ago

      It’s fundamental that some members of the public are problems. By prohibiting the content but keeping the people around, they are curating a culture where predators are allowed to roam on the platform as long as they don’t cross a specific line.

      They do this because acknowledging and dealing with bad users would also mean impacting fake users that make them money with ad fraud, growth metrics and ginning up engagement.

      End of the day, if you operate a social platform, and you have people uploading child pornography and animal torture video, you should do everything in your power to make the user go away, for good.

        nojvek 124 days ago

        I don’t get the point of animal torture videos. The asking of meat consumed in the world is ludicrous. Eating meat is supporting animal torture in a way.

        Just saying.

          xodice 123 days ago

          Eating meat is part of life, no one is torturing them. Torture is for a reason or for fun. There is a reason some animals are raised and used for food and its not to have fun torturing them.

          I let you live your way, don't try to force others to live your way by using "scare tactics".

            c0vfefe 122 days ago

            Well, certain practices common within the meat industry are torturous, regardless of whether they're intended for fun. "Food, Inc." & "Earthlings" are good resources.

      hashkb 124 days ago

      > We can blame Facebook, but the reality is no one has a good solution.

      We can take a step backwards and eliminate Facebook, Insta, etc from our societies and enact legislation that will govern the second generation of social networks.

        Jedi72 124 days ago

        What legislation? Specifically? What do you want to make law, that they have to hire moderators (which they already do?) or do you want to make it illegal to post "bad things" on the internet?

          Spooky23 124 days ago

          I would start with a few things:

          - Make social networks mandated reporters for suspected child abuse. Like teachers or little league coaches, reporting would absolve them of liability.

          - Prohibit generation of profit from illegal activity.

          - Require transparency in political advertising, with an accountable entity and individual listed and publicly available.

          - Require reporting of funding source and material individuals for paid political advertising.

          - Make the platform liable for false product claims in cases where the platform facilitates direct or implied endorsements for a product. (“Jedi72 likes this homeopathic cancer treatment!”

          - Make the platform responsible for reasonable efforts to block and report on actions against fictitious users posing as people. Share liability for any claims, including libel, resulting from the actions of fictitious people.

          A regulatory regime like this would either improve the signal/noise ratio or cause the company to abandon or prohibit certain actions or functions.

          hashkb 124 days ago

          Let me start by saying that I am not running for elected office because it's a really hard job; even though I might believe I would do it with more integrity and rationality than the next person, I'm also smart enough to realize that everyone thinks that way and I'd probably end up within a standard deviation of average corruption.

          What I'm saying, seriously, is that we need a society-wide blameless post-mortem on social networks. It needs to be a slow, careful discussion, where all the stakeholders have their voices heard, and we decide what is good for us all. I don't know that we need them at all, I'm not sure they provide ANY value to ANYONE, but as a non-user I'd of course be open to persuasion by current victims. Erm, users. In the meantime, it should be illegal to operate a social network. Nobody needs to go to prison yet, even though I think it's horrifyingly clear at this point that the ethics are out the window.

          One obvious one: advertising and public conversation must always be separated; the same way no public schoolteacher may read scripture in class, it should be illegal for an internet service that hosts public conversation (e.g. Twitter) to allow sponsored content.

          We should also establish guidelines for addiction. After cigarettes, drugs, sugar, etc; we should as a society be prepared to understand the various forms that addictive products can take and regulate them aggressively before they become serious problems. UX patterns like infinite scroll, pull-to-refresh, and push notifications are particularly suspect.

          I also think we should close the apparent loophole in COPPA that allows parents to post photos of their children on social networks.

          > that they have to hire moderators (which they already do?) or do you want to make it illegal to post "bad things" on the internet?

          I tend to come down on the "free speech" side of these issues as much as possible; I would prefer unregulated public fora that (perhaps by requiring identity verification) encouraged good-faith participation in substantial discussions. I think if you are just fooling around, maybe you should head down to the bar and get drunk with your friends and do your shit talking there.

            dragonwriter 124 days ago

            > What I'm saying, seriously, is that we need a society-wide blameless post-mortem on social networks

            Social networks aren't dead, so we can't have a post-mortem.

            Also, you can't have a society-wide blameless analytical conversation about anything, especially if there are significant conflicting interests involved; that's not how people work. (It's perhaps noteworthy that even while suggesting this you offer no objective descriptions of impacts with reasoned analysis of contributing factors, instead jumping straight to blame and policy responses, with some labelled as “obvious”.)

            Macross8299 123 days ago

            >In the meantime, it should be illegal to operate a social network

            So what about the countless number of people whose livelihoods depend on social media in some capacity? They're supposed to just be fine with being irrevocably fucked over until the moral panic about social media subsides? (or, in your words, "we decide what is good for us all")

            Where does the small business owner who depends on a social network factor in as a "stakeholder"?

            Drugs, sugar, cigarettes are measurably and objectively harmful to your health and that's why they are regulated (or should be).

            There isn't similar comparable scientific evidence that social media is nearly as harmful except for questionable non-reproducible psychology studies, so I don't think it's comparable at all.

              hashkb 123 days ago

              I hear you saying you don't believe social media is addictive. Will you please kindly open the "digital wellbeing" app (or whatever it's called on iOS) on your phone and share the number of hours you've used social networks over the last week?

                reciprocity 123 days ago

                I couldn't help but notice that your comment doesn't address the other parts of his argument, nor did you choose to respond to the earlier reply you received (by dragonwriter) outlining the legitimate flaws in the reasoning you provided with your initial opinion.

                Social media has a lot of problems - even this article on just Facebook outlines a number of them [0] but I agree with the above poster in saying that you can't have a societal post-mortem analysis of the effects of social media given its very much _not dead_ state. Advocating that society just presses a 'shut down' button until "we collectively decide how we should proceed" is an entirely unrealistic scenario.

                [0] https://en.wikipedia.org/wiki/Criticism_of_Facebook

                  hashkb 111 days ago

                  > nor did you choose to respond to the earlier reply you received

                  HN's posting frequency limits made that choice for me. The system is unfortunately biased towards drive-by comments and against engaging with feedback on your own comments.

                  Edit:

                  > Advocating that society just presses a 'shut down' button until "we collectively decide how we should proceed" is an entirely unrealistic scenario

                  But that's what I'm advocating. I don't think I'm obligated to respond to people who only came by to reject the premise of my argument. More interesting discussions are available to anyone who shows up with an open mind.

            jki275 124 days ago

            You claim to be for free speech, yet most of what you propose directly infringes on the concept. You don’t get to determine who gets to say what.

            closeparen 124 days ago

            The point of free speech is that it’s not regulated according to an opinion about “what is good for us all” or what is “needed.”

        1stcity3rdcoast 124 days ago

        Ah, yes, Congress — the most competent subject matter experts regarding the internet and technology.

        Chazprime 124 days ago

        Well, to do that you’d need a congress that has the first clue what social media even is, and I’m guessing the US is easily 5-10 years away from that.

        ummwhat 124 days ago

        How does that solve the problem that the incoming content stream flows faster than any curation mechanism we can economically build today?

          joeblubaugh 123 days ago

          Well, first you slow the content stream by imposing a rate limit per-user, and users who post things that violate the policy are stopped from posting for a longer time.

          Second, you do better at automatically detecting duplicate content that’s already been banned.

      nibbula 124 days ago

      "Strong" AI is a person, whom you also shouldn't mistreat. Let's hope they think the same about us.

      lone_haxx0r 124 days ago

      Do you mean a bussiness solution or a solution to the "social problem" of people posting disgusting content?

      I have no bussiness solution, but the solution to the latter is to simply ignore/block people who post stuff you don't like.

    Miner49er 124 days ago

    Facebook won't do anything to help these moderaters unless the cost of the negative PR exceeds the cost of doing something. Engineers at FB could help achieve this by causing agitation, for sure.

    Maybe the best solution is for the moderaters to unionize.

      plankers 124 days ago

      If you can think of a practical way to unionize workers on a 1-2 year contract with a high turnover rate, I'd love to hear it.

      The deck is stacked against employee organization. If I were conspiracy-minded, I might even say this was by design.

        Barrin92 124 days ago

        >If you can think of a practical way to unionize workers on a 1-2 year contract with a high turnover rate, I'd love to hear it.

        Adopt the Danish model, make it mandatory for companies with workforces over a certain size to at least vote on unionisation. Will shift the default from not being unionised to being unionised except for cases were the workforce is actively convinced its detrimental.

          plankers 123 days ago

          I love that idea! But I have a hard time seeing a law like that being passed in the U.S., even in liberal California.

          Side thought, things like this help me understand why the Danes are so darn happy.

      hashkb 124 days ago

      They don't actually meet the legal requirements of contractors. In CA, anyway, they'd have to be full time employees since they do the same thing every day, full time, they are required to be on-prem during dictated hours, and the business relies on their work (e.g. they can never finish their project.)

        otterley 124 days ago

        The moderators are employees of the contracted organization (Cognizant, in this case). They’re not employees of FB.

          hashkb 124 days ago

          Cognizant pays them hourly; they are referred to as "contractors" multiple times in the article, while the word "salary" does not appear.

            dragonwriter 124 days ago

            Hourly employees are still employees. Salary v. Hourly is completely different distinction from employee/contractor (you can have an hourly or fixed monthly/annual contract rate, and you can get either model as an employee, though salary is more common if your pay and working conditions make you an FLSA exempt employee, since otherwise you might need to be treated as hourly for overtime and certain other things anyway.)

      ihm 124 days ago

      The union would be stronger if it included the engineers too, who have more leverage.

        consumer451 124 days ago

        In my idealized universe FB engineers would have to rotate through moderation for 1 week per year. In my impossibly idealized universe the C suite would do the same.

        esoterica 124 days ago

        Every single union in the planet was formed because workers wanted to improve their own wages and working conditions. Nobody organizes to protect other people’s interests. Why should engineers be the only rubes on the planet using their bargaining power to help other people instead of themselves?

          mistersquid 124 days ago

          > Every single union in the planet was formed because workers wanted to improve their own wages and working conditions. Nobody organizes to protect other people’s interests.

          This overlooks the possibility that workers with different statuses can recognize commonality and, so, unionize as a coalition.

          This is precisely what happened when the graduate students at the University of Virginia unionized with the University classified staff under UAW for health care and a living wage in 1996ish (? IIRC).

          Workers can unionize across pay grades and professional classification if and when it suits such workers to do so.

        zamber 124 days ago

        I'm convinced that unionizing at this scale does not work. They'll always find cheaper people to do the grunt work which presumably is there only to teach the AI. Maybe when the AI is stable enough this AI "caretaker" position will take over and the burden of traumatic content will be alleviated. Maybe.

        Up until then people should unionize. Do whatever it takes to create a workplace whichd by human eyes at all, or worst case just once.

        reply

        Miner49er 10 hours ago [-]

        Facebook won't do anything to help these moderaters unless the cost of the negative PR exceeds the cost of doing something. Engineers at FB could help achieve this by causing agitation, for sure.

        Maybe the best solution is for the moderaters to unionize.

        reply

        plankers 1 hour ago [-]

        If you can think of a practical way to unionize workers on a 1-2 year contract with a high turnover rate, I'd love to hear it.

        The deck is stacked against employee organizatio does not drain sanity from people.

        Personally I understand browsing toxic content to foster "immunity" or at least familiarity. Doing it for money sounds like a lose-lose scenario. Call centers seem to sound better given the conditions.

      shujito 124 days ago

      ...which it's unlikely to happen because of the NDA's they have to sign.

      If there's something wrong with this reasoning, please elaborate.

        vageli 124 days ago

        > ...which it's unlikely to happen because of the NDA's they have to sign.

        > If there's something wrong with this reasoning, please elaborate.

        I would be utterly surprised if an NDA could be used to prevent an employee from exercising their right to organize (in the US at least). Just because you sign something does not make it legally enforceable.

          greghatch 124 days ago

          Yeah - but how would they know what is legally enforceable or not?

          It's really complicated and I don't think it's reasonable to assume people have access to a clear answer on enforceability and certainly not an answer they could rely upon enough to risk their job over,

            vageli 124 days ago

            I agree the whole thing is complicated and especially hard to navigate without guidance. I was merely responding to parent asking to validate their reasoning.

    ralusek 124 days ago

    I don't understand how this isn't a problem perfectly solved by the market. If it's a shit job that isn't worth the pay, people won't do it. If it's an okay job that isn't really worth the pay, they will have high turnover. High turnover has a cost, maybe it's worth it to them. If the turnover is so high or the job is so undesirable, maybe they'll axe the role. If the role is so important to the business that it can't be axed, but people aren't taking the jobs, they'll have to pay more for it. If they can't pay more for it because it cuts too much into profitability, but it's necessary for the company to operate, then they are no longer a viable company in the market. Or they'll have to innovate or adapt. All possibilities are fully encapsulated in very simple dynamics.

      sandworm101 124 days ago

      >> If it's a shit job that isn't worth the pay, people won't do it.

      That assumes perfect knowledge. As mentioned in the article, these moderators often do not understand what they are getting into until long after they have committed to the job. Once you have quit a previous job, moved, setup your new life, only then do you realize your new "brand consultant" job means watching kittens die all day.

      Jobs like this also open persons to realities that we normal keep tucked away deep in our brains. We all know that cops don't investigate most reported crimes, but that doesn't come up very often for us. A facebook moderator sees the crimes, sees the lack of response by the police or anyone else, every minute. Repeatedly, every minute. Nobody can appreciate what this does to ones' mental state until you have been there.

        debacle 124 days ago

        Nobody moves for $30k a year.

          skyyler 124 days ago

          Are you serious? I know people that have moved for less. Making absolute statements like that doesn't do much for anyone.

          sandworm101 124 days ago

          Lol. Ever seen a migrant camp? Ever talked to a seasonal farm laborer? People will literally walk across continents for 30k/years.

          https://www.payscale.com/research/US/Job=Soldier/Salary

            jumbopapa 124 days ago

            Sounds like $30k is an improvement in their lives then, so is it really "too little"?

              lifeformed 124 days ago

              The word is "exploitive". Desperate people will do things because they have to. If I pay a homeless person $20 to humiliate himself, is that a fair transaction, or an exploitation of the asymmetry between our conditions?

                Spooky23 124 days ago

                To the chubby engineer type in 2019, yes. Make the drunk dwarf dance, it’s a free market.

                When recession strikes and the gravy train ends, the same folks will be wearing Che Guevara T-shirts crying about unfairness.

                ekianjo 124 days ago

                There is no obligation for anyone to do anything. Do you assume that there is no free will and we are all like robots driven by short term profits?

                  pizza234 123 days ago

                  That doesn't answer the parent's question.

                  If you call 20$, for a person who owes nothing and is probably starving, a "short term profit", you've very likely never experienced how is it to live in poor conditions.

              JauntyHatAngle 124 days ago

              It is very possible to "improve" somebody's life and still be abusing your power over them and causing them damage in other ways.

              vageli 124 days ago

              > Sounds like $30k is an improvement in their lives then, so is it really "too little"?

              Being put in prison is an improvement for many individuals in so far as they are guaranteed three square meals and a warm place to lay their head. Would you seriously argue that that's an improvement that is worth the cost to the individual?

          vageli 124 days ago

          > Nobody moves for $30k a year.

          I personally know people who have moved for half that amount.

      xnyan 124 days ago

      "And the Union workhouses." demanded Scrooge. "Are they still in operation?" "Both very busy, sir..." "Those who are badly off must go there." "Many can't go there; and many would rather die." "If they would rather die," said Scrooge, "they had better do it, and decrease the surplus population."

      C. Dickens: A Christmas Carol. In Prose. Being a Ghost Story of Christmas, 1843

        ralusek 124 days ago

        That literally describes the problem though. There are too many people that can do that job, so there's a surplus supply for it to be valuable to the market. It isn't the responsibilities of the Scrooges and the Facebooks to account for the fact that there isn't enough demand for that position for it to pay well. If people are unable to find work, or are otherwise unable to have enough stability so as to avoid exploitative work, that becomes a societal responsibility. Don't tell Facebook what that job should be worth, or what it should entail, that doesn't make any sense. Just like minimum wage laws, they don't make any sense.

        If we're legitimately talking about situations where the balance of power is so unfavorable to the unemployed individual, the solution is to start talking about UBI, not micromanaging very specific roles at very specific companies.

          xnyan 124 days ago

          https://twitter.com/lil_yenta/status/1140788295175233536

          edit: if it was not clear, facebook is in fact part of society (by definition) and their responsibility is proportional to their role in society, no more no less.

            ralusek 124 days ago

            Yes, and taxes + UBI would represent that proportion. This meme is also only ridiculous because of the $100 figure. What if someone paid you $20k to fuck off. $100k to fuck off?

              xnyan 124 days ago

              Sorry for the discourse, my polisci past is flaring up but I do in fact explain why I support this (silly) tweet.

              There are some kinds of nonviolent behavior we don't allow because we (I guess we is majority of society?) think it's wrong, often around issues of financial exploitation. For example, even if you are mentally sound you can't sell yourself into indentured servitude because that tends to only happen when someone with power exploits someone desperate, often (but not always) due to reasons out of their control.

              UBI is basically an artificial wage. Like a lot of things in life, a wage is a negotiation between two entities of vastly different levels of power. As every public company proudly proclaims, the value of what employees do for them is far far less then the value the employees receive.

              Why do you think those workers accept this fact? They like getting less value than they generated? It's clearly the result of an exploitative power dynamic - one side has to work or not eat, the other does not.

              If you think someone should not have medicine and food dangled over their head in work negations, the answer is workers need to be paid in real value of the capital that they create and that needs to law.

              I hope one day our wages are looked back on like no minimum wage, the 6 day 14+ hr work day, child labor, indentured servitude, slavery and etc which were often viewed as normal at their time but now disgusting.

              This GIF is mocking the idea of UBI and while for sure a silly meme, has an element of truth IMO.

                closeparen 124 days ago

                Why would it need to be law? Since, in your view, capital / management / entrepreneurship contribute nothing of value, workers should have no problem creating successful businesses without them. Particularly if UBI takes care of basic needs during the early stages.

                  xnyan 124 days ago

                  Why do feel the need to misrepresent my argument? When did I say that management contributes nothing of value? Of all the arguments I was expecting, this one surprises me as it makes no sense.

                  Capitalism works on leveraging advantage to pay workers less than the value that they create. Do you have anything to say about that, or are you going to continue to suggest magic discount coupons (UBI) which are worth less than the value a worker creates are something besides slavery with extra steps?

                  Feel free to scoff/mock me for questioning sacred holy capitalism - when immoral ideals are questioned and no logical answer is available, history tells us emotion and feelings are the standard responses.

                    closeparen 120 days ago

                    A thing is worth its market-clearing price. Facebook has $15B in revenue and pays $12B in expenses (for the sake or argument, it's all payroll). I claim the workers are getting exactly what they're worth, and the remaining $3B is the value contributed by capital/management. Your claim seems to be that workers created the full $15B in value, and the $3B is being stolen from them. Apologies if you meant something else.

                    If that's true, Facebook's workforce could quit tomorrow, go build a competitor, and claim the full $15B for themselves. Why don't they do that?

                    The traditional Communist argument has been that they can't because capital owns the means of production. Factory workers would happily go start their own factory, but they can't afford the machines to fill it. That argument doesn't work too well here: the means of production needed to build Facebook are accessible even to college students. Basically an editor, a scripting language, a web server, and some linux boxes.

                    Another reasonable argument is that they can't because they need to eat. The time it would take between quitting their jobs at Facebook, Inc. and seeing revenue from Facebook Co-Op would exhaust their savings. That is a problem UBI can solve. It's not that UBI pays what you are worth, but that it removes the urgency that sometimes prevents people from getting their full value in the market.

                    This idea - that workers create all the value in a company, not just the subset they are paid for - is testable. If you're sure it's true and you want everyone else to believe it, then I think you'd want to run the experiment. UBI could be a way of doing that. Could also do grants that pay living expenses during entrepreneurial activity, or just straight-up house and feed co-op founders without asking for an equity stake.

                    124 days ago

          Spooky23 124 days ago

          Do you know what a Victorian-era workhouse is?

            xnyan 124 days ago

            If someone suffered more than you, your suffering is invalid - is that your argument? Trying to understand what you are arguing.

              Spooky23 123 days ago

              Workhouses were the UBI of the era. Sort of a combo welfare and prison.

        pfortuny 124 days ago

        Thanks for posting this. There are people who do not get it.

      Lewton 124 days ago

      > If it's a shit job that isn't worth the pay, people won't do it.

      That is not how the labor market works... At all.

      "Worth the pay" is overridden by "absolutely need this money for me or my kids to survive"

        ralusek 124 days ago

        I responded to another poster. Desperation (debatable regarding whether or not this is possible in the United States) breaks markets, sure, but that doesn't mean that they suddenly need to be micromanaged. I have long been in favor of UBI in order to remove desperation from the table. I'm not, however, in favor of micromanaging roles at companies and arbitrarily determining what they're worth.

          dragonsky67 124 days ago

          I've seen desperation in Australia, and our social welfare is head and shoulders above the US. US$30,000 would revolutionise many peoples lives. They would put up with a lot of crap to get that type of money.

          p1necone 124 days ago

          > debatable regarding whether or not this is possible in the United States

          Wha...? Are you really that far removed from reality?

      8bitsrule 124 days ago

      >> If it's a shit job that isn't worth the pay, people won't do it.

      Tell that to the people who pick your fruits and veggies.

      social_quotient 124 days ago

      I think the same about Uber and the drama with drivers. If the work is so bad, pay so terrible, riders don’t want to pay more, then why hasn’t the market killed itself?

      Now we have gov externalities trying to legislate something that shouldn’t have existed?

      I’m not saying any of this is write or wrong, I’m just at a loss how people get mad at Uber when the model shouldn’t have made it. Where did the market dynamics go wrong here?

        Spooky23 124 days ago

        It will, but it will take a few years.

        The issue with Uber is they pay $0.70/mi for a task where the cost of fulfilling the task is $1.00. The drivers do it because they can get quick cash with no friction.

        You can already see the system breaking down. Uber XLs are getting older and shittier. 2012 Dodge Caravan that picked me up at the airport will be dead in a year. As the situation goes on, someone will die in a fiery wreck by an Uber with no brakes, and it will gather the national attention and kill the company.

          manfredo 123 days ago

          I always thought that the car age requirement was dumb. I drove a 2007 Subaru and it still worked perfectly. Another person I knew drove a 1997 BMW that's better than most Ubers I take.

            Spooky23 123 days ago

            I drive a 2003 Honda. It’s a great car in great shape. If I used in as an urban livery vehicle, it wouldn’t be in great shape.

      michaelt 124 days ago

        I don't understand how this
        isn't a problem perfectly
        solved by the market.
      
      If I offer a homeless person $30 to let me spit on their face and they accept, the market has worked perfectly - but some people will still think I'm an asshole for spitting on that homeless person.

      sonnyblarney 124 days ago

      Below a certain level, people don't have any leverage at all.

      Most people depend on their meagre income quite greatly, it's a bold thing to say someone could just 'quit' their job and spend 9 months looking for another one, especially if they don't have some pretty hard skills.

      So in the realm of talent, yes, it would solve itself, which is why devs et. al. often get 'free lunch' among other things.

      For most working people, it's much harder.

      "All possibilities are fully encapsulated in very simple dynamics."

      Yes, of course, but the incredible power asymmetry between FB and not hugely skilled workers is the dynamic that drives all of this.

    Shivetya 124 days ago

    anyone who has done moderation for any active forum can tell you how nightmarish it can be. the more popular the platform the more problems you can have.

    anecdotal, friend who streams on twitch also mods for a few streamers and they even have issues when a stream is in subscriber mode only. simply because anonymity and distance from those being affected empower people to do bad things

    edit : I am really surprised there is/are not companies which would spring up to provide these services seeing how most of these activities are required by law.

    however before bemoaning what they are paid, just go look at your local 911 operators who are government employees. just because we think it should be paid more doesn't mean others do.

      kevingadd 124 days ago

      Anonymity is definitely empowerment (especially because Twitch has no effective ban system, so bad actors will create dozens of accounts and keep coming back), but even people acting under their real names will do horrible things. We've seen this with real names on FB/Google+.

      It seems like there are no easy solutions for this and it's really frustrating: Societal norms are just completely shredded right now.

        nvr219 124 days ago

        > but even people acting under their real names will do horrible things. We've seen this with real names on FB/Google+.

        And on twitter. Their most famous user uses his real name.

    jfindley 124 days ago

    Given the description of the videos in the article, I'm not sure that merely paying the moderators more would solve it.

    I think for most people that I'd want to be moderating content (e.g. not sociopaths) making them richer isn't going to do anything to deal with the real reason this is worse than a call centre - the content itself. I'd rather see facebook put more effort into protecting moderators from the content they're viewing. I realise this is a non-trivial problem, but here's a few ideas of things they could do which may help:

    * Once a video is confirmed bad, fingerprint it and block future uploads.

    * Provide colour-inversion buttons to help reduce visual impact

    * Rotate people between areas, so they aren't looking at the same class of bad content constantly

    * Use ML training to detect which content is likely to be the most severe and for this content, reduce minimum viewing interval to ~ 5 seconds, and ensure that a single individual doesn't get too many of these.

    * Flag accounts who post bad content too often, and shadow-ban them so their posts are not visible by anyone else (this could go in stages, starting off by blocking re-shares of their posts or something)

      jnbiche 124 days ago

      > Flag accounts who post bad content too often

      Too often? If we're talking about content that is traumatic to view (extreme violence, child porn, etc) then I think one time is enough for a flag, no?

      That said, I do agree that shadow banning is the best option. Let them spin their wheels for a few weeks posting content before they realize no one is seeing it.

      zanmat0 124 days ago

      >* Provide colour-inversion buttons to help reduce visual impact

      To add to this, some degree of pixelation would help without impacting the accuracy of assessments.

      quadcore 124 days ago

      Provide colour-inversion buttons to help reduce visual impact

      I was thinking also about blurring the picture/video.

        afasadaf 124 days ago

        They could have levels, and use moderator consensus to remove the pictures

        1. Blurred+Mosaic 2. Contour lines only 3. Windowed/Striped (only showing like 10-25% of the photo)

        If 2/3 of the moderators above agree that it's a bad photo, it moves onto stage 4:

        4. Color filters - Break the photo down into k-means color clusters, moderators choose which colors to see. They can see 100% of the colors, or one, or two, or however many they need.

        If #4 agrees, the photo goes into the round folder.

        No need for anyone to ever see the whole photo. It'll be pretty obvious that something was violating the TOS without anyone being able identify anything specific about the photo.

        We're not talking about 'beyond the shadow of a doubt' here, we're talking about 'yeah, that's probably not ok.'

    m463 124 days ago

    Stuff like this should be brought in-house. You can't outsource responsibility. Engineering and AI/pattern matching should be in the loop as a key ingredient so that a terrible image/video won't have to be viewed by human eyes at all, or worst case just once.

      laminar_flow 124 days ago

      I would agree. Having to watch some of this stuff is a great incentive to make the AI better.

    RegBarclay 124 days ago

    I thought the same thing -- that once content is moderated, it should be trivial to block exactly the same content from being posted and moderated again. What a waste.

      Nasrudith 124 days ago

      The problem is actual use and context sensitivity. Since if they let historic signifiant images get flat banned they also look like big bad idiots trying to memory hole history for blocking say death camp images while still having to ban asshats who post it and say "Not enough."

      Neither the text nor the image themselves would be ban worthy but in combination it is boot-worthy. Extrapolating has Scunthorpe problem issues.

      LinuxBender 124 days ago

      The content may get modified quite a bit if censored though. A good example was the ChristChurch video. 4chan members modified it so many times, there are probably 4k-5k different iterations of it. NZ made a big deal about blocking it and arresting people for viewing it, making the demand for it significantly higher. There would have to be some really efficient and highly optimized ML to group all the variations together with a common label. AFAIK this does not exist for videos yet.

      The only partial mitigation that comes to mind would be if a small circle of your friends had to "approve" your video before people outside your circle could see it. That would put them on the hook for sharing bad things too. That might even reduce the psychological load on the moderators.

    baud147258 124 days ago

    > cheap in terms of reputation risk

    Also in terms of the reputation of Facebook: any issue with moderation failures and Facebook can easily says that it's the contractor's fault

    Cthulhu_ 124 days ago

    Re: salaries, don't forget that there's also salaries working on the content moderation problem - taking the huge datasets that the content moderators are reviewing and applying machine learning and content ID and I have no clue what else more to automatically filter out a lot of the content, ideally with such a high certainty rate that the moderators don't see it in the first place.

    I want to believe the content moderators only end up seeing the 1% of content that is dubious, or brand new content (not published and hashed earlier). I don't understand in that regard why the one video mentioned in the article (about the iguana) wasn't marked as "ok" in the systems, or marked as "on hold for law enforcement" so the moderators didn't have to see it again.

    anon029102 124 days ago

    > Can anyone at Facebook speak to whether engineers are aware of the working conditions of moderators, and agitate to improve their lot?

    They are passively aware but mostly don't care. The engineers who express frustration are in the vast minority. Most engineers are just super happy about their own situations, very focused on PSCs, and try to keep to their own stuff. Any internal frustrations are channelled towards appropriate feedback channels where the anger is watered down.

      plankers 124 days ago

      Nobody's making waves when everybody is focused on their own struggle. Gamifying for a docile workforce.

      johnkpaul 124 days ago

      PSCs?

        tristanperry 124 days ago

        From a quick Google, "Performance Summary Cycle" - presumably regular reviews and bonuses.

        124 days ago

    hashkb 124 days ago

    > Facebook is enormously valuable. They made something like $15B in net income in the last four quarters.

    Their profit is not proof they are valuable, because they are willfully abdicating responsibility for the damage they cause, and, further, involving themselves in government debates over their own regulation.

    Advertising businesses are inherently suspect, and when one is addictive and potentially enabling election fraud etc, we need to be extra careful in declaring its value.

    Anyone participating in the debate should have to disclose if they are a daily user or if their company advertises on any social network. (I'm clean)

    Edit: my opinion of course is that FB, Insta, etc (any advertising product with push notifications and/or infinite scroll) ought to be illegal. In 50 or 100 years we'll all recognize that social networks are like internet cigarettes.

      mehrdadn 124 days ago

      > my opinion of course is that FB, Insta, etc (any advertising product with push notifications and/or infinite scroll) ought to be illegal. In 50 or 100 years we'll all recognize that social networks are like internet cigarettes.

      Well cigarettes are still not illegal, so we have some ways to go...

        allthecybers 124 days ago

        This was an earlier comment that disappeared. Frankly I ask myself consistently "what net value does Facebook add to the world?" other than being a 100% optional vice.

    scarface74 124 days ago

    $30k/year, no benefits, likely going to acquire mental illnesses

    Why do you think they don’t get benefits? Facebook contracts out to another company. The company they contract with still is probably providing some benefits

      cldellow 124 days ago

      That's true, the article says Cognizant offers "a 24/7 hotline, full healthcare benefits, and other wellness programs". But at the same time, it quotes the on-site counsellor as having said "I don’t really know how to help you guys" when an employee went to him for help.

      This leads me to believe that there is a qualitative difference in the calibre of benefits Cognizant provides versus those that Facebook provides.

      Further, since the job has high turnover and can be expected to put a high toll on an employee's wellbeing, it's also important to me to think about what happens to their benefits if/when they quit. In the US model of employer-linked healthcare, this tends to result in losing your benefits when you need them most. By contrast, other organizations that do similar moderation jobs seem to offer psychological health care treatment even after the employee leaves the company. See https://www.theguardian.com/news/2017/may/25/facebook-modera... for more info.

    krick 124 days ago

    > I understand the arguments about supply and demand of labour, but I'd have more respect for Facebook if they demonstrated awareness of this issue.

    It is a very politically correct statement, which, as it usually happens with politically correct statements, really says nothing. I mean, what exactly do you propose they should do about it?

    I am the last person to defend Facebook, but, seriously, they are running a business. That is, essentially, finding a way to do/own/produce some stuff while spending less on it than you receive for it. And cost management is all about supply and demand.

    After all, it's not like Facebook wants to pay engineers $150k/year and for them to be happy. It needs good engineers and the only way to get them is to keep them happy, or else they stop coming. Engineers are not really special, they are not even exactly more necessary than cleaners, and loaders, and content moderators. The only difference is, demand for them is higher than supply.

    And that would be tragic in a sense, if everyone would be randomly assigned a role and if you draw an "engineer" you are lucky, but if you draw a "content moderator" you are not. Or if we were talking about some factory in some giant village in Vietnam, where no matter how smart you are and no matter what you do, you'll be working much harder job in conditions far, far worse than described for $500/year and you really don't have any choice.

    But we are not. No, this is Florida, $30K/year and these are people choosing the job of content moderator over a job of an engineer, either because they don't want to or cannot do otherwise. They while many of them may have stories that can be seen as tragic (because who hasn't, really?), they are no more victims than anyone else born into living a life on the Earth. And thus, hardly entitled to anything.

    Let me be clear: I really hate Facebook and I find companies with a modus operandi like Congizant's distasteful to say the least. And I can see how Cognizant can be held accountable for giving somebody a false hope, luring into doing the job while promising something else. But what exactly do you want for Facebook to do? They want a service as cheap as possible, they get asked a given price, they agree. That's it. It won't save their reputation after an article like this (because, well, public relations...), but it seems just fair.

    Bakary 124 days ago

    >This is a guess - can anyone at Facebook speak to whether engineers are aware of the working conditions of moderators, and agitate to improve their lot?

    If you go work at Facebook as a engineer chances are you aren't too bothered by ethical questions

    wrongdonf 124 days ago

    It says right in the article that the content moderators have full medical benefits

      cldellow 124 days ago

      Sorry, yes, I could have expanded on that. You're right, they provide health benefits, in theory.

      My experience with the US health care system is that there is a wide gulf in health benefits. When I worked for Microsoft in the late 2000s(as an employee, not as "dash trash", as contractors were called), I paid some very small premium like $10-20/month for a plan with 0 deductible, 0 copay, and pretty much every doctor was in network and covered at 100%. When I dislocated my shoulder, some orthopedic surgeon who used to consult for an NBA team looked at it for me.

      I understand that that was considered quite good insurance and that most people in the USA don't get that level of care, much less at that price.

      If you read Glassdoor reviews of Cognizant, people complain about the insurance offerings available to employees. https://www.homebaseiowa.gov/sites/homebaseiowa.gov/files/do... looks like it might be what they offered 2 years ago. It doesn't look very good to me. If I was earning $15/hour, I think I'd pretty much buy the cheapest plan and only use it if I was actively bleeding to death.

      I'll also link to https://www.theguardian.com/news/2017/may/25/facebook-modera... again - it describes how other companies that moderate distressing content approach this. It includes the company paying for psychological treatment, including after the employee leaves the company.

        pravda 124 days ago

        >I understand that that was considered quite good insurance and that most people in the USA don't get that level of care, much less at that price.

        AFAIK, Microsoft 'self-insures', in the sense that they simply pay the medical bills for their employees.

    pawelmurias 124 days ago

    It's knowledge workers vs unskilled labour.

      McWobbleston 124 days ago

      It's also people not being paid nearly the value they produce for an organization. Everything is always a race to the bottom for us

      eswat 124 days ago

      I was thinking about this more but you’re right. I doubt there’s any acquired knowledge needed to know what’s repugnant to the human mind.

        cldellow 124 days ago

        I don't have first-hand experience with the job, but I've read a few articles about these sort of teams, as they're widely deployed by all the large Internet companies.

        In my view, it's knowledge work. Sure, it's knowledge work that almost anyone can be trained to do, but there's a skill set required.

        For example, moderators need to make judgment calls that distinguish between offensive uses and artistic/journalistic/parody uses. There are many cultures and in-groups on Earth, which makes this harder than it might seem at first blush. Not all nudity is obscenity, not all hate speech is blatant.

          qudat 124 days ago

          > For example, moderators need to make judgment calls that distinguish between offensive uses and artistic/journalistic/parody uses. There are many cultures and in-groups on Earth, which makes this harder than it might seem at first blush. Not all nudity is obscenity, not all hate speech is blatant.

          In these cases I'm not sure it really matters. Over-moderation is going to win overall and I don't think moderators are going to get into that much trouble for being slightly over-zealous. I could be wrong but I feel like after a week of being on the job you'd get the hang of it.

            baud147258 124 days ago

            From the article (and the others on the subject linked at the end of the article), there are many rules, many exception and edges cases and even mandates for current events (for example, there's been a shooting 15 min ago, do you ban or keep the videos?); in addition to judging if content is infringing the rules, the moderator has to select the right rule being broken ("Do I delete it for terrorism? Do I delete it for graphic violence? Do I delete it for supporting a terrorist organisation" from https://www.irishtimes.com/culture/tv-radio-web/facebook-s-d...).

            Some of the decisions taken by the moderators are then reviewed by a "super"-moderator and the match between the two decisions (including the rule chosen) is used as a performance indicator, which has to be above 90% to avoid troubles.

      skybrian 124 days ago

      It's not "unskilled" so much as (a) it can be learned on the job and (b) people don't see why they should pay more for top talent.

      The second one is pretty important. Consider that sales, sports, and entertainment work similarly, where you can make a case that you should pay more to get the best and workers (or their agents) can use this to good effect.

        svachalek 124 days ago

        I think "unskilled" and "can be learned on the job" are usually considered equivalent. Nearly anything you need to hire a human to do requires some kind of learning, at least to do "well" or "our way".

      isolli 124 days ago

      No, it's market forces. If the world was overflowing with employment opportunities, then unskilled labor would leave unpleasant jobs, and employers would have to pay them as much as it takes for someone to put up with the unpleasantness.

        sneak 124 days ago

        US unemployment rates are lower than they have been in more than a decade. There are more jobs than applicants.

        It could be people prefer working at a desk and a screen in an air conditioned office to, say, schlepping boxes in a warehouse.

          eswat 124 days ago

          > It could be people prefer working at a desk and a screen in an air conditioned office to, say, schlepping boxes in a warehouse.

          And judging from the article, agencies hiring content moderators are going to do more to paint the work more colourful and being a career-builder, by outright lying about what contractors will actually do day-to-day.

          Of course that’s going to look more attractive at face-value than being pitched a warehouse job by the foreman.

            sneak 124 days ago

            There is no penalty to quitting if the workers are unhappy with the work or feel it was misrepresented to them, and the warehouse jobs are still there waiting.

              jschwartzi 124 days ago

              The penalty is the amount of time you have to spend looking for a new job, and doing it while you're still working your soul-sucking current job.

    nswest23 123 days ago

    for the record the article states that cognizant, the contractor that FB deals w/for content moderation, does provide health benefits:

    > Cognizant also offers a 24/7 hotline, full healthcare benefits, and other wellness programs.

    deegles 124 days ago

    I think all companies with user generated content should require every employee to moderate content for some amount of time. 1 hour per quarter would be plenty.

    nonwifehaver3 124 days ago

    How can kiddie porn, gore and animal cruelty "flood the network" for people using Facebook? Don't you normally just see things from your friends, ads, and groups that you are a part of? If they post that, either call the cops, block them on Facebook, don't hang out with them irl, or leave the group as appropriate. Problem solved? I haven't used Facebook in 5+ years so maybe this is no longer accurate.

      lazyasciiart 124 days ago

      As you may have guessed from the downvotes, this is not even close to accurate. It should have been possible for you to realize this was inaccurate simply from how trivial the problem looked to you in comparison to how much effort is being spent on it.

        nonwifehaver3 124 days ago

        I genuinely don't understand how an average Facebook user would encounter videos like those described in the article. Is it some recommender system gone awry?

          lazyasciiart 124 days ago

          As far as I know, mostly because people post them as comments on public or widely shared items, along with "get free sunglasses" and every other kind of spam. There are tons of these public posts from pages that do news, memes, politics, trolling, regular advertising, whatever.

    p1esk 124 days ago

    How about training a classifier to detect and remove potentially bad stuff, then let the users who uploaded it argue that the classifier made a mistake? Only in those cases have human moderators look at it?

      Sharlin 124 days ago

      They already have classifiers, of course. It's not like the moderators go through everything posted to FB–you'd need millions of people to do that. What the mods actually do, presumably, is review content reported by users.

        p1esk 124 days ago

        Then perhaps they need better classifiers? I don’t remember seeing any papers from FB about that, so it does not seem like a priority for them. Or just make the existing bad classifiers more aggressive and shift the moderator’s effort to reviewing good content.

    GoodJokes 124 days ago

    Are you telling companies exploit labor? Are you telling me meritocracy is a joke? Are you telling me a ton of software devs get paid WAY too much and aren’t that “special.” I think we have hit the nail on the headddddd

    meerita 124 days ago

    "If kiddie porn, gore and animal cruelty flooded the network, it would cease to be a destination visited by people that advertisers will pay to reach."

    I bet this can be done with ML and IA instead human power. Yet, FB uses their human cattle to censor conservative voices.

      kartan 124 days ago

      > I bet this can be done with ML and IA instead human power.

      IA and ML work in bulk. But they misclassify things that a human easily can discern.

      Think Tesla VS Google. Google wants a self-driving car, and that is not working yet. Tesla wants a car that helps you to drive and that is good enough.

      It's order of magnitude easier to be almost always correct that to be correct all the time. And you only need one kiddie porn video in your Facebook feed to start to worry about Facebook.

        meerita 124 days ago

        It is matter of time this will happen. Even with self-driving cars. Using people to do this is just killing people. Think all the mental effort that has to be done to censor. Humans aren't that perfect and cannot oversee the entire FB daily activity.

      mythrowaway1124 124 days ago

      Good luck! There's at least a billion dollars in it for you if you figure out how to do this.

    geggam 124 days ago

    150k a year in Silicon Valley is similar to 30k a year in the midwest...

    Where are the moderators living ?

    sandworm101 124 days ago

    >> If kiddie porn, gore and animal cruelty flooded the network, it would cease to be a destination visited by people that advertisers will pay to reach.

    Or the opposite. People flock to the internet to see this stuff, the material that isn't on TV or even dedicated commercial websites. They get to view it in the privacy of their homes, a secret between them an their trusted "friends" on facebook. Facebook knows that if it really cracks down, if cops bust down doors over this stuff, that people will go elsewhere to share horrible material. So facebook does a poor job of moderation, one just good enough to avoid new regulation but not so good as to actually make a difference.

    Advertisers care about eyeballs. They may claim to not want to be associated with X or Y content, but in reality they would rather be in front of an audience than not.

anon029102 124 days ago

Guy Rosen and other execs within Integrity team continually skirt their responsibilities here. They claim they're doing better, but the second-order effects of crappy work conditions and demands keep cropping up. Zuck says one day we will hopefully be able to AI-away this integrity work (especially the most traumatizing), but he does not say a whisper as to improving working conditions or pay while the work needs to be done by humans. And I bet Zuck wouldn't be able to handle the content that these people have to view. Sheryl does not care. She keeps referencing the same standard schpiel about how contracting companies have to abide by a strict set of standards, and how they're ahead of the market in terms of pay and wellbeing. But it's still awful. The divide between contractors and full-time workers at Facebook is truly disgusting.

People who work at Facebook should be pushing for change. But they're numb to the schpiel. They're cushy and looked after and don't want to create a fuss.

Rosen doesn't care. Zuck doesn't care. Sheryl doesn't care. What DO they care about? Perception. Sit in any high-up integrity meeting and you'll see the only thing they seem to talk about is how "x" would be received by users at scale. There's no comment as to the ethics or corporate responsibility. You can be talking about something pretty out there like how human rights intersect with takedown decisions and all you've got is a bunch of people umming-and-ahhing about lossy metrics and how Zuck wants this or that so we better hurry up. Or how awwesome it'll look on our PSCs if we ship this thing.

Broken company.

    124 days ago

    deusofnull 124 days ago

    You're right, Facebook is a broken company. Along that point, we should break it up.

      allthecybers 124 days ago

      I agree and in a more enlightened future I hope we can assess companies like this on their net benefit to society and apply penalties when they act in a way that negatively affects society.

SolaceQuantum 124 days ago

"Conditions at the Phoenix site have not improved significantly since I visited. Last week, some employees were sent home after an infestation of bed bugs was discovered in the office — the second time bed bugs have been found there this year. Employees who contacted me worried that the infestation would spread to their own homes, and said managers told them Cognizant would not pay to clean their homes."

This is utterly nightmarish, given how costly to one's life bedbugs are. (clearing out a home, including the replacing of all mattesses/couches, and bagging or hot-cleaning all clothing, sheets, towels, rugs...)

"A manager saw that she was not feeling well, and brought a trash can to her desk so she could vomit in it. So she did."

This particular manager put their employees in danger of catching illness, especially given what appears to be the open office floorplan where airborne sicknesses can travel the entire room. I'm shocked and apalled, and this is the stuff I'm comfortable quoting from the article to be shocked and apalled by. The other stuff has convinced me to help inform friends and family to get off facebook rather than passively clean myself of it only.

    awakeasleep 124 days ago

    This is industry standard in businesses with limited sick day or leave policies. Especially in call center type environments.

    Once an employee has used their allotment, they receive a write up if they take off again no matter how ill. Too many write ups and you're fired. Managers have no discretion in this process, so they can only mitigate the impact by doing something like bringing the trash can over or buying hospital masks.

    Not saying it’s acceptable! But this isn’t a facebook problem specifically.

      organsnyder 124 days ago

      A local healthcare system has a similar policy: PTO (paid time off) is not differentiated between sick days vs. other days off. They don't offer any sort of maternity/paternity leave, aside from what the US Family Medical Leave Act (FMLA) requires, which is six weeks unpaid time off. Even worse, they require employees to burn up all of their accrued PTO days before taking any unpaid FMLA time. So you have new parents returning to work sleep-deprived, picking up new viruses from the petri dish that is their kid's new daycare, with no time off. And most of these workers are HEALTHCARE PROVIDERS.

      How stupid are we as a nation that we don't mandate more humane/non-stupid policies, even for people in environments where coming to work unhealthy can often result in death?

        komali2 124 days ago

        > How stupid are we as a nation that we don't mandate more humane/non-stupid policies, even for people in environments where coming to work unhealthy can often result in death?

        This is one of the rare times I believe malice trumps stupidity. Our healthcare industry generates billions and some of that money is being spent lobbying to keep things just as they are.

          b_tterc_p 124 days ago

          I don’t think I believe that healthcare companies lobby to prevent office condition reform from happening

        leetcrew 124 days ago

        > Even worse, they require employees to burn up all of their accrued PTO days before taking any unpaid FMLA time.

        everything else you mentioned seems pretty bad, but what is so terrible about this? unless you are trying to keep your PTO buffer high so you can cash out on a higher salary when you leave, it doesn't seem to make much difference if you were going to need to take the unpaid leave anyway.

          dragonwriter 124 days ago

          > everything else you mentioned seems pretty bad, but what is so terrible about this?

          Because PTO (or even pure sick leave) has less restrictive usage conditions than FMLA, it means that anyone who uses needs FMLA for a longer-term issue like a new child will also exhaust all the leave they have available for short-term personal or family illness or need for stress relief and be unable to use that until they have reaccrued it.

          This is a legal but extremely employee-hostile policy that contributes to bad working conditions for everyone (not just the employees that need FMLA.)

            organsnyder 124 days ago

            Yep. FMLA can't be used for a routine illness—only a "serious health condition"[1]. But it's better for everyone—especially vulnerable populations like hospital patients—for people with less-severe illnesses to stay home. A routine cold can be fatal for a person undergoing chemotherapy.

            [1] https://www.dol.gov/whd/fmla/employeeguide.pdf

brundolf 124 days ago

I think the most pertinent question is why don't they quit?

It says a great deal about how broken the United States' job market and social safety net are. If minimum wage were $15, they could find another job that paid their basic living expenses. If health care weren't left up to your employer, they wouldn't be out of luck while looking for a different job. If there were any alternative, they wouldn't stay in this hellscape.

They stay because this is the best deal they could find. Think about the kind of society that makes that the case.

    baron_harkonnen 124 days ago

    In general folks working in in-demand areas of tech are pretty out of touch with how hard it is it "just quit" a job. If you're an engineer or data scientist making $200k+ then you very likely have some moderate savings to cushion you between jobs and will have no trouble finding a new job and getting a slight raise. If you make over $200k, you are getting that much because your skills are in demand and employers need to pay that much.

    If you're making $30K you certainly don't have any kind of buffer to hold you over between jobs, and don't have skills that are attractive to employers. Even basic things like interviewing are much harder. An engineer or data scientist can pretty much disappear from their desk for an entire day and no one will ask questions, you don't need time off for an interview, and if you do you likely have tons of leave. Someone making 30k most likely has to give notice of vacation many weeks in advance, if they even have leave benefits at all. And because your skills are less in demand your interview success rate is going to be much lower so you'll need even more time to interview.

    It's likely very hard for these people to find new jobs even if they hate their current one, they are probably happy just to have income.

      brundolf 124 days ago

      You clearly only read the first line.

        SomeOldThrow 124 days ago

        ...of what? It seems to integrate well with both the parent comment and the article.

        Given that you are the parent comment, is it so hard to read conversation like this as building on each other rather than an assumption of being relentlessly contrarian? This is needlessly hostile to someone who clearly agrees with you.

          brundolf 124 days ago

          I read it as an attempted counter-argument, but I could have misinterpreted.

    spamizbad 124 days ago

    It also sounds like they are sold on the empty promise that these jobs will get your foot through the door into the lucrative technology industry.

      brundolf 124 days ago

      It seems pretty clear that they stayed after any illusion of that fell away.

        darkpuma 124 days ago

        Maybe it's an illusion, or maybe it isn't. That depends on the individual in question. If somebody's previous work experience is laying tar down on roofs, they may consider the change to an office job to be well worth the added emotional toll.

        Due to the low physicality of the work, there are likely people who consider it easy work, despite the psychological impact that seems obvious to you or me.

          chris11 124 days ago

          I had a really short contract at a major tech company. It was a completely different situation, I felt that I was respected and treated well. But it was a very low paid/low skilled job. I didn't see much of any room for growth. The biggest benefit to me was that it got me an interview for an internship with that company when I went to school for CS.

          I don't expect these contractors to get much long term benefit from the position. And it sounds like most quit or get fired.

        124 days ago

    enibundo 124 days ago

    Maybe they think that flagging shit online is easy and pays well to later find themselves with anxiety in the house. This kind of stuff is never easy to pin point

      brundolf 124 days ago

      Maybe when they apply. But from the sound of things that idea would vanish in the first week.

        Barrin92 124 days ago

        many do leave as the turnover rate is extremely high, but with an increasingly underempoyed white collar workforce and rising rent and healthcare and education cost there's a pretty huge pool of reserve labour to churn through.

        This is really just a giant meatgrinder for the population outside of the "cognitive elite" class and probably a good sign of things to come for the workplace of the future of half of the population.

    bena 124 days ago

    That is true, but a common theme among many of the stories is how much more they pay compared to other places.

    These people are literally trading dollars for their mental health. That is the choice they are making.

      brundolf 124 days ago

      Except those dollars amount to $28,800/year. Hardly enough to live on in the cities where these offices are, yet more than twice minimum wage ($15 full-time, versus $7.25 at what are often inconsistent hours [Edit: in Florida it's $8.46, but I think the point still stands]).

      They aren't hustling because they want some extra cash. They're trading their mental health for the basic ability to be a living, eating human being.

      jedimastert 124 days ago

      Yeah, 28,000 was barely a livable wage in the incredibly low-cost city I used to live in. I can't even imagine trying to live on it in the cost-of-living-helscape that is silicon valley.

        brundolf 124 days ago

        Technically the offices are in Phoenix, Austin, and Tampa. Still, those are major U.S. cities.

    jm4 124 days ago

    I'm not following this logic. They can't quit a job that pays double minimum wage unless minimum wage is increased to the amount they are currently getting paid? If minimum wage was $15 there would be some other crummy employer paying more than minimum wage to incentivize people to take a shitty job. Would they be stuck in that one too because the alternatives pay less?

      baroffoos 124 days ago

      If it was possible to live a normal life on the minimum wage then many would chose to instead of destroying their mental health to be able to pay rent.

        jm4 124 days ago

        Choose to do the bare minimum because we change the system to make it comfortable enough? That’s a good idea.

        I’m not necessarily against raising minimum wage. It should probably be reviewed and adjusted more often than it is. But wouldn’t we be better off with a system where people are incentivized to do more than the bare minimum? I don’t think minimum wage should be intended to be the baseline for a comfortable life. It should be high enough that people want to work in the first place rather than take handouts but not so high that it’s an opportunity to stagnate. Frankly, minimum wage is for first time workers and people who have zero marketable job skills or some other issue that prevents them from doing more productive work. Most people can achieve much more with a little work experience.

      brundolf 124 days ago

      They aren't necessarily trying to maximize, they're trying to keep their heads above water. If they had alternative option for doing that, they would take it.

    zodiac 124 days ago

    > If minimum wage were $15, they could find another job that paid their basic living expenses.

    Why is this necessarily true? Couldn't it be the case that minimum wage were $15 but they still couldn't find any other job?

      brundolf 124 days ago

      Currently we have a surplus of actual jobs, it's just that a large number of them pay sub-living wages. If those were forced to pay more - even if it meant some of them disappeared - many more people would have the opportunity to climb out of poverty.

      Edit: I may have misspoken when I said "surplus". What I was referring to is the very low unemployment rate we currently have, despite large numbers of people - with "jobs" - continuing to live in near-poverty.

        dragonwriter 124 days ago

        > Currently we have a surplus of actual jobs

        No, we don't. An actual job is what happens when the willingness to pay for labor connects with the willingness to provide it for pay.

        What you are seem to be referring to (additional willingness to purchase labor at prices below that that people are willing to accept) is not a surplus, it is is the normal condition of demand (and a parallel thing exists normally with supply, where there are people willing to provide something beyond what is currently being transacted in the market, but at higher-than-market prices.)

        will4274 124 days ago

        Or we'd see a bump in automation and a reduction in the number of jobs available. When you raise the price of something (labor), corporations will reduce their consumption of it.

        smileysteve 124 days ago

        If there is a surplus of actual jobs, then they are being forced to pay more already; but they're not, hence the surplus.

      erikpukinskis 124 days ago

      > > If minimum wage were $15, they could find another job that paid their basic living expenses.

      > Why is this necessarily true?

      It costs money to take a surveying class, or buy tools to practice your skills. It also takes having money saved so if there is a crisis, car repairs, etc you don’t have to constantly work overtime to keep your bills paid.

      Please reach out to a friend who is making less than a “living wage” in their area and ask them about career development and what they need. I appreciate your question.

    dennisgorelik 124 days ago

    > If minimum wage were $15

    ... then low-skilled candidates would NOT be able to find any job at all, because if low skilled candidate's expertise does not support $15/hour salary, and employers who could, potentially, pay less (e.g. $10/hour) are legally prohibited from employing people at that lower rate -- then there are no other jobs available to that low skilled candidate at all. Which means forced and hopeless unemployment.

      chillwaves 124 days ago

      Yes, and if we got rid of the minimum wage just think of all the wonderful job opportunities that would abound! Why, if you only pay people $1/hr, you can employ 7x as many people vs the current min. wage! It's simple economics.

aboru 124 days ago

I am surprised after reading a lot of comments here (not all), that I have not seen any discussion of Cognizant and their role. I am no fan of Facebook and I believe that they have significant responsibility here, but the contractor is, imo, the party directly responsible.

These people do not work for Facebook, and we don't know the nature of the contract in play. Are they paying per person, or a lump sum for some capacity at some accuracy rate. If Cognizant automated all of this would it be accepted under the contract?

Anyways, I don't want to shift focus away from Facebook so much as wanting to recognize the contracted call mpanies like Cognizant (which is what the whole article is about btw, with some comments referring to Facebook). Accenture and Cognizant really shouldn't escape the scrutiny just for being overshadowed by a bigger name.

    johnrbent 124 days ago

    Facebook is the entity creating this work, and is the root of the problem (by contracting a 3rd party to perform work that they very well understand the consequences of). The article is suggesting that Facebook should be held accountable for the detriment (trauma) that their work is causing. I think the article even argues that the contractors aren't equipped to deal with a problem of this magnitude/seriousness. Facebook is in the best position to rectify their moral accounts, but we all know that's not going to happen because they are evil and so forth.

    danso 124 days ago

    It's true that Cognizant has direct power to change things, but ultimately, the buck stops with Facebook, since they seem to be the vast majority of Cognizant's work and thus essentially have direct control of the purse strings. FB has the ability to change how Cognizant treats its workforce, and it's Facebook's choice to take a stand or to wash its hands of it. FB also has indirect say in demanding a certain standard ("98% accuracy target") for a given amount of money ($200M) -- though obviously if FB were to simply pay Cognizant more for the contract, there's no guarantee Cognizant would use that money for better worker pay/benefits (as opposed to giving bigger bonuses to executives, for example).

    In the article, one of the contractors says that Cognizant puts up a "dog-and-pony show" whenever FB executives visit. Again, it's ultimately up to FB to decide how much they want to push past the facade.

      MockObject 124 days ago

      Why wouldn't the buck actually stop with Cognizant management? FB isn't demanding these horrible labor practices, Cognizant is.

        empath75 124 days ago

        They’re paying rates that more or less require it.

          Panini_Jones 124 days ago

          Facebook is? What rate does Facebook pay and how much are these Cognizant employees getting paid?

            chris11 124 days ago

            The assumption is that Facebook is selecting the contracting agencies based on performance or cost. If Cognizant's performance doesn't meet the Facebook's standards they will get dropped, the same thing will happen Cognizant isn't competitive on price.

            This downward pressure ends up directly impacting moderators. Cognizant needs to keep payroll costs low so they don't lose the contract, and the contracted accuracy target of 98% seems unrealistic. So moderators end up fearing for their jobs when they don't meet accuracy targets.

    hunter23 124 days ago

    Cognizant should be banned from the h-1 program for these type of employee abuses. If they were threatened with something like that, they would actually listen since thats how they make money. There should be some principle like "if we believe you are a scummy employer, you can be banned from being granted h-1 visas".

    colpabar 124 days ago

    Cognizant isn't in the headline and unfortunately most people don't actually read things.

    kshacker 124 days ago

    If the same article was about Apple and a subcon (not Foxconn) where do you think the limelight will be?

arethuza 124 days ago

Just a warning - I found even a short description of some of the videos they had to watch fairly disturbing.

I don't think I could do that job for very long - let alone in a badly run, high pressure environment with low wages.

    Pigo 124 days ago

    I don't see how an average person could be expected to witness some of the things mentioned in the article. I didn't have time to read the entire article, but do they have counselors on staff or something?

    That one paragraph about organs was enough to ruin my day, and it was just text. I'm surprised such a "rash of videos" wasn't in the news somewhere.

      arethuza 124 days ago

      From the article:

      He sought out the on-site counselor for support, but found him unhelpful.

      “He just flat-out told me: ‘I don’t really know how to help you guys,’”

        colpabar 124 days ago

        So the counselors were probably an afterthought. I'd even bet the decision was made not as a way to keep the moderators from going insane, but to protect the company's image.

          enraged_camel 124 days ago

          I mean, a counselor can only do so much in this type of situation, since the patient is not in a position to remove the negative stimuli from their lives.

      stronglikedan 124 days ago

      Update June 19th, 10:37AM ET: This article has been updated to reflect the fact that a video that purportedly depicted organ harvesting was determined to be false and misleading.

      thom 124 days ago

      I know a very well qualified woman who has previously worked with pschological trauma in the army amongst other organisations, who has recently been hired to work at Facebook in Ireland as part of what sounds like a sizeable team. So it seems like Facebook aren't entirely ignorant of their responsibility here, although they do seem massively behind the curve.

      ceejayoz 124 days ago

      > That one paragraph about organs was enough to ruin my day, and it was just text. I'm surprised such a "rash of videos" wasn't in the news somewhere.

      I'm left wondering if it was a clip from a horror movie or something. Shitty to see, but not necessarily newsworthy like "live children chopped up for organs" would be.

        antisemiotic 124 days ago

        No need to be left wondering. While I'm not sure these are the videos mentioned, it's not hard to find some just by googling for it. Or maybe don't, I wish I didn't.

          ceejayoz 124 days ago

          I don't doubt you can find all sorts of horrifying video.

          I feel like I'd have seen media reports of widely-shared videos featuring live, conscious kids being vivisected in order to harvest their organs.

            dudul 124 days ago

            Isn't the point of moderation/censorship to avoid having these videos widely-shared? These videos exist, they are fairly easy to find on specialized websites. Some are fake, but a lot are genuine. Videos showing executions during wars in Chechnya or in the middle east and such.

            Pigo 124 days ago

            It really seems important to know whether the video is real or a horror movie.

              AlexandrB 124 days ago

              Not from a PTSD point of view. If these workers think it’s real, it’ll have the same psychological effect as the real thing.

          wingerlang 123 days ago

          Looks like the article itself linked the video they mentioned - turns out they were not harvesting any organs in that particular video

      invalidOrTaken 124 days ago

      >I don't see how an average person could be expected to witness some of the things mentioned in the article.

      I'm no fan of Facebook, but in their defense, this is sort of the point.

        Barrin92 124 days ago

        Well it still is their responsibility. It's like Dr Frankenstein letting the monster lose on the village and then going "hey guys well I sure don't know how to figure this stuff out".

        If the government, the military, academia, the police or anyone else would let their inventions lose on the world in the manner the private sector does and leaves it to us to figure out how to clean up the mess we'd call them insane and demand them to be shut down within a week.

    corey_moncure 124 days ago

    It makes me aware of a curious dichotomy about content moderation on Facebook. On one hand, they have publicly committed to stem the flow of "Fake News", whatever that means, on their platform. Now on the other hand, we have this article about the suffering of contractors who's task is to block content that is too real. I guess this places Facebook's platform into a category with theme parks like Disney Land, where they aim at maintaining a precarious balance somewhere between the fake and the real.

      gerbilly 124 days ago

      Very few people want to see real videos of animals being tortured.

      In fact it's their very realness that makes them especially horrifying.

        corey_moncure 123 days ago

        Of course no one wants to see horrifying things. But current events force us to admit that Facebook is a platform for vast quantities of political speech. And if we're going to entertain political speech, we should be grown-ups and face reality; if we don't allow the horror to inform our discussions, if we turn away, then the horror will only grow.

    doctorRetro 124 days ago

    Thanks for that warning. It's appreciated. I had to take a break in the middle of the article before continuing.

    cynicalreason 123 days ago

    I found it very disturbing .. I closed the video within a few seconds of the descriptions starting. Got a knot in my stomach and I feel very angry .. I could not do that job for a day

    dsfyu404ed 124 days ago

    There's a lot of variance between individuals. Given the choice this sounds like an overall better job than working a busy McDonald's drive through (people are assholes when they're hungry) but I'm not the kind of person that's particularly bothered by graphic violence. My opinion of humanity is dark enough to accommodate it. I'm sure many people would prefer McDonald's though.

    Some of the people interviewed were complaining primarily about the bad working conditions (and their complaints are valid as far as I care). I would wager these people are not as bothered by the content (though I'm sure they don't like it) as the ones who's primary complaints are about the content. They could probably do the job with less burnout if the rest of the job was made to suck less (i.e. it wasn't a shitty call center style job with all the accompanying baggage).

    Edit: Why am I getting down-voted? Can people legitimately not fathom that some people would not be seriously bothered by seeing this content? People post violent content to Facebook. Shock and gore sites exist because some people actively seek out(!!) the kind of content that these moderators are being exposed to. It stands to reason that the subset of the population that at least finds that content not mentally damaging is substantially larger than the group that seeks it out.

      bloopernova 124 days ago

      I think perhaps people feel that you are minimizing the horror and repulsiveness of the content by saying that these moderators have a better job than working at a fast food restaurant.

      My wife, who has worked in a busy McDonald's for 5 years, says this sort of moderation is far, far worse than anything she had to deal with. And she's had to deal with human waste, violence, and direct verbal abuse from customers.

      Your point about making the working environment better is a valid one, but think it is overshadowed by your assertion that the horrific content is acceptable to "many people".

      Your views do resonate with some people on Reddit though. I remember saying that I didn't like the pained yelps, limping, and whining of Dogmeat in Fallout 4, and I was attacked and ridiculed for that. I know I wouldn't last more than a minute at this Facebook moderation job, it would scar me for life. I don't think that means I'm "wrong", any more than your views are "right".

        dsfyu404ed 124 days ago

        >I think perhaps people feel that you are minimizing the horror and repulsiveness of the content by saying that these moderators have a better job than working at a fast food restaurant.

        I think that perhaps those people need to realize that not everyone is as easily rattled as them. Some people go to the Holocaust museum and are mortified by the details of it and are sad for a week. Some people go and are like "yeah, people do terrible things sometimes, this is just the worst systemic instance to date" and then go out for beers afterward. Whenever there's an armed conflict there's some people who are F'd up by what they see and there's some people who say "yeah that sucked and I'm glad it's over" and there's people in between. I think it's pretty evident that the ability to cope with violence varies a lot between individuals.

        >My wife, who has worked in a busy McDonald's for 5 years, says this sort of moderation is far, far worse than anything she had to deal with. And she's had to deal with human waste, violence, and direct verbal abuse from customers.

        I've done that too. I'll take hospital janitor over anything in food service. The general public sucks. The floor doesn't complain when you didn't mop it up just the way it wanted. I'd probably try my hand at a moderating job before I went back to food service. Blood, violence, obscene pornography, etc, etc don't bother me. It's nasty but whatever, some people are terrible so what do you expect. There's other things that bother me but those aren't it.

        > your assertion that the horrific content is acceptable to "many people".

        There's levels of acceptable and there's a reason these employees are being paid more than those of the call center across the office park. I'm not suggesting that some employees find it acceptable in the abstract. I'm saying they are not so easily mentally harmed by it to consider it not worth the pay.

        I see it no different than a physically fit 20yo who finds a ditch digging job to be worth his while because he can handle it whereas the 50yo almost certainly cannot handle it without much greater negative health consequences. If you can hack it for the pay then why not.

        >I don't think that means I'm "wrong", any more than your views are "right".

        You're not asserting that nobody can do this job and I'm asserting that there exist people who can do this job (or at least people who whom the "shitty call center job" conditions are the primary irritant preventing them from doing this job). That said, the down-vote to reply ration makes me suspect that many people simply do not believe that some other people do not think like them.

      gerbilly 124 days ago

      The fact that these kind of shock videos are even a thing, kind of disproves your point.¹

      There are also people who are habituated with cutting into living bodies (surgeons) but the difference is they are highly paid.

      It's a bit unfair to equivocate and call this kind of work a good option for some people, when the vast majority of people who are in these jobs probably will be psychologically harmed by it² and are doing it because they have few other options.

      It is relegated to an underclass, like all dangerous and undesirable work.

      1: Statistically at least. 2: See also: https://www.researchgate.net/publication/228141419_A_Slaught...

arethuza 124 days ago

Maybe Facebook should make all of their own employees do 15 minutes of moderation per day - just to share the pain out a bit....

    asark 124 days ago

    Make the users do a little every so often. Like Slashdot's "hey, go moderate some posts" thing but forced to before they can use the platform anymore. That'd be fun.

    Of course the real answer is that this sort of site is a bad idea and shouldn't exist. Wide-open signup and public visibility of content, or ease of sharing it with strangers. Bad combo, don't care how much money it's making them (and other, similar sites).

      commandlinefan 124 days ago

      > Make the users do a little every so often.

      Oh, dear god, no - have you ever used Reddit? This is what happens when you outsource moderation to the sorts of users who enjoy moderating other people.

        nvrspyx 124 days ago

        Have you ever used Facebook? It’s much worse and this is specifically in reference to media that are against Facebook’s guidelines (e.g. nudity, gore, etc), not all content as a whole. Users doing some moderating in this context would simply be presenting flagged media randomly to users and asking if it breaks the guidelines.

        Very completely different things. Subreddits would be the equivalent of Facebook Pages and they already have moderation tools for those that run the page.

        saagarjha 124 days ago

        Reddit is a bit different than what was suggested here, since moderators are not a rotating position there and this encourages power trips.

        baroffoos 124 days ago

        Stack overflow has the users do the moderation and it works well, maybe not perfect but its not bad.

      acdha 124 days ago

      There's not really a shortcut around having employees do it because Facebook has a significant amount of private content which users will not tolerate being shared, standards vary widely (imagine a vegan flagging every picture with leather visible or a post about eating meat as obscene — if you're requiring this, do you just disable their account once you notice it?), and especially because it's the target of coordinated group activity so errors won't be uncorrelated: get a cycle of outrage politics going and you'll have thousands of people around the world who have no easily-observed connection to each other flagging things as either objectionable or safe.

        Fjolsvith 124 days ago

        And yet Facebook lawyers recently argued to a judge that there is no expectation of privacy on Facebook.

          acdha 124 days ago

          I’m not defending them but there is a distinction between Facebook using your private data for mining and displaying it to random strangers. Any sort of user-driven moderation policy would need to have a way to handle that or people would stop using the service.

            Fjolsvith 124 days ago

            Isn't everything you post on FB available for anyone to browse?

        saagarjha 124 days ago

        You could have a system where multiple people flagging certain content would be a signal to give it manual review.

          otakucode 124 days ago

          I've always expected that a system which could work, albeit with flaws, would be to ratchet up censorship for any individual user who flags content. The more they flag, the less they see. The only 'problem' is that this would leave users fearing that there are others who have not flagged the same content objectionable that they have, and are viewing that content without being actively prevented. They'd be protected themselves, but it would be easy to convince them (if any convincing is even necessary) that 'almost everyone else' is consuming the raw obscene feed and being turned into monsters by it.

          jandrese 124 days ago

          This already exists on Facebook.

      jandrese 124 days ago

      Just what I want my grandmother to see when she logs into Facebook once a week to check on the grandkids.

      ------------------------------------------------------------

      Hello user, you have been randomly selected to help moderate.

      Does this depict pedophilia? Y/N

      [ Image here ]

      This image was flagged by other users as depicting horrible gore and death, do you agree Y/N?

      [ Image here ]

      This post may contain hate speech, do you agree? Y/N

      [ Embedded Nazi Screed post ]

      Thank you for making Facebook a safer place for everybody!

      ------------------------------------------------------------

      calgoo 124 days ago

      Make it captcha test: Select all the pictures of animal cruelty. /s

        megous 124 days ago

        Wow, captcha from hell.

      ceejayoz 124 days ago

      > Make the users do a little every so often.

      "Facebook made me look at child porn" is probably a headline Facebook would prefer not to have going around.

      arethuza 124 days ago

      Personally, I wouldn't go that far - just that I would think anyone doing this kind of job would need a lot of training, active support and monitoring and probably periodic breaks to do something else and decompress. Probably long term health care coverage as well.

      Would that be expensive at the scale required by Facebook? Very.

      jklinger410 124 days ago

      Only if they dole out libra-bux

        worble 124 days ago

        >libra-bux

        Personally I'm fond of zucc-bux

          asark 124 days ago

          Seen on here the other day (can't remember the poster to credit them): "Douche Mark". A play on Deutsche Mark, for those who are either not European or not old enough to remember that currency, as seemed to be the case with some respondents to the original post.

      fencepost 124 days ago

      Slashdot's "hey, go moderate some posts" thing but forced to before they can use the platform anymore. That'd be fun.

      Overall not possible. On Slashdot everything is basically public, on Facebook it's largely private/restricted and I'm sure from past discussions that the worst stuff is in private groups or clusters of friends. They could do much more aggressive banning of users (and detecting new accounts designed to circumvent bans) for sharing such content or being in groups focused around such content, but that might hurt their most important metrics.

      TallGuyShort 124 days ago

      I agree with your point about the site being a bad idea, so maybe your first point was tongue-in-cheek. But displaying reported child pornography to people outside of your controlled environment? Not a great idea...

        asark 124 days ago

        Just pricing in an externality. If that's a disgusting and shocking way to deal with this, then it doesn't follow to me that paying a little money to folks to do it makes it anywhere near OK.

        So yeah, it was tongue in cheek, with a side of reductio.

          TallGuyShort 124 days ago

          No the concern with distributing likely child pornography and snuff films to users is that the whole point of moderation is to STOP distributing that. You want to have it moderated in as controlled an environment as possible. I'm all for not subjecting more people to it than necessary, but it should at least be people who explicitly consent and are trained and provided with the necessary resources.

            asark 124 days ago

            Don't have to subject anyone to it. Don't have Facebook.

              TallGuyShort 124 days ago

              I have never distributed child pornography or snuff films. I'm already not subjecting anyone to it. Communications channels exist. Unless you want to undo that, or enable this behavior, you have to deal with it somehow.

                asark 124 days ago

                We don't have to have huge, centralized communication sites where anyone can sign up and post stuff and the operator's shielded from liability for what's posted so long as they subject a bunch of humans to it in order to filter it out, even if there's tons and tons of it posted constantly. We only have to have that in order to keep supporting the current ad-tech business models, which isn't a given.

                I'm not saying if Sally starts a forum and it gets hacked and someone posts illegal stuff she should be fined or charged with anything. I'm not saying she should even be in trouble if she gives one person who turns out to be a jerkass posting rights and that person posts illegal stuff. But if she opens up public signups, gets a billion or two users, then says "well I guess I'll just have to pay a bunch of people money to look at all this gore porn that's getting posted" the rest of us should go "uh, no, you should instead just stop".

                  TallGuyShort 123 days ago

                  That's a very arbitrary line that's really just based on you making a judgment call for things you like or against things you don't like. You'd probably do well in American politics.

                    asark 123 days ago

                    I don't like subjecting a bunch of people to horrible images on an industrial scale, no. And I don't think "it's making us money and we can't think of a better way" is an excuse to keep it up.

                    [EDIT] and given that second sentence, no, I'd get nowhere in American politics.

                      TallGuyShort 123 days ago

                      So your solution here is that instead of all these people voluntarily seeking other jobs, their department shouldn't have a reason to exist, ergo shutdown Facebook and now all these people and more are involuntarily unemployed. Riight.

    rorykoehler 124 days ago

    All the devs would leave

      arethuza 124 days ago

      That's probably true - I wonder whether it would be because of the affront of having to do such menial tasks or the horror of realising that operating a global social network requires somebody to trawl through the filth and risk their mental health?

        sillyquiet 124 days ago

        No, its more like the devs have the necessary marketplace leverage and fallback to either put a quash on that notion or move on to another less ridiculous job. Don't think for a minute devs wouldn't be treated the same (and in some places ARE) if the corporate c-levels could get away with it without bleeding talent that is expensive to re-acquire. I am not, by any means, a Workers of the World Unite kinda person, but group negotiation and gasp unionization is becoming more and more necessary - if ONLY to get the beancounters to consider labor in their souless cost calculations when making decisions that lead to the conditions in the article.

          malaxii 124 days ago

          Historically, unions are a tried and true way for improving working conditions and average compensation. Of course, that's also the reason why they are so vehemently opposed...

            sillyquiet 124 days ago

            Very true, but in honesty, they can be almost or more trouble than they solve when they grow so big and bureaucratic as to no longer represent or even actively harm the workers they purport to support - it's these cases a lot of union detractors latch onto in their arguments against unionization, with some degree of fairness. I speak as the son, grandson, and great-grandson of people who were VERY active in their respective unions. In particular my great-grandfather helped to unionize electrical workers in refineries in Texas, and my father was a union steward. They are full of stories of the headaches the union bureaucracy caused.

            That being said - those cases are really rare and in terms of harm, the naked calculating exploitation that corporations flirt with is WAY worse imo than the harm a too-big-for-its-britches union causes.

        Nasrudith 124 days ago

        It is because it is a waste of their time and a bad approach and everyone knows it fundamentally on some level - however nice the eglatarian ideal would be in other ways.

        It would be like expecting doctors to clean bedpans. Now it is a neccessary task and those who do so should receive appropriate respect even if it is just "Thanks glad I don't have to do it." but anyone off the street and willing to do the dirty job could do it without a doctor's training which they spent over half a decade on to be /entry level/. Plus incredibly inefficient for effectively paying say $100/hr to a janitor when they could be saving lives instead.

        Now asking them to do it in a neccessary and justifiable situation (say posting in an aid camp cut off by weather or in space thus making sending any extra bodies expensive) is one thing and they would be in the wrong then for refusing out of pride.

        Absent that neccessity it shows poor sense of their actual value until medical degrees and skills become just as common as the skills to clean bedpans.

          cldellow 124 days ago

          Yes, I should perhaps clarify - I mentioned developers to point out that Facebook has the resources to provide excellent working conditions.

          If I can adapt (and perhaps torture) your analogy, I'd say it's like Facebook currently has doctors who save lives and command high salaries, and janitors who change bedpans but don't have access to hot water and latex gloves. So inevitably, the janitors catch a disease (hospitals are filthy, after all! This is foreseeable!) and are no longer able to work, at which point they are replaced.

          Given that the hospital can afford to pay the doctors, we might ask if they could splash out for latex gloves and soap for the janitors, too.

            matz1 124 days ago

            The different is there are plenty of people who are not affected negatively with those "horrible" content. Me for example, if not for the low salary I would do it.

            Content moderation job is not for everyone.

            geodel 124 days ago

            Your example seems useless. There would be minimum working condition requirement enforced by labor department. So it would be illegal that janitors be forced to work in filthy conditions without necessary gears. Similarly FB would be fulfilling the working conditions set by labor department for desk workers.

            People are essentially asking if FB can increase salary/ perks by x amount because internet commenters are somehow not feeling good about current scenario.

              cldellow 124 days ago

              > There would be minimum working condition requirement enforced by labor department.

              Yeah, this is basically what I'm saying. :) Labour codes are developed over time. In the beginning, it's the wild west. Want to be a janitor and not use any personal protective equipment? Go right ahead! After workers keep getting sick, the government begins to legislate requirements around health and safety.

              Awareness of mental illness is a relatively new thing. We'll probably see some developments in this area if the big tech companies continue to outsource moderation at scale. https://www.theguardian.com/news/2017/may/25/facebook-modera... is a nice article that describes accommodations that other companies provide to moderators. They include, as an example, monthly psychologist visits which continue after the person stops working for the organization. They also include training for the person's family and social support groups.

        emiliobumachar 124 days ago

        Executives would also not put up with it - obviously, to the point that I'm not sure whether you meant to include them or not when you said "employees".

        Not wanting drudge work is common to nearly all people who have options. Why should devs get all the hate?

      mannykannot 124 days ago

      ...but there must be a downside, no?

      (stolen from Scott Adams.)

      eswat 124 days ago

      Probably not the ones—if any—that lurk 4chan and other extreme imageboards, having been desensitized to such content or probably even get something off of it.

      I actually wonder now how much of their workforce would fit in that demographic.

        hi5eyes 124 days ago

        anyone thats been lurking/posting on chans for a long time also know neurotypicals couldnt deal with the content

        just because some people become desensitized doesnt mean the obscene content wont damage the mental health of the entire set of people

        jerkstate 124 days ago

        I was just thinking that. 4chan janitors are volunteers. Of course, the payoff is helping 4chan exist, not helping Facebook make a few billion more, so the incentive is still vastly different. Still, that might be the right recruitment pool.

          egypturnash 124 days ago

          I don’t think so. They’d just beat off to the orphan vivisection videos all day long.

            skyyler 124 days ago

            There is a lot more to that website than /b/. The janitors on /out/ do a very good job of keeping the board clean.

      wu_tang_chris 124 days ago

      that would be great, then they could just the whole thing down and the world would instantly improve by 10%

    H8crilA 124 days ago

    Have you ever worked with mechanical turk or anything of this sort? You pretty much have to do at least 15 minutes of manual moderation per day. Training "human workers", setting up rating templates, finding human mental "bugs", i.e. things that raters/moderators get consistently wrong - it takes quite a lot of time.

    Granted, not everyone does that, someone nibbling at the frontend or sitting deep in some distributed DB generally does not care much about data quality.

    mikeash 124 days ago

    Make the executives do this.

      rock_hard 124 days ago

      It’s common practice for them to do so. Not just at Facebook, but YouTube and other platforms.

      It’s management 101

        H8crilA 124 days ago

        Oversight over key quality metrics (including understanding what exactly raters/moderators do) is a VP or even SVP level job. It's critical if your entire org's goal is to deliver ever improving data quality at reasonable latency and reasonable reliability (example: Google Search).

        Not to mention that, in data driven orgs, the launch/no launch debates are focused on those metrics. So if you want to call the shots you kind of need an in-depth understanding.

          Apocryphon 124 days ago

          You and the GP claim this, yet if these execs have truly been exposed to the worst that gets posted to their platforms, why has there been no appreciable change made? Unless they're so hardened to it they can't emphasize with the horrors that they're asking people to face.

            H8crilA 124 days ago

            There's no alternative. The only other option (for Facebook and content moderation) is to shut down the shop & return equity to investors, or do nothing and wait for the platform to blow up. And even then someone will make Facebook v2.

            Just the economic incentive to cut moderators loose and use algorithms would be enough to do so if it were possible. After all you can buy a decent amount of compute power for 30+k/yr, that will certainly deliver more than the 100 or so QPD[ay] form a content moderator.

            PS. I'm sure that most of the work is being done by algorithms anyhow.

              Apocryphon 124 days ago

              That’s all or nothing thinking. The immediate alternative is to improve working conditions for moderators. Or impose more stringent bans and punishments on those who post that sort of material. Those are just for starters.

                H8crilA 124 days ago

                Look I wish everyone had a nice life, I really do, but markets set salaries while governments sometimes intervene (e.g. by setting a minimum wage).

                Don't know about the banning situation, that'd require insider Facebook knowledge. I'm sure they ban some people.

                  mikeash 124 days ago

                  The Market Police don’t haul you off if you pay something different than the “market wage.” Markets set a floor on wages by making it so you won’t get enough workers if you pay less. But companies can easily pay more than the floor if they want to. Obviously they don’t want to, because they love money, but it’s ridiculous to frame this as some iron law of markets.

    zhte415 124 days ago

    This makes complete sense. Why pigeon hole employees into, well, pigeon holes? Technology, Operations, Marketing, more. Each understanding more about each others' roles, challenges, solutions. A knowledge organisation is based on knowledge. I'd favour a scale that could be chosen, 5-50% outside of core role.

    [Clarity: Am not a Facebook employee.]

      TallGuyShort 124 days ago

      I suspect it would come with a "manager's discretion" clause that would ultimately effectively eliminate it, like 20% time at Google.

r3vrse 124 days ago

Dipping into whimsical analogies: this is a digital abattoir where the meat = content.

Now, as before, no-one wants to see how the sausage gets made. Especially those selling it.

Can't kill demand or bear the visceral truth. So instead we'll pretend the seedy underbelly doesn't exist. Paper over dissonance with ethical codes and platitudes.

Not new. Just a context shift in production of sustenance for the collective, insatiable gaping maw.

    Cthulhu_ 124 days ago

    Yup; bear in mind that pretty much ALL major user-generated content websites have to deal with this - think Google (and I now wonder if G+, Picasa, etc were shut down because they couldn't handle the inappropriate content anymore?), Dropbox (who may do a passive one where they only investigate if reported by law enforcement), Youtube (also Google), Discord, Slack, etc. It's a problem everywhere.

    I also believe this is one of the reasons where Facebook's real name policy comes in - it discourages people from posting the worst of it. Animal brutality could just be kids fooling around on their phone, accidental, but produced child pornography is not, and the people making that shit know really well how they shouldn't put it on sites that require them to use their real names.

    blablabla123 124 days ago

    Every web community has to deal with this, some types of communities are more prone to this than others. On HN this obviously works very well, also other 90s style web forums don't suffer these problems but that's probably also because those usually have a very close scope and attract only certain kind of people.

    Facebook is really broad, so are also online news magazines which are sometimes full of horrible comments (only text of course). Usenet more or less ceased to exist because of that. So in reality this is not a Facebook problem but an online community problem - it just looks like a FB problem because that's the major online community.

    I wonder if this could be solved with a 3rd party Facebook integration/startup :-). Anyway, Facebook also has more problems with that, another one being undesired unsolicited contacts which fall into the same category of not so nice stuff about Facebook. People don't see the meat production but the bad smell is really close which is why many leave.

dalore 124 days ago

If they made engineers take turns working in the content moderation you would soon see all sorts of improvements (like the aforementioned ability to recognize duplicate content for starters).

They would hate it so much, that would make better tooling. But now they don't have to know about it, just send images to the moderation team over and over like they are robots.

    joeblubaugh 124 days ago

    I knew an engineer who worked on news moderation tools - the stress was so high that he had a breakdown and had to quit.

    I think if FB engineers had to actually interact with the bottom-bucket of the content on a regular basis, Facebook would have stricter rules, and perhaps even prior-restraint filters for image and video uploads.

    Nasrudith 124 days ago

    Couldn't it be just done in a more pleasant and efficient way by simply tasking them on moderation assistance tools instead of misplaced retribution?

    While putting themselves in the moderator's shoes may prove helpful they aren't comparable in skillsets. If they were given a command line interface for instance most mods would be puzzled. "Produced for self" and "produces for content moderators" are differing needs.

      oarsinsync 124 days ago

      Dogfooding isn't really retribution, so much as a reasonably effective way to better understand and capture requirements.

      Using the things you produce generally helps motivate ironing out bugs and/or improving the product overall.

        michaelt 124 days ago

        People in this discussion are misunderstanding one another because no-one's making a clear distinction between "Engineers/managers responsible for content moderation and with the power to improve the tools should have to do some content moderation themselves" and "All engineers/managers should do content moderation"

        Andy the content moderation tools developer could certainly benefit from experience using the tools he's working on.

        On the other hand Bob the Unix admin who keeps the CI cluster online can't iron out bugs in another team's product no matter how hard you motivate him.

        Needless to say, if someone writes about Andy but a reader thinks about Bob, the gap in whether the explanation makes sense might leave the reader sceptical about the writer's claims, or even their true motivation.

    tomp 124 days ago

    The engineers would just make the filtering AI better.

    End result: engineers paid $250k, moderators unemployed making $0k.

james_pm 124 days ago

There were many reasons that led to me deleting my Facebook account in May, 2018. The fact that the platform is a magnet for the absolute worst of humanity and then employs and exploits people in such an inhumane and cavalier way to filter the garbage out was high on the list. Get Zuckerberg to do the job for a few hours and see what he thinks.

    lallysingh 124 days ago

    I find that I haven't missed anything outside of people I haven't actually talked to in 10-20 years. Which, frankly, is easier. It's nice to let my social past stay in the past.

Verdex 124 days ago

So ... we all need to start flagging beautiful nature scenes, humorous comics, lists of health tips, and job postings for less stressful jobs that you could do if you're already qualified for being a facebook moderator?

At least that way they get a bit of a break from all the other horrible stuff.

    panic 124 days ago

    This is a great idea if you could get people to do it at scale -- it raises awareness, improves the lives of the moderators, and forces Facebook to deal with the increased volume of reports all at the same time.

ljm 124 days ago

I think this is also a symptom of the US’ shocking attitude to worker’s rights.

The article says staff were constantly reminded of how easily replaced they were, which is a euphemism for “you’re lucky we gave you a job.”

Facebook and Cognizant are majorly dropping the ball but US government and legislation strongly enables that. As do individual states.

From a European perspective this is a sad article to read about work conditions in “the greatest nation in the world.”

neuro 124 days ago

Social Media Content Moderation Team Lead

Cognizant Technology Solutions

Tampa, Florida • This is an exempt position, requiring day, evening, weekend, and holiday shifts, as this delivery center is operational 24/7 and 365 days a year.

Cognizant is seeking a team of strong Team Leads to manage a team of social media content moderators for a global social media organization.

The Team Lead will be responsible not only for managing day to day operations of the team, people management, performance management, but also help the client determine gaps in processes, identifying innovative ways to solve problems upstream and scale our operations.

Ideal candidates will be comfortable understanding social media, have an appetite for research and gathering data insights, a high level of comfort working with cross-functional partners, and a strong analytical mindset. Successful team members have a passion for business success, strong attention to detail, analytical problem-solving abilities keeping a high level of team motivation and keen eyes for operational inefficiencies.

Responsibilities: • Provide mentorship, guidance and career development to members of your team • Lead a high-performing team through an exciting transition to build problem solving, critical thinking, analytical and technical capabilities which will enable the department to develop deeper, more scalable solutions • Team management responsibilities for a market team, whilst also serving as a cross-functional and a global liaison in developed areas of expertise • Establish team goals and work with direct reports on strategies for executing, measuring progress and sharing results • Deliver projects involving quantitative analysis, industry research, and strategy development, working directly with global cross-functional teams to problem solve analytical approaches and develop solutions • Identify actionable insights, suggest recommendations, and influence team strategy through effective communication • Advocate for users within their market, partnering with global and cross-functional teams to develop global solutions

    124 days ago

bpyne 124 days ago

From the article, we don't know how much a moderator "can take" daily. Perhaps studies haven't been done. Perhaps it's too complicated a subject for good tests. But, they could check with organizations who have experience already with employees having to watch horrific crimes.

The chief information security officer in my organization came from our state police where he headed the internet crimes against children division. People on his team had to watch videos like the ones described in the article. Despite having all hardened officers on his team, team members regularly cried at their desks while watching videos. His team members had to be taken off-duty for weeks every six months to recover. They had mandatory counseling even when off-duty.

I would think FB and other large companies could look at the measures taken by police forces with similarly disturbing jobs as a guideline for the contractors.

arzeth 124 days ago

I have a hotkey in i3wm:

bindsym $mod+backslash exec "xcalib -i -a"

which inverts all colors (btw, I have to `killall redshift` because of a bug). When colors are inverted, I feel almost undisturbed at any gruesome content (for me it's like seeing screenshots of Quake 2 with buggy GPU drivers), yet I can still easily recognise whether the content is disturbing. And when it's a video, I play it at ≥2x speed so that it wouldn't feel realistic for my brain.

I wonder whether those moderators use these two lifehacks.

    akerro 124 days ago

    They most likely can't change wallpaper on their windows XP...

    theNJR 124 days ago

    What are you doing that requires this so frequently?

    rock_hard 124 days ago

    There was a previous article a couple month ago that explained that this sort of stuff is already build into the moderation tools.

    bamboozled 124 days ago

    It seems like moderating audio is also part of the job and equally disturbing as the video.

    Not sure how you get around that.

      knd775 124 days ago

      Chipmunk voices? Might make it feel less real

deogeo 124 days ago

Yet another case where NDAs are abused to cover-up corporate misdeeds. I think their enforcability should be severely restricted. With penalties if a lawyer includes them in a contract, despite knowing they are invalid, so invalid terms can't be used to scare workers not familiar with the law.

spunker540 124 days ago

“But had his managers asked, they would have learned that Speagle had a history of anxiety and depression“

Should employers really be asking about mental health history during the hiring process?

    mpclark 124 days ago

    Of course, if the job involves risks to mental health!

      cloakandswagger 124 days ago

      Queue up the next Verge article in 6 months: "Facebook discriminates on the basis of mental health"

        SomeOldThrow 124 days ago

        I mean they really should be in this case. This is not a normal job.

          bena 124 days ago

          The real solution is that they present the reality of the job and let the candidate make the decision to proceed or not.

          That way the company presented the job as is. And if the candidate takes the job, they own some of the responsibility for the consequences.

    mhh__ 124 days ago

    > Should employers really be asking about mental health history during the hiring process?

    There should be some disclosure of potential mental health risks in the job description. no? i.e. Screening for this at a later stage in the hiring process would be reasonable (No idea on the legality, however)

    noisy_boy 124 days ago

    Maybe they should have checked his Facebook history.

    My decision to delete my account from this toxic spider web of a platform is affirmed regularly by the frequent revelations of their immoral behavior.

comboy 124 days ago

> But as the weeks went on, the video continued to reappear in his queue (..) They kept reposting it again and again and again.

Seems like it should be pretty "easy" to spot similar videos and audio (even after some modifications) given how great dataset all those moderators are providing.

I'm pretty sure there are some pretty smart folks working at fb, so it would mean that given accuracy standards it's still cheaper for them to hire humans to do the job.

    Buttons840 124 days ago

    That part of the article says they allowed the video of animal abuse to remain visible so that law enforcement could do something about it. Of course, that's bullshit for many reasons.

    Ban the user and the video, if the same video is posted again, autoban it. If the user is law-enforcent, allow them to see the details of all related content that was banned.

    There are neat things called if-statemets that could do that.

      Cthulhu_ 124 days ago

      Yeah, I mean Facebook can open up a portal for all local and national police precincts and law enforcement that goes "This is the content that our systems and content moderators marked as illegal, this is the exact location, date, time it was recorded, our facial recognition software identified these and these individuals", etc.

      However, I just also described how Facebook could easily be used in a dystopian police state fashion. I am fairly sure they aren't allowed to proactively report crimes. And that law enforcement would be overwhelmed by the sheer amount of reports coming in all at once.

        Buttons840 124 days ago

        Allegedly they are already altering their behavior to assist law enforcement. I'm not suggesting they put a lot of effort into helping, but being able to remove a video from the general public without stopping law enforcent investigations seems reasonable.

    dillonmckay 124 days ago

    No, that is only to protect IP rights.

      comboy 124 days ago

      I don't understand, can you elaborate? What difference does it make if a duplicate is spotted by a human vs by a machine?

        cldellow 124 days ago

        If Facebook repeatedly permits copyright infringement on the same video, they might get sued by the rightsholder, which might be a very wealthy company with experience enforcing its rights in courts.

        In they repeatedly make a $30,000/year employee view 15 seconds of the same child being beaten to a pulp (...the minimum time they are obligated to watch each clip before rendering a judgment), the employee will just quit after their health declines past a certain point.

        So while the technology could be applied to both contexts, there's probably a lot more economic incentive to apply it to the first context than the second one.

          thecatspaw 124 days ago

          Wouldnt it be more economically viable to use it also for moderation? What is the point of making an employee watch the same video again, when they could instead moderate another video?

            cldellow 124 days ago

            I don't know! It's conjecture in the first place. I assume _some_ ML is used - presumably (hopefully!) not all stuff that goes to moderators is because of a human flag, some of it is an ML model.

            I can imagine, though, that there may be odd asymmetries that would encourage dumping the problem on to a contractor work force.

            - The contractor work force is a fixed cost. Even if it seems high (the article quotes $100,000,000/year), this is (a) a known number you can budget around and (b) perhaps cheaper than spinning up a multi-disciplinary team to adapt ML models to solve the problem.

            - The detection-for-copyright case may have more people co-operating. Rightsholders have a strong economic incentive to directly work with you to provide you with fingerprints of their copyrighted material that you can use to train a model. Trolls and bad actors aren't going to give you such a heads up, so you'll still need a contractor work force for the foreseeable future.

            - This stuff is toxic. You can't underline that enough. Kiddie porn, gore, animal cruelty, conspiracy theories. Outsourcing it serves at least two purposes: you can blame the contractors for any problems and you keep it far from your core employees who are expensive to hire, train and retain, who may start to ask tough questions about the unwitting-if-not-unpredictable role Facebook plays in distributing this content to people.

        akerro 124 days ago

        Machine learning for recognizing duplicates is only for "protection of IP rights", not for protection of low wage workers. Apparently somebody at facebook decided that ML is more expensive than low wage human worker rejecting the same material over and over. Layers are more expensive and bring more bad media coverage than someones suicide in 3rd world country.

          asdff 124 days ago

          >3rd world country

          These workers are in the U.S. FYI

            akerro 124 days ago

            Ah right, last news I remember from a few years ago was that facebook had a child-company in Thailand that was hiring 1000s of moderators locally.

the_duke 124 days ago

Couldn't violence (both towards animals and humans) be detected in an automated fashion?

Especially screams in the audio should be fairly easy to find.

Then block those videos by default, with a manual appeals process that sends it to a moderator, combined with a big warning that submitting videos against the TOS will get you suspended. Of course this could lead to people submitting videos without audio, but this will always be a cat and mouse game.

Or is FB under legal duty to review potentially criminal content?

    dexen 124 days ago

    >Couldn't violence (both towards animals and humans) be detected in an automated fashion?

    Even if it could (and parallel posters argue it cannot), there is a meta-problem: "Who is allowed to post violent content?". Violence is contextual, and also legality of posting it is contextual. Same goes for other objectionable materials (nudity, extreme ideologies etc.).

    Point in case, if all users were subject to the same criteria, the big names in media would quickly get banned. And/or would enter war path with Facebook, accusing it of heavy-handed censorship against the freedom of the press.

    Same for reportage of popular events with any degree of nudity. A news piece about a Femen protest, a Pride event, a World Naked Bike Ride, maybe even the No Pants Subway Ride would cause a ban.

    Likewise, anybody documenting & discussing historical events, and anybody documenting & discussing present-day uprising, revolutions, civil wars, persecutions, etc., would quickly get banned, essentially sweeping a lot of abuse under the rug. Probably even discussion & documentation of domestic violence would have this problem.

    As a lighter aside, cartoonish violence (video games & movies) could also easily fall prey to automated violence take-downs.

    All in all, Facebook really really wants to give certain users (mostly press & historians) broad, sweeping exceptions to the general rules.

      raxxorrax 124 days ago

      Exactly the phenomenon we are seeing today. Random people are accused to forming a hate-mob while main-stream media happily paints a target on individuals. Most of them are even innocent and just have the wrong opinion.

      I thought the US had finally come to peace with nudity, but that was extremely wrong for every side of the political spectrum as the latest developments clearly show.

      lallysingh 124 days ago

      Train with signals about the sender?

        dexen 124 days ago

        That's my point: it would enforce an automated two-tier system. If you are "ingroup", you would be allowed to post virtually anything. If you are a regular citizen, good luck.

        the_duke 124 days ago

        Also my first thought, but thinking it through, this can easily lead to (potentially valid) criticism of intrusive profiling and censorship.

          raxxorrax 124 days ago

          And that would be bad because...?

        vetinari 124 days ago

        So Facebook would have their own Animal Farm?

          lallysingh 124 days ago

          It already is! Like Twitter with checkmark people.

    lemcoe9 124 days ago

    "should be fa[i]rly easy to find"

    Nope. It's just not that easy. Why do you think the biggest companies on the planet struggle so mightily to reliably and un-controversially figure out what content to block.

      mulmen 124 days ago

      Because they optimize for profit. Moderation is a cost center.

        cloakandswagger 124 days ago

        Right, so wouldn't they be highly motivated to create models for detecting it automatically? Even if it cost millions of dollars to develop, they would both save money on human labor and boost PR by avoiding articles like this haranguing them about "torturous" working conditions.

        The fact is that the state of the art for ML is simply not to the point where you could build a model for reliably identifying a category as broad and ambiguous as "violence".

        I would hope HN users, even if they aren't directly familiar with ML, would appreciate that it isn't trivial.

          mulmen 124 days ago

          I'm not suggesting that moderation is easy to automate. I'm saying Facebook is an advertising company that cares about content moderation exactly as far as it impacts their advertising business and no more.

          They have no concern for the moderators or the content, they just want to solve this problem (as it relates to profit) as cheaply as possible. At this point in time they believe the best way to do this is by utilizing low-cost labor.

          Facebook could easily devote more resources to caring for these moderators but they don't because doing so has no further positive impact on the bottom line.

            cloakandswagger 124 days ago

            The cheapest route to moderation is automation, and I would be stunned if there isn't a thick layer of it applied before this content reaches human moderators.

            My point is that ML is not a magic wand and we're still in its early days. Facebook would love to have a model that accurately identifies offensive content, but that level of AI does not exist today, and no amount of money thrown at it will instantly advance the state of the art to that point.

    jdietrich 124 days ago

    It's a whole bunch more complicated than that - I'd highly recommend the Radiolab episode Post No Evil and the documentary The Cleaners for the full story.

    Facebook doesn't have a blanket policy of removing violent content, for wholly legitimate reasons. Many users in Mexico very strongly want videos of cartel murders to stay on the site, because they feel that they are important reportage of something that the mainstream media is unwilling or unable to report. Many users in Syria very strongly want videos of IS murders to remain on the site, because they want the world to see the suffering that IS has wrought on their country.

    If Facebook just delete all violent content, they're robbing the victims of violence of their voice - it doesn't feel like Facebook are protecting users of their platform, it feels like Facebook are complicit in covering up the crimes of drug cartels and terrorists.

    It's not enough to identify that a video is violent - they need to know context. Is this particular video in this particular context a cry for help or a celebration of evil? Will removing this video silence the propaganda of a murderer, or will it help to conceal their crimes? Does this video show the death of an unidentifiable person in an ungoverned warzone, or is it vital evidence that urgently needs to be forwarded to the relevant authorities? For now, only well-trained human beings are capable of making that call.

    This sort of dilemma points to the fundamental impossibility of moderating Facebook to the satisfaction of everyone. Some users complain about pornography, so Facebook make a policy to take down the porn. Other users complain that these policies are sexist and protest outside Facebook's offices to free the nipple. Facebook end up with several pages of rules that allow their army of moderators to consistently distinguish between pornographic and empowering nipples, but nobody's really happy with the outcome.

    Facebook are doing a really crappy job of looking after their moderators, but that issue is fixable if we apply enough pressure. There are many other problems with moderating Facebook that are far less tractable and have far wider consequences.

    jsty 124 days ago

    > Especially screams should be farely easy to detect

    I guess it depends how reliably it could determine between ordinary screaming (especially with children who scream at just about anything that makes them happy or unhappy), and 'terrible things are happening' screaming. Even most humans would probably struggle to tell the difference just based on a short audio clip of a scream, which is what your algo would have to work with.

      SketchySeaBeast 124 days ago

      I think we've all sat in our house trying to figure out if the next door neighbour's kids are being murdered or not.

      Rollercoaster videos would be banned, as well as haunted houses, neither of which are terribly offensive.

        logfromblammo 124 days ago

        I can barely tell the difference between the parrot call that means "Ohshithawkhawkhawkhawk! Take cover!" and the one that means "Hawk! Just kidding; pay attention to me now."

        I'm not certain a machine learning system could classify the emotional context of screams across the entire range of animal voices. And what are you going to train it on, exactly? That's a nightmarish corpus to be tagged, right there.

    mschuster91 124 days ago

    > Shouldn't violence (both towards animals and humans) be farely easy to detect in an automated fashion?

    No. Already filters fuck up way too much and restrict perfectly legal content.

      onion2k 124 days ago

      Already filters fuck up way too much and restrict perfectly legal content.

      Would users actually have a problem with that though? People have the freedom to share whatever they want, but that freedom doesn't necessarily have to include sharing it on Facebook. Facebook could decide to tell their users "We're filtering out the bad stuff much more aggressively now and unfortunately that might block your content by mistake." I think a lot of users would be fine with that.

        marvin 124 days ago

        Such moderation has caused a significant degree of political grumbling in the recent past. E.g. where Facebook has blocked women's nipples but not men's, and also photos containing nudity that are politically important. E.g. the photo of Vietnam refugees fleeing a napalm attack, as well as examples of art that are politically important or relevant in an ongoing discussion, +++.

        The difference between bona fide political censorship and decency/legality/prevent-minors-from-seeing-supposedly-horrible-things/stuff-we-just-don't-like-and-assign-a-political-label censorship is very much a matter of judgement.

      the_duke 124 days ago

      That's why I mentioned an appeals process for requesting human review.

      As long as this exists and actually leads to human review, I'd much rather wait a while for my videos to be posted if it means easing the burden on 15.000 people having to watch highly damaging content.

        Konnstann 124 days ago

        Why wouldn't the people who upload violent/pornographic content not always appeal their videos? It's not like they're accidentally posting banned content. This would lead to nothing more than a slight delay before it gets banned by a moderator, solving nothing.

          mschuster91 123 days ago

          > It's not like they're accidentally posting banned content.

          They simply hope it ain't going to be flagged by another Facebook user or their AI.

hirundo 124 days ago

> Nobody’s prepared to see a little girl have her organs taken out while she’s still alive and screaming.

On the one hand, of course you don't want to see this and of course you want it removed from your social media feed.

On the other hand, if it's hidden, it horrifies fewer people, and instead of rising social pressure to take action against it, it can fester in the dark.

So if we manage to replace a significant chunk of centralized, moderated social media with decentralized, unmoderated alternatives, many of us will be exposed to more of this kind of evil. But as a result more of us will be aware of it and motivated to fight it.

I'd rather participate in unmoderated media, even at a greater risk of being assaulted by this kind of crap. But at this extreme I can sympathize with those that want straight up censorship, even if sunlight is a better long term disinfectant.

    pjc50 124 days ago

    Context matters: it's this kind of thing that was used to promote the massacres of the Rohingya. Or consider the various things that are posted about Syria. Pictures of atrocities without clear attribution of those responsible just make the situation worse and provoke reprisals against the wrong people.

    behringer 124 days ago

    Not in a million years would most people put up with it. If I ever saw something like that on FB I'd immediately delete my account.

solidsnack9000 124 days ago

I asked him what he thought needed to change.

“I think Facebook needs to shut down,” he said.

neuro 124 days ago

This seems to be the place

Cognizant Technology Solutions Woodlands2 7725 Woodland Center Blvd, Tampa, FL 33614

There's another big story here, it may get more horrific, from browsing their phone directory most of their "employees" appear to be people of south Asian nationality. Given the circumstances and their predatory behavior, I imagine they are also taking advantage of H1Bs.

    dRaBoQ 124 days ago

    Where do you see their phone directory ?

martin1b 124 days ago

The issue is not as much Cognizant or FB as it is the state of the public. The horrible acts posted online for entertainment by the public is the reason for companies like Cognizant. I hope these posts can forwarded to local law enforcement and the posts used as evidence against the poster.

    britch 124 days ago

    Come on... don't let FB off the hook. Don't just stand up and blame "society."

    Posting violent, hateful, evil stuff is as old as the internet itself. There will always need to be moderators. The question is how these people are treated.

    Do you really think it is _impossible_ for a company as big and valuable as FB to treat these moderators well? Do they not have the money to pay them what they deserve, to give them sick leave, and to offer even basic mental health services?

    Imagine if half or even a quarter the money and effort that has gone into their crypto debut had gone into helping build better tools and services for their moderation team.

    cldellow 124 days ago

    The article implies strongly that Facebook prefers to focus on the short term task of blocking offensive content and only pays lip service to the long term task of pursuing criminal charges.

    I'd definitely believe that that is the case, as there's little economic incentive to play the long term game with its uncertain payout if you can just churn through low-wage content moderators instead.

    Users of Facebook, but more realistically, employees of Facebook are the ones best situated to pressure Facebook to change that behavior.

      britch 124 days ago

      There's also the possibility of passing a law around this issue. It doesn't have to be entirely market forces.

    cloakandswagger 124 days ago

    As evidence of what?

      martin1b 124 days ago

      Crime. Several of the examples are clearly cruelty to animals. If a cop watched them do it, they would be arrested on the spot.

ycombonator 124 days ago

Cognizant is a ‘win at all costs’ major outsourcer. Their entire operations and staff are based in India and they are registered in US to soften the regulatory hurdles. It’s no surprise they didn’t have a defibrillator in the building.

jokoon 124 days ago

I would have thought they would have the best filtering AI and dataset the world has to offer, to avoid having those moderators to work like this.

    britch 124 days ago

    Building an AI is expensive. It also means FB is on the hook for false classifications.

    This moderator labor is (relatively) cheap. It also allows FB to point to a third party if something goes wrong.

    Look at their statements in the article. It's easier for them to distance themselves and point to "bad actor" contractors than it is if these were direct FB employees.

    daxterspeed 124 days ago

    From the descriptions of the work in this article it sounds like Facebook is actively choosing to have the same moderator re-moderate the same content over and over. At that point it almost seems like intentional malice from Facebook rather than a "developer oversight". Surely the first time a video has been flagged it should be trivial to identify further uploads of that video?

    The only reasonable explanation I can imagine is that Facebook is doing everything they can to avoid having to implement a "Content ID" system like YouTube. Now why exactly they don't want to do that can only be speculated.

Balgair 124 days ago

I feel that we're going to view these places and these practices the same way we view children in coal mines or young women working with radium paint. I've got a feeling that though we know very little about this now, our children and grandchildren are going to know a lot about this.

Dear Lord, what a horrible thing.

luckylion 124 days ago

There seems to be a very distinct difference between employee classes. Management and high value tech employees are treated very well while moderators, gig workers etc are treated as "human resources", quite literally.

It seems to me that they put companies like Cognizant between them and the exploited workers to deflect criticism as "we didn't know, we've been told by our partners that everything is great".

And on a technical level: if true, how can it be that FB needs the same videos and pictures to be moderated over and over? Are they not using any form of content id? Is developing such a system more expensive than running content moderation sites and then swiftly firing the burnt out moderators?

At what point do you become complicit if you're working for Facebook?

    jsgo 124 days ago

    The only problems I see are that someone could potentially upload a lower resolution version of the video a moderator has seen and then perhaps bypass the content id system (I don't know how tolerant they tend to be for situations as that).

    There's also the aspect of if we make it search out for distinct frames (like "scene change" type things), there's the possibility that some safe things could become auto-moderated which would be another controversy. Perhaps if they could locate start/stop times for inappropriate elements of the source video and then check for videos containing any of those in a later uploaded video, but then instead of asking human moderators to discern "is this video good or bad" we're now forcing them to get timestamps of when bad elements take place which would force them to really watch the video.

    Dunno, I'm of the belief that it's just not an easily solved problem. I hope I'm wrong and they can toss money at the problem to fix it, but I genuinely don't know how they could.

      luckylion 124 days ago

      Even with lower resolutions, fingerprinting (on a frame level) should still work relatively well, shouldn't it? If you get a certain level of confidence on each frame - if you have 80% of the frames with 80% confidence to be the same as $someVideo, that ought to be good enough overall to not get false positives but catch repeat views - how many 30s videos will randomly share 80% of their frames with a 90s video of an animal being abused?

      I assume that most of these videos aren't manipulated by professionals to escape a ban by looking completely different (which actually would make it a different video), but that they are mirrored, cut, have extra material appended or, as you suggested, have their resolution changed. I'd guess that videos are easier to ID than, say, image data, because you have plenty of material to work with.

      Are there well-organized troll armies attacking Facebook? Because trying to sneak a video in that shows an animal being abused and doing sophisticated changes to evade detection sounds weird on a platform where there isn't even (as far as I know) monetary gain connected to posting a video (as it might be on YT or for Email spam).

        jsgo 124 days ago

        I started playing FFXIV a bit over a month ago. During that time, there was someone in the Novice Network (new player advice channel in game) posted a "does this look like an okay PLD build?" with a URL. I click the URL and it is a video of something being held down and attacked with a hatchet or something and it sounded like a child (as soon as I realized what was going on after like 1-2 seconds, I closed the video then reported the user and blocked them. It was a bit much and not the type of content I'd seek out, so that's probably why details are hazy).

        Reason I give that anecdote is for the last part: I don't think it is about monetizing this type of content and that they're only after the shock factor of it.

        Again, maybe they could automate safeguarding against it, but my guess is they'd find a way to get through. Like take a 5 minute YouTuber video and inject a slightly modified version of the shock video in it with hopes it bypasses the filter. Don't know, but wouldn't put it past anyone that would do this sort of thing.

          luckylion 124 days ago

          That sounds tough and like you were in luck that you reacted quickly.

          > Reason I give that anecdote is for the last part: I don't think it is about monetizing this type of content and that they're only after the shock factor of it.

          Possibly, I assume so as well, but I believe there's a difference in commitment. If I'm making money off of something, I'm going to put a lot of effort (at least until I approach $0) into evading filters, and can easily pay for advanced technical solutions (possibly hand-crafted for my purpose). If it's to mess with people, my effort and monetary investment will be much smaller. If I need to put significant time into each video upload only for it to be blocked after a handful of views, it doesn't scale. Trolls are an issue, of course, but video seems to be so hard to create/manipulate at scale and easier (I think) to check (compared to text for example), at least until there will be something like GPT2 for video.

        xamuel 124 days ago

        >how many 30s videos will randomly share 80% of their frames with a 90s video of an animal being abused?

        If a powerful actor wants the 30s video removed, and knows it would be removed if it shared 80% of its frames with a 90s video of an animal being abused, then it would not be technically difficult to engineer such a 90s video.

          luckylion 124 days ago

          Yeah, but that works once or twice, and then the defense catches on. Sure, it's not a perfect solution, but rather like spam filters, it gets you pretty close.

          Also, I don't assume that powerful actors are the issue here - they'll likely just call their contacts at a high price law firm and they in turn call FB to remove some video.

            xamuel 124 days ago

            >And then the defense catches on

            And has to redo everything. It's whack-a-mole.

            Doesn't have to be powerful actors really, either. There are many less powerful actors who would love to get selected things removed from YouTube.

            This isn't the same as spam mail filters, because you generally don't know what sort of emails other people are sending each other. If you did, you could deliberately try to manipulate filters to flag your competitors' emails as spam.

              luckylion 124 days ago

              > If you did, you could deliberately try to manipulate filters to flag your competitors' emails as spam.

              You could, it's just hard and generally not worth it. The same, I believe, goes for video manipulation. Sounds like hash collisions. Possible, sure, but not something you run on your cell in 30s.

                xamuel 123 days ago

                It's eye opening when you realize how cut-throat competition can be when it comes to blackhat SEO etc.

                If you've discovered some longtail goldmine and you're the #2 result when people search for it and it's making you money, you have a big incentive to do anything it takes to nuke the #1 result.

                And before you say "You'll get caught and banned", obviously you do the nuking with a sockpuppet account. That's blackhat 101.

dsfyu404ed 124 days ago

I've worked some shit jobs over the years an I can't say I wouldn't be very tempted to try working there for $15/hr if I was still in the market for that kind of job and everything else around is paying closer to $9. I know HN likes to grandstand about how important workplace conditions are but when you're doing unskilled jobs you don't have that luxury. If you get the opportunity to do a worse job for 66% more money you take it and try to do it well enough to keep cashing that check.

I know that a lot of people here are probably bothered by the fact that the workplace sounds like impersonal zoo but most of these minimum wage workplaces are along that spectrum. Dirty bathrooms, inflexible policies for everything, etc. are normal. It's just how workplaces like this are. If you want a shitty job that's more personal and flexible then you need to work for a small employer but that can have its downsides too.

Sure the content itself sucks particularly but that's why they pay bigger bucks than everyone else if you can cope then good for you. If you can't you leave. It's just like a call center but more extreme.

Don't mistake me as defending Facebook or Cognizant here, I'm not. I'm just saying this that all things considered this doesn't seem like a particularly bad deal as far a shitty unskilled jobs go. They all have their pros and cons and you gotta find something that works for your personal preferences. I'd take digging ditches over anything with rigid corporate policy about every facet of my job but that's just my preference.

Personally something I think would help a lot would be if they'd over-staff the place and schedule people for few enough hours that they can easily hold another job and make efforts to accommodate people's other commitments. Just doing content moderation every day is probably where a lot of the mental and physical health issues comes from. In my experience the kinds of jobs that really grind you down grind you down a lot less when they're your side gig to some other slightly less shit job.

mengmeng 119 days ago

Dear Facebook Employees and Zuckerberg: More than the other megas, you claim moral highground. You have a passion culture where people `actually believe` they are doing something great. To this day, even with the retarded masses and corrupt congress thinking something is wrong, you `still believe` this. So where the fruit of your labor. Where has the blessings of your existence improved the lives of the unfortunate? where? where? im asking you right now reading this looking into your eyes. where?

well we've heard the line that the tech isn't there yet. or how you have learned from past attempts to help and are taking more targeted approaches.

ok, that works for the retarded masses and corrupt congress.

but I call bullshit. its actually more than that, its malicious intentional lying.

you know how you can help? heres a really easy way. take some of your money, give it to these slave/content-moderators.

100,000 to you is not 100,000 to them.

how much would you need to pay me to watch that shit? zuck man, you can't give me your salary to watch that. that shit damages you for life. and you think its worth whatever?

yall walk around (figuratively, on the internet) like you have something to be proud of. donald abraham is confident as shit.

but you know, yall are just like any other mega. sure each mega has its own demographic. but the entire reason for the existance of a mega ensures 100% you will engage in unethical behavior. and you will recreate the extremely oppressive society we live in everyday. Furthermore, you will recreate the belief in the opposite: things aren't so bad and we `good peoples`

124 days ago

b3lvedere 124 days ago

"He watched videos of people playing with human fetuses, and says he learned that they are allowed on Facebook “as long as the skin is translucent.”

Wow. Just Wow.

    lazugod 124 days ago

    This sounds like a fantastic example of basing moral guidelines around what is easy for a computer to detect rather than what is actually moral.

    We know that FB has nudity detection systems, due to their past problems with banning breastfeeding discussion groups.

      LucasLarson 122 days ago

      Unfortunately, it’s human work. The paragraph reads “For the six months after he was hired, Speagle would moderate 100 to 200 posts a day. He watched people throw puppies into a raging river, and put lit fireworks in dogs’ mouths. He watched people mutilate the genitals of a live mouse, and chop off a cat’s face with a hatchet. He watched videos of people playing with human fetuses, and says he learned that they are allowed on Facebook ‘as long as the skin is translucent.’ He found that he could no longer sleep for more than two or three hours a night. He would frequently wake up in a cold sweat, crying.”

doctorRetro 124 days ago

“I think Facebook needs to shut down."

As I've spent the last few years pissing away absurd amounts of time on the platform, gotten in countless fruitless arguments, and seen the truly vile and toxic elements of my communities exposed and worn like a badge, this is an idea I've been thinking an awful lot about. After reading this article, I've never been more certain of that statement.

    hvs 124 days ago

    "We have met the enemy and he is us." https://upload.wikimedia.org/wikipedia/en/4/49/Pogo_-_Earth_...

    Shutting down FB will just create this problem in the next popular social media platform. This is a problem that needs to be solved, and shutting companies down won't do it.

      50656E6973 124 days ago

      That's like saying there's no point in putting out house fires, because there's always going to be somewhere else that catches on fire.

      When a social gathering (whether its a party at a club or a riot in the streets) exceeds capacity and becomes dangerous, destructive, and out of control the police shut it down for the good of public safety.

        hvs 123 days ago

        That's the opposite of what I'm saying. I'm saying you need to put out the fire not just let it burn down and build a new house. Houses burn, find a way to put them out, don't just plan on creating new ones.

          50656E6973 122 days ago

          There comes a point when a fire gets so massive that its foolish/impossible to try and put it out, as it would just endager more lives. Human lives are more important than buildings

      darkpuma 124 days ago

      Some problems can only be solved one generation at a time. Maybe facebook dying and the new generation coming to age is exactly what society needs.

firefocus 124 days ago

Basically, you get paid $15 an hour to poison your subconscious mind.

Absolutely unethical. Facebook should not be allowed to hide behind subcontractors.

prirun 123 days ago

The article mentioned that "In May, Facebook announced that it will raise contractor wages by $3 an hour". If Facebook is dictating the wages of content moderators, then it is an employer, and all of these "contractors" should be reclassified as such, with Facebook paying all back wages, benefits, etc.

Here are some other ideas:

Instead of having content moderators watch 200 of these videos per day, distribute the load over the entire Facebook workforce. Or, when a content moderator flags a video for removal, verify that by making a regular Facebook employee watch the video to confirm the removal. This should increase the accuracy rate above the 98% threshold Facebook has set. That should put a finer point on the problem.

I could barely read descriptions of these videos. There's no way I could watch one, let alone 200 a day. This work is like being in a war, and they should be paid like a soldier who is risking their life / mental health.

apanloco 124 days ago

800 workers, 1 bathroom. Really?

    dillonmckay 124 days ago

    Sounds like my current HQ.

    I have worked for too many startups with inadequate bathroom setups.

    It is a big red-flag for me now.

      joezydeco 124 days ago

      I've used this as a hidden criteria for judging a company during an interview cycle. I always ask to use the bathroom and then see how clean it is, and if it looks like it's been maintained.

      I visited one private company where I talked to the owner - his office was plastered with pictures of his racing Porsche but the bathroom had rusted metal and peeling paint. Took me about 5 seconds to see where his priorities lay and I bugged out of there.

        dillonmckay 124 days ago

        The other red-flag was seeing all the framed photos on the CEO’s desk, of his family.

        All the pictures were facing away from him, so it seemed like he was advertising his family to anybody sitting in front of his desk.

        Maybe he didn’t like looking at them?

        Very bizarre.

          joezydeco 124 days ago

          Wow, that's messed up.

          All I can figure is that he's trying to soften or deflect anger at him when people come into the room. How can you be mad at him when he's just a simple family man?

thtthings 124 days ago

All this content reprograms our brain. Even if a sane person works for FB for content moderation they will be screwed up pretty soon.

I am so glad i am not on facebook since 2008. If you think it's not messing with you, THINK AGAIN. Facebook is the worst thing that happened to us, except for Mark and it's employees

negamax 124 days ago

This is done for Youtube as well btw. Not a FB only problem.

cronix 124 days ago

Don't fret, FB mods. You'll soon be replaced by algorithms so you won't have to do this much longer. You know, by those people making a min of 4x what you do and look down upon you. They'll tell ya to learn to code as they chuckle under their breath.

close04 124 days ago

> Florida law does not require employers to offer sick leave, and so Cognizant workers who feel ill must instead use personal leave time.

It seems that Facebook and Cognizant are not the only ones completely failing at protecting their people (employees, contractors, citizens).

    save_ferris 124 days ago

    Lobbying plays a huge role in how legislative decisions like this are made.

    The city of Austin recently passed a mandatory sick leave policy, only to be struck down at the state level after lobbying by employers.

    It’s not that companies like Facebook simply fail to protect their people, they’re financially incentivized to undermine their rights. And since money == free speech in the US, it’s perfectly legal for them to do so.

      close04 124 days ago

      > they’re financially incentivized to undermine their rights

      In the end most failings can be traced back to financial interest (getting cheap, staying cheap, aiming for cheaper). But the whole article was about how Cognizant and Facebook fail those poor souls who have to go through a "peaceful war", with all the gruesome tragedies but none of the preparation.

      The state doesn't seem to try harder either. This particular detail related to sick leave was surprising to me, I hadn't expected this.

aerovistae 124 days ago

Where are the organ harvesting videos coming from, I wonder? Anyone have any thoughts on that?

gerbilly 124 days ago

This might be an opportunity for facebook to do a lot of real good for the world.

They should have an investigative unit that tracks down the source of this material and partners with law enforcement so we can catch the bastards doing and posting this horrible stuff.

skizm 124 days ago

They should pay users small amounts of Libra for accurately moderating content. It completely solves the problem and they get to pay their new, more willing, contractors in what basically amounts to monopoly money (in the near term at least).

ralphstodomingo 124 days ago

I'm sure such scale requires heavy moderation. It pains me to realize the cost of enjoying the benefits of social media - if any, at all - that it makes me wonder whether we should have it in the first place at all.

I can imagine a world without Facebook.

    overthemoon 124 days ago

    This is what I keep coming back to. I don't think Facebook is worth it. I would personally extend this to most, if not all, social media.

    aeorgnoieang 124 days ago

    It's sad to realize that the world is worse than we previously thought, but it's not obviously worse overall. Not having social media, or anything similar, and thus not having to moderate awful content, wouldn't (necessarily) cause all of the terrible behavior depicted in that content to not happen.

    People can be awful. So can other organisms (independent of whether they're evil or just immoral). Not facing that fact, at all, isn't obviously best.

s_dev 124 days ago

I live in East Wall, Dublin -- beside the Facebook moderator (The Beckett) building.

I wonder if this is true for them as well or if it's just N. America -- we've better employment protections and minimum wage in Ireland compared to the US.

    anon029102 124 days ago

    AFAIK Content reviewers in the Dublin office work for Facebook as perm employees (are not contractors), and mostly deal with escalated cases from offsite contractors.

      disgruntledphd2 124 days ago

      Nope, that's not true (unfortunately). Most of the moderators in Dublin (approx 75%) are contractors, but there are a lot more full-time moderators in Dublin, as global moderation (ex US) is run from that office

      124 days ago

ryanmarsh 124 days ago

I read the bit about the children having their organs harvested while alive and didn’t believe it so I did some googling and now I’m done for the day. I’m gonna go hug my kids.

    kilroy123 124 days ago

    I'll take your word and skip on the googling myself. Seriously disturbing stuff.

starpilot 124 days ago

How much of this originates with FB/Cognizant? This seems mostly characteristic of other low-paying job work environments. You're not going to get the cultural elite paying $15/hour. You get desperate people with patterns of maladaptive behavior and dysfunctions. You get horseplay, various illnesses that come from bad home environments, and overall balls of stress that are just looking for a job, any job.

gopher2 124 days ago

Would be interested in seeing some kind of legislation where content moderation jobs that deal with XYZ categories of content must be compensate at least some % downtime/recovery. So e.g. in an 8 hour day you can only spend 4 hours a day doing moderation, and 4 hours of 'paid mental preparation' for doing the moderation work.

firefocus 124 days ago

Basically you get paid $15 an hour to poison your subconscious mind which is 95% of your life.

Absolutely unethical.

atishay811 124 days ago

I imagine if Facebook could use its users to moderate content from others, say you are sign up for moderation, then have to moderate 20 posts from unrelated users to get 20 days of ad free Facebook. Facebook could employ a system like re captcha uses to identify users faking it.

Win-win?

    dRaBoQ 124 days ago

    Facebook won't be able to install the necessary measures to prevent the moderators from saving the videos onto their own machines.

    A lot of the bad content is highly desired by criminals and can even be profitable to sell on the dark web (e.g. child abuse).

toss1 124 days ago

>> "Speagle vividly recalls the first video he saw in his new assignment. Two teenagers spot an iguana on the ground, and one picks it up by the tail. A third teenager films what happens next: the teen holding the iguana begins smashing it onto the street. “They beat the living shit out of this thing,” Speagle told me, as tears welled up in his eyes. “The iguana was screaming and crying. And they didn’t stop until the thing was a bloody pulp.”

>>"Under the policy, the video was allowed to remain on Facebook. A manager told him that by leaving the video online, authorities would be able to catch the perpetrators."

Utterly disgusting and inhumane policy on facebook's part.

Continuing to display abject animal cruelty only lowers the bar for would-be imitators.

There have also been shown to be strong links between animal cruelty and human cruelty, including murder.

To be clear, the ONLY proper way to handle this is to immediately take it down and file a report with the relevant police agencies. Expecting the local police agencies to maintain the same staff of 30,000 people to monitor FB posts for crime is stupidly absurd.

This is at best depraved indifference on FB's part, and more likely a deliberate dishonest rationalization to keep posted something extreme that will get lots of 'views' and 'engagement'.

This is only one of millions of examples, including cooperating with the Russian / Cambridge Analytica / etc groups to corrupt elections in the US, England, and elsewhere.

Simply put, Facebook is deliberately poisoning society for profit, and far worse than any tobacco company ever did.

They need to be shut down. Now. There are far better ways to do everything FB claims to do.

(edit: add police report paragraph)

    toss1 124 days ago

    At the very least, FB must be held to account as editors.

    FB vigorously opposes this.

    However, with 1) a staff of 30K++ screeners, 2) who knows how many moderators, etc, 3) many layers of management decisions and policy on what is and is not allowed on the platform and the reasons why/why not, 4) extensive human & algorithmic promotion and demotion of content, they are absolutely actively editing their site. This is arguably even more extensive editing than from any print or media organization.

    The only reason FB wants to avoid the "editing" label is so they can edit as they please. I.e., they want to edit for 'engagement' and profit, not for social responsibility.

    This cannot remain out of control without consequences, which we are already seeing in society.

    iBasher 124 days ago

    After trying to find the video I've found that iguanas are an invasive species in Florida and blunt force trauma to the head is the legal "humane" way to kill the invasive species. Not sure if that's what's going on in the video, and if it took multiple swings it certainly wasn't done correctly but maybe look into it before assuming things.

      toss1 124 days ago

      There are plenty of invasive species that I'm 100% in support of eradicating from their invaded territories. This is no justification for cruelty of any kind, nor promoting cruelty, which this video and Facebook does by leaving it posted. No assumptions needed.

      Assume that you are 100% correct, the species is undesirable invasive and that is the proper procedure.

      Repeatedly bashing their head while the creature is obviously screaming in pain is blatant cruelty. Not only do all the previously stated reasons to pull the video & report it to the police still stand, you provide yet another reason to pull it -- it shows a horribly bad example of how to do it, providing negative instruction and promoting bad & cruel technique.

      Moreover, based on this and numerous other articles and events (e.g., leaving posted a doctored video defaming the person 3rd in line for the US Presicency), this sort of behavior is fully consistent with Facebook's modus operandi.

      Simply, FB are happy to poison society for profit, and disingenuously justify it in the name of 'free discourse' and by claiming to have no editorial control.

      We must at the very least hold them responsible for their content.

        igravious 124 days ago

        User's account is new and the name is "iBasher" (iguana basher?). Proceed with caution.

          toss1 123 days ago

          Oops, yup, I don't even usually check user names, not accustomed to trolls on here.

          Some of the responses and voting patterns are indeed starting to indicate that an infestation of trolls is starting...

Havoc 124 days ago

Even second hand in article format that’s disturbing.

A classic sweatshop operations except it also destroys the slaves psyche on top of it.

Two parties weren’t really mentioned. The executives that set up the operation and the people posting the content. Both must be pretty twisted

119 days ago

nonwifehaver3 124 days ago

I haven't used Facebook since 2012 so maybe I'm missing something. When someone posts a video of them torturing a dog under their own name, or some other sick thing, does that not cause some sort of problem with their friends and local community? Many would permanently block and ostracize an acquaintance for such a thing. Why does nobody call the police on someone like that, especially when they know their work/address/etc in real life?

I guess I don't understand the key change in the medium or the culture that requires Facebook's ponderous rule tome and 10000 anonymous content moderators in Manila or Phoenix. Is this just about the risk of some corporate ad being next to some undesired content for a few page views?

saagarjha 124 days ago

> Work stopped while we were there to ensure we did not see any Facebook user’s personal information.

Do users know that their personal information is being looked through by employees who don't work for Facebook?

save_ferris 124 days ago

It’s strange how quiet the pro-Facebookers become on threads like these. Yesterday, we saw a lot of energy around Libra on both sides of the Facebook spectrum.

I’d love to hear an argument from someone defending this company that isn’t “everybody does it.”

    SpicyLemonZest 124 days ago

    Sometimes “everybody does it” is the appropriate response. I’m absolutely in favor of Facebook giving everyone more money and better working conditions and generally making their lives a pleasure. But when they sometimes fail to do that, in predictable ways that other companies also fail at, “Facebook specifically is evil and we’ve got to destroy it” is just not a reasonable reaction.

      save_ferris 124 days ago

      This is an example of default bias. By your logic, one could make the argument that we shouldn't hold the president accountable for lies he tells because it's generally accepted that all politicians lie. Regardless of your political leanings, should that ever, ever be true?

      That argument simply doesn't hold up. Calling out specific, concrete bad behavior should never be defended with "everybody does it." Sure, there's argument to be made that seeking to destroy Facebook is an overreaction, but that's different than arguing that everybody does it, which completely excuses that bad behavior.

    kodz4 124 days ago

    Every good story needs good villains. And the story of the 21st century is just getting started. We are in that phase of the story where the villains are pupating, given resources, and time to develop their own hyper efficient Amtssprache and narratives of self importance. Sooner or later destiny cannot be escaped and it takes everyone to a point of no return. And then the great war begins and the heroes arrive. The entire cycle would breakdown without good villains.

    debacle 124 days ago

    The Libra discussion was strange, because it was mostly pro-crypto people who didn't grasp how not crypto Libra is, and anti-Facebook people, with a slice of "hey this is probably illegal" people thrown in.

    holidaygoose 124 days ago

    It seems like content moderation is just a crazy wicked problem. And the company is doing the best it can with a lot of constraints. Better communication with the workers sounds like it would help, but what else should they be doing realistically?

      save_ferris 124 days ago

      Paying them better and taking their working conditions more seriously.

      There are serious scalability issues with facebook’s current business model, which inevitably puts workers in a position where they have few rights, low pay, and poor working conditions.

      We have to be able to question the viability of a business that treats workers this way. Contracting firms essentially allow companies like FB to completely absolve responsibility for the conditions of these low-paid workers.

      anon029102 124 days ago

      > what else should they be doing realistically?

      - Pay them better

      - Give them proper healthcare

      - Don't isolate/silo them

      These things would enable them to better deal with the inevitable stress and secondary PTSD that comes with their work. And would help FB perms to observe difficulties and quickly affect change.

        bilbo0s 124 days ago

        Well, they should be elevated above the other departments to handle the obvious security and conflict of interest issues that naturally exist in any company like this. So their salaries should be much higher. And actually, more important than healthcare, which the employee takes advantage of at the employee's own discretion, there should be mandatory mental health counseling and screening. I don't know what the frequency should be, I'm not a clinician. But my layman's guess would be a minimum of 3 times a year for each employee.

        But I disagree about siloing them. I mean, I'm sure there are some pretty good security reasons for siloing content moderation off from other parts of the company. Not saying that anyone from FB would necessarily do the following, but imagine an ad sales bonuses start going away because clicks are down. You just can't have ad sales cooperating with content moderation in any way shape or form to get more clickable content through. There should just be a content policy, and content moderation zaps whatever they please. End of story. That's how it should work. If ad sales wants input, they should have to convince legal to change the content policy.

        In other words, if this thing were structured correctly, content moderation would be above most everything else. (Everything other than legal.) And completely untouchable via any mechanism other than an official change of the acceptable use and content policies.

      Konnstann 124 days ago

      When I started working with Class 3B lasers at my lab, I had to undergo an initial medical exam, and then periodic ones to make sure my eyesight wasn't being affected. Guess what the company should be doing?

        SpicyLemonZest 124 days ago

        I mean, come on. If the article had said there are mandatory mental health screenings, and mandatory transfers if you don’t pass, you know everyone would be pointing to it as another example of how the content moderators are being mistreated.

          Konnstann 124 days ago

          The idea is that employees would get mandatory mental health care with licensed professionals, and not the 1 counselor that didn't know how to help.

          Yeah, I would support transfers to non-moderation tasks if the moderator was psychologically unfit to perform the job, rather than keeping them on and putting more pressure on.

      Lewton 124 days ago

      They're contracting it out so they can wash their hands, how's that count as "doing the best they can"?

      124 days ago

    golergka 124 days ago

    Personally, I skimmed the article and failed to see anything particularly bad about it. Typical unskilled office job, with handpicked horror stories.

    As usual, the author manages to make the employer responsible for employee's life situation. But aside from an impulse of assigning blame on the closest most powerful actor related to situation, I don't see any reason to see Facebook in any kind of negative light for this. They need a service performed, they found people who're willing to do that service for the money offered, they gave them this money. They didn't create neither the job situation, neither the lack of education, neither the whole life of choices that lead these people toward this point: they aren't even in a any condition to control the labor market.

    And speaking of labor market, I really don't think that they would do anyone a favor if they pay significantly higher than the market going rate. In my experience, such an experiment creates a very unhealthy office politics: people realize that they won't get this kind of compensation anywhere else, and their concern for not getting fired becomes more influential than personal ethics or professionalism.

      Nav_Panel 124 days ago

      The distinction between content moderation and other sorts of "unskilled office work" is the distressing nature of the content. My gut feeling is that these aren't "handpicked horror stories" but are instead the core of the job: facing video evidence of the worst aspects of human cruelty for 8 hours a day. Thus, the question raised isn't a simple "shouldn't these employees be treated better?" but something more like "how can we accept the existence of a job like this?"

      In my personal opinion, as someone who's moderated online communities before, Facebook needs to accept that any "system of [ethical] rules" is prone to being "lawyered", and ultimately serves to waste the time of content moderators like these. More importantly, few accept such a system of rules as legitimate: any rule set, even if "fair", is going to be judged by its edge cases, e.g. https://www.propublica.org/article/facebook-hate-speech-cens...

      The answer is to deploy an aggressive moderation AI, and tell users "too damn bad" if their post gets deleted (users will protest about "free speech" but that philosophical principle was created in a very different communication environment. If I had to create a new right more suited to the modern day, it'd be a right to "freedom from speech" instead), or else dramatically limit the sphere of content users are exposed to (although that would go against their whole ethos "connecting the world" or whatever).

      daxterspeed 124 days ago

      Facebook is in charge of demanding that these people to re-watch the same gore videos over and over. I'd wager that the vast majority of gore videos uploaded to Facebook can be identified from a few couple keyframes. They shouldn't have to listen to the audio of the video unless absolutely necessary either.

      Facebook has actively designed this system that's extremely hostile and damaging towards their moderators and they should absolutely take all blame for the faults in that system.

      pfortuny 124 days ago

      Clearly you just skimmed.

      Or perhaps in your office you deal constantly with anger, perversion, violence and gore?

      Where is that office?

      ssully 124 days ago

      Personally, I find your response to be emblematic of what is wrong with our industry. Completely ignoring the quantity and power of the filth that social media is able to generate and being ok with shoveling this shit to "unskilled" workers and letting them handle the consequences of dealing with that garbage on a daily basis. Just brush it off as a by product of capitalism/the labor market - "They need a service performed, they found people willing to do that service for the money offered".

      You ignore the human cost of all of this and brush off peoples experiences as "handpicked horror stories" - you are completely detached from the real world. . Despite your claims, this is 100% a problem of Facebook's making.

      save_ferris 124 days ago

      > Typical unskilled office job, with handpicked horror stories.

      Are you arguing that these stories aren't important because of the relative infrequency with which they occur? An employee died on the job in a brutally high-stress environment, and Cognizant is denying that his death had anything to do with the work he did. You're right in that cases like his are all too typical nowadays. But the fact that you don't "see anything particularly bad about it" says a lot.

      > They didn't create neither the job situation, neither the lack of education

      They created the jobs, didn't they? Those jobs exist because Facebook exists. How do they not bear responsibility for the conditions of those jobs? And nobody is criticizing FB for the lack of education or life choices of these workers. What do those have to do with this story at all?

      > they aren't even in a any condition to control the labor market.

      Disagree. They're in a position to constantly threaten thse workers with termination, that's about as close to total control as one can get in the free world. I'd wager that's far more control than your employer exercises with you. Can you imagine what would happen if Facebook tried that tactic with their SWEs? It's unbelievably disgusting that they treat their low skilled workers this way.

      > But aside from an impulse of assigning blame on the closest most powerful actor related to situation, I don't see any reason to see Facebook in any kind of negative light for this.

      If the company paying the employees isn't responsible for the working conditions of its facilities, then who is? Do companies not have a responsibility to provide a safe and healthy work environment? On top of that, these employees who make very little chose to break their NDAs, subjecting them to legal exposure, in order to talk about this. They're risking far more than just their jobs by going to the press.

      If a company isn't comfortable with employees talking, the onus is on them to create an environment in which workers aren't motivated to go to the press. That very clearly isn't happening here.

      > In my experience, such an experiment creates a very unhealthy office politics: people realize that they won't get this kind of compensation anywhere else, and their concern for not getting fired becomes more influential than personal ethics or professionalism.

      Logically, this makes no sense to me. In a healthy work environment, strong personal ethics and professionalism should inherently insulate a worker from termination. If a worker feels that they need to behave unethically to stay employed, that says a lot about the culture of the company. Good people in good work environments don't think this way. But I'd love to see some data on this if you have any.

phosphophyllite 124 days ago

Maybe censoring is not effective?

Why not to embrace any content and just make police raids when someone uploads questionable content?

Is censorship solves anything?

    bilbo0s 124 days ago

    Um...

    police busting down your door for uploading a video kind of is censorship.

    But yeah, I'm now starting to see why more and more people are just wanting to go to the censor and police raid system. A lot of this stuff people are uploading just doesn't belong in a civilized society. What started off as maybe just content on making non violent jokes about blacks or gays has morphed into showing children being disemboweled and videos of little old black or jewish ladies being gunned down in their place of worship. It's just gone too far.

    Probably just have to file it under:

    "This is why we can't have nice things"

      pjc50 124 days ago

      > maybe just content on making non violent jokes about blacks or gays

      The nonviolent "jokes" have a habit of escalating into real violence. If you let enough people post N-word rants without consequence and with validation from their hateful peers often enough they will egg each other on until eventually someone burns down or shoots up a church.

        bilbo0s 124 days ago

        Yeah, I guess that's what I'm starting to see.

        Like I said, "This is why we can't have nice things."

          saalweachter 124 days ago

          Ultimately you can only have one absolute principle or right; if you have two, eventually they will come into conflict and you have to decide which is really absolute and which is just mostly absolute.

    pjc50 124 days ago

    Can't police raid someone in another country. Not that the police are particularly interested:

    > Under the policy, the [animal cruelty] video was allowed to remain on Facebook. A manager told him that by leaving the video online, authorities would be able to catch the perpetrators. But as the weeks went on, the video continued to reappear in his queue, and Speagle realized that police were unlikely to look into the case.

    lazugod 124 days ago

    How does that change the problems in the article? You would still need people looking at questionable content to decide when to call the police.

    124 days ago

124 days ago

neilv 124 days ago

Where is the Facebook walkout, to demand humane working conditions for everyone at Facebook and its contractors?

mythrwy 124 days ago

Why are the same videos coming up again and again?

Of everything, that seems like an easy problem to solve.

pfortuny 124 days ago

I cannot believe they require signing an NDA for this job. You cannot JUST VENT?

wrongdonf 124 days ago

PSA: I watched a video and I got PTSD.

I used to casually browse a subreddit called watch people die. In 2016 I watched a video on that subreddit that gave me ptsd. At that point in time I had watched probably hundreds of intensely graphic videos, and the total pieces of intensely violent media I had consumed probably numbered in the thousands. I had been into it since 2008. I did it because of morbid curiosity.

At first I scrolled through the comments and noticed something very unusual: very emphatic comments warning people that videos can give you ptsd. Most videos have comments where people talk about how “I couldn’t even finish it” or whatever, so I brushed it off. After watching the video I immediately knew something was wrong. My body felt strange. My mind was in a state of hyper-tension or vigilance. It’s very difficult to describe. I also noticed that my libido would come and go in waves. I would go from not feeling any sexual feelings to being more horny than I’ve ever been in my life. I knew that something deep inside of me had been deeply affected. I went to sleep without much trouble. When I woke up I went into the bathroom to brush my teeth. I felt something coming over me. A sensation of panic. It surprised me because it was out of the blue and I’ve never felt something like that before. I then entered a full blown panic attack, which rocked me so hard that I fled the bathroom and threw myself on the couch. It passed, but I was drowning in anxiety and a sensation of doom. At this point I knew that I may have permanently fucked myself. I was scared. but I still had to go to work. I spent the next few weeks forcing myself through each workday while being suffocated by an overwhelming sensation of doom, anxiety and panic. It was the toughest thing I’ve ever done. But I got through it.

I noticed many things that I later learned are indicative of ptsd: having your eyes lock up, feeling ready to fly off the handle at the slightest provocation, an intense desire to subdue the anxiety with alcohol and drugs. I would walk down the street like an insane person, ready to rage on anyone who even looked at me wrong, and I had no history of anything like this.

I never had bad nightmares or trouble getting to or staying asleep, so I think I had some kind of light beer ptsd. But it was hell on earth. Everything I had heard about veterans losing their jobs and killing themselves all of a sudden made so much sense. Take it from me: ptsd is one of the worst things that can happen to you. And I didn’t even have full blown ptsd.

I got help from a few therapists, and they informed me that if my symptoms persisted more than a month or something, I would technically have ptsd. Other than that, the therapists were basically of no help whatsoever. The symptoms lasted well beyond three months.

As time went on, the symptoms got better. They seem to have stabilized now. If I’m distracted, I feel normal. But if my mind is idle then my thoughts always go back to it and with those thoughts comes the anxiety. Long drives can be uncomfortable. I’m at a state now where I’m in the clear: the symptoms are weak enough that they don’t threaten my ability to work and bathe and etc. and my ability to recognize and cope with the symptoms has increased a lot too. But it still bothers me sometimes and I am keeping my eye out for breakthrough treatments. Sgb and mdma look promising.

A thought can either be in your mind or not. When I’m feeling symptoms, it’s almost like the memories are somewhere in my mind, lurking. But other times they aren’t around. It’s like they have a life of their own. It’s something you don’t have control over.

The best coping mechanism I’ve found so far is meditation sort of. I think that part of ptsd is that your mind is fighting to block the memories and their emotional consequences. So when I feel symptoms I let my mind be open to any and all thoughts or memories. I totally relinquish control of my own thoughts and whatever comes into my mind, I allow it to come and then I watch it pass on. Opening the mind and simply observing the thoughts. This dramatically reduces the severity of my symptoms and often leads my mind to organically become preoccupied with something else.

It’s strange to think that a video can be so dangerous. But they can be. I was a grizzled veteran of gore videos and I thought surely that if they damaged the mind, I would have noticed a long time ago. Some videos, especially high definition ones, can for sure fuck you up. If you have children, don’t allow them access to the internet unfiltered. I saw this video on Reddit for Christ sake.

unicornherder 124 days ago

I didn’t even realize that they had human mods. Interesting!

124 days ago

AnaniasAnanas 124 days ago

NDAs and non-compete agreements should not ever be considered as valid contracts by the government.

    skybrian 124 days ago

    Some NDA's can be too broad, but this is a bad take. It needs to be possible to hire people that you trust not to disclose all your secrets, and your customer's secrets. This is what privacy regulations are all about. (At Facebook in particular, disclosing stuff about users is pretty bad, see lots of news stories over the last few years.)

    The balance between protecting privacy and making abuses public is pretty nuanced and doesn't lend itself to one-bit thinking.

      dustingetz 124 days ago

      > needs to be

      Nothing needs to be anything, though the world order would certainly look different and reflect the interests of different classes of people than today

        ghaff 124 days ago

        No. But, in the absence of NDAs and other agreements for both employees and external partners, you'd see a great deal more limits on sharing information both within and without companies to a strictly need to know basis. Certainly those limits exist today to a degree because NDAs basically just allow for consequences. But if you can't keep someone from turning around and sharing anything you tell them other than through some sort of mutual trust, you'll be less inclined to share it.

        skybrian 124 days ago

        Seems like looking at this from a class perspective only complicates things further?Poor people often have secrets and can be pretty vulnerable to attack if they're disclosed.

      AnaniasAnanas 124 days ago

      > It needs to be possible to hire people that you trust not to disclose all your secrets, and your customer's secrets

      I disagree, it needs to be possible for whistle-blowers to operate freely. It should also be possible to disclose to the whole world new and superior techniques and technologies that a company tries to hide.

      > This is what privacy regulations are all about

      I am pretty sure that this is a separate thing to NDAs. Nevertheless I believe that the solution should be technical rather than legal, with things like end to end encryption and public key cryptography.

        skybrian 124 days ago

        That's just wishing the problem away with techno-optimism. When you call someone at a company on the phone to get help, they often need to access your account. If they don't have access to anything, they're mostly useless and you get no help.

        We're a long way away from making everything self-service and companies not needing to hire anyone to do support. Until all the support people get laid off, they need to be trusted at least to some extent. (Internal controls can be helpful.)

        ghaff 124 days ago

        >It should also be possible to disclose to the whole world new and superior techniques and technologies that a company tries to hide.

        Whether or not a company really benefits from this in a particular case, the consequence of prohibiting any legal protections against the broad sharing of company information would be a lot more secrecy and compartmentalization of information.

      maxerickson 124 days ago

      Which privacy regulations protect corporations more than they protect actual meat people?

    dsfyu404ed 124 days ago

    I think you're painting with a very broad brush there.

    NDAs and non-competes have their uses. It's when they become part of the default boilerplate that everyone signs to get a job that the problems start.

      AnaniasAnanas 124 days ago

      > NDAs and non-competes have their uses

      I have yet to see a valid use that does not hinder whistle-blowing, the advancement of technology, or does not abuse the employees. I am sure that you will find a few valid use-cases if you try hard enough, however in the vast majority of cases they are used in order to repress the rights of others.

        navigatesol 124 days ago

        >I have yet to see a valid use that does not hinder whistle-blowing

        The legal system isn't static. If your company is breaking the law and you report it to authorities, your NDA will be unenforceable.

    bloak 124 days ago

    Do any companies use something more like a blackmailer's NDA, which works without a legal system?

    I'm not sure exactly how it would work, and perhaps it wouldn't work in practice, but I imagine it might involve paying the (former) employee a certain sum every month for their continued cooperation, and the employer would reserve the right to unilaterally cancel the arrangement: it would be "discretionary" or whatever. So the employee has a motive to cooperate (unless they're terminally ill ...) but there's nothing to "enforce".

    dec0dedab0de 124 days ago

    If there were no NDAs then companies would exploit patent, trademarks and copywrite even more than they already do. If they kept NDAs limited to trade secrets then there wouldn't be a problem

    I think Non-competes should be limited to while youre actually working there

      AnaniasAnanas 124 days ago

      Companies seem to try to abuse patent, trademarks, and copyrights as much as they can anyway. The best of-course would be if NDAs, patents, and copyrights all disappeared overnight. Trademarks are generally fine but they can be abused.

    pjc50 124 days ago

    NDAs need to be heavily restricted, but it's a difficult distinction to draw between "trade secrets of the job" (which arguably should be protected) and "abusive working conditions" (which should not)

      Nasrudith 124 days ago

      Honestly why the hell are trade secrets protected expect for misguised sense of intellectual property that forgets the concept was a contract and not a natural right?

      They don't even have the benefit of disclosure which patents were meant to give - to prevent said knowledge being lost. Trade secrets are why we had to investigate Damascus Steel reproduction throughly and still speculate.

      They give the useful arts and sciences nothing and yet they get free resources for enforcement.

      I believe the proper legal response from the state for breach of trade secrets should be "Wow, sucks to be you!" We really shouldn't be promoting that artificial scarcity and restriction of knowledge.

anentropic 123 days ago

sounds like the workers need a union

DannyB2 124 days ago

> there are two sets of entry-level knowledge workers at Facebook:

> engineers ($150k/year, benefits, upward career trajectory) and

> content moderators ($30k/year, no benefits, likely going to acquire mental illnesses).

If Facebook could get away with paying engineers $30K/year, no benefits, believe me, they would.

    H8crilA 124 days ago

    I'm sorry but who wouldn't? I would.

      bumby 124 days ago

      I imagine many people wouldn't. It's potentially shortsighted to have that sort of myopic utility maximizing mindset when it causes you to debase other values like social stability.

      But in a society that advocates the hustle, I gotta get mine, right? /s

        threatofrain 124 days ago

        Isn't that something that should be resolved at the social or government level? As opposed to Walmart? The market is good at some things, and that which the market does not naturally provide... we morally expect/beg for?

        Doesn't a situation like this ask for social or government intervention?

          fuzz4lyfe 124 days ago

          It can't be resolved at that level as far as I can see it. Even in planned economies individuals found ways to enrich themselves. My view is that any programs that are seen to imply that a employer is responsible for it's employees is harmful. It should be made clear to everyone that the employer/employee relationship isn't too different from the used car dealer/customer relationship in that everyone is looking to extract maximum value from the exchange. These people aren't your family and will take food out of your children's mouth without hesitation, as such make the moves that are best for you and don't work too hard if hard work goes unrewarded.

          This was the critical realization that took me from poverty to the middle class. Smile, pretend to be a bit naïve, extract what you need from the company (résumé fodder, etc) and move on.

          bumby 124 days ago

          It potentially could be solved at the governmental level, but it could also be addressed at the individual level. Each probably has it's own tradeoffs. For example, governments tend to have sweeping solutions that ignore ground-truth information (e.g., the disparity of cost-of-living with a federally mandated minimum wage). On the other hand, individuals may not have the bandwidth to absorb all the information (e.g., I have no idea about any immoral conditions that lead to much of the consumer items I buy). I assume there's a balance between the two. My personality tends to advocate personal responsibility over looking for someone else to solve the problems, when possible.

          specialist 124 days ago

          Government imposed open market sanity saved the domestic oil industry.

          I think about this every time someone advocates laissez faire (aka Freedom Markets™, corporatism).

          Every game needs referees, a ruling body.

          xvector 124 days ago

          Social responsibility is a thing.

        csallen 124 days ago

        Sure, but this seems arbitrary. Why is $150K great but $30K isn't? At what level of income for software engineers would you consider employers to be "debasing" the values of social stability? What's so unstable about a world in which software engineers don't make a ton of money?

          bumby 124 days ago

          I guess it's expected in HN that someone would focus on the software engineer pay, but that largely misses the point. I don't think anyone was arguing that the SWE in this case was overpaid. Perhaps there's a case that, in the context of the moderators who make so much less, it can create a wealth inequality that leads to social instability, but I don't thinks there's enough information for that claim.

          I would argue that there is a non-arbitrary point below which someone cannot meet the items that produce a stable society. Things like being able to raise a family, save for retirement, pay for healthcare etc. When those things are absent people become more reliant on a government/society has to fill in the gaps. When people live paycheck-to-paycheck, depending on government subsidies, it certainly can affect social stability.

            emn13 124 days ago

            People at 30k are not "dependent" on government subsidies in any socially meaningful sense. They're likely paying more taxes than receiving subsidies - and even if they aren't, there's arbitrariness in that "zero" line in this accounting. E.g. you might call public transit subsidies in cities a subsidy, but... it's a social good, so who exactly is that subsidizing? These kind of workers have next to no leverage, so they're payed the minimum they (as a group) need. If you "subsidize" fixed costs, pay is likely to drop. Many of those subsidies might equally be called investments, or plain old payment. This isn't in this sense equivalent to charity, where people really can't take care of themselves.

            I don't think you're wrong or anything - as individuals they're obviously relying on those subsidies, it's just the perspective is weird. These people are clearly adding value to society (of the monetary kind, I don't want to get sidetracked by the moral aspect here) - they're just not in a position to claim it for themselves. Being unable to make long term plans is destabilizing: yep! The fact that the government happens to "pay" you, largely from your own tax bill: why is that destabilizing? That must be due to stigma or something, because there doesn't seem to be any underlying mechanic that makes sense.

            Am I missing something?

              bumby 124 days ago

              I know the $30k mark is germane to the original article but I deliberately refrained from putting a quantifiable value on where this "dependency" tipping point may be. For one, I don't know and two, it probably changes with other variables like location, number of dependents etc. In some areas, I imagine $30k is enough to need no assistance while in others life becomes unmanageable without it.

              I would say that most/all subsidies are in place because they are viewed as a social good. I would also say regardless of where that subsidy qualification point is, it can be destabilizing in the fact that it moves people away from self-sufficiency. When people feel financially insecure, they tend to avoid the kinds of choices that may lead to a more stable society...buying a house, starting a family, starting a business, etc.

              Maybe I'm 180 degrees off here. There's certainly been an argument that government subsidies make a society more stable and not less. I've heard some even refer to it as "revolution insurance" for that same reason.

              Here's a different take on it: subsidies tend to help both the upper and lower ends of the economic strata. The lower benefits directly and, in cases of business elite, they may benefit from effectively being able to maintain lower wages. So who gets pinched? The middle class, which I think most would feel are necessary for a stable society. Some of our biggest gains as a country came from when the middle class made the largest strides. Not sure I know enough to make that claim, but it's one that's bandied about.

                emn13 123 days ago

                > Here's a different take on it: subsidies tend to help both the upper and lower ends of the economic strata. The lower benefits directly and, in cases of business elite, they may benefit from effectively being able to maintain lower wages. So who gets pinched? The middle class, which I think most would feel are necessary for a stable society. Some of our biggest gains as a country came from when the middle class made the largest strides. Not sure I know enough to make that claim, but it's one that's bandied about.

                Makes a lot of sense - but like you, not sure whether sense implies truth.

            csallen 124 days ago

            > I guess it's expected in HN that someone would focus on the software engineer pay…

            This branch of the comments is literally about software engineer pay. See four levels up: "If Facebook could get away with paying engineers $30K/year, no benefits, believe me, they would."

            To which you replied, "It's potentially shortsighted to have that sort of myopic utility maximizing mindset…"

            > I would argue that there is a non-arbitrary point below which someone cannot meet the items that produce a stable society. Things like being able to raise a family, save for retirement, pay for healthcare etc. When those things are absent people become more reliant on a government/society has to fill in the gaps.

            I agree with this. But shouldn't we then be advocating for a government/society capable of filling in the gaps? Expecting individual citizens or corporations to do this out of benevolence seems like a poor strategy.

              bumby 124 days ago

              > But shouldn't we then be advocating for a government/society capable of filling in the gaps?

              See my other comment above. There's reasons to believe both the state and the individual have a responsibility in fixing it for both ethical and practical reasons.

            crumpets 124 days ago

            So where is your outrage over the price of nearly all manual labor, fast food jobs, etc? $15/hr is pretty standard for unskilled work.

              bumby 124 days ago

              Who said I don't have similar outrage over those?

              bumby 124 days ago

              As an aside, $15/hr may be common but it's not necessarily "standard". If you look up the Davis-Bacon Act for non-skilled manual labor in the SV area, most rates are around $30/hr + $20/hr fringe (give or take, depending on the position).

      anarchy8 124 days ago

      I wouldn't, because that would be horrible...

        H8crilA 124 days ago

        How much would you pay your house cleaner? 100k a year? Do you tip $100 USD in restaurants so that waiters can enjoy similar effective salary?

          memmcgee 124 days ago

          Neither of those include massive mental trauma.

            geodel 124 days ago

            More likely none of trauma they suffer is being pursued and reported by media.

          reactorofr 124 days ago

          I would if I could, absolutely.

            csallen 124 days ago

            What's stopping you from giving high percentages of your income/wealth away to those less fortunate than you right now?

              amanaplanacanal 124 days ago

              Wait, how do you know they don't?

                csallen 124 days ago

                To my eye, "I would if I could" implies "I don't because I can't."

          lagg 124 days ago

          If I had the fuck-you levels of money facebook does I'd switch you and your house cleaner's salary, hire a PI to follow you and figure out which waiter or clerk hates you the most (you know there's at least one) and give them $3.5 million in front of you. Simply because the most powerful and hilarious usage of fuck-you money is punching up in such a manner.

          And making someone's life better at the expense of someone that thinks like you do on top is a cherry on the most flavorful sorbet.

          What I'm saying is you're a diiiiiiiick

          Edit: I was so taken aback by the dickery I forgot my closing point and I guess this shit isn't obvious to some people: There's little reason whatsoever beyond personal insecurity to not want someone to make as much as possible. But many reasons to not want them to make as little as possible. If you can get away with either - like facebook can at this point, I really feel like - why the hell would you do the latter. I'm aware of the practical issues with incentives with this for example, but you already derailed the thread into dumb anyway so what's it matter.

          flatline 124 days ago

          Neither of those require years of education and training.

            wernercd 124 days ago

            Content Moderators requires years of education and training?

            The original poster made two categories: Engineers and Moderators.

            Engineers? Definitely years of training and education...

            Moderators? What education is required to say "Does this post break community standards? (Check yes/no)"

              fesoliveira 124 days ago

              >Content Moderators requires years of education and training?

              It doesn't, but the potential trauma and psychological damage they might get from this kind of work should put it in the "high risk, high reward" category. I would never do a job like that knowing it could scar me for life unless the rewards were fantastic in that moment. And I believe that most people, knowing what this job entails, would say the same thing.

              The way the article puts it, these moderation companies are looking for whoever they can to fill the chairs in order to meet their quotas, offering a seemingly larger paycheck for what they advertise as an easy job. They literally mislead people so they accept this job, and when the hired person finds out what it is really about, they are likely to be dependent on that income and can't leave the job. And as time goes, they get more and more damaged, to the point that the company eventually fires them, and they are left with no money, no psychological help and traumatized for who knows how long.

            saagarjha 124 days ago

            Does being a Facebook moderator?

              flatline 123 days ago

              The comment was in regards to paying Engineers $30k/yr.

        rayvd 124 days ago

        If they were willing to work for that wage, why wouldn't you allow them to? Why would you favor an opportunity for someone pricing themselves higher?

          124 days ago

      lexapro 124 days ago

      Yeah, that's kinda how free markets work.

        ameister14 124 days ago

        Except it's not, not really - Ford discovered that paying workers a better wage resulted in greater profitability and increased efficiency.

          sbov 124 days ago

          To be more specific, paying them a better wage resulted in less turnover which meant less time searching for employees, less time training, and more experienced workers - resulting in an overall more efficient production line.

        bumby 124 days ago

        Except we don't have those Ayn Rand-ian free markets for a reason. We have a whole host of workers rights in the U.S. because society realized that leaving everything up to free markets to take advantage of what they can may lead to bad outcomes.

          cabaalis 124 days ago

          Don't we still enjoy the results of those "bad outcomes", though? Huge cities built, railways across the country, massive industries and technology advancements from the period?

            bumby 124 days ago

            Sure, but this falls under "the ends justify the means", which I disagree with.

            Take climate change. Just because we advanced because of our previous attitudes about carbon doesn't mean it's a good idea to continue doing so in the same regard.

            specialist 124 days ago

            All of those outcomes were initially paid for by government, meaning us taxpayers.

            For instance, railroads were given huge land grants.

          luckylion 124 days ago

          ^ for engineers. It obviously works for moderators, and this isn't the first instance it has been reported on, so it appears that there is no outcry from society to be expected any time soon - they know, and they don't care.

          I'm convinced they would care if it affected those with higher education from wealthy families (i.e. those that become engineers and managers) more, but it doesn't, they are well protected.

            bumby 124 days ago

            Not saying your class argument is wrong, but in the U.S. at least, engineers and moderators are still protected by the same workers rights laws. Maybe the laws need to be updated to reflect modern service jobs, but all workers fall under them.

          maximente 124 days ago

          no, that's not why we have a whole host of workers' rights in the US.

          we have a whole host of workers' rights in the US because people were fighting for workers' rights in the US, some of whom were injured or killed.

          (ed: clarity)

            bumby 124 days ago

            They fought for them because they didn't want the outcomes produced from unfettered industry. I don't see why the tactic (fighting) and the outcome (more equitable society) should be treated as mutually exclusive.

          okr 124 days ago

          Do you have an example for a free market, where this has happened?

          I consider the IT world, that i was part of, the most free experience in my life. No state was involved, contracts were enforced, people just got along. Plain level field. No one even thought of unionizing.

          Just now it peaks around the corner again. Rules, bureaucracy, everything that slows you down to make your own thing. Can i do this, can i do that? Other people decide about what i do in my life.

            autotune 124 days ago

            >Rules, bureaucracy, everything that slows you down to make your own thing.

            Would you say the same thing about OSHA? Sometimes having additional safeguards in place are a good thing.

            bumby 124 days ago

            Are you asking for examples where regulations came in without government intervention?

            There's some examples from the labor movement in the U.S. that fit this sort of self-organizing regulation, although I don't know much on the subject.

            Eventually, many of the privileges became federally codified rights under the Occupational Safety & Health Act signed by Nixon.

      dahart 124 days ago

      The company I work for now seems to be serious about a strategy of golden handcuffs, giving more than the going rate, and keeping employees from having to fret about basic needs so they can spend more time focused. It definitely appears to work to retain talent. I could say similar things about the two companies I worked for prior to this one as well. I feel there is something serious and economically competitive to be said for understanding the importance of hiring and keeping good people by paying more than the minimum. Maybe this only works if individual employees aren't easily replaceable. Or, maybe some companies don't try to track the long term financial costs of high turnover.

      erikpukinskis 124 days ago

      I wouldn’t. I pay everyone at least a living wage, full stop. It’s a matter of ethics for me. If the business can’t support it, I change the business.

    124 days ago

    not_a_cop75 124 days ago

    A company that made 15B can't afford to raise salaries? What a ludicrous statement to make.

      DrJaws 124 days ago

      He never said they can't afford to raise salaries, but they can get away with it

      124 days ago

    meruru 124 days ago

    >likely going to acquire mental illnesses

    I find hard to believe humans are so sensitive that some visual stimulation is going to give them mental illnesses. We evolved in an environment where violence was part of real life and a real threat to our own well being, seeing it on a screen will elicit emotions we may not be used to in modern life, but causing mental illness is a stretch.

      jstarfish 124 days ago

      The job is to view an endless parade of the most creatively horrible things people have ever said to each other or captured on film, all day every day.

      The job itself requires you to actually scrutinize the details-- averting your eyes is not an option. The job is to look at and mentally process each artifact.

      Violence in our origins at least had a point. Kill for survival. Kill for food. Killing meant something.

      Raping children never served any biological purpose, nor does watching actual people explode. There are many reasons we try to make the world such that neither is a part of anybody's life.

      (ed) Put it this way-- if it's so harmless for people to be exposed to, why moderate anything at all?

        meruru 124 days ago

        Yeah, to be honest I wrote that before reading the article and I was painting a much rosier picture than what it was described.

        I do wonder what is the distribution of content they have to moderate. Is it a continuous stream of violence or is that something that appears once or twice a day?

        >if it's so harmless for people to be exposed to, why moderate anything at all?

        Obviously if the public doesn't like it enough, you're going to moderate it away. Some places don't even allow cleavage, so it's not like something has to be harmful to be moderated.

      Fnoord 124 days ago

      > I find hard to believe humans are so sensitive [...]

      Perhaps that says more about you than about humanity.

      There's two very good reasons why cops use checksums instead of watching child pornography.

      1) It goes faster (on the long term once the scripts/programs are written).

      2) Its mentally exhaustive to check such footage. Why? One word: empathy.

      Also, why is it that cops become so tense, or racist? Because of what they experienced. If your experience is shooting someone for the first time in your life, yielding a kill, then that is going to have a severe impact on you. If its #81, then its just a statistic.

      marak830 124 days ago

      Id like to preface, I'm not trying to reduce what these people are going through.

      But it's relative.

      I grew up on farms, I've slaughtered animals and I've seen seemingly stable 30 year olds completely break down when they see a video of an animal being humanely slaughtered.

      I've also heard stories from mates of mine in the military that I'd prefer not to see on video.

      People are desensitised from things they are raised with. Some shit (kiddie porn and extremely graphic BDSM and rape), isn't something you really want people to be able to accept.

      Knowing that it's something that really happened vs a movie is a big difference.

        meruru 124 days ago

        >People are desensitised from things they are raised with

        From my own experience, that's not required. I grew up pretty sheltered myself and I have no problem seeing that kind of content occasionally. Maybe it would be a different story if I were being exposed to this sort of content constantly. I don't know.

        That said, I hadn't actually read the article when I posted and found it pretty surprising. It seems there's a lot going on there.

        >seemingly stable 30 year olds completely break down when they see a video of an animal being humanely slaughtered.

        My reaction was more about things like this. I tend to think that's an overreaction and at least partially feigned and self-deceptive.

          marak830 124 days ago

          So take that, and then change it to a human child. I was trying to use a fairly light-touch subject matter.

          At that point, if you don't have a reaction(from seeing it multiple times), its probably tome to seek professional help.

            meruru 124 days ago

            >At that point, if you don't have a reaction(from seeing it multiple times), its probably tome to seek professional help.

            You're already assuming there must be something wrong with me because I'm not getting scandalized by the idea of seeing violence. That's exactly the kind of reaction that makes me skeptic.

            Just so it's clear, I do dislike seeing violence, I just don't think it's that big a deal in general.

              marak830 124 days ago

              No I was using a low level difficult thing to watch, instead of a horrifying thing to watch.

              If you think watching child porn to rate the videos as safe or unsafe for a platform such as Facebook or youtube, and that seeing cp is something you can handle multiple times safely, then I urge you to seek some help.

              In all honestly, being able to ignore multiple videos like that - which you can believe in their primary are real - without being affected, the you may, in all honestly, have some compassion issues, and in your own best interests, seeking help cannot hurt.

                megous 124 days ago

                Having compassion doesn't mean breaking down when seeing [eg.] gutted child on a screen.

                You can still know it's wrong, feel sympathy and want to help, etc.

                Where I'd draw the line is seeing that, and wanting/asking to kill more, or enjoying that, or feeling the need to justify the act, or not stopping to consider the act.

      otakucode 124 days ago

      Human brains are very adaptive. Consider the following scenario: A person is standing on a stage, in front of many of their peers, giving a speech or receiving an award or some other public recognition. Someone comes up behind them and swiftly pantses them, dropping their pants and underwear instantly, exposing them nude from the waist down. Would you expect this event to result in psychological trauma to the person? The answer depends entirely and completely upon that persons prior experience. If the person has been raised as a nudist, and spent a great deal of time in their life nude in the presence of strangers, it would be a non-event on the level of sneezing. If the person has been raised the way most individuals in our society are raised, with tremendous social scorn and shame heaped upon every topic related to the body, with zero experience of being naked or witnessing nudity in a public setting... it could be monumentally traumatic and cause them intense psychological harm that lingers for decades. When society was familiar with violence, violence had less of an effect because of that. Now that violence is very rare, and we repeatedly insist to one another that things on a screen are basically the same as the things themselves (which we do with our language we use when talking about fictional depictions, using instead the words that would be more correct only if we were describing real in-person situations), we sensitize ourselves and invent the possibility of these images destroying us psychologically. It is a self-imposed weakness, but one which intellectual recognition of can do almost nothing to remove.

      Const-me 124 days ago

      > We evolved in an environment where violence was part of real life and a real threat to our own well being

      It's beneficial for survival to have some strong reaction on observing violence. Could be "charge", "run for your life", or something else, but it was never "click a mouse button and move on to the next violence scene". Similar for gore, evolutionary correct reaction is "run and/or vomit".

      People who failed to react in a strong way did not pass genes: joined the violence scene they observed, caught the same decease and joined the gore scene they observed.

      ljcn 124 days ago

      It's well known that viewing graphic material can trigger PTSD, e.g. in the police force. These moderators will be exposed to similar.

      hollerith 124 days ago

      >We evolved in an environment where violence was part . . .

      We also evolved in an environment without screens, that is, a world without any selection pressure to distinguish between actually dangerous experiences and experiences generated by screens.

      geggam 124 days ago

      When I worked at Y! groups a couple times legal would ask us to remove images. I am about as insensitive as it gets but a few images of babies being molested will trigger some deep emotions.

      I cant imagine doing that day in and day out.

bksenior 124 days ago

You ever read Upton Sinclair's the jungle? It's a tale as old as time. Just publicly shame the company and ultimately they change or congress passes laws. This is a major reason the media works this way, it's part of American democracy.

    president 124 days ago

    > Just publicly shame the company and ultimately they change or congress passes laws.

    This just doesn't work in today's world. If it did, Equifax wouldn't be thriving the way it still does today.

      nkrisc 124 days ago

      People will have a much more visceral reaction to the truth behind the products they eat, as opposed to the arcane world of credit reporting that most people are only vaguely aware of when they go to rent an apartment and the landlord runs a credit check.

      Apocryphon 124 days ago

      How does the average consumer directly interact with Equifax? Unlike a meat producer, an amphorous credit agency isn't something you can just boycott.

      lostlogin 124 days ago

      A good example. Another is FB and their practices with privacy and user data.

      bksenior 124 days ago

      The effectiveness of types of persuasion change with time.

      Eventually there will be something that moves people if it becomes important enough.

        emn13 124 days ago

        Relying on congress to respond to public issues is fundamentally weaker now, and weakening by the year due to the duopoly in politics: any issue one side takes up is immediately something that - if possible - is criticized and turned into a point of differentiation; and media are growing into that too (online media already are highly polarized).

        Whatever small forces of reasonable consensus remain aren't enough to address issues like this: because let's be honest, it's not really all that clear exactly what the problem is (there are lots of aspects), how serious it is, let alone how to address (if that's even possible). You're not going to get proper consensus overnight on an issue like that in the best of times, and it's hopeless now. At best - and I don't think even that's likely - you'll see some hyper-targeted no worker-abuse kind of laws, but nothing addressing the underlying dynamic that creates the situation, and something that a creative business is likely to be able to route around. Essentially a base-pleasing legislative patch that punts the problem until it reemerges. Given the topic that's only going to happen if the democrats win both houses.

          khazhou 124 days ago

          Actually, since Facebook is now considered "anti-GOP", I would not be surprised if Republican senators begin investigations into this.

          crooked-v 124 days ago

          > any issue one side takes up is immediately something that - if possible - is criticized and turned into a point of differentiation

          This kind of "bothsame" response willfully overlooks that one party has gone to substantial lengths to offer compromises on bills, appointees, and policy goals, while the other very much hasn't.

            emn13 124 days ago

            You're totally right to point out that Republicans are much more shameless in this regard. I didn't intend to imply the two parties are the same here, but that's how it reads.

            But even if the republican party+electorate are at "fault" here - I don't think that fact is going to help anyone, nor that Democrats are acting like little angels here anymore (if they ever did). They don't have the power to stop this dynamic, but they sure are participating pretty wholeheartedly in it now.

            More fundamentally, the idea of open markets and in general an open society (i.e. the anti-Trump) was in some sense fallacious, and Democrats didn't admit this quick enough. A rising tide does not lift all boats - not automatically; at least. And there was precious little care for ensuring people had the negotiating position to actually profit from the changing times. And what rebalancing there was didn't actually serve to reduce inequality, merely to make it sting a little less - but people don't like being treated as inferiors needing a paternalistic pat on the head.

            I think of it a bit like a prisoners dilemma: sure, it's utterly self-destructive to do what Trump (and by extension his voters) are doing. But it's partly a reaction borne of lack of other options: if they're going to get screwed, then screw everyone; time for a reboot. Only: that reboot isn't in sight, and all kinds of other nasty social habits are coming along for the ride.

            I'm exaggerating a little as a point of debate, but you might say that the liberal elite (say) 15-30 years ago was being a little deceptive, and were doing so in a somewhat condescending, abrasive way, and that was plain dumb - because that helped cause something like Trump. For the playground analogy: maybe they started, but this fight was avoidable.

              rapind 124 days ago

              The idealogical polarization of the parties was really kicked into full gear by Gingrich. Trump's very clever at manipulating media to take this polarization to new heights, but the playbook it's Newt's playbook.

              It certainly doesn't help the Democrats (and reasonable republicans?) that they appear weak when they try to be reasonable or compromise.

      quest88 124 days ago

      If the media kept covering equifax as much as they do Facebook then things would have changed, I think.

      It has been shown opinions change depending on what the news is covering.

    bagacrap 124 days ago

    There's a key difference there: the practices described in the jungle directly affected the population at large, by way of the food they ate. The kind of exposé in tfa requires the masses to have collective empathy.

      officialchicken 124 days ago

      While "Mary had a little lamb" may be the part everyone remembers - and is THE REASON why the FDA was founded - the book spends more time discussing exploiting labor (unions and busting them); the role of banks in predatory housing; the role of transportation networks in segregation; and all kinds of issues related to immigration, education, and public health.

      memmcgee 124 days ago

      The population at large is heavily affected by Facebook as well.

    sharkmerry 124 days ago

    I believe Sinclair's goal with the jungle was to expose the working condition for the benefit of the workers but the takeaway was how disgusting our food was

      mortenjorck 124 days ago

      In the author's own words: "I aimed at the public's heart, and by accident I hit it in the stomach."

    pmoriarty 124 days ago

    More apropos to this age is not the political climate in which Sinclair's "The Jungle" led to pro-consumer reforms, but rather Jack London's "The Iron Heel"[1], wherein when challenged, one of the captains of industry spoke these prophetic words:

    "This, then, is our answer. We have no words to waste on you. When you reach out your vaunted strong hands for our palaces and purpled ease, we will show you what strength is. In roar of shell and shrapnel and in whine of machine-guns will our answer be couched. We will grind you revolutionists down under our heel, and we shall walk upon your faces. The world is ours, we are its lords, and ours it shall remain. As for the host of labor, it has been in the dirt since history began, and I read history aright. And in the dirt it shall remain so long as I and mine and those that come after us have the power. There is the word. It is the king of words--Power. Not God, not Mammon, but Power. Pour it over your tongue till it tingles with it. Power."

    [1] - https://en.wikipedia.org/wiki/The_Iron_Heel

    revscat 124 days ago

    > Just publicly shame the company and ultimately they change or congress passes laws

    This will not happen so long as libertarians and Confederates have political power.

Albert500 118 days ago

We are part of a team consisting of highly efficient developers and hackers. We run hack job site like Upgrade University Grades, Facebook, Instagram, Twitter,Whatsapp,Line,SkypeHack, Delete unwanted online Pictures and Videos on any website,Remove Criminal Records, bank transfers Apps hacking Mastercard, Paypal, Bitcoin, WU, Money Gram with untraceable credit on it etc. We also develop hacked facebook, twitter, instagram, yahoo, gmail passwords + phone hack (iPhones,android) etc. We do custom software and web development in php, java, asp. net etc.

We have 100% records from our client as well as highest repeat hire rate. our work speak for ourselves, we provide a perfect software solution to all clients. We believe in mutual growing with client and hence we work as a technology partner and consultant for our clients. Contact us Email: Theredhackergroup@gmail.com WHATSAPP:+1 786 708 9974 TEXT:571 318 9498

averros 124 days ago

Oh, and the simple thought of refraining from censorship and giving users the tools to self-sanitize their feeds never actually crossed the Facebook mgmt's minds.

Because, apparently, Facebook is not only about stalking the users and manipulating them into buying more stuff they don't need, but also about manipulating them into believing into the "right" stories and voting for the "right" people.

All of which makes censorship - both algorithmic and human-based - a must.

Another take-out from the article (for Facebook users): your perception of reality is being shaped by the kind of people who can't (or can't be bothered to) find a better job paying $15/hr.

sonnyblarney 124 days ago

Funny how Zuck and even Bezos have this worker problem coming up all the time.

Their making a lot of money, can they not at least make working conditions decent? Too much to ask?

mahgnous 124 days ago

Disgusting that companies use NDA’s to abuse their employees like this.

Proven 124 days ago

> The 800 or so workers there face relentless pressure from their bosses

They're not slaves. Like those Foxconn workers, they can leave for a Tier 2 employer or find something else to do.

> to better enforce the social network’s community standards, which receive near-daily updates that leave its contractor workforce in a perpetual state of uncertainty.

tl;dr - all of this is completely unnecessary. If American control freaks (in this case, mostly socialists) stopped attacking FB with constantly new types of complaints and censorship requests, both workers and their managers would live happier lives and their accuracy wouldn't be below target (that is, if they used common standards and common sense in their work, there would be less false positives and false negatives).

dx7tnt 124 days ago

I can't wait until my bank has to pay teams of people to keep child porn and beheadings off its website!

creaghpatr 124 days ago

For the wage mentioned in the article, I'm surprised they don't have trouble finding people to do the job.

>Speagle helps to take care of his parents, who have health problems, and was afraid to quit Cognizant. “It was tough to find a job down here in this market,” he said.

Tougher than watching graphic PTSD-inducing videos every day? It sounds like the job absolutely sucks but these contractors need to have a little bit of personal responsibility for their own mental health. I suspect they do but the article gives the impression they are all trapped there with no way out.

Tampa is a major call center hub so I imagine contractor positions are opening up all the time given general churn rates.

    danharaj 124 days ago

    > It sounds like the job absolutely sucks but these contractors need to have a little bit of personal responsibility for their own mental health. I suspect they do but the article gives the impression they are all trapped there with no way out.

    > Tampa is a major call center hub so I imagine contractor positions are opening up all the time given general churn rates.

    What matters is that you've come up with a narrative that convinces you it's the workers' fault.

      creaghpatr 124 days ago

      Are you really trying to gaslight me here? Why not address my argument instead?

        chefandy 124 days ago

        You're telling workers that it's their fault being abused by a crappy employer when there are many reasons someone might not be able to leave a job, but are accusing someone who's pointing that out of gaslighting you. I think that, itself might actually be closer to gaslighting.

        mcbits 124 days ago

        Your argument being ... that they should simply get a new job if it's so bad, or that the article must be misrepresenting the job if they don't?

          creaghpatr 124 days ago

          Both! It's right there at the beginning:

          >But had his managers asked, they would have learned that Speagle had a history of anxiety and depression, and that he might not be suited well for the role. No one did.

          Perhaps when he realized he was going to be viewing graphic content for the entire workday- he might have reconsidered whether the job was right for him? The framing is clearly trying to pin the blame on the company and/or facebook and while they definitely misrepresented the job (and the company sounds like they committed a lot of workplace violations in general),no one was forced him to be there.

          And then he got laid off, he didn't even quit on his own.

            fzeroracer 124 days ago

            I mean by your own argument, workers are not allowed to criticize their working conditions ever. You're technically not being forced to work there (even if you have a family to feed, poverty preventing access to better jobs etc), therefore you just gotta chin up and deal with it.

            This is an argument entirely unrelated to the article at hand which specifically calls out how terrible these working conditions are and how Facebook perpetuates it.

          workingpatrick 124 days ago

          I agree, we can't blame people with limited economic opportunities for taking a relatively 'good paying' job. We can absolutely blame one of the largest corporations in the country for offering jobs with such demanding and damaging work while offering few support services and allowing these sub-contacting companies to treat employees in such a manner.

    lallysingh 124 days ago

    It's not easy to self diagnose for this stuff. "You just read Facebook posts and look at pictures? No customer yelling at you? Sounds easy."

kadira 124 days ago

Fucking you all the cunts working for FB, find a better job fucking cunts.

HPSUCKME 124 days ago

Any leadership by EX HP employees truely enriches a company. I am very impressed. Thanks for removing my previous post.

HPSUCKME 124 days ago

HP is why noone can have nice things. Let me fix this for you OP.

"The person responsible for managing YOURCOMPANY's growing contractor workforce is Arun Chandra, whose title is vice president of WHATEVER. Chandra arrived at YOURCOMPANY last November after a long career at HP, where he helped to oversee the company’s global supply chain. In his new role, he told me, he hopes to gradually INCREASE OUTSOURCING TO 100%."

redm 124 days ago

This story reads like a journalist trying to find some dirt on Facebook because its “hot” right now. Its the inverse of writing fluff stories when everyones excited about a startup. This story isn't even Facebook.

I worked tech support for Acer in the 90s and its a similar setup. If they took the time to have “relaxation” rooms, i doubt its as bad as the story makes it out to be.

    rawrmaan 124 days ago

    You obviously didn't read the article.

      redm 124 days ago

      Actually, I did, obviously. I feel I have a much better picture of contracted centers like the one described in the article than a sensationalizing story from The Verge could inform me of though.

        rootlocus 124 days ago

        You said:

        > If they took the time to have “relaxation” rooms, i doubt its as bad as the story makes it out to be.

        The article said:

        > Facebook contractors are required to use a browser extension to report every time they use the restroom, but during a recent illness, Lola quickly took all her allotted breaks. She had previously been written up for going to the bathroom too many times, she said, and so she felt afraid to get up from her desk. A manager saw that she was not feeling well, and brought a trash can to her desk so she could vomit in it. So she did.