Exploiting emotional labor

Casey Newton’s article on The Verge about the lives of Facebook moderators likely only adds to the growing rage against social networks. It’s worth a read. Even if stories like this often make the work seem worse than it usually is, it’s not a pretty picture.

Other journalists and bloggers have recently been talking about work and about how online communities work. On work, see Derek Thompson’s recent Atlantic essay. Thompson observes the way in which work is expected to function as one’s entire life, making it more like a religion than a job. Scott Alexander’s post on his attempts to moderate comments in his own little community is worth considering.

These articles offer a chance to synthesize some varied thoughts about how our high-tech, information-rich, ultra-connected world is affecting us. Here is just one idea that these essays have made me think about.

As computers can do more and more, jobs will be more and more about what only humans can do. Firms will look for ways to extract value from distinctively human abilities. This is what a lot of “information” jobs actually look like. They are not traditional “white collar” jobs; they’re not in management or administrative support. Instead, they are ways of leveraging part of the human mind that computers can’t duplicate yet.

For a few months I worked at a company where the task was to correct errors that computers made in reading documents. The computer did pretty well with the initial read, but any characters it was not confident in got passed to a human reader. The software we used was built to make us work as fast as possible. We didn’t need to read the entire document, only the few parts the computer couldn’t read. We were carefully tracked for speed and accuracy. Nowadays machine-learning technology has likely surpassed even human abilities in this domain, but the basic function of the human in the system is much like the Facebook moderators’ function. It makes up the gap between what the machine can do and what the product requires.

This gap-filling is what Newton’s article describes in the Facebook moderating company. Employees are asked to leverage their judgment in figuring out whether something is appropriate or not. Because judgments of this sort are hard to reduce to rules (note all the problems Facebook has in specifying the rules clearly), the task needs a tool that is good at interpreting and assessing an enormous amount of information. And human minds are just the thing.

Computers have gotten good a certain kinds of pattern recognition, but they are still not good at extracting meaning from contexts. Human beings do this all the time. In fact, we’re really, really good at it. So good, in fact, that people who aren’t better than the computer strike us as odd or different.

The problem is that this task of judging content requires the human machines to deploy what they have and computers don’t. In Facebook’s case, that thing is human emotions. Most of our evaluative assessments involve some kind of emotional component. The computer doesn’t have emotions, so Facebook needs to leverage the emotional assessments of actual people in order to keep their site clean.

These kinds of jobs are not particularly demanding on the human mind. Sometimes we call this kind of work “knowledge work,” but that’s a mistake. The amount of knowledge needed in these cases is little more than a competent member of society would have. It would be better to call these jobs human work, or more precisely emotional work, because what is distinctive about them is the way they use human emotional responses to assess information. Moderators need to be able to understand the actions of other humans. But we do this all the time, so it’s not cognitively difficult. In fact, this is why Facebook can hire lots of relatively young, inexperienced workers. The human skills involved are not unusual.

The problem is that as those parts of us that are distinctively human become more valuable, there is also a temptation to try to separate them off from the actual person who has them, then track them and maximize their efficiency. In ordinary manual labor, it’s not so hard to exchange some effort and expertise for a paycheck. Faster and more skilled workers are more productive, and so can earn more. Marx notwithstanding, my labor and expertise are not really part of who I am, and expending them on material goods does not necessarily diminish or dis-integrate me. In contrast, my emotions and capacity for evaluate judgments are much closer to who I am, and so constantly leveraging those parts of me does prompt me to split myself into my “job” part and my “not-job” part. We might call this “emotional alienation,” and it is a common feature of service economies. We’re paying someone to feel for us, so that we don’t have to do it.

All this doesn’t mean we should give up content moderation, or even that moderator jobs are necessary bad jobs. I have little doubt that there is tons of stuff put online every day that ought to be taken down. I am an Augustinian and a Calvinist, and harbor no illusions about the wisdom of the crowd. But we should be more aware of what it actually costs to find and remove the bad stuff. We enjoy social networks that are largely free from serious objectionable and disturbing content. But someone has to clean all that off for us, and we are essentially paying for that person to expend emotional labor on our behalf. Social media seems “free,” but as we’re being constantly reminded, it really isn’t—not to us, and not to those who curate it for us.

So suppose Facebook, or Twitter, or YouTube actually paid their moderators whatever was necessary for their emotional and spiritual health, and gave them the working conditions under which they could cultivate these online experiences for us without sacrificing their own souls. How much would that be worth? I doubt our tech overlords care enough to ask that question. Maybe the rest of us should. Though we cannot pay them directly, we can, perhaps, reduce their load, exercise patience with them, and apply whatever pressure we can to their employers. This is, after all, the future of work. It’s in all of our interests to set the norms for distinctively human labor right now, while we still can.