AI, Drones, Empathy, Alienation, and the Gig Economy - 10 minutes read
AI, Drones, Empathy, Alienation, and the Gig Economy
The Verge has an important post on content moderation in the corporate hellhole that is Facebook which has implications both for the future of work (or, as we call it, “labor”) and for services that we, as consumers (reproducing our labor power) file mentally as algorithmic or robotic, but are in fact labor, performed remotely. However, I’m going to approach the Verge article indirectly, by looking at drones and drone operators — an earlier, prototype gig economy, if you think of enlistment as a gig — and the stresses that result from whacking faraway brown people remotely. Then I’ll look at Facebook, and then at other forms of remote labor (or, as it seems to be called, “telepresence”).
From the New York Times, “As Stress Drives Off Drone Operators, Air Force Must Cut Flights”
What had seemed to be a benefit of the job, the novel way that the crews could fly Predator and Reaper drones via satellite links while living safely in the United States with their families, has created new types of stresses as they constantly shift back and forth between war and family activities and become, in effect, perpetually deployed. “Having our folks make that mental shift every day, driving into the gate and thinking, ‘All right, I’ve got my war face on, and I’m going to the fight,’ and then driving out of the gate and stopping at Walmart to pick up a carton of milk or going to the soccer game on the way home — and the fact that you can’t talk about most of what you do at home — all those stressors together are what is putting pressure on the family, putting pressure on the airman,” Colonel Cluff said.
That’s the quote from the Colonel. The Independent has a different version, in “Secret US drone whistleblowers say operators ‘stressed and often abuse drugs and alcohol’ in rare insight into programme”
From as far as 8,000 miles away in their base in the Nevada desert, the men operated unmanned drones carrying Hellfire missiles, in places such as Afghanistan, Pakistan, Iraq and Yemen. …. [The operators] said they were encouraged to dehumanise their targets and even referred to the children they monitored with their drones as “tits”, or “terrorists in training”, or “fun-sized terrorists”. The four said they had struggled with depression and even suicidal thoughts since quitting. The operators said they were supposed to combine signals intelligence, imagery and human intelligence. Often they lacked one or more of these and yet they still proceeded with the kill missions. “The programme hemorrhages people. We don’t like it. .”
And in a follow-up story in the Times, “The Wounds of the Drone Warrior“:
So, being a drone operator is “a bad job”[1] because:
Obviously, being a military drone operator is a limit case for “remote control” gigs, but are these characteristics really true for other gigs, like content moderation? I think they are.
Now let’s look at the Verge article (which I recomend you read in full). We’ll go through the working conditions for content moderators at Facebook as described by the whistleblowers, and see which of the above characteristics, as they emerged from describing the work of military drone operators, apply:
1. The nature of the gig mixes work time and private time.
2. The gig makes demands while not providing the tools to meet them
“The stress they put on him — it’s unworldly,” one of [Keith] Utley’s managers told me. “I did a lot of coaching. I spent some time talking with him about things he was having issues seeing. And he was always worried about getting fired.” On the night of March 9th, 2018, Utley slumped over at his desk. Co-workers noticed that he was in distress when he began sliding out of his chair. Two of them began to perform CPR, but no defibrillator was available in the building. A manager called for an ambulance. Utley was pronounced dead a short while later at the hospital, the victim of a heart attack… [T]he moderators who work in these offices are not children, and they know when they are being condescended to. They see the company roll an oversized Connect 4 game into the office, as it did in Tampa this spring, and they wonder: When is this place going to get a defibrillator?
4. The gig makes enormous demands on empathy while not allowing the operator to offer help
We are told that remote labor (“telepresence”) is the future for many jobs. Kara Swisher and Rani Molla of Recode/Decode has a very interesting interview with Louis Hyman, author of Temp[2]. From the transcript:
[LOUIS HYMAN] And I think part of this acceleration I wrote about in the book is this idea of digital migrants. So sometime in the next few years, we will see robots that are tele-operated by somebody else, and I think people aren’t as attentive to this as they need to be. …. I went to a lab a couple of years ago at Berkeley, and you could put on virtual goggles. Like we all now have these — well, I guess six people have the Oculus Rift or whatever. And you can run a robot body through that. And people there were very excited about this towel-folding robot that could see a towel and fold it. And I sat there for an hour waiting for this towel to be folded and it never could. I hate folding so I was super excited to see this. And I put the goggles on and I could fold the towel almost instantaneously… I could reach the robot’s arms and fold the towel. And I realized when I did this it was like, oh wow, I could do this anywhere. And so I can easily imagine the next couple years, some entrepreneur offering very cheap house-space robots the same way that Tesla used its own drivers to train its Autopilot, to use just hundreds of thousands of people around the world through some kind of online labor program in putting on virtual reality goggles somewhere in Bangladesh or Mexico. And then operating these robots. And then because of machine learning, the robots would learn how to do all kinds of manual tasks…. RM: Right, so everything that can be digitized, will be digitized. A lot of things will be automated. And it will be digitized by cheap people. This is the important part.
Which, of course, Facebook being Facebook, it intends to do. Verge once more:
If you believe moderation is a high-skilled, high-stakes job that presents unique psychological risks to your workforce, you might hire all of those workers as full-time employees. But if you believe that it is a low-skill job that will someday be done primarily by algorithms, you probably would not. Instead, you would do what Facebook, Google, YouTube, and Twitter have done, and hire companies like Accenture, Genpact, and Cognizant to do the work for you. Leave to them the messy work of finding and training human beings, and of laying them all off when the contract ends. Ask the vendors to hit some just-out-of-reach metric, and let them figure out how to get there.
Now let’s look at a few of these futuristic remote labor gigs. Of the five characteristics, listed:
I would say that #1 and #2 are “normal” in the sense that most gigs head toward this baseline anyhow, kaching. #3 is, I think, inherent in remote labor; either you’re working alone or in a warehouse, and in any case you’re under a headset or staring into a screen. I think #5 will most often be a function of #4. So let’s look at potential demands on empathy. (It’s worth noting that in the literature on telepresence I’ve read, the developers focus on latency — that is, the response time necessarily created by remote operation. They don’t give any thought to the operations at all.
The first example: Remote pilots of commercial aircraft. CNBC:
Great. Would the remote pilot, for example, have had to follow Ethiopian Airlines Flight 302 all the way down to the ground? I’m guessing yes; one of the stressors for military drone operators is not being able to look away. Has consideration been given to the demands for empathy placed on the remote pilot?
The second example: Emergency medical drones. My Drone Authority:
Great. Will the remote paramedics be required to view a heart attack where the treatment is going wrong?
The third example: Remote drivers for robot cars. From Wired:
Total creepiness aside, Will Livingston be prepared for what happens in case of a car crash, and will he have to monitor the screens — heightening things a little, here — while the bodies are pulled from the flaming vehicle?
In case case, the behavior of Facebook toward its moderators — as well as the general incentives to treat those who will be replaced by AIs as disposable — would that in all three cases, remote operators will be seeing events they will not be able to look away from, and which they will remember for the rest of their lives. In a bad way. If gig workers training the artificial intelligences that will replace them at their horrible jobs would lead to Fully Automated Luxury Communism, I might consider the sacrifice of their empathetic faculties worth it (especially if they were told, truthfully, that was the goal, instead of being treated as disposible and fungible…. lumps of labor). Somehow, however, I don’t think that’s going to be the case.
Source: Nakedcapitalism.com
Powered by NewsAPI.org
Keywords:
Artificial intelligence • Unmanned aerial vehicle • Empathy • Social alienation • Temporary work • Internet forum • Corporation • Facebook • Employment • Service (economics) • Consumer • Labour power • Algorithm • Robotics • Unmanned aerial vehicle • Unmanned aerial vehicle • Prototype • Temporary work • Facebook • Telepresence • The New York Times • Unmanned aerial vehicle • United States Air Force • General Atomics MQ-1 Predator • General Atomics MQ-9 Reaper • Unmanned aerial vehicle • Satellite • United States • My War • Walmart • Milk • Association football • Unmanned aerial vehicle • Whistleblower • Alcohol • Unmanned aerial vehicle • AGM-114 Hellfire • Afghanistan • Pakistan • Iraq • Yemen • Unmanned aerial vehicle • Terrorism • Terrorism • Depression (mood) • Suicide • Signals intelligence • Intelligence • Unmanned aerial vehicle • Unmanned aerial vehicle • Unmanned aerial vehicle • Remote control • Internet forum • The Verge • Internet forum • Facebook • Whistleblower • Unmanned aerial vehicle • Cardiopulmonary resuscitation • Defibrillation • Ambulance • Death • Hospital • Myocardial infarction • Connect Four • Defibrillation • Empathy • Telepresence • Kara Swisher • Recode • Louis Hyman • Louis Hyman • Robot • Virtual reality • Oculus Rift • Robot • Robot • Robot • Tesla, Inc. • Autopilot • Virtual reality • Bangladesh • Mexico • Robotics • Machine learning • Robotics • Facebook • Facebook • Job • Psychology • Risk • Workforce • Workforce • Full-time • Employment • Skill • Job • Facebook • Google • YouTube • Accenture • Genpact • Cognizant • Just Out of Reach • Ka-Ching! • Head-mounted display • Empathy • Telepresence • Latency (engineering) • Pilot (aeronautics) • Airliner • CNBC • Pilot (aeronautics) • Ethiopian Airlines • Unmanned aerial vehicle • Empathy • Unmanned aerial vehicle • Unmanned aerial vehicle • Paramedic • Myocardial infarction • Robot • Wired (magazine) • Facebook • Internet forum • Artificial intelligence • Artificial intelligence • Communism • Fungibility • Labour economics •
The Verge has an important post on content moderation in the corporate hellhole that is Facebook which has implications both for the future of work (or, as we call it, “labor”) and for services that we, as consumers (reproducing our labor power) file mentally as algorithmic or robotic, but are in fact labor, performed remotely. However, I’m going to approach the Verge article indirectly, by looking at drones and drone operators — an earlier, prototype gig economy, if you think of enlistment as a gig — and the stresses that result from whacking faraway brown people remotely. Then I’ll look at Facebook, and then at other forms of remote labor (or, as it seems to be called, “telepresence”).
From the New York Times, “As Stress Drives Off Drone Operators, Air Force Must Cut Flights”
What had seemed to be a benefit of the job, the novel way that the crews could fly Predator and Reaper drones via satellite links while living safely in the United States with their families, has created new types of stresses as they constantly shift back and forth between war and family activities and become, in effect, perpetually deployed. “Having our folks make that mental shift every day, driving into the gate and thinking, ‘All right, I’ve got my war face on, and I’m going to the fight,’ and then driving out of the gate and stopping at Walmart to pick up a carton of milk or going to the soccer game on the way home — and the fact that you can’t talk about most of what you do at home — all those stressors together are what is putting pressure on the family, putting pressure on the airman,” Colonel Cluff said.
That’s the quote from the Colonel. The Independent has a different version, in “Secret US drone whistleblowers say operators ‘stressed and often abuse drugs and alcohol’ in rare insight into programme”
From as far as 8,000 miles away in their base in the Nevada desert, the men operated unmanned drones carrying Hellfire missiles, in places such as Afghanistan, Pakistan, Iraq and Yemen. …. [The operators] said they were encouraged to dehumanise their targets and even referred to the children they monitored with their drones as “tits”, or “terrorists in training”, or “fun-sized terrorists”. The four said they had struggled with depression and even suicidal thoughts since quitting. The operators said they were supposed to combine signals intelligence, imagery and human intelligence. Often they lacked one or more of these and yet they still proceeded with the kill missions. “The programme hemorrhages people. We don’t like it. .”
And in a follow-up story in the Times, “The Wounds of the Drone Warrior“:
So, being a drone operator is “a bad job”[1] because:
Obviously, being a military drone operator is a limit case for “remote control” gigs, but are these characteristics really true for other gigs, like content moderation? I think they are.
Now let’s look at the Verge article (which I recomend you read in full). We’ll go through the working conditions for content moderators at Facebook as described by the whistleblowers, and see which of the above characteristics, as they emerged from describing the work of military drone operators, apply:
1. The nature of the gig mixes work time and private time.
2. The gig makes demands while not providing the tools to meet them
“The stress they put on him — it’s unworldly,” one of [Keith] Utley’s managers told me. “I did a lot of coaching. I spent some time talking with him about things he was having issues seeing. And he was always worried about getting fired.” On the night of March 9th, 2018, Utley slumped over at his desk. Co-workers noticed that he was in distress when he began sliding out of his chair. Two of them began to perform CPR, but no defibrillator was available in the building. A manager called for an ambulance. Utley was pronounced dead a short while later at the hospital, the victim of a heart attack… [T]he moderators who work in these offices are not children, and they know when they are being condescended to. They see the company roll an oversized Connect 4 game into the office, as it did in Tampa this spring, and they wonder: When is this place going to get a defibrillator?
4. The gig makes enormous demands on empathy while not allowing the operator to offer help
We are told that remote labor (“telepresence”) is the future for many jobs. Kara Swisher and Rani Molla of Recode/Decode has a very interesting interview with Louis Hyman, author of Temp[2]. From the transcript:
[LOUIS HYMAN] And I think part of this acceleration I wrote about in the book is this idea of digital migrants. So sometime in the next few years, we will see robots that are tele-operated by somebody else, and I think people aren’t as attentive to this as they need to be. …. I went to a lab a couple of years ago at Berkeley, and you could put on virtual goggles. Like we all now have these — well, I guess six people have the Oculus Rift or whatever. And you can run a robot body through that. And people there were very excited about this towel-folding robot that could see a towel and fold it. And I sat there for an hour waiting for this towel to be folded and it never could. I hate folding so I was super excited to see this. And I put the goggles on and I could fold the towel almost instantaneously… I could reach the robot’s arms and fold the towel. And I realized when I did this it was like, oh wow, I could do this anywhere. And so I can easily imagine the next couple years, some entrepreneur offering very cheap house-space robots the same way that Tesla used its own drivers to train its Autopilot, to use just hundreds of thousands of people around the world through some kind of online labor program in putting on virtual reality goggles somewhere in Bangladesh or Mexico. And then operating these robots. And then because of machine learning, the robots would learn how to do all kinds of manual tasks…. RM: Right, so everything that can be digitized, will be digitized. A lot of things will be automated. And it will be digitized by cheap people. This is the important part.
Which, of course, Facebook being Facebook, it intends to do. Verge once more:
If you believe moderation is a high-skilled, high-stakes job that presents unique psychological risks to your workforce, you might hire all of those workers as full-time employees. But if you believe that it is a low-skill job that will someday be done primarily by algorithms, you probably would not. Instead, you would do what Facebook, Google, YouTube, and Twitter have done, and hire companies like Accenture, Genpact, and Cognizant to do the work for you. Leave to them the messy work of finding and training human beings, and of laying them all off when the contract ends. Ask the vendors to hit some just-out-of-reach metric, and let them figure out how to get there.
Now let’s look at a few of these futuristic remote labor gigs. Of the five characteristics, listed:
I would say that #1 and #2 are “normal” in the sense that most gigs head toward this baseline anyhow, kaching. #3 is, I think, inherent in remote labor; either you’re working alone or in a warehouse, and in any case you’re under a headset or staring into a screen. I think #5 will most often be a function of #4. So let’s look at potential demands on empathy. (It’s worth noting that in the literature on telepresence I’ve read, the developers focus on latency — that is, the response time necessarily created by remote operation. They don’t give any thought to the operations at all.
The first example: Remote pilots of commercial aircraft. CNBC:
Great. Would the remote pilot, for example, have had to follow Ethiopian Airlines Flight 302 all the way down to the ground? I’m guessing yes; one of the stressors for military drone operators is not being able to look away. Has consideration been given to the demands for empathy placed on the remote pilot?
The second example: Emergency medical drones. My Drone Authority:
Great. Will the remote paramedics be required to view a heart attack where the treatment is going wrong?
The third example: Remote drivers for robot cars. From Wired:
Total creepiness aside, Will Livingston be prepared for what happens in case of a car crash, and will he have to monitor the screens — heightening things a little, here — while the bodies are pulled from the flaming vehicle?
In case case, the behavior of Facebook toward its moderators — as well as the general incentives to treat those who will be replaced by AIs as disposable — would that in all three cases, remote operators will be seeing events they will not be able to look away from, and which they will remember for the rest of their lives. In a bad way. If gig workers training the artificial intelligences that will replace them at their horrible jobs would lead to Fully Automated Luxury Communism, I might consider the sacrifice of their empathetic faculties worth it (especially if they were told, truthfully, that was the goal, instead of being treated as disposible and fungible…. lumps of labor). Somehow, however, I don’t think that’s going to be the case.
Source: Nakedcapitalism.com
Powered by NewsAPI.org
Keywords:
Artificial intelligence • Unmanned aerial vehicle • Empathy • Social alienation • Temporary work • Internet forum • Corporation • Facebook • Employment • Service (economics) • Consumer • Labour power • Algorithm • Robotics • Unmanned aerial vehicle • Unmanned aerial vehicle • Prototype • Temporary work • Facebook • Telepresence • The New York Times • Unmanned aerial vehicle • United States Air Force • General Atomics MQ-1 Predator • General Atomics MQ-9 Reaper • Unmanned aerial vehicle • Satellite • United States • My War • Walmart • Milk • Association football • Unmanned aerial vehicle • Whistleblower • Alcohol • Unmanned aerial vehicle • AGM-114 Hellfire • Afghanistan • Pakistan • Iraq • Yemen • Unmanned aerial vehicle • Terrorism • Terrorism • Depression (mood) • Suicide • Signals intelligence • Intelligence • Unmanned aerial vehicle • Unmanned aerial vehicle • Unmanned aerial vehicle • Remote control • Internet forum • The Verge • Internet forum • Facebook • Whistleblower • Unmanned aerial vehicle • Cardiopulmonary resuscitation • Defibrillation • Ambulance • Death • Hospital • Myocardial infarction • Connect Four • Defibrillation • Empathy • Telepresence • Kara Swisher • Recode • Louis Hyman • Louis Hyman • Robot • Virtual reality • Oculus Rift • Robot • Robot • Robot • Tesla, Inc. • Autopilot • Virtual reality • Bangladesh • Mexico • Robotics • Machine learning • Robotics • Facebook • Facebook • Job • Psychology • Risk • Workforce • Workforce • Full-time • Employment • Skill • Job • Facebook • Google • YouTube • Accenture • Genpact • Cognizant • Just Out of Reach • Ka-Ching! • Head-mounted display • Empathy • Telepresence • Latency (engineering) • Pilot (aeronautics) • Airliner • CNBC • Pilot (aeronautics) • Ethiopian Airlines • Unmanned aerial vehicle • Empathy • Unmanned aerial vehicle • Unmanned aerial vehicle • Paramedic • Myocardial infarction • Robot • Wired (magazine) • Facebook • Internet forum • Artificial intelligence • Artificial intelligence • Communism • Fungibility • Labour economics •