By Professor Brenda K. Wiederhold
Anecdotal reports indicate that pilots who remotely bomb targets in Operation Enduring Freedom using unmanned aerial vehicles (UAVs) may be at greater risk for posttraumatic stress disorder (PTSD) than those who fly combat missions.
Associated Press reporter Scott Lindlaw suggests that there may be several reasons for this.
1. UAV pilots work longer shifts and tours than pilots deployed to a war zone.
2. Unlike manned flights, UAVs are often required to linger and assess the bomb damage, showing the pilot the resultant carnage in high-resolution detail.
3. Very little decompression time elapses between a pilot’s bombing run and being at home with spouse and family, resulting in a jarring transition between a virtual reality and a physical reality.
What helicopters were to Vietnam, UAVs are to Afghanistan – essential to the engagement and a symbol of technological superiority. The number of UAVs or drones has grown from less than 200 eight years ago to more than 7,000 today. Each Predator and Reaper aircraft has a two-person crew: an Air Force pilot, who flies the drone, and a sensor operator, who runs the camera and targeting laser. Drone pilots can complete training in just months, versus the years it takes an F-15 fighter pilot. Videogame expertise may be among the most valuable skills for armed services that depend increasingly on robots. In a battlefield situation, soldiers and pilots grow conditioned to violence, which helps to “inoculate” them from the effects of stress. In contrast, UAV pilots talk about the unreality of working with partners they never meet against an enemy that exists (for them) only on a video screen. Perhaps this emotional disengagement is another reason for reports of PTSD: Subconsciously, they may find it hard to justify killing when their lives are not in danger.
But what if you’re in a “boots on the ground” position and a robot saves your life? What happens to your emotions then?
“The EOD [Explosive Ordnance Disposal] soldier carried a box into the robot repair facility at Camp Victory, Iraq. ‘Can you fix it?’ he asked, with tears welling in his eyes. Inside the box was a pile of broken parts. It was the remains of ‘Scooby-Doo,’ the team’s PackBot, which had been blown up by an IED [improvised explosive device]. On the side of Scooby’s ‘head’ was a series of handwritten hash marks, showing the number of missions that the little robot had gone on. All told, Scooby had hunted down and defused 18 IEDs and one car bomb, dangerous missions that had saved multiple human lives. ‘This has been a really great robot,’ the soldier told Master Sergeant Ted Bogosh, the Marine in charge of the repair yard.”
In addition to the UAVs, which range from the 48- foot Predator to the hand-thrown Raven, there are more than 12,000 unmanned ground vehicles such as the lawnmower-sized PackBot deployed to the Middle East and South Asia. Soldiers are projecting their emotions onto these machines, with undreamed-of consequences.
As they begin to bond with their robot as part of the team, they may, for example, promote the robot to private first-class and give him an EOD badge. But this same anthropomorphizing may result in an EOD soldier running into enemy fire to “rescue” his robot – the exact opposite of what was intended by the robot’s creation. Perhaps it is because the soldier realizes that he may not be alive without this machine, so he chooses not to view it that way. There is a physiological reason for this: When people look at robots, their mirror neurons fire, indicating that we consider robots alive and deserving of empathy.
In the future, it may be desirable to create a PackBot with a slightly annoying personality, so that soldiers don’t feel so bad when it’s destroyed. And perhaps a “drone with a conscience” can someday replace the two-person UAV teams – though the field of roboethics, established in 2004, needs to grow quickly to address increasingly complex issues.
One issue in particular, outside the obvious ethical concerns, will need to be given due attention in the future – the popularized science fiction notion that robots will be able to develop on their own, begin to think for themselves, and end up attacking their own troops. Such instances have been reported, although on a small scale. In 2007 a robot killed nine and injured 14, a report the South African National Defence investigated and blamed on a “mechanical problem” – poor design or modification.
For the time being, the Department of Defense has a unique opportunity to conduct longitudinal studies of drone pilots and other human-robot teams to determine the extent and causes of their PTSD – and to help those who are suffering from this disabling condition. And it’s possible that Virtual Reality assisted exposure therapy could be the treatment of choice for this and future groups of videogame-based warriors.
Create your own reality! Brenda Wiederhold
President of Virtual Reality Medical Institute (VRMI) in Brussels, Belgium. Executive VP Virtual Reality Medical Center (VRMC), based in San Diego and Los Angeles, California. CEO of Interactive Media Institute a 501c3 non-profit Clinical Instructor in Department of Psychiatry at UCSD Founder of CyberPsychology, CyberTherapy, & Social Networking Conference Visiting Professor at Catholic University Milan.