Wednesday, January 16, 2013

Autonomous Robots: Disastrous or Beneficial


I want to start off by saying I really enjoyed reading about autonomous robots. It is exciting to see what military robots are capable of. I say, ‘Yes!’ to autonomous lethal targeting. Simply put, I believe robots like this can save more American lives, so it seems brainless to me not to say ‘yes’.
In the chapter Robotic Gods: Our Machine Creators, Singer mentions an autonomous robot that does “counter sniper” work. When ever a sniper shoots at the marines, the technology automatically points a machine gun at where the bullet came from. The robot does not automatically start firing; ultimately, the decision to engage with a target is decided by a human. I believe this “human-supervised autonomous weapon” is a great tool, and can help save lives. There are two kinds of autonomous weapons. “Human-supervised autonomous weapons” can be shut down by a human in the event of a weapon system failure, but non-human supervised “autonomous weapons” would lack such safety features. I don’t know why there are non-human supervised autonomous weapons. They sound like a recipe for disaster. Mark Gubrud says “This policy is basically, ‘Let the machines target other machines; Let men target men.’ Since the most compelling arms-race pressures will arise from machine-vs.-machine confrontation, this solution is a thin blanket, but it suggests some level of sensitivity to the issue of robots targeting humans without being able to exercise ‘human judgment’ — a phrase that appears repeatedly in the DoD Directive. This approach seems calculated to preempt the main thrust of HRW’s report, that robots cannot satisfy the principles of distinction and proportionality as required by international humanitarian law, therefore AWS should never be allowed.”
The Dragon Runner is another robot that looks like a model car, and can be used to “see around the corner.” Troops can toss it through a window, up some stairs, or down a cave; and the robot will land on its feet and send back video of whatever it sees.
According to the article As Drone Use Surges, Pilots Report High Stress Levels nearly half of the military’s drone operators report high stress levels. Some operator had seen close-up  video of casualties of women, children or other civilians; collateral damage of drone strikes. Maybe non-human supervised autonomous weapon systems are the answer to so many drone operators having high stress. Although the pilots aren’t deep in enemy territory, they are the one that “pulls the trigger” on the drone strikes, and I think that guilt is what causes so much stress. However, if non-human supervised autonomous weapons could be made to be trusted to make the correct call while using lethal force it could help with this problem. The robot would recognize the threat, address it, and decide whether or not to engage in lethal force. There would be a lot of room for error, and researchers and scientists would have to near perfect such a weapon system, but if it worked it would take a lot of the guilt off of the soldiers who are now pulling the “drones’ triggers.”

6 comments:

  1. Does the Sniper robot save lives or kill snipers? I guess it depends upon the job the sniper is supposed to carry out

    ReplyDelete
  2. I am not sure what to think of drones being able to operate without human control. Maybe it could be beneficial to lower the level of stress for drone operators. However, I think it is dangerous to let a robot make killing decisions. There is always room for error, no matter how "perfect" the robot may seem.

    ReplyDelete
  3. Does it kill enemy snipers or save lives? I think they go hand in hand. If the enemy sniper is dead he/she can't kill any Americans. Bwalsh12, i agree that it definitely is scary to think about robots making killing decisions. I agree that there is always room for error. The way drones and robots are advancing I can see them heading in that direction in the future.

    ReplyDelete
  4. If there is an enemy sniper firing shots at our troops, I believe we have the right to kill him/her. The main issue I have with autonomous weapons is the potential for them to kill innocent lives. However, a man shooting a sniper at our troops can be considered an enemy combatant.

    ReplyDelete
  5. It all comes down to what you consider to be a successful war venture. A sniper firing at our troops being killed is a right? What happens if the autonomous weapon, in the process of killing that sniper, kills more innocent people.

    ReplyDelete
  6. So are American soldiers not to fire back at a sniper because of the change it may also harm someone else? They need to react. They don't have time to think about it and debate behind their computer. Their getting shot at. Someone is trying to kill them. So whether it is a soldier or an autonomous weapon, I think he/she/it should fire back.

    ReplyDelete