欢迎访问译联翻译公司!  联系邮箱:fanyi@translian.com
当前位置:首页 > 新闻动态 > 行业新闻

新闻动态 / NEWS



作者: 来源: 日期:2016/9/26 9:11:26

Life, death and the limits of crowdsourcing morality





Today I have been both murderous and merciful. I have deliberately mown down pensioners and a pack of dogs. I have ploughed into the homeless, slain a couple of athletes and run over the obese. But I have always tried to save the children.



As I finish my session on the Moral Machine — a public experiment being run by the Massachusetts Institute of Technology — I learn that my moral outlook is not universally shared. Some argue that aggregating public opinions on ethical dilemmas is an effective way to endow intelligent machines, such as driverless cars, with limited moral reasoning capacity. Yet after my experience, I am not convinced that crowdsourcing is the best way to develop what is essentially the ethics of killing people. The question is not purely academic: Tesla is being sued in China over the death of a driver of a car equipped with its “semi-autonomous” autopilot. Tesla denies the technology was at fault.

我在“道德机器”(Moral Machine)——麻省理工学院(MIT)运行的一项公开实验——上完成测试后发现,我的道德观跟很多人不一样。有些人辩称,在道德困境上把公众意见汇集到一起,是向无人驾驶汽车等智能机器赋予有限道德推理能力的有效手段。然而,在测试之后,我不相信众包是形成杀戮道德(本质上就是这么回事)的最佳途径。这个问题并不单纯是学术层面的:一辆配备“半自动式”Autopilot的特斯拉(Tesla)汽车的驾车者死亡,导致该公司在中国被起诉。特斯拉否认那起事故的过错在于该项技术。


Anyone with a computer and a coffee break can contribute to MIT’s mass experiment, which imagines the brakes failing on a fully autonomous vehicle. The vehicle is packed with passengers, and heading towards pedestrians. The experiment depicts 13 variations of the “trolley problem” — a classic dilemma in ethics that involves deciding who will die under the wheels of a runaway tram.



In MIT’s reformulation, the runaway is a self-driving car that can keep to its path or swerve; both mean death and destruction. The choice can be between passengers and pedestrians, or two sets of pedestrians. Calculating who should perish involves pitting more lives against less, young against old, professionals against the homeless, pregnant women against athletes, humans against pets.



At heart, the trolley problem is about deciding who lives, who dies — the kind of judgment that truly autonomous vehicles may eventually make. My “preferences” are revealed afterwards: I mostly save children and sacrifice pets. Pedestrians who are not jaywalking are spared and passengers expended. It is obvious: by choosing to climb into a driverless car, they should shoulder the burden of risk. As for my aversion to swerving, should caution not dictate that driverless cars are generally programmed to follow the road?



It is illuminating — until you see how your preferences stack up against everyone else. In the business of life-saving, I fall short — especially when it comes to protecting car occupants. Upholding the law and not swerving seem more important to me than to others; the social status of my intended victims much less so.



We could argue over the technical aspects of dishing out death judiciously. For example, if we are to condemn car occupants, would we go ahead regardless of whether the passengers are children or criminals?



But to fret over such details would be pointless. If anything, this experiment demonstrates the extreme difficulty of reaching a consensus on the ethics of driverless cars. Similar surveys show that the utilitarian ideal of saving the greatest number of lives works pretty well for most people as long as they are not the roadkill.



I am pessimistic that we can simply pool our morality and subscribe to a norm — because, at least for me, the norm is not normal. This is the hurdle faced by makers of self-driving cars, which promise safer roads overall by reducing human error: who will buy a vehicle run on murderous algorithms they do not agree with, let alone a car programmed to sacrifice its occupants?



It is the idea of premeditated killing that is most troubling. That sensibility renders the death penalty widely unpalatable, and ensures abortion and euthanasia remain contentious areas of regulation. Most of us, though, grudgingly accept that accidents happen. Even with autonomous cars, there may be room for leaving some things to chance.