Robotics and Law

105 230
The robotics researchers Klaus Schilling recently interviewed to Technology Review magazine in Germany on the ethics of machines.

Schilling teaches robotics and telemetric at the University of Wurzburg, and is co-founder of the DFG-funded project "Robotics and Law".

Technology Review: When a robot caused damage - then who is responsible person or the robot?

Klaus Schilling: Man. But the responsibility is not always easy to assign. Earlier decisions met the man, the machine they argued. Today we have but a rapid development towards self-acting robots.

TR: When can the responsibilities not clearly attributed manufacturer or operator?

Schilling: For example, if unforeseen conditions occur, such as extreme weather. The robot should indeed even remember when it exceeds the limits of the safe use - but that also applies to the operator. And it is really exciting if they turn off one robot cannot simply because more can cause more damage.

TR: When can that be the case?

Schilling: impetus for our project "Robotics and Law" was a vehicle that older people accompanied autonomously transported to destinations such as pharmacy, post office or doctor. When face obstacles it will automatically stop or to derogate from.

TR: When people react to slowly take hard decisions to machines already - such as a missile to shoot down approaching a warship. But what if this is a harmless amateur pilot?

Schilling: modern sensor systems can often detect even better and faster than the human situations and make sound decisions. But like the man and robot will encounter errors.

TR: Should we then not consistently appoint robot to legal persons?

Schilling: we engineers and lawyers with experts from other areas out to find sound answers to such questions. It is conceivable; for example, proceed by analogy with damage caused by children or pets. For insurance companies will open here an interesting area to offer innovative products.

TR: Could a computer to plead diminished responsibility because he was infected by a virus?

Schilling: Sure is then the programmer of the virus has to be taken to account. But there are also the manufacturer precautions against viruses or hackers [http://blog.certsportal.com] to take. Machines can also accept sensor failures and other malfunctions and react accordingly. But the question is: How far are such a self? A robot that deals only with him can perform no duties.

TR: What has your project goal?

Schilling: We want to create a legal and ethical foundation to place the responsibility on increasingly greater decision-making ability of machines. Such legal certainty will give developers a guide

Source...
Subscribe to our newsletter
Sign up here to get the latest news, updates and special offers delivered directly to your inbox.
You can unsubscribe at any time

Leave A Reply

Your email address will not be published.