Posts: 1,172
Joined: Mar 2020
Reputation:
44
Location: Austria
Posts: 1,172
Joined: Mar 2020
Reputation:
44
Location: Austria
November 10th, 2020 at 6:45 AM
If people are injured or killed in traffic, the person who caused them must be responsible. But who is guilty if an automatically driving car caused the accident? The traffic court day is looking for answers.
For automakers, designers and software programmers, things could become risky in the future. Because they may be held criminally responsible if automatically driving vehicles cause an accident with injuries. Because there is currently no concrete legal framework, the 57th German Traffic Court Conference (VGT) in Goslar deals with the issue.
For the experts, one thing is clear: drivers must not be held accountable if there is a crash in autonomous driving. According to the German Road Safety Council, they only have to bear criminal responsibility if they can control the automated system.
Automobile clubs demand clarity for vehicle drivers
The ADAC sees it similarly: Vehicle drivers should only be prosecuted if they have driven the vehicle themselves and have not followed the system's request to take control in time, says a spokesman.
[x] <= Drive in nail here for new display!
Posts: 2,920
Joined: Jul 2014
Reputation:
104
Location: VA
Items (1) ▼
Posts: 2,920
Joined: Jul 2014
Reputation:
104
Location: VA
Items (1) ▼
November 10th, 2020 at 12:26 PM
I think its still the drivers. They're the one taking the risk by letting the car take over. We're not at a point yet where the cars can legitimately self drive.
Posts: 5,287
Joined: May 2013
Reputation:
181
Location: Where's North?
Items (6) ▼
Posts: 5,287
Joined: May 2013
Reputation:
181
Location: Where's North?
Items (6) ▼
November 10th, 2020 at 12:42 PM
To some extent the company is still liable. The question becomes a matter of how to handle situations where, for better or for worse, the automatic cars are FAR safer than their human-driving counterparts. If more lives are saved by self-driving cars, should we sue the crap out of automakers when an honest mistake is made?
Unfortunately, that "honest mistake" still costs a life. And it may save ten lives versus the one that was lost, but a life lost is just that, a life that isn't coming back, and it's still a sad reality, and the auto maker will still get sued. And if someone's family still isn't coming back, every legal avenue is up for grabs, they will be persued.
Ethical questions like these will probably never get answered in a way that humanity accepts across the board. I think that auto makers should demostrate a very good safety record and let the people audit them independently. Once they have been trial run for years, they should be immune to lawsuits from basic things *unless* it can be shown that there was clear negligence. My reason for this thinking is that a very well designed automatic self-driving car will probably save far more lives than it costs, and ultimately, it will probably benefit society and reduce fatalities on the road. We need to not miss the forest for the trees. It can have the potential to become a very life-saving technology, if done right.
I'm not quite sure we've developed it to that level yet, however. We're probably still several years off, although they're getting closer.
Posts: 1,172
Joined: Mar 2020
Reputation:
44
Location: Austria
Posts: 1,172
Joined: Mar 2020
Reputation:
44
Location: Austria
November 10th, 2020 at 1:20 PM
Here is the current legal situation:
In Germany and some other countries, the legal situation is clear, because there is a three-pillar model consisting of driver, owner and manufacturer liability. The driver is responsible for the driving task and must, for example, constantly monitor the vehicle with partially automated driving functions and intervene in an emergency. If he does not fulfill his duty of care and thereby causes an accident, he is liable for the damage incurred as well as the owner. In addition, within the framework of product and producer liability, the manufacturer can be held liable for damage caused by a product defect.
##############
But then who is so stupid as the driver who takes responsibility for a not fully developed self-driving car? Like who is so stupid, buys one and plays a test object?
[x] <= Drive in nail here for new display!
Posts: 2,682
Joined: Jun 2013
Reputation:
105
Location: Tycho City, Tycho Crater, Luna.
Items (1) ▼
Posts: 2,682
Joined: Jun 2013
Reputation:
105
Location: Tycho City, Tycho Crater, Luna.
Items (1) ▼
November 10th, 2020 at 11:22 PM
I think it's simple really.
there shouldn't be autopilot in situations like neighborhoods and such.
that should be reserved for long straight stretches of roads like highways and rural areas where it makes sense.
or in cases like commercial transport where the autopilot can take over in the case of pilot fatigue and such, at least long enough for the co-pilot to take over thus eliminating the need for rest stops.
we just aren't at a place where automation can replace humans in situations beyond that.
Instead I propose that there be a blanket ban on autopilots outside of special permits (self driving taxi's for example) inside restricted areas like neighborhoods and off major roads for public safety.
"I reject your reality and subsitute my own." - Adam Savage, Mythbusters