Know the rules The Paceline Forum Builder's Spotlight


Go Back   The Paceline Forum > General Discussion

Reply
 
Thread Tools Display Modes
  #16  
Old 03-19-2018, 06:22 PM
Kontact Kontact is offline
Senior Member
 
Join Date: Apr 2011
Location: Sunny Seattle
Posts: 2,824
Unless the car swerved to hit the pedestrian, I have a hard time seeing how this was the fault of the computer system. The "driver" had full access to the brakes. I think the victim stepped out in front of a car that would have hit her no matter what was driving.
Reply With Quote
  #17  
Old 03-19-2018, 07:17 PM
marciero marciero is offline
Senior Member
 
Join Date: Jun 2014
Location: Portland Maine
Posts: 3,108
Quote:
Originally Posted by FlashUNC View Post
Some pretty high grade whataboutism going on in this one.

Whether this was avoidable or not in the software is just part of the larger issue that these cars will need to be programmed to make life or death decisions, including prioritizing in some cases who lives and who doesn't. Like a pedestrian outside a crosswalk.

But hey, let's use traffic deaths for cars driven by people to excuse one killed by a beta autonomous system. Because the streets can be places where we all serve as guinea pigs, right?
Only if the software performs as intended and still kills someone. But what the software (and the hardware) is intended to do and whether it performs as intended are two very distinct and separate issues. One is engineering and one is ethics. I dont understand why some people are freaked out about the prospect of the codifying (literally) of ethical decision-making. The alternative - what we have now- is no decision at all and leaving it to chance, since crashes all happen too fast for humans to make or act on any decision.

On the other hand I can understand people being uncomfortable with the reliability of the software, even though all the evidence indicates that humans are far more error-prone than computers.
Reply With Quote
  #18  
Old 03-19-2018, 07:33 PM
unterhausen unterhausen is offline
Randomhead
 
Join Date: Dec 2013
Location: Happy Valley, Pennsylvania
Posts: 6,950
I don't know about uber's cars, but Elon Musk is insisting they don't need lasers. No, they do need lasers.

This woman was either riding a bicycle or pushing one. I'm going with riding for now.
Reply With Quote
  #19  
Old 03-19-2018, 07:53 PM
goonster's Avatar
goonster goonster is offline
Cranky!
 
Join Date: May 2006
Location: Cary, NC
Posts: 3,768
Quote:
Originally Posted by saab2000 View Post
If the occupant has override capabilities I would assume the operator would be responsible for accidents as the operator presumably has the possibility to avoid the accident.
It's either a self-driving ("driverless") car, or it isn't.

Watching the car like a hawk, poised to perform some lifesaving intervention in a split second, is more exhausting than just driving.

The greatest danger happens when we're not sure what the machine will do, or how much the machine is in control. Is it in automatic, or manual? What mode is it in, and what does that do? (c.f. AF447) Is the anti-lock/stability control halfway off, or all the way off? Is there a driver in that car, or is it fully auto?

Quote:
Originally Posted by FlashUNC View Post
these cars will need to be programmed to make life or death decisions, including prioritizing in some cases who lives and who doesn't.
Strongly disagree. The car doesn't make a life or death decision, it is programmed to do A or B. If programmed well, the car will respond consistently given the same set of inputs; it doesn't have a choice, per se.

Indeed, there are powerful ethical issues, with life and death consequences in play, but they are on us, not the machines.

IMHO, everybody involved in this Uber project, who shrugs their shoulders and says "the first fatality was unavoidable, just a question of time" should have their engineering degrees rescinded.
__________________
Jeder geschlossene Raum ist ein Sarg.
Reply With Quote
  #20  
Old 03-19-2018, 08:40 PM
FlashUNC FlashUNC is offline
Senior Member
 
Join Date: Apr 2008
Location: Berkeley, CA
Posts: 14,452
Quote:
Originally Posted by marciero View Post
Only if the software performs as intended and still kills someone. But what the software (and the hardware) is intended to do and whether it performs as intended are two very distinct and separate issues. One is engineering and one is ethics. I dont understand why some people are freaked out about the prospect of the codifying (literally) of ethical decision-making. The alternative - what we have now- is no decision at all and leaving it to chance, since crashes all happen too fast for humans to make or act on any decision.

On the other hand I can understand people being uncomfortable with the reliability of the software, even though all the evidence indicates that humans are far more error-prone than computers.
The braking system fails and the car has to determine whether to kill a pedestrian to save the drivers life, or vice versa. Who programs that? Who's responsible? If the pedestrian, is the car company prioritizing car owners' lives over the lives of the public? If the public, who's going to buy a car that doesn't want to keep them safe?

This kind of programming is Pandora's box man.
Reply With Quote
  #21  
Old 03-19-2018, 09:21 PM
rwsaunders's Avatar
rwsaunders rwsaunders is offline
Everything is connected
 
Join Date: Nov 2005
Location: Seaburgh
Posts: 11,201
MIT's Technology Review published these articles a few years back regarding the ethics of programming a driverless car...making split second decisions of valuing the occupant more than the non-occupant, adult vs child, squirrels vs oncoming traffic, etc. Interesting questions indeed.

Here is an excerpt which is food for thought...Others believe the situation is a little more complicated. For example, Bryant Walker-Smith, an assistant professor at the University of South Carolina who studies the legal and social implications of self-driving vehicles, says plenty of ethical decisions are already made in automotive engineering. “Ethics, philosophy, law: all of these assumptions underpin so many decisions,” he says. “If you look at airbags, for example, inherent in that technology is the assumption that you’re going to save a lot of lives, and only kill a few.”

https://www.technologyreview.com/s/5...ammed-to-kill/

https://www.technologyreview.com/new...cal-decisions/
Reply With Quote
  #22  
Old 03-19-2018, 10:19 PM
jimcav jimcav is online now
Senior Member
 
Join Date: Mar 2005
Posts: 4,690
interesting dilemmas posed in those articles

you'd think that 3 years later they would have actually had to do some programming for certain scenarios. even if just in "the lab" because they need to be to have the 3laws strong difference engine ready


Quote:
Originally Posted by rwsaunders View Post
MIT's Technology Review published these articles a few years back regarding the ethics of programming a driverless car...making split second decisions of valuing the occupant more than the non-occupant, adult vs child, squirrels vs oncoming traffic, etc. Interesting questions indeed.

Here is an excerpt which is food for thought...Others believe the situation is a little more complicated. For example, Bryant Walker-Smith, an assistant professor at the University of South Carolina who studies the legal and social implications of self-driving vehicles, says plenty of ethical decisions are already made in automotive engineering. “Ethics, philosophy, law: all of these assumptions underpin so many decisions,” he says. “If you look at airbags, for example, inherent in that technology is the assumption that you’re going to save a lot of lives, and only kill a few.”

https://www.technologyreview.com/s/5...ammed-to-kill/

https://www.technologyreview.com/new...cal-decisions/
Reply With Quote
  #23  
Old 03-19-2018, 10:34 PM
binxnyrwarrsoul's Avatar
binxnyrwarrsoul binxnyrwarrsoul is offline
Senior Member
 
Join Date: Jun 2010
Location: SW CT, Queens/Brooklyn NY, Bizarro World
Posts: 6,179
Quote:
Originally Posted by jimcav View Post
I'm sure they will discuss it on CNN when i get home if someone else hasn't been poisoned, or fired, or resigned to completly monopolize the news
Or there's even a dusting of snow.
__________________
Make mine lugged.
Reply With Quote
  #24  
Old 03-19-2018, 11:39 PM
Kontact Kontact is offline
Senior Member
 
Join Date: Apr 2011
Location: Sunny Seattle
Posts: 2,824
Quote:
Originally Posted by goonster View Post
IMHO, everybody involved in this Uber project, who shrugs their shoulders and says "the first fatality was unavoidable, just a question of time" should have their engineering degrees rescinded.
What should they do or say instead?


Unless a robot driver is able to break the laws of physics, there are always going to be accidents. The reaction time can be as fast as the speed of light, but brakes don't stop cars instantaneously, so some version of this accident was preordained.
Reply With Quote
  #25  
Old 03-20-2018, 02:25 AM
marciero marciero is offline
Senior Member
 
Join Date: Jun 2014
Location: Portland Maine
Posts: 3,108
Quote:
Originally Posted by FlashUNC View Post
The braking system fails and the car has to determine whether to kill a pedestrian to save the drivers life, or vice versa. Who programs that? Who's responsible? If the pedestrian, is the car company prioritizing car owners' lives over the lives of the public? If the public, who's going to buy a car that doesn't want to keep them safe?

This kind of programming is Pandora's box man.
I'm pretty sure Mercedes-Benz, for example, has decided to prioritize passenger safety over public safety, with the rationale that the public would in fact end up being safer in the end. They reason that with more people buying cars that prioritize their own safety, more of these cars safer cars would be on the road, making the roads safer for the public.
But yes, a Pandora's box in the sense that, as rwsaunders points out, it takes all of these interesting and sometimes disturbing Trolley Problem-type ethical dilemmas out of the purely academic realm.

Last edited by marciero; 03-20-2018 at 02:27 AM.
Reply With Quote
  #26  
Old 03-20-2018, 03:28 AM
soulspinner soulspinner is offline
Senior Member
 
Join Date: Dec 2003
Location: rochester, ny
Posts: 9,500
Quote:
Originally Posted by Kontact View Post
Unless the car swerved to hit the pedestrian, I have a hard time seeing how this was the fault of the computer system. The "driver" had full access to the brakes. I think the victim stepped out in front of a car that would have hit her no matter what was driving.
This. I'm in these cars every day. Unless the user disa
bled the system, something isn't right.
__________________
chasing waddy
Reply With Quote
  #27  
Old 03-20-2018, 06:16 AM
oldpotatoe's Avatar
oldpotatoe oldpotatoe is offline
Proud Grandpa
 
Join Date: Oct 2009
Location: Republic of Boulder, USA
Posts: 47,036
Quote:
Originally Posted by jimcav View Post
So I saw a scrolling news alert that a self driving uber hit a woman pushing a bicycle (hence the sort of OT) in AZ, apparently pushing a bike last night through an intersection. At work I can't access new sites (but can this forum--duh). A quick search on my phone said the uber had a "control driver" as back up. So I guess the conditions must have been bad for both the self-driving electronics and the human pilot to both miss seeing her. Just wondered if anyone had more details.
She wasn't in the intersection, is what I read, but walking across the road between intersections. Gotta wonder why she didn't see the car.
__________________
Chisholm's Custom Wheels
Qui Si Parla Campagnolo
Reply With Quote
  #28  
Old 03-20-2018, 07:37 AM
jimcav jimcav is online now
Senior Member
 
Join Date: Mar 2005
Posts: 4,690
well, I don't have "cable"; my roku TV has limited channels

Quote:
Originally Posted by binxnyrwarrsoul View Post
Or there's even a dusting of snow.
I haven't bought a HD antenaa and the only thing I've figured out how to add besides amazon prime and netflix was cnn2go, so yes, that is what has been on
and of course i was wrong, it was the cambridge analytica whisteblower last night: so i swtiched over to prime and watched "swiss army man" which i enjoyed
Reply With Quote
  #29  
Old 03-20-2018, 08:49 AM
Mark McM Mark McM is offline
Senior Member
 
Join Date: Jun 2006
Posts: 11,987
Quote:
Originally Posted by goonster View Post
IMHO, everybody involved in this Uber project, who shrugs their shoulders and says "the first fatality was unavoidable, just a question of time" should have their engineering degrees rescinded.
Pretty much what Kontact said. The only vehicle that has zero possibility of hitting something (or someone) is one that never moves. We as a society have to decide on what cost/risk vs. benefit we are willing to accept for autonomous vehicles, but no matter how hard engineers try, the risks of autonomous vehicles can never be zero - although they may be made much less than for human drivers.
Reply With Quote
  #30  
Old 03-20-2018, 09:11 AM
Tony T's Avatar
Tony T Tony T is offline
Senior Member
 
Join Date: Feb 2012
Posts: 6,158
Quote:
Originally Posted by jimcav View Post
I'm sure they will discuss it on CNN when i get home if someone else hasn't been poisoned, or fired, or resigned to completly monopolize the news
CNN?
The 24/7 "Stormy Daniels" channel?
They'll spend 10 minutes on the news and then get back to "Stormy"
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 09:17 PM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2024, vBulletin Solutions, Inc.