PDA

View Full Version : Ot: First tesla autopilot death


Tickdoc
06-30-2016, 07:23 PM
http://www.sfgate.com/business/technology/article/Self-driving-car-driver-died-after-crash-in-8334982.php

chiasticon
06-30-2016, 07:52 PM
ugh. that's sad. crazy technology man...

local dude to me. I grew up in Canton, and parents still live nearby there. he's just a few years older than me too, but I didn't know him.

rest in peace.

seric
06-30-2016, 07:53 PM
Interesting, it appears 1 death in 130 million autopilot miles driven does beat the national average of 1 death per 100 million miles driven. But not by Elan Musk's claimed 50% probability. Really too small a sample size to come to any conclusions.

unterhausen
06-30-2016, 08:33 PM
from the videos I've seen of it in action, the tesla "autopilot" works almost exactly as badly as the "lane keeper assist" in my 2010 Toyota. Although I think Toyota was a little smarter, in that it really forces you to keep driving

enr1co
06-30-2016, 08:58 PM
Have you driven a Tesla with auto pilot? Its a lot more advanced than 2010 lane keeper assist technology.

Louis
06-30-2016, 09:04 PM
I don't care how advanced an "autopilot" system is, I wouldn't use it unless 100% of the cars, trucks and other vehicles on the road with me were also using it.

(Except bicycles - I wouldn't be afraid of them because I could just mow them down like plenty of other drivers have already done.)

Louis
06-30-2016, 09:09 PM
I just thought of something else - it's time to get a Tesla and PARTY!

I can now drive while blithering drunk (or otherwise distracted) and don't have to worry about anything! Even better than a party bus.

http://los-angeles-limos.com/wp-content/uploads/2014/10/Girls-Cups-Hollywood-Party-Bus.jpg

chazmtb
06-30-2016, 09:14 PM
Williston is good riding roads in Gainesville North Central Florida area. Just bringing this closer to a cycling topic.

Tickdoc
06-30-2016, 09:17 PM
Williston is good riding roads in Gainesville North Central Florida area. Just bringing this closer to a cycling topic.

It's just a matter of time before it is a cycling topic, unfortunately.:crap:

Louis
06-30-2016, 09:22 PM
It's just a matter of time before it is a cycling topic, unfortunately.:crap:

So the article said that the Tesla couldn't distinguish between the color of the truck and the color of the sky. That means that the car is only doing image processing of info from a camera. I think eventually they'll have to go one step further and use radars for that sort of thing, but that's pretty far in the future. And even then, someone on a bicycle doesn't have a huge radar cross-section. But you could increase your RCS with one of these: (but a smaller one obviously)

http://www.duckworksmagazine.com/09/images/letters/commercialreflector.jpg

seric
06-30-2016, 10:00 PM
The autopilot system actually uses a camera in combination with a front facing radar and 12 ultrasonic sensors for short range 360 degree vision. I wouldn't recommend relying on any "government provided records" for the truth in just about any matter. More so when any technical details are required.

I suspect the accident had more to do with the height of the vehicle confusing the radar.

Louis
06-30-2016, 10:06 PM
Lucky for Tesla the fatality wasn't a 10 year old kid who'd suddenly run out into the street. That would be major bad news (and will eventually happen).

MattTuck
06-30-2016, 10:14 PM
Real life consequences of "What is Toronto?".

Elon Musk is the king of self-promotion and vapor ware. Most car companies have this technology in various stages of development, and indeed Volvo has come out publicly saying how bad the hybrid model, that Tesla uses, really is. Namely, it drives in good conditions and then gives control back to the driver when it is over-whelmed or confused. Looking at Aviation, where they have millions of hours of experience, and guess where a significant number of accidents originate.... the hand off between auto-pilot and the human pilot.

Sigh, Musk (Tesla) is apparently ok using their customers as real life crash test dummies. Hey, it's car industry 2.0, Lean start-up, rapid prototyping, etc. etc....

Sad, for sure. But not surprising.

Louis
06-30-2016, 10:23 PM
Sigh, Musk (Tesla) is apparently ok using their customers as real life crash test dummies.

The drivers' lawyers would probably deny this in court when folks try to sue Tesla, but I think most of their customers deep down inside realize at some level that when you want to be at the very bleeding edge of fashion and technology, you have to be willing to pay the price in more than just dollars.

Macadamia
06-30-2016, 10:42 PM
The drivers' lawyers would probably deny this in court when folks try to sue Tesla, but I think most of their customers deep down inside realize at some level that when you want to be at the very bleeding edge of fashion and technology, you have to be willing to pay the price in more than just dollars.

The owners of Teslas are willing to die for their car? am I reading this right

Louis
06-30-2016, 10:56 PM
The owners of Teslas are willing to die for their car? am I reading this right

That's not what I said, but even if it was, they wouldn't be the only drivers (or motorcyclists) out there who are.

Edit: or bicyclists.

seric
06-30-2016, 11:13 PM
Real life consequences of "What is Toronto?".

Elon Musk is the king of self-promotion and vapor ware. Most car companies have this technology in various stages of development, and indeed Volvo has come out publicly saying how bad the hybrid model, that Tesla uses, really is. Namely, it drives in good conditions and then gives control back to the driver when it is over-whelmed or confused. Looking at Aviation, where they have millions of hours of experience, and guess where a significant number of accidents originate.... the hand off between auto-pilot and the human pilot.

Sigh, Musk (Tesla) is apparently ok using their customers as real life crash test dummies. Hey, it's car industry 2.0, Lean start-up, rapid prototyping, etc. etc....

Sad, for sure. But not surprising.

It appears you have not ridden in a Tesla being used in autopilot mode to witness the warnings that need to be agreed to in order to initiate autopilot mode. These warnings include keeping your hands on the wheel and being ready to take over at all times. The system will even slow the vehicle down if it does not sense the drivers hands on the wheel.

I also doubt the 10 year old kid scenario is likely to happen even at this stage in their development. Even in this case as I've previously pointed out, the autopilot system still has a better statistical track record than is the average in the U.S.

1happygirl
06-30-2016, 11:19 PM
From the Tesla Blog:
https://www.teslamotors.com/blog/tragic-loss

Blog


A Tragic Loss

The Tesla Team June 30, 2016
We learned yesterday evening that NHTSA is opening a preliminary evaluation into the performance of Autopilot during a recent fatal crash that occurred in a Model S. This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles. It is important to emphasize that the NHTSA action is simply a preliminary evaluation to determine whether the system worked according to expectations.

Following our standard practice, Tesla informed NHTSA about the incident immediately after it occurred. What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S. Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents.

It is important to note that Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled. When drivers activate Autopilot, the acknowledgment box explains, among other things, that Autopilot “is an assist feature that requires you to keep your hands on the steering wheel at all times," and that "you need to maintain control and responsibility for your vehicle” while using it. Additionally, every time that Autopilot is engaged, the car reminds the driver to “Always keep your hands on the wheel. Be prepared to take over at any time.” The system also makes frequent checks to ensure that the driver's hands remain on the wheel and provides visual and audible alerts if hands-on is not detected. It then gradually slows down the car until hands-on is detected again.

We do this to ensure that every time the feature is used, it is used as safely as possible. As more real-world miles accumulate and the software logic accounts for increasingly rare events, the probability of injury will keep decreasing. Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert. Nonetheless, when used in conjunction with driver oversight, the data is unequivocal that Autopilot reduces driver workload and results in a statistically significant improvement in safety when compared to purely manual driving.

The customer who died in this crash had a loving family and we are beyond saddened by their loss. He was a friend to Tesla and the broader EV community, a person who spent his life focused on innovation and the promise of technology and who believed strongly in Tesla’s mission. We would like to extend our deepest sympathies to his family and friends.

It is refreshing to me that they acknowledged this person and his family.
A lot of auto companies now and in the past, to my understanding, have not done this.

cat6
06-30-2016, 11:42 PM
Tragic. I wonder how many miles are ridden by MTB's between Grizzy Bear deaths compared to the 130 million auto-pilot miles logged by Tesla before this fatality.

We're all in a Petri dish daily there is nothing unique about Tesla.

How many people died today as a result of someone texting?

Louis
06-30-2016, 11:50 PM
We're all in a Petri dish daily there is nothing unique about Tesla.

How many people died today as a result of someone texting?

I agree - in the grand scheme of things they're probably safer than many of the alternatives. But as Tesla acknowledged in the memo shown above, they're still in the learning phases. I'm sure when horseless carriages first showed up on the "roads" there were plenty of incidents that would never have happened if folks were still on horseback or in carriages.

rileystylee
07-01-2016, 03:50 AM
Sad but at least he didn't kill anyone else.
Having that amount of trust In technology is just daft.

tuscanyswe
07-01-2016, 04:18 AM
It appears you have not ridden in a Tesla being used in autopilot mode to witness the warnings that need to be agreed to in order to initiate autopilot mode. These warnings include keeping your hands on the wheel and being ready to take over at all times. The system will even slow the vehicle down if it does not sense the drivers hands on the wheel.

I also doubt the 10 year old kid scenario is likely to happen even at this stage in their development. Even in this case as I've previously pointed out, the autopilot system still has a better statistical track record than is the average in the U.S.

I dont know the first thing about teslas autopilot but i wouldn't be surprised if the autopilot was only used on the freeway and in other scenarios where accidents are less frequent (mile wise) even when cars are driven by humans.

Which in reality would mean that the accident ratio is actually worse than the U.S average. This seems to be the norm when company talks about statistics these days, always painted on the bright side.

verticaldoug
07-01-2016, 05:18 AM
Real life consequences of "What is Toronto?".

Elon Musk is the king of self-promotion and vapor ware. Most car companies have this technology in various stages of development, and indeed Volvo has come out publicly saying how bad the hybrid model, that Tesla uses, really is. Namely, it drives in good conditions and then gives control back to the driver when it is over-whelmed or confused. Looking at Aviation, where they have millions of hours of experience, and guess where a significant number of accidents originate.... the hand off between auto-pilot and the human pilot.

Sigh, Musk (Tesla) is apparently ok using their customers as real life crash test dummies. Hey, it's car industry 2.0, Lean start-up, rapid prototyping, etc. etc....

Sad, for sure. But not surprising.

It reminds me of an old quote about Bill Gates. Gates was discussing how much computers had improved over 20 years compared to cars. I think GM had the best comeback- except in a car when you need your brakes, you can't control alt delete reboot because the system is frozen first.

verticaldoug
07-01-2016, 05:20 AM
GM replies to Bill Gates


At a recent computer expo (COMDEX), Bill Gates reportedly compared the computer industry with the auto industry and stated "if GM had kept up with the technology like the computer industry has, we would all be driving $25.00 cars that got 1,000 miles to the gallon."

In response to Bill's comments, General Motors issued the following press release -

If GM had developed technology like Microsoft, we would all be driving cars with the following characteristics -

1. For no reason whatsoever, your car would crash twice a day.

2. Every time they repainted the lines in the road, you would have to buy a new car.

3. Occasionally your car would die on the freeway for no reason. You would have to pull over to the side of the road, close all of the windows, shut off the car, restart it, and reopen the windows before you could continue. For some reason you would simply accept this.

4. Occasionally, executing a maneuver such as a left turn would cause your car to shut down and refuse to restart, in which case you would have to reinstall the engine.

5. Only one person at a time could use the car unless you bought "car NT", but then you would have to buy more seats.

6. Macintosh would make a car that was powered by the sun, was reliable, five times as fast and twice as easy to drive - but would only run on five percent of the roads.

7. The oil, water temperature, and alternator warning lights would all be replaced by a single "General Protection Fault" warning light.

10. The airbag system would ask "are you sure?" before deploying.

11. Occasionally, for no reason whatsoever, your car would lock you out and refuse to let you in until you simultaneously lifted the door handle, turned the key and grabbed hold of the radio antenna.

12. GM would require all car buyers to also purchase a deluxe set of Rand McNally road maps (now a GM subsidiary), even though they neither need nor want them. Attempting to delete this option would immediately cause the cars performance to diminish by 50% or more. Moreover, GM would become a target for investigation by the Justice Dept.

13. Every time GM introduced a new car, car buyers would have to learn to drive all over again because none of the controls would operate in the same manner as the old car.

14. You'd have to press the "Start" button to turn the engine off.

JStonebarger
07-01-2016, 05:41 AM
I also doubt the 10 year old kid scenario is likely to happen even at this stage in their development. Even in this case as I've previously pointed out, the autopilot system still has a better statistical track record than is the average in the U.S.

What, there's some kind of technological "no kid zone" around each Tesla?

For the record, what does the auto pilot do in that scenario, endanger the driver/passengers or run the kid down?

rnhood
07-01-2016, 06:43 AM
At least the guy never knew what happened. High speed then decapitated while watching a movie. Auto pilot in a car.....what a stupid idea.

AngryScientist
07-01-2016, 06:52 AM
i think we are wading into some pretty convoluted waters with these early autopilot systems.

i'm conflicted, because i do think that the human element is responsible for a ton of wrecks, and and if very sophisticated technology can remove some of that human element from the equation and keep us safer on the road, it's probably a good thing.

i do think that the technology will be abused though. it will be taken as an excuse to get behind the "wheel" drunk or impaired, or completely distracted. i dont know what happened here, but the initial paragraphs of that news story claim this guy was watching a movie while bombing down the road? if that's true it's inexcusable, especially given this early, relatively unproven tech.

this will be interesting to watch develop and see where it goes.

FlashUNC
07-01-2016, 07:16 AM
At some point, they're going to have to develop the ability to kill into a fully autonomous car. Its the trolley dilemma in real life.

Tickdoc
07-01-2016, 08:44 AM
i think we are wading into some pretty convoluted waters with these early autopilot systems.

i'm conflicted, because i do think that the human element is responsible for a ton of wrecks, and and if very sophisticated technology can remove some of that human element from the equation and keep us safer on the road, it's probably a good thing.

i do think that the technology will be abused though. it will be taken as an excuse to get behind the "wheel" drunk or impaired, or completely distracted. i dont know what happened here, but the initial paragraphs of that news story claim this guy was watching a movie while bombing down the road? if that's true it's inexcusable, especially given this early, relatively unproven tech.

this will be interesting to watch develop and see where it goes.

Agreed. My wife new volvo is semi-automous. It will lane assist and adaptive cruise very well, but not enough that I want to take my hands off the wheel or eye off the road. It is more effective as a way to take your feet off the pedals on road trips, or for crawling through sprawl. I have had three or four encounters with it so far where it puts on the brakes for you (i.e. if someone pulls out in front of you) and it is very effective there. Also super effective in parking lots where there are kids/people or other cars backing up.

I have also had a few near misses where on say a narrow road, a large truck or trailer with wide wheels will be riding the line in oncoming traffic, the lane assist will try to steer you back into the lane when you need to nudge the other way to avoid getting clipped. It is easily overridden, but only if you are paying attention. The feeling of fighting your car to keep from getting clipped is not good.

It still gives me an uneasy feeling using it though. I'm just not there yet.

Sierra
07-01-2016, 08:55 AM
At this stage the technology is in a limbo state. The danger comes when the driver becomes complacent and turns control completely over to the technology and fails to be properly vigilant. So, if you have to be that attentive you may as well drive the damned thing and be done with it.

MattTuck
07-01-2016, 08:58 AM
It appears you have not ridden in a Tesla being used in autopilot mode to witness the warnings that need to be agreed to in order to initiate autopilot mode. These warnings include keeping your hands on the wheel and being ready to take over at all times. The system will even slow the vehicle down if it does not sense the drivers hands on the wheel.

I also doubt the 10 year old kid scenario is likely to happen even at this stage in their development. Even in this case as I've previously pointed out, the autopilot system still has a better statistical track record than is the average in the U.S.

So, what is your explanation of the accident, if the warnings were all sufficient?

Here's the article I was mentioning.
http://www.theverge.com/2016/4/27/11518826/volvo-tesla-autopilot-autonomous-self-driving-car

Victor says that Volvo believes that Level 3 autonomy, where the driver needs to be ready to take over at a moment's notice, is an unsafe solution. Because the driver is theoretically freed up to work on email or watch a video while the car drives itself, the company believes it is unrealistic to expect the driver to be ready to take over at a moment's notice and still have the car operate itself safely. "It's important for us as a company, our position on autonomous driving, is to keep it quite different so you know when you're in semi-autonomous and know when you're in unsupervised autonomous," he says.

Volvo's Drive Me autonomous car, which will launch in a public pilot next year, is a Level 4 autonomous car — this means not only will it drive itself down the road, but it is capable of handling any situation that it comes across without any human intervention. As a result, the human doesn't need to be involved in the driving at all. If something goes wrong, the car can safely stop itself at the side of the road.

"In our concept, if you don't take over, if you have fallen asleep or are watching a film, then we will take responsibility still," says Victor. "We won't just turn [autonomous mode] off. We take responsibility and we'll be stopping the vehicle if you don't take over." Unsaid here is that in its current "beta" incarnation (which customers have to pay thousands of dollars to enable) Tesla's Autopilot can suddenly turn itself off if it gets into trouble, and the driver must take over immediately or bad things can happen.

"That's a really important step in terms of safety, to make people understand that it's only an option for them take over," says Victor. Volvo is "taking responsibility both for crash events, and we're also programming it for extreme events like people walking in the road even where they're not supposed to be. There's a massive amount of work put into making it handle a crash or conflict situations."

steelbikerider
07-01-2016, 08:59 AM
ethics question for the not too distant future.
When most of us have smart cars and a non smart car interferes, a crash is enevitable, who gets saved?
Will the cars and their occupants be ranked as to who is most important?

verticaldoug
07-01-2016, 09:04 AM
https://openai.com/blog/concrete-ai-safety-problems/

Concrete Problems in AI Safety

OpenAI (the consortium started by Musk) contributed to the paper published last week dealing with 'accidents' in machine learning systems.

There was also the article in the NYTimes last week dealing with moral dilemmas. 'Should a driverless car hit a pedestrian to save your life?'

http://www.nytimes.com/2016/06/24/technology/should-your-driverless-car-hit-a-pedestrian-to-save-your-life.html

This accident is still just the easy stuff.

54ny77
07-01-2016, 09:07 AM
Helluva question. How about what if smart car is riding alongside group of cyclists and stupid car is about to t-bone or smash into it? Smart car crash avoidance would be to swerve into cyclists to save itself, but at the expense of plowing into cyclists. Who gets priority?

For example, there was a pilot in air force recently (a month or two ago) who on a training exercise, when something went very wrong with the plane, opted to down it at the expense of his own life rather than have the jet crash into a residential area, which would have been great cost of life. Dunno if that kind of logic could be programmed into a smart car.

In the example i posted above, what if it's 4 people in smart car and only 1 cyclist on side of road?

A conundrum indeed.

ethics question for the not too distant future.
When most of us have smart cars and a non smart car interferes, a crash is enevitable, who gets saved?
Will the cars and their occupants be ranked as to who is most important?

Sierra
07-01-2016, 09:10 AM
These moral/ethical implications are fascinating!

We've turned over driving responsibilities to the MACHINE; and GOD has turned over his rolling of the existential dice to the MACHINE as well, it seems! What a universe we live in!

This would be a helluva programming project!

Tickdoc
07-01-2016, 09:17 AM
Helluva question. How about what if smart car is riding alongside group of cyclists and stupid car is about to t-bone or smash into it? Smart car crash avoidance would be to swerve into cyclists to save itself, but at the expense of plowing into cyclists. Who gets priority?

For example, there was a pilot in air force recently (a month or two ago) who on a training exercise, when something went very wrong with the plane, opted to down it at the expense of his own life rather than have the jet crash into a residential area, which would have been great cost of life. Dunno if that kind of logic could be programmed into a smart car.

In the example i posted above, what if it's 4 people in smart car and only 1 cyclist on side of road?

A conundrum indeed.

Very basic problems worry me even more.....like wind shears and potholes, heavy rain/ice. There are so many variables that we are adapt at handling (most of the time) that I cannot fathom how a car could handle them all with a decent outcome.

I'm afraid a controlled road surface would be required for them to be truly effective, and that is a very expensive proposition in a county as vast as this.

saab2000
07-01-2016, 09:27 AM
Airplane autopilots are incredibly reliable but they're mostly dumb devices and will only do what they are told to do. They have essentially no autonomy. The pilots (2) must be ready to assume control at any time. Even collision and terrain avoidance are not automatic. The system will warn the pilots but the pilots must hand fly the maneuver.

It will be interesting to see where this goes.

A buddy of mine with adaptive cruise control says it's amazing technology and he's a very good driver, always at the ready.

verticaldoug
07-01-2016, 09:55 AM
These moral/ethical implications are fascinating!

We've turned over driving responsibilities to the MACHINE; and GOD has turned over his rolling of the existential dice to the MACHINE as well, it seems! What a universe we live in!

This would be a helluva programming project!

It should be easy given modern societal values. We all have a RFID Chip implanted, the system then selects the poorest person/group. If you don't have a RFID Chip, it defaults to you not being part of the system and you're selected.

I say in jest, but somehow this seems like a good solution for the guy in the Bentley

Sierra
07-01-2016, 10:04 AM
It should be easy given modern societal values. We all have a RFID Chip implanted, the system then selects the poorest person/group. If you don't have a RFID Chip, it defaults to you not being part of the system and you're selected.

I say in jest, but somehow this seems like a good solution for the guy in the Bentley


Of course! Leave it to me to make things more complicated than they really are! :D

FlashUNC
07-01-2016, 10:09 AM
The first fully autonomous car is also the first car programmed to kill. Its inevitable.

saab2000
07-01-2016, 10:15 AM
The first fully autonomous car is also the first car programmed to kill. Its inevitable.

Being programmed to prioritize is not the same as being programmed to kill. The implication you make in this statement is that the device will have nefarious intent, which will not be the case. Sometimes choices have to be made. Choosing the lesser of two bad options is not the same as being programmed to kill, IMHO.

There is no such thing as absolute safety or absolute security in anything in our lives.

I'm surprised this story of an owner who likely misused his vehicle's technology against its intended limitations is such a big story.

"Autopilot" in the case of Tesla is nothing more than a marketing term. It doesn't mean, "Push the A/P button on - go to sleep". The driver is always the last line of defense and it is his/her responsibility to understand the limitations of the technology.

Sierra
07-01-2016, 10:15 AM
The first fully autonomous car is also the first car programmed to kill. Its inevitable.


At first I didn't pay any attention to this thread because I don't have a dog in the fight, as it were.

But, yeah--the thought that now cars must inevitably be programmed to kill is just mind boggling.

I guess the program would have to key off of something truly random (atmospheric noise, etc. In more archaic terms, this would play the part of the old firing squad). It is an interesting technical problem. And, terrible in its implications . . . .

saab2000
07-01-2016, 10:18 AM
But, yeah--the thought that now cars must inevitably be programmed to kill is just mind boggling.



How are cars being programmed to kill?

MattTuck
07-01-2016, 10:23 AM
How are cars being programmed to kill?

Like this.

Deda-duh-duh-da-DA!

http://starsmedia.ign.com/stars/image/article/901/901831/ocd-the-terminator-skull-20080821034033022-000.jpg

Sierra
07-01-2016, 10:24 AM
How are cars being programmed to kill?

We're not there yet. But, eventually it will need to be part of the calculus.

seanile
07-01-2016, 10:28 AM
How are cars being programmed to kill?

if a crash presents an inevitable death, but by way of two options, 1) the death of the passenger, or 2) the death of someone outside of the car...the car will have to decide who will die, and who it will protect.
"kill" is a poor term due to intent, but, the car is the object causing the death, and "let die" doesn't have enough weight imo.

as was already mentioned: https://en.wikipedia.org/wiki/Trolley_problem

saab2000
07-01-2016, 10:30 AM
We're not there yet. But, eventually it will need to be part of the calculus.

I don't agree. We all make choices in life as part of instinctual risk mitigation. Making a choice that has higher potential risk isn't the same as being programmed to kill.

AngryScientist
07-01-2016, 10:32 AM
I don't agree. We all make choices in life as part of instinctual risk mitigation. Making a choice that has higher potential risk isn't the same as being programmed to kill.

agree absolutely. "programmed to kill" is needlessly sensationalizing reality.

Sierra
07-01-2016, 10:35 AM
I don't agree. We all make choices in life as part of instinctual risk mitigation. Making a choice that has higher potential risk isn't the same as being programmed to kill.


What is instinct if not a computer program of sorts?

54ny77
07-01-2016, 10:44 AM
At that point, it's semantics, no?

In a binary scenario (protect vehicle occupants vs. not), what else would it be called, risk mitigation?

Someone has to write instructions (code), and the AI aspect will determine what to do.

All that said, humans make plenty of the same stupid decisions so the end result will probably be the same--death.


I don't agree. We all make choices in life as part of instinctual risk mitigation. Making a choice that has higher potential risk isn't the same as being programmed to kill.

Sierra
07-01-2016, 10:54 AM
Now I'm depressed.

I'm going to have to buy some Rapha.

Perhaps then I'll feel better.

verticaldoug
07-01-2016, 10:56 AM
The problem is not really how the program selects, as long as the program works as intended.
The potential for the 'oops' moments are numerous.

A big topic in both AI and BigData right now are biases that unintentionally get inserted into algorithms. These introduce all sorts of unintended effects. You can look up what some biases in image recognition software.

Or you can look at the recent episode of Microsoft's TAY. The program worked in Chinese without a problem. But researchers did not factor in the Chinese state censorship leads to people not trolling and self-editing. Deploy TAY into a troll filled global english speaking internet world, add some very clever trolls and it quickly morphed into something really ugly.

unterhausen
07-01-2016, 11:08 AM
the example where you have to choose between killing the occupants and people outside the car is ridiculous in my mind. First, it means the car was driving faster than its sight lines should allow. Hopefully they don't do that. Second, at some point you just have to slam on the brakes and hope for the best. Writing software to kill the occupants in this situation is going to end up driving the car into a wall or off the cliff for no reason, that's just the nature of the interactions of software and sensors in my experience.

The problem I have with the Tesla system is the hubris of calling it a self-driving car. They need to make it work a little worse. The toyota system does that by making the driver put corrections in on a fairly regular basis. It fails pretty often, but the real purpose is to keep you from falling asleep and driving off the road. It's really good at that. My wife put the ezpass in front of the camera that they use to detect the lane lines, so I haven't used it recently. But when I'm afraid I might fall asleep, I use it.

It's kinda funny that Volvo would criticize Tesla though, there are some pretty horrifying videos of people being run over by Volvo's self driving car.

enr1co
07-01-2016, 11:26 AM
"Autopilot" in the case of Tesla is nothing more than a marketing term. It doesn't mean, "Push the A/P button on - go to sleep". The driver is always the last line of defense and it is his/her responsibility to understand the limitations of the technology.
Excellent point- for marketing purposes and in response to this tragic event, it would not surprise me for tesla to revise/update the "Autopilot" term to something more aligned the actual capability of the technology at this time , i.e. Nissan or Audi's "driver assist" function.

verticaldoug
07-01-2016, 11:29 AM
the example where you have to choose between killing the occupants and people outside the car is ridiculous in my mind. First, it means the car was driving faster than its sight lines should allow. Hopefully they don't do that. Second, at some point you just have to slam on the brakes and hope for the best. Writing software to kill the occupants in this situation is going to end up driving the car into a wall or off the cliff for no reason, that's just the nature of the interactions of software and sensors in my experience.

The problem I have with the Tesla system is the hubris of calling it a self-driving car. They need to make it work a little worse. The toyota system does that by making the driver put corrections in on a fairly regular basis. It fails pretty often, but the real purpose is to keep you from falling asleep and driving off the road. It's really good at that. My wife put the ezpass in front of the camera that they use to detect the lane lines, so I haven't used it recently. But when I'm afraid I might fall asleep, I use it.

It's kinda funny that Volvo would criticize Tesla though, there are some pretty horrifying videos of people being run over by Volvo's self driving car.

But you may not actually be writing code in a conventional sense, parts of the system definitely rely on deep learning, so the program is being fed data and building a statistical model on how to handle things. It probably has default built in, like slam on the brakes, but given the nature of the system, you may not know how it will respond until hit with the variable.

In most cases, this probably means being rear ended by the human driver tail gating behind you.

FlashUNC
07-01-2016, 11:32 AM
Being programmed to prioritize is not the same as being programmed to kill. The implication you make in this statement is that the device will have nefarious intent, which will not be the case. Sometimes choices have to be made. Choosing the lesser of two bad options is not the same as being programmed to kill, IMHO.

There is no such thing as absolute safety or absolute security in anything in our lives.

I'm surprised this story of an owner who likely misused his vehicle's technology against its intended limitations is such a big story.

"Autopilot" in the case of Tesla is nothing more than a marketing term. It doesn't mean, "Push the A/P button on - go to sleep". The driver is always the last line of defense and it is his/her responsibility to understand the limitations of the technology.

Its not a nefarious intent, just a statement of the facts. If the brakes fail on an autonomous car and it has to choose between sliding into a cross walk full of kids or the old guy on the sidewalk, someone has to have programmed it to make that choice. And if the push towards fully autonomous cars happens, that kind of thing needs to be programmed in.

Volvo, as has been noted in this forum, is working towards a fully automated car, which does not require the driver to intervene in an emergency with the expectation that it is "on them" (the car company) to develop a system robust enough to ensure the car makes the right decision.

So what's the priority? The driver? Pedestrians? Cyclists? There'll be some situation -- to your point -- where the outcome of least bad still means people die. And you're letting an autonomous machine make that choice in a split second. I hope the engineers designing these systems are fully thinking through the moral and ethical issues those situations bring up.

If you're reading the fact that we'll be programming a car capable of choosing who lives and who dies as nefarious, I think that's more a read of the rather sticky moral situation this creates than my statement.

FlashUNC
07-01-2016, 11:39 AM
Put it another way. Would you buy a fully autonomous car that, in an emergency:

1) Did not place the utmost importance on your safety of the owner? (Who wants to be the peripheral safety objective in a vehicle you own after all?)

Or

2) Preserved your life at the utmost even if it meant the deaths of others outside the vehicle? (Who wants a car that they know will kill others to keep them alive? That's awful to think about.)

Fully autonomous presents enormous and, in some ways, impossible moral dilemmas.

Either way, you're programming a car that has to be capable of killing at the expense of saving somewhere else.

54ny77
07-01-2016, 11:43 AM
Yup.

Although, de facto, that's what humans do every day.

Some intentionally, some not.

Put it another way. Would you buy a fully autonomous car that, in an emergency:

1) Did not place the utmost importance on your safety of the owner? (Who wants to be the peripheral safety objective in a vehicle you own after all?)

Or

2) Preserved your life at the utmost even if it meant the deaths of others outside the vehicle? (Who wants a car that they know will kill others to keep them alive? That's awful to think about.)

Fully autonomous presents enormous and, in some ways, impossible moral dilemmas.

Either way, you're programming a car that has to be capable of killing at the expense of saving somewhere else.

saab2000
07-01-2016, 11:47 AM
If you're reading the fact that we'll be programming a car capable of choosing who lives and who dies as nefarious, I think that's more a read of the rather sticky moral situation this creates than my statement.

It is probably semantics. But I don't see the decision of who lives and who dies as being programmed to kill. But as mentioned above somewhere, if the choice becomes inevitable, it's possible the autonomous technology wasn't doing its job right. It ought to be smart enough to consider more eventualities than we as humans can consider and react ahead of time.

Of course, there are people far smarter than myself working on this.

Anyway, it's just the term 'programmed to kill' that gets under my skin because it is a sensational phrase that implies more than making a decision about the lesser of two bad outcomes.

Yes, this will be challenging stuff at an ethical level because the person programming the computer will be making a choice in advance rather than the person making an instantaneous choice about these things in the heat of the moment.

FlashUNC
07-01-2016, 11:48 AM
Yup.

Although, de facto, that's what humans do every day.

Some intentionally, some not.

True, but we at least own that choice, and the consequences that emerge from that choice. Something happens and you hit someone and they die as a result of your bad choice, we have a whole structure around that. You could be arrested and tried for the crime. The victim's family could sue you for financial restitution.

What happens when the same thing occurs where no human is involved in making that choice? Do you sue the car company? The idle passenger who was literally just along for the ride and had nothing to do with it? How do you arrest a car?

Autonomy solves a lot of problems on the road, but it opens up a whole host of new ones too.

cinema
07-01-2016, 12:05 PM
it seems like the accident in question would still have been the fault of the tractor who made a left turn. anytime you turn left in front of oncoming traffic and cause an accident like that you are at fault.

now i'm not sure exactly how the tesla ai works but i'm guessing if the driver had been paying attention and used the brake it would have overridden the auto pilot but i have literally no idea. regardless it is totally the fault of the driver who turned, though there was obviously a flaw in the AI of the oncoming vehicle that needs addressing as well.

i know current prototypes of cars with AI use lidar and infrared as well as camera technology

seric
07-01-2016, 12:50 PM
What, there's some kind of technological "no kid zone" around each Tesla?

For the record, what does the auto pilot do in that scenario, endanger the driver/passengers or run the kid down?

Applies the brakes while moving into an open lane if available according to the 360 degree ultrasonic short range sensors within the stop zone. About the same as a human.

I guess I should say, I doubt a child will be killed by Tesla's autopilot in a scenario where a human would more likely avoid the accident.

goonster
07-01-2016, 12:50 PM
And you're letting an autonomous machine make that choice in a split second. I hope the engineers designing these systems are fully thinking through the moral and ethical issues those situations bring up.

The machine is not making a choice. It cannot make a value judgment between "old guy" and "pregnant lady", or "probably serious injury" vs. "certain death."

Given the same inputs, the machine will always take the same course of action, and that's not a choice. Machine programming has the same ethical considerations as fire escape design or spec'ing lifeboats for ships. The programmers cannot anticipate every possible scenario. There is always a limit to design, beyond which even the most complex machine has the ethical agency of a derailed train.

seric
07-01-2016, 01:12 PM
So, what is your explanation of the accident, if the warnings were all sufficient?

Here's the article I was mentioning.
http://www.theverge.com/2016/4/27/11518826/volvo-tesla-autopilot-autonomous-self-driving-car

Simply put, driver arrogance. Tesla cannot force drivers to act in any particular way. Despite it's name, the system is very thoroughly communicated to owners as an assistant feature and not autopilot. Any driver can agree to terms and then not follow them. The linked article is simply a Volvo engineer trying to capitalize on an otherwise tragic event. If you are asking for a more technical reason the software did not recognize the trailer, the answer as it turns out is that it did. The system tuned it out as an overhead road sign due to it's combination of height (the objects the system was designed to detect wasn't written to detect hover vehicles) and lack of differential interferometry.


I figure I'll throw out some of my credentials in pontificating on this matter:

I'm a two time Autonomous Class Robot Wars Champion, I've also been provided access to Root in Tesla's OS for a short amount of time.

Disclaimers:
I do own Tesla stock, and have friends at both Tesla and SpaceX.

goonster
07-01-2016, 01:26 PM
the autopilot system still has a better statistical track record than is the average in the U.S.

That is of limited relevance, and a low bar to boot. "Average" includes the drunk, high and demented.

54ny77
07-01-2016, 01:33 PM
Root was bad arse.

http://4.bp.blogspot.com/-C3o0wH1XimI/UzIjuCzblJI/AAAAAAAAUSY/UqllH5nz4UE/s1600/root-shoot.gif

:D

I've also been provided access to Root...

MattTuck
07-01-2016, 01:45 PM
Despite it's name, the system is very thoroughly communicated to owners as an assistant feature and not autopilot.

...
The linked article is simply a Volvo engineer trying to capitalize on an otherwise tragic event.

I'd say it was arrogance on the part of Tesla. They were the ones that chose to market the feature as 'auto-pilot'... they could have easily chosen a name that communicated what the system really did.

And, that article was written in April, before the accident. Nothing to do with the accident in question.

The issue isn't so much whether the technology is good, I believe it is good. The question is, is it ready for wide spread roll out... and I guess that is a question that can't be answered in a vacuum. It is based on the costs and benefits of putting it out in the wild. Every other car company has decided that the costs don't outweigh the benefits at this point. Tesla thought otherwise, so they either figured that the technology was much better than it actually is, or that accidents like this and potential damage to their reputation were worth the benefits (incremental sales due to this feature, more data to train their system on, etc.)

seric
07-01-2016, 01:56 PM
I'd say it was arrogance on the part of Tesla. They were the ones that chose to market the feature as 'auto-pilot'... they could have easily chosen a name that communicated what the system really did.

And, that article was written in April, before the accident. Nothing to do with the accident in question.

The issue isn't so much whether the technology is good, I believe it is good. The question is, is it ready for wide spread roll out... and I guess that is a question that can't be answered in a vacuum. It is based on the costs and benefits of putting it out in the wild. Every other car company has decided that the costs don't outweigh the benefits at this point. Tesla thought otherwise, so they either figured that the technology was much better than it actually is, or that accidents like this and potential damage to their reputation were worth the benefits (incremental sales due to this feature, more data to train their system on, etc.)


I'll continue to disagree, one part of my thinking on the matter is that both seatbelts and airbags increased the probability of an accident being fatal due to the human factor (if we feel more secure we become more reckless), while Tesla so far seems to have avoided this increase with the limited data set given.. It appears to me that their system has been doing a pretty good job of preventing human stupidity and attempts to push their system to the extremes having a net negative impact.

Also, as was commented previously that the statistics I'm looking at do include the "drunk, high, and demented". That same statement could be used to point out that those same "drunk, high, and demented" drivers are also the ones the system seems to be succeeding in avoiding fatal accidents with.

goonster
07-01-2016, 02:11 PM
both seatbelts and airbags increased the probability of an accident being fatal

Say what now?

How old are you?

(edit: No offense, just wondering if you remember the pre-airbag era . . .)

Louis
07-01-2016, 02:14 PM
Yup.

Although, de facto, that's what humans do every day.

Some intentionally, some not.

Absolutely.

When I'm riding the bike and some idiot passes me on a blind turn you know d@mn well that if suddenly a car or truck appears going in the other direction the driver of the car passing me will immediately swerve to the right, with a very high likelihood of hitting me, causing serious injury if not death.

All because they couldn't wait 30 seconds. What does that tell about some drivers' priorities?

Sierra
07-01-2016, 02:14 PM
Say what now?

How old are you?

(edit: No offense, just wondering if you remember the pre-airbag era . . .)

He has obviously never driven a Chevrolet Corvair.

:D

Louis
07-01-2016, 02:17 PM
Say what now?

How old are you?

(edit: No offense, just wondering if you remember the pre-airbag era . . .)

I assume what he means is that people would be much more careful drivers if every car or truck had a 9" dagger installed in the center of the steering wheel pointing right at the driver's chest.

I don't know if there truly would be fewer fatalities overall, but it's an interesting concept.

seanile
07-01-2016, 02:24 PM
I assume what he means is that people would be much more careful drivers if every car or truck had a 9" dagger installed in the center of the steering wheel pointing right at the driver's chest.

I don't know if there truly would be fewer fatalities overall, but it's an interesting concept.

i know it to be true about my friends who ride brakeless fixed gears vs. friends who ride with brakes. the brakeless kids are often notably slower and far more deliberate with their speed and movements than those with brakes because those without can't afford not to be.

Sierra
07-01-2016, 02:25 PM
I'll continue to disagree, one part of my thinking on the matter is that both seatbelts and airbags increased the probability of an accident being fatal due to the human factor (if we feel more secure we become more reckless), while Tesla so far seems to have avoided this increase with the limited data set given.. It appears to me that their system has been doing a pretty good job of preventing human stupidity and attempts to push their system to the extremes having a net negative impact.

Also, as was commented previously that the statistics I'm looking at do include the "drunk, high, and demented". That same statement could be used to point out that those same "drunk, high, and demented" drivers are also the ones the system seems to be succeeding in avoiding fatal accidents with.

Not necessarily. I ride a sportbike and I wear full leathers when I do that. Leathers do allow me to feel more secure in riding than I otherwise would, but not to the point that I ride with reckless abandon. You can also make the argument that to the extent a person seeks out these safety features they also would exhibit more careful behaviors (e.g. your "typical" Volvo driver). It is not uncommon to see squids on the road riding liter bikes wearing nothing more than shorts, t-shirts, and flip-flops! And, more often than not, these are the riders who ride recklessly. Interesting.

MattTuck
07-01-2016, 02:25 PM
I'll continue to disagree, one part of my thinking on the matter is that both seatbelts and airbags increased the probability of an accident being fatal due to the human factor (if we feel more secure we become more reckless), while Tesla so far seems to have avoided this increase with the limited data set given.. It appears to me that their system has been doing a pretty good job of preventing human stupidity and attempts to push their system to the extremes having a net negative impact.


I assume what he means is that people would be much more careful drivers if every car or truck had a 9" dagger installed in the center of the steering wheel pointing right at the driver's chest.

I don't know if there truly would be fewer fatalities overall, but it's an interesting concept.

I'm familiar with risk homeostasis, but I'd like to see the data that says airbags and seat belts have increased the rate (per whatever metric you like) of fatal accidents, that can't be explained by other factors.

And, pretty good compared to what? I haven't seen any kind of comparison of performance between different autonomous (or semi-autonomous, or driver assist) systems. I think that federal standards are expected this summer, and maybe we'll see some standard testing that can be compared across systems/brands. Otherwise, in the parlance of our times, "that's just, like, your opinion, man." :)

1happygirl
07-01-2016, 02:29 PM
fascinating and sad (the tesla victim) thread from people who have more experience with new cars and know far more than I do about it.

IIRC, from 2017 onward aren't all cars requ to have backup cameras?

Drove a 2017 and it's not foolproof.

I still looked behind every time. (prob cuz the car wasn't mine either) :D

seric
07-01-2016, 02:31 PM
Say what now?

How old are you?

(edit: No offense, just wondering if you remember the pre-airbag era . . .)

I do remember the pre-airbag era, one of my first cars was a '66 El Camino. There is quite a bit of literature available on the introduction of airbags in the 70's and the problems that ensued in the infancy of new technology.

Here is a more recent 2005 study from the University of Georgia:
http://www.stat.colostate.edu/~meyer/airbags.htm

I don't know that I really agree with their 2005 findings, it seems to be the net benefit of airbags was probably in the positive well before 2005.

1happygirl
07-01-2016, 02:33 PM
per my radio listening this am

Takata airbags

http://www.nhtsa.gov/About+NHTSA/Press+Releases/nhtsa-takata-high-risk-inflators-06302016

http://blog.caranddriver.com/massive-takata-airbag-recall-everything-you-need-to-know-including-full-list-of-affected-vehicles/

seric
07-01-2016, 02:37 PM
Not necessarily. I ride a sportbike and I wear full leathers when I do that. Leathers do allow me to feel more secure in riding than I otherwise would, but not to the point that I ride with reckless abandon. You can also make the argument that to the extent a person seeks out these safety features they also would exhibit more careful behaviors (e.g. your "typical" Volvo driver). It is not uncommon to see squids on the road riding liter bikes wearing nothing more than shorts, t-shirts, and flip-flops! And, more often than not, these are the riders who ride recklessly. Interesting.

I see where your coming from, but try flipping your own argument. I only ride in full gear as well, however I have ditched my jacket a couple of times on ironbutt rides out in the desert. On those occasions I felt very vulnerable and was much more cautious than I would be otherwise. I expect you would be as well. So in the end with everything else being equal (same rider/driver) less safety general equals more cautiousness.

Another example, I'm willing to bomb down mountain roads on a supermoto at speeds I would not even consider in bibs and a helmet on a bicycle.

Sierra
07-01-2016, 02:49 PM
I see where your coming from, but try flipping your own argument. I only ride in full gear as well, however I have ditched my jacket a couple of times on ironbutt rides out in the desert. On those occasions I felt very vulnerable and was much more cautious than I would be otherwise. I expect you would be as well. So in the end with everything else being equal (same rider/driver) less safety general equals more cautiousness.

Another example, I'm willing to bomb down mountain roads on a supermoto at speeds I would not even consider in bibbs and a helmet on a bicycle.

Good points. I guess it depends on where you come from in this whole thing. For example, today I would never even consider riding my road bike without a helmet on despite the fact that I used to commute to work without one for years in busy southern California!

I think, though, that if you took away seatbelts and airbags these days you would be asking for real carnage on the roads due simply to the amount of traffic that now exists. For example, I remember riding my bike when I was a kid on boulevards that were considered "busy" back then. Nothing ever happened to me because, not only were drivers more careful in those days, but there simply was not the volume of driving taking place. Drivers are also much more distracted today, and this is a factor.

Another factor that plays into this is the sheer size of the average car on the road these days. Talk about rendering a sense of security!

Louis
07-01-2016, 02:49 PM
Good points.

I think, though, that if you took away seatbelts and airbags these days you would be asking for real carnage on the roads due simply to the amount of traffic that now exists. Another factor that plays into this is the sheer size of the average car on the road these days.

https://en.wikipedia.org/wiki/Risk_compensation#Seat_belts

"However, a 2007 study based on data from the Fatality Analysis Reporting System (FARS) of the National Highway Traffic Safety Administration concluded that between 1985 and 2002 there were "significant reductions in fatality rates for occupants and motorcyclists after the implementation of belt use laws", and that "seatbelt use rate is significantly related to lower fatality rates for the total, pedestrian, and all non-occupant models even when controlling for the presence of other state traffic safety policies and a variety of demographic factors". "

Sierra
07-01-2016, 02:56 PM
https://en.wikipedia.org/wiki/Risk_compensation#Seat_belts

"However, a 2007 study based on data from the Fatality Analysis Reporting System (FARS) of the National Highway Traffic Safety Administration concluded that between 1985 and 2002 there were "significant reductions in fatality rates for occupants and motorcyclists after the implementation of belt use laws", and that "seatbelt use rate is significantly related to lower fatality rates for the total, pedestrian, and all non-occupant models even when controlling for the presence of other state traffic safety policies and a variety of demographic factors". "

Louis, I don't think I need to point out that Nova Scotia is not southern California! :)

MattTuck
07-01-2016, 02:57 PM
I do remember the pre-airbag era, one of my first cars was a '66 El Camino. There is quite a bit of literature available on the introduction of airbags in the 70's and the problems that ensued in the infancy of new technology.

Here is a more recent 2005 study from the University of Georgia:
http://www.stat.colostate.edu/~meyer/airbags.htm

I don't know that I really agree with their 2005 findings, it seems to be the net benefit of airbags was probably in the positive well before 2005.

Thanks for that link. It seems like just the abstract, but it is an interesting theory. The author's own findings, though, seem to be in conflict with your earlier assertion. That link suggests that low speed crashes are where air-bags cause more fatalities, but your earlier post suggested that recklessness (which I read as higher speeds) rose as a result of airbags and people thinking they were safer.

It is an interesting thing to think about, for sure, and is kind of the point of why I feel Tesla got a little ahead of itself. Rather than really engage in this kind of discussion, and ways to ensure safety, they rushed a (potentially) flawed feature to the market, and used all of the equity in the term 'auto-pilot' that had been developed by others (mostly in aviation) to sell more cars. We could sit down for an hour and come up with 100 other terms that better describe what the system really is, but Tesla intentionally called it auto-pilot, and sent that message to so many people (both their customers and others)...

FlashUNC
07-01-2016, 03:09 PM
Jalopnik has a pretty good rundown on what we know about this autopilot failure and the other known example of a Tesla autopilot failure.

In both cases, it appears the car's observation system treats it like a mattress in terms of shape driving down the road, with significant blind spots above the hood line where objects may impact the greenhouse and intrude into the cabin. This is why the poor gentleman in Florida was basically Jayne Mansfieled.

If that's really the case, Tesla has a rather shoddily designed system that doesn't account for the entirety of the vehicle -- just the widest and lowest parts -- that doesn't account for stuff that may not be low enough but will intrude into the cabin doing all sorts of horrific damage.

choke
07-01-2016, 03:38 PM
"However, a 2007 study based on data from the Fatality Analysis Reporting System (FARS) of the National Highway Traffic Safety Administration concluded that between 1985 and 2002 there were "significant reductions in fatality rates for occupants and motorcyclists after the implementation of belt use laws", and that "seatbelt use rate is significantly related to lower fatality rates for the total, pedestrian, and all non-occupant models even when controlling for the presence of other state traffic safety policies and a variety of demographic factors". "Those are some interesting conclusions. I find it hard to believe that the only factor involved is seat belt use since those people aren't wearing one at all. Perhaps that's the NHTSA seeing what it wants to see....

oldpotatoe
07-02-2016, 06:17 AM
Not necessarily. I ride a sportbike and I wear full leathers when I do that. Leathers do allow me to feel more secure in riding than I otherwise would, but not to the point that I ride with reckless abandon. You can also make the argument that to the extent a person seeks out these safety features they also would exhibit more careful behaviors (e.g. your "typical" Volvo driver). It is not uncommon to see squids on the road riding liter bikes wearing nothing more than shorts, t-shirts, and flip-flops! And, more often than not, these are the riders who ride recklessly. Interesting.

Hmmmm, ya gotta mean local sailors, yer in CA(USN, altho only the Jarheads called sailors Squids)...It was a BIG problem in the USN squadrons I was in..these young guys, right outta boot camp, with a few inches of $ in their pocket..go by a 'DonorCycle'...I Never had one get killed(altho plenty did) but one who ended up with long titanium rods in his back, and a medical discharge.

Climb01742
07-02-2016, 06:54 AM
Question is, what's our yardstick? What are we comparing autonomous cars to?

Distracted drivers? Drivers who value saving 5 seconds by passing a cyclist too closely? Drunk drivers? Drivers who choose to save themselves from a worst crash by hitting someone or something else? Drivers who think the road is their exclusive domain so all you pesky riders and runners get off my f-ing road?

The ethical choices we're raking autonomous cars over the coals for are made_every day_by drivers right now. It would be nice if perfection were our standard (odds are technology could get us closer than humans ever will) but isn't the real standard we should hold autonomous cars to this: can they be safer and more predictable/reliable than people? Will technology ever feel road rage? Be driving while fighting with their boss or spouse on the phone? Value their faster commute over a human being on bike?

I'm not arguing to lower the bar for autonomous driving technology or putting too much faith in it. But isn't it fair to ask yourself, how much faith do you really have in some percentage of drivers? Has there ever been part of driver education or driving tests that address the ethical responsibilities of driving? Or why has no one implemented a technological answer to distracted driving or drunk driving? The world we ride in today is pretty dangerous. Why not make_that_the bar we ask autonomous cars to surpass: make us safer than we are today, not some hypothetical perfect world?

unterhausen
07-02-2016, 11:07 AM
you are comparing admittedly flawed drivers against the perfect autonomous car of your dreams. Someone above said that the people working on autonomous cars are smarter than them. That's a nice fiction. I have done enough work in robotics to know that it really isn't that easy. About 10 years ago, I had a discussion with a very smart grad student about this. His position was that you could make some autonomous vehicle or another perfectly safe. Then one day, I come into the lab and there is a robot-sized hole in the wall. Good reminder of what can happen.

What I see is that a lot of people want this to happen, and there are researchers exploiting this to make money. I think it's a pipe dream, and we would be a lot better off with good public transit. If google says autonomy isn't ready, it's not ready. Tesla is over-selling a system that needs the driver' constant attention. The fact that they have a good record means nothing if the system is fatally flawed, and the flaw is obvious. Now we know it has two flaws, doesn't look up and doesn't require the driver to pay attention.

Truckers will violate your right of way in unpredictable fashion. In fact, it looks like autonomous long-haul trucks may be the only application of autonomy that gets safely implemented in my lifetime.

Mr. Pink
07-02-2016, 02:27 PM
you are comparing admittedly flawed drivers against the perfect autonomous car of your dreams. Someone above said that the people working on autonomous cars are smarter than them. That's a nice fiction. I have done enough work in robotics to know that it really isn't that easy. About 10 years ago, I had a discussion with a very smart grad student about this. His position was that you could make some autonomous vehicle or another perfectly safe. Then one day, I come into the lab and there is a robot-sized hole in the wall. Good reminder of what can happen.

What I see is that a lot of people want this to happen, and there are researchers exploiting this to make money. I think it's a pipe dream, and we would be a lot better off with good public transit. If google says autonomy isn't ready, it's not ready. Tesla is over-selling a system that needs the driver' constant attention. The fact that they have a good record means nothing if the system is fatally flawed, and the flaw is obvious. Now we know it has two flaws, doesn't look up and doesn't require the driver to pay attention.

Truckers will violate your right of way in unpredictable fashion. In fact, it looks like autonomous long-haul trucks may be the only application of autonomy that gets safely implemented in my lifetime.


Yes, "people" want this to happen. Maybe not the average schmoe, but the trucking and taxi industry certainly wants this to happen, because it will save them billions in wages, therefore, profits will soar. So what if a few people get exterminated in the process? It's much "safer" than having stupid, drunk humans operating vehicles, right? That's the meme I've been hearing for a few years from the industry owners and their hired guns in PR. I'm afraid many here in this forum have swallowed that one hook and line and sinker.

Don't believe the hype. When autos were invented, a skeptical public was sold on the idea by being told that they were non polluters. Yup, the cities at the time were filled (I'm not exaggerating) with horse s**t. I've seen pictures of ten foot banks in NYC, like snow banks in the winter. Well, haha, how did that work out, a hundred years later? Ever been to SLC in the winter?

seric
07-02-2016, 02:40 PM
The advantage I'm waiting for is "car clubs". More room in the garage for n+1.

When fully self driving cars become a reality, I don't see them as vehicles one would purchase to own. I expect I'll keep my van for hauling kayaks and dirtbikes, along with a single household vehicle. Otherwise, instead of owning multiple cars as I do now I will simply be an "Uber/Lyft" member or similar and schedule a self driving car as needed. I expect these services will have fleets demographically positioned based on demand and time of day. There will probably be various membership levels.

Jaq
07-02-2016, 03:13 PM
Hmmmm, ya gotta mean local sailors, yer in CA(USN, altho only the Jarheads called sailors Squids)...It was a BIG problem in the USN squadrons I was in..these young guys, right outta boot camp, with a few inches of $ in their pocket..go by a 'DonorCycle'...I Never had one get killed(altho plenty did) but one who ended up with long titanium rods in his back, and a medical discharge.

So my daughter's best friend lives with us in the summer when school's out; she's Korean & her family's all overseas. Anyhow, she's dating a young Marine stationed at Pendelton. Amazing young man, btw. Anyway, he got caught speeding on the base (according to him, a fraction over the limit).

30 days confined to base. They do not mess around. We'll give him a second 4th of July sometime in August.

Matthew
07-02-2016, 03:19 PM
I understand the need for advancing technology but I will never own or drive a car that can drive by itself. I can see lots of accidents down the road due to the car failing in some way. I know, humans cause accidents every minute on this planet but if I am "driving" a car I want to actually drive the damn thing.

ultraman6970
07-02-2016, 03:39 PM
I do see why autopilot could work, like for example a woman ready to go for delivery at home by herself and needing to go to the hospital fast but, besides those really one of a kind extreme cases I dont see how autopilot for a car could be a good idea, ever.

What sucks is that this thing will be a case of what the investigation says vs what Tesla says, and clearly as happened with Tucker for sure the other manufacturers are just waiting to see tesla fall.

Man, would be sick to see delorean to come back as electric car.

rnhood
07-02-2016, 03:51 PM
Autopilot is a stupid idea for a car, and if a couple more drivers get decapitated people might start leaving the autopilot box unchecked when they order one of those rather overpriced yuppie sedans.

choke
07-02-2016, 04:12 PM
Hmmmm, ya gotta mean local sailors, yer in CA(USN, altho only the Jarheads called sailors Squids)..Heh....not that kind of squid. :)

From the Urban Dict (http://www.urbandictionary.com/define.php?term=Squid): A young motorcyclist who overestimates his abilities, boasts of his riding skills when in reality he has none. Squid bikes are usually decorated with chrome and various anodized bits. Rear tyres are too wide for their own good, swingarm extended. Really slow in the corners, and sudden bursts of acceleration when a straight appears. Squids wear no protection, deeming themselves invincible. This fact compounds itself with the fact that they engage in 'extreme riding'--performing wheelies and stoppies in public areas. Squids wreck a lot. Derived from 'squirly kid'

enr1co
07-02-2016, 04:50 PM
one of those rather overpriced yuppie sedans.

Which one? :p

To dispell the misconception of Tesla pricing, do the research on price, total cost of ownership, operating cost as most would with any other car purchase in the same class. The price and TCO of a Model S is competitive or less than many other mfgs base price in the same premium sedan class.

"Semi-Autonomous Cars Compared! Tesla Model S vs. BMW 750i, Infiniti Q50S, and Mercedes-Benz S65 AMG
Four cars itching to prove they're better at driving than you. Or your pet."
http://www.caranddriver.com/features/semi-autonomous-cars-compared-tesla-vs-bmw-mercedes-and-infiniti-feature


http://media.caranddriver.com/images/16q1/665057/semi-autonomous-cars-compared-tesla-vs-bmw-mercedes-and-infiniti-feature-car-and-driver-photo-665162-s-original.jpg

dustyrider
07-02-2016, 08:49 PM
My dog is blind so...guessing pretty much anything is better at driving than her.

GuyGadois
07-03-2016, 01:57 AM
Late to this party on this thread but I wanted to give my feedback. I am not sure how many people actually have driven a Model S with autopilot but it is actually very good. Is it perfect, no. I have owned one for a little over 6 months and have found the autopilot very accurate. The software continually updates and increases the accuracy because data is continually streamed to tesla to improve the software. Is it perfect? No. Is it very, very accurate, yes.

I personally believe humans are terrible drivers. We are distracted, emotional, and prone to be sleepy. The future is autonomous driving and it will be safer. During freeway trips I usually drive about 75% autopilot but I leave my Harry Potter DVD viewing for home. I keep my hands on the wheel in case I need to take over. The autopilot is amazing accurate with braking, accelerating and changing lanes and keeps getting better.

I realized that there was a death with someone using it but it was not being used as intended. The driver was breaking the law while driving (a law that stands with an autopilot or not). You just can't legislate for stupid. The question remains would there have been a death without autopilot Has autopilot actually help avoid accidents (tesla owners talk about how it saved them from an accident).

Before jumping on the autopilot hate-wagon I invite you to try the technology out. When used correctly it is amazing. Like it or not the future is going this way and we'll all be thinking in the future how bad human were at driving.

GG

verticaldoug
07-03-2016, 03:31 AM
Initially, the self-driving car is going to be a horrible ride in towns. Simple reasoning here, as soon as people around realize the car is self driving, they will start to game it. I am driving my car, let me barge in an merge, the self-driving car will slow down and let me in. I am a pedestrian and decide to cross in the middle of the street, not to worry or even look, because the self driving car will slam on the brakes and not hit me.

I think Google has discussed interaction with people in some papers.

Tickdoc
07-03-2016, 07:09 AM
Nice to hear from an actual user, guy. I'm not a self driving hater, just not a believer that it is ready for prime time.

One thing I notice is the level of engagement necessary to use the system on our Volvo. It lets you know your hands are not on the wheel.

It also lets you choose the carlength distance between you and the car in front. Of you.

The cruise works ok on a highway, but in traffic crawl mode in town it offers up enough of a gap that other drivers can easily squeeze in front of you, even in shortest distance mode.

Back to the cruise, though, I find myself constantly watching the lane changes and the cars around me because you will either find yourself following the slower car in front of you, or have to stay in the left lane constantly to avoid it slowing for the car in front. It is a different set of stresses, I found, or it requires a different level of attention.

Yesterday I was crawling through our little downtown with my wife in the passenger seat. Two lanes, cars parked diagonal on both sides, 25 mph speed limit.

There were three teenage kids walking and goofing. I saw them but just barely, and they were getting ready to jaywalk across the street. It looked like they were going to wait until I passed and then cross, then right as we got close, one kid pushed another into the lane! I was watching the thing unfold and had slowed just a bit, so I was able to slam on my brakes and missed him by inches. They were all laughing, but his friend just about shoved him to his death.

We were in my Mercedes and it has no electronic nanny's, but good brakes. Thankfully I was paying attention and alerted to the dumbass potential of the situation.

My point is, if we were in her Volvo, I am confident it would have also slammed on its brakes and avoided hitting the kid in this exact same scenario. It's accident avoidance system is very advanced. Three or four times now, say when someone is slowing to turn in front, or turns in your lane, it sounds a loud audible alarm and slammed on the brakes.

Would it have avoided the "is that the sky or is that a truck?" Scenario that killed this poor tesla driver? I would assume only the engineers at tesla will ever know.

1happygirl
07-03-2016, 08:38 AM
I grasp the safety issue but always feel and depend on my own choices over a machine or others choices. I don't think this is a viable option for my lifetime even.
What about the flying cars?
Autopilot is a stupid idea for a car, and if a couple more drivers get decapitated people might start leaving the autopilot box unchecked when they order one of those rather overpriced yuppie sedans.
Agree
After commuting for a while to an out of town job with no joy in the commute, I definitely just see this as a convenience issue. I would so never drive again (I know I'm the bane of car salesman and dealers just like all young folks).
Since the US doesn't have the mass transit infrastructure to accomplish no driving,
I can imagine I could arrive refreshed to my destination and will have relaxed reading or getting ready for whatever job I have ahead.

If I win the lottery I'm buying a 150K mile 25 year old junky car but hiring a chauffeur to drive it for me.

Mr. Pink
07-03-2016, 08:56 AM
Late to this party on this thread but I wanted to give my feedback. I am not sure how many people actually have driven a Model S with autopilot but it is actually very good. Is it perfect, no. I have owned one for a little over 6 months and have found the autopilot very accurate. The software continually updates and increases the accuracy because data is continually streamed to tesla to improve the software. Is it perfect? No. Is it very, very accurate, yes.

I personally believe humans are terrible drivers. We are distracted, emotional, and prone to be sleepy. The future is autonomous driving and it will be safer. During freeway trips I usually drive about 75% autopilot but I leave my Harry Potter DVD viewing for home. I keep my hands on the wheel in case I need to take over. The autopilot is amazing accurate with braking, accelerating and changing lanes and keeps getting better.

I realized that there was a death with someone using it but it was not being used as intended. The driver was breaking the law while driving (a law that stands with an autopilot or not). You just can't legislate for stupid. The question remains would there have been a death without autopilot Has autopilot actually help avoid accidents (tesla owners talk about how it saved them from an accident).

Before jumping on the autopilot hate-wagon I invite you to try the technology out. When used correctly it is amazing. Like it or not the future is going this way and we'll all be thinking in the future how bad human were at driving.


GG

Well, that's all fine, but, a question. How many deaths would you tolerate before this system that "isn't perfect" is, er, perfected? As though that will ever happen. After all, the man who runs the company that built your car believes it's possible we're all living in an alternate reality, and many of us will happily be living on Mars in the future. I'm not too keen on buying into his vision for the new automobile, especially since he has to do a lot of financial engineering to keep that company sovent.
Besides, isn't it kinda weird to be a driver(?), passenger(?), in a car that has to be prepared at any moment, in an instant, to correct the robot's decisions? Sounds like an unhealthy state of anxiety added to our daily lives, as though we need more. I would rather relax and drive.

One thing that bothers me after reading all this after this nasty incident, is that I have learned that there are a lot of people (probably more in my hood, since I live in a wealthy county) who are essentially beta testing this new technology on the roads right now. The same roads I bike. Hey, if that Harry Potter car missed a big old truck, I doubt it will see poor old Mr. Pink around the corner.

As others have mentioned, what we need is more public transit, not millions of robots driving around making a small number of geeky tech barons fabulously wealthy.

witcombusa
07-03-2016, 10:46 AM
Personally I don't want more mass transportation or robot vehicles. Besides I don't see a robotic motorcycle coming along for a while yet. Yes distracted (read IDIOT) drivers are a big problem. Let's start by making their phones inoperable in their car! Sure there are other distractions but this is a great place to start. How did these people ever live BC? (before cellphones)

Mass-trans doesn't work here and never will, except in urban centers which I call hell. Besides, they should be on bikes!

seric
07-03-2016, 01:21 PM
Here is Bloombergs take. They touch upon the human trust factor. I'll also mention that it's amazing that it seems almost 100% of drivers feel that they are better than average drivers. I'll continue in my opinion that that I'd rather have even today's Tesla autopilot next to me on a bicycle than an average Bay Area driver.

http://www.bloomberg.com/view/articles/2016-07-01/tesla-s-autopilot-will-make-mistakes-humans-will-overreact

RowanB
07-03-2016, 01:56 PM
Besides, isn't it kinda weird to be a driver(?), passenger(?), in a car that has to be prepared at any moment, in an instant, to correct the robot's decisions? Sounds like an unhealthy state of anxiety added to our daily lives, as though we need more. I would rather relax and drive.



That's my major issue with this system - the underlying assumption that the driver is alert to intervene whenever required, while the autopilot system massively reduces the level of interaction and engagement required. If 95% of of your driving with autopilot requires no interaction, of course your mind will wander. I think there's a fundamental contradiction there, and making the driver tick a box that they still have responsibility for driving doesn't get around it.

enr1co
07-03-2016, 03:54 PM
Here is Bloombergs take. They touch upon the human trust factor. I'll also mention that it's amazing that it seems almost 100% of drivers feel that they are better than average drivers. I'll continue in my opinion that that I'd rather have even today's Tesla autopilot next to me on a bicycle than an average Bay Area driver.

http://www.bloomberg.com/view/articles/2016-07-01/tesla-s-autopilot-will-make-mistakes-humans-will-overreact

:beer:

1happygirl
07-03-2016, 04:13 PM
Airbus Airplanes are universally against this as has been routinely mentioned, the system encourages and decreases human intervention.

You have no control.

saab2000
07-03-2016, 05:06 PM
Airbus Airplanes are universally against this as has been routinely mentioned, the system encourages and decreases human intervention.

You have no control.

This is mostly misleading. I personally do not fly an Airbus but my long-time mentor and someone I've known in aviation for 20 years does fly an Airbus 320 and 330. He's a big proponent of knowing how this stuff works and he tells me it is a myth that you can't turn off the automation. That said, yes, it's a highly automated airplane, but so is every other commercial airplane today. Automation allows for greater situational awareness and that's a good thing. It's not designed to allow the pilots to watch movies and read magazines. It's designed to allow pilots to focus on the skies outside and the overall situation better and not have to worry about altitude and course corrections all the time.

It's really impossible to compare automation on cars to automated airplane for so many reasons. Starting at 18000' all air traffic is under constant supervision and control by ground based air traffic controllers and all have collision avoidance technology which communicate with each other, all automatically. It's called TCAS. The open road is the wild, wild west compared to the commercial skies, which are highly controlled.

Anyway, it's really an apples and oranges comparison if for no other reason than that all commercial airplanes are in communication with controllers at pretty much all times.