Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

i understand where you're coming from, but there is a counter argument to what you're saying, which is that although there may be some dangers to things like the comma one and tesla autopilot, their release may save more lives than they edanger, having a net positive to humans. humans are pretty bad at driving


That's even MORE of a reason for the NHTSA to ask these questions. If the first-to-market autopilot system is too dangerous, that will destroy the public's trust in any future autopilot system, no matter how much of a net benefit it provides.

It's like how IUDs (birth control device) are disproportionately unpopular in the United States despite many benefits - all because of one particular brand that caused infections with horrible repercussions:

http://www.motherjones.com/blue-marble/2012/09/why-are-iuds-...


This is an ends justify the means argument. How many lives have been saved TO DATE with self driving cars? How many lives will be saved IN THE FUTURE with self driving cars?

What is the acceptable casualty rate to get from where we are now to where you think we will be in the future. Is your family an acceptable casualty? If not, then why is mine?


"Is your family an acceptable casualty?" is a pointless question. It can be applied equally well to the opposite argument: what is the acceptable casualty rate for not advancing safety in this way? Is your family an acceptable casualty?

Continuing on with the status quo is just as much of an action and a choice as pushing forward with automation. Both choices are going to get people killed. We can either evaluate the choices and choose the one that seems better, or we can ignore the question and possibly have more people die than is necessary, but there's no option where your family isn't at risk.


I don't necessarily disagree with you but one thing that I think a lot of people are seeing is you are relinquishing control of your car to a computer. When I think of computers I often think of things that I CAN MAKE WORK but that aren't inherently trustworthy. Sure they're great for things like math and stuff but when I try to run a new piece of software I expect crashes and they generally happen. NOT ALL SOFTWARE. But a good bit of first gen software.

So do I really want untested first gen software released without extreme safety standards and heavy testing? No. Do I think the net result of that happening would be a safer driving experience? No. Give me the iOS 9 or so of self driving car software or the unix. Not the Amazon Fire phone.


I totally agree on this. Putting my life in the hands of software makes me nervous. I've seen how software is made. But I do it anyway....

I'm not sure if it's entirely relevant here, since the comment above presumes that these systems are safer than humans, and the reply doesn't seem to be questioning that.

I think it's very likely that these systems will rapidly become far better than human drivers, and probably already are in the domains where they're meant to be used. But we definitely can't just take that on faith and hope it's being done properly.


This argument is a red herring. We know the current casualty rate and have accepted it. Nobody gets into a car with the illusion that it's 100% safe.


And if technology reduces that casualty rate, why does the fact that it's new suddenly make it unacceptable?


I didn't say that. My question was: what is the casualty rate currently and what is the casualty rate in the self-driving car future? I am asking what you think this reduction actually is and how high of a price you're willing to pay to see that reduction. I am specifically asking what you think the reduction is so that we can have a discussion on 1) whether that is a realistic reduction and 2) whether that is, in turn, worth the price. It has nothing to do with whether or not it's new.

Is that reduction a result of self driving or advances in other safety (airbags, seat-belts, crumple zones)?

It is clear that your mind is already made up and that you are a self driving car evangelist. Please endeavor to approach these debates with an open mind and in good faith.


My mind is made up only because I'm pretty sure the casualty rate will be sharply lower. Humans are horrible at driving.

As for your question, you basically destroyed the conversation when you asked "Is your family an acceptable casualty?" If you just wanted numbers, you shouldn't have made it personal like that. That's all my point is here.


> As for your question, you basically destroyed the conversation when you asked "Is your family an acceptable casualty?" If you just wanted numbers, you shouldn't have made it personal like that. That's all my point is here.

This is a really fair point but I also don't think it's a bad thing to say this. I think people distance themselves from possible problems by calling things a numbers game but they don't factor in the fact that they could be part of those numbers.

If I say that 27000 people per year die from listening to Mambo Number 5 then I say that we can lower that to 15 if we remix it to take out the deadly frequency but we will have to test it on 20 people to find which is the deadly frequency for sure. That sounds great. A couple people might die but what's that to save 27000/year?

Well, take that and say we will have to test it on 20 people and 5 of those will be your family. That makes the decision a lot harder. Logically it's exactly the same but emotionally instead of "people" it's 15 "people" and 5 of your family.

So, it's important to mention it could be your family because if someone was alright with it because they are looking at the pure numbers, they should be brought back to the reality that we all actually live in a little.


I agree with the effects but I disagree with the desirability! We should be making these choices rationally. Taking an action which causes some people to die but saves a more overall is typically a good thing. Asking, "what if it's your family?" just makes it harder to make the correct choice, and that makes it more likely for your family to be a casualty.

I guess this all boils down to what you think of the Trolley Problem. I'm one of those who say, yes, of course you pull the lever if you have no way to save everybody, it's obviously better.


> Continuing on with the status quo is just as much of an action and a choice as pushing forward with automation

Of course, but public perception is, you know... not largely logic-oriented. Whether or not you personally find @abduhl's argument convincing, there's still that emotional hurdle you'd have to clear to convince significant numbers of people to sign up.


Of course. But acknowledging and working with that illogical public understanding of things doesn't mean I can't argue against it when I encounter it online.


To estimate your questions, probably no lives have been saved so far through self driving, possibly one extra guy killed in the Telsa crash which you could say was maybe half the systems fault and half the driver not following instructions.

In the future, 1.2m people die a year so say self driving reduces that to 0.2m/year, that's a million lives saved per year.

You family is thousands of times more likely to die in a conventional motor accident than due to a self driving vehicle. If unnecessary red tape slows things down and causes 100k extra deaths is that ok? Even if someone you know is amongst the 100k?

That said there's an argument system like Hotz's should be tested first. That's probably not going to slow things that much.


If unnecessary red tape slows things down and causes 100k extra deaths is that ok?

That's an imaginary scenario. You have no idea if that's the way it plays out. Let me give you another scenario: the first self-driving cars kill a bunch of people, and no one wants to touch them for years after they finally become safe; delaying the adoption of cars, and causing millions of additional people to die.


Very unlikely. Self-driving cars are potentially enormously profitable; with such profitable products I think you'll find that the opinion of an outraged public matters somewhat less than it customarily does.


As a great man once said, "Not taking any action is also an action."


I hate this bullshit argument that 'humans are pretty bad at driving.' Yes there's a wide distribution of driving skill levels, and this is dependent on the cars people drive and their maintenance records. But please don't lump me in the same bag as obviously shitty drivers, of which I've seen scores of.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: