Former Navy SEAL Becomes the First Person to Die in Self-Driving Car

From TheBlaze
Ships have had autopilots for years, yet they are always manned. I just hope Space X is not affected. If it hit a school bus the lawsuit would be huge.

WASHINGTON — A driver so enamored of his Tesla Model S sedan that he nicknamed the car “Tessy” and praised the safety benefits of its sophisticated “Autopilot” system has become the first U.S. fatality in a wreck involving a car in self-driving mode.

The National Highway Traffic Safety Administration announced the driver’s death Thursday, and said it is investigating the design and performance of the Autopilot system.

Joshua D. Brown of Canton, Ohio, the 40-year-old owner of a technology company, was killed May 7 in Williston, Florida, when his car’s cameras failed to distinguish the white side of a turning tractor-trailer from a brightly lit sky and didn’t automatically activate its brakes, according to statements by the government and the automaker. Just one month earlier, Brown had credited the Autopilot system for preventing a collision on an interstate.

No lawsuit should really happen, if you actually think Tesla does not have a disclaimer saying that the auto pilot is not %100 reliable they'd be out of business a long time ago. People are sadly dumb enough to trust a feature that is still in its infancy to make the right decision and not fail on them. It's sad that this happened but this is user error, not the car.

3 Likes

I'm not going to fault Tesla at all, not even a little bit.

No technology is perfect in every aspect. Even if it does function flawlessly at first, it will break eventually. Putting your life in the hands of a machine and thinking that you'll never need to intervene is just ignorant.

The main thing I don't like about the idea of self-driving cars is it will give people an excuse to pay even less attention while they're driving. When I first saw this on FaceBook there were tons of people claiming that it was Tesla's fault because their system failed; these comments made me somewhat disgusted that some people are that actually that stupid. I see people every single day still texting or talking on their phones while driving. If their cars drove themselves, I'd imagine that those are the same people that would be watching movies or sleeping, get in a crash, and then try to say it's Tesla's fault.

1 Like

It's an autopilot system. Teslas are not self-driving cars, they were never intended to be self-driving cars, and no amount of shenanigans and software updates will give current Teslas the ability to drive themselves completely autonomously, and anyone who is ignorant enough to trust it to keep them safe from a car crash is an idiot. It's a horrible thing that happened, but it does not excuse the man who entrusted the car with a task it was not built to handle. He should get a darwin award.

This is not a self-driving car any more than a 747 is a self-piloted plane. The flight crew HAS to keep an eye on the plane at all times, they can't both just sit back and watch Harry Potter like this guy did. The autopilot is only engaged when no major adjustments to the plane's speed and direction are needed for a while, but it does not absolve the captain and first officer of responsibility to keep their passengers out of harm's way.

The media will blow this out of proportion. Lawsuits will be made, grievances will be had, the man's family will likely want some kind of retribution or recompense for their loss, and everyone will sheepishly remain oblivious and possibly defiant of the fact that Mr. Brown died because of his own negligence and ignorance. Nobody has anybody but himself to blame for this. Tesla is not to blame, the engineers who designed the car are not to blame, the programmers who made the autopilot software are not to blame, not even the man who decided to call it Autopilot is to blame, because that's precisely what it is - an autopilot.

This still qualifies as technological error as much as it is user error and re-insures my fears of using a self-driving car, needs manual override at least. Technology can and has made mistakes you know, it's the very nature of nothing being perfect.

True, and it does have manual override built in. The cameras did make a mistake, but again this is still considered a beta program.

1 Like

Yeah, they are very ignorant to realize that technology isn't perfect and people need to accommodate for it. But still, I can see this being a problem for someone that already drives properly and doesn't need auto-pilot, unless there is a way to give user top priority when needed or else let car take control. People need to not depend on technology to do everything and sit on their asses doing nothing, or stuff like this happens.

Yeah, not mad at Tesla for this, can't believe the amount of people that are without understanding the technology. If there is one thing I learned from my programming and engineering classes (So far, haven't done much other than Java/C programming, gotta do C++, and Digital Electronics) is that technology can make mistakes and it is up to the user to compensate for any mistake they can, unless the effort is too great for the user to handle.

Thats why i prefer to drive myself. ☺

2 Likes

More fun too

^-- THIS.

They're driver assistance features :D it's nice to have, but doesn't guarantee safety. Users sit down at a computer and expect it to think.

After 130million combined miles with autopilot.

https://www.teslamotors.com/blog/tragic-loss?utm_campaign=Blog_063016&utm_source=Twitter&utm_medium=social

Look forward to government regulations instead of a lawsuit.

I hope this doesn't become the Hindenburg disaster, and we all know how susceptible people are to fear monger.

2 Likes

yea i read about that this morning
it happened on the 7th ?