Go Tesla

  • Active since 1995, Hearth.com is THE place on the internet for free information and advice about wood stoves, pellet stoves and other energy saving equipment.

    We strive to provide opinions, articles, discussions and history related to Hearth Products and in a more general sense, energy issues.

    We promote the EFFICIENT, RESPONSIBLE, CLEAN and SAFE use of all fuels, whether renewable or fossil.
  • Hope everyone has a wonderful and warm Thanksgiving!
  • Super Cedar firestarters 30% discount Use code Hearth2024 Click here
Status
Not open for further replies.
The next mechanic class.....

[Hearth.com] Go Tesla
 
Last edited:
The question of liability is intriguing. Eventually with wide employment and acceptance of autonomous cars insurance will probably end up being a form of product liability insurance. In the meantime there are a lot of scenarios to be figured out and soon. Autonomous features will be showing up on luxury vehicles soon.
https://www.hg.org/article.asp?id=34960
 
Likely true. Seems like adoption will depend on the country. England and Germany appear to be moving more aggressively in this direction, but that could change.
 
Once in 130 million miles - which is great unless it happens to be you. Not sure where they get the 130 million mile number though. Does the car "phone home" every time it is on autopilot?

https://www.teslamotors.com/blog/tragic-loss
 
Once in 130 million miles - which is great unless it happens to be you. Not sure where they get the 130 million mile number though. Does the car "phone home" every time it is on autopilot?

https://www.teslamotors.com/blog/tragic-loss
Not sure the answer to your question, whether that number is estimated or actually logged from on-board systems, but I wonder how this compares to human drivers? I've never been in an accident at the wheel, myself. However, I suspect the national average is probably 100-1000 times worse than Tesla's reported number, maybe one major accident per several hundred thousand miles driven.
 
I've read that Teslas are constantly reporting back data like position, speed, temp inside and outside, battery state, etc. and when in auto-pilot mode and driver corrects steering. Owners sign an agreement to send back data when they buy the car.

The accident death rate has dramatically declined due to the advent of many safety features and better tires and perhaps stricter DUI laws. 1 in 130 million is an unacceptable ratio, but not out of line with the average.

https://en.wikipedia.org/wiki/Trans...dia/File:USA_annual_VMT_vs_deaths_per_VMT.png
For driving, one can use the U.S. average fatal automobile fatality rate of 1.5 per 100 million vehicle-miles for 2000[1]
https://en.wikipedia.org/wiki/Transportation_safety_in_the_United_States
 
Investigation will be interesting ... will they release details? Truck driver said he heard Harry Potter movie running ...
(broken link removed to http://www.cnbc.com/2016/07/01/truck-driver-involved-with-fatal-tesla-accident-says-driver-was-watching-a-movie.html)
 
The accident death rate has dramatically declined due to the advent of many safety features and better tires and perhaps stricter DUI laws. 1 in 130 million is an unacceptable ratio, but not out of line with the average.

https://en.wikipedia.org/wiki/Trans...dia/File:USA_annual_VMT_vs_deaths_per_VMT.png
For driving, one can use the U.S. average fatal automobile fatality rate of 1.5 per 100 million vehicle-miles for 2000[1]
https://en.wikipedia.org/wiki/Transportation_safety_in_the_United_States
The fact that this accident resulted in a death brings more attention to what happened, but let's not let that shape the statistics we consider. After all, which accidents may or may not result in death can be so random.

I hope people are able to focus on the more meaningful statistic, accident rates, not death rates. On the metric of accident rates, how does the Tesla system compare to human drivers?
 
Hard to answer with a nascent technology like this. Particularly when there are folks doing crazy chit far beyond reasonable at this juncture. For sure there are a whole lot more accidents caused by cell phone users getting distracted while driving. For that we have some metrics. It's estimated the one out of four accidents are caused by texting while driving. :eek: Given those odds, autonomous driving should be mandatory for texting addicted drivers.
 
  • Like
Reactions: Ashful
The first one where it kills somebody in the other vehicle is gonna be when all hell breaks loose. Insurance companies will hit the exits.
 
Last edited:
Tesla is learning some hard lessons with this accident. It seems they are promising too much without teaching the drivers all that can go wrong when the right set of conditions trip up the sensors. Also, they have been using non-disclosure agreements that have troubled the NHTSA because it would curtail reporting defects to the agency.

I guess the only safety statistics that will matter will be between the different car makers to see whose smart cars get into the least amount of accidents.
 
Last edited:
Still need more info ... how fast was he driving? If he had been engaged in driving, could he have avoided the accident? Accident reconstruction should be able to answer that.

Still comes down to the human driver error by relying to heavily on the system and ignoring the cautions provided by Tesla...

Trucker has not been charged yet...
 
Another autopilot/driver fail with Tesla...
http://www.teslarati.com/tesla-model-x-crash-montana-blamed-autopilot/

While it was purportedly prompting driver for hands on the wheel, with the first hit, it should have braked and stopped immediately. Would suspect a loss of sensor with the amount of damage and program shortfalls. Did it prompt driver control before or after first guide post hit?
 
Driver claims he didn't understand the alert, says he speaks Mandarin, not English. Also, the system instructions appear to specifically warn that the feature is for divided, marked highway driving. Not rural country roads. I suppose the directions were in English too instead of Chinese.
 
I'm interested to see how these systems handle stupid people. We all see them. They seem to be breeding at an exponential rate. People that will use autopilot in bad weather, snow, fog, ice, ect. How does an intelligent system engineer around the ridiculous mistakes truly stupid people make?

Locally here in CNY, a driver drove his car into a tree playing a video game on his phone.

These systems are only as good as the sensor technology is. The reactions to that data are only as good as the inputs from those sensors and the output to the PLC/computer.
 
Tesla and the media have hyped up the promise of self-driving cars. This has led to exaggerated expectations. Truly autonomous, full automation cars are a long way off. The June issue of Scientific American has a good article on the differences between the progressions from driver assistance to partial automation, and from conditional automation to full automation. This is s truly challenging software nightmare. The problem of full automation is large enough that the author doesn't see it solved until about 2075. Until then, my preference would be for Tesla to not allow hands off driving because there are too many variables that are not accounted for and that will eventually cause a failure and potential fatality or injury.
 
  • Like
Reactions: Lake Girl
This article brings up the question that has been nagging me. Musk claimed that it might have been the bright sky + white truck that confused the collision avoidance systems. What bugs me is that the car has both visual and radar sensors. AFAIK radar doesn't care about bright skies, so what really failed here?
http://www.forbes.com/sites/dougnew...-will-make-changes-to-autopilot/#269f727d5d54
The article goes on to point out that Consumer Reports is calling for some changes to make the beta test software in the Tesla safer.
  • Disable the Autosteer feature of Autopilot, “until it can be reprogrammed to require drivers to keep their hands on the steering wheel.”
  • Refrain from referring to the system as Autopilot because it “is misleading and potentially dangerous.”
  • Provide clearer guidance to owners on how the system should be used “and its limitations.”
  • Issue “no more beta releases” and test all safety-critical systems before public deployment.
 
  • Like
Reactions: vinny11950
Status
Not open for further replies.