Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Tesla Vehicle Safety Report (tesla.com)
32 points by redox99 on Jan 10, 2023 | hide | past | favorite | 90 comments


> For example, more than 35% of all Autopilot crashes occur when the Tesla vehicle is rear-ended by another vehicle.

Well, if you want to know why some people are rear ending a Tesla, it's because it's far more unpredictable than a human driver.

https://theintercept.com/2023/01/10/tesla-crash-footage-auto...

Also it seems to contradict itself on data collection. First;

> In practice, this correlates to nearly any crash at about 12 mph (20 kph) or above, depending on the crash forces generated

But then they've decided to ignore crashes that doesn't deploy an airbag or similar...

> recent analysis led us to identify and implement upgrades to our data reporting. Specifically, we discovered reports of certain events where no airbag or other active restraint deployed [...]

So now we're only counting major accidents?


Autopilot does emergency braking for nonexistent obstacles. I'm sure it's contributing to rear end collision risk. I stopped using it for this reason.


Every autonomous braking system has that problem, including the one in my 2020 Macan. They are dangerously-flaky hacks. As of 2025 they are mandated by law.


I own another VAG car with automatic emergency braking. If it is actually applying the brakes for you more than in cases where you are clearly about to rear-end someone, you should consider not tailgating as much. The sensor even gives you a warning if you are following too close at highway speeds.

The radar system they use in modern VW and Audi cars is extremely conservative, claiming to not engage the brakes at all above 35 mph, and waiting until the last possible moment to apply the brakes below that window. It's been absurdly accurate for me. Once every few months or so it "warns" outside of a dangerous situation, but they are almost always situations I could have prevented with a gentler approach to driving, for example following closely a truck with a trailer that is slowly turning off the street will erroneously activate the warning because I'm continuing as if it will be out of my way, which is not the safest.

The time it activated in a real danger situation it didn't even apply the brakes, it "primed" them so that my action of pressing the brakes would happen faster.


> The radar system they use in modern VW and Audi cars is extremely conservative, claiming to not engage the brakes at all above 35 mph, and waiting until the last possible moment to apply the brakes below that window.

I love my 2021 RS 5, but I have to disagree with this. Even in Traffic Jam adaptive mode, many times it will accelerate towards a vehicle _that it detects_, and only brake very late, and hard. On several occasions I've intervened because I wasn't sure if it would. When I've been alone in the car, I've waited later (and seen it brake hard, of its own accord), when I've held off on the brake since I have no passengers to give whiplash.

It will also absolutely emergency brake above 35mph.


It's possible the features introduced in the upmarket trims are to blame. I have all the hardware for things like distance based cruise control and parking assistance and lane keeping, but the previous owner did not buy them so I ONLY have automatic emergency braking. In 20k miles the worst problem I had was it failing self-calibration and just turning itself off for a few months.


Definitely possible - my girlfriend has a 2017 A4 without anything but AEB, basically, so I have no real comparison source.


If it is actually applying the brakes for you more than in cases where you are clearly about to rear-end someone, you should consider not tailgating as much

Thanks, Dad. Clearly I have no idea what I'm doing, or what I'm talking about, so my real-world experience is invalid.

For the benefit of everyone else, phantom braking is a thing that by definition occurs when you're not following anyone. Such as when the car slams on the brakes upon approaching a car that's clearly about to turn right from a nearby parking lot, but who is equally clearly waiting for you to pass before entering the road.


I think I meant to add more weasel words to that like "consider". Let me be more direct. In 20k miles in this vehicle, I have never once encountered the radar based AEB warning, not even activating, in a situation where it wasn't obviously sensing a car that would be in the path of my vehicle. It has not activated for trash, or overhead bridges, or cars in the next lane over, or any of the number of things everyone else says radar can't handle. It has only ever warned me when there is a car in front of me, sometimes a car that is taking extra time to turn off the road, once because of a car in a suicide lane. In those instances it does NOTHING else than play a generic warning tone and put a red warning on the dash, it never activates the brakes, and does not even do the brake "priming" that it claims to do if it thinks an accident is likely. It is EXTREMELY conservative, to the point that I do not consider it a backup.

I shouldn't have been patronizing, I'm just so confused how the same hardware and (possibly) software can produce such different results. Why would the audi version be so much more sensitive?

>Such as when the car slams on the brakes upon approaching a car that's clearly about to turn right from a nearby parking lot, but who is equally clearly waiting for you to pass before entering the road.

Like I've driven past this exact situation hundreds of times with not even a peep from my car.


I did 30,000 miles in a 2019 outback with a significant amount of phantom breaking, more than I've had in my Tesla or e-tron. Anecdotes don't mean much.


Someone else on HN commented this the other day. RSymons RSEV did a comparison[0] of the self-park feature of Tesla's compared to three other cars. All cars nailed the spots they tested flawlessly while the Tesla just straight up didn't see most of the spots. When they moved the cars further and further away so that the spot became more obvious it eventually did see the spot but parked over a foot from the curb

Tesla seems to have a lot of software issues when it comes to "smart" features like these. I wouldn't be surprised if they are indeed worse than the competition, but I'd love to see a similar comparison

[0] https://www.youtube.com/watch?v=nsb2XBAIWyA


Apparently not as common as Tesla. You should read about "phantom braking" issues on Tesla , especially after they removed the radar


Yeah, I can't even imagine what it would be like to implement a system like that without radar. Tesla's approach would make a good Most Interesting Man In the World meme. "I don't always tackle the hardest engineering problems in human history, such as self-driving cars. But when I do, I tie one hand behind my back, close one eye, and consume some alcohol or drugs first."

One episode per 3000 miles is too many, and that's about what I was seeing before I disabled the feature.


Are you referring to the "gentle slow down" behavior that Autopilot often does or the "slam on the brakes as hard as you can" behavior of Automatic Emergency Braking? I definitely see the former more than once per 3000 miles but it doesn't bother me. I've never seen the latter.


In my vehicle, the 'gentle slow down' part that's associated with the radar-driven auto cruise control usually works OK. I'm referring to false triggering of AEB in completely-inappropriate circumstances when no obstacle is in the car's path.

If someone has built an AEB system with the degree of immunity from false triggers that's needed to make it a net positive for road safety, I haven't seen any evidence yet. Similar complaints across many different car models are widespread.


On my car, all of these occurred frequently.

It definitely is erratic with the speed, speeding up and slowing down slightly, which is annoying, but not so dangerous.

The dangerous parts are driving home from work, 6 miles, at least two very hard brakes each trip with no cars ahead of me (I only made a few trips like this before disabling it).

Automatic emergency braking (with the loud beeps and red message on the screen) on two lane highways, over crests, a myriad of other circumstances.


Musk's obsession with not relying on radar and using only cameras as sensors is nuts and likely a big reason why competitors systems perform so much better. But it seems Tesla has finally backtracked on this

https://electrek.co/2022/06/08/tesla-files-use-new-radar-con...


The rate for me was approximately 1 emergency brake maneuver per 6 miles of driving. I doubt any of them are as bad as the Tesla system.


Better a false positive (phantom braking) than a false negative (not braking when it should). I imagine they're all tuned with this in mine.


"Here's a drug that is guaranteed to prevent pancreatic cancer if you take it every day, but is very likely to give you one or more heart attacks at random intervals. Even if that happens, though, you'll probably be OK, because most of the heart attacks won't be all that severe. Want some?"

"Oh, by the way, starting in 2025, your breakfast cereal will be legally required to contain this drug."


Are you kidding me? One of the several times autopilot tried to kill me was when it randomly tried to flat stop on I-80 with a semi driver following me. No way he could have stopped in time if I hadn't disengaged and slammed on the accelerator.

(Absolutely no reason to stop and no weather or afternoon sun for confusion. My best guess is that flaky vision systems thought a semi truck was entering my lane, but I really don't know.)


But it didn't kill you. If you had a false negative, that means it failed to stop when it should have, and that almost definitely would have caused an accident. False negatives are more dangerous than false positives. Both can be dangerous, but false negatives are worse.

All AEB's from all manufacturers have a certain amount of phantom braking, but non-Tesla AEB are well studied and definitely save lives.


GP gave an obvious example of a false positive. I don't why them thinking fast enough to save their life neglects the point...


I think the current state of Tesla's self driving is more dangerous in this respect than I am.


Keep in mind, you can reasonably assume many minor accidents don't get reported to the government database. But Tesla is afflicted by perfect (or near perfect) knowledge of every vehicle in the fleet. A line has to be drawn somewhere. 0.1 mph? 2 mph? 25 mph? Half that? Where would you draw the line to be comparable to the federal data?


Sure. But we're very much in the data gathering phase of autonomous/semi-autonomous/enhanced driver assist vehicles, so I have a hard time arguing that "we don't really need to report this incident (not accident) to a database".


>Well, if you want to know why some people are rear ending a Tesla, it's because it's far more unpredictable than a human driver.

How unpredictable is a solid 5 seconds of the Tesla signaling it's lane change then slowing down? I blame the asshole that was in the far left lane going way too fast and not slowing down (just letting off the accelerator when the Tesla signalled would have prevented the accident) when someone is trying to enter. Then they brake way too late. Everyone behind that first vehicle is also to blame for following too closely at too fast of a speed.


Does the Tesla always activate its brake lights when slowing down? (I have an EV6, and I don't even know if brakes lights come on when I'm using regen as opposed to the brake pedal) When I'm driving and traffic is stopped ahead, I used to always brake not because I had to, but as an indication to the car behind me.


https://www.tesla.com/ownersmanual/modely/is_is/GUID-3DFFB07....

Yes, but it depends on the speed you are going

> If regenerative braking is aggressively slowing Model Y (such as when your foot is completely off the accelerator pedal at highway speeds), the brake lights turn on to alert others that you are slowing down.


Yes, brake lights are illuminated under any deceleration above a certain threshold, or application of the brake pedal.


Yes, as soon as you hit regen your brake lights come on. I tested this with a Honda Hybrid a long time ago to make sure. If your car has a diagram somewhere that you can display while driving you will actually see it on the display.


This isn't a universal truth. It varies by vehicle, and apparently even by what level of regen is selected.

https://www.greencarfuture.com/electric/does-regen-braking-t...


"Conclusion: Do Brake Lights Activate During Regenerative Braking? Yes, but Stay Alert"

Seems pretty obvious, the idea is that engine braking would also cause your car to slow down without lighting up the brake lights so any kind of slowdown at that level would not normally engage the brake lights. But as soon as you slow down more than that the lights will come on, making its behavior pretty close to what you'd expect from a regular car with and ICE lacking regenerative braking.

Or would you suggest that a normal car should engage it's brake lights when you let go of the accelerator and engine brake? (I can see a case for that.)


Yes, even the little CGI model of your car in the UI even shows your brake lights illuminate. AFAIK the the CGI representation of the car accurately shows tire position, blinkers, headlights, doors, etc.

In comparison, my 2015 Leaf doesn't illuminate the brake lights during regen but it's regen is significantly less than my Model 3.


Thanks for asking this. I've never thought of the implications of re-gen braking when slowing down!


Searching around seems inconsistent across different brands/models/model years (eg Bolt behavior changed recently). As one pedal driving becomes more prevalent might be something in need of a standard.

For Tesla Model Y:

> If regenerative braking is aggressively slowing Model Y (such as when your foot is completely off the accelerator pedal at highway speeds), the brake lights turn on to alert others that you are slowing down.

Source: Model Y manual: https://www.tesla.com/ownersmanual/modely/is_is/GUID-3DFFB07...


Might be a nice addition for other cars, for example, when you aggressively downshift and decelerate that way.


According to this article [1], the brake lights do come on. But the article is from 2021, so who knows if that is still true (or if it was ever true for all Teslas, since they only tested one).

1: https://www.greencarfuture.com/electric/does-regen-braking-t...


It is. You can see it on your little icon that the brakes turn on. Anything reasonably fast shows brakes.


Given that per the NHTSA 29% of all accidents are rear end collisions, I don't think the 35% number is meaningful, given the small sample size.

Also: Tesla has always been clear about their definition of "crash", it hasn't changed since they started reporting years ago.


> Also: Tesla has always been clear about their definition of "crash", it hasn't changed since they started reporting years ago.

But it says specifically it does since Jan 2023 and they've retroactively adjusted their previous reporting...



Almost as upsetting as it is to see a Gish gallop link list rise to the top of that topic. Needless to say not one of those corroborates "history of lying about their cars' safety", and you surely know that. But you'll challenge us all to figure it out on our own anyway?

It's absolutely amazing that we have to go through this every time someone gets video of a Tesla event. There are millions of these cars on the road now, surely the fact that this happens only monthly sits as evidence against your position, no?

Edit: FWIW: my money is on this story being quietly retracted at some point anyway. FSD doesn't behave like that. It just doesn't. Almost certainly the driver got confused, panicked, or otherwise overrode the automation and then blamed it when talking to police. The police report is almost silent on the subject: https://www.documentcloud.org/documents/23569059-9335-2022-0...

This entire kerfuffle is based on this one sentence: "P-1 stated V-1 was in Full Self Driving mode at the time of the crash, I am unable to verify if V-1’s Full 24 Self-Driving Capability was active at the time of the crash."


> Needless to say not one of those corroborates "history of lying about their cars' safety", and you surely know that.

From the first link:

> Federal safety regulators accused Elon Musk of issuing “misleading statements” on his company’s Tesla Model 3 last year, sending a cease-and-desist letter

Federal regulators don't exactly send cease and desists casually. Misleading and misrepresenting is dishonest, i.e. lying.


the National Highway Traffic Safety Administration, New York Times, and even published research[0] have all been calling Elon Musk out for years now, yet his fans will still believe him despite them withholding data

[0] https://engrxiv.org/preprint/view/1973/3986


This is the gallop part. I called you out on your first set of links (one of which legitimately hoodwinked a commenter here!) and instead of defending those arguments you're pivoting to another. I won't engage here either, except to point out that this isn't "published research" (it literally tells you that on the first page).

The clear and inarguable truth is that the Tesla fleet is pushing three million cars now. They're safe. Are they safest? safer? less safe? There's space for argument. But the kind of absolutist story you're spinning just doesn't hold. If these cars were actually causing accidents at even a moderately higher rate, we'd quite clearly know about it. And they aren't.


This study is from one of the original links I shared...

You accused me of gish gallopping. I've since slowed down and focused on specific events. See my other comments about the battery swapping scam for example.

Please provide some actual data proving these cars are safer than average. Tesla certainly hasn't been able to. And research, like the one I just linked which you refused to acknowledge, definitely does not back up Elon's claims


> Please provide some actual data proving these cars are safer than average. Tesla certainly hasn't been able to.

But... they have. That's exactly the link we're discussing. You don't want that to be true, so you're digging around trying to pretend via a flood of bad evidence that they're somehow "lying" in that data.

But... what if they aren't? Again, three million cars are an awfully big signal to try to spin. Isn't the Occam's Razor interpretation (also jives with my personal experience with the product, FWIW) that... they're safe?


I can't tell if you forgot a </s> there but just in case you're sincere...

I just paid $5k on a used extended van. One of the most dangerous cars you can buy. Obviously I don't expect this to be any safer than a $100k Tesla.

Similarly, I find any comparison between the "average car" and a Tesla's crash ratings to be completely useless. How do we know that other luxury cars around that price point don't actually do much much better than tesla does? The "data" provided is insincere at best. It's also not actual data. They're just telling us some numbers. They've consistently refused to provide any actual data

This OP also again makes the claim of the "lowest probability of injury of any car by NHTSA" which the NHTSA has repeatedly asked them to stop saying.[0] The NHTSA doesn't rank cars by probability of injury at all

Remember when they said the Tesla S earned 5.4 stars and NHTSA had to make a public statement that it does not award more than 5 stars? lol. Yet another lie

[0] https://krcrtv.com/features/auto-matters/nhtsa-to-tesla-stop...


I read through all these links. They’re not great aside from the worker safety one.

The first one states that Tesla is being misleading for claiming that it’s cars scores better for safety but not mentioning that the weight classes affect the scores. The original claim is still true though.

Most of the others deal with the claim that the cars accidents per mile ratings are lower because they’re highway miles or because they’re driven by rich people. But… they’re still low.

You can argue at best that there is no rigorous proof that Tesla cars are safer than other cars and that these are cherry picked stats for marketing. But there’s no compelling evidence that they’re more dangerous. Calling them lies is disingenuous. Regardless of one’s opinion of musk.


[Deleted for my poor reading comprehension].


Good grief. Those are workplace accidents. In the factory.

Edit: but you get the meta point, right? Upstream comment threw a ton of links at you along with an intended (and deliberately incorrect) interpretation knowing that you wouldn't read them carefully. That's why that kind of argumentation is so toxic.


Those are work related injuries, not car accidents. I already agreed with that point.


Musk has repeatedly been called out by regulators for his misinformation. Some claims, like the safety of autopilot have been directly debunked by research. This article has a good literature review if you're interested:

[0] https://engrxiv.org/preprint/view/1973/3986

> In independent research, Templeton (2020)compared Tesla’s stated crash rates with Autopilot enabled and not enabled by attempting to control for increased use of Autopilot on relatively safer freeways. To compare human-driven crash rates of freeways and non-freeways, Templeton used fatality rates, which may overestimate crash rates on freeways as higher speeds increase crash severity according to a fourth power law (Evans, 1994). When controlling for road type, the crash rate benefits of Autopilot narrowed significantly. Templeton was unable to fully assess their comparison of Autopilot crash rates with national estimates due to their different definitions of crashes.

> Goodall (2021)investigated struck-from-behind crashes of automated vehicles using age-weighted crash rates from SHRP 2 NDS database as a baseline. Automated vehicles were struck from behind at five times the rate of human-driven vehicles, although much of the difference could be attributed to higher rates of urban driving experienced in automated vehicle testing.

I'm also not sure what percent of level 2 ADAS vehicles are Tesla vs other brands but they're by far the most common vehicles with driver-assist involved:

[1] https://www.theverge.com/2022/6/15/23168088/nhtsa-adas-self-...

Google's Waymo was lightyears ahead of Tesla. The reason they didn't go to market was that they knew it wasn't really ready. Elon Musk has lied (as in he knew the truth and lied) about their driver assistant technology's safety and capabilities and has been sued for it[2]. There's no other way to sugarcoat it

[2] https://www.theverge.com/2022/9/14/23353787/elon-musk-tesla-...


Your first quote implies that the Tesla autopilot had better than average performance but not by much, when normalized for roads. Like… sure. Who cares? That’s a nuanced statistic that doesn’t even change the direction of the difference.

Your second argument… again, sure, whatever? Not a contradiction or evidence of a lie.


Ok lemme slow down and do it one at a time. Remember the battery swap scandal? There were many articles written about it.[0] Turns out it was a conscious scam by Tesla to increase its vehicle credits[1]

I've got more, but I think I'd have better luck taking it one at a time so I'll wait...

[0] https://dailykanban.com/2015/05/27/tesla-battery-swap-unused...

[1] https://dailykanban.com/2015/05/27/tesla-battery-swap-unused...


I don’t care about Tesla much so… no, I don’t know about a battery swap scandal from 7 years ago. But reading this first article (which is the same as the second article) it would appear that Tesla invited people to use a battery swap facility and a team of investigators staked it out for 48 hours and nobody came.

???

What am I supposed to take from this? I think maybe you intended to provide a different article with an actual claim but the first one seems unlikely to matter.

I’d like to see something specific to claims that teslas aren’t actually safer than average if possible


True, although it is indicative of some level of community sentiment and will certainly have a comment section full of rebuttals as your own. Good advantage of a mostly free speech platform.


I do believe that Teslas are quite safe despite the high-profile failures, but I'm very unconvinced by a stat that compares accidents per mile using autopilot to accidents per mile not using autopilot. Those are not comparable miles - people don't uniformly use autopilot across all driving conditions. Imagine you had a subsystem that did absolutely nothing, but people turned it on when driving on empty highways. It's accidents per mile number would look great!


What's also egregious about this is that Tesla has now "decided" that if active restraints or airbags are not deployed, that incident "does not count going forward".

And as an automobile manufacturer, I cannot believe in good conscience that Tesla is unaware that active restraint and airbag systems of today are far more nuanced and weigh multiple criteria when deciding to deploy, versus "speed exceeds X mph, deploy" of old. You can have a very significant incident (two that I've witnessed recently involve vehicles into stationary objects at ~30mph) without airbag deployment. But if that was a Tesla in FSD that hit something at 30mph and didn't deploy airbags, well, that's not an accident according to Tesla.

That also doesn't account for "incident was so catastrophic that restraint systems could not deploy", also "not counted" by Tesla. Or just as egregious, "systems failed to deploy for any reason up to and including poor assembly line quality control", also not an accident and "not counted".


I don't disagree that it's not the best of comparisons (and I wonder if a better one could be imagined + implemented...). But still, it's not like we turn on Autopilot/FSD only "on empty highways", far from it! Certainly it's a tool where the user needs to learn its strengths and weaknesses and use it accordingly, but it is useful in so many more situations than not, that it's also not a terrible or meaningless comparison to make!

Anecdata: Almost all (95%?) of my highway driving (Europe) is on Autopilot. I don't even enjoy doing the driving myself any more in those situations where I know that Autopilot is doing a pretty good job. In particular, Autopilot does a better job than I can in conditions of heavy snow / rain / otherwise poor visibility conditions. I feel a lot safer being the operator than the driver in those instances! (The alternative would often be to slow down by a significant amount, and/or use up more of my focus/attention, leading to either less safe driving or forced breaks.)


> and I wonder if a better one could be imagined + implemented...

There is a ton of data using industry defined conditions and a ton of research has gone into determining the types of conditions that can affect accident rates. The only reason we can't compare is because Tesla only releases what we see in the link above. [1]

Best guess estimates of normalizing autopilot data against the average mix of highway/city driving finds that AP's safety advantage effectively disappears. Of course this is rough and Tesla likely has much better data that they are choosing not to share. [2]

[1] https://www.iihs.org/topics/fatality-statistics/detail/urban...

[2] https://twitter.com/NoahGoodall/status/1489291552845357058


It's not at all difficult to imagine many other measurements: minutes driven, type of road driven on, speed at time of crash, whether another vehicle was involved, etc. As a data analyst what I really want though is more granular data so we can figure out whether the published metrics are being cherry-picked.


> The alternative would often be to slow down by a significant amount

You should probably slow down.


This needs to be emphasized so much more.

Short of "not driving", human drivers don't have the option of "well, this is less than optimal, I can't do well, so I won't try".

FSD absolutely does. Disengages, or is not even engaged in the first place.

I wonder Tesla's FSD safety / disengagement stats for December to February in Pittsburgh, for example.


> Short of "not driving", human drivers don't have the option of "well, this is less than optimal, I can't do well, so I won't try".

They sure do, and will stop when things are too complex for them (e.g. bad weather).


Yes, how about a stat for, "accidents/incidents which were proceeded by AutoPilot disengagement within the past ~5 seconds."


That's what Tesla is counting. From the methodology section of the safety report:

> To ensure our statistics are conservative, we count any crash in which Autopilot was deactivated within 5 seconds before impact, and we count all crashes in which the incident alert indicated an airbag or other active restraint deployed.


Yeah, the "no active restraint deployed" thing is horseshit, and Tesla knows it, too. I go to manufacturer training sessions on these systems maybe once a year as a firefighter-paramedic, and there are multiple criteria for deployments these days - you can have a significant collision and if the system determines its better for the occupants not to deploy, then it doesn't.

According to Tesla, that's not an accident.

For FSD evaluation purposes? It sure as hell is.

But it's no longer counted.


I was responding to the point about counting crashes after autopilot disengagement, not the criteria for what counts as a crash.


That's what I get for not reading the source material! Thanks for correcting me.


Right, like when they tried to blame Walter Huang for FSD veering directly into a barrier US-101/CA-85. "Autopilot was disengaged when it crashed." Yes, because the driver grabbed the wheel when autopilot veered directly into a gore point.


That wasn't the first time they'd done that, either. Prior to that, another fatal collision, released a statement, "The vehicle had warned driver prior to collision to pay attention to driving".

They, strangely, didn't mention that this warning was issued for a single steering wheel inactivity failure, and that that happened _eighteen_ _minutes_ prior to the collision.


And I assume that autopilot is used primarily on highways, in either stop and go or steady flow situations.

Really they should know this, because the autopilot system should be identifying it based on speed limits and detected type of road.

Of course that assumes some ability to look into a black box, but also location tracking should provide it too.


This is true, but there’s a counterpoint here too. If the autopilot takes all of the “easy miles” then you should expect to see a spike or accidents concentrated on the non autopilot “hard miles”.

Since we don’t see that spike, I’m inclined to believe that the effect size of the highway driving preference is relatively small.


Comparing Tesla to the US incident average is a neat way of ignoring that high-end electric vehicles are not driven by the US average driver, in average situations. The driver (or should we call them executives, in self-driving cars?) is likely to have much more to do with the statistics, that the car.


I wrote article [1] along these lines a few years ago. There are lots of non-comparables between Teslas and the US fleet average, although this has been getting less true over time (as Tesla released less expensive cars, and as its vehicles aged).

1: https://www.thedailybeast.com/how-tesla-and-elon-musk-exagge...


It's not even comparing Tesla to US incident average, that would be an improvement. It's comparing Tesla in situations where people are comfortable activating Autopilot to US incident average.

The absolute easiest miles for a luxury vehicle vs all miles for all cars.


Public policy in the town I grew up in was that intersections only got lights after a certain number of reported accidents had happened there.

There was a particularly nasty intersection near the most popular bike shop that the bike club kept going to the city about but since nobody had died yet, they weren't going to put one in.

People know which intersections are accident prone, well before the city does something about it. There are routes I don't take because there are others that are safer or easier to follow at all hours of the day instead of just outside of rush hour, or also when I'm tired/distracted. But there are also magnets that pull certain demographics to them. Like that bike shop, or the shitshow outside of CostCo where I live now.


Calling Tesla drivers executives seems a bit... out of touch? Maybe the most premium model is an executive car. The vast majority are $50-$65k, with a house hold income of $133k. This means they are top 25% earners, not top 5%.

More important is the fact they are driven by older people (median age 48 years old), that is more important.

Model 3 Demographics: https://hedgescompany.com/blog/2019/03/tesla-model-3-demogra...


I think GP's "Executives" remark was less directly about the employment state of the subject than an attempt to distinguish them ("people in the driving seat of self-driving cars") from "drivers" (people who actually drive cars). IMO "operators" is probably a term with more fitting analogies.


Nothing is stopping you from looking at the Q3 2022 Autopilot vs Q3 2021 numbers, the others are nice for reference


The reason for that difference should be evaluated then. Is it an income thing? Car cost thing? (compare stats to non-Teslas in the same group) Does Tesla's marketing create a false sense of confidence in the car's abilities? Does the UI distract? Numbers can lie, but in general, I think the data is out there to support or reject the "fake news" narrative.


This is absurdly, and at this point clearly intentionally, disingenuous.

You would absolutely expect Autopilot/FSD to have fewer crashes than human-in-control crashes because AP/FSD isn't available in the conditions that result in more crashes: bad/variable roads, poor conditions, etc. Anything else would imply AP/FSD is absurdly dangerous as it would be getting higher crash rates in good conditions than human drivers in bad conditions.

If AP/FSD were actually better and/or safer than human drivers you would expect Tesla to aggressively demonstrate that. Given that they are choosing not to compare crash rates by road condition it is reasonable to assume that the statistics do not support the claim.

This is before we get into whether AP/FSD disconnecting moments before crashing counts as crashing under AP/FSD. I recall articles claiming that Tesla didn't include such crashes as being caused by AP/FSD, but I have no idea if that was ever confirmed. Certainly I know Tesla has gone out of its way to blame drivers for AP/FSD crashes, so my faith in any kind of honesty from Tesla in these matters is minimal.


I was curious about what stats were used for the following:

> By comparison, the most recent data available from NHTSA and FHWA (from 2021) shows that in the United States there was an automobile crash approximately every 652,000 miles.

The figures they are using for this are 4,954,323 vehicles in crashes (ie a single crash with 3 vehicles adds 3 to this total). This includes large trucks, motorcycles, buses, other, and unknown vehicle types, from the NHTSA data. and 3,228.8 billion miles from the FHWA data.


Tesla is discovering it's created a lightning rod for blame.

Previously, accident culpability was diffuse across individual drivers in a way that motivating any group of individuals to action is inherently a _social_ problem, and one that is often dismissed as such.

The development and deployment of a singular identifiable causal agent casts off that shield of diffuse social responsibility in a way that leaves Tesla vulnerable to intense scrutiny. It doesn't help that as Tesla's figurehead, the collective schadenfreude we gain from Musk's failures are converted into a cathartic group joy.

Nevertheless, the dynamics of blame have changed because the terrain has changed. The closest analogy would be a human driver that drive all the miles autopilot drove in 2022 and was responsible for the same number of autopilot crashes in 2022. Presented outside the frame of this debate, I can't help but imagine we would want them off the road regardless of whether their miles per crash were less than the population on average.


My main question is whether the miles equated are in fact equal.

Autopilot is used far more often on the highway, I believe, than in residential areas – where I also believe crashes are more frequent.

It'd be interesting to see the miles broken down by road type, speed, etc.


It's interesting that there is so much variability in "Tesla vehicles not using Autopilot technology". I wonder if this is just the small sample size compared to the national data, or some other effect?


Is there similar data for other companies like BMW or Ford?

Also just seeing improvement from Q3 2022 vs Q3 2021 is good, although small margin.


I hope my son's first car is a Tesla. I hope I'm well off enough to support that purchase. I want him to be as safe as possible and have a cool ride :D




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: