Follow TV Tropes

Following

Self-Driving Cars

Go To

A thread to discuss self-driving cars and other vehicles. No politics, please.

Technology, commercial aspects, legal considerations and marketing are all on-topic.


  • Companies (e.g. Tesla Inc.) are only on-topic when discussing their self-driving products and research, not their wider activities. The exception is when those wider activities directly impact (or are impacted by) their other business areas - e.g. if self-driving car development is cut back due to losses in another part of the business.

  • Technology that's not directly related to self-driving vehicles is off-topic unless you're discussing how it might be used for them in future.

  • If we're talking about individuals here, that should only be because they've said or done something directly relevant to the topic. Specifically, posts about Tesla do not automatically need to mention Elon Musk. And Musk's views, politics and personal life are firmly off-topic unless you can somehow show that they're relevant to self-driving vehicles.

    Original post 
Google is developing self-driving cars, and has already tested one that has spent over 140,000 miles on the road in Nevada, where it is street-legal. They even let a blind man try a self-driving car. The car detects where other cars are in relation to it, as well as the curb and so on, follows speed limit and traffic laws to the letter, and knows how to avoid people. It also uses a built-in GPS to find its way to places.

Cadillac plans to release a scaled back, more simple version of similar technology by 2015 - what they call "Super Cruise", which isn't total self-driving, but does let you relax on highways. It positions your car in the exact center of a lane, slows down or speeds up as necessary, and is said to be meant for ideal driving conditions (I'm guessing that means ideal weather, no rain or snow, etc.).

I am looking forward to such tech. If enough people prefer to drive this way, and the technology works reliably, it could result in safer roads with fewer accidents. Another possibility is that, using GPS and maybe the ability to know ahead of time which roads are most clogged, they can find the quickest route from place to place.

On the other hand, hacking could be a real concern, and I hope it doesn't become a serious threat. It's looking like we're living more and more like those sci-fi Everything Is Online worlds depicted in fiction for a long time.

(Mod edited to replace original post)

Edited by Mrph1 on Mar 29th 2024 at 4:19:56 PM

RainehDaze Figure of Hourai from Scotland (Ten years in the joint) Relationship Status: Serial head-patter
Figure of Hourai
#1776: Mar 29th 2024 at 7:48:11 AM

So, now when something goes wrong for whatever reason, the complete lack of any proper media team really won't help with their attempts at playing catch-up

To reference what I said earlier, when something does go wrong and it should be easy to prove that it's misuse and actually the technology is fine if used correctly...

It kind of needs to be spread by something other than:

  1. Elon Musk alone
  2. Random social media fans
  3. People arguing on small media forums.

Avatar Source
Imca (Veteran)
#1777: Mar 29th 2024 at 7:48:11 AM

[up][up]And the National Transportation Saftey Board isnt the transportation professionals?

Because there the ones that published the data on accidents, not Tesla.

[up] That part you would think, but a lie spreads on social media faster then the truth, especialy when it targets acceptable targets.

See the amount of people that think electric cars start on fire better then zippo lighters.

Edited by Imca on Mar 29th 2024 at 11:51:37 PM

Fighteer Lost in Space from The Time Vortex (Time Abyss) Relationship Status: TV Tropes ruined my love life
Lost in Space
#1778: Mar 29th 2024 at 7:52:37 AM

[up] To clarify, it's a little of both. The NTSB and NHTSA collect accident data, but most of that reporting is voluntary; they have no way to compel it from automakers. Tesla is somewhat unique in that its active vehicle telemetry allows it to provide substantially more comprehensive data than anyone else.

In other words, the stats are probably distorted a bit by the accuracy of Tesla's reporting, not the other way around.

Tesla also publishes its own crash and fire data on its safety page, although in aggregate.

Edited by Fighteer on Mar 29th 2024 at 10:57:54 AM

"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"
M84 Oh, bother. from Our little blue planet Since: Jun, 2010 Relationship Status: Chocolate!
Oh, bother.
#1779: Mar 29th 2024 at 8:05:36 AM

The irony is that I'm pretty sure I'm the only person here who has had some first-hand experience with Tesla's FSD. And I will admit it didn't cause me any grief during the times I was driven in the car.

Disgusted, but not surprised
Mrph1 MOD he/him from Mercia (4 Score & 7 Years Ago) Relationship Status: Tell me lies, tell me sweet little lies
he/him
#1780: Mar 29th 2024 at 8:44:27 AM

A reminder - this is a thread about self-driving cars.

Tesla is only on-topic if we're discussing the self-driving features of their vehicles, not their electric car business as a whole, and definitely not their solar energy business.

Musk is potentially relevant to the thread in his role within Tesla Inc and as a champion of self-driving technology. But that is all.

If we're talking about Musk in a post here, that should only ever be because he's directly relevant due to something he's said or done - probably something related to Tesla's self-driving technology. Posts about Tesla do not automatically need to mention Musk.

His views, politics and personal life are firmly off-topic unless you can somehow show that they're relevant to self-driving technology.

Mrph1 MOD he/him from Mercia (4 Score & 7 Years Ago) Relationship Status: Tell me lies, tell me sweet little lies
he/him
#1781: Mar 29th 2024 at 9:20:40 AM

Also, just to confirm - the pinned post has been updated to clarify the points above.

NativeJovian Jupiterian Local from Orlando, FL Since: Mar, 2014 Relationship Status: Maxing my social links
Jupiterian Local
#1782: Mar 29th 2024 at 11:11:09 AM

Rather than fearmongering, let's see the actual data.

This is more difficult to do than it seems. The numbers that most people like to cite are the ones provided by Tesla, but they seem to count "an accident" as the airbag firing, while the industry in general typically goes off of police reports or insurance claims. Indeed, if you look at insurance numbers, Tesla actually has the highest accident rate of any major car brand.

Another problem is that comparing "Autopilot active" vs "all non-Tesla driving" is that Autopilot is something like 90%+ highway driving — which has lower accident rates (per mile) even in completely non-automated driving. So you're comparing Tesla's accident rate under the easiest driving conditions to everyone else's accident rate under all driving conditions, which is not apples to apples. There are additional statistical correlations that should be controlled for to make a proper analysis, as well, but generally are not — for example, as a more expensive vehicle, Tesla owners are more likely to be owned by people who are older and richer than average. Older and richer drivers are also among the demographic groups with the lowest accident rates. So if we're trying to determine the role of the vehicle in accident rates, we need to control for that, but this is basically never done. Here's a source that goes into the details of some of these discrepancies.

The tldr is that best guess we can make based on the available data is that Tesla accident rates are roughly equivalent or slightly higher than the rest of the industry. The most disappointing thing is that Tesla very definitely has the data to make apples-to-apples comparison (they just need to sort it the same way the rest of the industry does), but they do not make it public. The second source I posed goes into this as well.

In any case, all of this is actually besides the point I was making earlier: that by doing the Full Self Driving beta with regular Tesla owners on public roads, Tesla is privatizing benefit (the data they gather) and socializing risk (the chance of an accident). This behavior affects not just Tesla owners (who can choose for themselves whether to use the beta service after considering the risks and benefits), but everyone on the road. Even if is statically safer (which, as mentioned above, is a dubious claim), Tesla does not have the right to make that decision for the public at large. But that's exactly what they've done.

As an aside, Fighteer, you continue to be dismissive (calling my arguments a "performative emotional response" and "fearmongering") and I just want to be on the record as saying that I do not appreciate it. I'm trying to engage the topic in good faith and from a neutral perspective (in the sense that I'm willing to be convinced, not that I don't have an existing opinion). Do me the favor of at least keeping the contempt from showing in your replies.

Edited by NativeJovian on Mar 29th 2024 at 2:23:54 PM

Really from Jupiter, but not an alien.
Fighteer Lost in Space from The Time Vortex (Time Abyss) Relationship Status: TV Tropes ruined my love life
Lost in Space
#1783: Mar 29th 2024 at 11:23:38 AM

There's a lot of criticism of Electrek's analysis of that LendingTree report in the comments, which I don't have time to dig into. It is unclear if Tesla drivers are self-reporting accidents at a higher rate, for one thing.

Anyway, since there's no data showing whether these insurance claims are related to Autopilot or FSD use, there's no way to usefully determine the value of the system. You're broadly correct that Tesla only reports crashes in which the airbags deployed, and this could skew the data.

Further, there's no breakdown of the fault in said accidents. If someone hits a Tesla, that's still an accident involving the Tesla.

The lack of accurate crash reporting is a serious issue that goes well beyond the scope of this thread.

Edited by Fighteer on Mar 29th 2024 at 2:24:33 PM

"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"
NativeJovian Jupiterian Local from Orlando, FL Since: Mar, 2014 Relationship Status: Maxing my social links
Jupiterian Local
#1784: Mar 29th 2024 at 11:42:53 AM

There's a lot of criticism of Electrek's analysis of that Lending Tree report in the comments, which I don't have time to dig into. It is unclear if Tesla drivers are self-reporting accidents at a higher rate, for one thing.

I find it frustrating that you repeat "the data is the only thing that matters here" several times, and then when I post data you say "well I don't have time for that".

Anyway, since there's no data showing whether these insurance claims are related to Autopilot or FSD use, there's no way to usefully determine the value of the system.

Well, Tesla drivers are presumably using Autopilot and FSD, so at minimum we can say that the real-world use of Autopilot and FSD do not make Teslas safer overall. We can't say whether that's because Autopilot and FSD are extremely safe but manually-driven Teslas are extremely unsafe, manually-driven Teslas are extremely safe but Autopilot and FSD are extremely unsafe, or some point in between those two extremes. But we can definitely say that, for whatever reason, having Autopilot and FSD available objectively does not make Teslas less accident-prone.

Further, there's no breakdown of the fault in said accidents. If someone hits a Tesla, that's still an accident involving the Tesla.

I see no reason to believe why this wouldn't be a random distribution. Why would people be hitting Teslas at higher rates than other brands?

The lack of accurate crash reporting is a serious issue that goes well beyond the scope of this thread.

The data I posted is just as accurate as anything else in the industry, where is this coming from?

Really from Jupiter, but not an alien.
Fighteer Lost in Space from The Time Vortex (Time Abyss) Relationship Status: TV Tropes ruined my love life
Lost in Space
#1785: Mar 29th 2024 at 3:42:02 PM

The problem is that the statistics presented by LendingTree don't analyze the causes of crashes or incidents, and therefore there's no way to tell if Autopilot or FSD were involved in any of them. I agree that Tesla's safety statistics may be somewhat misleading when it comes to Autopilot, since it's mainly used on highways and highways have lower crash rates in general, but there's a lack of normalization.

Subaru, which makes a lot of vehicles that have built-in ADAS (EyeSight), is third on the list of accident rates. It seems counterintuitive that the use of ADAS would lead to more crashes, but again there is no way to tell how frequently Subaru drivers use EyeSight, nor whether its use was involved in any of the reported accidents.

LendingTree's data covers insurance claims, but the NHTSA data cited by Tesla includes only crashes where a police report was filed and a vehicle was towed. LT would naturally report higher raw collision numbers since not all accidents result in a tow.

The data are simply too messy to use them to support any particular conclusions.

"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"
Fighteer Lost in Space from The Time Vortex (Time Abyss) Relationship Status: TV Tropes ruined my love life
Lost in Space
#1786: Apr 3rd 2024 at 10:55:45 AM

X thread from Ashok Elluswamy, AI director at Tesla. It is in response to a video clip from an FSD 12.3 user whose vehicle perfectly navigated offroad through a marked construction zone. (The original poster of the video noted that FSD made that maneuver accurately without a lead vehicle earlier, off-camera.)

Whether some surface is drivable or not is all contextual. We typically don’t want the car to drive on dirt, but since the paved road is closed, the car needs to drive on dirt. The same logic also applies to, for example, mounting small curbs to avoid a larger obstacle. So “collision avoidance” cannot be an absolute objective, but a relative one. The actual objective is minimizing risk of injury / property damage while still getting to the destination. It requires a lot of intelligence to assess this risk accurately. This comes naturally to humans and is now also obvious to the car’s AI.

Elluswamy is claiming that FSD is capable of contextual intelligence: in other words, real-world problem solving rather than following rote instructions. It seems to be getting really good. Still looking for counterexamples or cases of it screwing up, but not many of those are making it to my feed.

Incidentally, it is officially no longer in beta. FSD is now called "Supervised" in the marketing, a label that is repeated multiple times throughout the instructions and user acknowledgments.

Edited by Fighteer on Apr 3rd 2024 at 3:37:27 PM

"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"
Silasw A procrastination in of itself from A handcart to hell (4 Score & 7 Years Ago) Relationship Status: And they all lived happily ever after <3
A procrastination in of itself
#1787: Apr 3rd 2024 at 12:17:49 PM

Full Self-Driving (Supervised) is very much a better name, I’d rather the “Supervised” went first but it provides much clearer context as to the use case.

“And the Bunny nails it!” ~ Gabrael “If the UN can get through a day without everyone strangling everyone else so can we.” ~ Cyran
Galadriel Since: Feb, 2015
#1788: Apr 3rd 2024 at 4:22:20 PM

In any case, all of this is actually besides the point I was making earlier: that by doing the Full Self Driving beta with regular Tesla owners on public roads, Tesla is privatizing benefit (the data they gather) and socializing risk (the chance of an accident).

It doesn’t seem beside the point to me at all. If the question is “does Tesla’s FSD beta make people less safe”, then whether FSD cars are more likely to get in accidents than other cars is precisely the point. And even from your analysis of the stats that adjusts for FSD generally being used in lower-risk situations than the average driver, it doesn’t sound like there’s a clear, demonstrable difference:

The tldr is that best guess we can make based on the available data is that Tesla accident rates are roughly equivalent or slightly higher than the rest of the industry.

So if that’s the case - if they’re basically the same as the rest of cars on the road - how are they making you less safe?

You could make a much stronger statistical case than drivers in their teens and early 20s were “endangering you without your consent” by being on the roads at all, given their higher rates of accidents, and I think most people would easily recognize that as an unreasonable characterization. You could likewise say that the very existence of trucks and SUV is “endangering you without your consent”. It’s…not how consent works. A person’s choice is to drive or not, taking into account the existence and behaviour of other drivers. If a thing presents a clear and demonstrable risk (like drunk driving), then people push for laws to ban it; but even from what you’ve cited, the evidence on FSD doesn’t seem to come close to that threshold.

(All of this is separate from the question of whether Tesla should be calling something that isn’t full-self-driving “Full-Self Driving. They shouldn’t. I feel that’s easily answered but not germane to the policy question of what forms of driver assistance and self-driving should be allowed on the roads.)

Edited by Galadriel on Apr 3rd 2024 at 4:29:23 AM

NativeJovian Jupiterian Local from Orlando, FL Since: Mar, 2014 Relationship Status: Maxing my social links
Jupiterian Local
#1789: Apr 3rd 2024 at 4:48:52 PM

It doesn’t seem beside the point to me at all. If the question is “does Tesla’s FSD beta make people less safe”, then whether FSD cars are more likely to get in accidents than other cars is precisely the point.

"does Tesla's FSD beta make people less safe" is not the point I was making. You literally quoted me saying that it wasn't the point so that you could say "it was the point, actually". I don't know what else to say except to repeat myself that no, that is not the point I was making, even if it's the point you want to make.

You could make a much stronger statistical case than drivers in their teens and early 20s were “endangering you without your consent” by being on the roads at all, given their higher rates of accidents, and I think most people would easily recognize that as an unreasonable characterization.

You're conflating two different ideas. Teenagers may well be more statistically likely to cause an accident, but that's not the nature of my complaint. People should be allowed to use public roads because "public roads that no one is allowed to use" is a self-defeating concept. Tesla using public roads as a beta testing facility for their self-driving software is a categorically different thing. The benefit of "other people get to use roads" is "I also get to use the roads", which seems like a fair trade to me. The benefit of "Tesla gets to test their software on public roads" is "Telsa gets to develop a product that they sell for a profit".

Why should Tesla be allowed to put the public in danger (however slight) for their own benefit? No one else developing self-driving software feel the need to have untrained consumers test it on public roads. That's the point I'm making about privatizing benefit and socializing risk.

Really from Jupiter, but not an alien.
RainehDaze Figure of Hourai from Scotland (Ten years in the joint) Relationship Status: Serial head-patter
Figure of Hourai
#1790: Apr 4th 2024 at 1:22:00 AM

[up][up][up] really, it should be "Supervised Self-Driving" as that's descriptive, accurate, and doesn't sound near so silly. But at least they found the parentheses to try and downplay it with.

Avatar Source
Smeagol17 (4 Score & 7 Years Ago)
#1791: Apr 4th 2024 at 4:56:20 AM

Yeah, it does rise the question what “full” refers to.

Edited by Smeagol17 on Apr 4th 2024 at 2:56:37 PM

Fighteer Lost in Space from The Time Vortex (Time Abyss) Relationship Status: TV Tropes ruined my love life
Lost in Space
#1792: Apr 4th 2024 at 6:00:57 AM

The name is aspirational. It describes what the product is intended to do when it is mature. The people complaining about the name will look fairly silly if/when it gets certified for L4/5 autonomy.

"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"
Chortleous Since: Sep, 2010
#1793: Apr 4th 2024 at 6:23:53 AM

If.

'Aspirationally' naming widely-available consumer products, especially if the name doesn't represent what that product actually does in the here and now, is at best dumb and at worst purposely dishonest. Even dangerous, in this case—we all know full well people don't read EULAs or fine print for shit.

Edited by Chortleous on Apr 4th 2024 at 7:29:05 AM

Zendervai Visiting from the Hoag Galaxy from St. Catharines Since: Oct, 2009 Relationship Status: Wishing you were here
Visiting from the Hoag Galaxy
#1794: Apr 4th 2024 at 6:40:57 AM

Especially since there's literally nothing stopping Telsa from just changing the name when it's fully operational.

Like they just did.

There is no compelling reason to name a thing misleadingly, and "it's aspirational" as a defense is moronic.

Not Three Laws compliant.
Mrph1 he/him from Mercia (4 Score & 7 Years Ago) Relationship Status: Tell me lies, tell me sweet little lies
he/him
#1795: Apr 4th 2024 at 7:04:43 AM

It's the sort of thing that trading standards (at least in the UK and EU) may have Strong Views on, but it's also a thing many businesses do.

Is it shoddy, misleading marketing? Maybe. But branding based on the day one 'minimum viable product' doesn't age well, and companies always prefer to sell you the dream.

Integrity in marketing and branding is rarer than it should be. And when it is there, it's often because a regulator intervened.

RainehDaze Figure of Hourai from Scotland (Ten years in the joint) Relationship Status: Serial head-patter
Figure of Hourai
#1796: Apr 4th 2024 at 7:10:05 AM

Why would anyone look silly for pointing out they had a misleading name for years if and when it gets certification? It just vindicates that they were being overoptimistic for years of providing the service. <_>

Avatar Source
Fighteer Lost in Space from The Time Vortex (Time Abyss) Relationship Status: TV Tropes ruined my love life
Lost in Space
#1797: Apr 4th 2024 at 8:01:29 AM

It's not misleading if it labels itself as developmental. No reasonable person would read "Full Self-Driving Beta" and believe it's a mature, finished product. That is literally what "beta" means.

There are more reasonable grounds to go after "Autopilot" as a brand name, an argument we've had to death already and that several courts have agreed with. I still don't agree, but it's not worth fighting over.

What I care about is whether it works, not what it's called, and I refuse to apologize for that position. Maybe if it was called "Miracle Perfect Driver 9000 That Totally Works Always" there would be some valid concerns.

Edited by Fighteer on Apr 4th 2024 at 11:03:29 AM

"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"
RainehDaze Figure of Hourai from Scotland (Ten years in the joint) Relationship Status: Serial head-patter
Figure of Hourai
#1798: Apr 4th 2024 at 8:14:53 AM

I'd argue that calling it a beta this entire time is also wrong, because I don't think anyone could call "Self-driving system that needs to hand over to a human sometimes" a feature complete release just looking for bugs.

Avatar Source
Fighteer Lost in Space from The Time Vortex (Time Abyss) Relationship Status: TV Tropes ruined my love life
Lost in Space
#1799: Apr 4th 2024 at 8:16:53 AM

Are you seriously going to nitpick over the meaning of "beta" now? You can't win that one.

"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"
RainehDaze Figure of Hourai from Scotland (Ten years in the joint) Relationship Status: Serial head-patter
Figure of Hourai
#1800: Apr 4th 2024 at 8:23:18 AM

Well, it was either not Full Self Driving or it wasn't feature complete despite its name and was therefore an alpha release. So yes, I'm going to be nitpicky, because one way or the other it was misnamed. It is misleading to give something a name that should mean 'finished except for bug testing' when it still needs work to even do its primary role. If I use beta software, and occasionally I've had to (i.e. the whole remote development frontend/client that I use sometimes is beta software), then I expect that it does everything that it's for. Not perfectly, and more buggy than normal software, but I wouldn't expect it to suddenly give up and go "Nope, not doing anything now, back to you".

Avatar Source

Total posts: 1,906
Top