A thread to discuss self-driving cars and other vehicles. No politics, please.
Technology, commercial aspects, legal considerations and marketing are all on-topic.
- Companies (e.g. Tesla Inc.) are only on-topic when discussing their self-driving products and research, not their wider activities. The exception is when those wider activities directly impact (or are impacted by) their other business areas - e.g. if self-driving car development is cut back due to losses in another part of the business.
- Technology that's not directly related to self-driving vehicles is off-topic unless you're discussing how it might be used for them in future.
- If we're talking about individuals here, that should only be because they've said or done something directly relevant to the topic. Specifically, posts about Tesla do not automatically need to mention Elon Musk. And Musk's views, politics and personal life are firmly off-topic unless you can somehow show that they're relevant to self-driving vehicles.
Cadillac plans to release a scaled back, more simple version of similar technology by 2015 - what they call "Super Cruise", which isn't total self-driving, but does let you relax on highways. It positions your car in the exact center of a lane, slows down or speeds up as necessary, and is said to be meant for ideal driving conditions (I'm guessing that means ideal weather, no rain or snow, etc.).
I am looking forward to such tech. If enough people prefer to drive this way, and the technology works reliably, it could result in safer roads with fewer accidents. Another possibility is that, using GPS and maybe the ability to know ahead of time which roads are most clogged, they can find the quickest route from place to place.
On the other hand, hacking could be a real concern, and I hope it doesn't become a serious threat. It's looking like we're living more and more like those sci-fi Everything Is Online worlds depicted in fiction for a long time.
(Mod edited to replace original post)
Edited by Mrph1 on Mar 29th 2024 at 4:19:56 PM
"Autopilot does not literally drive your car for you" is not something I would classify as minutiae.
"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"It's not what you would.
But is Tesla putting it front and centre in advertising? Are the other companies working on it doing that? They all bluntly state that fully autonomous driving is their goal and kinda get really wibbly about where they are at this point in the way they talk about it.
It is minutia because it's in the fine print, three pages in, metaphorically speaking. If you don't make a habit of scrutinizing everything, it's actually extremely easy to miss and one disclaimer when you turn the feature on can't override that. If you have to go out of your way to find out what the truth is, it's not obvious and most people won't be aware of it.
Edited by Zendervai on Jun 12th 2022 at 6:03:50 AM
Tesla's published reports
state that approximately ten times fewer Autopilot crashes occur per mile driven than the NHTSA aggregate average. Note that this is not necessarily a like-for-like comparison since Autopilot is intended for use on highways while the NHTSA data covers all driving conditions.
It seems clear that when Autopilot is used as intended — in suitable environments with the driver paying attention — it increases vehicle safety. What is not as well studied, at least in terms of publicly available data, is how it compares to other ADAS. Presumably this is because the companies making those systems do not publish the same data or do not publish it using the same metrics.
Note that Basic Autopilot, which is available on all Teslas as a standard feature, is not part of the FSD Beta program and it is not updated with nearly the same frequency. Because it is a standard feature and not a closed beta, it is subject to much stricter regulatory scrutiny.
Edited by Fighteer on Jun 12th 2022 at 8:12:02 AM
"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"You know, yesterday I was thinking that it would be good for Tesla if they framed their autopilot-but-not-quite system as "here you can teach a machine how to drive!" or something like that. It's not quite as misleading and I'd imagine there are people out there who would jump at the prospect.
"For a successful technology, reality must take precedence over public relations, for Nature cannot be fooled." - Richard Feynman![]()
That is definitely a motivation for many people in the FSD Beta program, but I think it's a little too abstract for most Tesla buyers.
Tesla is more transparent than most automakers regarding the safety of their ADAS software. But realistically this is something that NHTSA should be doing: gathering and releasing the data itself or setting forth standards for automakers to report on their own safety.
Certainly, detractors will accuse Tesla of cherry-picking its data to make itself look as good as possible, but if that were the case and Autopilot were killing people on a regular basis — a child- and grandmother-seeking automotive missile, as it were — there would be more actual incidents and not just this small number of events that keep getting harped on.
It doesn't help that every time there's a serious crash involving a Tesla, news media put the make of the car front and center in the headline, treatment given to nobody else.
"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"
That latter bit, Tesla did it to themselves. They brand themselves in a really particular way and when a Tesla car lights on fire, the news going "it was a Tesla!" is actually following Tesla's own lead on branding. Tesla cars are also still rare enough that most things involving them are more newsworthy than a Ford pickup lighting on fire, if only because the number of Ford pickups are at least an order of magnitude higher than the number of Teslas overall.
The other element is that things like recalls go unremarked from companies like Toyota and Honda and Ford because they're generally pretty quiet about them and they're usually either unremarkable or hard to explain. Like, my Toyota had a recall because something was wrong with the rear right window and it wouldn't roll down. That's not newsworthy, it's a vaguely annoying thing that affected something like 5% of their vehicles. Most recalls are along those lines.
Tesla, on the other hand, routinely issues recalls for software updates. This is not an issue in and of itself, but that usually comes with, say, a patch list and there tends to be at least one really alarming sounding thing in the mix. That's standard for patches, it's not a crack against Tesla, but it's a lot easier to spin a story about Tesla having to fix a really nasty but extremely rare glitch that needs to be patched out of all of their cars than to spin a story about Toyota needing to replace a single part in the rear right passenger door that causes a problem that is mildly inconvenient at worst.
One other element is that this isn't actually new. Ford got a ridiculous reputation in the 70s because of the Pinto and the news at the time loved rubbing it in that Honda and Toyota were eating Ford and GM's lunches and Ford couldn't even make a passenger car that wouldn't light on fire. Tesla positions themselves as the wave of the future, while the other companies are like "you buy cars from us" mostly. Of course Tesla's gonna be focused on.
Edited by Zendervai on Jun 13th 2022 at 9:00:22 AM
Note that the software "recall" thing was forced on Tesla by NHTSA. Normally it doesn't label those updates as such because, really, why would you? Customers don't have to bring their cars to a dealer or service center; they get "fixed" while sitting in a garage or driveway overnight. It'd be like Apple sending you a recall letter every time iOS gets updated.
Edited by Fighteer on Jun 13th 2022 at 9:01:17 AM
"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"Okay. That doesn't change that a weird alarming thing in a patch list is automatically going to draw more attention than a random minor part replacement that barely does anything.
When someone shoves themselves into the spotlight, they will get more attention. That's what most of this boils down to. There's not actually a media bias against Tesla, at least not generally, it's just more newsworthy that the company that constantly yells about how their cars are advanced and well engineered have these strange issues than a particular year of the Honda Civic needing a replacement of the windshield wiper mechanism.
And none of this actually addresses one of the core problems. Tesla chose a term with a really specific connotation to name their self-driving system after, with the result of most people who don't have the time or inclination to bother with research, getting the wrong impression of what it actually is.
Call it "Driver Assistance" and say that "Autopilot" isn't available yet and is still under development. Boom, problem solved.
Edited by Zendervai on Jun 13th 2022 at 9:06:50 AM
A lot of headlines are grabbing attention today with the news that NHTSA (the US National Highway Transportation Safety Administration) has published a report
with data on crashes involving "advanced vehicle technologies" — meaning in this case SAE Level 2 (dubbed "advanced driver assistance systems" or ADAS) and SAE Level 3-5 (dubbed "automated driving systems" or ADS).
The reports cover a time period between July 20, 2021 and May 15, 2022 and include mandatory reporting by manufacturers and "operators" — fleet operators? — of all crashes in which one of these systems was involved.
Crashes involving a Level 2 ADAS-equipped vehicle are reportable if the Level 2 ADAS was in use at any time within 30 seconds of the crash and the crash involved a vulnerable road user or resulted in a fatality, a vehicle tow-away, an air bag deployment, or any individual being transported to a hospital for medical treatment.
It is important to note that the data are not screened for duplicate reports and are not normalized to the quantity of vehicles in service or the frequency of ADAS use. Crashes also may not be reported if there is insufficient vehicle telemetry, and not all makers of ADAS have access to said telemetry.
The level 2 report covers 392 crashes over the period listed above.
- Most reports come from telematics or direct complaints.
- The overwhelming majority of crashes are from Tesla (273), with Honda in second (90) and Subaru in third (10). Other companies are minimally represented.
- The plurality of incidents (125) come from California.
The level 3-5 report isn't as big because there aren't very many total vehicles operating in this category. It covers 130 crashes.
- The main report sources are, in order, telematics, field reports, and testing.
- The field is much larger here. Waymo is the highest (82), with Transdev (34) and Cruise (23) in second and third place.
- The majority of incidents (90) come from California.
- The data are more accurate in terms of crash type, severity, and damage, reasonable since these cars have much more pervasive telemetry and are monitored closely.
- There is one ADS report about Tesla, which is strange because Tesla does not classify Autopilot or FSD Beta as Level 3. It is probably an error.
The media, of course, are reporting that Tesla has a lot of ADAS crashes. Gotta get them clicks. But it must be stressed that the lack of normalization is crucial. If Tesla Autopilot is operating over ten times as many vehicle-miles as other systems, it's naturally going to report more crashes. Unless every automaker is reporting on this, the data won't sync up. Also, as noted, Tesla has access to vehicle telemetry in the majority of crashes, while other automakers may not.
California's overrepresentation in the data for ADAS is likely directly proportional to the number of Tesla vehicles in that state. Its overrepresentation for ADS is undoubtedly because the companies that are developing Level 3-5 systems are doing the majority of testing there.
All in all, the data here are too inconsistent to generate any sort of valid statistical universe, especially on Level 2 systems.
Edited by Fighteer on Jun 15th 2022 at 10:27:40 AM
"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"What that information ultimately means is that there is ample reason to do a proper long-term study, including a really close look at how the systems react to imminent collisions.
One common safety feature is that whenever the automatic system can't figure out what to do, it kicks control back to the driver. If it happens to do that 33 seconds before a collision, it's not counted as a self-driving collision and that could be a genuine issue despite it not being a deliberately intended outcome.
I don't believe that anyone set it up to do that on purpose...but I would believe that Tesla put fixing that particular issue as low priority because it shutting off that close to an accident is irrelevant to the actual driver, but it makes Tesla metrics look better, so other things are deemed as being more important.
There may or may not be anything here, but there is every reason to closely examine and study it and, if it turns out that Tesla is aware of these issues and is dragging their feet on fixing it or are not adequately explaining it, then they can get pinned to the wall over it.
One important thing to keep in mind here is that Tesla is a corporation run by a man who is...pretty unscrupulous. It doesn't mean that something sketchy is happening...but it also means that assuming it isn't happening is pretty naive. (I'm emphasizing the corporation part here. Never trust a company to do anything but attempt to enrich themselves, they are not your friends, they will happily screw you over the instant they think it would benefit them)
Edited by Zendervai on Jun 16th 2022 at 8:40:41 AM
This does reveal an interesting discrepancy between Tesla's metrics and the NHTSA's, too.
As you said on the other page ("When investigating and reporting Autopilot crashes, Tesla counts any event that occurs within five seconds of a disengagement as happening under Autopilot"), Tesla only counts Autopilot crashes as happening if it was within 5 seconds of disengagement. The NHTSA is counting any usage within 30 seconds. You would hope that people would realise they're in control and react at some point during that time horizon, but the point you should cross over from "Autopilot is responsible" to "the driver is responsible" is probably more than 5s in my opinion just given braking times and distances—5s is, what, approximately the braking time from 70mph? So, if autopilot is used at highway speeds in general, even an instant reaction 6s before isn't going to have you stop.
My takeaway from this is not "Tesla is run by an unscrupulous man", it's that there are no uniform reporting standards for ADAS/ADS crashes, so the data are too noisy to be of any use. Tesla is the only company that publicly reports these statistics.
"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"
You, uh, seriously misunderstood what I said. I'm actually really confused by your reading of what I said, you left out all of the actual substance in your reading of it.
I didn't say that this is necessarily deliberate. I said that this is enough information to justify a proper study and investigation and that while we can't assume that anything is definitely wrong...we also can't assume that Tesla is innocent because it's a corporation and run by someone who doesn't seem to care much about professional ethics unless forced to. As you can see here, my statement was pointing out that we can't assume that Tesla is automatically innocent of any wrong-doing, therefore my actual takeaway of the value of doing an investigation is more supported.
It could be either outcome. But an independent investigation is a good idea regardless.
Next time, try taking what people say at face value before assuming bad faith on their part. I'm getting very tired of you cherrypicking random stuff from other people's posts and arriving at readings that barely resemble what was actually said at all. Generally, when someone says something, they usually intend for their whole statement to be read and aren't going for double or triple layers of meaning because that's fucking exhausting and most people are really bad at doing that anyway.
Edited by Zendervai on Jun 16th 2022 at 9:11:54 AM
The Information: Inside Apple’s Eight-Year Struggle to Build a Self-Driving Car
This article is login-walled, so salient points may be found in this Twitter thread
. I'll post a few takeaways, but this is the sort of thing that we need to understand in depth.
- Apple's self-driving project, code-named Titan, is struggling under absent leadership, unclear expectations, and an emphasis on demos instead of real-world performance.
- The project is infected with a "demoware" mentality, which occurs when a product is carefully fitted to a specific scenario in order to show it off, but none of that effort helps it solve the general-purpose problem.
- Apple is trying to ditch high-definition maps like those that companies like Waymo and Cruise use in their lidar-equipped vehicles, but their vehicles are subsequently unable to precisely navigate their environment, bumping into curbs, incorrectly navigating lanes, and such.
- Ian Goodfellow, a manager and scientist with extensive machine-learning experience, recently left the team. Apple has struggled to build deep-learning models for its AI.
- Tesla is generating billions a year in revenue from selling self-driving to customers, while Apple is burning a billion per year with no product in sight. (Caveat: it can easily afford that expenditure.)
- Leadership turnover and constant changes in strategy — do they pursue ADAS first or go immediately to a fully autonomous (driverless) vehicle — cause frustration among team members.
- In some cases, leaders are more focused on the passenger experience than on getting the thing to work in the first place. As an example, at one point the car was going to be built entirely out of glass until someone pointed out that even if it drove perfectly, other drivers might still hit it.
- The focus is still on hand-coded driving rules rather than deep machine learning (Tesla's approach). For deep learning, vast amounts of driving data are required, but the company has few vehicles on the road and thus no way to gather that data.
- The test vehicles they've built are apparently super ugly as they are festooned with sensors (up to 14 lidar units?). Apple appears to be considering building up that aesthetic rather than trying to hide it.
- In real-world testing, a car almost struck a jogger on an unmarked crosswalk, requiring intervention from a safety driver.
Edited by Fighteer on Jul 11th 2022 at 3:55:01 PM
"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"I don't know why, but this seems like the kind of problems that occur when someone tries to board a hype train despite not having the right experience, idea and really company-wide vision to do it...
"For a successful technology, reality must take precedence over public relations, for Nature cannot be fooled." - Richard FeynmanDidn't Paris have a tourist bus that was made with lots of glass
that made tourists feel like they're in a greenhouse?
Tesla FSD Beta version 10.13 is releasing today to qualified owners. So far the release notes have only been screencapped, as in this tweet
, and we can't read all of them. I can paraphrase the most important improvements, though: unprotected left turns, lane-keeping on unmarked roads, and anticipation of pedestrian behavior.
The release notes specifically call out "Chuck Cook" style intersections, and that's because Tesla has been working very closely with the submissions of a particular owner (named Chuck Cook, obviously) who is faced with an unprotected left turn across six lanes of traffic
. As an aside, whoever built that intersection should be fired and then dope slapped.
While testing FSD Beta at this intersection, Cook noted that the car would frequently "panic" and refuse to advance through the intersection or even change its mind partway through and turn right instead. Elon Musk has personally stated that the behavior should be substantially improved in this release. Among other things, the car should make better use of the median area between the lanes.
Lane-keeping on unmarked roads refers to the car's tendency to take the middle of a wide road that has no lane markings, such as what you would find in many suburban residential areas. In this release, the car should prefer the right side of the road, splitting it into virtual lanes as a human driver would.
Finally, 10.12 has some notable issues with being overly cautious around pedestrian crossings. It's very paranoid about the possibility that someone might decide to cross and so it tends to slow down and wait in ways that a human driver would not. This release promises to improve the algorithms that try to predict when someone will cross ("interact with the ego car") vs. stay on the sidewalk.
One thing we can take away from these notes is that Tesla feels that it has solved most of the mechanical aspects of self-driving — getting the car to understand its environment — and is now into the scenarios that test even some humans: predicting how other people will behave. This is where self-driving cars in general will face their greatest challenges.
Having a mix of self-driving and human-driven cars on the roads is the global worst-case for autonomy. The more cars are self-driving, the more predictably they will behave and thus the fewer conflict situations will arise.
Edited by Fighteer on Jul 18th 2022 at 11:58:31 AM
"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"Yeah. By the way, I misinterpreted the original tweet. FSD Beta 10.13 has released internally to employees, hence the leaked notes, but Musk said it needs a little more time before the customer release.
"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"Electrek: Tesla self-driving smear campaign releases ‘test’ that fails to realize FSD never engaged
Remember Dan O'Dowd, the "billionaire" founder of Green Hills Software and California Senate candidate whose sole public platform appears to be to get the federal government to ban Tesla Full Self-Driving? Well, he came out with another smear ad, this time claiming that in a closed-course test, FSD Beta failed to recognize children in the road.
Unsurprisingly, a review of this alleged test shows that the driver never engaged FSD, despite claims to the contrary. It's not completely clear why it failed to engage; the guess is that, since the car wasn't on a recognized road, it couldn't get GPS navigation data. I'm also not sure why it wouldn't have engaged Automatic Emergency Braking in this situation, but it can be seen that the driver is not interacting with the car and it's just drifting to a stop.
Anyway, the point is not to trust anything you see from this person or group.
Edited by Fighteer on Aug 10th 2022 at 4:55:00 AM
"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"

If Tesla had just called it "Driver Assistance", they'd have entirely avoided this problem.
It's why a fruit juice company can't brand themselves as "Medicine Co." or something like that. They're taking an existing word and coopting it to give a false impression of what their product actually is.
Tesla decided to avoid calling it what it literally is and used a term that is seriously misleading because it feeds into a really common misunderstanding of what real world autopilots actually are. And on top of that, Tesla isn't doing much to avoid that misconception. Think about it. Have you ever seen them advertise the autopilot thing and make a point in the ad of saying that it's not actually an autopilot and is actually a driver assist? Most of the stuff where they talk about it, it's like "Tesla Autopilot can do automatic parallel parking!" and things like that. It's really not hard to see why someone would make assumptions from that.
You can't assume everyone knows about the minutia of stuff, if all anyone's seeing is Tesla going "we have autopilot and our cars can be self-driving!" why would anyone think to question what that means? Tesla makes a point of advertising themselves as the wave of the future, why would someone feel the need to question that?
Edited by Zendervai on Jun 12th 2022 at 2:25:02 PM