Follow TV Tropes

Following

Self-Driving Cars

Go To

A thread to discuss self-driving cars and other vehicles. No politics, please.

Technology, commercial aspects, legal considerations and marketing are all on-topic.


  • Companies (e.g. Tesla Inc.) are only on-topic when discussing their self-driving products and research, not their wider activities. The exception is when those wider activities directly impact (or are impacted by) their other business areas - e.g. if self-driving car development is cut back due to losses in another part of the business.

  • Technology that's not directly related to self-driving vehicles is off-topic unless you're discussing how it might be used for them in future.

  • If we're talking about individuals here, that should only be because they've said or done something directly relevant to the topic. Specifically, posts about Tesla do not automatically need to mention Elon Musk. And Musk's views, politics and personal life are firmly off-topic unless you can somehow show that they're relevant to self-driving vehicles.

    Original post 
Google is developing self-driving cars, and has already tested one that has spent over 140,000 miles on the road in Nevada, where it is street-legal. They even let a blind man try a self-driving car. The car detects where other cars are in relation to it, as well as the curb and so on, follows speed limit and traffic laws to the letter, and knows how to avoid people. It also uses a built-in GPS to find its way to places.

Cadillac plans to release a scaled back, more simple version of similar technology by 2015 - what they call "Super Cruise", which isn't total self-driving, but does let you relax on highways. It positions your car in the exact center of a lane, slows down or speeds up as necessary, and is said to be meant for ideal driving conditions (I'm guessing that means ideal weather, no rain or snow, etc.).

I am looking forward to such tech. If enough people prefer to drive this way, and the technology works reliably, it could result in safer roads with fewer accidents. Another possibility is that, using GPS and maybe the ability to know ahead of time which roads are most clogged, they can find the quickest route from place to place.

On the other hand, hacking could be a real concern, and I hope it doesn't become a serious threat. It's looking like we're living more and more like those sci-fi Everything Is Online worlds depicted in fiction for a long time.

(Mod edited to replace original post)

Edited by Mrph1 on Mar 29th 2024 at 4:19:56 PM

SeptimusHeap from Switzerland (Edited uphill both ways) Relationship Status: Mu
#1701: Feb 16th 2024 at 9:47:40 AM

Now to be fair, we don't know for certain that the deceptive naming was the reason for the crash discussed here, pace the fact that this concern has been raised before (including in this thread). Mine's a plausible theory, but there are many other way through which it could have happened.

"For a successful technology, reality must take precedence over public relations, for Nature cannot be fooled." - Richard Feynman
Silasw A procrastination in of itself from a handcart heading to Hell Since: Mar, 2011 Relationship Status: And they all lived happily ever after <3
A procrastination in of itself
#1702: Feb 16th 2024 at 9:48:18 AM

From a legal perspective the software being engaged is not inherently something that matters, because the software is not meant to drive the car, it’s meant to assist a fully capable person to drive the car.

If someone got drunk and jumped off a building with an umbrella expecting it to slow their fall Mary Poppins style the umbrella company would not have legal liability even if the umbrella was open.

The possible changing fact is that the owner clearly thought his car could drive itself, was that belief reasonable? Normally we’d say no and Tesla will argue as such, but as we’ve got a motoring equivalent of an umbrella company selling “Fully Mary Poppins magical slow fall umbrellas” there’s a liability argument there. That argument will end up being considered against the evidence of not just the Tesla small print, but also Musk’s public hype statements.

"And the Bunny nails it!" ~ Gabrael "If the UN can get through a day without everyone strangling everyone else so can we." ~ Cyran
Zendervai Visiting from the Hoag Galaxy from St. Catharines Since: Oct, 2009 Relationship Status: Wishing you were here
Visiting from the Hoag Galaxy
#1703: Feb 16th 2024 at 9:48:38 AM

Yeah, of course.

I was more going for the “this is a misleading name and it wasn’t a good move because it oversells what’s actually happening in general”.

[up] It’s like, if someone attempts to use the Cybertruck like a boat and drowns, normally, people would be like “well, that guy’s an idiot.” But Musk literally told people that the Cybertruck can be used as a boat in some circumstances. That goes from “some people on Reddit were hyping each other up” to “the guy in charge of the company literally directly said it and people took him seriously.” That really does create liability.

Because the name of the software directly states that it is full self driving, people assuming it’s full self driving matters. The other question is what exactly that guy was doing for Tesla. Yeah, I know Tesla has that asinine “everyone does everything” policy, but if he’s a hardware engineer or is mostly figuring out how to physically put the cars together, there’s a pretty good chance he actually doesn’t know how the self-driving works and what the limitations are.

The “everyone does everything” thing sounds nice if you’re an idiot, but like, how much is a hardware engineer or an HR person or machinist really going to know about the actual internal details of the way the software system works? And on top of that, we need to know how Tesla is talking about these systems internally to their employees. It’s not at all weird that a Tesla employee would get sucked into the hype about a system they aren’t directly involved with.

Edited by Zendervai on Feb 16th 2024 at 12:58:39 PM

Not Three Laws compliant.
Shaoken Since: Jan, 2001 Relationship Status: Dating Catwoman
#1704: Feb 16th 2024 at 1:20:36 PM

It’s worth noting that FSD isn’t allowed to be used in Europe for several reasons, one of the, being the name being deceptive. It’s only being used in America because of lax standards.

@IMCA from the last page

I also find it sus that they would blame the lithium ion, when uhhh petrol also very much burns....

Lithium burns hotter than petrol, that's an objective fact and was the point of the comparison. You can put aside all other statistics on how lithium ion batteries are less likely than ICE engines to catch fire as that's irrelevant in this particular case. The claim is that if it was an ICE that was on fire the thermal injuries wouldn't have been as severe and the deceased may have survived. Of course it's kind of a moot topic since as far as I'm aware no ICE has a FSD system so this exact crash wouldn't have happened, but it's not fear mongering to state a Lithium Ion battery burns hotter than a ICE.

Edited by Shaoken on Feb 16th 2024 at 10:08:26 PM

Fighteer Lost in Space from The Time Vortex (Time Abyss) Relationship Status: TV Tropes ruined my love life
Lost in Space
#1705: Mar 26th 2024 at 6:24:37 AM

Since I'm sure this is going to catch attention, I'll jump out ahead of it. Tesla as a company is showing increasing confidence in Full Self-Driving version 12.3.x, which uses an end-to-end neural-network stack, and has begun offering a free one-month trial for all vehicles (that are capable of running it; see below).

Elon Musk has reportedly instructed all sales locations in North America to offer a FSD 12.3.1 test ride to new customers to show off the technology.

There are a few caveats: FSD is not yet available on Cybertruck, as it's still being "tuned" in the models. I have also heard that refreshed Model 3s may be on the wrong software build to enable FSD 12.3, but hopefully that'll get sorted out soon enough. Also this implies that Canada is on the list, when it hasn't had access to FSD Beta before now (IIRC).

Some observers have noted that newly delivered vehicles use an "engineering build" of the operating system that calibrates FSD and Autopilot over the first 50-100 miles of driving, and that wouldn't allow use of the software. Tesla must feel that they have solved that problem as well.

All in all, this is a very ambitious moment in the program and one that is undoubtedly going to turn over a lot of rocks given a combination of inattentive/inexpert drivers and autonomy software that isn't yet Level 4 certified. At the same time, I'm a little irritated that this happened several months before I'll prepared for my own purchase, at long last.

Edited by Fighteer on Mar 26th 2024 at 9:30:48 AM

"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"
RainehDaze Figure of Hourai from Scotland (Ten years in the joint) Relationship Status: Serial head-patter
Figure of Hourai
#1706: Mar 26th 2024 at 10:33:50 AM

At the same time, I'm a little irritated that this happened several months before I'll prepared for my own purchase, at long last.

Eh, you're sitting out something that you should absolutely not want to be beta testing anyway.

Avatar Source
Fighteer Lost in Space from The Time Vortex (Time Abyss) Relationship Status: TV Tropes ruined my love life
Lost in Space
#1707: Mar 26th 2024 at 11:59:06 AM

Speak for yourself. From my point of view, autonomy can't come soon enough. It's a matter of balancing the risk of a system that isn't perfect against the benefit of improving the safety of driving overall. I've seen too many bad drivers in my life to be under any illusions about the value of self-driving, if it works.

There will be an intersection point of [people using self-driving irresponsibly] and [lives saved by self-driving] that will be an acid test for human civilization. Are we able to tolerate increased specific risks over that period of time in trade for reduced general risk in the long run?

Edited by Fighteer on Mar 26th 2024 at 3:00:09 PM

"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"
Zendervai Visiting from the Hoag Galaxy from St. Catharines Since: Oct, 2009 Relationship Status: Wishing you were here
Visiting from the Hoag Galaxy
#1708: Mar 26th 2024 at 12:20:33 PM

[up] That's not what Raineh was talking about.

That free trial is pretty obviously Tesla doing free beta testing with any users who sign up for it. This isn't a "boo, Tesla" thing, it's just what every tech company does if they get the chance.

The point you want to enter the ecosystem at is after the bugs have been worked out, not when they're still trying to identify them using, apparently, live driver data.

You specifically as an individual will have a negligible impact on the results, so bemoaning that you don't get to do beta testing for Tesla for free isn't worth it.

Edited by Zendervai on Mar 26th 2024 at 3:21:31 PM

Not Three Laws compliant.
Fighteer Lost in Space from The Time Vortex (Time Abyss) Relationship Status: TV Tropes ruined my love life
Lost in Space
#1709: Mar 26th 2024 at 12:24:07 PM

That's the thing. Tesla is apparently confident enough in FSD 12.3 that they are willing to release it to the general public for trials. It's been in beta for years. If this is the track that leads to the promised final thing, then I'd at least like to try it out and see for myself.

This obviously isn't a choice for everyone. It's something you should do only if you're willing to be accountable for the results. I worry that Tesla is opening itself up for a realm of lawsuits, and this more than anything is a show of confidence.

In response to your edit, it is grossly inaccurate to say that I, as an individual, would have no impact on the outcome. The fundamental concept of the FSD program is to use real feedback from real drivers to improve the system. Sure, it may work perfectly in a million places, but does it work on my streets in my neighborhoods? The answer is valuable whether it's yes or no.

Edited by Fighteer on Mar 26th 2024 at 3:26:29 PM

"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"
Zendervai Visiting from the Hoag Galaxy from St. Catharines Since: Oct, 2009 Relationship Status: Wishing you were here
Visiting from the Hoag Galaxy
#1710: Mar 26th 2024 at 12:26:49 PM

I mean, if you're the only person driving a Tesla in your vicinity, then sure. But if anyone else is around, then not really. Also, like, if a Tesla has serious problems in what I'm guessing is a random suburb, you don't want to be the one discovering that.

Edited by Zendervai on Mar 26th 2024 at 3:28:46 PM

Not Three Laws compliant.
Fighteer Lost in Space from The Time Vortex (Time Abyss) Relationship Status: TV Tropes ruined my love life
Lost in Space
#1711: Mar 26th 2024 at 12:30:48 PM

Actually, I'd be only too happy to be "discovering that", as long as I feel sure that I can take over if needed to prevent damage or injury. You can speak for yourself, but you shouldn't presume to speak for everyone. "Leave it to other people", taken to an extreme, means that nothing ever gets challenged or fixed or tested.

"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"
Zendervai Visiting from the Hoag Galaxy from St. Catharines Since: Oct, 2009 Relationship Status: Wishing you were here
Visiting from the Hoag Galaxy
#1712: Mar 26th 2024 at 12:39:19 PM

Dude, I'm saying it's not that big a deal to miss the free trial period and that "never beta test for free" is a pretty standard mindset around tech stuff.

Not Three Laws compliant.
Fighteer Lost in Space from The Time Vortex (Time Abyss) Relationship Status: TV Tropes ruined my love life
Lost in Space
#1713: Mar 26th 2024 at 12:42:04 PM

There are millions of people willing to do exactly that. You are taking a very restrictive viewpoint on risk. That's fine for you, but you should not prescribe how others feel. This isn't like a video game released as an Obvious Beta.

In fact, the cost of FSD has acted as a deterrent to getting more people involved in said efforts. The pool of people willing to drop $12K to be part of a beta test is vastly smaller than the pool willing to take advantage of a free trial of the same software.

As a wise man once said, "Moar data, moar better".

Edited by Fighteer on Mar 26th 2024 at 3:45:01 PM

"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"
Silasw A procrastination in of itself from a handcart heading to Hell Since: Mar, 2011 Relationship Status: And they all lived happily ever after <3
A procrastination in of itself
#1714: Mar 26th 2024 at 12:44:35 PM

Tesla is apparently confident enough in FSD 12.3 that they are willing to release it to the general public for trials.

Sure, but they are not yet confident enough in FSD to take on the legal liability risk of people using it as a self-driving program (as opposed to a driver-support program). That distinction is a pretty key one for a lot of people because the only risk Tesla is taking with FSD 12.3 being tested by the general public is repetitional, financial/legal risk is where companies generally show their true confidence.

"Leave it to other people", taken to an extreme, means that nothing ever gets challenged or fixed or tested.

The point being made is that product testing should generally be left to either paid product testers or people who have a personal financial stake in the product.

I used to test half-finished PC games for Lego, I got a free copy of the game at the end of it.

There are millions of people willing to do exactly that.

That’s an appeal to popularity and you know it.

This isn't like a video game released as an Obvious Beta.

Only because there’s no claim that this is the full product. This is moving from paid Early Access to a public beta.

The pool of people willing to drop $12K to be part of a beta test is vastly smaller than the pool willing to take advantage of a free trial of the same software.

And it’s a positive step that Tesla has moved from expecting people to pay to do product testing for them to asking them to do it for free, I personally hope they eventually move to paying people to do product testing for them.

Edited by Silasw on Mar 26th 2024 at 7:49:17 PM

"And the Bunny nails it!" ~ Gabrael "If the UN can get through a day without everyone strangling everyone else so can we." ~ Cyran
Fighteer Lost in Space from The Time Vortex (Time Abyss) Relationship Status: TV Tropes ruined my love life
Lost in Space
#1715: Mar 26th 2024 at 12:47:03 PM

To be clear, that legal risk is not up to Tesla. To switch liability from the driver to the company, it has to be officially classified as SAE Level 4 or equivalent. (Level 3 also carries some shared liability, I think.) To get to that point, it has to demonstrate safe operation under a vast array of situations, meaning it needs to get on the road and be tested.

Hiring people to do that work is simply not enough to generate the necessary data. Some might say that's where Cruise and Waymo ran into their major roadblocks. The possibility space of a video game is finite, as is the risk. The possibility and risk spaces of a self-driving car are practically infinite. The goal is to reduce the risk space by collecting as much data as possible.

Offering FSD Beta for free trials is indeed a risk for Tesla. I won't argue that point. While envious of those who get to test it right away, I am at least somewhat glad I'm not among the very first group.

Edited by Fighteer on Mar 26th 2024 at 3:49:37 PM

"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"
Silasw A procrastination in of itself from a handcart heading to Hell Since: Mar, 2011 Relationship Status: And they all lived happily ever after <3
A procrastination in of itself
#1716: Mar 26th 2024 at 12:54:43 PM

To be clear, that legal risk is not up to Tesla. To switch liability from the driver to the company, it has to be officially classified as SAE Level 4 or equivalent.

And to the best of my knowledge Tesla has not applied for that status, which says that they don’t think it can meet the regulatory standards required.

Hiring people to do that work is simply not enough to generate the necessary data.

Why not? Can Tesla not afford to pay enough people even if payment was provided in kind (something like do X hours of FSD testing for us and you’ll get a FSD licence on the house once full-release happens)? Or are the standards set out such that it’s simply not realistic for them to be met without a public beta?

Do we actually know what the standards are? How many sceneries would a self-driving vehicle have to demonstrate itself under to get certified?

"And the Bunny nails it!" ~ Gabrael "If the UN can get through a day without everyone strangling everyone else so can we." ~ Cyran
Fighteer Lost in Space from The Time Vortex (Time Abyss) Relationship Status: TV Tropes ruined my love life
Lost in Space
#1717: Mar 26th 2024 at 1:01:11 PM

I think I should point out that Tesla does have a ton of engineering drivers whose job it is to "alpha test" builds of FSD. Chuck Cook, who has become something of an online celebrity for his proximity to an immensely complicated unprotected left turn across multiple highway lanes, has observed multiple rounds of Tesla employees testing the software near his neighborhood.

But there is just too much data for a finite set of employees to gather. We're talking every road, under all weather and lighting conditions, with all types of traffic, with every possible emergency situation. You can drive the same road a thousand times, but on the thousand and first is when a deer will run across the road being chased by a mountain lion.

The more people you get involved, the more rapidly you can find those edge cases and train the system to handle them. Tesla also uses simulations, where it'll train the system using synthetic visual data in order to be able to present extremely rare scenarios. It's throwing everything at this software that it can.

I don't know exactly what the NHTSA's standards are for self-driving certification at Level 4 or higher. I'm not even sure that there are concrete standards, given how few vehicles have attempted it. Tesla could probably apply for Level 3 (which allows unsupervised driving under limited circumstances) right now and get it, but it seems like the company wants to jump straight past that.

From what I do know, the idea is to present regulators with literally hundreds of millions of miles of hard data showing that the system works as advertised.

ETA: Elon Musk recently posted that Tesla's AI team is no longer "compute-limited" for training FSD. To my inexpert eye, this means that data gathering is now the maximum priority.

Edited by Fighteer on Mar 26th 2024 at 4:12:59 AM

"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"
Silasw A procrastination in of itself from a handcart heading to Hell Since: Mar, 2011 Relationship Status: And they all lived happily ever after <3
A procrastination in of itself
#1718: Mar 26th 2024 at 1:21:14 PM

If Tesla have set the standards themselves rather than them being from the NHTSA then I’d argue that they’re trying to massively overshoot what should be the mark.

There are already clear driving capability standards in law around the world, they’re the standards that humans are tested to and self-driving should be aiming to be better than thosue standards. You don’t need to manage “every road, under all weather and lighting conditions, with all types of traffic, with every possible emergency situation.”, you need to be better then the human alternative, whose standards are publicly listed.

Subject FSD to the relevant human driving tests in different locations and then publicise how well it does. If it can pass with better marks then a human needs then I’d say that puts the ball in the NHTSA’s court to explain why it’s not safe to certify.

Bluntly, if the NHTSA haven’t said they want “hundreds of millions of miles of hard data showing that the system works as advertised.” then it’s silly to be trying to produce that, even if the NHTSA had said it wants it I’d honestly argue that Tesla should then be publicly calling that a bullshit double-standard compared to what humans have to prove. But I’d at least understand a company less willing to get in a public slap-fight with regulators just quietly playing the game.

"And the Bunny nails it!" ~ Gabrael "If the UN can get through a day without everyone strangling everyone else so can we." ~ Cyran
Zendervai Visiting from the Hoag Galaxy from St. Catharines Since: Oct, 2009 Relationship Status: Wishing you were here
Visiting from the Hoag Galaxy
#1719: Mar 26th 2024 at 1:27:43 PM

I do still think that having the self-driving actually do driving tests in a bunch of different places with different standards and stuff would be a really good way to advertise it.

Not Three Laws compliant.
Fighteer Lost in Space from The Time Vortex (Time Abyss) Relationship Status: TV Tropes ruined my love life
Lost in Space
#1720: Mar 26th 2024 at 1:31:11 PM

[up][up]It's not just that. Tesla understands the psychological barriers it has to overcome here. FSD can't just be 25 percent safer than the average human driver to be accepted. It has to be at least an order of magnitude safer. We can call this a double-standard if we want, and it is, but it's the simple reality that we face.

The process is synergistic: get as many cars as possible in the hands of drivers that are equipped for self-driving, gather massive amounts of data from them to train the neural networks, feed that back into the cars to get more driving data, etc.

In particular, we don't want regressions from human behavior. One thing noted in FSD 12.3 is that it still has a tendency to misread certain signs. For example, it'll see a "40 MPH Minimum" sign and interpret it as the limit being 40 mph. That's easy enough to fix in a training set, but there is so much signage out there that has to be accounted for. One of the goals of FSD v12 is to teach the software to think semantically, not just logically: "What should I be doing here, given what I know and what I see?"

A problem that can't be fixed in training is road law obeyance. There are two specific cases to illustrate the point: speeding and rolling stops. Some would say that a self-driving car should always come to a complete stop at a stop sign. This includes regulators. But actual humans in real-world driving rarely do this. Indeed, when Tesla began training its end-to-end NN on real data, it found that humans do it less than 5 percent of the time when there is no opposing traffic.

Similarly, humans almost always speed. If the limit is 35, they will go 40 or 45 mph. If you want the car to drive like a human, it should speed too. If you want it to obey the laws, it should strictly follow the speed limit. Yet, driving below ambient speed can often create an active hazard as other drivers feel that you are obstructing them.

These are incredibly challenging problems to solve because they are psychological, not technological.

Edited by Fighteer on Mar 26th 2024 at 4:37:35 AM

"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"
Silasw A procrastination in of itself from a handcart heading to Hell Since: Mar, 2011 Relationship Status: And they all lived happily ever after <3
A procrastination in of itself
#1721: Mar 26th 2024 at 1:57:45 PM

Tesla understands the psychological barriers it has to overcome here. FSD can't just be 25 percent safer than the average human driver to be accepted. It has to be at least an order of magnitude safer.

Does it? Because we already have plenty of cases of people wanting to trust a technology of that Tesla won’t say is safer than a human, I’d say that the risk appetite of the general public for self-driving technology is actually pretty high. I’m curious as to if there’s actually any public polling on this.

One of the goals of FSD v12 is to teach the software to think semantically, not just logically: "What should I be doing here, given what I know and what I see?"

Which is something that any quality driving test would evaluate anyway. On the U.K. driving theory test you will get a picture of a road without signage and then be told to work out what the speed limit is based purely on the context clues.

These are incredibly challenging problems to solve because they are psychological, not technological.

I’d argue they’re not even psychological, they’re legal. The regulators need to be asked if they’d rather cars following the law even when it poses a danger because other cars are all breaking it, or if they want a car that will mirror other vehicles even when they’re doing illegal actions without valid cause.

I do still think that having the self-driving actually do driving tests in a bunch of different places with different standards and stuff would be a really good way to advertise it.

It’s just such a good gimmick that I’m left to wonder if some companies have tried it and found the software to do poorly so they’ve not told anyone. Now for Tesla there is the issue that the US has no universal driving test standards (with some states just handing out a full licence if you go enough years as a learner without being caught breaking the law), but for European companies this should be easy.

Edited by Silasw on Mar 26th 2024 at 8:58:06 AM

"And the Bunny nails it!" ~ Gabrael "If the UN can get through a day without everyone strangling everyone else so can we." ~ Cyran
Fighteer Lost in Space from The Time Vortex (Time Abyss) Relationship Status: TV Tropes ruined my love life
Lost in Space
#1722: Mar 26th 2024 at 2:28:26 PM

Passing a driving test in the US is indeed a gimmick. They could do it easily, but it wouldn't prove anything to regulators. European regulations are much tougher, which is why FSD is being developed here instead of there.

The reason I say those challenges are psychological is because a crucial factor in FSD adoption is the satisfaction drivers get from using it. No matter how hard we regulate or legislate, if people don't like how the car drives itself, they won't use it, and that includes things like matching prevalent traffic speeds and not stopping "unnecessarily" at signs when there is no opposing traffic.

I believe that there will come a time when enough cars are autonomous that you'll need a special license in order to drive manually, but that's a long way out. As long as driver choice is a factor, how it drives is a factor.

Edited by Fighteer on Mar 26th 2024 at 5:31:07 AM

"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"
Silasw A procrastination in of itself from a handcart heading to Hell Since: Mar, 2011 Relationship Status: And they all lived happily ever after <3
A procrastination in of itself
#1723: Mar 26th 2024 at 4:23:35 PM

I think you’re missing that a big chunk of the market will only care if the car drives in a safe and time efficient manner. You’re taking the perspective of someone who is in the front seat watching every minor action of the car to see if it meets their standards, I think the actual customer base for a Level 4 system are people who will sit in the back seat on their phone.

That’s before we consider the big chunk of people who would use the car to do things without the owner in it. Place a grocery (or even takeaway) order for collection and then send the car to collect on its own, have the car take the kids to sports practise, send the car home on its own after I drive it to work so that my wife can summon it when she finishes work and drive herself home.

All of those use cases need the car to be safe and time efficient, but unless it impacts journey times none of them care if the car is being a bit cautious at stop signs or sticks to the slow lane on the highway because everybody else is speeding.

"And the Bunny nails it!" ~ Gabrael "If the UN can get through a day without everyone strangling everyone else so can we." ~ Cyran
Galadriel Since: Feb, 2015
#1724: Mar 26th 2024 at 6:50:14 PM

Most parents would not send a self-driving car to pick up their kids from school. That’s one of the very last things that would happen after we already had ideal self-driving cars and everyone was convinced of their safety.

Fighteer Lost in Space from The Time Vortex (Time Abyss) Relationship Status: TV Tropes ruined my love life
Lost in Space
#1725: Mar 27th 2024 at 7:29:46 AM

[up][up] Maybe it's the social media bias, but I've seen a lot of people swear that they will not use self-driving cars unless they speed and roll stop signs. It's weird. Near where I live, there are areas of highway where you're practically a road hazard if you aren't going at least 10 mph over the limit. I'd like nothing better than to see that mentality change.

While I have never used a fully autonomous vehicle, I do often turn on the Subaru EyeSight ADAS while driving my wife's car on the highway, and I find that setting a speed and letting the car do the rest is immensely relaxing. I stop worrying about how fast I'm going or need to go.

Maybe the barrier is just getting this tech in the hands of people and letting them see for themselves, in which case Tesla's "everyone gets a free trial" initiative might be incredibly helpful.

[up] To be clear, I also would not send my self-driving car to pick up the kids from school on its own until they're vastly more mature than they are today. Meaning the cars, not the kids. That's not likely to happen for some time anyway, since it would require Level 4 certification at the very least, along with a "robo-taxi" mode that can follow commands without a driver/owner present.

ETA: It would also undoubtedly require changes in school policy to make sure the kids don't accidentally get in the wrong car.


Update from X: FSD Beta 12.3.1 is now going out to Tesla vehicles in Canada.

Edited by Fighteer on Mar 27th 2024 at 10:54:05 AM

"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"

Total posts: 1,881
Top