Follow TV Tropes

Following

Self-Driving Cars

Go To

A thread to discuss self-driving cars and other vehicles. No politics, please.

Technology, commercial aspects, legal considerations and marketing are all on-topic.


  • Companies (e.g. Tesla Inc.) are only on-topic when discussing their self-driving products and research, not their wider activities. The exception is when those wider activities directly impact (or are impacted by) their other business areas - e.g. if self-driving car development is cut back due to losses in another part of the business.

  • Technology that's not directly related to self-driving vehicles is off-topic unless you're discussing how it might be used for them in future.

  • If we're talking about individuals here, that should only be because they've said or done something directly relevant to the topic. Specifically, posts about Tesla do not automatically need to mention Elon Musk. And Musk's views, politics and personal life are firmly off-topic unless you can somehow show that they're relevant to self-driving vehicles.

    Original post 
Google is developing self-driving cars, and has already tested one that has spent over 140,000 miles on the road in Nevada, where it is street-legal. They even let a blind man try a self-driving car. The car detects where other cars are in relation to it, as well as the curb and so on, follows speed limit and traffic laws to the letter, and knows how to avoid people. It also uses a built-in GPS to find its way to places.

Cadillac plans to release a scaled back, more simple version of similar technology by 2015 - what they call "Super Cruise", which isn't total self-driving, but does let you relax on highways. It positions your car in the exact center of a lane, slows down or speeds up as necessary, and is said to be meant for ideal driving conditions (I'm guessing that means ideal weather, no rain or snow, etc.).

I am looking forward to such tech. If enough people prefer to drive this way, and the technology works reliably, it could result in safer roads with fewer accidents. Another possibility is that, using GPS and maybe the ability to know ahead of time which roads are most clogged, they can find the quickest route from place to place.

On the other hand, hacking could be a real concern, and I hope it doesn't become a serious threat. It's looking like we're living more and more like those sci-fi Everything Is Online worlds depicted in fiction for a long time.

(Mod edited to replace original post)

Edited by Mrph1 on Mar 29th 2024 at 4:19:56 PM

Falrinn Since: Dec, 2014
#1651: Nov 12th 2023 at 1:25:42 PM

I assume that if it can be proven that the end user made unauthorized modifications to their car's self-driving functions, then they also assume the role of manufacturer for liability purposes and without any protections a manufacturer will gain by certifying their self-driving systems with the relevant government agencies.

Possibly using uncertified self-driving systems on a public roadway could itself be a crime even if doesn't cause an accident. And making unsupported modifications would count as implementing an uncertified self-driving system (unless you go through the certification process yourself of course).

Fighteer Lost in Space from The Time Vortex (Time Abyss) Relationship Status: TV Tropes ruined my love life
Lost in Space
#1652: Nov 12th 2023 at 1:27:57 PM

We're talking behavior as simple as installing a counterweight on a steering wheel to defeat "nags" to pay attention. And no, it's not just Tesla that has (or had) that problem. My Subaru's EyeSight technology is pretty good at lane-keeping and traffic-aware cruise control under decent conditions but if I hang a weight on the wheel and go to sleep, I'm 100% liable if it veers off the road because a curve was too sharp.

I don't even know why this is under debate. I remember stories from when cruise control was first deployed as a technology and people would treat it like self-driving. You can't fix stupid. The only way to keep humans from driving badly is to keep them from driving at all, ergo Level 5 autonomy.

ETA: Regulations are important to ensure that manufacturers don't take unsafe shortcuts, but this is a journey that requires some degree of accommodation for innovation. If we stifle it by demanding perfection up front, it'll never happen.

Edited by Fighteer on Nov 12th 2023 at 4:35:53 AM

"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"
RainehDaze Figure of Hourai from Scotland (Ten years in the joint) Relationship Status: Serial head-patter
Figure of Hourai
#1653: Nov 12th 2023 at 1:41:12 PM

Seems like we're just moving the barrier for manufacturer liability down a step. Presumably so that nobody can ever play the "the self driving feature turned off after doing something insane without warning, you're responsible as you had less than a second to realise and correct" card, regardless of whether or not anyone has done something like that.

Avatar Source
Silasw A procrastination in of itself from A handcart to hell (4 Score & 7 Years Ago) Relationship Status: And they all lived happily ever after <3
A procrastination in of itself
#1654: Nov 12th 2023 at 1:42:59 PM

Several companies have released Level 3 solutions akin to Drive Pilot, but they are extremely limited: good weather, geofenced roads, low speeds, and so on. They are specialized products whose architecture cannot be expanded to Level 4 without almost complete redesign.

Can you expand on this? At level 3 the vehicle is controlling all systems is it not? Looking at how the levels are described online the difference to me looks like if when the car can’t solve a road issue it hands over to the driver or safely parks itself and radios for help.

Causing manufacturers to be liable for crashes of Level 3 autonomous vehicles is a very thorny problem. If they are operated within their established parameters, maybe, but the whole point of these systems is that they require the driver to take over at a few seconds' notice if they can't handle a situation.

Sure and liability transfers back to the driver when the car hands control back to them. What’s being legislated is that if the self-driving system remains active then the liability is with the manufacturer.

Waymo and Cruise are already deploying prototype Level 5 solutions (robo-taxis), albeit under very strict limitations. So it's not even technically accurate to say that Mercedes is in the lead.

Are they commercially available for the general public? Mercedes are not simply testing a product, they’re selling it.

I remember stories from when cruise control was first deployed as a technology and people would treat it like self-driving. You can't fix stupid.

It’s correct that you can’t fix stupid, but you can chance the threshold of stupid that causes harm. That’s why the U.K. is going to regulate the marketing.

“And the Bunny nails it!” ~ Gabrael “If the UN can get through a day without everyone strangling everyone else so can we.” ~ Cyran
Fighteer Lost in Space from The Time Vortex (Time Abyss) Relationship Status: TV Tropes ruined my love life
Lost in Space
#1655: Nov 12th 2023 at 2:00:04 PM

Can you expand on this? At level 3 the vehicle is controlling all systems is it not? Looking at how the levels are described online the difference to me looks like if when the car can’t solve a road issue it hands over to the driver or safely parks itself and radios for help.

The line between Level 2 and Level 3 is fuzzy. Both can have limitations in their operating environments, but the main difference is that in a Level 2 system the driver is expected to be alert and ready to intervene at any time, whereas a Level 3 system has certain conditions under which a driver is allowed to take their eyes off the road and do other tasks, as long as they are able to take back over with some warning.

My car could be Level 3 on highways as long as they are properly mapped, I'm going under 40 mph, and driving conditions are good; but Level 2 in other situations and have no autonomy at all in cities or suburbs.

Meanwhile, I can (or could, before the CA DMV shut it down) hail a Cruise and have it drive me around San Francisco while I nap or play video games on my phone. The latter is only possible with high levels of remote supervision, but yes, these systems are deployed and operating, albeit on a very small scale.

The distinction is not in the specific tasks performed but in the level of supervision. A Level 4 system is capable of handling all ordinary driving tasks and can request input from a driver if it doesn't know what to do, and/or safely stop if a driver is not available. A Level 5 system can operate without driver controls.

Anyway, the scaling factor is found in architecture. If you need to have programmers writing C++ code for every possible condition and outcome, plus a fleet of labelers training your software how to recognize things and a fleet of vehicles using lidar to map every street, you're going to hit a wall. That wall is that the driving world-space is effectively infinite and you have finite capacity to generate maps and code.

The other approach is to grow a machine-learning system to understand how to drive as a human would, so that it is capable of recognizing and adapting to situations. It's not following rote instructions but "understanding" at a high level what its task is and figuring out how to solve that task. This is a much more difficult challenge but one with greater potential to deliver true autonomy... many believe.

What’s being legislated is that if the self-driving system remains active then the liability is with the manufacturer.

Yes, I would say this is broadly reasonable, but my concern is that it requires a wide degree of discretion in the law for the different situations. If DrivePilot, for example, is operating under its advertised conditions for Level 3 autonomy and causes a crash or injury, I'd agree that Mercedes should have some liability.

Edited by Fighteer on Nov 12th 2023 at 5:18:05 AM

"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"
Silasw A procrastination in of itself from A handcart to hell (4 Score & 7 Years Ago) Relationship Status: And they all lived happily ever after <3
A procrastination in of itself
#1656: Nov 12th 2023 at 2:22:38 PM

Meanwhile, I can (or could, before the CA DMV shut it down) hail a Cruise and have it drive me around San Francisco while I napped or played video games on my phone. The latter is only possible with high levels of remote supervision, but yes, these systems are deployed and operating, albeit on a very small scale.

Being able to hail one isn’t the same as it being commercially available. Also if the system is being subject to supervision how is it level 5? Isn’t it just a remotely operated level 2?

If you need to have programmers writing C++ code for every possible condition and outcome, plus a fleet of labelers training your software how to recognize things and a fleet of vehicles using lidar to map every street, you're going to hit a wall. That wall is that the driving world-space is effectively infinite and you have finite capacity to generate maps and code.

Why is the driving world-space effectively infinite? Also why does it need to be trained for every situations? You talk a lot about how self-driving cars don’t need to be perfect, just better than humans. Can’t you just train it for the same scenarios that we train humans for and then include the addition of “when you see an unknown situation safely remove yourself from the situation” training? Sure it wouldn’t be quick, but I don’t see how it’s impossible, especially not just because one self-driving company has decided to go another way.

If you have a level 3 trained for highways at medium speed and good conditions I don’t see why it can’t be scaled to include high speeds, then slightly poor conditions, then bad conditions, then more street types. That’s kinda how a lot of humans learn, you start in a car-park in good weather and eventually you’re on a major A road at night in bad conditions.

This has got me thinking, has anyone tried putting a self-driving car through a driving test? Literally use the same test humans do and see how it performs.

“And the Bunny nails it!” ~ Gabrael “If the UN can get through a day without everyone strangling everyone else so can we.” ~ Cyran
Fighteer Lost in Space from The Time Vortex (Time Abyss) Relationship Status: TV Tropes ruined my love life
Lost in Space
#1657: Nov 12th 2023 at 2:35:39 PM

Also if the system is being subject to supervision how is it level 5? Isn’t it just a remotely operated level 2?

Yes, well. You've hit one of the nails on the head. According to some (contested) reports, Cruise vehicles require a remote intervention for every 2-5 miles of driving and have around 1.5 personnel supervising each vehicle if you take everything into account. That doesn't sound very autonomous. But Cruise disputes this analysis and it's not clear if the "1.5 personnel" counts just the operators or every engineer and programmer.

From the perspective of the passenger, they get in and the car takes them to their destination. But there may be some degree of smoke-and-mirrors involved.

As for the rest of your questions, I must plead lack of specific knowledge about any particular manufacturer's solution. Yes, one possible approach is to start with limited autonomy and gradually expand the envelope until it encompasses all reasonably predictable situations, but as yet nobody has achieved this nor shows signs of achieving it.

Tesla tried that approach and rapidly gave up on it as they began "chasing the nines", instead moving to a fully ML solution. That also has yet to pan out, although reports from drivers using FSD Beta suggest that it's capable of driving very well, with most of the problems occurring in edge cases. FSD is far more capable than any Level 2 or Level 3 system currently deployed, but its error rate is still too high.

Tesla apparently intends to build a case for regulators that it has solved Level 4 in a very broad environment and simply jump to that level, once the software reaches a sufficient level of maturity.

This has got me thinking, has anyone tried putting a self-driving car through a driving test? Literally use the same test humans do and see how it performs.

I'm sure people have tried "driver's test" equivalent scenarios, but you can't literally instruct a Tesla to "turn right at this intersection, change lanes, parallel park," etc. You tell it where to go and it goes there.

Edited by Fighteer on Nov 12th 2023 at 5:37:14 AM

"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"
RainehDaze Figure of Hourai from Scotland (Ten years in the joint) Relationship Status: Serial head-patter
Figure of Hourai
#1658: Nov 12th 2023 at 2:47:10 PM

I'm sure people have tried "driver's test" equivalent scenarios, but you can't literally instruct a Tesla to "turn right at this intersection, change lanes, parallel park," etc. You tell it where to go and it goes there.

If their developers couldn't engineer a way to do that, then I'm really concerned.

Avatar Source
Fighteer Lost in Space from The Time Vortex (Time Abyss) Relationship Status: TV Tropes ruined my love life
Lost in Space
#1659: Nov 12th 2023 at 2:50:14 PM

Could and want to are different things. Why would you give a Tesla a driver's test, other than to show off? Proving autonomy to regulators isn't about that, but about demonstrating millions or billions of miles driven without interventions and with statistically fewer crashes than human drivers.

"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"
RainehDaze Figure of Hourai from Scotland (Ten years in the joint) Relationship Status: Serial head-patter
Figure of Hourai
#1660: Nov 12th 2023 at 2:54:10 PM

Because it's a good metric to pass?

Sure, statistics are nice and all, but "our car can pass driving tests in all these different locations without intervention" is something that people will actually understand.

Avatar Source
Fighteer Lost in Space from The Time Vortex (Time Abyss) Relationship Status: TV Tropes ruined my love life
Lost in Space
#1661: Nov 12th 2023 at 2:56:08 PM

I'm sure someone could stage it if they wanted to, but if I were a regulator seeking to approve FSD I'd be interested in thousands or millions of repetitions of similar activities under different circumstances. A human driver only has to show that they can park once to get a license. A self-driving car has to be able to do it millions of times perfectly.

Marketing and regulation are fundamentally different things.

Edited by Fighteer on Nov 12th 2023 at 5:59:11 AM

"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"
RainehDaze Figure of Hourai from Scotland (Ten years in the joint) Relationship Status: Serial head-patter
Figure of Hourai
#1662: Nov 12th 2023 at 3:01:21 PM

And it's such blatantly obvious good marketing that I don't know why you wouldn't want to make sure you can do it before someone else does. <_>

Avatar Source
Silasw A procrastination in of itself from A handcart to hell (4 Score & 7 Years Ago) Relationship Status: And they all lived happily ever after <3
A procrastination in of itself
#1663: Nov 12th 2023 at 3:03:18 PM

From the perspective of the passenger, they get in and the car takes them to their destination. But there may be some degree of smoke-and-mirrors involved.

And from the perspective of an unobservant Uber customer their car is also self-driving. Cruise just sounds like a level 2 taxi fleet with remote drivers, which is cool and all, but I can’t pretend my computer is running itself just because I’ve got somebody remoted in from next-door.

as yet nobody has achieved this nor shows signs of achieving it.

Not achieving it any time soon. But if Mercedes have moderate speed highway driving in good conditions down at present then we can watch that space and see if if it expands over the next few years.

Tesla tried that approach and rapidly gave up on it as they began "chasing the nines", instead moving to a fully ML solution.

Sure but do we have any evidence that the decision was made for reasons of non-visibility tied to technology as opposed to being a business decision? Tesla is a company that prides itself on moving fast, it was never going to take the methodical approach.

FSD is far more capable than any Level 2 or Level 3 system currently deployed, but its error rate is still too high.

Then why is Tesla not willing to accept legal liability when it’s in operation? Mercedes are willing to put their money where their mouth is and say that their tech is good enough that when it’s running they’re the ones liable for anything going wrong, that confidences shows either arrogance or capability. I also don’t see how FSD can be more capable than Drive Pilot if FSD requires constant supervision and Drive Pilot does not.

Tesla apparently intends to build a case for regulators that it has solved Level 4 in a very broad environment and simply jump to that level, once the software reaches a sufficient level of maturity.

And I wish them the best of luck with that. But if the software is no mature enough for that yet then I don’t see how it’s more capable than an actually operational level 3 system.

you can't literally instruct a Tesla to "turn right at this intersection, change lanes, parallel park," etc.

Why not? Are you telling me you can’t strap a voice command module onto a Tesla? If the tech is meant to develop into robo-taxis then it’s going to need a way to take voice commands on things like “park just here” eventually.

A human driver only has to show that they can park once to get a license. A self-driving car has to be able to do it millions of times perfectly.

Which is silly. You’ve argued time and time again that regulations on self-driving cars shouldn’t demand perfection just improvement from the average human. The average human fails their driving test multiple times. If someone can make a self-driving car that will get the test right every time then that’s solid proof that can be used to argue for a change in regulations.

It’s also bloody good marketing.

Edited by Silasw on Nov 12th 2023 at 11:06:55 AM

“And the Bunny nails it!” ~ Gabrael “If the UN can get through a day without everyone strangling everyone else so can we.” ~ Cyran
NativeJovian Jupiterian Local from Orlando, FL Since: Mar, 2014 Relationship Status: Maxing my social links
Jupiterian Local
#1664: Nov 12th 2023 at 3:07:04 PM

Can you expand on this? At level 3 the vehicle is controlling all systems is it not?

The levels of autonomous vehicle capability as defined by the Society of Automotive Engineers (SAE) are:

  • Level 0: system may issue warnings, but has no ability to control the vehicle (eg blind spot indicators, backup camera warnings)
  • Level 1: system has limited control of the vehicle, so driver must handle everything the system is not (eg cruise control, emergency braking)
  • Level 2: system has full control of the vehicle, but driver must be alert for situations the system can't handle and take over immediately when necessary (eg adaptive cruise control + lanekeeping)
  • Level 3: system has full control of the vehicle, and can be trusted to give drivers sufficient warning to retake control (this is "you can watch a movie while you're in the driver's seat" level)
  • Level 4: system has full control of the vehicle, and no driver attention is required, but may revert to lower levels in certain circumstances (this is "it safely drives itself, unless it can't, in which case it will pull over and make you take control")
  • Level 5: system has full control of the vehicle, and can operate in any situation a human driver could (this is "your car doesn't even need to have a steering wheel anymore")

This is sometimes summarized as level 1 "hands on", level 2 "hands off", level 3 "eyes off", and level 4 "mind off", though this is misleading because most level 2 systems do literally require you to keep your hands on the wheel. (Ones that don't, like GM's Supercruise, are sometimes called level 2.5 systems.) It does get the gist across, though.

The main difference between a level 4 and a level 5 is that level 4 may require additional data to be pre-loaded, such as high-resolution LIDAR maps of the route. The cars themselves would not be able to provide that data in real time, so in areas where it's not available, they could not operate at level 4. The hypothetical here is something like the car drives itself on a mapped highway without any driver input required, but when it exits (leaving the LIDAR-mapped route), it needs the driver to take the wheel again. If the driver is unable to take over (they're asleep, for example) then a level 4 system must be able to safely pull over and park itself. Meanwhile, a level 5 system can operate without driver input anywhere, anytime, regardless of circumstances.

If you have a level 3 trained for highways at medium speed and good conditions I don’t see why it can’t be scaled to include high speeds, then slightly poor conditions, then bad conditions, then more street types.

This depends entirely on the nuts and bolts of how the system functions under the hood, which differs in approach between different parts of the industry. The specifics are above my pay grade, but the tldr is that some approaches simply are not practical to scale up beyond a certain point. As a rough analogy, if you want to travel at 50 MPH, you can design a car that reaches that speed. If you want to travel at 100 MPH, you can design a faster car, but you start running into problems (you need to be on a highway and in relatively light traffic, etc). If you want to travel at 150 MPH, a car just isn't going to work — even if you design a car that can go that fast, it's going to look like a Formula 1 car and won't be practical for everyday use. Meanwhile, if you're designing planes instead of cars, you can start with one that goes 50 MPH, then improve it so it can go 150 MPH, then improve it further to go 500 MPH, all on the same basic framework.

The idea is that some approaches are going to be dead ends, because they're easier to get started but don't scale up as well. eg, it's easier to design a working car than a working plane, but if your ultimate goal is to travel at 500 MPH, at some point you're going to have to abandon cars and start over with planes.

Of course, when it comes to autonomous vehicles, which approaches are the dead ends and which aren't is still an open question.

Edited by NativeJovian on Nov 12th 2023 at 6:09:49 AM

Really from Jupiter, but not an alien.
Zendervai Visiting from the Hoag Galaxy from St. Catharines Since: Oct, 2009 Relationship Status: Wishing you were here
Visiting from the Hoag Galaxy
#1665: Nov 12th 2023 at 4:12:06 PM

I’m going to echo the idea that having the car consistently be able to pass driving tests is a good thing. It’s a huge PR coup, it clearly explains the capability of the technology in a way the average person will easily understand and it’s a very easy way to make sure the technology can work in similar ways under different circumstances.

Not Three Laws compliant.
Smeagol17 (4 Score & 7 Years Ago)
#1666: Nov 13th 2023 at 3:03:10 AM

The last of this is not exactly a given. Especially if you train the system specifically to pass the test. Driving tests are generally much more similar to each other than the situations they are testing.

Edited by Smeagol17 on Nov 13th 2023 at 4:38:39 PM

Zendervai Visiting from the Hoag Galaxy from St. Catharines Since: Oct, 2009 Relationship Status: Wishing you were here
Visiting from the Hoag Galaxy
#1667: Nov 13th 2023 at 5:36:18 AM

PR PR PR PR PR.

It’s mostly about the fantastic PR that would come from it.

And, I’m going to point out, the idea is not that you preprogram it in. The idea would be “can this vehicle follow specific instructions given while driving?” As far as I know, it can’t yet. And it’s hard to believe that a feature like that wouldn’t be appreciated.

And to be clear, driving test instructions aren’t usually “go from point a to point b”, it’s like “go in this specific lane. Now switch to this other lane. Now switch back. There’s a highway here, get into the on-ramp” and so on.

Edited by Zendervai on Nov 13th 2023 at 8:37:20 AM

Not Three Laws compliant.
RainehDaze Figure of Hourai from Scotland (Ten years in the joint) Relationship Status: Serial head-patter
Figure of Hourai
#1668: Nov 13th 2023 at 5:54:53 AM

Plus if you forced it to do a whole bunch of driving tests (at least the British ones), you're advertising that it can handle urban areas, just about every form of junction, deal with street signage and changing conditions, rural roads, and everything shy of a motorway. Plus random stuff like 'park well on side of road'.

And highway driving is what self-driving excels at.

Avatar Source
Zendervai Visiting from the Hoag Galaxy from St. Catharines Since: Oct, 2009 Relationship Status: Wishing you were here
Visiting from the Hoag Galaxy
#1669: Nov 13th 2023 at 6:26:19 AM

It might not be the single most efficient option, but if travel and transit were really that fixated on efficiency, we’d have amazing public transit everywhere.

Ooh, yeah, if you can get a self driving car to reliably navigate British back roads, that’s a really good self-driving system.

Edited by Zendervai on Nov 13th 2023 at 9:27:16 AM

Not Three Laws compliant.
Fighteer Lost in Space from The Time Vortex (Time Abyss) Relationship Status: TV Tropes ruined my love life
Lost in Space
#1670: Nov 13th 2023 at 9:44:30 AM

There's no reason they couldn't get a Tesla to pass a standard driver's license test if they wanted, although I continue to view that as a gimmick rather than a proof of capability. If we regulated self-driving the way we do humans, we'd accept that it's just going to crash and kill people every so often — what can you do, it's got to get to work.

It has to be held to a much higher standard to be fit for purpose. If the idea is to sell it to the public, sure, put on all the stunts you want. Go nuts.

Teslas can absolutely follow voice commands, including for navigation. Indeed, if you don't give FSD a destination and just tell it to drive, it's capable of navigating around effectively at random. I don't know if it accepts "turn left at the next intersection", although I'd be curious to try that out. I do know that there's a lot of synergy between natural language processing and route planning in Tesla's AI architecture.


And from the perspective of an unobservant Uber customer their car is also self-driving. Cruise just sounds like a level 2 taxi fleet with remote drivers, which is cool and all, but I can’t pretend my computer is running itself just because I’ve got somebody remoted in from next-door.

Well, that's sort of the question with Waymo and Cruise. The companies would say that the cars are driving themselves, but the fact that humans have to actively supervise, even if it's remotely, would suggest otherwise. I have no horse in this race so I can't say much more than what's been publicly reported.

Sure but do we have any evidence that the decision was made for reasons of non-visibility tied to technology as opposed to being a business decision? Tesla is a company that prides itself on moving fast, it was never going to take the methodical approach.

Tesla has shown a greater than average willingness to discard solutions when they hit local maxima. I suppose it's possible that some of those solutions could be wrangled into a semblance of viability with enough effort, but I don't see why we should regard that as better.

I was fascinated by the demonstration of occupancy networks, in which the cars build a 3D model of the space around them and classify each variable-size voxel as containing nothing or an object to be avoided, then pass that data to the network that classifies the objects by kind, but apparently that's already in the bin and moved on from.

Then why is Tesla not willing to accept legal liability when it’s in operation?

Aside from the fact that no company wants to take on liability for things if it doesn't have to, it all comes down to regulatory approval. Some agency has to certify FSD (or any other autonomy system) for operation as Level 3 or equivalent, and Tesla evidently doesn't feel that FSD Beta is ready for that yet.

In a nutshell, FSD is already capable of end-to-end navigation (front door to front door) on any roadway in the US, but it's not yet able to do so with a sufficient level of reliability, meaning lack of interventions. This comes down to a different approach. Mercedes may be satisfied with making DrivePilot exceptionally safe under a limited set of conditions, but it's not useful outside of those conditions. Tesla wants to make FSD capable of driving anywhere at any time and incrementally mature its reliability.

I'm not saying either approach is better, just that claiming "leadership" at this point in time is a bit of a red herring. Good for PR, but put me down as not caring about that as much as others seem to.

"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"
SeptimusHeap from Switzerland (Edited uphill both ways) Relationship Status: Mu
#1671: Dec 20th 2023 at 6:40:57 AM

Been wondering for a while: If a self-driving car violates traffic laws (speeding, accidents, whatever), who gets to pay the fines/go to jail/have their licence suspended?

In Switzerland at least the owner of the car is liable for the damages caused by it, but that doesn't automatically extend to fines or jail sentences. Actually, I don't think there is much consensus on criminal responsibility for self-driving cars. We don't allow self-driving cars yet, though, so I dunno how the responsibility would be partitioned. I don't get the impression that we'd allow the terms of a sale to make the difference, though.

"For a successful technology, reality must take precedence over public relations, for Nature cannot be fooled." - Richard Feynman
Fighteer Lost in Space from The Time Vortex (Time Abyss) Relationship Status: TV Tropes ruined my love life
Lost in Space
#1672: Dec 20th 2023 at 7:00:31 AM

Obviously the laws can be whatever they are and don't necessarily reflect deep technical knowledge of the systems at issue. IANAL, AFAIK, all the other disclaimers.

Many modern cars are equipped with automatic lane-keeping and traffic-aware cruise control systems. These are considered Level 2 self-driving features in that they can steer the car and adjust its speed automatically under certain conditions. Tesla's Autopilot is considered Level 2, as are Cadillac's Super Cruise, Subaru's EyeSight, and many others. These are generally accepted and widely used throughout society. I'm not aware of any countries that prohibit them.

Every jurisdiction considers a vehicle running on Level 2 autonomy to be the responsibility of its driver. You are required to be paying attention, with your eyes on the road and your hands on the steering wheel at all times, even if the car is doing the basic driving tasks itself.

Where things get challenging is moving up the scale. There are a few consumer vehicles (Tesla is not among them) rated for Level 3 autonomy, allowing the driver to pay attention to other things while it's engaged but be able to take over on very short notice. These are usually geofenced and come with strict operating limits: on highways, at low speeds, in daylight and good weather.

There are no consumer vehicles rated for Level 4 or 5 in general-purpose situations. Cruise, Waymo, and a few other companies are (or were) working with such vehicles in city driving for use as robo-taxis, but under strict limitations. There are airports that run autonomous shuttles, factories and warehouses that run autonomous robots, and other specialty applications.

I don't know of any active lawsuits or complaints about liability for Level 3 vehicles, but the SAE rating system says that the driver remains responsible at all times. Level 4 and 5 vehicles assign liability to the manufacturer or fleet operator because they are able to drive under all or nearly all conditions and deal safely with unanticipated situations.

Complaints about self-driving systems like Tesla Autopilot mainly focus on whether the system contains defects that make it unfit for purpose, such that the driver could not reasonably have anticipated or prevented a crash. They may also, particularly in Tesla's case, focus on misleading marketing that may convince a driver that the car is more capable than it actually is.

As far as I know, Tesla does not permit Full Self-Driving Beta (its L4/5 autonomy solution) to be used in Europe, although it is working with regulators around the world to gain access to new markets as the technology develops.

Edited by Fighteer on Dec 21st 2023 at 8:06:20 AM

"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"
Silasw A procrastination in of itself from A handcart to hell (4 Score & 7 Years Ago) Relationship Status: And they all lived happily ever after <3
A procrastination in of itself
#1673: Dec 20th 2023 at 10:09:10 AM

Yeah as far as the law is concerned the only actual self-driving vehicles are operated by organisations (airports, tram network operators, etc…) not private individuals.

Everything else is a person driven vehicle with driver assist technology, so the driver continues to carry all legal responsibility.

BMW have a new car scheduled for release in 2024 that they’ve said has the technology at the point that they will be willing to accept legal liability when it is in use, but it will only be usable under certain circumstances (up to 37 mph and on a motorway).

Edited by Silasw on Dec 20th 2023 at 6:09:44 PM

“And the Bunny nails it!” ~ Gabrael “If the UN can get through a day without everyone strangling everyone else so can we.” ~ Cyran
Fighteer Lost in Space from The Time Vortex (Time Abyss) Relationship Status: TV Tropes ruined my love life
Lost in Space
#1674: Dec 21st 2023 at 10:34:47 AM

Ford (BlueCruise) and Mercedes (DrivePilot) have both deployed Level 3-equivalent systems on highways. I don't know if they have officially declared that they will take liability for crashes while the system is in use.

"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"
tclittle Professional Forum Ninja from Somewhere Down in Texas Since: Apr, 2010
Professional Forum Ninja
#1675: Feb 15th 2024 at 4:52:46 PM

This one is a couple of months old, but figured it was still news worthy: the Texas Department of Transportation is looking to transform part of State Highway 130 between Georgetown and Del Valle into a "smart freight" highway, including cameras, radar, other communication hardware and machine learning to help improve the AI of driverless frieght.

Just a note that TX SH 130 is a toll road and its main piece of fame is having the fastest posted speed limit in the US at 85 mph.

"We're all paper, we're all scissors, we're all fightin' with our mirrors, scared we'll never find somebody to love."

Total posts: 1,906
Top