Self Driving Car Insurance

2026-01-3015:50147341www.lemonade.com

Lemonade Autonomous Car insurance offers Tesla owners 50% off FSD miles in select states. The first insurance designed for self-driving cars.

Lemonade Autonomous Car insurance is the first car insurance designed specifically for self-driving cars. It offers Tesla owners 50% off every mile driven using Full Self-Driving technology by automatically tracking FSD miles versus manual miles through direct Tesla integration.

TL;DR
  • Lemonade offers the first autonomous car insurance with 50% discounts for Tesla FSD miles
  • No special insurance endorsements needed, our autonomous coverage integrates seamlessly with regular car insurance requirements
  • With your permission, we track your driving automatically through Tesla’s Fleet API. No devices or self-reporting required
  • FSD discounts will become more significant as safety rates increase

Lemonade Autonomous Car insurance is built on a simple principle: Tesla’s data shows that Full Self-Driving miles are twice as safe as manual driving, so they should cost 50% less on your insurance premiums.

When you drive using Tesla’s FSD technology, our system automatically tracks those miles and applies the discount. Manual miles are priced normally, while your FSD miles get the full 50% reduction, creating a pricing model that actually reflects the safety benefits of autonomous driving.

Unlike traditional car insurance that relies on estimates, Lemonade connects directly to your Tesla using Tesla’s Fleet API, which gives Lemonade access to vehicle data with a customer’s permission. This means:

  • No additional devices to install
  • No manual mileage reporting
  • Automatic tracking of FSD versus manual miles
  • Real-time data integration between Tesla and Lemonade apps

The technology handles everything behind the scenes, so you just drive and save.

Lemonade Autonomous Car insurance is currently available for Tesla drivers in Arizona and launching in Oregon on February 26, 2026, with more states coming soon. To qualify, your Tesla needs:

  • Hardware 4.0 or higher
  • Firmware version 2025.44.25.5 or newer (easily updated for free through your Tesla app)

Traditional insurers might offer small discounts for safety features like adaptive cruise control or lane-keeping assistance, but they don’t distinguish between these basic driver assistance systems and sophisticated autonomous driving technology.

Lemonade recognizes that Tesla’s Full Self-Driving represents a fundamentally different level of automation, one that deserves a fundamentally different insurance approach.

Tesla reports that FSD technology leads to:

  • 52% overall crash reduction
  • 14% reduction in highway incidents
  • 63% reduction in non-highway accidents

These aren’t theoretical benefits, they’re real safety improvements that Lemonade’s pricing reflects. When autonomous driving technology demonstrably reduces risk, insurance costs should follow.

Instead of estimating your annual mileage and hoping for the best, Lemonade’s system knows exactly how many miles you drive in FSD mode versus manual mode. This usage-based insurance model means you pay precisely for the risk you represent.

As car manufacturers like Ford, GM, and others advance their autonomous driving technology, Lemonade is building the infrastructure to support them. Our approach with Tesla creates the foundation for insuring all types of self-driving vehicles as they become available.

If you own a Tesla with Hardware 4.0 or higher in an eligible state, you can start saving immediately with Lemonade Autonomous Car insurance. The 50% discount applies to every mile you drive using Full Self-Driving, potentially reducing your overall insurance costs significantly depending on how often you use the technology.

Lemonade Autonomous Car insurance works alongside our other products, so Tesla owners can bundle their autonomous coverage with homeowners, renters, pet, or term life insurance for additional discounts.

Getting started with Lemonade Autonomous Car insurance is straightforward:

  1. Verify your Tesla meets hardware and firmware requirements
  2. Connect your Tesla account to Lemonade
  3. Start driving with FSD and saving automatically

Everything is managed through the Lemonade app, with transparent tracking of your FSD miles and savings.

Whether you’re a current Tesla owner interested in Lemonade Autonomous Car insurance or considering buying a Tesla to take advantage of this innovative coverage, Lemonade makes the process simple.

Our autonomous car insurance represents the future of auto insurance, pricing that actually reflects the safety benefits of new technology, seamless integration with cutting-edge vehicles, and transparent, fair coverage that evolves with your driving.

Ready to see how much you could save with the first insurance designed specifically for autonomous vehicles? Get a quote today and discover why safer miles should cost less.

You’ll save 50% on every mile driven using Tesla’s Full Self-Driving technology. Your total savings depend on how often you use FSD versus manual driving, but many drivers see significant reductions in their overall insurance premiums.

No special policies are required: Lemonade Autonomous Car insurance integrates with regular car insurance requirements. We simply price your FSD miles at 50% less than manual miles because they’re demonstrably safer.

Your Tesla needs Hardware 4.0 or higher and firmware version 2025.44.25.5 or newer. The firmware update is free and easy to install through your Tesla app or directly in your car.

Existing customers can add autonomous coverage at their next renewal. We don’t recommend canceling your current policy early, as you’d pay unnecessary fees for a new policy setup.

We’re working to expand to more states as quickly as possible while ensuring compliance with local regulations. Follow our updates for announcements about new state availability.


Read the original article

Comments

  • By microtherion 2026-01-3023:017 reply

    I'm quite skeptical of Tesla's reliability claims. But for exactly that reason, I welcome a company like Lemonade betting actual money on those claims. Either way, this is bound to generate some visibility into the actual accident rates.

    • By sfblah 2026-01-310:591 reply

      One thing that was unclear to me from the stats cited on the website is whether the quoted 52% reduction in crashes is when FSD is in use, or overall. This matters because people are much more likely to use FSD in situations where driving is easier. So, if the reduction is just during those times, I'm not even sure that would be better than a human driver.

      As an example, let's say most people use FSD on straight US Interstate driving, which is very easy. That could artificially make FSD seem safer than it really is.

      My prior on this is supervised FSD ought to be safer, so the 52% number kind of surprised me, however it's computed. I would have expected more like a 90-95% reduction in accidents.

      • By wrsh07 2026-01-312:381 reply

        I think this might be right, but it does two interesting things:

        1) it let's lemonade reward you for taking safer driving routes (or living in a safer area to drive, whatever that means)

        2) it (for better or worse) encourages drivers to use it more. This will improve Tesla's training data but also might negatively impact the fsd safety record (an interesting experiment!)

        • By paulryanrogers 2026-01-314:113 reply

          > ...but also might negatively impact the fsd safety record (an interesting experiment!)

          As a father of kids in a neighborhood with a lot of Teslas, how do I opt out of this experiment?

          • By pfannkuchen 2026-01-317:334 reply

            Do your kids randomly run into the road? I was worried about that but then mine just don’t run into the road for some reason, they are quite careful about it seemingly by default after having “getting bumped into by a car” explained to them. I’m not sure if this is something people are just paranoid about because the consequences are so bad or if some kids really do just run out into the road randomly.

            • By dragonwriter 2026-01-3110:392 reply

              Some kids really do just run into the road seemingly randomly. Other kids run in with a clear purpose, not at all randomly, and sometimes (perhaps very rarely, but it only takes once and bad luck) forget to look both ways. Kids are not cookie cutter copies that all behave the same way in the same circumstances (even with the same training).

              • By dctoedt 2026-02-0412:05

                > Some kids really do just run into the road seemingly randomly. ... sometimes (perhaps very rarely, but it only takes once and bad luck) forget to look both ways.

                Just this week I was telling my law school contract-drafting class that part of our job as lawyers and drafters is to try to to "child-proof" our contracts, because sometimes clients' staff understandably don't fully appreciate the possible consequences of 'running into the street,' no matter how good an idea it might seem at the time.

            • By paulryanrogers 2026-01-3114:30

              I'm more worried about the Teslas hitting my kids when they're on bicycles or Teslas swerving off the road into the yards. Regardless, it sure would be nice if technology controlling multi-ton vehicles on public roads were subject to regulations, or at least had clearly define liability.

            • By wcrossbow 2026-01-3110:281 reply

              Kids will randomly run into the road. They might run behind a ball or a dog so that it doesn’t end up on the other side or runned over or are simply too excited to remember your stern road safety talk.

              The first thing I was taught when I picked up a car was: if you see a ball on the road you stop immediately. This valuable lesson has saved one kid (and my sanity) with me on the wheel.

            • By tbossanova 2026-01-3110:52

              Yes it does happen. Otherwise smart kids will do dumb stuff sometimes. Like see their friend across the road, but at that moment someone on a motorcycle is accelerating out of their driveway, kid runs across, dead

          • By fragmede 2026-01-3118:51

            Same way you opt out of having drunk drivers drive home along your street and pass out while driving, or drivers getting a stroke or other blood clot while driving and crashing into parked cars.

    • By DaedalusII 2026-01-314:492 reply

      The insurance industry is a commercial prediction market.

      It is often an indicator of true honesty, providing there is no government intervention. Governments intervene in insurance/risk markets when they do not like the truth.

      I tried to arrange insurance for an obese western expatriate several years ago in an Asian country, and the (western) insurance company wrote a letter back saying the client was morbidly obese and statistically likely to die within 10 years, and they should lose x weight before they could consider having insurance.

      • By croddin 2026-01-316:52

        I could see prediction markets handing insurance in the future, it could probably get fairer prices but would have to be done right to avoid bad incentives, interesting to think about how that might work.

      • By cucumber3732842 2026-01-3118:371 reply

        > providing there is no government intervention.

        You mean like forcing people to buy it ad then shaping what product can ad cant be offered with a spiderweb of complex rules?

        • By DaedalusII 2026-02-022:20

          The clearest example is the state of California preventing insurance companies from increasing annual premium when risks increase. Please understand I have no political opinion about this. As a result, a lot of insurers have completely withdrawn and now its not possible to insure houses properly for many people.

          https://www.theguardian.com/us-news/2023/may/27/state-farm-h...

          With no government intervention, the price of all fire insurance in California would increase materially to reflect the genuine risk of wildfire damage.

    • By JumpCrisscross 2026-01-3023:134 reply

      > quite skeptical of Tesla's reliability claims

      I'm sceptical of Robotaxi/Cybercab. I'm less sceptical that FSD, supervised, is safer than fully-manual control.

      • By panopticon 2026-01-310:454 reply

        Where I live isn't particularly challenging to drive (rural Washington), but I'm constantly disengaging FSD for doing silly and dangerous things.

        Most notably my driveway meets the road at a blind y intersection, and my Model 3 just blasts out into the road even though you cannot see cross traffic.

        FSD stresses me out. It's like I'm monitoring a teenager with their learners permit. I can probably count the number trips where I haven't had to take over on one hand.

        • By parpfish 2026-01-313:261 reply

          > I'm constantly disengaging FSD for doing silly and dangerous things.

          You meant “I disable FSD because it does silly things”

          I read “I disable FSD so I can do silly things”

          • By horns4lyfe 2026-01-3117:001 reply

            Exactly. Every bad situation I’ve been in with FSD was when I misread the situation and disengaged it during a maneuver that it was handling safely

            • By protimewaster 2026-02-0114:20

              It feels unlikely that blindly entering cross traffic, as described in the previous post, is going to be a safe maneuver, though.

        • By horns4lyfe 2026-01-3116:59

          I use it for 90% of my driving in Austin and it’s incredible

        • By apearson 2026-01-310:592 reply

          Do you have HW3 or HW4?

          • By lotsofpulp 2026-01-311:451 reply

            The newest FSD on HW4 was very good in my opinion. Multiple 45min+ drives where I don’t need to touch the controls.

            Still not paying $8k for it. Or $100 per month. Maybe $50 per month.

            • By fragmede 2026-01-3118:53

              It's your sanity (and money) ¯\_(ツ)_/¯

          • By panopticon 2026-01-311:53

            HW3, unfortunately. Missed the HW4 refresh by a couple of months.

        • By elif 2026-01-312:032 reply

          it's edging into the intersection to get a better view on the camera. it's further than you would normally pull out, but it will NOT pull into traffic.

          • By panopticon 2026-01-312:391 reply

            It's not edging; it enters the street going a consistent speed (usually >10mph) from my driveway. The area is heavily wooded, and I don't think it "sees" the cross direction until it's already in the road. Or perhaps the lack of signage or curb make it think it has the right of way.

            My neighbor joked that I should install a stop sign at the end of my driveway to make it safer.

            • By cucumber3732842 2026-01-3118:40

              Or just manually drive in your own driveway.

              The fact that it does't handle some specific person's driveway well is far from a condemnation of the system. I'm far more concerned about it mishandling things on "proper" roads at speed.

          • By seanmcdirmid 2026-01-312:05

            The software probably has a better idea of their car’s dimensions than a human driver, so will be able to get a better view of traffic by pulling out at just the right distance.

      • By madsmith 2026-01-3023:322 reply

        Having handed over control of my vehicles to FSD many times, I’ve yet to come away from the experience feeling that my vehicle was operating in a safer regime for the general public than within my own control.

        • By smileysteve 2026-01-312:581 reply

          Keeping a 1-2 car's length stopping distance is likely over a 50% reduction in at fault damages.

          • By protimewaster 2026-02-0114:22

            You can get this with just a fairly dumb radar cruise control system, though.

        • By Rover222 2026-01-3023:374 reply

          I think you greatly overestimate humans

          • By ihaveajob 2026-01-310:151 reply

            The problem IMO is the transition period. A mostly safe system will make the driver feel at ease, but when an emergency occurs and the driver must take over, it's likely that they won't be paying full attention.

          • By Retric 2026-01-310:411 reply

            We aren’t talking about the average human here.

            On average you include sleep deprived people, driving way over the speed limit, at night, in bad weather, while drunk, and talking to someone. FSD is very likely situationally useful.

            But you can know most of those adverse conditions don’t apply when you engage FSD on a given trip. As such the standard needs to be extremely high to avoid increased risks when you’re sober, wide awake, the conditions are good, and you have no need to speed.

            • By izacus 2026-01-318:472 reply

              > On average you include sleep deprived people, driving way over the speed limit, at night, in bad weather, while drunk, and talking to someone. FSD is very likely situationally useful.

              Are those people also able to suprevise FSD like the law and Tesla expects them to? That's also a question.

              • By fragmede 2026-01-3118:54

                FSD will pull over and stop if it detects the driver has passed out. Can the law do that automatically?

          • By JumpCrisscross 2026-01-310:421 reply

            > you greatly overestimate humans

            Tesla's FSD still goes full-throttle dumbfuck from time to time. Like, randomly deciding it wants to speed into an intersection despite the red light having done absolutely nothing. Or swerving because of glare that you can't see, and a Toyota Corolla could discern with its radars, but which hits the cameras and so fires up the orange cat it's simulating on its CPU.

            • By fragmede 2026-01-3118:55

              Yeah even corollas have better sensors than a Tesla for driving in fog. It's embarrassing.

      • By bayarearefugee 2026-01-311:401 reply

        > I'm less sceptical that FSD, supervised, is safer than fully-manual control.

        I'm very skeptical that the average human driver properly supervises FSD or any other "full" self driving system.

        • By microtherion 2026-01-319:45

          Supervised FSD — automating 99.9% of driving and expecting drivers to be fully alert for the other .1% — appears to go against everything we know about human attention.

      • By misiti3780 2026-01-3023:20

        this ^^

    • By benatkin 2026-01-313:42

      > betting actual money on those claims

      Insurance companies can let marketing influence rates to some degree, with programs that tend to be tacked on after the initial rate is set. This self driving car program sounds an awful lot like safe driver programs like GEICO Clean Driving Record, State Farm Good Driver Discount, and Progressive Safe Driver, Progressive Snapshot, and Allstate Drivewise. The risk assessment seems to be less thorough than the general underwriting process, and to fall within some sort of risk margin, so to me it seems gimmicky and not a true innovation at this point.

    • By rubyfan 2026-01-3023:51

      Lemonade will have some actual claim data to support this already, not relying on the word of Tesla.

    • By sMarsIntruder 2026-01-318:111 reply

      They don’t bet money on just “I’m quite skeptical because I hate the man”, but on actual data provided by the company.

      That’s the difference.

      • By microtherion 2026-01-319:521 reply

        The skepticism and hate is based on observing decades of shameless dishonesty, which is itself a form of data provided by the company: https://motherfrunker.ca/fsd/

        • By sMarsIntruder 2026-01-3114:07

          Still doesn’t change my point: as of today being skeptic because relying on outdated data or historical series is just nonsense. I mean, insurance quotes work in a totally different way.

    • By thegreatpeter 2026-01-3113:57

      Do you drive a HW4? I’m 90% FSD on my total car miles

  • By jasoncartwright 2026-01-3017:3117 reply

    If it autonomous or self-driving then why is the person in the car paying for the insurance? Surely if it's Tesla making the decisions, they need the insurance?

    • By gizmo686 2026-01-3020:213 reply

      Generally speaking, liability for a thing falls on the owner/operator. That person can sue the manufacturer to recover the damages if they want. At some point, I expect it to become somewhat routine for insurures to pay out, then sue the manufacturer to recover.

      • By amelius 2026-01-3020:401 reply

        Or at some point subscribing to a service may be easier than owning the damn thing.

        • By DaSHacka 2026-01-3021:182 reply

          All according to plan

          • By koakuma-chan 2026-01-3022:233 reply

            It already doesn't make sense to own a car for me. It's cheaper to just call an Uber.

            • By ghaff 2026-01-311:241 reply

              I'm guessing that's a fairly city viewpoint. My car is setup with roofrack and carries a lot of other gear I want. I'm regularly in places without reliable cell etc. Visiting friends can easily be an hour drive.

              • By koakuma-chan 2026-01-312:48

                Yes, a city viewport. I usually just walk, but when I don't I most often take the subway, not even Uber. Though I feel like in Toronto the subway or some part thereof is closed or under maintenance or whatever way too often. It's not very reliable.

            • By paulddraper 2026-01-3121:15

              Depends how often.

              Multiple Ubers per day are expensive. ($55 x 365 = $20,000)

              All in, a budget car costs less than half of that per year.

              But if you replace some of that with public transportation, or a car is otherwise impractical, the math changes.

            • By gffrd 2026-01-3023:131 reply

              For some this is the case. For others, this is not the case.

              • By amelius 2026-01-310:141 reply

                some -> most ?

                • By DaSHacka 2026-02-0119:37

                  In west/east coast cities maybe.

                  Talk to anyone from the midwest about not owning a car and they'll laugh you out of the room.

                  Well, unless it's because youre proposing they switch to ATV's and Snowmobiles, in which case there some people can technically get by without a traditional automobile.

          • By amelius 2026-01-3114:431 reply

            If you take off the conspiracy hat, you will see that there are many advantages to not owning a product. Such as that the vendor's incentives are better aligned with yours. For example, if the thing breaks, it is in __their__ best interest to fix it (or to not let it break in the first place). This also has positive implications for sustainability.

            • By physicles 2026-01-3116:021 reply

              It’s also in their best interest to set the price so as to maximize their own profits. If switching costs or monopoly power allow them to set a higher price, they will do so.

              Have we learned nothing from a decade of subscription services?

              • By amelius 2026-01-3116:561 reply

                Nobody said we should allow monopolies?

                • By fragmede 2026-01-3119:01

                  Especially Adam Smith. The claims are scattered throughout The Wealth of Nations, but he hated them with specificity. He said they raise prices and lower quality, misallocate capital, and corrupt politics, among other things.

      • By PunchyHamster 2026-01-318:161 reply

        but tesla is the operator

        • By fragmede 2026-01-3119:031 reply

          Aside from the human in the vehicle holding the steering wheel with a foot on the pedal, that is.

          • By aurareturn 2026-02-013:422 reply

            That’s today. If Tesla ever becomes fully autonomous, you won’t need that.

            • By bdangubic 2026-02-0121:56

              If I ever marry Oprah I’ll be a rich man :)

            • By PunchyHamster 2026-02-0121:37

              google what F stands for in FSD

      • By einpoklum 2026-01-3023:392 reply

        Ah, but could one not argue that the owner of the self-driving car is _not_ the operator, and it is the car, or perhaps Tesla, which operates it?

        • By kube-system 2026-01-312:501 reply

          All Tesla vehicles require the person behind the steering wheel to supervise the operations of the vehicle and avoid accidents at all times.

          Also, even if a system is fully automated, that doesn’t necessarily legally isolate the person who owns it or set it into motion from liability. Vehicle law would generally need to be updated to change this.

          • By einpoklum 2026-01-3113:031 reply

            But that might be considered a legal trick. Suppose that, when you pay for a taxi, the standard conditions of carriage would make it your responsibility to supervise the vehicle operation and alert the driver so as to avoid accidents. Would the taxi driver and taxi company be able to eschew liability through that formalism? Probably not. The fact that Tesla makes you sign something does not automatically make the signed document valid and enforceable.

            It may be that it is; but then, if you are required to be watchful at all time, and be able to take over from the autonomous vehicle at all times, then - the autonomy doesn't really help you all that much, does it?

            • By kube-system 2026-01-3117:031 reply

              No, Tesla doesn’t assign you liability by making you sign something. The law makes the driver of a vehicle liable for the operation, as it always has.

              My first sentence was to say that even if the law treats autonomous vehicles differently, Tesla doesn’t sell one.

              • By einpoklum 2026-01-3121:251 reply

                > The law makes the driver of a vehicle liable for the operation, as it always has.

                So, either those Tesla's don't really self-drive (which may be the case, I don't know, but then the whole discussion is moot), or they do, in which case, the human wasn't the one driving and may thus avoid liability.

                Then of course there is the possibility that the court might be convinced the car was being drive collaboratively by the human and the car/the computer, in which case Tesla and the human might share the liability. IANA(US)L though.

                • By kube-system 2026-02-0216:02

                  > either those Tesla's don't really self-drive

                  All Teslas are level 2 ADAS and require the human behind the the wheel to monitor the vehicle and intervene when necessary.

                  > or they do, in which case, the human wasn't the one driving and may thus avoid liability.

                  That is not legally true. Automation does not absolve someone from liability. Owners of a piece of machinery have liability just by being the owner and placing it into operation.

                  Forget about cars for a second -- we already have many products that are entirely automated already, for example: an elevator. If you own a building with an elevator, and it hurts someone, the building owner is absolutely going to be sued over it, and "oh, it's automated" isn't a get-out-of-court free card.

                  There are still responsibilities that the owner has: did they properly maintain it? were they aware of an issue but decided to operate it anyway? were they in a position to intervene and avoid the accident, but failed to do so?

        • By sroussey 2026-01-3023:592 reply

          Mercedes agrees. They take on liability when their system is operated appropriately.

          • By kube-system 2026-01-312:43

            They say they will, but until relevant laws are updated, this is mostly contractual and not a change to legal liability. It is similar to how an insurance company takes responsibility for the way you operate your car.

            If your local legal system does not absolve you from liability when operating an autonomous vehicle, you can still be sued, and Mercedes has no say in this… even though they could reimburse you.

          • By iknowstuff 2026-01-319:09

            No. They don’t. It was vaporware made to fool people including you. You could never actually order it and it’s canceled now in favor of an L2 system.

    • By kjksf 2026-01-3018:028 reply

      Because that's the law of the land currently.

      The product you buy is called "FSD Supervised". It clearly states you're liable and must supervise the system.

      I don't think there's law that would allow Tesla (or anyone else) to sell a passenger car with unsupervised system.

      If you take Waymo or Tesla Robotaxi in Austin, you are not liable for accidents, Google or Tesla is.

      That's because they operate on limited state laws that allow them to provide such service but the law doesn't allow selling such cars to people.

      That's changing. Quite likely this year we will have federal law that will allow selling cars with fully unsupervised self-driving, in which case the insurance/liability will obviously land on the maker of the system, not person present in the car.

      • By throwaway2037 2026-01-3110:04

            > Quite likely this year we will have federal law that will allow selling cars with fully unsupervised self-driving, in which case the insurance/liability will obviously land on the maker of the system, not person present in the car.
        
        You raise an important point here. Is it economically feasible for system makers to bear the responsibility of self-driving car accidents? It seems impossible, unless the cars are much more expensive to cover the potential future costs. I'm very curious how Waymo insures their cars today. I assume they have a bespoke insurance contract negotiated with a major insurer. Also, do we know the initial cost of each Waymo car (to say nothing of ongoing costs from compute/mapping/etc.)? It must be very high (2x?) given all of the special navigation equipment that is added to each car.

      • By paulryanrogers 2026-01-314:251 reply

        Tacking "Supervised" on the end of "Full Self Driving" is just contradictory. Perhaps if it was "Partial Self Driving" then it wouldn't be so confusing.

        • By pests 2026-01-315:001 reply

          Its only to differentiate it from their "Unsupervised FSD" which is what they call it now.

          • By paulryanrogers 2026-01-3114:281 reply

            That is redundant and doesn't make the other any less contradictory

            • By pests 2026-02-011:48

              I agree but I think context is important here. It was called FSD, but they got into trouble, now its "Supervised" so people know its not, well, unsupervised FSD. Yes, I know it doesn't make sense.

      • By kolbe 2026-01-3020:19

        > Quite likely this year we will have federal law that will allow selling cars with fully unsupervised self-driving, in which case the insurance/liability will obviously land on the maker of the system, not person present in the car.

        This is news to me. This context seems important to understanding Tesla's decision to stop selling FSD. If they're on the hook for insurance, then they will need to dynamically adjust what they charge to reflect insurance costs.

      • By arijun 2026-01-3021:061 reply

        I imagine insurance would be split in two in that case. Carmakers would not want to be liable for e.g. someone striking you in a hit-and-run.

        • By smallnix 2026-01-3022:051 reply

          If the car that did a hit-and-run was operated autonomously the insurance of the maker of that car should pay. Otherwise it's a human and the situation falls into the bucket of what we already have today.

          So yes, carmakers would pay in a hit-and-run.

          • By JumpCrisscross 2026-01-3023:181 reply

            > If the car that did a hit-and-run was operated autonomously the insurance of the maker of that car should pay

            Why? That's not their fault. If a car hits and runs my uninsured bicycle, the manufacturer isn't liable. (My personal umbrella or other insurance, on the other hand, may cover it.)

            • By graeme 2026-01-310:55

              They're describing a situation of liability, not mere damage. If yor bicycle is hit you didn't do anything wrong.

              If you run into someone on your bike and are at fault then you generally would be liable.

              They're talking about the hypothetical where you're on your bike, which was sold as an autobomous bike and the bike manufacturer's software fully drives the bike, and it runs into someone and is at fault.

      • By AlotOfReading 2026-01-3020:06

        You can sell autonomous vehicles to consumers all day long. There's no US federal law prohibiting that, as long as they're compliant with FMVSS as all consumer vehicles are required to be.

      • By rubyfan 2026-01-3019:20

        Waymo is also a livery service which you normally aren’t liable for as a passenger of taxi or limousine unless you have deep pockets. /IANAL

      • By jasoncartwright 2026-01-3018:431 reply

        I see. So not Tesla's product they are using to sell insurance around isn't "Full Self-Driving" or "Autonomous" like the page says.

        • By FeloniousHam 2026-01-3018:543 reply

          My current FSD usage is 90% over ~2000 miles (since v14.x). Besides driving everywhere, everyday with FSD, I have driven 4 hours garage to hotel valet without intervention. It is absolutely "Full Self-Driving" and "Autonomous".

          FSD isn't perfect, but it is everyday amazing and useful.

          • By JumpCrisscross 2026-01-3023:252 reply

            > My current FSD usage is 90% over ~2000 miles

            I'd guess my Subaru's lane-keeping utilisation is in the same ballpark. (By miles, not minutes. And yes, I'm safer when it and I are watching the road than when I'm watching the road alone.)

            • By olyjohn 2026-01-312:03

              My favorite feature of Subaru's system is when you change lanes, and it stays locked onto the car in the slower lane and slams on the brakes. People behind you love that.

            • By FeloniousHam 2026-02-0214:06

              I don't want minimize the efforts of other manufacturers (I'm sure they'll all have Tesla's features in the next generation), but: my wife has a Subaru Outback, and the two systems are as close in functionality as humans are to chimpanzees. The differences are many, stark and subtle (that Subaru screen), I'd just say take a test drive with FSD.

          • By wat10000 2026-01-3021:193 reply

            If it was full self driving, wouldn't your usage be 100%?

            • By FeloniousHam 2026-02-0214:08

              > It's not perfect,

              Probably about 90% perfect! Obviously we don't agree on the definition.

            • By pests 2026-01-315:011 reply

              Sometimes a car is fun to drive.

              • By fragmede 2026-01-3119:23

                It refuses to engage above, like, 80.

          • By jasoncartwright 2026-01-3018:582 reply

            Yet still on relying you to cover it with your insurance. Again, clearly not autonomous.

            • By AlotOfReading 2026-01-3020:261 reply

              Liability is a separate matter from autonomy. I assume you'd consider yourself autonomous, yet it's your employer's insurance that will be liable if you have an accident while driving a company vehicle.

              If the company required a representative to sit in the car with you and participate in the driving (e.g. by monitoring and taking over before an accident), then there's a case to be made that you're not fully autonomous.

              • By buran77 2026-01-3021:58

                > it's your employer's insurance that will be liable if you have an accident while driving a company vehicle

                I think you're mixing some concepts.

                There's car insurance paid by the owner of the car, for the car. There's workplace accident insurance, paid by the employer for the employee. The liability isn't assigned by default, but by determining who's responsible.

                The driver is always legally responsible for accidents caused by their negligence. If you play with your phone behind the wheel and kill someone, even while working and driving a company car, the company's insurance might pay for the damage but you go to prison. The company will recover the money from you. Their work accident insurance will pay nothing.

                The test you can run in your head: will you get arrested if you fall asleep at the wheel and crash? If yes, then it's not autonomous or self driving. It just has driver assistance. It's not that the car can't drive itself at all, just that it doesn't meet the bar for the entire legal concept of "driver/driving".

                "Almost" self driving is like jumping over a canyon and almost making it to the other side. Good effort, bad outcome.

            • By dzhiurgis 2026-01-3020:081 reply

              [flagged]

              • By zen928 2026-01-3020:46

                Disagree. I appreciate their viewpoint tethering corporate claims to reality by illustrating Tesla is obfuscating the classification of their machines to be autonomous, when they actually aren't. Their comments in other thread chains proved to be fruitful when lacking agitators looking to dismiss critique by citing website rules, like the post adding additional detail to how Tesla muddles legal claims by cooking up cherry-picked evidence that work against the driver despite being the insurer.

      • By 2III7 2026-01-3020:102 reply

        Without LIDAR and/or additional sensors, Tesla will never be able to provide "real" FSD, no matter how wonderful their software controlling the car is.

        Also, self driving is a feature of a vehicle someone owns, I don't understand how that should exempt anyone from insuring their property.

        Waymo and others are providing a taxi service where the driver is not a human. You don't pay insurance when you ride Uber or Bolt or any other regular taxi service.

        • By Marsymars 2026-01-3020:26

          > Also, self driving is a feature of a vehicle someone owns, I don't understand how that should exempt anyone from insuring their property.

          Well practically speaking, there’s nothing stopping anyone from voluntarily assuming liability for arbitrary things. If Tesla assumes the liability for my car, then even if I still require my “own” insurance for legal purposes, the marginal cost of covering the remaining risk is going to be close to zero.

        • By arijun 2026-01-3021:041 reply

          Never say never—it’s not physically impossible. But yes, as it stands, it seems that Tesla will not be self driving any time soon (if ever).

          • By kjksf 2026-01-3021:422 reply

            They literally just (in the last few days) started unsupervised robotaxis in Austin.

            They are as self-driving as a car can be.

            This is different than the one where they had a human supervisor in passenger seat (which they still do elsewhere).

            And different than the one where they didn't have human supervisor but did have a follow car.

            Now they have a few robotaxis that are self driving.

    • By zugi 2026-01-312:021 reply

      If your minor child breaks something, or your pet bites someone, you are liable.

      This analogy may be more apt than Tesla would like to admit, but from a liability perspective it makes sense.

      You could in turn try to sue Tesla for defective FSD, but the now-clearly-advertised "(supervised)" caveat, plus the lengthy agreement you clicked through, plus lots of lawyers, makes you unlikely to win.

      • By paulryanrogers 2026-01-314:23

        Can a third party reprogram my dog or child at any moment? Or even take over and control them?

    • By davidhunter 2026-01-3017:521 reply

      Seems like the role of the human operator in the age of AI is to be the entity they can throw in jail if the machine fails (e.g. driver, pilot)

      • By taneq 2026-01-3022:552 reply

        I’ve said for years that pragmatically, our definition of a “person” is an entity that can accept liability and take blame.

        • By wcfrobert 2026-01-310:071 reply

          LLCs can't go to jail though

          • By amitav1 2026-01-310:36

            Because LLCs aren't people

        • By sroussey 2026-01-310:08

          Not to be confused with “human” thanks to SCOTUS.

    • By JumpCrisscross 2026-01-3023:161 reply

      > Surely if it's Tesla making the decisions, they need the insurance?

      Why surely? Turning on cruise control doesn't absolve motorists of their insurance requirement.

      And the premise is false. While Tesla does "not maintain as much insurance coverage as many other companies do," there are "policies that [they] do have" [1]. (What it insures is a separate question.)

      [1] https://www.sec.gov/ix?doc=/Archives/edgar/data/0001318605/0...

      • By forgetfreeman 2026-01-3023:421 reply

        Cruise control is hardly relevant to a discussion of liability for autonomous vehicle operation.

        • By fragmede 2026-01-3023:471 reply

          In the context of ultramodern cruise control (eg comma.ai), which has a radar to track the distance to the car (if any) in front of you, and cameras so the car can wind left or right and track the freeway, I think it does.

          • By sroussey 2026-01-310:071 reply

            Not unless they are marketing it as “autopilot” or some such that a random consumer would reasonably assume meant autopilot.

            And I’d include “AI driver” as an example.

            • By olyjohn 2026-01-311:41

              A random consumer doesn't actually understand what Autopilot means. Most people don't have pilot's licenses. And cars don't fly. Did you not see all the debacles around it when it first came out?

    • By ck2 2026-01-3022:15

      The coder and sensor manufacturers need the insurance for wrongful death lawsuits

      and Musk for removing lidar so it keeps jumping across high speed traffic at shadows because the visual cameras can't see true depth

      99% of the people on this website are coders and know how even one small typo can cause random fails, yet you trust them to make you an alpha/beta tester at high speed?

    • By stubish 2026-01-311:06

      Risk gets passed along until someone accepts it, usually an insurance company or the operator. If the risk was accepted and paid for by Tesla, then the cost would simply be passed down to consumers. All consumers, including those that want to accept the risk themselves. In particular, if you have a fleet of cars it can be cheaper to accept the risk and only pay for mandatory insurance, because not all of your cars are going to crash at the same time, and even if they did, not all in the worst way possible. This is how insurance works, by amortizing lots of risk to make it highly improbable to make a loss in the long run.

    • By jimt1234 2026-01-3018:001 reply

      Not an expert here, but I recall reading that certain European countries (Spain???) allow liability to be put on the autonomous driving system, not the person in the car. Does anyone know more about this?

      • By bluGill 2026-01-3020:40

        That is the case everywhere. It is common when buying a product for the contract to include who has liability for various things. The price often changes by a lot depending on who has liability.

        Cars are traditionally sold as the customer has liability. Nothing stops a car maker (or even an individual dealer) from selling cars today taking all the insurance liability in any country I know of - they don't for what I hope are obvious reasons (bad drivers will be sure to buy those cars since it is a better deal for them an in turn a worse deal for good drivers), but they could.

        Self driving is currently sold as customers has liability because that is how it has always been done. I doubt it will change, but it is only because I doubt there will ever be enough advantage as to be worth it for someone else to take on the liability - but I could be wrong.

    • By throw20251220 2026-01-3017:402 reply

      It’s because you bought it. Don’t buy it if you don’t want to insure.

      • By SoftTalker 2026-01-3017:431 reply

        Yep, you bought it, you own it, you choose to operate it on the public roads. Therefore your liability.

        • By 9rx 2026-01-3017:482 reply

          If you bought and owned it, you could sell it to another auto manufacturer for some pretty serious amounts of money.

          In reality, you acquired a license to use it. Your liability should only go as far as you have agreed to identify the licenser.

          • By recursive 2026-01-3021:412 reply

            You can actually do that. Except that they could just buy one themselves.

            Companies exist that buy cars just to tear them down and publish reports on what they find.

            • By 9rx 2026-01-311:181 reply

              > Companies exist that buy cars just to tear them down and publish reports on what they find.

              What does it mean to tear down software, exactly? Are you thinking of something like decompilation?

              You can do that, but you're probably not going to learn all that much, and you still can't use it in any meaningful sense as you never bought it in the first place. You only licensed use of it as a consumer (and now that it is subscription-only, maybe not even that). If you have to rebuild the whole thing yourself anyway, what have you really gained? Its not exactly a secret how the technology works, only costly to build.

              > Except that they could just buy one themselves.

              That is unlikely, unless you mean buying Tesla outright? Getting a license to use it as a manufacturer is much more realistic, but still a license.

              • By recursive 2026-01-313:311 reply

                Check out Munro and Associates. I'm not talking about software. The whole car.

                • By 9rx 2026-01-313:501 reply

                  For what reason?

                  In case you have forgotten, the discussion is about self-driving technology, and specifically Tesla's at that. The original questioner asked why he is liable when it is Tesla's property that is making the decisions. Of course, the most direct answer is because Tesla disclaims any liability in the license agreement you must agree to in order to use said property.

                  Which has nothing to do with an independent consulting firm or "the whole car" as far as I can see. The connection you are trying to establish is unclear. Perhaps you pressed the wrong 'reply' button by mistake?

                  • By recursive 2026-01-319:53

                    I started responding to this. I interpreted it to be referring to the whole car.

                    > Yep, you bought it, you own it, you choose to operate it on the public roads. Therefore your liability.

      • By Rebelgecko 2026-01-3017:592 reply

        I don't think Tesla lets you buy FSD

        • By throw20251220 2026-02-0115:19

          If they don’t let you buy, you don’t own. If you don’t own, how is that insurance even available to you?

        • By scottyah 2026-01-3020:101 reply

          They do, until Feb 14th.

          • By Rebelgecko 2026-01-3022:57

            Even now I think it's a revocable license

    • By seanmcdirmid 2026-01-3023:522 reply

      I think there is an even bigger insurance problem to worry about: if autonomous vehicles become common and are a lot safer than manual driven vehicles, insurance rates for human driven cars could wind up exploding as the risk pool becomes much smaller and statistically riskier. We could go from paying $200/month to $2000/month if robo taxis start dominating cities.

      • By AnthonyMouse 2026-01-310:401 reply

        > if autonomous vehicles become common and are a lot safer than manual driven vehicles, insurance rates for human driven cars could wind up exploding as the risk pool becomes much smaller and statistically riskier.

        The assumption there is that the remaining human drivers would be the higher risk ones, but why would that be the case?

        One of the primary movers of high risk driving is that someone goes to the bar, has too many drinks, then needs both themselves and their car to get home. Autonomous vehicles can obviously improve this by getting them home in their car without them driving it, but if they do, the risk profile of the remaining human drivers improves. At worst they're less likely to be hit by a drunk driver, at best the drunk drivers are the early adopters of autonomous vehicles and opt themselves out of the human drivers pool.

        • By seanmcdirmid 2026-01-310:452 reply

          Drunk driving isn't the primary mover of high risk driving. Rather you have:

          1. People who can't afford self driving cars (now the insurance industry has a good proxy for income that they couldn't tap into before)

          2. Enthusiasts who like driving their cars (cruisers, racers, Helcat revving, people who like doing donuts, etc...)

          3. Older people who don't trust technology.

          None of those are good risk pools to be in. Also, if self driving cars go mainstream, they are bound to include the safest drivers overnight, so whatever accidents/crashes happen afterwards are covered by a much smaller and "active" risk pool. Oh, and those self driving cars are expensive:

          * If you hit one and are at fault, you might pay out 1-200k, most states only require 25k-50k of coverage...so you need more coverage or expect to pay more for incident.

          * Self driving cars have a lot of sensors/recorders. While this could work to your advantage (proving that you aren't at fault), it often isn't (they have evidence that you were at fault). Whereas before fault might have been much more hazy (both at fault, or both no fault).

          The biggest factor comes if self driving cars really are much safer than human drivers. They will basically disappear from the insurance market, or somehow be covered by product liability instead of insurance...and the remaining drivers will be in a pool of the remaining accidents that they will have to cover on their own.

          • By cucumber3732842 2026-01-3118:55

            Classic car insurance is dirt cheap, even for daily driven stuff. Removing people who don't want to drive and don't care to not suck at it hugely improves the risk pool.

            If there's only a small minority of human drivers people like you will have bigger fish to screech about there will be substantially less political will to perpetuate the system and it'll probably go away in favor of a far simpler and cheaper "post up a bond" type thing and much of the expensive mechanisms for grading drivers will be dismantled.

          • By AnthonyMouse 2026-01-311:161 reply

            > Drunk driving isn't the primary mover of high risk driving.

            It kind of is. They're responsible for something like 30% of traffic fatalities despite being a far smaller percentage of drivers.

            > People who can't afford self driving cars (now the insurance industry has a good proxy for income that they couldn't tap into before)

            https://pubmed.ncbi.nlm.nih.gov/30172108/

            But also, wouldn't they already have this by using the vehicle model and year?

            > Enthusiasts who like driving their cars (cruisers, racers, Helcat revving, people who like doing donuts, etc...)

            Again something that seems like it would already be accounted for by vehicle model.

            > Older people who don't trust technology.

            How sure are we that the people who don't trust technology are older? And again, the insurance company already knows your age.

            > Also, if self driving cars go mainstream, they are bound to include the safest drivers overnight

            Are they? They're more likely to include the people who spend the most time in cars, which is another higher risk pool, because it allows those people to spend the time on a phone/laptop instead of driving the car, which is worth more to people the more time they spend doing it and so justifies the cost of a newer vehicle more easily.

            > Oh, and those self driving cars are expensive

            Isn't that more of a problem for the self-driving pool? Also, isn't most of the cost that the sensors aren't as common and they'd end up costing less as a result of volume production anyway?

            > Self driving cars have a lot of sensors/recorders. While this could work to your advantage (proving that you aren't at fault), it often isn't (they have evidence that you were at fault). Whereas before fault might have been much more hazy (both at fault, or both no fault).

            Which is only a problem for the worse drivers who are actually at fault, which makes them more likely to move into the self-driving car pool.

            > The biggest factor comes if self driving cars really are much safer than human drivers.

            The biggest factor is which drivers switch to self-driving cars. If half of human drivers switched to self-driving cars but they were chosen completely at random then the insurance rates for the remaining drivers would be essentially unaffected. How safe they are is only relevant insofar as it affects your chances of getting into a collision with another vehicle, and if they're safer then it would make that chance go down to have more of them on the road.

            • By seanmcdirmid 2026-01-312:011 reply

              Only .61% of car crashes involve fatalities, so that’s like .2% of car crashes you are referring to. Probably more due to alcohol, but we don’t know the ratio of accidents that involve alcohol, which would be more telling.

              > How sure are we that the people who don't trust technology are older? And again, the insurance company already knows your age

              Boomers are already the primary anti-EV demographic, with the complaint that real cars have engines. It doesn’t matter if they know your age of state laws keep them from acting on it.

              > that more of a problem for the self-driving pool? Also, isn't most of the cost that the sensors aren't as common and they'd end up costing less as a result of volume production anyway?

              I think you misunderstood me: If you get into an accident and are found at fault, you are responsible for damage to the other car. Now, if it’s a clunker Toyota, that will be a few thousand dollars, if it’s a roll Royce, it’s a few hundred thousand dollars. The reason insurances are increasing lately is that the average car on the road is more expensive than it was ten years ago, so insurance companies are paying out more. If most cars are $250k Waymo cars, and you hit one…and you are at fault, ouch. And we will know if it is your fault or not since the Waymo is constantly recording.

              > If half of human drivers switched to self-driving cars but they were chosen completely at random then the insurance rates for the remaining drivers would be essentially unaffected.

              That’s not how the math works out (smaller risk pools are more expensive per person period). And it won’t be people switching at random to self driving cars (the ones not switching will be the ones that are more likely to have accidents).

              • By AnthonyMouse 2026-01-317:33

                > Only .61% of car crashes involve fatalities, so that’s like .2% of car crashes you are referring to. Probably more due to alcohol, but we don’t know the ratio of accidents that involve alcohol, which would be more telling.

                Fatalities get more thoroughly investigated so we have better numbers on them, but if you had to guess whether the people who get behind the wheel drunk were similarly disproportionately likely to bang up their cars in a non-fatal way, what would your guess be?

                > Boomers are already the primary anti-EV demographic, with the complaint that real cars have engines.

                EVs and self-driving are two different things. Fox News tells boomers that EVs are bad because Republicans have the oil companies as a constituency.

                > It doesn’t matter if they know your age of state laws keep them from acting on it.

                The only states that do that are Hawaii and Massachusetts.[1]

                [1] https://www.cnbc.com/select/best-car-insurance-seniors/

                > If most cars are $250k Waymo cars, and you hit one…and you are at fault, ouch. And we will know if it is your fault or not since the Waymo is constantly recording.

                If X% of cars are Waymos and you hit another car in your normally priced car and you're at fault, there is an X% chance it will be expensive. If the Waymo hits another car and it's at fault, there is a 100% chance it will be expensive because it will damage itself, and an additional X% chance that it will be very expensive because both cars are.

                And again, that's assuming the price stays as high as it is when the production volume increases. A $250,000 car can't become the majority of cars because that percentage of people can't afford that.

                > That’s not how the math works out (smaller risk pools are more expensive per person period).

                Smaller risk pools don't have higher risk, they have higher volatility, and then if they're too small insurers have to charge a volatility premium. But the auto insurance market is very large and for it to get to the size that it would have volatility issues it would have to be a consequence rather than a cause of the large majority of people switching to self-driving cars.

                > And it won’t be people switching at random to self driving cars (the ones not switching will be the ones that are more likely to have accidents).

                You keep saying that but it's still not obvious that it's what would happen, and in any event the ones more likely to have accidents are already the ones paying higher insurance premiums -- which is precisely a reason they would have the incentive to be the first to switch to self-driving cars.

      • By mavhc 2026-01-310:083 reply

        The fact you think $200 per month is sane is amusing to people in other countries

        • By theodric 2026-01-310:11

          Hell, I was paying €180/yr for my New Beetle a decade ago...

        • By seanmcdirmid 2026-01-310:37

          Haha, yes, today already sucks badly in many US markets. Imagine what will happen when the only people driving cars manually are "enthusiasts".

        • By wavesquid 2026-01-310:331 reply

          Is that low or high?

          • By raisedbyninjas 2026-01-3119:16

            I'm guessing that other developed countries don't need 6-7 figure injury coverage.

    • By djoldman 2026-01-3022:311 reply

      That's probably the future; Mercedes currently does do this in limited form:

      https://www.roadandtrack.com/news/a39481699/what-happens-if-...

    • By ponector 2026-01-311:15

      Why ship owner is paying for the insurance while it's a captain making all decisions?

    • By jgbuddy 2026-01-3017:373 reply

      Because the operator is liable? Tesla as a company isn't driving the car, it's a ML model running on something like HW4 on bare metal in the car itself. Would that make the silicon die legally liable?

      • By jasoncartwright 2026-01-3017:404 reply

        Sounds like it's neither self-driving, nor autonomous, if I'm on the hook if it goes wrong.

        • By scottbez1 2026-01-3018:161 reply

          Yeah, Tesla gets to blame the “driver”, and has a history of releasing partial and carefully curated subsets of data from crashes to try to shift as much blame onto the driver as possible.

          And the system is designed to set up drivers for failure.

          An HCI challenge with mostly autonomous systems is that operators lose their awareness of the system, and when things go wrong you can easily get worse outcomes than if the system was fully manual with an engaged operator.

          This is a well known challenge in the nuclear energy sector and airline industry (Air France 447) - how do you keep operators fully engaged even though they almost never need to intervene, because otherwise they’re likely to be missing critical context and make wrong decisions. These days you could probably argue the same is true of software engineers reviewing LLM code that’s often - but not always - correct.

          • By redanddead 2026-01-3021:00

            > has a history of releasing partial and carefully curated subsets of data from crashes to try to shift as much blame onto the driver as possible

            Really? Thats crazy.

        • By iwontberude 2026-01-3023:14

          Especially since they can push regressions over the air and you could be lulled into a sense of safety and robustness that isn’t there and bam you pay the costs of the regressions, not Tesla.

        • By thelastgallon 2026-01-3018:182 reply

          Its neither self-driving, nor autonomous, eventually not even a car! (as Tesla slowly exits the car business). It will be 'insurance' on Speculation as a service, as Tesla skyrockets to $20T market cap. Tesla will successfully transition from a small revenue to pre-revenue company: https://www.youtube.com/watch?v=SYJdKW-UnFQ

          The last few years of Tesla 'growth' show how this transition is unfolding. S and X production is shutdown, just a few more models to shutdown.

          • By rubyfan 2026-01-3019:23

            I wonder if they will try to sell off the car business once they can hype up something else. It seems odd to just let the car business die.

          • By redanddead 2026-01-3021:03

            Wild prediction, would love to hear the rest of it

      • By throw20251220 2026-01-3017:401 reply

        Who’s the “operator” of an “autonomous” car? If I sit in it and it drives me around, how am I an “operator”?

        • By renewiltord 2026-01-3017:561 reply

          If you get on a horse and let go of the reins you are also considered the operator of the horse. Such are the definitions in our society.

          • By kyleee 2026-01-3022:56

            Great analogy, lol

      • By close04 2026-01-3020:31

        The point is if the liability is always exclusively with the human driver then any system in that car is at best a "driver assist". Claims that "it drives itself" or "it's autonomous" are just varying degrees of lying. I call it a partial lie rather than a partial truth because the result more often than not is that the customer is tricked into thinking the system is more capable than it is, and because that outcome is more dangerous than the opposite.

        Any car has varying degrees of autonomy, even the ones with no assists (it will safely self-drive you all the way to the accident site, as they say). But the car is either driven by the human with the system's help, or is driven by the system with or without the human's help.

        A car can't have 2 drivers. The only real one is the one the law holds responsible.

    • By charcircuit 2026-01-3023:08

      Not all insurance claims are based off of the choices of the driver.

    • By AnthonyMouse 2026-01-310:331 reply

      > If it autonomous or self-driving then why is the person in the car paying for the insurance? Surely if it's Tesla making the decisions, they need the insurance?

      Suppose ACME Corporation produces millions of self-driving cars and then goes out of business because the CEO was embezzling. They no longer exist. But the cars do. They work fine. Who insures them? The person who wants to keep operating them.

      Which is the same as it is now. It's your car so you pay to insure it.

      I mean think about it. If you buy an autonomous car, would the manufacturer have to keep paying to insure it forever as long as you can keep it on the road? The only real options for making the manufacturer carry the insurance are that the answer is no and then they turn off your car after e.g. 10 years, which is quite objectionable, or that the answer is "yes" but then you have to pay a "subscription fee" to the manufacturer which is really the insurance premium, which is also quite objectionable because then you're then locked into the OEM instead of having a competitive insurance market.

      • By childintime 2026-02-016:331 reply

        I like your thesis, but what about this: all this self driving debate is nonsense if you require Tesla to pay all damages plus additional damages, "because you were hit by a robot!". That should make sure Tesla improves the system, and that it operates above human safety levels. Then one can forget about legislation and Tesla can do its job.

        So to circle back to your thesis: when the car is operating autonomously, the manufacturer is responsible. If it goes broke then what? Then the owner will need to insure the car privately. So Tesla insurance might have to continue to operate (and be profitable).

        The question this raises is if Tesla should sell any self-driving cars at all, or instead it should just drive them itself.

        • By AnthonyMouse 2026-02-0110:59

          > That should make sure Tesla improves the system, and that it operates above human safety levels.

          There are two problems with this.

          The first is that insurance covers things that weren't really anyone's fault, or that it's not clear whose fault it was. For example, the most direct and preventable cause of many car crashes is poorly designed intersections, but then the city exempts itself from liability and people still expect someone to pay so it falls to insurance. There isn't really much the OEM can do about the poorly designed intersection or the improperly banked curve or snowy roads etc.

          The second is that you would then need to front-load a vehicle-lifetime's worth of car insurance into the purchase price of the car, which significantly raises the cost to the consumer over paying as you go because of the time value of money. It also compounds the cost of insurance, because if the price of the car includes the cost of insurance and then the car gets totaled, the insurance would have to pay out the now-higher cost of the car.

          > The question this raises is if Tesla should sell any self-driving cars at all, or instead it should just drive them itself.

          This is precisely the argument for not doing it that way. Why should we want the destruction of ownership in lieu of pushing everyone to a subscription service? What happens to poor people who could have had a used car but now all the older cars go to the crusher because it allows the OEMs to sustain artificial scarcity for the service?

    • By abtinf 2026-01-3021:381 reply

      You insure the property, not the person.

      • By redanddead 2026-01-3022:45

        well it's the risk, the combination ..

        it's why young drivers pay more for insurance

    • By loeg 2026-01-3021:171 reply

      It isn't fully autonomous yet. For any future system sold as level 5 (or level 4?), I agree with your contention -- the manufacturer of the level 5 autonomous system is the one who bears primary liability and therefore should insure. "FSD" isn't even level 3.

      (Though, there is still an element of owner/operator maintenance for level 4/5 vehicles -- e.g., if the owner fails to replace tires below 4/32", continues to operate the vehicle, and it causes an injury, that is partially the owner/operator's fault.)

      • By Night_Thastus 2026-01-3021:412 reply

        Wouldn't that requirement completely kill any chance of a L5 system being profitable? If company X is making tons of self-driving cars, and now has to pay insurance for every single one, that's a mountain of cash. They'd go broke immediately.

        I realize it would suck to be blamed for something the car did when you weren't driving it, but I'm not sure how else it could be financially feasible.

        • By loeg 2026-01-3022:00

          No? Insurance costs would be passed through to consumers in the form of up-front purchase price. And probably the cost to insure L5 systems for liability will be very low. If it isn't low, the autonomous system isn't very safe.

        • By AlotOfReading 2026-01-3021:44

          The way it works in states like California currently is that the permit holder has to post an insurance bond that accidents and judgements are taken out against. It's a fixed overhead.

  • By cebert 2026-01-3017:332 reply

    I own a Model Y with hardware version 4. FSD prevented my from getting in an accident with a drunk driver. It reacted much faster to the situation than I could have. Ever since, I’m sold that in a lot of circumstances, machines can drive better than humans.

    • By throw20251220 2026-01-3017:391 reply

      So does AEB in any modern car.

      • By fred_is_fred 2026-01-3017:455 reply

        Tesla fans have not realized that every car made since 2021ish can do this.

        • By 1970-01-01 2026-01-3018:181 reply

          It does more than AEB. It also knows to swerve out of the way during E: https://www.youtube.com/watch?v=c1MWml-81e0

          • By throw20251220 2026-01-3021:241 reply

            Generally known as AES, for example from BMW available with Active Driving Assistant or Driving Assistant Plus packages.

            • By tevon 2026-01-3022:112 reply

              I have a late model Audi, and a Tesla Model 3. Audi has all the bells and whistles.

              Doesn't come close to the safety I feel in the Tesla. Not even close. I know anecdotal

              • By servo_sausage 2026-01-313:32

                Design of these safety features for euro cars generally aims to be invisible unless active. You don't "feel" the car in control.

              • By PunchyHamster 2026-01-318:18

                then the PR worked

        • By dzhiurgis 2026-01-3020:05

          AEB has been around since ages. Even my 2010 Mazda had it. It's nowhere near Tesla's capabilities tho. Not sure what are you trying to achieve with such dunks?

        • By cucumber3732842 2026-01-3119:05

          About once a month my car makes me look like a piece of shit because the AEB gets confused by lane changes when you maintain speed coming up to slow traffic in order to wait for a good spot to move over. As you go to move over it'll flip out and brake as you slide left and no amount of gas pedal will override it so you wind up moving over a lane only to brake check that lane. Thankfully it doesn't do a full stop, just brakes for long enough to realize there's nothing there.

          0/10. Someone is gonna cause a multi-car pile up with this.

          I'm sure it would work great to prevent me from texting my way into the back of stopped traffic though.

        • By throw20251220 2026-01-3017:49

          Obviously.

        • By mullingitover 2026-01-3018:15

          My 2016 Honda Civic has automatic braking (and it has lanekeep assist, so it's technologically superior to a 2026 Tesla).

    • By dangus 2026-01-3017:544 reply

      [flagged]

      • By direwolf20 2026-01-3018:092 reply

        Money is apolitical. Politics is not allowed on HN.

        • By JumpCrisscross 2026-01-3023:31

          > Politics is not allowed on HN

          Nothing in the guidelines says this. What it does require is "thoughtful and substantive" comments, particularly "as a topic gets more divisive."

        • By kelseyfrog 2026-01-3020:15

          This is ridiculous wrong and demonstrates a profound lack of insight into both the history of economics[1] and the current political calculus.

          Please don't use rules as a cudgel or at least have more tact doing so.

          1. https://en.wikipedia.org/wiki/Political_economy

      • By kolbe 2026-01-3020:26

        Hacker News likes to keep conversations focused on the topic at hand. I doubt anyone here thinks politics are irrelevant. We just understand basic courtesy. If your goal is indeed to influence change, you do a massive disservice to the cause by acting immature and injecting your politics into other conversations.

      • By renewiltord 2026-01-3017:58

        Well, as everyone points out: Musk uses Tesla’s stock to fund things and Tesla’s stock is decoupled from fundamentals like revenue so that means that buying his car is decoupled from funding things. Practically a syllogism.

      • By parineum 2026-01-3018:021 reply

        > mass human displacement campaign (a.k.a. Genocide)

        genocide /jĕn′ə-sīd″/ noun

            The systematic and widespread extermination or attempted extermination of a national, racial, religious, or ethnic group. The systematic killing of a racial or cultural group.

        • By dangus 2026-01-3018:032 reply

          Great, I’m glad your dictionary is happy about deporting 5 year olds.

          “Uhm aktually it’s not a genocide it’s just a fascist police state”

          Multiple humanitarian organizations define mass displacement as genocide and/or ethnic cleansing.

          The holocaust literally started with mass deportations/detentions. Then the nazis figured out that it was easier to kill detainees.

          • By mhb 2026-01-3018:45

            If you have some point to make about deporting 5 years olds or whatever, don't you think it would be more persuasive without provoking a tangential discussion about your idiosyncratic definition of genocide regardless of whatever organizations agree with you?

          • By parineum 2026-01-3021:491 reply

            > Multiple humanitarian organizations define mass displacement as genocide and/or ethnic cleansing.

            You're mixing two things here to your advantage. Genocide is (or can be) ethnic cleansing but ethnic cleansing is not genocide. So your "and/or" does some work for you there and makes you correct. However, you said genocide not "genocide and/or ethnic cleansing". You've moved the goalposts.

            It'd be odd to redefine any word that ends in '-cide' from actual killing.

            > The holocaust literally started with mass deportations/detentions.

            Which was ethnic cleansing.

            > Then the nazis figured out that it was easier to kill detainees.

            Which was the point which it became a genocide.

HackerNews