F
Run

Tesla Autopilot / FSD

Autonomous

by Tesla · www.tesla.com/autopilot

"A $15,000 promise of self-driving that's killed people and still calls itself 'Full Self-Driving.'"

High Risk Published March 15, 2026

Overview

Tesla Autopilot is a driver-assistance system. Tesla Full Self-Driving (FSD) is a more advanced driver-assistance system. Neither one drives itself. The names are lies, and people have died because of those lies.

Elon Musk first promised fully autonomous Tesla vehicles by 2017. Then 2018. Then 2020. Then “next year” every year after that. Customers paid up to $15,000 for FSD capability, and what they received was a beta feature that requires constant driver supervision, can’t handle construction zones reliably, and has been linked to hundreds of crashes and dozens of fatalities.

NHTSA has investigated Tesla’s driver-assistance systems in connection with over 900 crashes, including incidents where Teslas drove into parked emergency vehicles with flashing lights. In 2023, Tesla recalled 2 million vehicles to update Autopilot’s driver monitoring system — the largest recall in the company’s history. In 2024, another recall of 2.2 million vehicles addressed insufficient warnings.

The fundamental issue isn’t that the technology is imperfect. All technology is imperfect. The issue is that Tesla markets an imperfect system with a name that implies perfection, to customers who lack the training to understand the difference.

What It Knows About You

Every Tesla is a rolling surveillance platform. The car has eight cameras recording continuously. Tesla collects video footage, GPS data, speed data, braking data, battery data, and cabin camera footage that monitors whether you’re paying attention to the road.

Tesla’s privacy policy allows the company to share data with “service providers and business partners.” In 2023, Reuters reported that Tesla employees privately shared sensitive images captured by car cameras — including images from inside customers’ garages and private property. The images were shared in internal group chats for entertainment.

If you own a Tesla, the company knows everywhere you drive, how fast you drive, how aggressively you brake, where you park at night, and — via cabin cameras — who’s in the car with you and what they’re doing. That’s not a car. That’s a surveillance vehicle you make payments on.

The Real Risks

Safety is the existential risk here. People have died. The NHTSA database contains hundreds of crashes involving Autopilot, many of which occurred because drivers trusted the system’s name and marketing more than the system’s actual capability. A system called “Full Self-Driving” that requires full driver attention at all times is not just misleading — it’s dangerous by design.

The autonomy score is 9/10 for a reason. When you engage FSD, the car makes steering, acceleration, and braking decisions. If it makes a wrong decision, you have seconds or less to intervene. You’re not driving — you’re supervising a system that might kill you if you supervise poorly. And because the system works well 99% of the time, your attention naturally drifts. This is the automation paradox: the better the system works on average, the worse humans perform in the rare moments it fails.

Lock-in is brutal. FSD is tied to the vehicle, not the owner. If you sell your Tesla, you lose the $15,000 feature. If Tesla decides to change the terms (which they have), you have no recourse. The car’s software is controlled entirely by Tesla via over-the-air updates. They can add features, remove features, or change how existing features work without your consent. In 2023, Tesla remotely reduced the range of some older vehicles via software update.

Bias matters in autonomous vehicles. Studies have shown that computer vision systems perform worse on darker skin tones, meaning pedestrian detection may be less reliable for people of color. Tesla uses a vision-only approach (no LIDAR, no radar in newer models), making it more susceptible to these biases.

Alternatives

  • Don’t use FSD. Use basic Autopilot (adaptive cruise control + lane keeping) on highways with full attention. It’s far less risky.
  • Comma.ai / openpilot: Open-source driver assistance. You can audit the code. It’s honest about its limitations.
  • Public transit: Zero autonomy risk. Zero privacy risk. Gets you there without anyone dying because an algorithm confused a white truck for the sky.
  • Other automakers’ ADAS: GM’s Super Cruise, Mercedes’ Drive Pilot, and BMW’s systems are arguably more conservative and honest about their capabilities. Mercedes actually accepts legal liability when Drive Pilot is engaged — Tesla does not.

Our Verdict

Tesla Autopilot / FSD gets an F. This is the only product in our index where the risks include death. Not theoretical death, not potential death — documented, investigated, confirmed deaths. Tesla charges up to $15,000 for a feature it calls “Full Self-Driving” that does not fully self-drive, monitors its customers through cabin cameras, has been caught sharing private footage internally, locks the feature to the vehicle rather than the buyer, and can alter the product you paid for via remote update at any time. The technology is impressive. The marketing is criminal. The combination has killed people.