Driver-assist technology doesn't replace the driver. But the question of who's responsible when Autopilot was engaged is reshaping Tesla cases.
Tesla's Autopilot has been involved in dozens of high-profile crashes. NHTSA opened a formal investigation in 2021 covering over 800,000 vehicles. As of 2026, the agency has identified hundreds of crashes with Autopilot engaged. The standard legal answer. 'the driver is responsible'. Is still mostly true, but the second-layer answer has changed: now the manufacturer can also be on the hook.
Autopilot vs. Full Self-Driving
Two different systems, both made by Tesla:
- Autopilot. Standard on Tesla vehicles. Lane-keeping, adaptive cruise control, automatic emergency braking. Designed for highway use.
- Full Self-Driving (FSD) Beta. Optional, paid feature. Attempts to handle city streets, traffic lights, turns. Still classified as 'driver assistance,' not autonomous.
Both systems require the driver to remain attentive and ready to take control at any moment. That's the legal foundation Tesla relies on. Driver-as-supervisor, system-as-tool.
Who's liable?
Default answer: the driver. Operating a vehicle on a public road in Missouri or Illinois requires you to maintain control. Engaging Autopilot does not delegate that legal duty.
More complete answer: the driver AND possibly Tesla. Product-liability law allows pursuit of the manufacturer when:
- The product was defective in design, manufacture, or warning.
- The defect caused the injury.
- The product was being used in a foreseeable manner.
Plaintiffs in recent Tesla cases have argued. Sometimes successfully. That the marketing of Autopilot creates false confidence in its capabilities (failure to warn), that the system's known limitations near emergency vehicles or stationary objects amount to design defect, and that the driver-monitoring camera is insufficient to prevent foreseeable misuse.
What Tesla's own data shows
Tesla vehicles record extensive event data: speed, steering input, brake application, Autopilot engagement status, camera and radar input at the time of the crash. This data is discoverable in litigation. But Tesla controls access. Preserving the vehicle and serving the right discovery requests early is critical.
The data has been used to support both sides. Sometimes it confirms the driver was inattentive. Sometimes it shows the system disengaged a fraction of a second before impact. Which Tesla has argued means 'driver was in control,' but plaintiffs have argued means 'system failed and dumped responsibility onto an unprepared human.'
Practical issues in pursuing Tesla
- Vehicle preservation. Once the car is repaired or scrapped, the data is gone. Get a litigation hold in place immediately.
- Discovery costs. Software experts, mechanical engineers, accident reconstructionists.
- Tesla's legal resources. They fight these aggressively and often demand arbitration under the purchase agreement.
- Combined claims. Pursuing both the driver's auto policy AND product liability against Tesla at the same time is the typical posture.
When is it worth pursuing both?
Generally: serious injury, clear Autopilot engagement, and Tesla data that supports a design or warning theory. Smaller claims usually resolve through the driver's auto carrier alone. The cost of product-liability litigation isn't justified. The threshold has come down as Tesla cases have proliferated, but it's still meaningful.
Tesla cases are not standard auto accidents. The vehicle is evidence. The data is evidence. Preserving both. And knowing what discovery to demand. Has to start within days of the crash, not after the dust settles.
Product-liability claims involve complex legal and factual issues. This is general information about an emerging area of law. Outcomes depend on specific facts, applicable law, and evidence available in each case. Past results do not guarantee future outcomes.
