Every time you drive, your car generates data. That data is collected, sold, and fed into AI systems that make inferences about you — your financial stress, your health, your habits, your psychology. You didn't consent to the inference. You consented to the terms and conditions. Those are not the same thing. This guide (six modules, a quiz, and an optional action plan) addresses the gap between what you agreed to and what that agreement actually means.
The cars, apps, and services covered here collect data most people assume stays private. Some of it is sold within seconds of leaving your vehicle. Some of it ends up with companies — and AI systems — you will never interact with directly. The terms and conditions said it was possible. What follows explains what that actually looks like.
Car manufacturers have promoted their vehicles as "computers on wheels" — emphasizing navigation, emergency assistance, and remote diagnostics. Those are real benefits. There are also costs. Ones that weren't in the brochure.
Over 80% of vehicles sold today still contain the personal data of previous owners — contacts, navigation history, garage door codes — accessible to the next user. Simply unpairing your phone is not sufficient. (Privacy4Cars)
56% of car brands will share your data with law enforcement on an informal request — not a warrant, not a court order. Just a request. (Mozilla Foundation, 2023)
Note on the Mozilla data: Mozilla's 2023 car privacy research remains the most comprehensive automotive privacy audit published. Mozilla's Foundation reorganized in late 2024, reducing advocacy capacity. The underlying 2023 findings — confirmed by FTC enforcement, state AG actions, and KPMG's 2024 industry survey — remain accurate.
In-vehicle AI systems increasingly attempt to infer emotional states from observable signals — facial movements, vocal tone, driving behavior. Those inferences can be wrong, biased, and culturally variable. And the data they generate doesn't stay in the car.
The critical distinction: Emotion AI systems interpret surface-level cues and label them as evidence of internal states. As researchers at IFOW note, a furrowed brow does not necessarily signal sadness. A raised voice does not necessarily indicate anger. Context, culture, and individual variation all affect expression. Misclassification has real consequences when data is used to profile or price people. (MIT Sloan; IFOW; ABA Business Law Today)
Nissan's privacy policy explicitly states it can infer and record "preferences, characteristics, psychological trends, predispositions, behavior, attitudes, intelligence, abilities, and aptitudes" — and sell those inferences to data brokers and law enforcement. This is in the terms you agreed to. (Mozilla Foundation, 2023)
Think of a self-driving car as a global learning machine on wheels. No simulator can replicate every real condition it will encounter. The only way to teach the system is to put cameras in real cars on real roads and collect real data. That is genuinely necessary. It is also where the privacy question begins.
The distinction that matters: two fundamentally different kinds of data exist in any AV or semi-autonomous vehicle. Environmental data — roads, obstacles, pedestrians, weather — is what the system needs to drive. Personal data — who you are, where you go, what you look like — is not. The question is whether companies separate these, or bundle them together.
The only company generating substantial US robotaxi revenue — over 500,000 paid rides per week across 10 cities including Los Angeles, San Francisco, Phoenix, Austin, Atlanta, Dallas, Houston, San Antonio, Orlando, and Miami. No human driver present. Passengers ride alone with cameras running.
Tesla is both a consumer vehicle company and an emerging robotaxi operator. Its privately owned cars run Level 2 driver assistance — a human must always supervise. In June 2025, Tesla launched a commercial Robotaxi service in Austin using unmodified Model Y vehicles. As of early 2026, roughly 135 Tesla robotaxis are in service — compared to Waymo's 3,000+. Whether you own a Tesla or ride in one, the privacy question is the same.
Your data doesn't stay with your manufacturer. The Markup identified 37 companies in the connected vehicle data ecosystem — brokers, hubs, insurers, and analytics firms operating entirely outside your awareness.
"Your driving data goes to a half a dozen companies you've never even heard of for reasons you'd perhaps never agree to if asked directly." — EFF
In 2024, data broker Verisk stopped accepting data from car makers and shut down its driving behavior product for insurers. It shows enforcement pressure works. LexisNexis and other brokers remain active. (Privacy4Cars; The Record)
KPMG "Driving Trust" survey of 50 global automotive privacy leaders, 2024.
Ten questions based on documented facts from the lesson. An honest measure of what you now know.
Now that you're informed, here are some steps you can take using the rights you already have. The tools below are free — and some may surprise you with how much your vehicle already knows. You may not be able to opt out of everything, but you have more choices than the terms and conditions suggested.
Every fact in this guide is drawn from primary sources, investigative journalism, regulatory filings, and peer-reviewed research. The terms and conditions were always there — this is what they actually say.