Could Your Phone’s Location Be Sold Without Your Consent? What the FTC’s Move Against Kochava Means for You

May 12, 2026
by
Pulkit Gupta
deleteme

Most people think “location tracking” means a dot on a map inside an app. The FTC is arguing it can be much more than that: a market where precise location data gets sold unless you gave affirmative express consent—and only when it’s needed to deliver a service you actually asked for. The FTC’s move against data broker Kochava (and its subsidiary CDS) is a loud warning shot. If the FTC gets what it wants, it changes what data brokers can sell, how they prove consent, and what rights you should expect as a consumer.

What the FTC says Kochava was selling (and why it’s not “just ads data”)

The FTC’s case against Kochava isn’t framed as “some harmless ads analytics.” It’s about precise geolocation data—the kind of data that can follow a person’s real movement in the real world.

Here’s the plain-English difference:

  • General location is “this device is somewhere in Chicago.”
  • Precise location data is “this device was at these GPS coordinates at 8:12am, then these at 8:19am, then these at 8:41am.”

Kochava’s feed was described as delivering raw latitude/longitude data at massive scale—Kochava marketed it as “rich geo data spanning billions of devices worldwide,” with volumes it claimed were around 94B+ geo transactions per month and “more than 90 daily transactions per device,” on average . Raw coordinates plus timestamps aren’t “just ads data” because repetition creates a trail. And trails create identity.

Why repeated “pings” can point to a real person

You don’t need someone’s name when you have their routine.

A location trail can show:

  • where a device sleeps most nights (home)
  • where it spends weekdays (work)
  • where it stops for childcare, groceries, gym, or a weekly appointment

Once home/work patterns appear, “anonymous device data” starts looking a lot like a specific human being. That’s the core privacy problem regulators are calling out: you can re-identify people with patterns, even if the dataset doesn’t come with a name attached.

What makes location data “sensitive”

The FTC’s allegations focus hard on sensitive locations—places where a visit can expose something deeply personal, or put someone in danger.

In its 2022 suit, the FTC alleged Kochava’s data could let clients trace people’s movements to and from:

  • mental health and addiction recovery facilities
  • reproductive health clinics
  • places of worship
  • shelters (including homeless shelters and domestic violence survivor shelters)

That list matters because it’s not about “I like coffee shops.” It can reveal health decisions, faith, crisis situations, or vulnerability. The FTC said consumers were unaware and had not consented to this kind of sharing, and that it could lead to real-world harms like stalking, discrimination, and physical violence .

When you hear “data broker” and “location data for advertising,” it’s easy to shrug. The FTC is arguing you shouldn’t—because precise geolocation data can function like a map of someone’s life.

How this kind of location data gets distributed (the part most people never see)

Once precise location data exists, the scarier part is how easily it can move. Most people never “meet” a data broker. They never download a “Kochava app.” The pipeline sits behind the apps you do use.

The alleged model: paid access to a location data feed

The FTC’s complaint describes Kochava offering clients paid access to a user-friendly data feed via Amazon Web Services (AWS) Marketplace, marketed as “rich geo data spanning billions of devices worldwide” .

It wasn’t framed like a one-off report. It was framed like a product you subscribe to. The same reporting on the complaint says clients paid a $25,000 subscription fee for access .

That detail matters because it changes the mental model from “a company shares data with one partner” to “location data is packaged, cataloged, and distributed like a standard commercial feed.”

A mini-scenario most people can relate to

You download a random flashlight app, a coupon app, or a free game. At some point it asks:

  • “Allow location access?”
  • Options: Allow Once / While Using / Always / Don’t Allow

You tap Location: Always because you want it to stop nagging you, or because you assume it helps the app “work better.”

From there, data can travel through an ecosystem you never interact with directly:

  1. App collects location (sometimes through its own code, sometimes through an SDK).
  2. The app shares data with partners (analytics, attribution, ad tech).
  3. A data broker aggregates it with other sources and normalizes it into a feed.
  4. Buyers get access—often not as a “story about you,” but as rows in a database.

The key point: you might think you made a choice about one app. What you really did was open a gate into a supply chain.

And when the product is a subscription feed sold through a mainstream cloud marketplace, it starts to look less like “advertising” and more like location data distribution at scale .

The FTC’s line in the sand: ‘affirmative express consent’ + ‘only for a service you requested’

The FTC isn’t just saying “be clearer.” It’s trying to set a hard rule about when precise location data can be sold, shared, licensed, transferred, or disclosed.

Under the proposed order filed in federal court, Kochava and its subsidiary would be prohibited from selling, licensing, transferring, or disclosing precise location data unless two conditions are met:

  • the company has affirmative express consent, and
  • the data is used to provide a service the consumer directly requested

What “affirmative express consent” means in real life

This isn’t “we mentioned it somewhere.”

Think of it like this:

  • Not buried inside a privacy policy you’ll never read.
  • Not implied because you installed an app.
  • Not a vague “partners may share data” checkbox.

It’s clear opt-in permission, specifically tied to selling or sharing precise location data. If you didn’t actively say “yes,” the FTC’s position is you didn’t consent.

“Only for a service you requested” closes the biggest loophole

This second condition is what makes this move a big deal. Even if a company can argue you said “okay” to location collection, the FTC is pushing that the data shouldn’t be sold unless it’s necessary to deliver something you actually asked for .

A reality check most of us recognize:

  • You open your phone and think: “I just wanted restaurant directions.”
  • The market hears: “This person’s movements can be packaged and resold.”

That gap—between what you thought you were agreeing to and what happened to the data afterward—is exactly what regulators are targeting here. The proposed order draws a bright boundary: permission has to be explicit, and use has to match the requested service .

What the proposed order would force data brokers to build (and what rights you gain)

If the FTC’s standard here sticks, the biggest shift isn’t a headline ban. It’s the plumbing behind the scenes.

The proposed order doesn’t just tell a data broker “stop.” It pushes them to build internal systems that make it harder to play dumb when data gets misused.

What brokers would be forced to operationalize

The reporting on the proposed order spells out several concrete requirements, including:

  • A sensitive location data program 【】
    Plain meaning: a formal process for handling location data tied to sensitive places, with rules that aren’t optional.
  • A supplier assessment program to verify consumer consent 【】
    Translation: “We got it from a partner” stops being a get-out-of-jail-free card. Brokers would need controls to check whether upstream suppliers actually obtained valid permission.
  • Incident reports to the FTC when third parties misuse location data 【】
    That’s accountability. If a downstream buyer does something shady, the broker can’t quietly move on.
  • A data retention and deletion schedule 【】
    This is the “you don’t get to keep it forever” rule. It forces a decision about how long location data sits around, and when it must be purged.

The consumer-facing rights you’d finally feel

The order also describes rights that are rare in the location-data market today, but easy to understand:

  • You can request disclosure of who received your data 【】
    In practice, this could look like: you submit a request and get a list of recipients (or categories of recipients) who got your precise location data.
  • You can withdraw consent 【】
    The point isn’t “I’ll never share anything again.” It’s having a real off-switch that doesn’t require deleting your entire digital life.

If this becomes a broader expectation, it changes the power balance. Location data stops being something that quietly travels forever, and starts being something that has to answer basic questions: Who has it? Why? For how long? And can I stop it?

What changes for you, advertisers, and the app ecosystem (plus what you can do today)

If you zoom out, the Kochava move fits a pattern: the FTC has been signaling a crackdown on commercial surveillance—business models built on collecting, analyzing, and monetizing people’s data at scale . It also warned it would enforce the law against illegal sharing or use of location, health, and other sensitive information .

And it’s not just talk. The same reporting notes the FTC has already banned other data brokers—including InMarket Media, Outlogic (formerly X-Mode Social), and Gravy Analytics/Mobilewalla—from harvesting and selling Americans’ location tracking data .

What likely changes in the ecosystem

This is where things get real for advertisers, app developers, and anyone who buys “audiences.”

Expect higher friction for location-based targeting:

  • Less “easy mode” access to precise location datasets.
  • More proof required around who consented, and what they consented to.

Expect more pressure on app companies and SDK vendors:

  • If a broker has to verify consent, it pushes accountability upstream. Apps and partners get asked tougher questions.

Expect more consumer pressure:

  • Once people learn location data can be sold like a commodity, “why do you need this?” becomes a normal question.

What you can do today (quick, tactical)

You don’t need to wait for regulation to catch up.

1) Fix your location permissions hygiene

  • Change apps from “Always” to “While Using” unless it’s a true navigation/safety need.
  • Remove location access from apps that don’t pass the sniff test (games, coupon apps, random utilities).

2) Audit who has location access (monthly)

  • On iPhone and Android, review app permissions and cut anything you don’t recognize or use.

3) Reduce ad tracking where you can

  • Limit the device’s ad identifier (exact labels vary by OS version).
  • Treat “personalized ads” toggles as real privacy settings, not decoration.

4) Reduce linkability when you sign up for apps

Even if you clamp down on GPS, a lot of tracking is identity-based: phone number, email, and logins.

For sign-ups where you don’t want your real number tied to an account that might collect behavioral data, a tool like Cloaked can help by giving you an alternative phone number (and identity details) for registrations. It doesn’t block GPS, but it can reduce how easily your accounts get connected back to you across apps and services.

Small changes like these don’t make you invisible. They just stop you from handing over extra data you never meant to give.

View all

Was Your “My Rituals” Account Caught in This Data Breach—What Was Taken and What Should You Do Now?

Data Privacy
by
Pulkit Gupta

Is LinkedIn’s Browser Fingerprinting Watching You? What the ‘BrowserGate’ Report Means for Your Privacy

Data Privacy
by
Pulkit Gupta

Are You Vulnerable to Crypto Scams from Southeast Asian Fraud Networks?

Data Privacy
by
Arjun Bhatnagar