Just Like TARS from Interstellar, Google’s Ad Preferences Page is not 100% Honest with You

Jack Bandy
3 min readOct 11, 2019
TARS, from https://www.artstation.com/artwork/1na8dG

Questions around big tech and data dignity continue to make the news. This week, Atlanta officials asked Google whether it had intentionally approached black homeless people to collect facial scans for Android, platform moderation got entangled with the Hong Kong protests, and at least a couple of pieces wondered how tracking-based advertising platforms are involved in the election.

Big tech companies often respond to these concerns with claims that they will “increase transparency.” As Google said in one blog post (bold is mine):

“we need to ensure that people all around the world can continue to access ad supported content on the web while also feeling confident that their privacy is protected…we believe the path to making this happen is also clear: increase transparency into how digital advertising works, offer users additional controls, and ensure that people’s choices about the use of their data are respected”

In another blog post, Google says “transparency, choice and control form the foundation of Google’s commitment to users.” (Facebook is also a big fan of transparency).

The transparency rhetoric sounds nice, but I wonder if Christopher Nolan may have something to teach us about its limits. TARS, from Interstellar, is one of my favorite characters of all-time. Here’s a dialogue excerpt (full script is here) that helps show why:

“What’s your honesty parameter?”

One of the ways Google tries to be honest with you is with this page, Ad Settings, which shows you what Google knows about you. Here’s what mine looks like at the moment:

Check it out here to see your own. Most users will see an age, an income, a relationship status, and some other topics that Google thinks you are interested in. It seems like a helpful tool for increasing transparency.

But just like with TARS, we should ask: what’s the honesty parameter?

A few years ago, a paper from researchers at Carnegie Mellon and UC Berkeley showed that Google’s honesty parameter is less than 100%. Specifically, they found that Google shows different ads to users who searched or browsed pages related to substance abuse, but these changes never showed up in the settings page like “Male,” “18–24 years old,” and other attributes.

In other words, Google does not show you everything they know about you on the ad settings page. This is at least true for substance abuse, as the study showed, but it may also be true for other sensitive characteristics such as religious affiliation, disability, citizenship status, etc.

When I talk to people about tracking-based advertising, they often respond with something like, “well that’s creepy, but I kind of like personalized advertisements.” I plan to write future blog posts addressing this attitude head-on, but in the meantime, I hope the substance abuse example can cause us to pause: what are the implications of big tech companies knowing about sensitive characteristics? And what about the fact that they know these sensitive characteristics, but don’t tell us that they know them?

--

--

Jack Bandy

PhD student studying AI, ethics, and media. Trying to share things I learn in plain english. 🐦 @jackbandy