There are no secrets anymore!” I hear that sentiment often as a neuroscientist and data scientist. Entrepreneurs hungrily anticipate that brain-machine interfaces will read thoughts directly from your brain. Medical researchers wrestle with data that are increasingly difficult to scrub clean of information that can identify you. Citizens voluntarily give away information about themselves that previous generations fought to keep private, creating ethical tensions for innovative scientists who use social media for their research.
Conversations in all of these realms of my work usually lead at least one person to exclaim something about the death of secrets.
I understand why people suspect that secrets are extinct. It is true that with enough data, the right analytic strategy, and moderate Internet scraping skills, most aspects of ourselves can be unveiled in a data-driven society. However, I do not agree that technology has abolished secrets. To the contrary, technology has generated new kinds of secrets. The information these new secrets protect is what companies do with your data.
As a consumer, you often are not aware that your private information, including financial history and health records, is regularly sold or traded in a “de-identified form” (without names, Social Security numbers, or addresses) to data brokers. Data brokers get your data from entities you might expect, like search websites, but they also get data from entities we trust to protect our private data, like hospitals. Data brokers in the United States are not required to disclose where they received their data from or whom they sold it to. In other words, it is legal for data brokers to keep secret what has happened to your data.
What do data brokers do with your data? They integrate all the information they can assemble to create models of what kind of consumer you are. They then resell or trade their aggregated and analyzed data to other parties. Some of this data-trading may seem harmless, and even beneficial. In particular, one of the main reasons companies buy aggregated data is to target their advertising more precisely, which is a benefit for fans of personalized advertising.
Other applications, however, are more disconcerting. For example, companies combine data sets not only to recover your personal information, but also to infer aspects of your health and preferences that you never agreed to share, like your IQ, your sexual orientation, or even whether you have HIV. At present, American companies are allowed to use these predictions to set your insurance premiums, influence your chances of getting accepted to a college, or implement a growing collection of other purposes difficult to track and regulate.
Data brokers aren’t the only ones who disguise what happens to your data. Many well-known companies do not impose the same privacy rules on apps that use their platform as they apply to themselves. As a result, when you take a Facebook poll, for example, you may unintentionally agree to share your otherwise private Facebook information—and sometimes the Facebook information of your friends—with the app owner. The app owner can then sell these data in various forms to third parties, including governments, even if Facebook itself says that it does not sell such information.
Until some recent investigative reporting, most voters had no idea that this data-transferring mechanism was at work during the U.S. presidential election to target voters using personality profiles gleaned from a Facebook app. A similar source of data is contributing to votes in Britain. A Tinder app has been built to advocate for Labour Party votes by taking advantage of answers provided through personal message exchanges with bots posing as real Tinder matches.
Many people who use platforms like Facebook or Tinder do not understand the extent to which their social posts and dating preferences are being harvested, and often influenced, for the benefit of specific political campaigns. Nobody knows how many people would share their personality traits and social preferences voluntarily with political campaigns if they understood the consequences of choosing to do so.
Regulations around data-sharing are evolving constantly, but no matter what happens in the future, it is very likely that some of your data have already been used in a way that you did not knowingly approve of. Contrary to popular belief, there are secrets in today’s data-driven world. It’s just that the secret-keepers are no longer individual citizens. Today’s secret-keepers are companies and governments. And they are keeping secrets from you.
Borg is assistant research professor in the Social Science Research Institute and affiliate faculty in the Center for Cognitive Neuroscience.