Alice Liogier wants to slap a price on her data.
The 23-year-old graduate student from Paris is researching the commercial use of personal information in the age of big data and she’s reached a controversial conclusion: If people really do own their data, then they should be allowed to sell it.
Regulators from Brussels to Beijing are trying to curb the use of personal information and many Facebook users have been reviewing their privacy settings in recent weeks in response to the Cambridge Analytica scandal. But Liogier argues that entrepreneurs, officials and executives who want to get to grips with the next phase of the big data era need to look further.
It’s not about privacy, she says, it’s about ownership and control.
“The debate right now is focused on data protection and privacy — that’s where fears have crystallized,” Liogier says. “But selling data and data ownership is the next big topic, and probably the most important topic.”
Consumers around the world are waking up to the fact that Facebook and Google’s online empires are built on data they signed away without any monetary compensation. The next step will be thinking about the alternatives, argues Liogier, who defended her masters thesis at Sciences Po in Paris last month and will start a management consulting job after the summer.
Real data ownership will mean having all your information from political ideas, to skin-care preferences and medical records in one place so you can decide who gets to access it and on what terms. That could mean selling it, granting limited use in exchange for a service (like Facebook), or simply keeping it private. The point is to have control.
As part of this trend, Facebook is considering offering an ad-free version of its service to clients who are willing to pay.
This is not just about reining in creepy ads. The ability to process vast amounts of personal data promises to change our relationships, our governments and even our bodies — not to mention, of course, our shopping habits.
Netflix is already using client data to shape TV shows and soon intelligent cars could alert highway operators to holes in the road, or trigger different billboard ads for drivers listening to country music or hip hop. A Cambridge University study famously found that after 300 likes Facebook knows more about your personality than your spouse.
How we deal with that new power is a cultural as much as a regulatory challenge. A younger generation of consumers and an older cohort of officials are wrestling with it already. Regulators in Europe may shape the approach of U.S. tech giants, just as European entrepreneurs may pick up on U.S. trends.
Looming over both is the Chinese market of 1.4 billion increasingly internet savvy people. They are still fenced in by government restrictions for now, but they constitute the ultimate source of big data for businesses.
At the moment, less than one in six people said they’d be likely to sell their data in a global survey of consumer attitudes published by ForgeRock in March. But the more knowledgeable people were about their data privacy rights, the likelier they were to consider it, the survey showed.
The next generation of tech companies are already developing the models that will start to allow users to do that.
London-based startup People.io is paying consumers for data in order to send them more targeted advertising. Former Cambridge Analytica executive Brittany Kaiser in April joined IOVO in New York which uses blockchain technology to store consumers’ data and let them sell it to advertisers.
Parisian think tank GenerationLibre as well as U.S. teams at Stanford and Columbia universities are working to develop a valuation model that would allow people to price their information.
Regulators though are still trying to get to grips with the online world as it functions today, after being caught out by the potential of Facebook in particular to influence the political process. The Cambridge Analytica scandal has prompted calls in the U.S. Congress for tighter regulation of tech giants.
The European Union’s first move to shape the age of big data, the General Data Protection Regulation, came into force on May 25 and focuses on protecting personal data. Companies will face stricter rules on consent and beefed up fines for any data breaches.
There are no provisions though to help people control where their data is used — officials are nervous about anything that looks like they’re encouraging consumers to hand over more information.
“Selling yourself” is not something the French government is ready to endorse, says Cedric Villani, the mathematician Macron appointed to spearhead his push into new technology.
But GDPR does give individuals the right to aggregate their own information, or force a company to delete it.
“You are back at the center of your data universe,” says Molly Schwartz, a 28-year-old New York librarian.
As a Fulbright scholar in Helsinki in 2015, Schwartz was a founding member of advocacy group MyData which pushes for stronger data privacy. Schwartz set up a New York hub on her return to the U.S. and is working to educate people about GDPR. While she hopes the law might have some knock-on effects protecting the data of Americans, she isn’t yet ready for her government to install a similar regulation.
Americans — early adopters of technology products — tend to relinquish their data willingly in exchange for new services. But like her compatriots, Schwartz became increasingly enraged by the Facebook data scandal. The share of U.S. users of the social media giant who described themselves as “very concerned” climbed to 43 percent after the Cambridge Analytica revelations, from 30 percent in 2011, according to Gallup public polling. Another 31 percent said they were “somewhat concerned.”
Just like Liogier, Schwartz sees that ultimately people will take more active ownership of their data. But she’s not so willing to embrace that.
“I don’t know whether to be happy or scared about this,” she says.
While U.S. legislators have pushed to limit government access to personal information while allowing private companies more leeway to self-regulate, attitudes in China are almost the opposite.
Behind the Great Firewall
After decades of authoritarian rule, the Chinese broadly accept that the state security apparatus can access personal information on their phones, Wechat, or internet providers, but they also expect that their information will not be sold or leaked by private companies. Unauthorized and illegal uses of personal data have become a major issue in recent years.
Once information leaks into illegal databases, people are pestered by sales calls, precision ads and even fraudsters. Celebrities are mobbed at airports when fans swap flight details on social networks and in 2016 an 18-year-old girl died of a heart attack after a telephone crook cheated her family out of half a year’s income saved for her college education.
Until the authorities get a grip on the abuse of data, people won’t be willing to use their information in a more active fashion, says Beijing-based data protection campaigner Nadiya Ni.
“The idea of trading personal data is not feasible in China at this moment,” she said.
But Liogier, the Parisian student, is betting that technological advances will ultimately trump cultural reservations. Even among her Parisian friends, Liogier meets resistance to her ideas on data ownership. She tells them they are wrong.
“Mindsets might not be ready, but this is a reality,” she says. “Our data is ours.”