Privacy Is Dead: Now What?

bigdata_0

One mobile app getting a lot of buzz is Snapchat, a social media message service that is transferring some 60 million messages and videos per day. Part of the platform’s draw is that the user determines how long friends can view messages before they disappear.The app’s popularity underscores peoples’ desire to have more control over their personal data. Shielding compromising pictures or drunk texting from the prying eyes of a future employer, parent or stranger is just one concern. With more than six billion people connected to mobile devices that can capture and track location patterns, an increasing variety of data can now be linked to individuals. Trouble is, most people don’t have a clue who is looking at their data and what is being done with it.

Alarm bells are ringing, and with good reason. WhatsApp, a popular cross-platform mobile messaging app,came under fire in January from Canadian and Dutch governments for forcing those who download the app to upload their entire address book. In a statement, the office of the Canadian privacy commissioner said it had “reasonable grounds” to believe the California-based developer was “collecting, using, disclosing and retaining personal information” of those who had never used the app, but may have given their phone numbers to a friend or contact who does.

And, in early February the U.S. Federal Trade Commission announced that it had slapped Path, a social network that allows users to keep journals of special life moments, with an $800,000 fine for automatically – and improperly — collecting personal information from its customers’ mobile address books, including friends’ first and last names, addresses, phone numbers, e-mail addresses, Facebook and Twitter user names, and dates of birth. The data collection occurred automatically when version 2.0 of the app was first launched and each time a user signed back into his or her account.

Like the Facebook, Google and MySpace settlements, the Path-FTC settlement also requires the company to establish a comprehensive privacy program and to obtain independent privacy assessment every other year for the next 20 years.

To head off such abuses in the future the FTC has just released new guidance on implementing security for mobile applications. The GSMA, the mobile industry trade association, in February introduced an Accountability Framework designed to add teeth to guidelines it introduced in 2012. A number of European mobile operators, such as Vodafone, Telefonica, France Telecom Orange and Deutsche Telekom, have already started implementing the guidelines.

But these steps may not be enough to restore user confidence: in a recent survey of the Internet populations across 11 countries 68% said they would select a “do-not-track” feature if it was easily available, according to consultancy Ovum’s new report “Personal Data Futures: The Disrupted Ecoystems.” This hardening of consumer attitudes, coupled with tightening regulation, could diminish the supply of personal data, undermining not just the Internet economy but big data analytics that could be used to help solve some of the world’s most pressing problems.

Anonymized data collected by mobile operators could be used to identify poverty pockets in urban areas or pinpoint where people are moving en masse during disease outbreaks in order to better target relief efforts, says Anoush Rima Tatevossian, head of Strategic Communications & Partnerships for Global Pulse, a program run from the executive Office of the Secretary-General United Nations, that is exploring how new, digital data sources and real-time analytics technologies can help policymakers better understand issues like hunger and poverty and manage disasters in real time.

Making sure big data can be used for the greater good – as well as by business — is the focus of work being done by the World Economic Forum (WEF). Based upon a year-long global dialogue among public sector, private sector, and civil society experts, there is growing consensus that the notions of deleting data and limiting its collection are problematic in the era of Big Data. If the likes of Google can’t collect personal information for targeted advertising it would have to start charging for services now offered for free, such as Google Search and Gmail. But the stakes go well beyond keeping the Internet economy afloat: limits on the use of personal data could curtail progress in solving some of the world’s most challenging problems because big data doesn’t just hold the answers to what products and services people consume; it can also tell us what people create, and how they cope with global stresses like unemployment and natural disasters.

Additional points of consensus from the WEF’s global dialogue — which are expected to be outlined in a report scheduled to be released in the first quarter of 2013 — include the need for data to flow and combine with other bits of data but in a manner which accounts for the potential risks and privacy intrusions this flow could create. Encouraging policy frameworks to shift from controlling data collection to focusing on appropriate and trusted data usage was widely supported as data itself does not create value or cause problems; its use does.

The topic is on the agenda at this year’s Mobile World Congress, which is taking place February 25-28.Robert Kirkpatrick, director of the U.N. Global Pulse, is scheduled to speak twice on February 25th: at a ministerial track session entitled “Balancing privacy: Providing social good and economic opportunities – the mobile perspective” and at a GSMA Disaster Response Program seminar entitled “Mobile: A Lifeline in Disasters.”

Kirkpatrick will also participate in a private workshop on the topic of data-driven development during MWC organized by the WEF.

“The challenge is to move beyond the privacy issue so that we can leverage the data to address socioeconomic problems such as financial inclusion, food security and disaster response and some of the other big global challenges and build it out in a way that is commercially sustainable,” says William Hoffman, head of the World Economic Forum’s ICT Agenda.

Moving From Data Fracking To Data Friending

The WEF has been working on a “rethinking personal data” initiative for the past two years. Steering board members include Augie Fabela, chairman and co-founder of VimpelCom; Robert Quinn, AT&T senior vice-president, Federal Regulatory and Chief Privacy Officer; Craig Mundie, Chief Research and Strategy Officer, Microsoft; Ellen Richey, Visa’s Chief Enterprise Risk Officer; and George Halvorson, Chairman and Chief Executive Officer, Kaiser Permanente.

It is a difficult task. Laws governing data privacy were made 30 years ago and are no longer relevant or effective.

Reliance on “notice and consent,” in which companies write obtuse policies and get consumers to click without reading or understanding them, is not working: studies show that privacy policies are hard to read, read infrequently and do not support rational decision making.

For example, a 2008 study by Aleecia M. McDonald and Lorrie Faith Cranor, which was published in a Journal of Law and Policy, found that privacy policies range in length from just 144 words up to 7,669 words (the median is around 2,500 words). At a standard reading pace of 250 words per minute, most privacy policies take eight to 10 minutes to read. The study concludes that it would normally take a person about 244 hours per year to read every new privacy policy she encountered… and 154 hours just to skim them. The authors estimated that if all American Internet users were to annually read online privacy policies word-for-word each time they visited a new site, the U.S. would lose the value of about $781 billion from the opportunity cost of the time to ready privacy policies.

Some companies are working on ways to speed up and standardize legal language. Privacyscore, for example, analyzes the privacy policies of companies along four clear criteria and gives each website a color-coded rating and score. Mozilla has proposed a symbols-based approach to the presentation of legal terms; icons signal, for example, how long data is retained, whether data is used by third parties, if and how data is shared with advertisers, and whether law enforcement can access the data.

Start-ups are cropping up to help alert consumers and shield them from unwanted data snooping. (See the box to find out more about what start-ups are doing in this area.)

But some argue that the real remedy lies in economics, rather than simplifying standard legalese or technology. Personal data is worth money, so soon privacy marketplaces are likely to evolve, where website visitors choose to accept or reject offers for payments or rewards in exchange for loss of privacy. Respect Network, a California-based start-up, if offering to broker such exchanges.

“A new ecosystem is emerging,” says Mark Little, an Ovum analyst who worked on the Personal Data Futures report. “Instead of the Internet company writing a one-sided legal standard-form contract, the consumer will write his own. You’ll have a vault, a company knocks on your door, you open your door and present them with your policy,” says Little.

Little describes the shift as a move from “data fracking” to “data friending.” If “data friending” works and the majority of consumers are confident that the trade is fair and secure and that both they and society as a whole will benefit from sharing their information everybody will win, says Ovum’s Little. The privacy industry can develop lucrative businesses, the Internet economy will continue to flourish, and agencies like the U.N. will obtain the tools to better tackle socioeconomic issues.

There is a big incentive for companies and governments to get this right. Analytics have become the new engine of economic and social value creation. And the insights derived from linking previously disparate bits of data have become essential for innovation.

Comments

comments

Related posts

Top