Balancing Privacy And Security

Lack of access to information and technology has a major impact on people’s ability to prepare for, survive and recover from disasters, says the 2013 World Disasters Report.The report, which was recently released by the International Federation of Red Cross and Red Crescent Societies, focuses on technology and the future of humanitarian action. Written by over 40 humanitarians and academics, the report emphasizes that during the first critical hours after an emergency, most lives saved are actually saved by local people. Yet many of these first responders don’t have access to basic life-saving information and tools, such as early warning systems and basic connectivity and network infrastructure.

The report urges the private sector, humanitarian organizations, governments and local communities to partner together to ensure access to technology for these populations and responders. It also looks at the risks and unintended consequences of this influx of technology: chief among them greater information sharing and more data collection brings risks of information misuse and compromised data security and privacy.

“The challenge is to move beyond the privacy issue so that we can leverage the data to address socioeconomic problems such as financial inclusion, food security and disaster response and some of the other big global challenges and build it out in a way that is commercially sustainable,” says William Hoffman, head of  Data-Driven Development at the World Economic Forum.

The World Economic Forum is organizing an invitation-only workshop during Web Summit that will seek to tap the brains of deep thinkers in government, the private sector and humanitarian agencies about how to enhance the Internet with better metadata and accountability. The workshop will explore the global rules and tools that will allow big data to be leveraged for the greater good and for business.

“The goal is to ensure a user-centered, dynamic and accountable ecosystem,” says Hoffman. “The autonomy of the individual and the policies that support it are fundamental to a resilient and sustainable ecosystem. Additionally we need technical innovation in the areas of metadata so that permissions and provenance can flow with the data. There also needs to be equitable value distribution among all stakeholders which address the current asymmetries that exist.”

 

Hacking For Humanity

Data collected by mobile operators could be used to pinpoint where people are moving en masse during war or disease outbreaks in order to better target relief efforts. But technology is a double-edged sword: The tools that can allow for effective tagging and identification of people can be used to subvert personal freedoms.

Take the case of people who attended a rally against the Syrian regime in 2011. The next day images of protesters downloaded from YouTube were being cross-checked with the faces of people walking through checkpoints, says human rights advocate Sam Gregory, Program Director at WITNESS, an organization that encourages people to use video to document abuse. When IDs were made, arrests followed.

Pictures taken with mobile phones often contain embedded information, known as metadata, which can record information including the type of camera that took the picture and the date and time. The metadata can also contain location information like GPS coordinates, which can be used to determine where an image or video was captured.

That is why WITNESS, working with the Guardian Project, has developed Obscuracam, a mobile phone app that strips out metadata and automatically detects and blurs out faces in videos and images of activists who are worried about their safety. In July 2012, YouTube introduced a similar blurring feature on its website.

The same technology can be used by the general public to protect the privacy of children or others.

WITNESS has also developed technology called InformaCam to do the opposite: add more metadata to verify and authenticate footage and share it securely with someone you trust. Such technology is needed because there are times when citizen journalists and human rights defenders might need to ensure their digital files can be accessed later for legal evidence or for archival purposes.

Or they may want to have the geolocation metadata, additional detailed information about what they are capturing, and the ability to encrypt it, to help media outlets trust and verify the content. Digital manipulation of images and video is now commonplace, so news agencies have to contend with the possibility that digitally altered media is being passed off as truth. A half a million videos have been shot since the conflict began in Syria, says Gregory, so it is important to establish basic criteria for authentication and trust when sorting through such huge volume of videos.

WITNESS wants to encourage technology vendors, such as handset makers, to consider embedding a “verification layer” or “witnessing functionality” in their platforms, just as YouTube agreed to add the blurring function.

WITNESS and other humanitarian organizations, such as the Global Fund, will be taking part in the World Economic Forum workshop during Web Summit. Humanitarian groups participating in the workshop want to collaborate with the tech community to build tools that are useful not just to NGOs but to society as a whole. For example, databases built by NGOs need technology safeguards so that an audit can be made to see who is accessing the data.

“We don’t need a new silo for these technologies — we want to leverage investments that have already been made and figure out how we can use what is out there,” says Gregory. And, he says, NGOs want tech companies and policy makers to start thinking about issues that touch not just privacy of regular citizens but the safety of activists and humanitarian workers. “There are all kinds of issues with things like Google Glass. How do you build an understanding of consent? How do you flag the people that are being filmed?” asks Gregory.

The idea is to get the tech community to start hacking for humanity: to think about human rights values from the start when developing new products: privacy by design, security and the best opportunity to speak out safely.”

The Need For Scrutineers

Controlling what governments can do with data is key. Totalitarian regimes can use videos of demonstrations or data collected by NGOs to track down and jail or kill rebels and human right activists.

While such abuses can be prevented with the right technology tools sometimes companies don’t have a choice about handing over data to governments. Soon after the 9/11 terrorist attack in the U.S. the Treasury Department began to subpoena data from The Society for Worldwide Interbank Financial Telecommunications (SWIFT), a global organization that each day handles close to 20 million financial transactions such as wire transfers for more than 10,000 banks, to allow government analysts to track the movement of terrorist funds.

The SWIFT system doesn’t contain private bank account information. But if a terrorist financier in one country were sending funds to a terrorist in another, it would be in the data of subpoenaed SWIFT messages. The sender’s and receiver’s name and bank account information would also be in the message.

Given the importance and confidentiality of its data, SWIFT demanded that the government’s access be targeted and limited, preventing broad mining but allowing focused searches and analysis to prevent terrorist attacks. Searches for other purposes were forbidden.

“You do the best you can up front sitting down with the government, trying to limit what they want, but very often you don’t know,” Leonard Schrank, the chief executive of SWIFT from 1992 to 2007, said in an interview with Informilo. “Over time you can keep track of what is really useful to them and then reduce and delete the stuff they don't need.”

SWIFT successfully argued that its data could be used only for one purpose — counterterrorism. “Data is seductive and could have been used for other purposes so we hired scrutineers — end-to-end auditors — to make sure that it was tied to a counter-terrorism investigation and there were no back doors,” says Schrank. “We were able to get 99.9% assurance the data was not being misused.”

Schrank argued in a recent New York Times op-ed article co-authored by Juan C. Zarate, a former assistant Treasury secretary, that SWIFT's approach could be a model for how to limit the government’s use of mass amounts of data.

Private companies that are affected should be briefed on how their data is being used and given some say in how the programs are structured, says Schrank. If companies like Google and Facebook could understand why their data is needed they might be able to help the U.S. government with some creative problem solving, he says.

While big companies like Google have the means to hire their own “scrutineers,” start-ups are in a different position, says Michael Jackson, a partner at Mangrove Capital Partners and a scheduled participant in the WEF workshop at Web Summit.

Jackson worked at Skype in the early days, where he was responsible for all contact with security services and compliance with regulation. Skype went to great lengths to ensure that it would not be classified as a communication service so that it would not have to comply with cumbersome wiretap requests, says Jackson. But the company knew that if it was asked for information backed up by the right paperwork it would have a legal obligation to deliver it. The obligations can be quite burdensome for small companies and they are not always clear. “We haven't clarified the legal framework for small companies, for start-ups to comply. It is all a big mystery,” says Jackson.

Jackson says he is less concerned about abuses by government agencies like the NSA then what untrained people are doing with data gleaned from social networking sites. Wrong assumptions are being made and lives being ruined.

 The question: "how safe is your data," is not limited to what governments do with it. With more than six billion people connected to mobile devices that can capture and track location patterns, private corporations can increasingly link all sorts of data to individuals.

 “We need a framework for data at the personal level,” says Jackson.

Until recently most people didn’t have a clue about who is looking at their data and what is being dne with it. The danger is that news headlines about alleged abuses by the U.S.’ National Security Agency and companies such as WhatsApp and Path will create a backlash.

In a 2013 survey of Internet populations across 11 countries 68% of respondents said they would select a “do-not-track” feature if it was easily available, according to consultancy Ovum’s report “Personal Data Future: The Disrupted Ecosystems.”

This hardening of consumer attitudes, coupled with tightening regulation, could diminish the supply of personal data, undermining not just the Internet economy but big data analytics that could be used to solve some of the world’s most pressing problems.

The Way Forward

Reliance on “notice and consent,” in which companies write obtuse policies and get consumers to click without reading or understanding is not working: studies show that privacy policies are hard to read, read infrequently and do not support rational decision making.

Some companies are working on ways to speed up and standardize legal language. Privacyscore, for example, analyzes the privacy policies of companies along four clear criteria and gives each website a color-coded rating and score. And, Mozilla has proposed a symbols-based approach to presentation of legal terms that features a number of icons that signal, for example, how long data is retained, whether data is used by third parties, if and how data is shared with advertisers and whether law enforcement can access the data.

Start-ups are also cropping up to help alert and shield consumers from unwanted data snooping.

But some argue that the real remedy lies in economics, not simplifying standard legalese or technology. Personal data is worth money, so soon marketplaces are likely to evolve for privacy where website visitors choose to accept or reject offers for payments or rewards in exchange for loss of privacy. Respect Network, a California-based start-up, if offering to broker such exchanges.

From Data Fracking to Data Friending

“A new ecosystem is emerging,” says Mark Little, an Ovum analyst who worked on the Data Futures report. “Instead of the Internet company writing a one-sided legal standard form contract the consumer will write his own. “You’ll have a vault, a company knocks on your door, you open your door and present them with your policy,” says Little.

Little describes the shift as a move from “data fracking” to “data friending.” If “data friending" works and the majority of consumers are confident that the trade is fair and secure and that both they and society as a whole will benefit from sharing their information everybody will win, says Ovum’s Little. The privacy industry can develop lucrative businesses, the Internet economy will continue to flourish, and humanitarian agencies will obtain the tools needed to better tackle socioeconomic issues.

 

Comments

comments

Tags

Related posts

Top