Big Data risk assessment: Everyone for themselves There is no one-size-fits-all risk assessment for Big Data. However, organisations should be cautious as traditional rules still apply, privacy experts say. Dugie Standeford reports.
Big Data encompasses a bundle of characteristics which differ from traditional data, but that doesn’t relieve organisations from complying with the rules for processing personal information, UK Information Commissioner’s Office (ICO) Senior Policy Officer Carl Wiper said in an interview. The office, which is preparing to re- issue its July 2014 paper on Big Data1 with “minor tweaks”, does not plan to change its view that data protection (DP) principles apply and that no new paradigm is needed, he said. Nor should DP risk assessments be handled differently from traditional ones, privacy lawyers said.
“Big Data is not a game played by different rules,” Morrison & Foerster attorneys Sue McLean, Ann Bevitt2, Karin Retzer and D. Reed Freeman said in a recent client alert3. It “is just data. It’s simply that we have more of it and we can do more with it.”
Risk assessments involving Big Data are the same as the ones organisations should already be doing when they roll out any new technology or perform activities in a new way, Bevitt who heads Morrison & Foerster’s privacy and data security group, said in an interview. However, she noted, having so much more data available makes data processing potentially riskier. Companies must “focus very carefully” on the benefits and risks of what they are doing because Big Data “ups the ante,” Wiper said.
Report, review, assess, remediate
Every organisation is different, but the standard framework for DP risk assessments involves reporting, reviewing, assessing and remediating, said Richard Kemp, founder of Kemp IT Law.
“In general, around 20% of data coming into an organisation is ‘structured’, meaning it is based on con- tracts, licences and other documents,” Kemp said. Big Data, however, often involves a 4:1 ratio of unstructured (Internet, social media) versus structured information. Companies that pull in data from social media can experience a mismatch between what they believe they are entitled to do and what they can actually do under their licences and contracts. So a risk assessment must first determine under what contracts or licences data is flowing into the organisation, how it is being used and whether there is a gap between that and how it should be used.
“Part of this evaluation involves deciding what a particular company wants to do with the Big Data it has,” Kemp told PL&B. The problem is that different departments want to take advantage of the information for varying purposes, because Big Data is more bottom-up than top-down. Organisations under closer regulatory scrutiny, such as banks and health-related organisations, may need a more formal evaluation of the gap because of tighter reporting requirements.
The risk assessment tends to go hand-in-hand with a more agile approach to information governance, Kemp said. “Big Data governance does not arise in a vacuum,” he wrote in an October 2014 white paper, ‘Legal Aspects of Managing Big Data’4.
Large organisations typically already have in place governance activities for all or part of their data activities, ranging from data protection and privacy governance frameworks to more detailed governance and management structures focused on information architecture, data accuracy, security and regulatory compliance. But the rise of Big Data is democratising the benefits of using the information, causing data governance and management to rise up the corporate agenda alongside Big Data itself, he wrote.
Kemp’s white paper proposed a structured approach for managing Big Data projects that involves four steps:
- 1) Risk assessment; 2) strategy statement; 3) policy statement; and 4) processes/procedures. The review should focus particularly on where data comes from, the terms under which it is supplied and how it is being The next stage should gauge whether use is consistent with contractual and licence terms, and whether all necessary consents have been obtained for the uses carried out. The review and assessment should be part of a report to senior management, and will normally also include recommendations for correcting any areas of non-compliance and looking forward to the strategy and policy aspects of data governance, the paper said.
Fixing problems involving structured data could mean obtaining re- permissions under the contract or licence to use data, Kemp said. Unstructured information from social media and the Internet could include open source data, which also has some rights attached to it, and that must be addressed, he said.
Any personal data must be processed fairly, with notice and con- sent, under the UK Data Protection Act and its EU equivalents, Kemp noted. But the whole point of Big Data is to come up with “unexpected correlations” between data sets, which is difficult to reconcile with the requirement that data be processed in connection with individuals’ reasonable expectations, he said. The privacy part of a Big Data risk assessment will be “reasonably sizeable.”
While obtaining express consent to use of Big Data might seem to be the final answer, people can revoke con- sent, requiring consideration of anonymisation, pseudonymisation, privacy impact assessments, privacy by design and so on, Kemp said. Those mechanisms are fine, but the consequence of so many sets of Big Data being analysed together is that there will be much more personal data involved. There’s an increasingly powerful view that in the long run, Big Data may make more traditional data protection “a bit of a nonsense,” he added.
In-house or outside consultant?
One key issue for data protection officers contemplating Big Data risk assessments or privacy impact assessments is whether to handle them in- house or externally, said DP consult- ant, Martin Hoskins, formerly head of data protection for mobile operator Everything Everywhere Ltd. If the assessments are carried out externally, DPOs must think carefully about several issues.
One question is how long to give the consultant to do the work, Hoskins said. A second is who in the organisation the consultant should be allowed to speak with. Then there is the issue of what information the consultant should be given to explain what the assessment is about; and, finally, what line the consultant should take, bearing in mind that the final report may have to be made public and, hence, subject to discovery under the Freedom of Information Act.
These considerations are true of data risk assessments in general but become more important with Big Data because the “stakes are much higher,” Hoskins said. “It’s more difficult to convince organisations that data they’re collecting may be personal data. In addition, the third data protection principle requires that personal data processed be adequate, relevant and not excessive in relation to the purpose for which it’s processed. Many companies aren’t clear about what they want to do with Big Data other than mash lots of it together, making it hard to reconcile with the principle of data minimisation.”
Where problems arise
Big Data risk assessments can go awry when organisations fail to think broadly enough from the start about what they want the information for, Bevitt said.
A company might decide to do ‘X and y’ with the data, and carry out a risk assessment for those activities, Bevitt said. Later, it might want to do ‘Z’ as well, so will have to do another assessment. Neglecting to think comprehensively about purpose limitation can lead to activities that are incompatible with the original risk assessment and to having to obtain consent from data subjects, she said. Big Data is an amorphous concept and, “It’s all still fairly new.” That makes it difficult for organisations to consider all the possible uses they could make of the information.
There is no one-size-fits-all risk assessment path, Bevitt said. Organisations handle such evaluations differently, and that variety flows into risk assessments for Big Data. That there has been no significant convergence of best practices for Big Data risk assessments reflects the fact that each enterprise performs them according to its own needs, she said.
Organisations do risk assessments using very different methodologies, said Hoskins. Most do them in-house and in such a way that they would find public disclosure of their findings embarrassing. The reports could show how much personal data a company is mashing together, or that it had deliberately decided to take a risk on a novel, innovative activity not covered by DP law, or that personal data is being used in a perfectly legal manner that some people might nevertheless find objectionable.
Sharing risk assessments with regulators or the public leaves organisations open to unfair criticism if their discussions about privacy risks and remediations are transparent, said Hoskins. Sections of the reports can be taken out of context when companies are “frank and open and fearless,” he said.
Not a “present evil”
Asked whether the ICO is aware of organisations mishandling Big Data risk assessments, Wiper said the issue does not feature much in complaints to the office, and even less in enforcement actions. The paper wasn’t directed at a “present evil” but was intended simply to take a closer look at a hot topic. It was about getting “slightly ahead of the game” and preparing businesses for a future that involves more data analytics.
The lack of complaints about misuse of Big Data could be because its use seems “much less embedded” in the UK than in the US, Wiper said. The ICO found that organisations were sometimes reluctant to characterise what they were doing as data analytics, either because it in fact wasn’t, or because the data-processing didn’t involve a major paradigm-shift from what they were already doing.
The ICO believes that there is something about Big Data that is different from what has gone on before, Wiper said. But the office’s key message is that if a company processes personal data, DP rules still apply. It is more difficult to apply data protection principles to Big Data, but that doesn’t mean organisations don’t have to do it. Wiper told PL&B that the ICO will publish its response to the comments on its July 2014 paper “soon”. Its view that DP rules apply to big data has won support from other data protection authorities, and the ICO believes that it is on the right track.
References
- ICO paper: https://ico.org.uk/media/for- organisations/documents/1541/big- data-and-data-protection.pdf
- Ann Bevitt has now moved and is Partner and Employment/Data Protection Lawyer at Cooley LL
- The MOFO client alert is at: mofo.com/~/media/Files/ClientAl ert/2014/11/141113BigDataSeries.pdf
- Kemp IT Law: legal Aspects of Managing Big Data (Oct. 2014): httwww.kempitlaw.com/legal-aspects-of-managing-big-data-white-paper/content/uploads/2014/10/Legal- Aspects-of-Big-Data-White-Paper-v2- 1-October-2014.pdf
Read the original article published in Privacy Laws & Business