Dr. Waël Hassan: Response to OPC Consent Consultation

Summary

Many believe that privacy as we know it is at a crossroads. Can data protection flourish in this brave new world of technological change, or will it decay? Economic, legal, technical, and corporate innovation will all be crucial in helping to direct the future of data protection in Canada.  The OPC’s consultation paper is on point and rather needed as privacy laws have become dated. This proposal will address the four questions put to stakeholders:

  1. What roles, responsibilities and authorities should the parties responsible for promoting the development and adoption of solutions have to produce the most effective system?

We will begin by proposing new relationships between government, technology entrepreneurs, and corporate and business leaders to strengthen and enhance privacy in Canada. Privacy-focused strategic alliances between government, major corporations, and innovation agencies can offer significant benefits to their various stakeholders, resulting in economic growth, improved legal compliance, and stronger privacy protections for individuals.

  1. What, if any, legislative changes are required?

The EU’s pending data protection legislation contains many elements that Canada should consider adopting, including a horizontal legal approach, mutual responsibility for data, national regulation of multinational corporations, strong compliance validation mechanisms, breach notification requirements, financial penalties, and individual and collective options for recourse.

  1. Of the solutions identified in the discussion paper, which ones have the most merit and why?

Emerging technologies have great potential to support privacy and individual control over personal data. Risk-based de-identification can be used effectively to protect privacy in big data contexts. Data “tagging” can support the management of privacy preferences across services, and in future could allow individuals to maintain control over personal content shared online.

Additional enforcement powers for the OPC are another key solution. The European Union offers an example of a strong governance and enforcement model that can effectively motivate corporate compliance with privacy laws.

  1. What solutions have we not identified that would be helpful in addressing consent challenges and why?

Apart from the question of individual consent, a public conversation is needed about the ethical use of big data, even in anonymized form. The OPC can act to create more dynamic and accessible forums for individuals to express their concerns and complaints about how their data is used by corporations and other entities.

 

Why Privacy Innovation is Needed

 

Privacy under Attack

Government and healthcare agencies, financial institutions, and corporations store massive amounts of personal data provided by Canadians. Yet according the Office of the Privacy Commissioner of Canada’s own data, 90% of Canadians are concerned about their privacy: 73% feel they have less protection of their personal information than 10 years ago, and 56% are not confident that they have enough information to know how new technologies affect their privacy (2014). Rapid changes in technology and communications are altering the ways we interact, and much of our private information is slipping out of our hands. In social media forums, we can instantly share personal details with a public of our own choosing. Such sharing is part of the developing cultural norm. What is less easy to control is what happens to the data tracked from our Internet use. As more and more of our interactions and transactions take place online, more and more of our personal information is finding its way onto the Internet.

 

Major Internet corporations such as Google and Facebook track consumers’ activities online, creating identity profiles of consumer preferences in every area of life, by analyzing browsing history, consumption patterns, status updates, and email content. As this data is shared amongst corporations, and with government security agencies, personal privacy faces an unprecedented challenge. There is little real transparency: consumers routinely agree to terms and conditions so lengthy that it isn’t practical to read them. Few citizens are aware of the level of data sharing that takes place between major corporations, and with federal agencies.  As larger and larger volumes of data are collected and aggregated by big data initiatives, it is becoming more difficult to define precisely what is considered personal information.

 

The advent of the Internet of Things adds another dimension of complexity to the sharing of personal information. In this new paradigm, informed consent is more important than ever. Yet legislation lags behind technological innovation, and organizational culture is still reorienting itself to respond to these new privacy challenges.

 

Although big data and the Internet of Things pose new challenges to privacy, they are not incompatible with it. A wide variety of sophisticated privacy-bolstering tools already exist, and many more are within reach. Privacy is being eroded not because of technological innovation in itself, but because industry currently has little economic incentive to invest in privacy innovation.

 

In this rapidly-changing technological environment, it is crucial for privacy leaders to find innovative new ways to bolster privacy, and to communicate these recommendations clearly to government, corporations, and the general public.

 

A Model of Shared Investment in Privacy Protection

Privacy is a relatively untapped resource in Canada. Working with government and the private sector, privacy leaders can help raise the profile of privacy protection by spelling out its economic potential. Privacy-focused strategic alliances between government, major corporations, and innovation agencies can offer significant benefits to their various stakeholders. Privacy commissioners and data protection authorities could play a pivotal role in inviting these stakeholders to the table.

 

Within corporate culture, data protection is often seen negatively, as another unfortunate overhead. Protection of private information is understood as a threat to profit, draining resources to avoid the risk of a security breach and the attendant liabilities. Yet privacy-bolstering technology is also a business opportunity. Far from being a liability, privacy can be a powerful opportunity for companies to differentiate themselves as leaders in corporate responsibility and service to the public.

 

While recent technological advances have often undermined personal privacy, emerging technologies can strengthen and protect an individual’s activity online. Investing in privacy-bolstering technologies is a smart business move. The erosion of online privacy is of significant concern to the public: for example, 90% of polled US citizens say that having control over what information is collected about them is important (Pew Research Center, 2015). The further development of privacy-bolstering technologies would thus be responding to the concerns of a significant majority of Internet users, who desire greater control over their personal data.

 

Who Benefits?

When government agencies (federal, provincial, and municipal) and corporations invest in startups working on privacy innovation:

  • They are contributing to economic growth: New companies, new jobs, new Canadian innovation.
  • Corporations are investing in their own economic future: As well as ensuring they are fully compliant with privacy legislation, the innovations that corporations adopt will give them a competitive edge in a challenging market.
  • Individuals’ personal information is better protected: All Canadians benefit from a climate in which issues of privacy and informed consent are given priority.

 

A MODEL OF SHARED INVESTMENT IN PRIVACY PROTECTION

For this innovative model of shared investment in privacy to help expand Canada’s privacy paradigm, it will need to be supported by developments in law, technology, and corporate culture.

This model focuses on how:

  • Data protection law can expand to meet these new privacy challenges.
  • Technological entrepreneurs can sharpen their focus on supporting personal privacy.
  • Corporate culture can re-orient its understanding of privacy protection, seeing it not as a threat but as an opportunity.

Data Protection Legislation

 

Prioritizing Privacy: The EU Approach

Current data protection laws in Canada, like those in the US, are vertical (sector-specific). By contrast, the European Union and many of its constituent states follow a horizontal model. This allows for a more mature, integrated approach to the protection of personal information. With more data sharing across organizational boundaries, sector-specific laws are becoming increasingly difficult to apply, and many initiatives now require extensive consultation to establish relevant privacy obligations. Data sharing across jurisdictions raises further complications; in Canada, some provinces have similar privacy laws, both in the realms of commerce and healthcare, but others have very divergent legislation. The EU has irreversibly committed itself to data protection reform, and this pending legislation offers much that Canada could consider emulating.

 

Some ideas and practices Canada should consider adopting from the EU:

 

  • A horizontal legal approach: allowing for streamlined provision and enforcement of data protection.
  • Mutual responsibility for privacy of shared data: in which both the primary service provider who first collected the data and third parties with whom that data is shared are held responsible for enforcing privacy provisions. A shared responsibility model reflects greater privacy maturity by shifting from an exclusive focus on adequate policy and agreements to ensuring effective implementation through monitoring and governance of all data holders.
  • National regulation of multinational corporate activity: The EU approach to data sharing across jurisdictions is based on territories, which means that foreign companies must comply with the laws of the countries in which their customers reside. The pending legislation will give national regulators the power to assess the legal compliance of multinational companies’ codes of conduct. Codes of conduct must contain satisfactory privacy principles and effective implementation tools, and demonstrate that they are binding. By contrast, Canadian citizens have little recourse to protect the privacy of their personal information held by American multinational companies (which include most cloud computing service providers), since under the US Patriot Act all information collected by American companies is subject to US government surveillance.
  • Validating compliance: While current Canadian law requires privacy impact assessments for all initiatives handling personal information, the content of these assessments is defined only in terms of compliance with general principles. The pending EU legislation, on the other hand, defines very specific criteria for privacy impact assessments. Similarly, while North American laws require only that organizations create risk mitigation plans, the EU Regulation makes corporate rules and policies binding, and through auditing and monitoring holds organizations accountable for their publicly and internally published policies.
  • Data breaches: In line with a greater focus on privacy risk management and enforcement, the new EU Regulation will require that companies (inside or outside Europe) holding information pertaining to EU citizens should notify citizens in the case of data breaches. The pending Regulation requires that companies notify regulators of breaches within 24 hours, and affected individuals within 72 hours, particularly if the breach increases the risk of identity theft, humiliation, or damage to reputation. North American laws only mandate notifying local regulators of breaches at the company’s earliest convenience, which in practice means within two or three months, and notifying individuals within a similar time frame if there is a risk of harm to individuals as a result of the breach.

Under the new EU legislation, fines for large data breaches will be a proportion (currently 2%) of the company’s gross revenue. Most North American laws define a set amount for fines, averaging a few hundred thousand dollars, which is insignificant for large companies. For companies to take privacy seriously, fines for violations must be set as a proportion of revenue.

  • Individual and collective right of action: As in the US, citizens in Canada can only launch complaints through the provincial or federal privacy commissioner. This makes it much more difficult to launch class action suits and otherwise advocate for privacy as a citizen collective. The new EU Regulation will allow individual citizens to exercise their right to protect their personal data, including the right to be removed from databases and the right to transfer their data elsewhere. Citizens can appeal individually or through any agency, organization, or association that works to protect their rights and interests. While North American laws do not offer any specific recourse, the pending EU Regulation guarantees the right to compensation for damages in the case of a privacy breach involving a single or multiple data custodians.

 

Since the EU Court of Justice struck down the former EU/US “Safe Harbor” agreement, Canadian companies with transnational business interests wishing to avoid legal complexities would be well advised to bring their privacy policies in line with EU standards.

 

Personal Content Privacy

Personality rights are an evolving field in Canadian jurisprudence. The provinces of British Columbia, Manitoba, Newfoundland and Labrador, and Saskatchewan have enacted privacy legislation dealing with personality rights, and Canadian common law also recognizes a limited right to personality. Such rights can also be found in the Civil Code of Quebec. Recent technologies create new possibilities of recording audio and video – strengthening and expanding such legislation will help keep privacy protection in step with these technological advances.

Solutions

The OPC discussion paper on consent and privacy suggests a wide variety of potential solutions to enhance or replace consent, and strengthen governance and enforcement. Implementing many of these solutions would require close collaboration between regulators and technology developers, illustrating the value of partnerships between government, industry, and technological innovators. We will review and expand upon a number of the proposed solutions, with particular attention to how emerging technologies could support privacy.

 

1.      Enhancing Consent

 

Strategies for enhancing consent focus on improving individuals’ ability to exercise meaningful consent, for example, by improving the delivery of information about privacy and simplifying the process of setting privacy preferences.

 

Greater transparency in privacy policies and notices

The OPC paper cites the “transparency paradox” of privacy notification: comprehensive privacy policies are too long and complex for most users to read and understand, while simple and concise notices do not allow users fully to understand data flows. As the paper suggests, layered privacy policies, data maps and infographics, and notices at key points in the user experience are helpful. “Privacy icons,” however, are unlikely to be understood by most users.

 

Managing privacy preferences across services

The concept of managing privacy preferences across services by “tagging” data is an extremely valuable solution. The “eXtensible Access Control Markup Language” (XACML) is an especially helpful tool for this. Applications can be built around this technology to allow an individual to manage, maintain, track, and destroy documents, images, audio, or content in general even after it is released through email or posted on the Internet. A public register overseen by the OPC, modeled after “do not call” lists, could be created to manage user preferences and audit actual data usage to ensure it is consistent with preferences.

 

Technology-specific safeguards

The OPC paper’s suggestions regarding potential solutions to strengthen consent for “Internet of Things” devices and services are valuable.

 

In addition to such built-in mechanisms, we advocate the broader distribution of defensive online security tools. Programs that block tracking software tend to be accessed by the technologically savvy rather than the average computer user. Such technologies need to become more visible, and easy to use, perhaps bundled with other defensive tools such as anti-virus programs.

 

Privacy as a default setting (Privacy by Design)

While many of the strategies identified in discussions of “Privacy by Design” are valuable, this umbrella term has been abused by many companies seeking to market their products. There are not yet any detailed criteria by which to evaluate whether an initiative is or is not aligned with the “Privacy by Design” concept. We advise against the use of this term as it has been overextended and is beginning to be discredited by colleagues in the privacy space.

 

Alternatives to Consent

Several strategies are proposed by the OPC to address situations where informed consent may not be practicable. De-identification aims to protect individual privacy by concealing identities. The concepts of “no-go zones” and “legitimate business interests” propose new guidelines for the ethical use of personal data, particularly using big data strategies, without consent.

 

De-identification

A variety of sophisticated new data liberation technologies allow users access to data while masking or erasing the identity of the data source, utilizing de-identification techniques such as tokenization or anonymization. Optimally used with automated risk analysis tools, de-identification allows both ongoing utilization of data and protection of individual privacy.

 

De-identification does have limits at present. Most current technologies focus on the protection of text records. Given the proliferation of recording technologies (such as smart phone cameras, Google Glass, or drones) future privacy-bolstering technologies will need to adapt to different kinds of content, and an individual’s rights therein. For example:

  • Video privacy: Does an individual consent to be photographed or filmed? If not, privacy-bolstering technology could allow the image to be masked or erased.
  • Audio privacy: Does an individual consent to be recorded? If not, privacy-bolstering technology could allow the relevant part of the recording to be masked or erased.

 

Beyond technological limitations, however, there are certain ethical discussions about the use of de-identified data that need to take place. Effective de-identification may completely conceal individual identities, but the ways in which data is used have broad impacts on society. Conclusions drawn on the basis of big data analytics affect marketing decisions, media coverage, and corporate and government policy. Apart from the question of individual privacy, we suggest that big data should be seen as a common good. Just as corporations need a “social license” to exploit publicly owned natural resources, they should also be required to engage in meaningful public consultation about the uses of big data. We will discuss this concept further in relation to governance solutions.

 

The other solutions proposed by the OPC paper as alternatives to consent, “no-go zones” and “legitimate business interests,” offer some helpful guidelines for the appropriate and ethical use of big data. However, as technological innovation continues, new ethical issues not addressed by these criteria will arise. Ultimately, decisions about the appropriate uses of big data can best be made through dynamic and ongoing public consultation.

 

Governance

Several of the governance solutions proposed in the OPC paper have merit. Sectoral codes of practice are a valuable means by which the OPC could guide the implementation of privacy laws in specific contexts. However, we would caution against reliance on privacy trustmarks developed and operated by industry bodies, which may provide limited accountability. To name one concerning example, the TRUSTe trustmark cited in the OPC paper was charged by the US Federal Trade Commission in 2014 for failing to conduct annual re-certifications of companies holding TRUSTe seals.[1] Such references influence the privacy community negatively.

 

The OPC paper’s suggestion of ethical assessments as a means to govern the use of big data is an intriguing one. Big data offers many opportunities for market research and social analysis, but as previously mentioned, these can raise ethical concerns. For example, if a statistic shows a particular demographic to be more susceptible to a given disease, or have a higher crime rate, that information could be used by insurance companies to penalize a consumer. In academic contexts, research studies must be cleared by a Research Ethics Board, which evaluates each study’s potential impacts on individuals or social groups – but as yet there are no similar bodies to govern the use of data in the private sector.

 

As suggested earlier, big data can be considered as a common good which should be exploited only with a “social license” obtained through public consultation. One means to support this would be an official, dynamic, and easily accessible online forum for individuals to complain about how their data is being used. Rather than focusing exclusively on individual formal complaints, the OPC could use such a forum as a source of information about entities that may be misusing personal data and should be investigated. Another means to support public input into the uses of big data would be to enact mechanisms that allow individuals to “vote with their feet”: as discussed earlier, “tagging” could in future make it possible for individuals not only to attach their privacy preferences to their personal data, but also to withdraw their data from projects or services to which they object. This concept is similar to the “right to erasure” or “right to be forgotten” provided by EU data protection legislation.

 

Enforcement models

In comparison to both the US and EU models, privacy enforcement in Canada is severely limited. In the EU, data protection authorities audit organizations proactively and require consultation about high-risk initiatives. Data protection authorities have the power to conduct investigations and soon, to impose penalties of up to 20 million Euros or 4% of an organization’s worldwide annual turnover. In the US, despite relatively weak privacy laws, the Federal Trade Commission regularly uses its enforcement powers to conduct audits and investigations, lay charges, and impose fines. In contrast, the OPC’s power to investigate complaints, publish findings, and make recommendations seems to be insufficient to motivate industry compliance with privacy laws.

 

To make privacy compliance a corporate priority, the OPC should have the power to engage in proactive enforcement, to hold organizations accountable for their publicly and internally published policies through auditing and monitoring, and to impose substantial penalties for noncompliance.

Conclusion

Privacy and informed consent are highly valued by Canadians, and yet recent technological innovations have done more to erode privacy than to protect it. Corporations and technological entrepreneurs tend to see privacy as a liability rather than an opportunity.  Privacy regulators have a key role to play in creating strong incentives for investment in privacy. On the one hand, regulators can help industry leaders to understand the economic benefits of investing in privacy in a highly competitive market. On the other, they can work towards a governance and enforcement model with teeth, to ensure that corporations face real consequences for failing to protect privacy.

 

Partnerships between regulators, corporations and technological innovators have the potential to foster creative solutions to dilemmas around privacy and consent. Sophisticated new tools for de-identification and anonymization are making it possible to enjoy the benefits of big data analytics with virtually no risk to individual privacy. Emerging technologies could soon give individuals greater control over their personal data online than ever before. Big data and privacy need not stand in opposition to each other.

 

The OPC has a unique opportunity at this point in time to shift conversations about privacy. First of all, there are certain public conversations that need to occur about the ethical uses of big data. The OPC is the logical entity to empower individuals to voice their opinions about how their personal data is being used and enforce their personal privacy preferences. Secondly, an expansion of privacy law, towards a strong governance and enforcement model aligned with the EU approach, could immediately make privacy a top priority for corporations. The OPC would then have a powerful opportunity to promote corporate investment in privacy-protective technological innovation and comprehensive privacy implementation. Innovations of the past couple of decades have eroded meaningful consent, and it is time to reaffirm corporations’ responsibility to protect Canadians’ personal data and manage it in accordance with their wishes.

[1] https://www.ftc.gov/news-events/press-releases/2014/11/truste-settles-ftc-charges-it-deceived-consumers-through-its

The post Dr. Waël Hassan: Response to OPC Consent Consultation appeared first on KI Design Magazine.


Dr. Waël Hassan: Response to OPC Consent Consultation was first posted on November 24, 2016 at 1:52 pm.
©2016 “KI Design Magazine“. Use of this feed is for personal non-commercial use only. If you are not reading this article in your feed reader, then the site is guilty of copyright infringement. Please contact me at admin@transigram.org

Leave a Reply

Your email address will not be published. Required fields are marked *