• Skip to primary navigation
  • Skip to content
  • Skip to footer
nNovation LLP

nNovation LLP

Small Canadian regulatory law firm with a big presence

  • Home
  • About Us
  • Our Team
    • Kim D.G. Alexander-Cook
    • Timothy M. Banks
    • Shaun Brown
    • Anne-Marie Hayden
    • Constantine Karbaliotis
    • Kris Klein
    • Dustin Moores
    • Florence So
  • Blog

Privacy

Maturing the Privacy Impact Assessment

January 28, 2022 by Constantine Karbaliotis Leave a Comment

Privacy Impact Assessments (PIAs) have not changed dramatically over the past 20 years or so, or at least the approach to them hasn’t.

Whether the starting point is in a Word or Excel template or [one hopes] by using actual technology to support the process, a PIA involves a group of people in the organization sitting around a [virtual] table to assess the risks and identify mitigations, before ultimately presenting it for sign-off by [one hopes] someone at the right level to accept the residual risk.

What’s wrong with this you might ask? It is certainly preferable over doing nothing, and in fact is a requirement for any privacy program worth its salt. It is increasingly also a legal requirement in many jurisdictions.

The problem is that this approach to PIAs creates a privacy echo chamber. Whether through meetings, or information gathering via technology that supports doing PIAs, it involves a group of internal employees assessing privacy risk, and ultimately accepting mitigations for a proposed use of technology or data. Inevitably, those involved will have similar viewpoints and see projects in a similar light.

So now for a revolutionary idea: why don’t we ask the people whose data we are using, what they think about what we are doing with their data?

Stay with me. Regulators have long supported that surveys, focus groups, and other ways of gathering stakeholder input would be influential in demonstrating an organization’s commitment to and accountability for privacy. So there is value in keeping your regulator happy, and this can mitigate the fallout should something unexpected or untoward occur.

What is the value to the organization? Well, these are your customers, your employees, your patients, your citizens. If you only look at PIAs in the echo chamber, you may miss how an average stakeholder reacts, and potentially miss the “that’s creepy” reaction. Increasingly, we are aware of how culture affects how we view privacy. For example, if the people assessing the risk of a geo-location app are all white males, they may fail to recognize how some data collection reinforces discriminatory pricing related to the area one lives in, or creates a potential risk to other groups, like the app being misused to stalk women.

Moreover, by explaining the good thing you are trying to accomplish, such as a new service, or greater convenience, you might well get suggestions from stakeholders as to how to reach that goal better, and without that potential “ick” factor. This might be through better notice or transparency, or data minimization.

And from the point of view of defending your choices later, it would seem a great insurance policy. So long as you haven’t failed to disclose something essential, and as long as you have approached seeking stakeholder involvement in good faith, documented stakeholder engagement can help demonstrate your practices were within individuals’ reasonable expectations should you be called upon to defend those practices at any time.

What are the downsides? Well for an organization, it could be scary, because it might mean hearing something you don’t want to hear – like the project is too invasive, too creepy. If the input is negative, and the remediation is too difficult or expensive, it might well kill the project. However, it is better to hear this early on than to find out later through complaints, a regulator, or negative press.  More likely, you will find out that how you have presented or explained the project leaves people suspicious or concerned.

What kind of project would benefit from this approach?

This could vary widely. Let’s use a topical example, which might relate to COVID screening of employees. You might want to engage with some employees, or union representatives, with the goal of finding the least intrusive way to ensure a safe workplace. Should we screen behind a screen, so we don’t embarrass anyone by sending them home? Who should know if someone is sent home? You might discover better, more privacy-minded ways to accomplish this from people at the front line.

Another example is smart city initiatives, where you might survey how your constituents would respond to a rollout of publicly accessible Wi-Fi. Would you want it funded by advertising, and if so, would you collect any user profile data to offset costs? What about law enforcement access to individuals’ connections – can you commit to requiring a court order to respond to law enforcement’s requests? What kind of notice or information will work best to inform users in either case?

A survey or focus group can provide real insight into the views of individuals whose data you collect, use, and disclose. The use of tools to obtain this kind of input has, ironically, always been in the hands of marketing, who have the best tools for getting this kind of feedback. Despite the concerns stakeholders might have initially, marketing would be the best ally for effectively soliciting input and might well be the ones most impacted (astounded, even?) by consumers’ responses. That, after all, is their business.

As we continue to innovate in how data is collected, used, and disclosed, it becomes ever more important that we innovate in how we conduct PIAs. Unquestionably, there will be complex areas where surveys and focus groups with the public won’t work well.  But these same techniques could be used with experts who understand complex topics. AI and data analytics are obvious examples. Specialists with knowledge in areas of ethics, medicine, data analysis, algorithms, or other specialty areas could also be leveraged to better understand and identify risks, evaluate approaches, and approve [or not] mitigations to risk. Publishing the results – transparency – would again help in demonstrating your commitment to privacy. Use of expert panels helps deal with the thorny question of demonstrating reasonableness or legitimate interests, where the processing activity is complex and informed consent is difficult to obtain.

Along with innovative technologies, innovative approaches to understanding and mitigating privacy risks can take the PIA out of a form-filling [and often dangerous rubber-stamping] exercise to something that inspires confidence with your stakeholders, your organization, and ultimately, your regulators.

Filed Under: Privacy, Privacy Impact Assessment Tagged With:

Tips for simplifying privacy communications

June 24, 2021 by Anne-Marie Hayden Leave a Comment

Guidance on consent often emphasizes that notices need to be in plain, easy-to-understand language for the consent to be meaningful. The thing is, the guidance doesn’t often tell you exactly how to do that.

In a recent speech to records and information management professionals, I offered a few concrete tips on how to improve your privacy communications.

Here are my top 10 takeaways:

  1. You’re almost always writing for online, so apply best practices in web writing when drafting privacy notices and policies.
  2. Make sure your sentences are short and concise, with one idea each.
  3. Use action words and avoid the passive voice.
  4. Eliminate jargon, acronyms and abbreviations.
  5. Use sub-heads – they make your text scannable.
  6. Use bullets and numbered lists instead of paragraphs.
  7. Lead with the “top tasks” – the main reason people go to that page.
  8. Use layers to point to more in-depth information.
  9. Ask someone who’s less familiar with your subject matter to review.
  10. Run your content through readability and accessibility tests that are available on most word processors.

Applying these best practices can help your organization be more clear and transparent about its privacy practices. Even easier… you can reach out to us, at nNovation LLP, for help with it.

Filed Under: Communications, Privacy Tagged With:

The risks and rewards of CPOs playing a role in communications

April 22, 2021 by Anne-Marie Hayden Leave a Comment

I participated on a panel recently, at an event organized by Wirewheel and with some other distinguished folks, to explore the idea of the chief privacy officer as spokesperson. The event was under Chatham House rules, so I can’t provide a play-by-play of the conversation, but I did want to take a moment to share some thoughts I had on this topic, leading up to and during the event.

Let’s start by considering a few things we know: that people care about their privacy – otherwise, there wouldn’t be laws and we wouldn’t be in business; that more and more, people want to do business with organizations they feel are ethical; that doing privacy well builds customer trust; and that regardless of the role consent may play down the road, the need for openness and transparency isn’t going away anytime soon. 

I believe that with the right foundation, making the CPO – the subject-matter expert on and champion for privacy – more proactively visible in an organization’s communications can be a way to demonstrate more transparency and accountability, and that it has the potential to boost an organization’s credibility and help it stand out as privacy leader in the marketplace.

What does it take for a CPO to delve into the wonderful world of communications? Granted, every situation is different. It certainly depends on the maturity of their privacy framework and whether they’re confident the organization’s privacy is in good shape. If not, we suggest you consider getting your privacy “house” in order and nNovation can certainly help. It also depends on whether collecting and protecting personal information is a relevant aspect of the business model. It depends on the company’s communications policy and whether it’s open to a more decentralized approach and the CPO’s level of knowledge on the subject. And it certainly takes collaboration with communications experts – within or outside the organization – as well as proper planning, training and practice.

What form can this take? I’m not suggesting that the CPO needs to get out there tomorrow to pitch and grant interviews to the Globe and Mail on their company’s privacy practices or that the CPO should take over the comms function – we need to recognize that everyone has their role. What I am suggesting however is that with the right planning, the CPO may be the best person, as the subject-matter expert, to respond to certain media requests, pen a blog, submit an article to a trade publication, participate in a podcast, give a speech on the topic and even play a role in privacy-focused marketing. Exploring the potential of these activities with the company’s comms team can help ensure the CPO is well prepared with clear key messages, strategies for responding to challenging questions, and armed with best practices and things to avoid. In this world, let’s face it: almost everyone is a communicator to a degree. These are very transferable skills and in my experience it’s best to get your feet wet and not wait until something goes wrong to try to develop these muscles.

What about when things do go wrong? There can also be a role for the CPO in public communications, in the unfortunate event of a privacy breach. If the organization has done a risk assessment, collects personal information and has identified cyber security as a risk, the company likely has mitigation strategies in place. It’s important to remember that these should go beyond the technological ones. Breach preparedness is key to successfully weathering a breach, reputation intact. Building a communications strategy directly into that breach response plan is essential. Part of the equation is ensuring subject matter experts, like the CPO, are ready, willing and, most of all, able. Sure, there are certain occasions when the CEO will have to be the spokesperson and there are others when it might make sense for corporate communications to handle things. I would argue that in an era of openness and transparency, when privacy is paramount, considering a more proactive role for the CPO in communications and a public role in the event of an incident is a way to help establish trust and credibility – in either scenario. Are there risks associated with the CPO being out there?  Sure, but I’d encourage organizations to weigh those against the advantages, as well as the risks of not doing so.

I have spent more than 25 years as a communicator and 18 years in privacy. After a brief time in the culture and heritage sector, I came back to privacy not only because of the great importance of the issue, but because I think deeper links can be made between these two often separate ideas – communications and privacy – and I want to play a role in that. Privacy is sometimes seen as standing in the way of certain communications and marketing efforts. I’d like to concentrate on updating that narrative. If you look closely at the privacy principles, you’ll see that about half of them can be significantly advanced through effective communications. We can help organizations understand and implement communications strategies that will enable them better comply with privacy requirements on the front end, better respond publicly to privacy incidents when they do occur and ultimately contribute to better positioning them as the privacy-forward organizations they are… and can be.

Filed Under: Communications, Privacy Tagged With:

The Digital Charter Implementation Act: A Clear Plan for Change

November 19, 2020 by Shaun Brown Leave a Comment

The Canadian government tabled draft legislation on November 17 that would make significant changes to the federal private sector privacy landscape. Bill C-11, the Digital Charter Implementation Act (DCIA), would replace Part 1 of the Personal Information Protection and Electronic Documents Act with the Consumer Privacy Protection Act (CPPA), create the Personal Information and Data Protection Tribunal Act (PIDPTA), and make minor amendments to several other laws.

The CPPA encapsulates the most fundamental aspects of Part 1 of PIPEDA, as it remains focused on providing individuals with control over how their personal information is collected, used and disclosed by organizations in the course of commercial activity. However, there are several important changes in both form and substance.

First, federal privacy law would exist in a standalone act, no longer bound to other, unrelated parts dealing with electronic documents. And, although the CPPA remains rooted in the ten privacy principles, unlike PIPEDA, it does not incorporate wholesale and build on the Canadian Standards Association Model Code for the Protection of Personal Information (which was an unusual way to draft a law).

In terms of substance, here are some of the most important changes:

  • Privacy management program. Organizations would be required to maintain a privacy management program setting out policies and procedures the organization takes to protect personal information, deal with privacy complaints, train personnel, and develop materials to explain an organization’s policies, practices and procedures. The Office of the Privacy Commissioner (OPC) would be authorized to demand access to these policies at any time.
  • Appropriateness. The CPPA incorporates and builds on the “reasonable purposes” clause of PIPEDA with a more comprehensive standard for when it is appropriate to process personal information.
  • Exceptions for business activities. The CPPA defines a list of “business activities” for which an organization can process personal information without consent.
  • Transfers to service providers. The CPPA would firmly establish that knowledge and consent are not required to transfer personal information to a service provider. It also helpfully clarifies when an organization is considered to have control over personal information.
  • De-identified information. The CPPA defines circumstances in which de-identified information can be processed.
  • Automated decision-making. If an organization uses an “automated decision system” to make a prediction, recommendation or decision about a person, the organization would be required to, on request, explain the prediction, recommendation or decision, and how the personal information used to make the prediction, recommendation or decision was obtained.
  • Data mobility. Individuals would have the right to transfer their data between organizations if those organizations are subject to a “data mobility framework” defined in regulation.
  • Disposal of data: The CPPA would provide individuals with an explicit right to request the deletion of their personal information.
  • Revised OPC powers. The OPC would have the authority to make orders requiring compliance with the Act and to recommend penalties.
  • Tribunal. The new Personal Information and Data Protection Tribunal would hear appeals from OPC orders. It would also have the ability to impose penalties, if recommended by the OPC.
  • Penalties. The CPPA provides  for maximum penalties of up to 3% of global revenue or C$10 million for most contraventions, and up to 5% of global revenue or C$25 million for certain offences.
  • Codes of practice and certification. The CPPA would allow for the creation of codes of practice and certification programs to facilitate compliance with the Act, which would be subject to approval by the OPC.
  • Private right of action. Individuals affected by contraventions of the law would have a right to sue for actual damages suffered. This right would only be available following an OPC finding that a contravention had occurred, which is not successfully appealed to the tribunal.

The DCIA would create the most significant change in Canadian privacy legislation in 20 years, aligning federal private sector privacy law – which applies throughout the country except in Alberta, British Columbia and Quebec – more closely with the EU General Data Protection Regulation. However, Bill C-11 still has a long road to travel before it becomes law, which is far from certain. The federal legislative process tends to move very slowly, and with a minority government in power, a vote of non-confidence in Parliament could trigger the election of a new government, which may prefer a different route.

Filed Under: Legislation, PIPEDA, Privacy, Privacy Commissioner of Canada Tagged With:

Ontario Launches Consultation Process on Privacy in the Private Sector

August 19, 2020 by Constantine Karbaliotis Leave a Comment

It seems that the winds of change have come to the privacy landscape in Canada. Ontario’s provincial government announced on August 13, 2020 its intention to seek public input on ‘creating a legislative framework for privacy in the province’s private sector.’

Citing growing privacy concerns that have been amplified during the pandemic by increased reliance on data gathering and digital platforms, the consultation will focus on increasing transparency, enhancing consent and enshrining opt-ins for secondary uses of data, privacy protections for de-identified or derived data, a right to deletion or erasure, data portability, requirements for de-identification, and increasing the enforcement powers of Ontario’s privacy commissioners.


There are two notable areas for those who have been following Canadian privacy legislative reform. The first includes the expansion to the non-profit and non-commercial organizations, which would notably catch charities, trade unions, and political parties (significant in light of the concerns arising out of the Cambridge Analytica case, in which only British Columbia could assert any authority over political parties).

The second intriguing area is the notion of enabling data trusts, for data sharing. This concept became important during the abortive Sidewalk Labs project in Toronto, where data trusts emerged as a way to address the risks associated with the large-scale collection of data in the smart city project. The data trust became an important vehicle to address concerns over data sovereignty, and the policy objective of deriving public benefit from private data.

The significance of Canada’s largest province and economy undertaking privacy legislation should not be underestimated. Federal privacy law currently applies to commercial activities in Ontario. The only Ontario law recognized as substantially similar by the federal government is PHIPA, the Personal Health Information and Protection Act, which applies only to the protection of health data in the health sector. The federal law, PIPEDA, does not govern employee data except if the sector is directly under federal jurisdiction (such as banks and airlines), and that gap has become noticeable during the pandemic. And there is no legislation addressing the significant non-profit sector.

In addition to these points, Canada’s federal law is in a revision process itself to address the significant changes that have taken place since it was enacted over twenty years ago, and to rise to the challenge our legislative regime will undoubtedly have to retain its adequacy status with the European Union under GDPR. One critical factor for adequacy has always been the limitation of it being to data governed by PIPEDA, and the ‘elephant in the room’ has always been the significant amount of data and activities under provincial jurisdiction.

Another key factor in Ontario is the tabling of legislation in Quebec in June, introducing an explicitly GDPR-like framework. In my commentary on that, I wondered if this would affect or alter the course of the federal government’s proposed changes by ‘raising the game.’ Now with Ontario entering the discussion on the future of our privacy regime, it makes certain elements I raised previously more urgent to address:

  • Canadians, and the Canadian economy, are not well served by a patchwork of different laws. We have been fortunate that because of our principles-based laws, we have largely ended up at the same place in terms of privacy values and results. This is true even between Quebec, which is a civil law jurisdiction, and the ‘rest of Canada’, which is common law, and between provincial and federal levels. Canadian businesses should not face the challenges that our friends in the US do in trying to comply with inconsistent laws.
  • It is in the interests of consistency and business predictability that we maintain a common market focus in our data protection laws.  GDPR itself has as its goal the free flow of data between EU member states. It is also worth noting the IMF has estimated 4 % of our GDP is ‘inhibited’ by internal trade barriers, an issue Canada’s Agreement on Internal Trade aims to address We want to avoid creating new barriers to trade within Canada.
  • Again, we cannot neglect our adequacy discussions with the EU; and as I have pointed out before, data goes with the trade. The original reason for PIPEDA was to facilitate and maintain trade relationships with the EU, and now more than ever, with a devastating economic contraction, our trade relationships must be maintained and strengthened externally and internally. We want to ensure that the EU is confident in exchanging data with Canada, all of Canada, and the Schrems II decision (which Abigail and I have discussed here), undoubtedly signals that we have to rise to the challenge.
  • While we need to address the business elements in privacy reform, it is worth also noting that our legal and constitutional framework had increasingly recognized privacy as a human right, through the Supreme Court of Canada and other court decisions. The Canadian genius has always been to find that balance, that supports business without sacrifice of those intrinsic values. This consultation is an opportunity to ensure that we promote business interests in data-driven innovation without creating an economy of digital have-nots, and that the goals of supporting the economy are consistent with personal control over the uses of data.

What an exciting time to be in privacy in Canada! There is an opportunity now to influence the future, and to build a framework that provides an integrated and consistent approach from sea to sea to sea; one that supports both our desires to remain in control and supports our data-driven economy. Canada, Ontario and Quebec now have the opportunity to lead in re-establishing Canada as a global privacy leader, and to make privacy Canada’s competitive differentiator. The consultation closes on October 1, 2020.

Filed Under: Legislation, Ontario, Privacy Tagged With:

Facebook’s $9M Settlement with Canada’s Competition Bureau makes history

May 21, 2020 by Constantine Karbaliotis Leave a Comment

Following Facebook’s challenge to the Office of the Privacy Commissioner (OPC) report arising from the Cambridge Analytica scandal, the fall-out for Facebook in Canada continues. While not at the scale of fines faced in the US and elsewhere, the involvement of a non-traditional regulator for privacy sends several signals to Canadian and multinational organizations.

In a news release today from the Competition Bureau Canada, Facebook has agreed to pay a penalty of $9 million to settle the Bureau’s investigation into Facebook’s privacy practices. (according to the press release – at this writing the settlement is not yet available from the Competition Tribunal).

The Bureau’s investigation concluded that Facebook had given the impression that users could control who could see and access their data, without limiting the sharing of users’ personal information with third-party developers. Further, third party developers could also access users’ friends’ personal information after users installed third-party applications.

The Bureau’s jurisdiction is based on the prohibition against false or misleading claims about products or services under the Competition Act. This is quite similar to section 5 of the US Federal Trade Commission Act, which prohibits “unfair or deceptive acts or practices in or affecting commerce“, and is commonly applied by the US Federal Trade Commission to enforce so-called “privacy promises”.

This is new territory for Canada, however. Given the OPC’s challenges enforcing the Personal Information Protection and Electronic Documents Act (PIPEDA), this settlement is a watershed in that privacy-related matters could increasingly be subject to oversight by a regulator who has significant enforcement powers.

The interaction with the OPC’s own enforcement, which its first application to the Federal Court to seek remedies against Facebook, deserves comment. The remedies being sought by the OPC mirrors some of the remedies obtained by the Competition Bureau, by requiring Facebook to cease making these representations. If the Competition Bureau’s enforcement reflects synchronization with the OPC’s views, as an expert tribunal, and particularly with the Guidelines for obtaining meaningful consent, jointly developed by the OPC and both British Columbia’s and Alberta’s Commissioners, then this enforcement action has suddenly put teeth into those guidelines and the privacy commissioners’ views.  It also raises the question of what the Facebook’s Federal Court application to set aside the OPC’s report is intended to accomplish, given what it has now agreed to.

Another observation to be drawn arises from the ‘non-traditional’ regulator; increasingly in Canada, other regulators are stepping into the privacy fray. Some examples: The Office of the Superintendent of Financial Institutions (OSFI), which supervises Canada’s federally regulated financial institutions, issued a notice requirement in January 2019 for cyber and privacy breaches. IIROC, the Investment Industry Regulatory Organization of Canada also issued a rule in November 2019 to require mandatory reporting of cybersecurity incidents. And the Ontario Energy Board which governs utilities in Ontario, has for several years had cybersecurity and privacy obligations made a requirement of licensing.

While it is natural that specialized regulators have a vested interest in the security and privacy given their roles in regulating markets and ensuring stability, it represents a risk of a patch work approach to privacy unless the principles upon which regulation is based are consistent. 

The Government’s Digital Charter, announced in May last year, is intended to not only chart a course for PIPEDA reform, but also is intended to provide a framework and direction for consistency with the provinces and regulatory agencies. With the COVID-19 crisis still underway, it is uncertain when reform will come, but the Competition Bureau settlement suggests that the enforcement of privacy obligations in Canada is still evolving.

Filed Under: Competition Act, Privacy, Privacy Commissioner of Canada Tagged With:

  • Page 1
  • Page 2
  • Next Page »

Footer

EXPERT LEGAL SERVICES

135 Laurier Avenue West, Suite 100 Ottawa Ontario K1P 5J2
  • Home
  • About Us
  • Our Team
  • Blog
  • Privacy

Copyright © 2020 nNovation LLP. All Rights Reserved