• Skip to primary navigation
  • Skip to content
  • Skip to footer
nNovation LLP

nNovation LLP

Small Canadian regulatory law firm with a big presence

  • Home
  • About Us
  • Our Team
    • Kim D.G. Alexander-Cook
    • Timothy M. Banks
    • Shaun Brown
    • Anne-Marie Hayden
    • Constantine Karbaliotis
    • Kris Klein
    • Dustin Moores
    • Florence So
  • Blog

PIPEDA

Federal Court Rules in Favour of OPC in Google Reference

July 19, 2021 by Dustin Moores Leave a Comment

On July 8th,  the Federal Court ruled in the Office of the Privacy Commissioner’s (OPC) favour in a decision that touched PIPEDA’s application to search engines and what has become known in privacy law as “the right to be forgotten.” The decision brings clarity on whether search engines like Google, and potentially businesses with similar business models, will be subject to PIPEDA when they handle personal information.

The Backstory

The decision results from a reference filed by the OPC with the Federal Court involving an OPC investigation of Google. The OPC’s investigation began in 2017 after an individual complained to it that news articles Google displayed in its search results contained outdated and inaccurate information and disclosed sensitive information about him. He also complained he had endured direct harm, including physical assault, lost employment opportunities, and severe social stigma, because Google links these articles to his name in search results.

But even before the complaint was launched, the OPC began consultations on whether a right to be forgotten existed in Canada. The result was the OPC’s 2018 draft Position on Online Reputation. In that paper, the OPC stated that Canadians need better tools to help them protect their online reputation, including tools like de-indexing and source takedown. De-indexing is the process by which a webpage, image or other online resource is removed from search results when an individual’s name is entered as the search term. Source takedown means the removal of this content from the internet.

In its draft position, the OPC argues that PIPEDA applies to a search engine’s indexing of web content and displaying of search results, so search engines need to comply with PIPEDA by allowing individuals to challenge the accuracy, completeness, and currency of search results attached to their name. When an individual is successful in their challenge, the OPC argues, the search engine should de-index the inaccurate, incomplete, or outdated results.

PIPEDA also gives individuals a right to withdraw consent and requires that personal information that is no longer needed be destroyed, erased, or made anonymous. The OPC argues this gives individuals the right to remove information they have posted online. If the information was posted by someone else, the individual does not have an unqualified right to remove it, but they should be able to challenge the accuracy, completeness, and currency of the information.

Returning to the complaint, Google responded to it by saying that PIPEDA did not apply to its search engine because it was not a commercial activity within the meaning of PIPEDA (due to constitutional constraints, PIPEDA only regulates the collection, use, and disclosure of personal information in the course of commercial activities). Google also argued that even if its search engine was a commercial activity, it fell under PIPEDA’s exception for organizations who collect, use, or disclose personal information only for a “journalistic” purpose. Lastly, Google submitted that an interpretation of PIPEDA requiring it to delist lawful public content was against its freedom of expression as protected in the Charter of Rights and Freedoms. Without addressing Google’s constitutional argument, the OPC referred Google’s jurisdictional arguments to the Federal Court through the reference process that allows federal offices to refer certain legal questions to the Court.

The Decision

In brief, the issues considered by the Court were:

  1. Does Google’s search engine service collect, use, or disclose personal information in the course of commercial activities when it indexes webpages and provides search results in response to a search for an individual’s name?
  2. Does Google’s search engine service involve the collection, use, or disclosure of personal information for a journalistic, and no other, purpose?

Regarding the first issue — whether Google’s search engine collects, uses, or discloses personal information — this was never really in question. Google collects personal information when its web crawlers access text on public webpages and transmit it to Google’s servers for indexing. It uses personal information to make its search engine as comprehensive and valuable as possible for users and advertisers. And Google discloses personal information through the “snippets” that appear in its search results.

Google attempted to argue, however, that because there was no evidence advertisements appeared alongside search results when searching the complainant’s name, the activity was not commercial in the usual and traditional sense. The Court disagreed. Underlining the popularity and profitability of Google’s search engine and advertising business, the Court noted that even if Google provides free services to content providers and search engine users, it has “a flagrant commercial interest in connecting these two players.” Google users provide Google with personal information when using its search service and Google uses that information for profit. The Court went further stating that “every component of [Google’s search engine] business model is a commercial activity as contemplated by PIPEDA.”

On the second issue, the Court found that Google’s purposes for handling personal information for its search engine are not journalistic, and “certainly not exclusively so.” The Court concluded this after, among other considerations, applying the test introduced in another Federal Court decision used to determine whether an activity should qualify as journalism. According to this test, an activity should qualify as journalism only where its purpose is to,

  • inform the community on issues the community values,
  • it involves an element of original production, and
  • it involves a self-conscious discipline calculated to provide an accurate and fair description of facts, opinion and debate at play within a situation.

The Court found none of these factors applied to Google. Google makes its information universally accessible (broader than informing a community); it does not “produce,” rather it only displays search results; and Google makes no effort to determine the fairness or accuracy of its search results. Even if there was some journalistic purpose to Google’s activities, the Court found that its primary purpose — to index and present search results — was not journalistic.

What’s Next?

Thanks to this decision, we have more clarity on whether PIPEDA applies to search engines and similar services and the factors courts will look to when making that assessment. We also now have a concrete example of how courts will apply the journalistic activity test, as it was not fully considered in the original Federal Court decision in which it was introduced.

As for Google, unless it appeals the decision, the OPC will continue its investigation and issue a Report of Findings and recommendations likely aligned with its draft Position on Online Reputation. If Google does not implement those recommendations, the OPC could take Google to the Federal Court again in a “de novo” application. In any case, this decision will continue to have important implications on whether Canadians will someday enjoy a legal “right to be forgotten,” and may well be seen as the first step should it come to be.

Filed Under: PIPEDA, Privacy Commissioner of Canada, Right to be forgotten Tagged With: google

The problem with de-identification in the Consumer Privacy Protection Act

December 15, 2020 by Shaun Brown 1 Comment

The recently tabled Consumer Privacy Protection Act (CPPA) would allow organizations to use and disclose de-identified information for certain purposes without consent. This makes sense, but there is a flaw: information that is de-identified according to the law is not even personal information. So privacy legislation shouldn’t apply. Yet, according to the proposed CPPA, de-identified information is personal information, excluded from only some of the CPPA’s requirements. This seems to defeat the purpose of referencing de-identification in the first place, while potentially redefining the concept of personal information.

What is de-identification?

To de-identify personal information in the CPPA means the following:

to modify personal information — or create information from personal information — by using technical processes to ensure that the information does not identify an individual or could not be used in reasonably foreseeable circumstances, alone or in combination with other information, to identify an individual.

De-identified information appears to be a new category of personal information that would remain within the scope the CPPA, although certain uses and disclosures can be made without consent. De-identified information can be used by an organization internally for research and development purposes. It can be disclosed to government institutions, health care institutions, post-secondary institutions, or other entities prescribed in regulation, for “socially beneficial purposes”.1

The CPPA does not explicitly state that de-identified information is personal information. However, this is implied, as the CPPA applies only to activities involving personal information according to the sections of the law describing its purpose and application.2 There is nothing to suggest that the law is intended to apply to de-identified information in addition to personal information.

What is personal information?

To understand the problem, it’s necessary to consider the meaning of “personal information”, defined as “information about an identifiable individual”. There are two related and overlapping lines of inquiry under this definition. The first is whether the information is “about” an individual (as opposed to, for example, an object). The second is whether an individual is “identifiable”.

In the absence of statutory guidance, courts have used different language to interpret this definition. In 2007, the Federal Court of Appeal stated that an individual is identifiable if it is “reasonable to expect” that an individual could be identified from the information alone or combined with “sources otherwise available”.3 A year later the Federal Court of Canada adopted the standard put forward by the Privacy Commissioner of Canada: there must be a “serious possibility” of identifying an individual through the information alone or combined with “other available information”. 4

More recently, the Federal Court found that “serious possibility” and “reasonable to expect” are effectively the same thing: more than mere speculation or possibility, but not probable on a balance of probabilities.5

The need for a different threshold

De-identification in the CPPA uses effectively the same threshold as personal information, but in reverse. We’ll call this the “serious possibility/reasonably foreseeable” threshold. The courts have said that information is personal if there is a serious possibility that an individual could be identified, which is equivalent to “reasonable to expect.” Under the CPPA, personal information becomes de-identified if there are no “reasonably foreseeable circumstances” in which an individual could be identified. So personal information that is de-identified under the CPPA should not be personal information according to our current understanding of personal information as interpreted by the courts. Except, in the CPPA, it is.

Here’s another way of looking at it. In our current world, information becomes personal when it rises above the threshold of serious possibility/reasonably foreseeable, as seen in figure 1 below. Yet, under the CPPA, information that is personal information becomes de-identified personal information when it crosses below the threshold of serious possibility/reasonably foreseeable, as seen in figure 2.


An obvious question is when, if ever, does personal information become non-personal? In other words, once information becomes personal and within the scope of the CPPA, is it possible to transform it so that it is outside the scope of the CPPA? Currently, information that is sufficiently de-identified to no longer qualify as personal information is not regulated under PIPEDA (even if it is not truly anonymized). The effect of the CPPA seems arbitrary. If the information had been collected in a manner that never met the threshold for what constitutes personal information, it would never be subject to the law. However, because the information was, at some point, within the scope of the law, it is permanently trapped.

Even more confusing, does this alter the definition of personal information? If so, where is the new threshold? It seems that this would have to be lower under the CPPA than it already is.

It might be argued that there is a meaningful difference between “serious possibility/reasonable to expect” and “reasonably foreseeable circumstances”. But this isn’t tenable. When comparing “serious possibility” with “reasonable to expect”, the Federal Court said that it may be “impossible” to discern a meaningful difference. There’s no way the rest of us could be expected to differentiate between “reasonable to expect” and “reasonably foreseeable”.

Even less probable is an intentional effort to expand the definition of personal information, and in turn, the scope of the law. The government would have to be more explicit about such a significant change.

Most likely, this is just a well-intentioned idea with flawed execution, which would make the law too confusing.

One potential solution is to modify the definition of “de-identify” by removing the reference to reasonably foreseeable circumstances, as follows:

de-identify means to modify personal information — or
create information from personal information — by using
technical processes to ensure that the information does
not identify an individual. or could not be used in reasonably
foreseeable circumstances, alone or in combination
with other information, to identify an individual

This would create a threshold for de-identified information that is clearly distinct from the definition of personal information, which would seem to accomplish the objective of including de-identified information in the CPPA.

Another option is to just remove all references to de-identification from the law. Though maybe not ideal, if the threshold for de-identification is not modified to differentiate it from the definition of personal information, then the law would be better without it.

Filed Under: Legislation, PIPEDA, Privacy Reform Tagged With:

The Digital Charter Implementation Act: A Clear Plan for Change

November 19, 2020 by Shaun Brown Leave a Comment

The Canadian government tabled draft legislation on November 17 that would make significant changes to the federal private sector privacy landscape. Bill C-11, the Digital Charter Implementation Act (DCIA), would replace Part 1 of the Personal Information Protection and Electronic Documents Act with the Consumer Privacy Protection Act (CPPA), create the Personal Information and Data Protection Tribunal Act (PIDPTA), and make minor amendments to several other laws.

The CPPA encapsulates the most fundamental aspects of Part 1 of PIPEDA, as it remains focused on providing individuals with control over how their personal information is collected, used and disclosed by organizations in the course of commercial activity. However, there are several important changes in both form and substance.

First, federal privacy law would exist in a standalone act, no longer bound to other, unrelated parts dealing with electronic documents. And, although the CPPA remains rooted in the ten privacy principles, unlike PIPEDA, it does not incorporate wholesale and build on the Canadian Standards Association Model Code for the Protection of Personal Information (which was an unusual way to draft a law).

In terms of substance, here are some of the most important changes:

  • Privacy management program. Organizations would be required to maintain a privacy management program setting out policies and procedures the organization takes to protect personal information, deal with privacy complaints, train personnel, and develop materials to explain an organization’s policies, practices and procedures. The Office of the Privacy Commissioner (OPC) would be authorized to demand access to these policies at any time.
  • Appropriateness. The CPPA incorporates and builds on the “reasonable purposes” clause of PIPEDA with a more comprehensive standard for when it is appropriate to process personal information.
  • Exceptions for business activities. The CPPA defines a list of “business activities” for which an organization can process personal information without consent.
  • Transfers to service providers. The CPPA would firmly establish that knowledge and consent are not required to transfer personal information to a service provider. It also helpfully clarifies when an organization is considered to have control over personal information.
  • De-identified information. The CPPA defines circumstances in which de-identified information can be processed.
  • Automated decision-making. If an organization uses an “automated decision system” to make a prediction, recommendation or decision about a person, the organization would be required to, on request, explain the prediction, recommendation or decision, and how the personal information used to make the prediction, recommendation or decision was obtained.
  • Data mobility. Individuals would have the right to transfer their data between organizations if those organizations are subject to a “data mobility framework” defined in regulation.
  • Disposal of data: The CPPA would provide individuals with an explicit right to request the deletion of their personal information.
  • Revised OPC powers. The OPC would have the authority to make orders requiring compliance with the Act and to recommend penalties.
  • Tribunal. The new Personal Information and Data Protection Tribunal would hear appeals from OPC orders. It would also have the ability to impose penalties, if recommended by the OPC.
  • Penalties. The CPPA provides  for maximum penalties of up to 3% of global revenue or C$10 million for most contraventions, and up to 5% of global revenue or C$25 million for certain offences.
  • Codes of practice and certification. The CPPA would allow for the creation of codes of practice and certification programs to facilitate compliance with the Act, which would be subject to approval by the OPC.
  • Private right of action. Individuals affected by contraventions of the law would have a right to sue for actual damages suffered. This right would only be available following an OPC finding that a contravention had occurred, which is not successfully appealed to the tribunal.

The DCIA would create the most significant change in Canadian privacy legislation in 20 years, aligning federal private sector privacy law – which applies throughout the country except in Alberta, British Columbia and Quebec – more closely with the EU General Data Protection Regulation. However, Bill C-11 still has a long road to travel before it becomes law, which is far from certain. The federal legislative process tends to move very slowly, and with a minority government in power, a vote of non-confidence in Parliament could trigger the election of a new government, which may prefer a different route.

Filed Under: Legislation, PIPEDA, Privacy, Privacy Commissioner of Canada Tagged With:

Comparing Facebook’s Settlement with Canada’s Competition Bureau with the Privacy Commissioner’s Recommendations

May 22, 2020 by Constantine Karbaliotis Leave a Comment

Now that Facebook’s settlement with the Competition Bureau Canada (the “Settlement”) has been published, it is interesting to consider how this could impact other regulatory actions Facebook is dealing with in Canada with the federal Office of the Privacy Commissioner (OPC).

The Settlement is quite short but has some interesting implications. First, it expressly states that Facebook’s agreement does not constitute an admission of guilt under the Competition Act or any other law; so this settlement doesn’t preclude Facebook’s ability to challenge the OPC’s report, as it is currently doing, through a judicial review application or at the hearing of the OPC application to enforce its report. However, Facebook is not permitted to make any public statements that contradict the terms of the settlement agreement. The recitals state the Competition Bureau Commissioner’s conclusions, which are not admitted to, but the fact of those conclusion and the commitments by Facebook, cannot be denied. The recitals also note Facebook’s Consent Decree with the FTC of July 2019, which brings Facebook’s compliance program into the Settlement.

The financial penalty is substantial for Canada: $9 million, plus $500K to cover the Bureau’s costs of investigation.

More interesting is the ongoing commitments. Facebook is first of all not permitted to make any materially false or misleading statements in the future concerning the extent to which users can control access to their personal information, as explained here:

The Respondent shall not make, in connection with a Facebook product or service,  any representation to the public that, taking into account its general impression as  well as its literal meaning, is materially false or misleading regarding the disclosure  of Personal Information, including how and the extent to which Users can control who can access the Personal Information.

Secondly, Facebook must within 180 days ensure its compliance program supports this commitment. Facebook is obliged to ”review” the Bureau’s Corporate Compliance Program Bulletin (“Bulletin”) with the aim of aligning Facebook’s compliance program with the Bulletin. To reinforce these obligations, senior management is required to sign and acknowledge this commitment to “fully support and enforce” the compliance program. This creates the risk of personal liability, both civilly and criminally, for future transgressions.

Third, there is ongoing monitoring: senior management must be provided with a copy of the settlement agreement with the view to ensuring that Facebook responds to the Bureau on matters covered by the sections dealing with statements about user control, as well as senior management acknowledgement of the Settlement and its terms. There must be a response within 45 days. The Settlement is binding on Facebook for 10 years.

What is “review” of and “aligning” to the Bulletin? The Bureau obviously has a wider remit than privacy – competition law, of course, and misleading advertising, which is how, like the FTC, privacy statements can bring companies under its authority. The Bulletin speaks to  compliance more broadly, and would include privacy programs:

  1. Management Commitment and Support
  2. Risk‑based Corporate Compliance Assessment
  3. Corporate Compliance Policies and Procedures
  4. Compliance Training and Communication
  5. Monitoring, Verification and Reporting Mechanisms
  6. Consistent Disciplinary Procedures and Incentives for Compliance
  7. Compliance Program Evaluation

Privacy Commissioners’ Recommendations

  • Implementation of measures to obtain meaningful consent that clearly informs users of consequences in a timely manner
  • Because of the failure to take accountability, the OPC and BC Commissioner recommended the ability to conduct audits of the privacy policies and practices

Competition Bureau Settlement

  • While expressed in the negative, the Settlement effectively require meaningful consent
  • The ability of the Bureau to monitor for 10 years how Facebook complies with its commitment to the section noted above gives it considerable insight into how Facebook obtains data from users, and to monitor its practices.

It will be interesting to see how the Facebook challenge to the OPC’s report continues, and whether in fact it will be meaningful in light of this settlement.

For businesses operating in Canada, the settlement indicates a new and material enforcement player in the area of privacy, the Competition Bureau Canada; it has been traditionally hard to get management attention given the limitations on our Commissioners’ enforcement powers, which the Competition Bureau does not suffer from. It also gives privacy officers and privacy program designers an additional resource/checklist against which to measure the effectiveness of the programs, and common framework with which to integrate privacy to general compliance programs.

Filed Under: Competition Act, PIPEDA, Privacy Commissioner of Canada, Uncategorized Tagged With:

Court agrees class actions necessary to enforce PIPEDA

October 28, 2019 by Timothy M. Banks Leave a Comment

The Ontario Superior Court of Justice recently approved a class action settlement involving a case that arose out of an insurer’s practice of conducting credit checks on claimants for accident benefits. The procedural history of the case is interesting and suggests a possible roadmap for other class proceedings. What is also interesting is the court’s statement that the limited powers of the Office of the Privacy Commissioner of Canada (OPC) are a reason why class actions are important tool for behavioural modification to encourage compliance with the Personal Information Protection and Electronic Documents Act (PIPEDA). Justice Glustein stated:

[88] Behavioural modification is a key objective of the Settlement Agreement. If systemic PIPEDA privacy breaches are not rectified by a class procedure, it is not clear what incentive large insurers and others will have to avoid overcollection of information. While the Privacy Commissioner may encourage or require changes to future practices, it has very limited powers to enforce compliance through strong regulatory penalties (see s. 28 of PIPEDA).

Haikola v. The Personal Insurance Company, 2019 ONSC 5982 at para. 88

Procedural History

After a car accident, the plaintiff, Kalevi Haikola, made a claim for accident benefits from his insurer, the Personal Insurance Company. An adjuster from the Personal Insurance Company contacted Haikola and asked for his consent to conduct a credit check. Haikola agreed, allegedly fearing that refusing to permit the credit check might affect his claim. According to the court’s findings, Haikola repeatedly sought answers about why a credit check was necessary but did not obtain a satisfactory answer. So, Haikola complained to the OPC, who found that the Personal Insurance Company was using these credit checks as part of a fraud detection model. In PIPEDA Report of Findings #2017-003, the OPC determined that Haikola’s complaint was well-founded. The OPC concluded that the Personal Insurance Company was unable to justify the use of the credit score and had failed to obtain meaningful consent.

Although the Personal Insurance Company undertook to cease the practice of conducting credit checks, Haikola was not satisfied. Section 14 of the PIPEDA permits individuals to bring an application to Federal Court for a remedy once the OPC renders a decision with respect to an investigation. However, instead of bringing an individual claim, Haikola sought to use section 14 to bring a class action against the Personal Insurance Company. The Personal Insurance Company argued that section 14 could not be used in this way and took the position that the Federal Court could not certify a class action.

Settlement

Haikola and the Personal Insurance Company entered into settlement discussions and, after a mediation, agreed to a settlement in which the Personal Insurance Company would pay an amount of $2,250,000. After taking into account proposed class action counsel fees, the court estimated that the value of the settlement for each affected individual would range between $150 – $180 depending on the take-up by class members.

As part of the settlement Haikola agreed to discontinue the proceeding in Federal Court and commence a class proceeding before the Ontario Superior Court of Justice. This avoided the necessity of determining the jurisdictional question regarding whether the Federal Court could certify a class proceeding. Because the Ontario Superior Court of Justice does not have jurisdiction to hear an application under section 14 of PIPEDA, the answer was to plead the breach of an implied contractual term that the Personal Insurance Company would comply with PIPEDA. Having allegedly failed to do so, Haikola and the class would be entitled to at least a nominal damages award. But a nominal damages award can certainly add up when there are 8,525 affected individuals.

Takeaways

The Ontario court was very receptive to the use of class proceedings to enforce PIPEDA. It was debatable whether or not the claim of an implied term would have been successful had there been a trial. However, Justice Glustein noted that there were reasons why a class proceeding might be preferable than individual complaints to the OPC followed by individual Federal Court actions for a remedy.

Glustein J. found that the likely small damages award hardly justified class members jumping through the hurdles created by PIPEDA. As Haikola’s case itself showed, an individual had to launch a complaint, wait for the OPC to investigate, obtain a report of findings from the OPC and then go to the Federal Court. Indeed, it took the OPC a few months shy of 3 years to issue a final Report of Findings. (Yes, you read that correctly – nearly 3 years! – even though s. 13(1) PIPEDA says that the Commissioner must issue a Report of Findings in 1 year.)

Even with a successful report of findings, the individual would then have to start all over again in Federal Court. As Glustein J. noted, the Federal Court would then conduct a hearing de novo, meaning that the complainant would have to convince the court that the OPC was correct and there was a violation of PIPEDA – all for a small damage award.

The whole set up of PIPEDA was not, in the court’s judgment, designed to achieve individual remedies for systemic breaches.

Look for more cases in which plaintiffs claim breaches of an implied term to comply with PIPEDA in order to avoid the OPC complaints and section 14 process under PIPEDA.

Filed Under: Class Actions, PIPEDA Tagged With: Class Actions, PIPEDA

Loblaw’s errors are overblown

October 21, 2019 by Kris Klein Leave a Comment

The Office of the Privacy Commissioner of Canada (OPC) released its Report of Findings into the Loblaw’s gift card matter this week.  This case was first reported in the news several months ago when people complained that they had to provide a fair amount of personal information in order to authenticate themselves if they wanted to receive a compensatory gift card as part of the bread price-fixing fiasco. So, suffice it to say they were an already-irritated bunch.

It turns out that, after all the hoopla, Loblaw didn’t really do too much wrong in this case. I cannot say I am surprised. In a few instances, they did ask for people to provide their driver’s license as part of the authentication process and failed to adequately inform them that they could redact all the information on the license except for the name and address.  As they got better with their communications (isn’t it almost always about better communications?), people were informed of other ways they could prove  they lived where they claimed to be living.

So, the news amounts to an over collection of information – namely the driver’s license number – but for me, there are other nuggets in the OPC’s Report that are worth focusing on.

First, the OPC quotes from the Loblaw’s privacy statements and endorses the language used to explain how the personal information was being processed in other countries. It’s one of the first instances that I can think of where the OPC has provided an example of language for these messages that it considers adequate. I’m particularly glad with this because if the OPC reports more often in this manner, we’ll be able to learn what language meets requirements and what language fails to meet the test.

Similarly, the OPC examined the contracts that were in place between Loblaw and its processors. While the specific contractual language is not repeated, the OPC does provide a shopping list of clauses that were contained in the contracts.  Paragraph 41 of the Report says:

The contract also provided guarantees of confidentiality and security of personal information, and included a list of specific safeguard requirements, such as: (i) implementing measures to protect against compromise of its systems, networks and data files; (ii) encryption of personal information in transit and at rest; (iii) maintaining technical safeguards through patches, etc.; (iv) logging and alerts to monitor systems access; (v) limiting access to those who need it; (vi) training and supervision of employees to ensure compliance with security requirements; (vii) detailed incident response and notification requirements; (viii) Loblaw’s pre-approval of any third parties to whom JND wishes to share personal information, as well as a requirement for JND to ensure contractual protections that are at a minimum equivalent to those provided for by its contract with Loblaw; and (ix) to submit to oversight, monitoring, and audit by Loblaw of the security measures in place.

Moreover, the OPC endorses these clauses as having met the accountability requirements in PIPEDA.  The European DPAs have long provided input on what specifically needs to be in a contract and it’s good to see the OPC providing an example in this case.  

I guess, in a perfect world, they might even go a step further and provide a precedent contract for us privacy pros to use when negotiating with our processors.  But, regardless, this is definitely a step in the right direction and I hope for more of this type of guidance in future Reports of Findings.  On that note, I can’t help but notice that the Loblaw case summary is numbered 2019-003.  If that means we have only had 3 reported cases this entire year, I’m disappointed because, in my mind, they can be a really excellent way of getting meaningful guidance out there. 

Filed Under: PIPEDA, Privacy Commissioner of Canada Tagged With:

  • Page 1
  • Page 2
  • Next Page »

Footer

EXPERT LEGAL SERVICES

135 Laurier Avenue West, Suite 100 Ottawa Ontario K1P 5J2
  • Home
  • About Us
  • Our Team
  • Blog
  • Privacy

Copyright © 2020 nNovation LLP. All Rights Reserved