Privacy Impact Assessments (PIAs) have not changed dramatically over the past 20 years or so, or at least the approach to them hasn’t.
Whether the starting point is in a Word or Excel template or [one hopes] by using actual technology to support the process, a PIA involves a group of people in the organization sitting around a [virtual] table to assess the risks and identify mitigations, before ultimately presenting it for sign-off by [one hopes] someone at the right level to accept the residual risk.
What’s wrong with this you might ask? It is certainly preferable over doing nothing, and in fact is a requirement for any privacy program worth its salt. It is increasingly also a legal requirement in many jurisdictions.
The problem is that this approach to PIAs creates a privacy echo chamber. Whether through meetings, or information gathering via technology that supports doing PIAs, it involves a group of internal employees assessing privacy risk, and ultimately accepting mitigations for a proposed use of technology or data. Inevitably, those involved will have similar viewpoints and see projects in a similar light.
So now for a revolutionary idea: why don’t we ask the people whose data we are using, what they think about what we are doing with their data?
Stay with me. Regulators have long supported that surveys, focus groups, and other ways of gathering stakeholder input would be influential in demonstrating an organization’s commitment to and accountability for privacy. So there is value in keeping your regulator happy, and this can mitigate the fallout should something unexpected or untoward occur.
What is the value to the organization? Well, these are your customers, your employees, your patients, your citizens. If you only look at PIAs in the echo chamber, you may miss how an average stakeholder reacts, and potentially miss the “that’s creepy” reaction. Increasingly, we are aware of how culture affects how we view privacy. For example, if the people assessing the risk of a geo-location app are all white males, they may fail to recognize how some data collection reinforces discriminatory pricing related to the area one lives in, or creates a potential risk to other groups, like the app being misused to stalk women.
Moreover, by explaining the good thing you are trying to accomplish, such as a new service, or greater convenience, you might well get suggestions from stakeholders as to how to reach that goal better, and without that potential “ick” factor. This might be through better notice or transparency, or data minimization.
And from the point of view of defending your choices later, it would seem a great insurance policy. So long as you haven’t failed to disclose something essential, and as long as you have approached seeking stakeholder involvement in good faith, documented stakeholder engagement can help demonstrate your practices were within individuals’ reasonable expectations should you be called upon to defend those practices at any time.
What are the downsides? Well for an organization, it could be scary, because it might mean hearing something you don’t want to hear – like the project is too invasive, too creepy. If the input is negative, and the remediation is too difficult or expensive, it might well kill the project. However, it is better to hear this early on than to find out later through complaints, a regulator, or negative press. More likely, you will find out that how you have presented or explained the project leaves people suspicious or concerned.
What kind of project would benefit from this approach?
This could vary widely. Let’s use a topical example, which might relate to COVID screening of employees. You might want to engage with some employees, or union representatives, with the goal of finding the least intrusive way to ensure a safe workplace. Should we screen behind a screen, so we don’t embarrass anyone by sending them home? Who should know if someone is sent home? You might discover better, more privacy-minded ways to accomplish this from people at the front line.
Another example is smart city initiatives, where you might survey how your constituents would respond to a rollout of publicly accessible Wi-Fi. Would you want it funded by advertising, and if so, would you collect any user profile data to offset costs? What about law enforcement access to individuals’ connections – can you commit to requiring a court order to respond to law enforcement’s requests? What kind of notice or information will work best to inform users in either case?
A survey or focus group can provide real insight into the views of individuals whose data you collect, use, and disclose. The use of tools to obtain this kind of input has, ironically, always been in the hands of marketing, who have the best tools for getting this kind of feedback. Despite the concerns stakeholders might have initially, marketing would be the best ally for effectively soliciting input and might well be the ones most impacted (astounded, even?) by consumers’ responses. That, after all, is their business.
As we continue to innovate in how data is collected, used, and disclosed, it becomes ever more important that we innovate in how we conduct PIAs. Unquestionably, there will be complex areas where surveys and focus groups with the public won’t work well. But these same techniques could be used with experts who understand complex topics. AI and data analytics are obvious examples. Specialists with knowledge in areas of ethics, medicine, data analysis, algorithms, or other specialty areas could also be leveraged to better understand and identify risks, evaluate approaches, and approve [or not] mitigations to risk. Publishing the results – transparency – would again help in demonstrating your commitment to privacy. Use of expert panels helps deal with the thorny question of demonstrating reasonableness or legitimate interests, where the processing activity is complex and informed consent is difficult to obtain.
Along with innovative technologies, innovative approaches to understanding and mitigating privacy risks can take the PIA out of a form-filling [and often dangerous rubber-stamping] exercise to something that inspires confidence with your stakeholders, your organization, and ultimately, your regulators.
Leave a Reply