Data Protection Impact Assessment. First guidelines
One of the most important cultural change companies and organisations are beginning to face is the need of systematic inclusion of privacy and data protection in technical and organisational frameworks.
A crucial aspect of these changes is the need of conducting Privacy Impact Assessments (PIA) and Data Protection Impact Assessments (DPIA). These tools are designed to measure the levels of privacy and security provided by the system, and suggest possible improvements.
Companies will be struggling to include PIA/DPIA in standard business process for products, applications and systems, and when in need - they will ask for external help. Mandatory deployment of the process needs to be finalised by May 2018. In practice - much sooner, companies will need to adapt during 2017.
Privacy Impact Assessments are already established paradigms. But until now, Data Protection Impact Assessment was less known, yet also important tool. DPIA is a measure devised by the European Union in the General Data Protection Regulation. There are currently no official guidelines by Working Party 29 (WP29), European Data Protection Supervisor or other authorities. This starting to change - recently, thanks to the Privacy Commission of Belgium. Belgian Data Protection Authority (DPA) is a very active body, keen to seek help from experienced industry professionals and researchers. You may have heard about the case Belgian DPA has brought against Facebook . It’s a serious and respected privacy authority body.
I analyse the document and annotate it with my interpretation. Where appropriate - I review and comment the proposal, proposing good DPIA practices. A word of caution - I do not copy and paste or translate the document literary. Rather, I analyse it - based on my knowledge, skills and professional experience.
In many respects, the DPIA guidelines laid out in the proposal are very general, not overly technical, and not really specific. They are, however, very interesting. Many privacy researchers and professionals suspected similar interpretations (it doesn’t mean “identical”). It’s worth to look on how the process might be operating.
Important note: These guidelines are still a proposal. They aren’t yet widely accepted in Europe. We’ll need to wait for that.
But it’s the first public document on DPIA delivered by a respected privacy authority. It will definitely be very influential. The report itself can be found here. It’s in French. And in case you’re wondering - the French official name for Data Protection Impact Assessment (DPIA) is Analyse d'Impact Relative à la Protection des Données, also known as AIPD).
On a high level the document specifies:
- the required elements of a DPIA
- when it’s obligatory to conduct a DPIA
- who is involved in a DPIA process
Conducting DPIA is understood as directly resulting from the GDPR principle of accountability. Organisations need to be able to demonstrate that privacy and data protection principles are practically considered and taken seriously. DPIA in itself is a risk-based tool helping to measure and review the privacy level, and when necessary - propose design changes. The DPIA process is broadly applied to large systems and products as a whole. In order to be efficient - it must be conducted by people with profound knowledge, skills and expertise in security and privacy. DPIA is not a check-list. I doubt that any DPIA conducted following the “check-list” approach, without deep understanding and consideration of the system - would ever be taken seriously during a possible later verification. And that’s also an important aspect of a DPIA: results must be verifiable.
In my view, that’s the first principle of a good DPIA: it must be specific.
Keep in mind that the DPIA concerns data processing. To simplify, data processing is a catch-phrase describing: collection, storage, use and erasure of data. Consequently, DPIA deals with such aspects like transmission, storage and operations on data.
Analysis of Privacy Commission of Belgium DPIA Guidelines
Let’s start with General Data Protection Regulation. GDPR article 35(7) lists the minimum requirements a DPIA must provide and contain:
- a systematic description of the envisaged processing operations and the purposes of the processing, including, where applicable, the legitimate interest pursued by the controller
- an assessment of the necessity and proportionality of the processing operations in relation to the purposes
- an assessment of the risks to the rights and freedoms of data subjects
- the measures envisaged to address the risks, including safeguards, security measures and mechanisms to ensure the protection of personal data and to demonstrate compliance with this Regulation taking into account the rights and legitimate interests of data subjects and other persons concerned.
A DPIA report must also contain
- purposes of processing
- the stakeholders
- categories and types of private data processed in the system
- characterization of types of data flows (e.g. is the data transferred?)
- etc.
These descriptions should be clear. Clarity is valued in DPIA. In my opinion, DPIA cannot contain ambiguous terms. A good DPIA is straight and to the point.
Proportionality
DPIA must provide a proportionality analysis. In other words: are the used data really needed to fulfil intended objectives?
It must be established:
- what is the objective of processing data
- what are the reasons of processing data in a particular way in order to meet the objective
- if there is more than one way to achieve a task: it’s necessary to explain why the chosen one is followed
The last point is very interesting and will force data controllers (companies, organizations) to analyze the way they process data on a very broad level. They will need to ask (and answer) a question: are there simpler (usually this means less risky) ways to achieve a particular goal?
Risk Analysis
In general, risks are related to “rights and freedoms of natural persons”. This refers to GDPR section, which I quote below in verbatim:
... which could lead to physical, material or non-material damage, in particular: where the processing may give rise to discrimination, identity theft or fraud, financial loss, damage to the reputation, loss of confidentiality of personal data protected by professional secrecy, unauthorised reversal of pseudonymisation, or any other significant economic or social disadvantage; where data subjects might be deprived of their rights and freedoms or prevented from exercising control over their personal data; where personal data are processed which reveal racial or ethnic origin, political opinions, religion or philosophical beliefs, trade union membership, and the processing of genetic data, data concerning health or data concerning sex life or criminal convictions and offences or related security measures; where personal aspects are evaluated, in particular analysing or predicting aspects concerning performance at work, economic situation, health, personal preferences or interests, reliability or behaviour, location or movements, in order to create or use personal profiles; where personal data of vulnerable natural persons, in particular of children, are processed; or where processing involves a large amount of personal data and affects a large number of data subjects.
In practice there is more, a fact recognized by Working Party 29 which also highlights: freedom of speech, freedom of thought, freedom of movement, prohibition of discrimination, right to liberty, conscience and religion.
There are even more types of risk we can list: identity theft, financial loss, reputational damage, loss of confidentiality of privileged data, unauthorised reversal of pseudonymisation, etc.
All of these terms have a broad meaning and often can be translated into matters of technology. So thinking about “freedom of expression” may directly relate to e.g. matters of “encryption”, “access control”, “authorisation”, etc. or the sole purpose of data storage (if not designed respectfully) - on very technical levels. In a recent ruling, European Court of Justice confirmed how privacy aspects arising from breach of “freedom of expression”:
“the retention of traffic and location data could nonetheless have an effect on the use of means of electronic communication and, consequently, on the exercise by the users thereof of their freedom of expression”
Most of the risks from above indirectly reference complex security and privacy controls (technological, organizational), issues and designs: communication systems, storage systems. Be prepared for a deep security and privacy analysis.
DPIA Methodology
DPIA is a risk-based method. Risks must be identified and evaluated. Risk analysis is performed on the basis of likelihood (of a risk) and impact (what are the consequences). Standard ISO-like methodology in risk-assessment can be used. Risk can be inherent (if not addressed it can be read as: we don’t do anything, let’s hope it’s fine; we’ll explain it to our CEO and the Data Protection Authority later on) or residual (despite precautions, what is the risk level that is left)?
Analysis must take into account the nature, scope and context of processing. Every possible data processing use case scenario (related to collection-storage-use-erasure) need to be taken into account.
Important component of a DPIA is a requirement of listing the risk-reducing measures. In practice, these can be technologies or approaches used to mitigate a risk. For example, in order to protect communication data, a particular cryptographic protocol can be used, in a certain setting and configuration.
Privacy risks will be identified, privacy controls (or lack of them) strategy described, and only afterwards, any remaining risk can be assessed.
No methodology requirements
There is a freedom of choice when it comes DPIA methodology; Belgian DPA is not insisting on any particular one. There is just as recommendation to conform to the accepted good practice, e.g. use ISO standards or specialised guidelines and codes of conduct.
That’s fortunate - ISO is currently nearing completion of standardising Privacy Impact Assessment. Just in time.
DPIA methodology should be adapted
The responsibility of choosing a methodology is in the hands of data controller (organization, company). But the DPIA methodology used in practice can and should be adapted to particular circumstances, needs and requirements. I would highly advise using specialized methodologies, adapted to the assessed systems. For example, there are differences when assessing a mobile ecosystem, web ecosystem, Internet of Things ecosystem and, say, industrial systems (like, for example, a sensor network of weather stations). Only a well adapted DPIA will result in a good quality assessment.
There are obviously some common characteristic of a DPIA:
- Context definition (internal and external factor)
- Specification of risk assessment criteria for the rights and freedoms of natural persons
- Identification and analysis of risks
- Definition of acceptable risk values
- Identification of appropriate risk mitigation measures (should be understood both on a technical and organizational levels).
Belgian DPA lists the characteristics of a good DPIA:
- Tailored and specific. There is no one-size-fits-all solution in DPIA. It’s a custom made process (but some common procedures can be reused)
- Accessibile. DPIA result should be easy to understand and unambiguous. It should be written in clear language. The intended audience of a DPIA report are not just experts, but also management and other personnel. A good DPIA contains an executive summary. Visual descriptions can be used to enhance.
- Nuanced with comprehensive risk scales. It should be comprehensive.
- It should be conducted with prior consulting in appropriate stakeholders
Who is Involved in Consulting for a DPIA?
Very often conducting DPIA requires engaging a broad number of important actors in the system. Among the stakeholders may be persons such as project managers, CIO, CISO, Data Protection Officer, application developers, users etc. People with knowledge of technical and organizational constraints. It’s the Data Protection Officer’s (DPO) duty to help in identifying the key persons, and to assist in the process of conducting of DPIA (however, DPO does not necessarily need to be the one who actually conducts a DPIA)
That’s because a good DPIA requires input. Not only technical input such as documents about design and requirement - but also input from actually involved people who hold responsibility in the project.
DPIA Needs Reviewing
Important note: DPIA are subject to a periodic review and may need to be updated or redone. The need for this arises especially if there has been a substantial change in the way data is processed. In this case it can be affected by several factors, for example
- System update, upgrade, integration
- Technology change
- Ecosystem change - new risks arise (for example identified by new research?), old mitigation strategies are no longer sufficient (the used encryption or anonymization scheme is found to have serious flaws)
DPIA is a process. It should be built into technical and organizational culture. Privacy in this way is becoming a strategic factor. DPIA reports how parts of strategies are deployed and executed. In this view, DPIA report is more than just a project report.
When is DPIA Obligatory?
Now the interesting part. Belgian DPA lists examples of cases where conducting a DPIA is mandatory. That’s very good. We’re seeing such a list for the first time. DPIA is required where the risk is substantial. What does it mean in the eyes of Belgian DPA?
Important thing first: the list is not complete. It doesn’t mean that other types of systems and data processing operations do not require conducting a DPIA. According to General Data Protection Regulation (GDPR), DPIA always must be conducted for systems fulfilling the requirements laid out in GDPR. This means that the list should be treated as an example. The list is a starting point and it is subject to change. Additionally, a DPIA can be conducted both for single projects and larger projects having several components (integration often may give rise to a greater risk, integrating two systems with good privacy standards does not necessarily mean that the final outcome also have those desired traits).
DPIA is always required for the following cases:
- If genetic data are used. Biotechnology/bioinformatics startups listen. Similarly for health data banks.
- When private data is collected from third parties and the data is used to decide whether to allow or deny access to a service (e.g. automatic decision making)
- If data are used to assess the financial level of the user or to prepare any “user profiles” to assess “risk” (for example to achieve the previous point). Applies to profiling and discrimination.
- If the data processing might carry a risk to physical health of the user/person
- If personal financial (or otherwise sensitive) data are used, if they are used for other purposes than those they were initially collected
- When communicating, disclosing and making publicly available data related to a large number of people
- If there is a need to assess and process private personal aspects, for example for producing analyses based on: economic status, health, personal preferences, interests, reliability, behavior (behavioral analysis requires a DPIA!), location data (is your app using GPS or motion sensors?), travel patterns
- If profiling is used on a large scale
- In the case of large-scale processing of children data, if done for purposes other than the data has been originally collected. Note: not consented - but collected!
- If there are projected common applications or entire environments for entire large sectors or occupational segments, or cross-functional activities where sensitive data is used. Note: this should concern e.g. products used for employee tracking.
- If recording the knowledge, benefits, abilities or mental health of children is made, especially in order to monitor their progress, for example to establish the educational level of of children (are they in primary, secondary school - etc.). Note: It’s a technique of indirect profiling.
DPIA will be broadly required in most cases where new technologies and ways to process data is utilised. Direct inclusion of genetic data is a great choice. Genetic privacy is a tough topic to address.
Similarly for behavioral analysis, which often carries a number of risks.
DPIA Not Necessary
Although GDPR requires all systems to handle risk and to have appropriate privacy controls, Belgian Privacy Commission is listing example systems where a formal DPIA process does not need to be necessarily made.
The list and its construction is very interesting. The descriptions of data processing where DPIA is not required are very long, much longer and much more specific than in the case of list where DPIA is mandatory.
This highlights that defining situations where DPIA may not be needed is a difficult task, even for Data Protection Authorities.
It makes me wonder if organizations will decide to take the risk to choose on their own?
The following types of processing, according to Belgian Privacy Commission, may not need a DPIA:
-
Processing of data related only to administration of salaries (payrolls), if the data is only used for this particular task and if the data is not stored longer than to fulfill the task
-
Processing of data related exclusively to administration of personnel (human resources) of an organization if the data is unrelated to health of the data subject. Note: In practice this is difficult to achieve, imagine a situation where even one employee has an e.g. disabled status, according to this description this scenario may fall into “must do a DPIA” case then.
-
Company accounting, if used only for that purpose. Only if personal data is exclusively used for accounting and if the data are not stored longer than necessary. The data cannot be shared with any third-party (unless the organisation is legally bound to do so).
-
Processing data for administration of shareholders - if the data is only used for administration and only concerns people whose data are necessary for administration.
-
Processing data by a foundation, association or other non-profit organization - as needed by usual activities, and only of the processing relates to the data of organizations members (persons with whom an organization maintains regular contacts)
-
If data is processed only and exclusively to register visitors, as part of access control, if the used data are only: name, business address of the visitor, identification of employer, identification of visitor’s vehicle, name, section and function of the person visited and the time of the visit. The data must not be kept longer than necessary.
-
If the data is used by educational institutions in order to communicate with students in relation to teaching activities, etc. (this also applies to prospective students), provided that no data is obtained from a third party and that the data is communicated only to third parties as foreseen by regulations. The relations can be maintained only for a specific period of time.
The descriptions of operations where DPIA may not be obligatory are very specific and often contain detailed provisions and “if’s” such as “not longer than necessary”, etc. It sounds like even in the cases where DPIA might not be necessary, some basic evaluation is still needed. Evaluation whether a DPIA is needed. Not conducting a DPIA is a formal decision, signed by management of the data controller (company, organisation). All possible subcontractors typically should also be involved in conducting of a DPIA.
We can call this process a Data Protection Threshold Assessment, as opposed to Privacy Impact Threshold Assessment. It’s the process of establishing whether a DPIA is necessary (respectively for PIA).
Summary
Data Protection Impact Assessment process is helping organizations to design their systems with good privacy and data protection levels. DPIA allows to measure the level of privacy.
Thanks to Belgian Privacy Commission, we’re able to see some first official guidelines about the process of conducting DPIA and how it fits in organizations. It will undoubtedly require a cultural shift in the way of thinking. Privacy becomes an important factor.
DPIA can be made with different methodologies and approaches tailored towards specific needs. Well conducted DPIAs will become market differentiators.