Mitigation strategies and solutions


This page contains information about the solution-oriented steps of the LINDDUN methodology.
In the fifth step of the methodology (3B in the nutshell summary), mitigation strategies are elicited to resolve the uncovered threats from the previous steps. In the sixth and final step (3C in the nutshell summary), these high-level strategies are used to select the most appropriate privacy enhancing technologies (PETs). Similar to previous steps of the methodology, LINDDUN provides the required privacy knowledge to select suitable privacy strategies and solutions that can mitigate the elicited privacy threats.

A Taxonomy of Mitigation Strategies

The goal of threat modeling is to identify flaws in the system in order to resolve them. Addressing these flaws will require a number of fundamental design decisions. These decisions are often referred to as strategies (or tactics). A strategy describes the means to achieve a certain objective, or, in case of LINDDUN, a means to resolve privacy threats. We will therefore refer to them as "mitigation strategies". They capture a high-level view of common techniques used in practice to prevent privacy threats. As these strategies can be used to classify privacy solutions, they are a suitable step in the conversion of privacy threats to appropriate privacy enhancing solutions.

In essence, we leverage on a taxonomy of solution strategies that provides a structured classification of common mitigation decisions. In addition, we provide a means to select the appropriate strategies by mapping them to the LIND(D)UN threat trees they intend to mitigate.

Obtaining privacy means controlling the consequences of exposing the (possibly indirect) association of individuals to some information/transaction in a given context.

Accordingly, in order to obtain privacy, it is important to focus on two major strategies. First, associations between users and their transactions and personal information need to be controlled in order to ensure that the user shares as little information as necessary with the system. This is the proactive approach. Second, the damage must be limited by controlling the associations after disclosure. To achieve this, the exposure of these associations needs to be restricted to the minimum. This is the reactive approach.

Taxonomy of privacy mitigation strategies

Concealing the association

Concealing the association can be divided into two sub-strategies: (1) protect the identity of the user during authentication and (2) protect the data that will be communicated to (or throughout) the system.

Protect ID

First, as shown on the left-hand side of the taxonomy, the identity of the user can be protected by means of pseudonyms (i.e. an alias instead of the real identity), or an attribute (such as a certificate) can be used. Even more anonymity can be achieved by using properties (such as anonymous credentials or other zero-knowledge proofs) as authentication mechanism.

Protect data

Second, the data that are being communicated have to be protected. We make a distinction between strategies that concern core data protection and those that aim at raising awareness regarding the sharing of information. Data protection is in fact even subdivided into strategies related to transactional data (i.e. the actual data that are being transmitted) and strategies to protect contextual data (i.e. the metadata related to the communication) as both will require different kinds of solutions. Encrypting the information that is being transmitted, for example, will only protect the transactional data, while the contextual data can still reveal the sender based on information required for the communication (e.g. IP address, browser settings, etc.). Strategies to protect data include removing the (sensitive) information, hiding the data, replacing (part of) the information or generalizing it. Note that the strategies related to both types of data are however the same as they only represent high-level approaches used to mitigate the associated threats. The corresponding solutions will however strongly differ (e.g. removing sensitive information before sending the transactional data vs. removing contextual data, such as sender information, by applying onion routing techniques). A final strategy to conceal an association is to make the users more aware of the consequences of sharing. Feedback and awareness tools have been emerging to assist the user, and also user-friendly privacy support can be beneficial to aid the user at managing his privacy settings.

Guarding the association

Guarding the association after the data has been shared can be divided into two sub-strategies: (1) guard exposure and (2) maximize accuracy.

Guarding the exposure

Guarding the exposure is the most obvious strategy. All sub-strategies correspond to ways to restrict access and reduce disclosure to the collected data. The strategies are divided into 3 main categories: means to obtain data protection compliance which are related to consents and policies, and notice and transparency; means to ensure confidentiality which correspond to security measures such as access control and data encryption; and means to minimize the collected data. The strategies related to minimization are in fact the same as those related to the protection of transactional data. Some of the solutions might even interlace (e.g. hiding data by encrypting it) however each category will have its dedicated solutions.

Maximizing the accuracy

Maximizing the accuracy is the final strategy to guard the association. It empowers the subject as it allows inspection and correction of his information. We make a distinction between two more detailed strategies. The first strategy allows a subject easy access to the collected data about himself in order to review the data. This strategy relates to user awareness. The second strategy extends this access right as it allows the subject to request updates or even deletion of his information. This strategy can be applied to obtain plausible deniability. An example of such deletion is the ‘right to be forgotten’ which was recently implemented by Google and allowed subjects to request removal of personal information from Google’s search index if the links are inadequate, irrelevant or no longer relevant, or excessive in relation to the purposes for which they were processed.

Selection of Mitigation Strategies.

As shown in the mapping table, the taxonomy can be used to determine the appropriate strategy to mitigate a particular threat. For each strategy, the corresponding threat trees are provided.

Mapping of Mitigation Strategies to LINDDUN threat trees

First, a distinction can be made based on the type of DFD element that corresponds to a specific threat. Threats related to entities and data flows correspond to the concealment of data branch in the taxonomy (left branch of the taxonomy tree). In particular, strategies that aim at protecting the identity will mitigate entity-related threats (linkability and identifiability of entity), while mitigation strategies for data protection before and during communication will aim at resolving data flow threats. The linkability and identifiability trees each have a branch that corresponds to transactional data and one related to contextual data protection. Detectability and non-repudiation threats of the data flow are entirely focused on contextual data. The final strategy related to protection of communicated data is awareness, which evidently corresponds to the unawareness threat tree. Note that only the left branch of the unawareness tree maps to this strategy, as this tree is actually divided into threats that require proactive mitigation and those that require reactive solutions.

Second, mitigation strategies that guard the exposure of associations correspond to threats related to data that have already been collected and stored. These clearly map to data store related threats, which can be divided in two categories: threats that require confidentiality and threats that require minimization. The main part of data store related privacy threats are mitigated by strategies that have data minimization as common denominator. This is in fact a broad category of mitigation strategies that limit the collected data by, for example, removing redundant data, or generalizing personal information. Data store threats related to non-repudiation and information disclosure will require confidentiality strategies such as access control to the data store and encryption of the stored data (Note that the threats that are mitigated by minimization strategies are indirectly also linked to confidentiality strategies, as hiding data will be often achieved by encryption and access control techniques. Minimization is however much broader than confidentiality). Threats to a process also require strategies to guard exposure. They are all mitigated by confidentiality, as information disclosure is their main threat.

Finally, strategies to maximize the accuracy are very specific and correspond to two subtrees. Reviewal of data corresponds to the right branch of the unawareness tree as this requires awareness of the data that have been collected about the subject. As mentioned before, unawareness threats can be found in both branches of the taxonomy of mitigation strategies as a user should be aware of the consequences before he shares information, as well as of what data are in fact collected about him to verify the accuracy. The strategy that maximizes the accuracy by empowering the subject to update and delete collected data (either directly, or indirectly by requesting deletion) is linked to the subtree of non-repudiation that requires deniability by editing the database.

Select Corresponding Privacy Enhancing Technologies (PETs)

A threat can be mitigated by several types of solutions, each with their own benefits and drawbacks. The selection of a fitting solution is thus not straightforward. We therefore leverage on mitigation strategies to enable an easier and more focused selection of the appropriate privacy enhancing technologies. By first determining a proper strategy to mitigate the threat, we narrow the scope of relevant solutions.

LINDDUN provided a table of solutions that indicates to which privacy properties each solution contributes. The table represents the hierarchical taxonomy of mitigation strategies and links each leaf node of the taxonomy tree to the relevant solutions. Note that, although some solutions correspond to multiple strategies, we only assigned them to their key strategy (e.g. onion routing does not only remove contextual data, but also hides transactional data). In addition, some strategies refer to solutions of related strategies to avoid duplication (e.g. generalization of transactional data refers to encryption solution in the guard exposure branch).

Once the mitigation strategy has been decided, a limited set with designated privacy enhancing techniques can be extracted from the solution table. The more detailed the proposed mitigation strategy is, the more focused the selection of appropriate solutions will be. The table thus facilitates an easy selection of proper privacy enhancing technologies to mitigate a specific threat.

This table will be updated with state-of-the-art privacy enhancing technologies on a regular basis. Please check back regularly, to make sure you are working with the latest version.

LINDDUN privacy enhancing solutions table

Mapping of mitigation strategies to privacy enhancing solutions