Startpage interview about privacy and threat modeling
This post first appeared on Startpage's blog: https://www.startpage.com/privacy-please/privacy-advocate-articles/privacy-in-action-kim-wuyts-academic-privacy-researcher
Startpage interviewed Kim Wuyts, one of LINDDUN's principal developers. They talked about privacy and threat modeling as part of Startpage's 'Privacy in Action' blog series.
Read the full interview below or visit the Startpage blog, which contains many interesting interviews with privacy professionals.
Kim Wuyts is an academic privacy researcher at imec-DistriNet, at KU Leuven in Belgium.
She researches privacy and security engineering and data protection, with a focus on privacy threat modeling. She works on the LINDDUN privacy threat modeling methodology project. Check her out on Twitter if you’d love to learn more about the academic side of digital privacy!
Startpage: What does privacy mean to you?
Kim Wuyts: Privacy is a basic right, although it doesn’t often feel like that in today’s online world.
Privacy means that you can control who knows what about you. You should be in charge of your personal data. It therefore also means that companies who process your personal data do that in your best interest, with privacy by default and privacy by design at the core.
Startpage: We know confidentiality is one of the components of the CIA Triad of cybersecurity. Is there a difference between confidentiality and privacy?
Kim Wuyts: Yes! It is probably one of the most common privacy misconceptions: considering confidentiality and privacy as synonyms. In fact, similar to the CIA security triad, there is a privacy triad as well: unlinkability, intervenability and transparency. Or the NIST counterpart: disassociability, manageability and predictability.
Confidentiality is indispensable to obtain privacy, but it is not at the core. Minimize personal data as much as possible. Allow data subjects to control access and correct data (intervenability). Openly communicate about what personal information is being processed for what specific purposes (transparency). Those are the key privacy properties to embed in a system’s design.
Startpage: Explain threat modelling to a newbie. What is it?
Kim Wuyts: In short, threat modelling is thinking about what can go wrong so you can fix it before it actually happens.
You basically ask yourself 4 questions when threat modeling: what are we working on? What can go wrong? What are we going to do about it? Did we do a good job? First you need to understand what you are analyzing. So, you need a model of the system. For threat modelling, that is often a data flow diagram, but a simple whiteboard sketch or other type of diagram already gets you started.
Next, you need to think about what issues can occur in your system. You systematically go over the different components of your model and think about security and privacy problems. Existing threat knowledge can guide you in this process. For security, there are for instance STRIDE threat trees and the Elevation of Privilege (EoP) card game as supporting threat knowledge. For privacy, you can use the LINDDUN threat trees or LINDDUN GO threat cards to help you think about potential privacy harms.
Once you have identified what can go wrong, you likely want to resolve this. You prioritize the identified threats based on their risk and you tackle them using tactics, strategies, patterns, privacy enhancing solutions and so on.
s a final step, you reflect on the threat model you have created. Did you do a good job? And you re-iterate when required. Methods and knowledge to apply threat modeling may vary based on team preferences, system or organization requirements, and so on. This four question structure however applies to all security and privacy threat modelling approaches.
Startpage: How is privacy threat modelling similar or different to threat modelling in general?
Kim Wuyts: The main difference between security and privacy threat modelling (or between security and privacy engineering in general for that matter) is the mindset with which you approach the analysis. Security focuses on the system’s assets to protect. Privacy takes the perspective of the data subject and examines if and how the privacy rights of the data subjects involved in the system might be violated.
For privacy analysis therefore not only ‘attackers’ are considered as threat actors, but also internal misbehavior has a big impact on the data subjects’ privacy. The system itself might process the lawfully obtained data in a privacy-violating way.
Startpage: Have academics made exciting new discoveries lately when it comes to protecting data?
Kim Wuyts: First of all, there is no silver bullet that will easily resolve all privacy issues. Protecting data has many facets. Security and privacy solutions should be improved to keep up with constantly evolving technological developments (including the new threats they might introduce) and societal requirements. Privacy design approaches should be enhanced to match evolving development practices. Legal, organizational and technical measures should be aligned.
At DistriNet, we have several research tracks that tackle security and privacy problems and thereby contribute to the overall goal of protecting data. To name just a few… privacy and security engineering, security and privacy-focused evaluation of existing technologies and techniques. Attacks, vulnerabilities, exploits and the detection and evasion thereof (for instance, WiFi network vulnerability attacks, CPU vulnerability attacks, automated website fingerprinting, malware detection, and so on.
See the full list of our publications on the DistriNet website.
But of course, the academic world is much broader than our research group. One recent very timely academic outcome was actually triggered by the COVID-19 pandemic, which raised the urgent need for contact tracing. A team of academic experts developed DP3T, a secure, privacy preserving proximity tracing system that forms the foundation of many of the corona tracing apps rolled out throughout the EU.
The machine learning community also has a big impact on security and privacy. Machine learning and deep learning techniques can greatly strengthen security and privacy practices. Of course, this might bring new privacy concerns to the surface. Definitely an interesting domain to keep an eye on.
Startpage: What are some misconceptions about digital privacy that laypeople often have?
Kim Wuyts: “I have nothing to hide.” You might not lay awake at night because you have huge secrets you don’t want anyone to know, but that still doesn’t mean you have really nothing to hide. You might be an open book to your close friends and family, but do you also want your neighbors, colleagues, acquaintances, or even strangers to know your deepest thoughts?
Imagine every online action you do is projected on a big screen in front of your house. Everyone who drives by gets to see the pictures of your not-so-sober night out that you synchronize to your cloud storage, the slightly inappropriate messages you send to your lover, the big order of toilet paper and pasta you just bought. Feeling uncomfortable yet? Good! All of this information is already available online at your storage host, service provider, online store, messaging service, and the like. Imagine what they can do with all of this information.
“I am not a celebrity. Nobody is interested in me or my data.” While it is true that you might not be the victim of an attack that targets you personally, your personal data is a useful addition to the pile of ‘big data’ that companies collect and process. This can be used to profile you, infer and predict behavior and nudge you even into certain actions. Also large scale blackmail campaigns abuse personal information to convince people to pay in exchange to keep sensitive information secret.
Startpage: What are some things ordinary people can do to better protect their privacy?
Kim Wuyts: Use two-factor authentication when possible. Don’t reuse passwords. Use a password manager to keep track of them. Be conscious of ‘dark patterns’ that will trick you into accepting the less privacy-friendly option. For instance, when asked about your cookie preferences, typically the ‘only necessary cookies’ is greyed out and much smaller than the ‘full functionality’ option that is highlighted.
Be aware that if a service is free, it is very likely that you (or at least your data) are the product. Look for more privacy-preserving alternatives when possible.