(C) Alec Muffett's DropSafe blog.
Author Name: Alec Muffett
This story was originally published on allecmuffett.com. [1]
License: CC-BY-SA 3.0.[2]


GDPR and the Intersection of Security and Privacy

2025-06

Over the past year I've discussed the intersection of privacy, morality, the GDPR, and security with some good friends. In one such thought exercise, we discussed the efficacy and morality of Deep Packet aka SSL Inspection. What follows is the summation of those ideas.

Cyber defence practitioners, myself included, want to cast as wide a net as possible to capture threats and anomalies. Going back to my 2008 ArcSight days the adage was "collect all the logs." This made our sales staff very happy as they could sell bigger licenses, and it made friends who were in the storage business happy, too. Obviously, with the more data points collected the more anomalies we could detect and more error conditions we could alert upon. We could find endpoints beaconing to known-bad IOCs, correlate that with authentication data, and know who's credentials were in-use at the time to track a potential breach. It was cutting edge, amazing, and intoxicating being "Big Brother."

A typical Big Brother is Watching You, 1984-style poster. I remember one of these at the local video rental store as a shoplifting deterrent.

With web proxying, filtering, deep-packet inspection (or SSL inspection) and other tools a well-equipped SOC will be able to see everything, even the content of a user's https sessions. And even though most Acceptable Use Policies (AUPs) prohibit personal use of work equipment, we all check our personal e-mail, make our doctor's appointments, and browse Reddit on the corporate laptop. We learn about the latest cyber news and threats via social media and blog sites, and even private Slack and Discord servers. We tune and track our work pensions and private investments from our work computers during the day, too. I know of very few professionals who haven't done this. A proper AUP will indemnify the company and its partners from the ramifications of this data collection because it is seen as required to provide a safe and secure networked workplace. And we all sign the AUP on Day 1 of our employment with an organisation, usually without reading it.

With deep-packet inspection/SSL Inspection tools (Zscaler, Fortinet, Forcepoint, et al) all the private data from on-line transactions are recorded. Sometimes this is recorded on local appliances, and more recently, in the cloud. The law and industry convention places the burden upon the product operator (cyber defence engineers) as to what data is collected. Providers, whether the big hyperscalers or a mom-and-pop SaaS delegate the responsibility of monitoring the sensitivity of the content placed in the tool owner or "superuser." Amazon isn't responsible if you leave your S3 bucket wide open with PII on it; Microsoft isn't responsible if your Azure Storage Blob is exposed. Zscaler isn't going to be responsible if your tools administrator leaves his console open and a rogue security analyst then browses to read his girlfriend's e-mail through the Gmail session that was logged.

The exemptions for this data collection are pretty clear in the GDPR. The data is being logged for the detection and prevention of crime. Someone slipping malware down the pipe through your company's pension provider? We want to track and detect and prevent that with multiple controls: endpoint monitoring (EDR) and web filtering/proxying (deep-packet inspection) come to mind. ISO27001 and the NIST CSF all provide guidance saying we must take reasonable precautions and monitor for this possibility. This is also compounded with the GDPR itself saying we need to monitor data access; if someone is siphoning data out of your organisation we need to be able to know who and how, and what the data is!

The real problem is that as security practitioners, we're collecting data, and by some estimations too much data and data that is too sensitive to really be of any value for security purposes. Let's walk into a hypothetical scenario.

Joe makes an NHS appointment with his GP to discuss his mental health; his local NHS Trust is using a third-party SaaS solution to handle appointment scheduling, and their security posture is poor. A ransomware dropper is included in the .ics file provided by the scheduler and can infect his system and even the locally-hosted Microsoft Exchange stack.

Deep-packet inspection prevents the .ics file and malware from even making it past the DMZ VLAN where the appliance is hosted. The network is safe and protected. At the same time, a plan-text record of Joe's appointment data is recorded by the appliance and forwarded to a cloud-based storage solution owned by the appliance and SaaS provider.

The SOC receives an alert about "Malware Blocked" by the deep-packet inspection appliance and investigates. Tina, a SOC Analyst, starts sifting through the data of Joe's session on the appointment website and is able to review intimate details about Joe's mental health which she doesn't need to know. Thankfully, Tina is a trustworthy and ethical employee and won't spill the beans on Joe's mental health wobbles. But, if analyst Dan picked up on it, it's possible this appointment could be discussed at the weekly pub quiz that a lot of the team attends. In this hypothetical situation, Dan has actually committed the GDPR violation because he's disclosing personal and even worse, "special category" data.

As this hypothetical GDPR breach care of Dan begins to spiral out of control, Joe retains an employment solicitor and this incident heads to a tribunal. To protect the company, the CISO and corporate legal point to the AUP, which Joe signed saying he waived his right to personal data protection by the company because of the need for security. The blame is shifted to "big-mouth Dan." From a corporate perspective, this should be open-and-shut.

Joe and his solicitor are a bit more crafty; they know enough about security controls to understand if the company didn't have a deep-packet inspection tool, they'd have some kind of EDR solution which would be equally effective and without the privacy issues. Additionally, through legal discovery, it's learned the appliance used by the company ships the data to a cloud provider that isn't in the EU, but over in the USA, a clear breach of the data sovereignty clauses of the GDPR.

There are now many issues to overcome. It looks like the CISO didn't perform supplier assurance on the deep-packet inspection SaaS solution. The SaaS provider itself is in breach of the GDPR because they did not disclose where their data lives, even though they claim exemption from the GDPR because the primary purposes of their platform is not to process PII, but to record session data. On another tangent, metadata alone can be correlated to deduce PII, user journeys and access patterns. The resolution of this, which could be quite expensive, is now in the hands of either a magistrate or jury who are non-technical and just terms like "regulatory violation" and identify with personal embarrassment.

At the end of the day, we must assume that all SSL Inspection/deep-packet inspection data will contain at least some PII and even Special Category data, even if we tell our users to not use corporate resources for personal use. As responsible practitioners we're required to be logical and ethnical and to understand and adapt to the reasonable workflows of our end users. We have to adapt for reality and prepare for the worst. At the same time, we need to strike a balance between the protection of an organisation, personal privacy expectations, and defining what controls are absolutely required for an organisation based upon an appropriate threat model and attack mapping. Clearly, a defence contractor or services provider has a different footprint and attack surface than a retail organisation, bank, or media company and the risks of what flows in and out are different. Recording all traffic via SSL Inspection/deep-packet inspection could be a bigger liability for some organisations, whereas it is required for those who are providing specific services.
[END]

[1] URL: https://www.linkedin.com/pulse/gdpr-intersection-security-privacy-jj-katz-qv2gc
[2] URL: https://creativecommons.org/licenses/by-sa/3.0/

DropSafe Blog via Magical.Fish Gopher News Feeds:
gopher://magical.fish/1/feeds/news/alecmuffett/