Source: Bing Image Creator: "Create an image illustrating the concept of privacy theater, depicting a stage with symbolic representations of privacy, such as a privacy policy, a cookie, a gavel, a mask, and a lock icon,."​

Beyond Privacy Theatre, Or Why Our Laws Mean We Can’t Have Nice Things

I’ve been experiencing a lot of frustration (exasperation? nihilism?) when it comes to the practice of privacy and data protection as of late. 

Laws are changing rapidly, making it nearly impossible to keep up. Each new law tries to be just a bit more clever than the others before it – and all of these laws are increasingly more focused on symbolic gestures rather than meaningful action. That leaves practitioners and companies swimming uphill against a vast river of “privacy compliance” busywork that does little to actually protect data, individual rights, or anything else.  And of course, everyone is collectively freaking out about AI and its privacy implications. Myself included.

To borrow a phrase from Bruce Schneier, it’s not really about privacy or data protection anymore – we’re now all performing “privacy theater”. 

The Thing and the Symbolic Representation of the Thing

I recently stumbled across a thought-provoking article, penned in 2015 by Zvi Mowshowitz on his excellent blog ‘Don’t Worry About the Vase’. The article is titled ‘The Thing and the Symbolic Representation of The Thing,’ and if you haven’t read it, go read it now. I’ll wait.  

In the article Mowshowitz posits the following:   

Let’s assume there is a thing that all would agree is, in context, a Good Thing(tm) that someone in your situation would want.

Do you want the thing, or do you want the symbolic representation of the thing?

While most of us might intuitively think “I want the thing!” – it turns out in practice that we don’t. We (or the market, regulators, or even users) want the representation of the thing. As Mowshowitz notes, “the system is more concerned with making sure everyone is able to signal, to others and to themselves, that they care deeply and that they are doing everything they can.” 

The actual effects of the decisions involved are at best, secondary concerns. It’s the risk of not engaging in the theater that is the true mind-killer: 

[E]veryone involved [is] afraid of being sued (far beyond the statistical danger of actually being sued) every time they deviate from the standard of care makes this that much worse.  

Now, Mowshowitz wasn’t talking about data protection – he was discussing symbolic representations in the context of a medical startup and delivering health outcomes to patients. For example, he found that what customers really wanted from his startup wasn’t the information his company could provide to materially improve their health (the thing), but a fancy, expensive report, ideally delivered by someone well-credentialed at a good price point (the representation of the thing).  

Why Privacy Notices Are Symbolic & Usually Pointless Representations of the Thing

Let me just say, that while reading that article, I couldn’t help but see all the parallels between his struggle and my own.

The world of data protection has largely fallen into this symbolic representation trap. This is made manifest in all the ways that organizations drown us in information to demonstrate compliance, and signal that they care about our privacy. Whether it’s the endless sea of privacy policies and notices that nobody reads, the non-negotiable e-book length data processing addendums, the cookie notice popups that never remember your preferences, the ‘risk assessments’ that don’t meaningfully evaluate risk, or the security white papers which might as well be marshmallow fluff. It’s all so many words, with so little substance. 

These symbolic representations exist in our world for a few reasons. Firstly, it’s because this is what legislators and regulators demand and importantly, what they enforce against. This gets compounded further when you consider that with each new law, a slightly different incantation must be uttered to really demonstrate that processing is being done on the up-and-up. And woe betide to any organization that forgets to include a jurisdiction’s specific magic words or new spin on the thing the other guy said… 

Not that we’d notice. We don’t have enough days to read the damn things in the first place. 

But, as Woodrow Hartzog notes in his excellent book Privacy’s Blueprint, much of this is also due to our collective and slavish devotion to the Fair Information Practice Principles (FIPPs) and fetishization of ‘user control’ and choice. Want to do something dodgy with data? That’s fine! All you need to do is bury everything in a dry, technical, overly legalistic 36-page privacy notice. As long as you’re not lying and you get our ‘consent’ (read: our apathetic non-response), you’ll be grand!

But don’t you dare forget to mention how you won’t be engaging in the ‘sale or sharing’ of data. 

I’ll note that very little of what’s in the FIPPs touches on practical mechanisms to protect data or ensure that peoples’ fundamental rights to privacy are upheld. That’s not entirely surprising, given that the FIPPs were written in the ‘70s by the US Government, to address what was, at the time, still a very niche problem faced by a small subset of paper-pushing institutions. But applying the FIPPs to data processing in the 21st century no longer makes sense. As Hartzog notes, its sole purpose at this point is to overwhelm individuals with “information and choices instead of substantively protecting” their privacy.  

This is why more and more of my time as a consultant and external DPO is spent not in improving a company’s data protection practices (fixing the thing), but in writing my own ‘fancy reports’, and reading and making sense of everybody else’s. It’s a vicious, miserable, and mostly pointless exercise, and it’s getting worse. We’re moving ever more into a world where privacy and data protection programs and efforts exist so that as Mowshowitz notes:

people could tell themselves they were doing a Responsible Business Thing that businesses needed to do, rather than working to get the benefits that thing would provide if it was actually done.

Thanks, I hate it. And you should too. 

How Do We Get Back to the Thing (AKA, Doing Privacy & Data Protection For Real) 

Listen. I’m not saying that privacy notices or risk assessments, or training aren’t important. They are. But we need to ask ourselves, how much effort should we be investing in writing words that symbolically represent data protection, instead of actually protecting people’s privacy and data rights? Fortunately, I’ve got some thoughts on that.  

Firstly, we all need to come to grips with the fact that our privacy laws are meaningless if the companies using and exploiting our personal data can simply hire fancy lawyers and consultants to write more words slightly differently and still do whatever they like. Regulators need to focus less on whether a company is complying with symbolic representations (Was your cookie policy compliant enough? Was your ROPA complete enough? Did you do a legitimate interest or transfer risk assessment for X thing?), and more on the abuses of our data that we can all agree are pretty lousy. Looking at you, data brokers, facial recognition sites, and creepy spycam vendors. 

Taking a page from Hartzog’s Privacy’s Blueprint, insead of privacy representations we need to set actual concrete standards and build tools and products that work for everyone – not just the companies who stand to profit from our data. 

Just like you can’t (legally) sell patent medicines, build a house without adhering to local building codes, or sell food without meeting health & safety standards, companies shouldn’t be able to operate and earn a profit doing obviously bad things with people’s data. Lawmakers and regulators should help better define the standards and practices that respect privacy and crackdown on abusive, deceptive and privacy-invading behaviors and the organizations who make their livelihoods exploiting those behaviors. Stop treating all technology as neutral, particularly when it’s blindingly obvious when it’s not. 

Second, if the EDPB and the European Commission are concerned with the effect of Big Tech (and government surveillance) on the personal data of EU data subjects, then they need to start acting on those concerns. Not through ineffective, time-wasting exercises like transfer risk assessments and endless pandering to Noyb, but by actually forcing their hand. Ban Clearview AI, TikTok and Meta. Stop doing business with the US, China and other countries with de facto or de jure weak privacy laws, unless said countries agree to meaningful privacy reforms and enforce them. Fund efforts at improving actual privacy at scale. Stop giving your own governments a pass when it comes to data collection and abuse.   

Third, we need more initiatives that encourage, fund, and implement privacy-enhancing technology developments, like end-to-end encryption, differential privacy, zero-knowledge proofs, and secure multi-party computation. We need to work on making these tools easy, cost-effective, and seamless to implement, and regulators need to be at the vanguard, working with researchers, institutions and companies focused on development and adoption of these tools. 

I will happily throw time, money & energy at this, if it leads to meaningful outcomes that will protect everyone. Part of the reason I put so much uncharacteristically optimistic (for me) hope in technology, is because I know that once a given technology becomes easy to use, affordable, and widely availalble, behaviors get nudged and things change. 

Ease, affordability, and ubiquity will always carry things farther than any mandatory policy or procedure. Think of airbags in cars or residential solar panels. Each started out as an expensive, limited-use thing that few of us had access to or could afford. Now, most cars have airbags, and an increasing number of homes around the world have panels on their rooftops. Another good example is HTTPS. In 2017, only 50% of internet requests were encrypted using SSL. Now 95% of all requests are encrypted. This growth didn’t happen because of regulatory dictat – it occurred because Google prioritized sites using HTTPS starting in 2017, and LetsEncrypt began offering free, easy-to-set up TLS certificates in 2020. 

Final Thoughts

It’s clear that the current state of privacy laws and data protection practices is more focused on symbolic gestures than meaningful action. That’s bad, not just for beleaguered DPOs like me, but for everyone. 

We are caught in a whirlpool of privacy theater, performing endless rituals of compliance that do little to protect the very data and individual rights we claim to value. It’s time for a shift in focus. We must prioritize the development and adoption of actions – nurturing the growth and expansion of privacy-enhancing technologies, setting meaningful, practical & actionable design standards, and holding entities accountable for their actual data (ab)uses. This will build trust, change behaviors, and protect privacy and data – the actual things we care about. By emphasizing the practical and tangible aspects of data protection as opposed to their representations, we can finally move away from the superficial, compliance hell that we’ve found ourselves in, towards a future where privacy is not just a buzzword, but a reality.

Scroll to Top