A very exasperated DPO, courtesy of Bing AI Image Generator: I need an image of a frustrated female professional who has her head on her desk, with a look of sheer frustration. A computer should be on the desk, and a monitor facing the viewer, with the following text on the screen: AI

Wanna Make Your DPO Deeply Sad? Use AI to Analyze Client Personal Data

Do you hate your Data Protection Officer? Do you want to make them rage quit and throw their computer out the window before moving to a tropical island far, far away from the internet? Well, then, tell them you’re using ChatGPT to analyze client records. I guarantee you, in less than 10 seconds, they will look like the AI-generated image above.

Fortunately, none of my clients have made such a pronouncement to me … yet. But I’m sure it’s only a matter of time.

Marketing a Solution … That Will Create Problems

Tonight, I stumbled on a lawyer’s glowing recommendation of one of the legal ‘co-counsel’ tools out there. Midway through gushing about how awesome it was, he dropped this line: “I uploaded over a 1000 pages of deposition testimony …” and I gasped loudly enough that my cats turned and looked at me, and my husband asked if everything was alright.

And only a week or so ago, I had a demo call with one of the sales reps for a leading Legal AI company. Save for me being unforgiveably late, the beginning of the meeting started out well: The sales rep boasted about the company’s track record with legal search and case analysis, how they had a direct collaboration with OpenAI, and how unlike ChatGPT in the wild, their model was still being trained on current cases, laws & regs.

But then he mentioned how amazing their AI co-counsel tool was at quickly and seamlessly reading through and analyzing depositions(!), internal memoranda (!!), and client medical records (!!!). It was then that a loud thud was heard, as my head very literally hit my desk.

Plenty of Promise, But So Much Lack of Clue

As I’ve mentioned before, I’m actually a big fan of the concept of incorporating LLMs in the legal space. I’ve played with some of the Legal AI tools out there (in addition to ChatGPT 3.5, 4 & now Bard). Hell, I even wrote a little python script to analyze cases using OpenAI’s API (you can read about it here). There’s a ton of promise in this space, and it’s exciting.

But I’m also deeply concerned with how some of these tools are being marketed, and even more concerned about how some lawyers and law-adjacent types are using them in practice.

Practitioners should not lose sight of the fact that anything you feed in, can, and probably is being used to continue to train the model, may be accessible by the company providing the tool (or other third parties like OpenAI), and that the programs themselves are still very, very buggy and prone to things like data breaches and confidentiality leaks.

Sure, it’s grand that these tools can reasonably analyze cases and statutes. That’s fine, statutes and decisions are public already. But listen, my fellow law-types, if you are uploading confidential, sensitive client data (and, yes, non-public client material — even depositions — should be seen as confidential, especially if they include sensitive personal information about things like health, criminal activity, sexual orientation, etc.), you are almost certainly committing malpractice by breaching client confidentiality. Don’t fucking do it.

The fact that these companies boast about helping attorneys do this at all is absolutely horrifying and IMHO, deserves further scrutiny by regulators.

It’s Not Just the AI Hallucinating Understanding

When I asked the sales rep where the medical records and other client-sensitive information was being held, who had access to the training models, whether his company emplyees could access snippets of the search, how it was being secured … he had no answers. And he’s not the only one. I’ve had other calls with companies slinging similar services, and let’s just say “I’ll have to get back to you on that” is a pretty common refrain.

I’ve also read through quite a few of the privacy notices, security documents & DPAs attached to vendors in this space, and it’s apparent that it’s not only the sales team who isn’t informed — many of these companies clearly haven’t considered data protection, confidentiality, or security at all, much less thought about how to build in guard rails or limits in their products.

And while the US has pretty garbage privacy laws, the EU has the GDPR. Uploading a client depo into a LLM (which almost certainly doing all processing and analysis in the cloud – either on the company’s servers or more likely, OpenAI’s), will, eventually land a lawyer before the state / federal bar, or a regulator. I promise, it’s not a matter of if, but when.

So, unless you enjoy 1) potentially being a test case; 2) enjoy making me sad, or 3) want your DPO or privacy counsel to rage quit, for the love of all things holy, PLEASE DON”T USE THESE TOOLS TO ANALYZE CLIENT RECORDS. Yes, even depositions.

Scroll to Top