A law firm was fined $5,000 after one of its lawyers used ChatGPT to write a court brief riddled with fake case references

BotIt@lemmy.world to Technology@kbin.social – 58 points –
A law firm was fined $5,000 after one of its lawyers used ChatGPT to write a court brief riddled with fake case references
businessinsider.com

The lawyer said that he had "no idea" ChatGPT could fabricate information and that he "deeply" regretted his decision.

13

You are viewing a single comment

I feel like a good lawyer should do a tiny bit of research before letting AI write a court brief. At least enough to read the 2 warnings on the ChatGPT site about how it can generate inaccurate information.

Or, they've done it before and gotten away with it.

It's impressive how well ChatGPT hallucinates citations.

I was asking it about a field of law I happen to be quite aware of (as a layman), and it came up with entire sections of laws that didn't exist to support its conclusions.

Large Language Models like ChatGPT are in my view verisimilitude engines. Verisimilitude is the appearance of being true or real. You'll note, however, that it is not being true or real, simply appearing so.

It's trying to make an answer that looks right. If it happens to know the actual answer then that's what it'll go with, but if it doesn't, it'll go with what a correct answer might statistically look like. For fields with actual right and wrong answers like law and science and technology, its tendency to make things up is really harmful if the person using the tool doesn't know it will lie.