68

Video version

Podcast version if you hate pictures

top 14 comments
sorted by: hot top controversial new old
[-] gerikson@awful.systems 12 points 1 month ago

Stupid sexy robot judge...

One thing an adversarial judicial system like the American one is that if one party sloppily use GenAI to write their documents, they can lose, because the other party can point that out. A lot of the excuses to use LLMs in software development is that modern software development is terrible anyway, so if you can get your slop to market faster than some other schlub, you probably won't lose customers. When there's a balanced incentive to point out hallucinations, they (hopefully) won't get that far.

[-] BlueMonday1984@awful.systems 10 points 1 month ago

When there’s a balanced incentive to point out hallucinations, they (hopefully) won’t get that far.

That I can see. Unlike software "engineering", law is a field which has high and exacting standards - and faltering even slightly can lead to immediate and serious consequences.

[-] zogwarg@awful.systems 7 points 1 month ago

I guess the type of lawyer that does this would be the same that would offload research to paralegals, without properly valuing that as real work, and somehow believe it can be substituted by AI, maybe they never engage their braincells, and just view lawyering as a performative dance to appease the legal gods?

[-] dgerard@awful.systems 9 points 1 month ago

word i hear is that too many lawyers fucking love this shit, they see plausible words and think that's sufficient to ~~replace~~ supplement the office peons

then this happens to them and lol

[-] uranibaba@lemmy.world 2 points 1 month ago

I get the problem with using a made up citations in your filing, but the idea for Harvey mentioned in the article is not all bad. If they can combine their LLM with a database of cases and create their software to not use any case not in this database, they would have a great start.

[-] rook@awful.systems 25 points 1 month ago* (last edited 1 month ago)

When confronted with a problem like “your search engine imagined a case and cited it”, the next step is to wonder what else it might be making up, not to just quickly slap a bit of tape over the obvious immediate problem and declare everything to be great.

The other thing to be concerned about is how lazy and credulous your legal team are that they cannot be bothered to verify anything. That requires a significant improvement in professional ethics, which isn’t something that is really amenable to technological fixes.

[-] diz@awful.systems 12 points 1 month ago* (last edited 1 month ago)

When confronted with a problem like “your search engine imagined a case and cited it”, the next step is to wonder what else it might be making up, not to just quickly slap a bit of tape over the obvious immediate problem and declare everything to be great.

Exactly. Even if you ensure the cited cases or articles are real it will misrepresent what said articles say.

Fundamentally it is just blah blah blah ing until the point comes when a citation would be likely to appear, then it blah blah blahs the citation based on the preceding text that it just made up. It plain should not be producing real citations. That it can produce real citations is deeply at odds with it being able to pretend at reasoning, for example.

Ensuring the citation is real, RAG-ing the articles in there, having AI rewrite drafts, none of these hacks do anything to address any of the underlying problems.

[-] kbotc@lemmy.world 6 points 1 month ago

Yea, and if you’re going to let the AI write the structure and have a lawyer go and rewrite the whole thing after validating it, why not remove the step and just have said lawyer actually write the brief and put their accreditation on the line?

[-] BlueMonday1984@awful.systems 8 points 1 month ago

That requires a significant improvement in professional ethics, which isn’t something that is really amenable to technological fixes.

That goes some way to explaining why programmers don't have a moral compass.

[-] o7___o7@awful.systems 7 points 1 month ago

We have got to bring back the PE exam for software engineering.

[-] uranibaba@lemmy.world 1 points 2 weeks ago

When confronted with a problem like “your search engine imagined a case and cited it”, the next step is to wonder what else it might be making up, not to just quickly slap a bit of tape over the obvious immediate problem and declare everything to be great.

That is why I called it a great start and not a finished product. I image there are a lot of legal cases to sift through and it is a lawyers job to at least keep track of the imporant ones (those which sets precedent), but knowing that there are multiple "lesser" rulings in your favour could be useful. And having a search enging that can find those based on a description of your current case? Not a bad idea to me.

The other thing to be concerned about is how lazy and credulous your legal team are that they cannot be bothered to verify anything. That requires a significant improvement in professional ethics, which isn’t something that is really amenable to technological fixes.

I can only agree here.

[-] swlabr@awful.systems 3 points 2 weeks ago* (last edited 2 weeks ago)

That is why I called it a great start and not a finished product. I image there are a lot of legal cases to sift through and it is a lawyers job to at least keep track of the imporant ones (those which sets precedent), but knowing that there are multiple “lesser” rulings in your favour could be useful. And having a search enging that can find those based on a description of your current case? Not a bad idea to me.

Such databases have existed since basically the conception of common law, like a thousand fucking years ago. Good solutions exist and have existed without AI til today. It’s not a great start, it’s a running leap backwards off of a cliff into a trough of slop.

[-] uranibaba@lemmy.world 1 points 2 weeks ago

What's even the point in engaging in a discussion if you are going to dimiss anything related to AI out of hand?

I'm not talking about generating cases. I am talking about creating a software that can help you find cases matching your current case. You will get a list, look at it and keep any case what was of use. If the output is bad, the software is bad. Just like any search engine.

Did you even bother to read anything I wrote or did you just see the word "AI"? I'm done.

[-] self@awful.systems 2 points 2 weeks ago

I agree, you are fucking done. good job showing up 12 days late to the thread expecting strangers to humor your weird fucking obsession with using LLMs for something existing software does better

this post was submitted on 18 May 2025
68 points (100.0% liked)

TechTakes

1973 readers
195 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS