28
submitted 1 day ago* (last edited 1 day ago) by dgerard@awful.systems to c/techtakes@awful.systems

video version

I do believe this may be the first youtube video on the issue too

you are viewing a single comment's thread
view the rest of the comments
[-] o7___o7@awful.systems 12 points 1 day ago

Please let this be a bit of performance art.

[-] scruiser@awful.systems 10 points 1 day ago

Using just the author's name as input feels deliberately bad. Like the promptfondlers generally emphasize how important prompting it right is, its hard to imagine them going deliberately minimalistic in prompt.

[-] zogwarg@awful.systems 8 points 19 hours ago* (last edited 19 hours ago)

It's also such a bad description, since from their own post, the Bot+LLM they where using was almost certainly feeding itself data found by a search engine.

That's like saying, no I didn't give the amoral PI any private information, I merely gave them a name to investigate!

EDIT: Also lol at this part of the original disclaimer:

An expert in LLMs who has been working in the field since the 1990s reviewed our process.

[-] scruiser@awful.systems 4 points 14 hours ago

That disclaimer feels like parody given that LLMs have existed under a decade and only been popular a few years. Like it's mocking all the job ads that ask for 10+ years of experience on a programming language or library that has literally only existed for 7 years.

load more comments (2 replies)
this post was submitted on 01 May 2025
28 points (100.0% liked)

TechTakes

1818 readers
53 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS