1238
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 21 Aug 2025
1238 points (100.0% liked)
Technology
75734 readers
3385 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
I asked ChatGPT about this article and to leave any bias behind. It got ugly.
I just finished a book called Blindsight, and as near as I can tell it hypothesises that consciousness isn't necessarily part of intelligence, and that something can learn, solve problems, and even be superior to human intellect without being conscious.
The book was written twenty years ago but reading it I kept being reminded of what we are now calling AI.
Great book btw, highly recommended.
The Children of Time series by Adrian Tchaikovsky also explores this. Particularly the third book, Children of Memory.
Think it’s one of my favourite books. It was really good. The things I’d do to be able to experience it for the first time again.
I only read Children of Time. I need to get off my ass
Highly recommended. Children of Ruin was hella spooky, and Children of Memory had me crying a lot. Good stories!
I'm a simple man, I see Peter Watts reference I upvote.
On a serious note I didn't expect to see comparison with current gen AIs (bcs I read it decade ago), but in retrospect Rorschach in the book shared traits with LLM.
In before someone mentions P-zombies.
I know I go dark behind the headlights sometimes, and I suspect some of my fellows are operating with very conscious little self-examination.
Blindsighted by Peter Watts right? Incredible story. Can recommend.
Yep that's it. Really enjoyed it, just starting Echopraxia.
It's "hypotheses" btw.
Hypothesiseses
You actually did it? That's really ChatGPT response? It's a great answer.
Yeah, this is ChatGPT 4. It's scary how good it is on generative responses, but like it said. It's not to be trusted.
This feels like such a double head fake. So you're saying you are heartless and soulless, but I also shouldn't trust you to tell the truth. 😵💫
Everything I say is true. The last statement I said is false.
I think it was just summarising the article, not giving an "opinion".
The reply was a much more biased take than the article itself. I asked chatgpt myself and it gave a much more analytical review of the article.
It's got a lot of stolen data to source and sell back to us.
Yeah maybe don't use LLMs
Why the British accent, and which one?!
Like David Attenborough, not a Tesco cashier. Sounds smart and sophisticated.
Go learn simple regression analysis (not necessarily the commenter, but anyone). Then you'll understand why it's simply a prediction machine. It's guessing probabilities for what the next character or word is. It's guessing the average line, the likely followup. It's extrapolating from data.
This is why there will never be "sentient" machines. There is and always will be inherent programming and fancy ass business rules behind it all.
We simply set it to max churn on all data.
Also just the training of these models has already done the energy damage.
Not if you run your own open-source LLM locally!
It’s automated incompetence. It gives executives something to hide behind, because they didn’t make the bad decision, an LLM did.
Can you share the prompt you used for making this happen? I think I could use it for a bunch of different things.
This was 3 weeks ago. I don't remember it, sorry.