24
top 19 comments
sorted by: hot top controversial new old
[-] zogwarg@awful.systems 1 points 47 minutes ago

The standout monuments of stupidity—and/or monstrosity—in McCarthy's response for me are.

  • Calling JW a failed computer scientist for failing to see that computers and clockwork are different, when really there is no computation a computer can make that Turing Complete clockwork couldn't be able to replicate.
  • Essentially saying that by analogy, where religion should not stand in the way of science, so should morals not stand in the way of science?!?!?! (I mean really? WTF)
[-] nightsky@awful.systems 24 points 1 day ago

From McCarthy's reply:

My current answer to the question of when machines will reach human-level intelligence is that a precise calculation shows that we are between 1.7 and 3.1 Einsteins and .3 Manhattan Projects away from the goal.

omg this statement sounds 100% like something that could be posted today by Sam Altman on X. It's hititing exactly the sweet spot between appearing precise but also super vague, like Altman's "a few thousand days".

[-] bitofhope@awful.systems 19 points 1 day ago

That sentence is somewhere between exactly 420.69 and 1,337.00 millialtmans of cringe.

[-] blakestacey@awful.systems 8 points 1 day ago

That paragraph begins,

Like his predecessor critics of artificial intelligence, Taube, Dreyfus and Lighthill, Weizenbaum is impatient, implying that if the problem hasn't been solved in twenty years, it is lime to give up.

Weizenbaum replies,

I do not say and I do not believe that "if the problem hasn't been solved in twenty years, we should give up". I say (p. 198) " . . . it would be wrong . . . to make impossibility arguments about what computers can do entirely on the grounds of our present ignorance". That is quite the opposite of what McCarthy charges me with saying.

It's a snidely jokey response to an argument that Weizenbaum didn't make!

[-] bitofhope@awful.systems 6 points 1 day ago

And even if Joseph Weizenbaum did actually say, verbatim: “if the problem hasn’t been solved in twenty years, it is time to give up”, that's not the same as asking for the precise time when “machines will reach human-level intelligence”.

[-] gedhrel@lemmy.world 3 points 1 day ago

It's sarcasm. The question asks for unwarranted precision and the response is a joke.

[-] bitofhope@awful.systems 10 points 1 day ago

Imagining a guy who asks me a dumb question so I can let everyone know how I'd mock them with a joke answer.

[-] froztbyte@awful.systems 7 points 1 day ago

“Pray tell mr Babbage..”

[-] gedhrel@lemmy.world 4 points 1 day ago

Spot on, yeah. Although as pointed out just above, this wasn't actually Weizenbaum's position. But in an era of letters to the editor, perhaps using a little rhetorical trickery to preempt a two-month-long back and forth might be excusable. It's a strawman nonetheless; but this letter is a screed.

[-] gedhrel@lemmy.world 2 points 1 day ago

I suspect he got asked it a lot. There was a lot of interesting work going on back then but people basically didn't have any notion that there was a path from there to any kind of AGI. (In that respect they might've been somewhat more clued up than Altman.)

I think it's a natural thing to preemptively defend against the obvious counterpoint when you're railing against the thesis that current AI work isn't going to deliver on the "I".

[-] gedhrel@lemmy.world 5 points 1 day ago

Having said that, that this is the kind of thing Altman might say unironically speaks volumes. He really does have a trillion-dollar monorail to sell.

[-] nightsky@awful.systems 1 points 1 day ago

Ah, thanks, well my sarcasm detector isn't that good.

[-] Amoeba_Girl@awful.systems 13 points 2 days ago

John McCarthy's really sounding like a typical libertarian prat.

He concludes that since a computer cannot have the experience of a man, it cannot understand a man. There are three points to be made in reply. First, humans share each other's experiences and those of machines or animals only to a limited extent. In particular, men and women have different experiences.

l.m.f.a.o., we're going there are we now

[-] nightsky@awful.systems 10 points 1 day ago

Also great, right after that:

Nevertheless, it is common in literature for a good writer - to show greater understanding of the experience of the opposite sex than a poorer writer of that sex.

Yeeeaah, sure. And to write that in the 1970s even.

If anything, this McCarthy reply makes me want to read the Weizenbaum book.

[-] blakestacey@awful.systems 14 points 1 day ago

From page 202:

Few "scientific" concepts have so thoroughly muddled the thinking of both scientists and the general public as that of the "intelligence quotient" or "I.Q." The idea that intelligence can be quantitatively measured along a simple linear scale has caused untold harm to our society in general, and to education in particular.

[-] nightsky@awful.systems 3 points 1 day ago

That's it, I'm ordering a copy.

[-] o7___o7@awful.systems 10 points 2 days ago

It's kind of encouraging that this dumb shit isn't new innit?

[-] bitofhope@awful.systems 11 points 1 day ago

It also doesn't bode well for the hope of it dying down anytime soon.

[-] o7___o7@awful.systems 7 points 1 day ago

Yeah, it kind of grew a religion on it since then, didn't it?

this post was submitted on 30 May 2025
24 points (100.0% liked)

TechTakes

1883 readers
154 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS