Well, if this guy's quite confident, then I'm sure it'll all pan out in the end. How hard could symbolic reasoning be, really? Incidentally, I've been in a coma since 1970
Text in AI-generated images will never not be funny to me. N the most n'tural hnertis indeed.
Making me learn how to do things the right way is premature optimization
"This technology is coming whether we like it or not, so we're going to make sure that we get it right," Adams said in a statement.
??? Who is "we" here. Is the technology going to be developed by aliens who beam it down to earth? Is a rogue AI developing self-driving cars for the purposes of annoying humanity into submission? Are they springing forth from the head of Zeus?
Seriously, can we go back to the days when tech boosters at least pretended technology was being developed by people to improve other people's lives? Now it seems like they just go "sucks to suck, idiots! this is the future now, get with it, grandpa!" and skateboard away into the sunset leaving everyone else to clean up their mess...
Exchange presented without comment:
My prediction: the advance of tech by AI will far surpasse what it consume in energy.
To look at the energy consumption of current model is extremely short sighted. If AI create a new material, a new solar cell, advance fusion reactor is all of humanity that jump forward.
Furthermore new generation of AI accelerators and new algorithms will improve efficiency by order of magnitute, it's still early days.
For every good thing, come up with a bad.
The material created will be a better poison/virus. The algorithm to keep the fusion tokamak from going boom will be at best 99% correct. The new solar cell? More exotic materials required than the current.
Blind optimism is a vice we cannot afford.
The post you're responding to doesn't argue from blind optimism, it argued a reasonably-expected gain in net beneficial effects.
Everything about Zack is sad.
I have to say, if you look past the, well, you know, stuff, he's actually pretty decent at injecting pathos into the posts about his personal life. His writing does a good job bringing you into his extremely depressing/self-loathing inner world -- you really feel for the guy, or at least I do. That said, it's this exact effect which makes me think he is probably not perceiving things as lucidly as he thinks he is. Depression can feel like clarity, but that's no way to live.
When I was a kid (Nat Nanny)[https://en.wikipedia.org/wiki/Net_Nanny] was totally and completely lame, but the whole millennial generation grew up to adore content moderation. A strange authoritarian impulse.
Me when the mods unfairly ban me from my favorite video game forum circa 2009
(source: first HN thread)
"how would women use our protocol?"
"oh, right, women, shit. uh, I guess they could use it for dating men?"
"yeah, that's a pretty good one, any other ideas?"
"do women even do anything else?"
"hmmm... I guess not that I'm aware of, no"
"all right then let's go with that one"
What I don't get is, ok, even granting the insane Eliezer assumption that LLMs can become arbitrarily smart and learn to reverse hash functions or whatever because it helps them predict the next word sometimes... humans don't entirely understand biology ourselves! How is the LLM going to acquire the knowledge of biology to know how to do things humans can't do when it doesn't have access to the physical world, only things humans have written about it?
Even if it is using its godly intelligence to predict the next word, wouldn't it only be able to predict the next word as it relates to things that have already been discovered through experiment? What's his proposed mechanism for it to suddenly start deriving all of biology from first principles?
I guess maybe he thinks all of biology is "in" the DNA and it's just a matter of simulating the 'compilation' process with enough fidelity to have a 100% accurate understanding of biology, but that just reveals how little he actually understands the field. Like, come on dude, that's such a common tech nerd misunderstanding of biology that xkcd made fun of it, get better material
Well all I know is I definitely trust the research and knowledge and informed-ness about biological sex of the person who uses the word "hermaphroditism" with regards to humans. Now that's a person who knows what they're talking about, I think to myself
If you think of LLMs as being akin to lossy text compression of a set of text, where the compression artifacts happen to also result in grammatical-looking sentences, the question you eventually end up asking is "why is the compression lossy? What if we had the same thing but it returned text from its database without chewing it up first?" and then you realize that you've come full circle and reinvented search engines