30
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 21 Apr 2024
30 points (100.0% liked)
TechTakes
1489 readers
32 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 2 years ago
MODERATORS
A lesswrong attempts to explain physics using Information Theory!. This irritates me.
No, you can't, because you're still presuming that gases do expand, i.e., that merely connecting two containers is enough to mix their contents. Otherwise, you're saying that if you fill one bottle with orange juice and another with vodka, and then forget which is which, you've made a screwdriver.
Then it gets weird and confused, talking about a box divided in two parts, with green particles on one side and pink ones on the other.
Forgetting where things are doesn't give you psychoflexitive powers!
And from the comments:
No. If you don't incorporate quantum mechanics (or at the very least take some results of quantum mechanics as valid), you will get statistical mechanics very wrong rather quickly. Your results for the thermal properties of gases will get worse the more you calculate. You'll convince yourself that magnets are impossible. Etc.
For all that Yud has been praising the Feynman books ever since HPMOR at least, he doesn't seem to have inspired his fans to actually read the Lectures on Physics.
What the heck did I just read because it appeared to be a proof that hourglasses can't possibly work if you look away from them for a moment.
Hourglasses work by inverse Weeping Angels rules, doncha know?
I should also have mentioned the part where they say that the entropy of the "uniform distribution over (0,x)" is the base-2 logarithm of x. This is, of course, a negative number for any x they care about (0 < x < 1), and more strongly negative the smaller x becomes.
Argh. These people just don't know any math and never call each other out for not knowing any math, and now I have to read MIT OpenCourseWare to scrub the feeling out of my brain.
I think there is in fact a notion of continuous entropy where that is actually true, and it does appear to be used in statistical mechanics (but I am not a physicist). But there are clearly a lot of technical details which have been scrubbed away by the LW treatment.
The fact that the naive continuous version of the Shannon entropy (just replacing the sum with an integral) can go negative is one reason why statistical physicists will tell you not to do that. Or, more precisely: That's a trick which only works when patched up by an idea imported from quantum mechanics.
yea i did try to read the lecture notes and got reminded very fast why i don't try to read physics writing lol