[-] Collectivist@awful.systems 8 points 2 months ago* (last edited 2 months ago)

This could be a sex thing or maybe they want young blood for their blood transfusions. Maybe they saw Marx's criticism that capitalists were akin to vampires, sucking the metaphorical blood out of the poor, and thought to themselves: he's right, we should take their literal blood too.

[-] Collectivist@awful.systems 6 points 3 months ago

Yeah, I didn't say he only makes those videos, just that he makes a lot of them

[-] Collectivist@awful.systems 7 points 5 months ago

Wasn't phrenology about skull shape and its influence on mental traits in general? Otherwise it's not really a field of study, it's just one claim: larger skull = more intelligence (which is just a less precise version of more childhood nutrition = taller = larger skull = more intelligence), but phrenologists also claimed they could explain all sorts of traits like criminality and personality with things like bumps in the skull.

[-] Collectivist@awful.systems 7 points 5 months ago

Wytham Abbey is being listed on the open market for £15 million [...] Adjusted for inflation, the purchase price of the house two years ago now equals £16.2 million.

Remember when one of their justifications was that it's also an investment?

Reaction on the EA forum:

It's not necessarily a loss of a million pounds if many of the events that happened there would have spent money to organise events elsewhere (renting event spaces and accommodation for event attendees can get quite pricey) and would have spent additional time on organising the events, finding venues, setting them up etc (compared to having them at Wytham). For comparison, EA Global events cost in the ballpark of a million pounds per event.

[-] Collectivist@awful.systems 7 points 5 months ago

I actually don't find this a bad post, but I do want to point out that it got way more karma than any of titotals more critical posts, even though I find many of them better. This once again points to how the EA Forum's voting-power-by-popularity karma system creates groupthink; being critical nets you less voting power than being lauditory, and it disincentivizes calling out bullshit in general.

When Ives Parr of "Effective Altruism is when you want to spend money on genetic engineering for race-and-IQ theories" fame, made a seperate post complaining that that post got downvoted despite nobody giving a good counterargument, I wanted to comment and call him out on his bullshit, but why bother with a karma system that allows him and his buddies to downvote it out of the frontpage while leaving you with less voting power? A lot of EA's missteps are just one off blunders, but what makes the EA forum's """epistocratic""" voting system so much worse is that it's systematic, every post and comment is now affected by this calculus of how much you can criticize the people with a lot of power on the forum without losing power of your own, making groupthink almost inevitable. Given the fact that people who are on the forum longer have on average more voting power than newer voices, I can't help but wonder if this is by design.

[-] Collectivist@awful.systems 8 points 6 months ago

embrace the narrative that “SBF died for our sins”

Huh? This is so absurdly self-aggrandizing that I struggle to comprehend what he's even saying. What did he imagine "our sins" were, and how did getting imprisoned absolve them?

[-] Collectivist@awful.systems 7 points 7 months ago

It made me think of epistemic luck in the rat-sphere in general, him inventing then immediately fumbling 'gettier attack' is just such a perfect example, but there are other examples in there such as Yud saying:

Personally, I’m used to operating without the cognitive support of a civilization in controversial domains, and have some confidence in my own ability to independently invent everything important that would be on the other side of the filter and check it myself before speaking. So you know, from having read this, that I checked all the speakable and unspeakable arguments I had thought of, and concluded that this speakable argument would be good on net to publish[…]

Which @200fifty points out:

Zack is actually correct that this is a pretty wild thing to say… “Rest assured that I considered all possible counterarguments against my position which I was able to generate with my mega super brain. No, I haven’t actually looked at the arguments against my position, but I’m confident in my ability to think of everything that people who disagree with me would say.” It so happens that Yudkowsky is on the ‘right side’ politically in this particular case, but man, this is real sloppy for someone who claims to be on the side of capital-T truth.

[-] Collectivist@awful.systems 8 points 7 months ago

While the writer is wrong, the post itself is actually quite interesting and made me think more about epistemic luck. I think Zack does correctly point out cases where I would say rationalists got epistemically lucky, although his views on the matter seem entirely different. I think this quote is a good microcosm of this post:

The Times's insinuation that Scott Alexander is a racist like Charles Murray seems like a "Gettier attack": the charge is essentially correct, even though the evidence used to prosecute the charge before a jury of distracted New York Times readers is completely bogus.

A "Gettier attack" is a very interesting concept I will keep in my back pocket, but he clearly doesn't know what a Gettier problem is. With a Gettier case a belief is both true and justified, but still not knowledge because the usually solid justification fails unexpectedly. The classic example is looking at your watch and seeing it's 7:00, believing it's 7:00, and it actually is 7:00, but it isn't knowledge because the usually solid justification of "my watch tells the time" failed unexpectedly when your watch broke when it reached 7:00 the last time and has been stuck on 7:00 ever since. You got epistemically lucky.

So while this isn't a "Gettier attack" Zack did get at least a partial dose of epistemic luck. He believes it isn't justified and therefore a Gettier attack, but in fact, you need justification for a Gettier attack, and it is justified, so he got some epistemic luck writing about epistemic luck. This is what a good chunk of this post feels like.

[-] Collectivist@awful.systems 5 points 7 months ago

I spend a lot of time campaigning for animal rights. These criticisms also apply to it but I don't consider it a strong argument there. EA's spend an estimated 1.8 million dollar per year (less than 1%, so nowhere near a majority) on "other longterm" which presumably includes simulated humans, but an estimated 55 million dollar per year (or 13%) on farmed animal welfare (for those who are curious, the largest recipient is global health at 44%, but it's important to note that it seems like the more people are into EA the less they give to that compared to more longtermist causes). Farmed animals "don’t resent your condescension or complain that you are not politically correct, they don't need money, they don't bring cultural baggage..." yet that doesn't mean they aren't a worthy cause. This quote might serve as something members should keep in mind, but I don't think it works as an argument on its own.

[-] Collectivist@awful.systems 7 points 9 months ago* (last edited 9 months ago)

Well of course, everything is determined by genetics, including, as the EA forum taught me today, things like whether someone is vegetarian so to solve that problem (as well as any other problem) we need (and I quote) "human gene editing". ~/s~

[-] Collectivist@awful.systems 5 points 9 months ago* (last edited 9 months ago)

From the top comment:

Yeah, I really wouldn't trust how that book [by Richard Lynn] picks its data. As stated in "A systematic literature review of the average IQ of sub-Saharan Africans":

For instance, Lynn and Vanhanen (2006) accorded a national IQ of 69 to Nigeria on the basis of three samples (Fahrmeier, 1975; Ferron, 1965; Wober, 1969), but they did not consider other relevant published studies that indicated that average IQ in Nigeria is considerably higher than 70 (Maqsud, 1980a, b; Nenty & Dinero, 1981; Okunrotifa, 1976). As Lynn rightly remarked during the 2006 conference of the International Society for Intelligence Research (ISIR), performing a literature review involves making a lot of choices. Nonetheless, an important drawback of Lynn (and Vanhanen)'s reviews of the literature is that they are unsystematic.

They're not the only one who find Lynn's choice of data selection suspect. Wikipedia describes him as:

Richard Lynn (20 February 1930 – July 2023) was a controversial English psychologist and self-described "scientific racist" [...] He was the editor-in-chief of Mankind Quarterly, which is commonly described as a white supremacist journal.

[From earlier in the comment] I can view an astonishing amount of publications for free through my university, but they haven't opted to include this one, weird... So should I pay money to see this "Mankind Quarterly" publication?

When I googled it I found that Mankind Quarterly includes among its founders Henry Garrett an American psychologist who testified in favor of segregated schools during Brown versus Board of Education, Corrado Gini who was president of the Italian genetics and eugenics Society in fascist Italy and Otmar Freiherr von Verschuer who was director of the Kaiser Wilhelm Institute of anthropology human heredity and eugenics in Nazi Germany. He was a member of the Nazi Party and the mentor of Josef Mengele, the physician at the Auschwitz concentration camp infamous for performing human experimentation on the prisoners during World War 2. Mengele provided for Verschuer with human remains from Auschwitz to use in his research into eugenics. [...] Something tells me it wouldn't be very EA to give money to these people.

[-] Collectivist@awful.systems 6 points 9 months ago

I wonder what the deleted Roko comment was about

Are you talking about his -18 karma comment? It says:

Long post on eugenics, -1 points right now and lots of comments disagreeing. Looks like this is a political battle; I'll skip actually reading it and note that these kinds of issues are not decided rationally but politically, EA is a left-wing movement so eugenics is axiomatically bad. From a right-wing point of view one can even see it as a good thing that the left is irrational about this kind of thing, it means that they will be late adopters of the technology and fall behind.

view more: ‹ prev next ›

Collectivist

joined 10 months ago