[-] LexiMax 8 points 1 year ago* (last edited 1 year ago)

Elon never intended to buy Twitter, he always intended to back out of the deal from the beginning, either as a headline-grabbing move or as some form of stock manipulation. His miscalculation is that he didn't realize that the existing Twitter management was eager to sell due to the site treading water in terms of profitability, and were willing to take him to court to make the deal go through.

The boat-anchor around his ankle is that he had to borrow a lot of money to honor the purchase. Advertising revenue would've always been decimated by his enabling the right, but the enormous debt hanging over his head is why the site is visibly cracking at the seams so soon after his purchase. Of course, he did favors for his political allies, but that was a move made out of opportunism, not conspiracy.

If I'm making him sound like he's playing 5D chess, he's not. He made a dumb decision that blew up in his face. He's screwed. He knows he's totally screwed. So he's doing what any CEO would be doing in his position - cutting costs, lying about why the site is having issues, and coming up with hare-brained ideas to try and generate revenue. It's 100% performative CYA, because when the site is finally repossessed by his creditors, at least he can claim he tried something.

[-] LexiMax 1 points 1 year ago* (last edited 1 year ago)

So it’s a matter of being relatively easy to scam people with it?

That is merely a side-effect of one of the original sins of programmable blockchains - the fact that there is a delegation of responsibility of data management from a database administrator to whoever happens to have control of that data, and the controller can only manage that data according to the underlying code of the NFT.

This only sounds appealing if you've never actually touched a database in your life. There are too many points of failure, and said mistakes are much harder to correct.

[-] LexiMax 2 points 1 year ago* (last edited 1 year ago)

Essentially, the NFT is "proof of ownership" only insofar as what is put into the blockchain is correct and accurately reflects reality.

If you don't vet what goes on the blockchain for accuracy, the blockchain is useless, and if you do validate, you don't need a blockchain because the validator can just use a traditional database.

It is a solution in search of a problem.

[-] LexiMax 1 points 1 year ago

If I buy a lot of baseball cards for 1 cent, at least they don't suffer from the oracle paradox.

NFT's were nakedly a solution in search of a problem that weren't even a very good solution to the problem they were purporting to solve. That's what pisses people off.

[-] LexiMax 2 points 1 year ago* (last edited 1 year ago)

Haven't dealt with it personally, but know of people who have. The one constant is that the places I've heard of that have a RTO mandate are short-staffed and brain-drained.

[-] LexiMax 7 points 1 year ago

Is it not "serious" to work towards a better future because that's more difficult to obtain? There is a future out there where more industries are dominated by software that respects user freedom.

I do not believe that distros ignoring the problem of binary software distribution is actually accomplishing anything productive on that front. All it does is put a gigantic KEEP OUT sign for most outside developers who might have briefly considered porting their software. Package maintainers are also incredibly overburdened, and are often slow to update their packages even on rolling release distros.

Worse, it also inconveniences their userbase, pushing them to solutions their that bypass the distro completely such as third-party repos, Steam, Wine, Flatpak, Docker, or even running Linux in WSL. All of them function as non-free escape hatches, but all of them are inferior to distros getting their act together and deciding that binary software distribution is a problem worth collaborating on and solving together.

[-] LexiMax 11 points 1 year ago* (last edited 1 year ago)

More importantly, the reason why all of those apps don't have Linux versions is not because of some anti-Linux conspiracy, but because Linux userspace has for most of its existence prioritized distro-packaged-and-provided software, at the expense and sometimes even exclusion of binary software distribution.

This is not just a technical limitation, but I'd also argue a cultural one, driven by folks who consider proprietary/nonfree software irrelevant and not worth supporting in a first-class way. Unfortunately, the companies who make both the software that entire industries are built around and the games that you play when you get off work disagree. Valve was probably the company in the best position to make native Linux games a trend, and the fact that they're more focused on Proton these days is pretty telling.

The only developers in the Linux ecosystem who I feel are taking the problem seriously are the Flatpak developers. They do amazing work, with great tooling that builds against a chrooted runtime by default. But it needs more widespread usage and acceptance, as well as better outreach to developers from other ecosystems who might've had horrendous experience making Linux builds in the past.

There is a future out there with native Linux builds of industry-standard tooling and even games. But it's a future the Linux community has to willing to actually work towards.

[-] LexiMax 1 points 1 year ago

Very exciting. I'm all for improved indexing and search times, and integrating C++ build insights directly into the IDE is a smart move.

However, someone pointed out to me a new feature in the works on the preview branch, Size and Alignment hints, where hovering over a struct or class will reveal its size and alignment. I can't be the only one thankful that I will soon no longer have to constexpr auto x = sizeof(Foo); 😀

[-] LexiMax 1 points 1 year ago

The advice I've always read about std::deque is that unless you can measure a performance difference, it's better to simply default to std::vector as the cache advantages of laying everything out in a contiguous block of memory cover for many of its "Big O" sins.

Something interesting I've come across is a so-called Veque which tries to claw back some of the on-paper advantages of a std::deque but with a performance of a std::vector by keeping scratch space both before and after the used data. Can't speak to its usefulness personally, but it does look neat.

[-] LexiMax 2 points 1 year ago

Apparently, Google has directly advised against doing this in a recent post on Twitter:

https://nitter.moomoo.me/searchliaison/status/1689018769782476800#m

LexiMax

joined 1 year ago