[-] aaronbieber@beehaw.org 63 points 9 months ago

Chant it with me, friends!

Stop 👏 using 👏 Chrome 👏!!

[-] aaronbieber@beehaw.org 8 points 1 year ago

No, you really can't.

[-] aaronbieber@beehaw.org 22 points 1 year ago

I mean, in fairness, "vegetable" isn't a scientific term at all, so whether potatoes are vegetables (or tubers, or roots, or something else) is totally up for debate.

But they're a hell of a lot more of a vegetable than pizza is!

[-] aaronbieber@beehaw.org 9 points 1 year ago

Bram was notoriously possessive of the Vim project and consistently avoided bringing in other lead maintainers or adding widely demanded features (like async processing). Maybe that changed while I wasn't paying attention, but it had a lot to do with the very successful neovim fork. Bram eventually added an async feature but not before neovim exploded in popularity.

It's tragic to hear of Bram's passing, and at such a young age. I will be interested to see what happens to the Vim project now, in his absence.

[-] aaronbieber@beehaw.org 20 points 1 year ago

It feels like realizing that WhatsApp is a terrible Meta privacy nightmare, but you can't wake up because you can't convince your whole family to use Signal.

[-] aaronbieber@beehaw.org 25 points 1 year ago

I wonder if these battles will shake loose the circuit split on de minimis exceptions to music samples (see https://lawreview.richmond.edu/2022/06/10/a-music-industry-circuit-split-the-de-minimis-exception-in-digital-sampling/).

Currently, it is absolutely not "cut and dried" whether the use of any given sample should be permitted. Most musicians are erring on the side of "clear everything," but does an AI-generated "simulacrum" qualify as "sampling"?

What's on trial here is basically "what characteristic(s) of an artist's work do they own?" If you write a song, you can "own" whatever is written down (melody, lyrics, etc.) If you perform a song, you can own the performance (recordings thereof, etc.) Things start to get pretty vague when we start talking about "I own the sound of my voice."

I think it's accepted that it's legal for an impersonator to make a living doing TikToks pretending to be Tom Cruise. Tom Cruise can't really sue them saying "he sounds like me." But is it different if a computer does it? It may very well be.

It's going to be a pretty rough few years in copyright litigation. Buckle up.

[-] aaronbieber@beehaw.org 45 points 1 year ago* (last edited 1 year ago)

Louis Rossmann is a bit of a provocateur, but what he's saying in this video is the bare and unvarnished truth. If Reddit cared about its users and its moderators, the CEO's internal messaging would be less like "this will blow over" and more like "what should we do to meet these people in the middle?"

There is no meeting in the middle when you're up against institutional investors who have put literally hundreds of millions of dollars on the line to fund your operation. I almost feel bad for Steve, he really has no choice, it's just a shame to see him falling into line and reciting exactly what the board wants him to say.

And by the way, this is why Beehaw has so much promise. The incentives of the operators and the users are aligned. There is no third party with outsized power waiting for the chance to pull the rip cord and enshittify the whole thing.

166
[-] aaronbieber@beehaw.org 9 points 1 year ago

Throughout history, people have always been driven to create, and others have always sought out creative works. For that reason, I don't think we'll necessarily "stagnate culturally" in a broad sense.

However, at least in the US, we're already standing at the precipice of making creative work practically impossible. Our extremely weak (by peer nation standards) labor protection laws and social support systems tends to strip life of everything but the obligation to work.

Our last bastion of hope for structural protection for creativity is the possibility that anyone could both create, and profit from it. Copyright law was, originally, intended to amplify that potential.

I usually point to stock photography as an area where people used to be able to make at least modest money, but nowadays you'd be lucky to make poverty wages. The market was flooded by cheap, high-quality cameras, and thus cheap, high-quality images. AI will do the same thing for many other mediums.

What has me really concerned is that the majority of really cool makers and creators I watch on YouTube are Canadian. I've convinced myself that this is because someone living in Canada can take the very real risk of sinking their life's energy into starting a YouTube channel because at least they know that if they get cancer, they have somewhere to go.

Not so here in America. If you aren't working for an established employer, or sitting on quite a bit of cash for independent health insurance, you're taking substantial risk in being unemployed for any length of time (assuming you have the choice). Even if you do "make it," the costs of self-insurance for sole proprietors is no joke!

So the only people taking their life in their own hands to create works of real cultural value are 1) the few percent who manage to get paid for it, 2) the independently wealthy and/or retired, and 3) the poor and desperate who would be just as precarious in either case.

It's not our finest hour here, if I do say so. I hope the rise of AI helps amplify this conversation. I am truly concerned about it.

[-] aaronbieber@beehaw.org 14 points 1 year ago

Reddit was dead from the day Conde Nast bought it. Every day since then was a roll of the dice as to whether they'd attempt to seize more profits and ruin it, or not. This happens to essentially every public or aspiring public company eventually. The need for perpetual growth warps decisions and guts the original mission in the end.

We call it "autosarcophagy" or "self-cannibalism."

As I understand it, Reddit also took on a lot of external capital investment, which only makes the pressure to perform financially even greater. I can't fault them for making the decisions they have to make to keep their jobs, keep their executive salaries, and so on.

Long live the sustainable, community-driven, community-funded future! Nobody can screw this up for us if we are the ones footing the bill.

[-] aaronbieber@beehaw.org 9 points 1 year ago

If you're interested, I wrote about some of the things I moved to self-host on my blog; https://blog.aaronbieber.com/2022/11/20/the-rise-of-the-indie-web.html

I recommend checking out https://indieweb.org/ as well!

[-] aaronbieber@beehaw.org 33 points 1 year ago

I am under the impression that the term was popularized, if not invented, by Cory Doctorow. See his many writings on his ad & tracker-free website; https://pluralistic.net/tag/enshittification/

[-] aaronbieber@beehaw.org 28 points 1 year ago

All of this is true, but I wanted to relate a similar phenomenon that I observed some 30 years ago that might be of interest, or at least entertaining, to everyone here.

In my formative years, I spent a lot of time reading Usenet, which, briefly, is a text-only forum not dissimilar to bulletin boards or subreddits or Lemmy communities.

I frequented one group in particular called alt.sysadmin.recovery. Most Usenet groups began with alt. by archaic convention, and the rest of the name is simply descriptive or categorical. There were groups like alt.hobbies.baking and so on. Again, not dissimilar from (and likely inspiration for) these modern web-based communities.

This group was for system administrators (or "sysadmins") to generally gripe with one another about the difficulties of their jobs, dealing with users on their systems or networks, and similar. One of the rules of the group was that no advice was ever to be requested, nor given. It was strictly for sysadmins to vent. The key point here is that everyone in the group was in some technical role.

What was unique about alt.sysadmin.recovery was that you couldn't post to it. At least, it seemed that you couldn't, because the group was set to be moderated, but had no moderators. If you posted a message to a moderated group, the message would be emailed to all of the moderators on record, who would either delete or ignore them, or apply their stamp of approval for the message to be posted in the group. alt.sysadmin.recovery had no moderator emails configured.

The trick is a little bit technical. Usenet posts are quite similar to emails: they have some "header" fields (like the title of the post, its author, and so forth) and a body. Most of the headers are not displayed directly (which is also true for email), such as what Usenet software sent the message, and so on.

When a moderator approved a message in a Usenet group, their client would append an Approved: header line with some value, like their name, or the date, or something. As long as the Approved: header was there and had any value at all, the message would be distributed to the group.

So the trick was to simply append that header when you posted the message. Since there were no moderators anyway, nobody could ever accuse you of bypassing the system. Bypassing the moderation system was, in fact, the entire point. You had to know enough about how moderation worked in Usenet to post a message to the group.

One of the lasting results was that alt.sysadmin.recovery was never overrun by bots and spam, even as the rest of Usenet became an absolute cesspool through the '80s.

Which brings me back to my point. A few hoops to jump through and a few initial challenges to adoption can go a long way as a filter for who can show up and interact. Of course we would want Lemmy to be welcoming to anyone who will make the community better, brighter, more fun, and more useful... But we can take our time cracking open the floodgates. Maybe that's for the best.

view more: next ›

aaronbieber

joined 1 year ago