God, that would be the dream, huh? Absolutely crossing my fingers it all shakes out this way.
Yeah but this presumes "the best way to beat 'em is to join 'em," right? Like, when all the operating systems or databases are proprietary, that's bad because those things are really useful and help you do things better and faster than you would otherwise.
But this argument applied here is like, oh no, what if large entertainment companies start making all their movies out of AI garbage, and everyone else can't do that because they can't get the content licensed? Well... what if they do? Does that mean they're going to be making stuff that's better? Wouldn't the best way to compete with that be not to use the technology because you'll get a higher-quality product? Or are we just giving up on the idea of producing good art at all and conceding that yes we actually only value cheapness and quantity?
Also, just on a personal level, for me as a J. Random Person who uploads creative work to the internet (some of which is in common crawl), but who doesn't work for a major entertainment corporation that has rights to my work, I would really prefer to have a way to say "sorry no, you can't use my stuff for this." I don't really find "well you see, we need to be able to compete with large entertainment companies in spam content generation, so we need to be able to use your uncompensated labor for our benefit without your permission and without crediting you" particularly compelling.
we simply don't know how the world will look if there are a trillion or a quadrillion superhumanly smart AIs demanding rights
I feel like this scenario depends on a lot of assumptions about the processing speed and energy/resource usage of AIs. A trillion is a big number. Notably there's currently only about 0.8% this number of humans, who are much more energy efficient than AIs.
The problem is I guess you'd need a significant corpus of human-written stuff in that language to make the LLM work in the first place, right?
Actually this is something I've been thinking about more generally: the "ai makes programmers obsolete" take sort of implies everyone continues to use javascript and python for everything forever and ever (and also that those languages never add any new idioms or features in the future I guess.)
Like, I guess now that we have AI, all computer language progress is just supposed to be frozen at September 2021? Where are you gonna get the training data to keep the AI up to date with the latest language developments or libraries?
I've thought about a similar idea before in the more minor context of stuff like note-taking apps -- when you're taking notes in a paper notebook, you can take notes in whatever format you want, you can add little pictures or diagrams or whatever, arranged however you want. Heck, you can write sheet music notation. When you're taking notes in an app, you can basically just write paragraphs of text, or bullet points, and maybe add pictures in some limited predefined locations if you're lucky.
Obviously you get some advantages in exchange for the restrictive format (you can sync/back up things to the internet! you can search through your notes! etc) but it's by no means a strict upgrade, it's more of a tradeoff with advantages and disadvantages. I think we tend to frame technological solutions like this as though they were strict upgrades, and often we aren't so willing to look at what is being lost in the tradeoff.