598
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 13 Aug 2023
598 points (100.0% liked)
Technology
59299 readers
3984 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
They're gonna be in even bigger trouble when it's determined that AI training, especially for content generation, is not fair use and they have to pay each and every person whose data they've used.
A) Most companies jumping on the AI bandwagon are training their own models.
B) The music industry has been legally using samples to create new songs since the 90's.
AI is here to stay.
A) Not true. Many have been training models using various online data that that doesn't belong to them, has not been licensed, and has been used without informed consent of the rights holders.
B) Terrible comparison. Music sampling is a grey area that is much more complex and dubious than you're suggesting. There are instances in which sampling has been considered fair use, but outside of that there are strict laws around sampling. Finally, human music creation and sampling have very little in common with generative AI.
AI is here to stay. But the free ride of scraping every piece of information in human history without even a basic regard towards intellectual property or personality rights is unsustainable, unethical, and nowhere near the threshold for what can be considered fair use.
Once people start needing to own or license their training data sets the technology will be just fine, but costs will rise dramatically and the VC investment bubble is going to pop bigtime.
What's legal changes. There will absolutely be new ai focused laws enacted just like there were internet focused laws once the Internet became very impactful. We simply have no idea how this will play out. Whatever new laws are passed will definitely not kill ai though since it's a big business and us law makes will want ai companies to thrive so those services can be exported. People acting like ai will die for legal reasons are completely off base.
And payment sharing will most likely be a percentage of revenue and right now their biggest hurdle is just scaling, and it's incredibly rare that a startup with huge demand completely fails because of scaling challenges. Once they scale their profit margin will be huge, they'd be able to do payouts and still profit. But don't get excited about payouts, it'll probably amount to pennies like it does on Spotify.
Ignoring the fact that training an AI is insanely transformative and definitely fair use, people would not get any kind of pay. The data is owned by websites and corporations.
If AI training was to be highly restricted, Microsoft and google would just pay each other for the data and pay the few websites they don't own (stack, GitHub, Reddit, Shutterstock, etc), a bit of money would go to publishing houses and record companies, not enough for the actual artist to get anything over a few dollars.
And they would happily do it, since they would be the only players in the game and could easily overcharge for a product that is eventually going to replace 30% of our workforce.
Your emotional short sighted response kills all open source and literally gives our economy to Google and Microsoft. They become the sole owners of AI tech. Don't be stupid, please. They want you to be mad, it literally only helps them.