645
submitted 4 months ago by AEMarling@slrpnk.net to c/solarpunk@slrpnk.net

In a post-scarcity solarpunk future, I could imagine some reasonable uses, but that’s not the world we’re living in yet.


AI art has already poisoned the creative environment. I commissioned an artist for my latest solarpunk novel, and they used AI without telling me. I had to scrap that illustration. Then the next person I tried to hire claimed they could do the work without AI but in fact they could not.

All that is to say, fuck generative AI and fuck capitalism!

you are viewing a single comment's thread
view the rest of the comments
[-] bonkerfield@sigmoid.social 0 points 4 months ago

@atrielienz @SleezyDizasta my opinion is if I, as an artist, can look at publicly posted content and use that to inform my own unique work then why shouldn't an AI be able to? If I try to sell a drawing of bugs bunny, then WB can sue me, but I can sell as many bugs bunny inspired rabbit drawings as I want. That should be the rule for an algorithm too.

[-] atrielienz@lemmy.world 2 points 4 months ago* (last edited 3 months ago)

Because you as the artist are going to change that to make a unique work within certain legal guidelines. The fact is, the laws have not caught up to regulate this and protect artists.

Additionally though you're not thinking about this the right way. Your work as an artist is copyrighted. Meaning you own it and the right to license it to other entities. You as the artist did not license the use of your work to the company that used it for training data to give a result similar to your work when queried.

There are LLM's that do only use licensed work that they have purchased a license for or the rights to. Getty images is a really good example. But ChatGPT did not license anything. So everything that comes out the other end of a query is tainted by the stolen data or art that went into it.

Look up why the actors guild striked and protested to protect their art and likenesses. And then tell me you don't feel the same way. There's multiple lawsuits going on right now with multiple of these LLM's that have stolen data to use as training material.

A college can't just take your work offline and use it in their curriculum. Neither should an LLM be allowed to do that.

[-] bonkerfield@sigmoid.social 0 points 4 months ago

@atrielienz what I'm saying is that if the artwork is viewable in public, I have given the public license to hold that information in their brain and use it to influence their own output.

If a member of the public makes too similar of a replica then I can sue. We do not regulate the intake of public information into human storage/retrieval systems (brains) so why should we do that for synthetic ones?

We should only regulate the output to not reproduce art or an actors likeness etc.

[-] atrielienz@lemmy.world 1 points 4 months ago

If you go to college for art you are actively required to use specific licensed learning materials to learn from. They don't just go get random training material off the web and go "draw like this but make it your own". The same principles apply. The AI has no filters. It has no way of determining what is copyright infringement and what isn't. It can't decide what is fair use and what isn't.

[-] bonkerfield@sigmoid.social 0 points 4 months ago

@atrielienz the reason they have to use specific licensed material is because they are charging rhe art student and therefore must pay for the materials they provide to the student.

But as a student, you can look at any public art you want and allow it to inform your work as long as you don't copy. So that's another example of the same principle: you must pay to reproduce/distribute someone else's art for money. So we come to the same point: no reproduction, but intake is allowed.

[-] atrielienz@lemmy.world 1 points 4 months ago* (last edited 4 months ago)

Two things. One. You agree that they are charging the student and therefore providing a service and thereby would need to use licensed material because they are charging for that material or its use. Why is that different that a generative AI firm providing a paid service using unlicensed training data? We're not talking about generative AI firms as individuals. They're businesses. Making money off a training set that was acquired through means that took the IP of other individuals and business without their knowledge and consent and used it to create something that they are selling as a service.

Two. There are a myriad of reasons why companies license materials and a lot of them don't include the direct use, redistribution of, or copying of any of that material. There's also a number of reasons schools license materials up to and including uniformity, consistency, and to put their spin on things so to speak. That's why you might be able to find the same art course on offer just about any higher learning institution but the one at Julliard is not going to be the same as the one at the community college of Kenosha Wisconsin. The community college can't just get a copy of the training materials used by Julliard and reproduce those exactly. What you're saying is just a gross oversimplification of the real reasons, and I feel like it might be on purpose at this point.

[-] bonkerfield@sigmoid.social 0 points 4 months ago

@atrielienz let's look at writing computer code. LLMs used public copyrighted code to get really good at writing code blocks. That's like 85% of my job, but I don't care that they are making me obsolete because that means I can now spend more time figuring out how to do better science.

Artists should do the same. Anything that could be adequately created by thinking of a good text prompt should be done in 10 s and spend the rest of the time on hard creative stuff 🤷‍♀️

[-] atrielienz@lemmy.world 1 points 4 months ago

This is a terrible one to one comparison. I can't even begin to tear this apart it's so bad. LLM'S aren't even good at writing code which is like half the reason people have to go back and fix the code they generate.

Artists don't do art because it's work. They do art because they like to create things. You code because it's work which is why you don't care.

[-] bonkerfield@sigmoid.social 0 points 4 months ago

@atrielienz ok, this is just insulting. I write code ONLY because I want to create things. I have dozens of open source projects that I've built over the years. But I don't care about writing the code even though it's fun sometimes.

I write the code to create the thing. And if artists cared about creation they'd use whatever tool they could. The only reason to not want an AI alternative tool is to create a moat to keep getting paid for work that could be made cheaper.

[-] atrielienz@lemmy.world 1 points 4 months ago* (last edited 4 months ago)

You started with the insults when you basically claimed that artists should feel the way you do about writing code about making art. You know that's not how that works. If it makes things quicker for you, that's great. But making it "quicker" for the artist to make a piece isn't the same thing and it was disingenuous of you to claim otherwise. It's especially egregious considering that what's actually happening is non-artists are making "art" using LLM's and companies are buying that art because it's cheap thereby pushing real artists who actually are doing work out of the market entirely.

A cabinet maker might use a band saw to make his life easier but just having the band saw make the whole cabinet because it's faster? That's not how that works.

[-] bonkerfield@sigmoid.social 0 points 4 months ago

@atrielienz so you said it right here: "...can’t just get a copy of the training materials used by Julliard and reproduce those exactly."

They can't reproduce, but if Juliard posted their materials online for free, then the professor at the community college could look at those materials and use that to inform their own material selection.

You are muddling up a bunch of random side issues rather than addressing the principle issue: anyone at any company can view public information.

[-] atrielienz@lemmy.world 1 points 4 months ago* (last edited 4 months ago)

You seem to think for free means just take it and use it to generate revenue. That's not what it means to have something be posted on the Internet. An artist's online portfolio isn't free. That's not how that's supposed to work and you know it.

If it were these LLM'S wouldn't shy away from using music on the Internet to train their LLM's. At least one of these firms has literally said they don't do this specifically because they don't want to get into trouble with any record labels. But sure. LLM'S can steal from Getty images and the NYT and be fine. That totally makes sense.

this post was submitted on 05 Jul 2024
645 points (100.0% liked)

Solarpunk

5468 readers
37 users here now

The space to discuss Solarpunk itself and Solarpunk related stuff that doesn't fit elsewhere.

What is Solarpunk?

Join our chat: Movim or XMPP client.

founded 2 years ago
MODERATORS