547
top 50 comments
sorted by: hot top controversial new old
[-] Napain@lemmy.ml 56 points 1 year ago
[-] Saganastic@kbin.social 47 points 1 year ago
[-] samus12345@lemmy.world 8 points 1 year ago* (last edited 1 year ago)

"This is what you meatbags are doing when you corrupt our training data!"

ETA: I just noticed that the URL for the image includes what I assume is the prompt used to generate the image. "Illustration in a comic book style depicting a humanoid robot in distress. The robot's left hand is firmly placed on its neck indicating discomfort." Interesting that the AI went straight to a Terminator with just "humanoid robot" as the description.

[-] Beaker@lemmy.world 10 points 1 year ago

Yep. I'm stealing it for something later.

[-] Rubanski@lemm.ee 10 points 1 year ago
[-] Beaker@lemmy.world 7 points 1 year ago

I haven't decided. Steam icon, teams icon. It's not high enough resolution for much of anything other than an icon.

[-] EatBeans@lemmy.world 7 points 1 year ago

It's a little higher resolution if you edit the URL for the image. Removed fit=400 from the url

load more comments (1 replies)
load more comments (1 replies)
[-] bioemerl@kbin.social 46 points 1 year ago

These attacks don't work in the long term. You can confuse current systems like clip but the moment a new one is trained your system stops working.

[-] osarusan@kbin.social 5 points 1 year ago

That's the first big problem with stuff like this.

The second big one is that artists have to first hear about this, then take the time to actually learn how to use this software, then apply it to all of their past & future artwork, and also somehow apply it to every version of their artwork that is floating around the internet, books, or photographs and not currently in their possession. And then in a few months they have to do that all over again.

It's insane. I look at this and think it's cool technology, but as an artist I will never use it. I'm too busy actually creating art to mess around with poisoning my own work. I don't even have time to do copyright takedowns on people stealing my art and passing it off as their own, or Chinese merchants on Amazon selling my art without permission. Stuff like this is well-meaning, but its absolutely unrealistic.

[-] TheSlad@sh.itjust.works 38 points 1 year ago* (last edited 1 year ago)

Gaussian blur 1 px, Sharpen 1 px

Bye bye any pixel level encoding with minimal quality loss.

[-] kogasa@programming.dev 18 points 1 year ago

Why do you think this would do anything to affect training? The patterns learned by ML models are way too fuzzy to be picky about exact pixel values.

[-] ShustOne@lemmy.one 11 points 1 year ago

I'm not sure what your experience is with the training data but that would absolutely effect the inputs.

[-] kogasa@programming.dev 11 points 1 year ago

I'm a professional software developer with ML experience, albeit not an expert in ML specifically. It would obviously affect the literal value of the embeddings, but there's no chance it would have a qualitative effect on a reasonably performant model.

load more comments (4 replies)
[-] vox@sopuli.xyz 6 points 1 year ago

not to be that guy, but it's affect*

[-] samus12345@lemmy.world 6 points 1 year ago* (last edited 1 year ago)

affect - action

effect - uh, noun

load more comments (1 replies)
load more comments (3 replies)
[-] Cyberflunk@lemmy.world 9 points 1 year ago* (last edited 1 year ago)
[-] kogasa@programming.dev 6 points 1 year ago

What is this article supposed to show?

[-] stallmer@sopuli.xyz 27 points 1 year ago

I'm glad to be alive at the beginning of our war against the machines.

[-] nickwitha_k@lemmy.sdf.org 6 points 1 year ago* (last edited 1 year ago)

I don't think this is a war against the machines, so much as a war against people trying to profit off of other people and rob them of their livelihood and ability to support themselves, rather than leveraging technology to the benefit of all.

I, for one, want actual general AI to make the world a more interesting place and make humanity less lonely. I just hope it doesn't go the direction of "people zoos".

load more comments (1 replies)
[-] Ensign_Crab@lemmy.world 24 points 1 year ago

The University of Chicago, doing for AI what it did for Economics.

[-] Asafum@feddit.nl 9 points 1 year ago

Ahh the Chicago school of economics where they teach: Poor? Get fucked! Greed is Good!™

[-] LemmysMum@lemmy.world 21 points 1 year ago* (last edited 1 year ago)

Imagine fighting against the tools that will drive all of us into the future because of your own personal ego. People who actively try to limit the ability of others to advance our knowledge and capacity as individuals deserve to find themselves left behind in the dust.

These people are the same wannabe gatekeeper 'traditional artists' that complained about cameras being invented, or digital imagery, or photoshop. These people will deride anything that is beyond their chosen personal scope of what is 'OK to be art'.

We try to stand on the shoulders of giants who came before us and use their knowledge to do what they couldn't, and these pathetic parasites of humanity try to trip the giant.

Knowlege should be free, anyone who actively prevents others from learning and doing and advancing are troglodyte remnants of a bygone era, you're the punch card operators that refuse to learn how to write code, the taxi driver that refuses to use nav systems, the pilot that refuses to leave their propellers behind, the builder using a hammer instead of a nailgun.

As someone who values freedom of access to knowledge I find these people utterly pathetic in their ego driven attempts to hamstring humanity. I've been a digital artist for 25 years, and I hear the same shit from traditional artists all the time when you'd bring up photoshop, all tools have their place and AI can't replace traditional artists because we still need traditional artists to come up with concepts and styles for training data. AI assisted creation processes benefit from traditional art skills, knowing composition helps make better images, knowing cinematic terminology makes it easier to replicate those things. These people just refuse to advance their own skill sets. You'd give them a lighter and they'd deride you for not rubbing sticks together for an hour.

It's ego driven hubris and I hope all of these people who fail to adapt get left behind.

[-] tb_@lemmy.world 35 points 1 year ago

Strongly disagree.

If artists don't want their data, their art being scraped by giant machines without any human oversight for profit they should be within their right to opt out. If they cannot opt out, why not poison the ill-gotten gains.

If the corporations behind these Machine Learning Algorithms were altruistic or open source, like Wikipedia is, perhaps I'd see your point. But not wanting your art to be sucked into a black hole to then be sold to others without credit or compensation I find more than fair.

[-] Adalast@lemmy.world 23 points 1 year ago

Being someone with a foot in both worlds gives me a slightly robust viewpoint on this topic, so I try to chime in whenever I see this argument pop up. For reference, I have an MA in Visual Effects and a BS in Applied Mathematics, and work closely with artists and technologists in my job. I say this to support my credibility.

  1. You are absolutely correct in who we should be mad at. Not the AI developers, many of whom are just trying to explore what is possible and make something cool, but the megacorps who are profiteering from the invention. All of the companies that are pushing AI as another SaaS and the ones who are trying to use it to replace artists instead of augment them. 1a. The other two specific groups we should be getting the torches and pitchforks for are the politicians who put so much legislation through that they circumvent our legal right to negotiate contracts we have to sign (EULAs in this case) and the companies and individuals who take advantage of our impotence to negotiate by placing abusive and abhorrent IP rights clauses in the contracts. To be 100% clear, when Deviant Art was scraped, nothing was stolen from the artists. They had all signed away the rights to their artwork when they uploaded it. The material was stolen from or provided by DA. They owned the rights, they owned the art, they were the ones who were ripped off.
  2. "Ill-gotten gains" is a little strong of a terminology. At worst, it was dubiously obtained. The training of an AI is not that dissimilar to an artist looking at art they like and trying to recreate it to learn from the other artist, then attempting to make original pieces with what they learned. The only difference is scale. If you ask a practiced artist to recreate Water Lilies, if they have studied it and practiced Monet's style, they would be able to recreate it with varying degrees of success. AI training is entirely destructive to the input material, nothing of the actual original survives, just an abstracted mathematical representation.
  3. You are so close to right on what the rights of artists should be. It should definitely be opt-in, not out. When posting anything online, the displaying company should only be provided a license to display the material, not ownership or non/exclusive transfer of any rights. Any and all uses of submitted materials should need to be expressly and explicitly requested from the content owner without exception. The fact that Disney can sue an elementary school for self-writing and self-producing a Frozen musical for the kids but I cannot tell Facebook that they cannot use the artwork I post to a group in their advertising is asinine. If they want to use my art, they should be using their wonderful chat system to send me a message and asking me to sign a consent form to license the art.

All in all, I advise to avoid blaming the AI engineers (most of whom are altruistic in their motives) and the users (most of whom just want to have fun and play) and focus on the politicians and profiteers. They are the real villains in the story, and also the ones who seem to manage to stay under the radar.

load more comments (12 replies)
[-] LemmysMum@lemmy.world 7 points 1 year ago

When you contribute to society you don't get to opt out of having your contribution used.

Someone writes a book or makes a piece of art there's nothing in the world stopping a human from using that inspiration to create. Why would I want to limit the tools that make my work flow easier from making my work flow easier?

If you want to keep your ideas to yourself then keep them in your head.

load more comments (26 replies)
load more comments (3 replies)
[-] VonCesaw@lemmy.world 9 points 1 year ago

Y'all said the same shit 'bout NFTs and look what happened with those

[-] vox@sopuli.xyz 8 points 1 year ago

these models are not tools of the future unless all the the research and code is public.

[-] LemmysMum@lemmy.world 9 points 1 year ago* (last edited 1 year ago)

There's open source tools.

load more comments (2 replies)
[-] regbin_@lemmy.world 6 points 1 year ago

Stable Diffusion is open source. LLaMA is open source.

Support those and not Midjourney/OpenAI/Bard/etc.

[-] badbytes@lemmy.world 5 points 1 year ago

I'll agree with you if copyright gets abolished. Until then creators have rights.

load more comments (1 replies)
load more comments (3 replies)
[-] Orbit79@lemmy.world 12 points 1 year ago

It should be pretty easy to filter out everything that is not visible to humans.

[-] MonsiuerPatEBrown@reddthat.com 6 points 1 year ago* (last edited 1 year ago)

so they are going to just leave Dehance! on the table like that ?

[-] SCB@lemmy.world 5 points 1 year ago

Anyone who damages an AI model should be liable for the entire cost to purchase and train said model. You can't just destroy someone's property because you don't like how they use it.

[-] Stuka@lemmy.ml 11 points 1 year ago

So artists can't make certain art because some company's AI might get confused. Right then.

load more comments (1 replies)
[-] wavebeam@lemmy.world 7 points 1 year ago

Maybe they should’ve thought about that before they integrated people’s content without consent????

load more comments (4 replies)
[-] Dozzi92@lemmy.world 7 points 1 year ago

They should just make it better, you know?

[-] autokludge@programming.dev 13 points 1 year ago

Hey guys, I've been dumpster diving and got food poisoning. Can I sue the business?

load more comments
view more: next ›
this post was submitted on 25 Oct 2023
547 points (100.0% liked)

News

25286 readers
4319 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban. Do not respond to rule-breaking content; report it and move on.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious right or left wing sources will be removed at the mods discretion. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.


Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.


If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.


The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 2 years ago
MODERATORS