664
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 03 Jan 2024
664 points (100.0% liked)
Technology
59017 readers
2505 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
deleted
deleted
deleted
Um no, we're defending actual open AI models, I couldn't give 2 shits about OpenAI. They have the funding to license things, but that open source model? Trying to compete against big corporations like Microsoft and Google? They don't.
You're actually advocating for the big corporations, what's going to happen if things go the way you want is the truly open models will die off and big corporations will completely control AI from then on. Is that what you really want?
deleted
I fail to see what he or your comment has to do with Generative AI models, which is what we are talking about.
I don't think you fully understand how Generative AIs work. The input data is used in a similar, but far more rudimentary way, to learn as humans do. The model itself contains no recognizable original data, just a bunch of numbers, math and weights in an attempt to simulate the neurons and synaptic pathways that our brains form when we learn things.
Yes, a carefully crafted prompt can get it to spit out a near identical copy of something it was trained on (assuming it had been trained on enough data of the target artist to begin with), but so can humans. In those cases humans have gotten in trouble when attempting to profit off it and therefore in that case justice must be served regardless of if it was AI or human that reproduced it.
But to use something that was publicly available on the Internet for input is fair game just as any human might look at a sampling of images to nail down a certain style. Humans are just far more efficient at it with far far less needed data
deleted
Not all AIs do, the more "traditional" ones that you're probably thinking of don't. The ones that are generating text, images and video, however, are based on Generative Adversarial Networks a type of Deep learning Neural Network and those do learn albeit in a rudimentary fashion compared to humans, but learning none the less.
deleted
Here is an alternative Piped link(s):
computers do not learn
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I'm open-source; check me out at GitHub.