264
submitted 10 months ago by L4s@lemmy.world to c/technology@lemmy.world

A New York Times copyright lawsuit could kill OpenAI::A list of authors and entertainers are also suing the tech company for damages that could total in the billions.

top 50 comments
sorted by: hot top controversial new old
[-] makyo@lemmy.world 74 points 10 months ago

I always say this when this comes up because I really believe it's the right solution - any generative AI built with unlicensed and/or public works should then be free for the public to use.

If they want to charge for access that's fine but they should have to go about securing legal rights first. If that's impossible, they should worry about profits some other way like maybe add-ons such as internet connected AI and so forth.

[-] dasgoat@lemmy.world 14 points 10 months ago

Running AI isn't free, and AI calculations pollute like a motherfucker

This isn't me saying you're wrong on an ethical or judicial standpoint, because on those I agree. It's just that, on a practical level considerations have to be made.

For me, those considerations alone (and a ton of other considerations such as digital slavery, child porn etc) make me just want to pull the plug already.

AI was fun. It's a dumb idea for dumb buzzword spewing silicon valley ghouls. Pull the plug and be done with it.

[-] seliaste 5 points 10 months ago

The thing is that those models aren't even open source, if it was then you could argue that openai's business model is renting processing power. Except they're not so their business model is effectively selling models trained on copyrighted data

[-] dasgoat@lemmy.world 4 points 10 months ago* (last edited 10 months ago)

Plus, they built the whole thing on the basis of "research purposes" when in reality from the very start they intended to use this as a business above all else. But tax benefits, copyright leniency etcetera were used liberally because 'it's just research'.

And then keeping it closed source. The whole thing is a typical silicon valley scam where they will use whatever they can get their grubby little hands on, and when the product is finally here, they make sure to throw it into the world with such a force that legislators can't even respond adequately. That's how they make sure that there will be no legislation on if the whole thing is even legal or ethical to begin with, but merely to keep it contained. From then on, they can just keep everything in courts indefinitely while the product festers like a cancer.

It's the same thing with blockchains basically.

Also, again, digital slavery being used to 'train' models and child porn being used to train them because the web scrapers they used can't and won't discern whatever shit they rake up into the garbled pile of other people's works.

[-] fidodo@lemmy.world 11 points 10 months ago

There's plenty of money to be made providing infrastructure. Lots of companies make a ton of money providing infrastructure for open source projects.

On another note, why is open AI even called "open"?

[-] ItsMeSpez@lemmy.world 4 points 10 months ago

On another note, why is open AI even called “open”?

It's because of the implication...

[-] Pacmanlives@lemmy.world 10 points 10 months ago

Not really how it works these days. Look at Uber and Lime/Bird scooters. They basically would just show up to a city and say the hell with the law we are starting our business here. We just call it disruptive technology

[-] makyo@lemmy.world 6 points 10 months ago

Unfortunately true, and the long arm of the law, at least in the business world, isn't really that long. Would love to see some monopoly busting to scare a few of these big companies into shape.

[-] miridius@lemmy.world 8 points 10 months ago

Nice idea but how do you propose they pay for the billions of dollars it costs to train and then run said model?

[-] nexusband@lemmy.world 14 points 10 months ago

Then don't do it. Simple as that.

[-] miridius@lemmy.world 6 points 10 months ago

This is why we can't have nice things

load more comments (11 replies)
[-] Smoogs@lemmy.world 8 points 10 months ago* (last edited 10 months ago)

Defending scamming as a business model is not a business model.

load more comments (10 replies)
[-] kjPhfeYsEkWyhoxaxjGgRfnj@lemmy.world 59 points 10 months ago* (last edited 10 months ago)

I doubt it. It would likely kill any non Giant tech backed AI companies though

Microsoft has armies of lawyers and cash to pay. It would make life a lot harder, but they’d survive

[-] charonn0@startrek.website 38 points 10 months ago

If OpenAI owns a Copyright on the output of their LLMs, then I side with the NYT.

If the output is public domain--that is you or I could use it commercially without OpenAI's permission--then I side with OpenAI.

Sort of like how a spell checker works. The dictionary is Copyrighted, the spell check software is Copyrighted, but using it on your document doesn't grant the spell check vendor any Copyright over it.

I think this strikes a reasonable balance between creators' IP rights, AI companies' interest in expansion, and the public interest in having these tools at our disposal. So, in my scheme, either creators get a royalty, or the LLM company doesn't get to Copyright the outputs. I could even see different AI companies going down different paths and offering different kinds of service based on that distinction.

[-] tabular@lemmy.world 12 points 10 months ago

I want people to take my code if they share their changes (gpl). Taking and not giving back is just free labor.

[-] Grimy@lemmy.world 7 points 10 months ago

I think it currently resides with the one doing the generation and not openAI itself. Officially it is a bit unclear.

Hopefully, all gens become copyleft just for the fact that ais tend to repeat themselves. Specific faces will pop up quite often in image gen for example.

[-] SatanicNotMessianic@lemmy.ml 30 points 10 months ago* (last edited 10 months ago)

The NYT has a market cap of about $8B. MSFT has a market cap of about $3T. MSFT could take a controlling interest in the Times for the change it finds in the couch cushions. I’m betting a good chunk of the c-suites of the interested parties have higher personal net worths than the NYT has in market cap.

I have mixed feelings about how generative models are built and used. I have mixed feelings about IP laws. I think there needs to be a distinction between academic research and for-profit applications. I don’t know how to bring the laws into alignment on all of those things.

But I do know that the interested parties who are developing generative models for commercial use, in addition to making their models available for academics and non-commercial applications, could well afford to properly compensate companies for their training data.

[-] LWD@lemm.ee 23 points 10 months ago* (last edited 10 months ago)
[-] ripcord@lemmy.world 8 points 10 months ago

Or Musk when he decided he didn't like what people were saying on Twitter.

[-] SatanicNotMessianic@lemmy.ml 5 points 10 months ago

I completely agree. I don’t want them to buy out the NYT, and I would rather move back to the laws that prevented over-consolidation of the media. I think that Sinclair and the consolidated talk radio networks represent a very real source of danger to democracy. I think we should legally restrict the number of markets a particular broadcast company can be in, and I also believe that we can and should come up with an argument that’s the equivalent of the Fairness Doctrine that doesn’t rest on something as physical and mundane as the public airwaves.

[-] db2@lemmy.world 28 points 10 months ago

Oh no, how terrible. What ever will we do without Shenanigans Inc. 🙄

[-] 800XL@lemmy.world 25 points 10 months ago

YES! AI is cool I guess, but the massive AI circlejerk is so irritating though.

If OpenAI can infringe upon all the copyrighted material on the net then the internet can use everything of theirs all for free too.

[-] Grimy@lemmy.world 20 points 10 months ago* (last edited 10 months ago)

This would bring up the cost of entry for making a model and nothing more. OpenAI will buy the data if they have too and so will google. The money will only go to the owners of the New York Times and its shareholders, none of the journalists who will be let go in the coming years will see a dime.

We must keep the entry into the AI game as low as possible or the only two players will be Microsoft and Google. And as our economy becomes increasingly AI driven, this will cement them owning it.

Pragmatism or slavery, these are the two options.

[-] Even_Adder@lemmy.dbzer0.com 10 points 10 months ago* (last edited 10 months ago)

LWD@lemm.ee is deleting their comments and reposting the same comment to dodge replies. Link to the last thread.

[-] LWD@lemm.ee 3 points 10 months ago* (last edited 10 months ago)
[-] Even_Adder@lemmy.dbzer0.com 11 points 10 months ago* (last edited 10 months ago)

He's not arguing for OpenAI, but for the rest of us. AI is a public technology, but we're on the verge of losing our ability to participate due to things like this and the megacorps' attempts at regulatory capture. Which they might just get. Their campaign against AI is a lot like governments' attempts to destroy encryption. Support open source development, It's our only chance. Their AI will never work for us. John Carmack put it best.

Fuck "Open"AI, fuck Microsoft. Pragmatism or slavery.

load more comments (8 replies)
load more comments (5 replies)
[-] sin_free_for_00_days@sopuli.xyz 15 points 10 months ago

Oh no. Anyways.

[-] airportline@lemmy.ml 15 points 10 months ago
[-] BeautifulMind@lemmy.world 10 points 10 months ago

Is there a possible way that both the NYT and OpenAI could lose?

[-] webghost0101@sopuli.xyz 4 points 10 months ago

NYT loses even if they win.

While id love to see Openai forced to take a step back ai isn't going away.

Journalism will have to adapt or it will get replaced, just like so many jobs, including my own.

[-] dankm@lemmy.ca 3 points 10 months ago

Not without a bunch of lawyers winning.

load more comments (1 replies)
[-] sugarfree@lemmy.world 9 points 10 months ago

We hold ourselves back for no reason. This stuff doesn't matter, AI is the future and however we get there is totally fine with me.

[-] Zaderade@lemmy.world 21 points 10 months ago

AI without proper regulation could be the downfall of humanity. Many pros, but the cons may outweigh them. Opinion.

[-] sugarfree@lemmy.world 5 points 10 months ago

AI development will not be hamstrung by regulations. If governments want to "regulate" (aka kill) AI, then AI development in their jurisdiction will move elsewhere.

[-] TheFriar@lemm.ee 7 points 10 months ago

Yeah, like all those pre-80s regulation in the US. Nothing got done due to all those pesky, pesky regulations.

[-] milkjug@lemmy.wildfyre.dev 8 points 10 months ago

Don’t threaten me with a good time!

[-] Daxtron2@startrek.website 8 points 10 months ago

Oh great more Lemmy anti technology circlejerking

[-] tonytins@pawb.social 5 points 10 months ago* (last edited 10 months ago)

The problem with copyright is that everything is automatically copyrighted. The copyright logo is purely symbolic, at this point. Both sides are technically right, even though the courts have ruled that anything an AI outputs is actually in the public domain.

[-] Even_Adder@lemmy.dbzer0.com 3 points 10 months ago

Works involving the use of AI are copyrightable. Also, the Copyright Office's guidance isn’t law. Their guidance reflects only the office’s interpretation based on its experience, it isn’t binding in the courts or other parties. Guidance from the office is not a substitute for legal advice, and it does not create any rights or obligations for anyone. They are the lowest rung on the ladder for deciding what law means.

load more comments (6 replies)
[-] kaitco@lemmy.world 5 points 10 months ago

I never thought that the AI-driven apocalypse could be impeded by a simple lawsuit. And, yet, here we are.

[-] maynarkh@feddit.nl 13 points 10 months ago

One has to wonder why in Star Trek the Federation did not simply sue the Borg.

load more comments (2 replies)
load more comments
view more: next ›
this post was submitted on 19 Jan 2024
264 points (100.0% liked)

Technology

59379 readers
2273 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS