1516
submitted 3 weeks ago by furycd001@lemmy.ml to c/memes@lemmy.ml
you are viewing a single comment's thread
view the rest of the comments
[-] saigot@lemmy.ca 4 points 3 weeks ago

If it was done with enough regularity to eb a problem, one could just put an LLM model like this in-between to preprocess the data.

[-] Azzu@lemm.ee 4 points 3 weeks ago

That doesn't work, you can't train models on another model's output without degrading the quality. At least not currently.

[-] Vashtea@sh.itjust.works 1 points 3 weeks ago* (last edited 3 weeks ago)

I don't think he was suggesting training on another model's output, just using ai to filter the training data before it is used.

[-] FooBarrington@lemmy.world 1 points 3 weeks ago

No, that's not true. All current models use output from previous models as part of their training data. You can't solely rely on it, but that's not strictly necessary.

this post was submitted on 22 Apr 2025
1516 points (100.0% liked)

Memes

50299 readers
889 users here now

Rules:

  1. Be civil and nice.
  2. Try not to excessively repost, as a rule of thumb, wait at least 2 months to do it if you have to.

founded 6 years ago
MODERATORS