39
you are viewing a single comment's thread
view the rest of the comments
[-] hendrik@palaver.p3x.de 2 points 3 days ago* (last edited 3 days ago)

Thanks for the comprehensive write-up. I'm not sure if I completely agree. I'd like to see an example of such an App. Some time ago when we had that big in the news, since some school-kids did deep-fakes and nudes of their classmates, I followed the news reporting but that was some commercial app. And they specifically catered for that niche. They had no imprint, obfuscated their business address and hosting provider. And put no restrictions in place to generate (or face-swap) pictures of minors. Likely all of that on purpose. (And nobody took the service down, when I checked almost a year later.)

I'm also not sure what they use behind the scenes. I'd say technically the app itself has to be open-source to qualify as an "open-source app". Or it needs to be phrased differently. And it can't be just that. Since Amazon, Google and all major companies use open-source technology for their services. Yet Google Search or AWS are not commonly referred to as "open-source", despite being powered by it to some varying degree... So we need to add some nuance to it anyway, or there is no meaning to it.

So yeah, I'd really like to know what has been used here and whether the reference to open-source has been made-up and doesn't really apply to the topic at hand. I think that's important for the argument. If the potential criminals use easier ready-made (commercial) web-services instead of buying a RTX 5090, learning ComfyUI, dealing with the steep learning curve etc, we'd know we have to primarily fight those apps and services, not necessarily the generative AI tools. Or at least do the things that have a big impact first, before jumping to very complicated and likely impossible options which come with severe downsides.

[-] hedgehog@ttrpg.network 3 points 2 days ago

To be clear, I agree that the line you quoted is almost assuredly incorrect. If they changed it to "thousands of deepfake apps powered by open source technology" then I'd still be dubious, simply because it seems weird that there would be thousands of unique apps that all do the same thing, but that would at least be plausible. Most likely they misread something like https://techxplore.com/news/2025-05-downloadable-deepfake-image-generators.html and thought "model variant" (which in this context, explicitly generally means LoRA) and just jumped too hard on the "everything is an open source app" bandwagon.

I did some research - browsing https://github.com/topics/deepfakes (which has 153 total repos listed, many of which are focused on deepfake detection), searching DDG, clicking through to related apps from Github repos, etc..

In terms of actual open source deepfake apps, let's assume that "app" means, at minimum, a piece of software you can run locally, assuming you have access to arbitrary consumer-targeted hardware - generally at least an Nvidia desktop GPU - and including it regardless of whether you have to write custom code to use it (so long as the code is included), use the CLI, hit an API, use a GUI app, a web browser, or a phone app. Considering only apps that have as a primary use case, the capability to create deepfakes by face swapping videos, there are nonetheless several:

  • Roop
  • Roop Unleashed
  • Rope
  • Rope Live
  • VisoMaster
  • DeepFaceLab
  • DeepFaceLive
  • Reactor UI
  • inswapper
  • REFace
  • Refacer
  • Faceswap
  • deepfakes_faceswap
  • SimSwap

If you included forks of all those repos, then you'd definitely get into the thousands.

If you count video generation applications that can imitate people using, at minimum, Img2Img and 1 Lora OR 2 Loras, then these would be included as well:

  • Wan2GP
  • HunyuanVideoGP
  • FramePack Studio
  • FramePack eichi

And if you count the tools that integrate those, then these probably all count:

  • ComfyUI
  • Invoke AI
  • SwarmUI
  • SDNext
  • Automatic1111 SD WebUI
  • Fooocus
  • SD WebUI Forge
  • MetaStable
  • EasyDiffusion
  • StabilityMatrix
  • MochiDiffusion

If the potential criminals use easier ready-made (commercial) web-services instead of buying a RTX 5090, learning ComfyUI, dealing with the steep learning curve etc, we’d know we have to primarily fight those apps and services, not necessarily the generative AI tools.

This is the part where, to be able to answer that, someone would need to go and actually test out the deepfake apps and compare their outputs. I know that they get used for deepfakes because I've seen the outputs, but as far as I know, every single major platform - e.g., Kling, Veo, Runway, Sora - has safeguards in place to prevent nudity and sexual content. I'd be very surprised if they were being used en masse for this.

In terms of the SaaS apps used by people seeking to create nonconsensual, sexually explicit deepfakes... my guess is those are actually not really part of the figure that's being referenced in this article. It really seems like they're talking about doing video gen with LoRAs rather than doing face swaps.

this post was submitted on 04 Jun 2025
39 points (100.0% liked)

Fuck AI

3010 readers
720 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 1 year ago
MODERATORS