But....they can already do all of that? And nothing is stopping you from having a secret offshore account in a foreign currency, instead of cryptocoins.
The source for that number is the International Atomic Energy Agency aka the nuclear control agency. As for the rest of your ideas, its sadly not that easy. It has to be stored somewhere where it cant contaminate the environment, water cant get to it, tectonics are stable, etc. No permanent storage location for the waste has been found, to date.
And to burn the unburned fuel you would have to breed the material, which is a process that requires the most dangerous reactors and is extremely costly.
I wonder what that means for google operating in russia. Since the demanded sum is so large, do they get their assets seized, like for example authorities sacking google phones? Or does this have absolutely no consequences?
Wow that sounds like a great way to stop the machine revolution!!!!! lets do that!!!!1
cattaria
Yes, there is a jar file on the main website that works just fine. Also i2pd is in the tumbleweed repo.
No its not like that. You only seed videos you are currently watching.
most of them are from left to right :-)
live player reaction:
@
Just chiming in here to say that this is very much like security through obscurity. In this context the "secure" part is being sure that the images you host are ok.
Bad actors using social engineering to get the banlist is much easier than using open source AI and collectivly fixing the bugs when the trolls manage to evade it. Its not that easy to get around image filters like this, and having to do wierd things to the pictures to be able to pass the filter barrier could be work enough to keep most trolls from posting. Using a central instance that filters all images is also not good, because now the person operating the service is now responsible for a large chunk of your images, creating a single point of failure in the fediverse(and something that could be monetised to our detriment) Closed source can not be the answer either because if someone breaks the filter, the community cant fix it, only the developer can. So either the dev team is online 24/7 or its paid, making hosting a fediverse server dependent on someones closed source product.
I do think however that disabling image federation should be an option. Turning image federation off for some server for a limited time could be a very effective tool against these kinds of attacks!
Alright. Heres an idea I had: Perhaps if in some of the court cases they find chatgpt and others guilty of copyright infringment, then free software licenses like the gpl would forbid nonfree AI automatically, as that would then be a license violation?