You could have government-signed models + programs that are approved for generating CP (not CSAM). It's legal if the signature checks out. Something like https://contentauthenticity.org/ but for verifying that something is definitely made by AI.
(You need to sign both the models and the programs to make sure there's no img2img.)
You don't need scientific arguments for everything you know. What's your argument against consentent 10 year old siblings having sex together if they use protection? I don't have one but I know it's morally wrong and won't bring anything good
You don’t even need to give them a model, just generate some images and publish them.
If you find those images, it’s fine, if you find anything else, arrest them.
This reminds me of a voting method I've seen some anarchists advocate for: the rules passed by votes should only be enforced on those who voted for it.
I mean this started with Stable Diffusion 1.x->XL which were only loosely open, and has just gotten worse with progressively farther from open licensed image gen models being described as “open weights”, but, yes, Flux.1 Krea (like the weights-available versions of Flux.1 from BFL itself) is not open even to the degree of the older versions of Stable Diffusion; weights available and “free-as-in-beer licensed for certain uses”, sure, but not open.
I don’t think censorship of nearly any kind has any place on the internet, but neither do kids.
It’s a parent’s responsibility to keep their children away from that type of content, not to hand them access to it so they can develop maligned, destructive ideas about sex, intimacy, and women.
Unfortunately that part never became practicable, and the only use cases that gained traction are speculation, fraud, extortion, and dark web marketplaces.
(You need to sign both the models and the programs to make sure there's no img2img.)
reply