Hacker Newsnew | past | comments | ask | show | jobs | submit | CaptainFever's commentslogin

You could have government-signed models + programs that are approved for generating CP (not CSAM). It's legal if the signature checks out. Something like https://contentauthenticity.org/ but for verifying that something is definitely made by AI.

(You need to sign both the models and the programs to make sure there's no img2img.)


The level of tech solutionist brain rot you need to reach to propose state sponsored child porn generator... This forum is a parody of itself

So no real arguments against it, only insults. That's great news, thank you! :)

You don't need scientific arguments for everything you know. What's your argument against consentent 10 year old siblings having sex together if they use protection? I don't have one but I know it's morally wrong and won't bring anything good

You don’t even need to give them a model, just generate some images and publish them. If you find those images, it’s fine, if you find anything else, arrest them.

That works too, though it'll of course result in a smaller selection and therefore smaller impact on the real market.

We can't agree on weed or safe injection sites, you think we'll have government approved CP generation?

I totally agree, we should aim for all three harm reduction measures.

Yeah? How? What strategy do you have for government funded and provided CP that is even remotely within the realm of political possibility?

This reminds me of a voting method I've seen some anarchists advocate for: the rules passed by votes should only be enforced on those who voted for it.

Rude.


We really need to add "please don't write comments witch-hunting articles for AI usage" into the guidelines at this rate


It is useful for those of us always checking the comments first, to decide if the article is worth reading.


"Information wants to be free, as long as it's not my information."


"...and as long it's created by a human."

(Because I feel proponents of generative AI appear to play the "info wants to be free" card as well.)


You do realize that open weights exist? Proprietary models suck, yes, but they can be distilled.


You're attacking me personally when I was agreeing the contents of the comment in an objective manner.


Nitpick: this is not open weights, this is weights available. The license restricts many things like commercial, NSFW, etc.


I mean this started with Stable Diffusion 1.x->XL which were only loosely open, and has just gotten worse with progressively farther from open licensed image gen models being described as “open weights”, but, yes, Flux.1 Krea (like the weights-available versions of Flux.1 from BFL itself) is not open even to the degree of the older versions of Stable Diffusion; weights available and “free-as-in-beer licensed for certain uses”, sure, but not open.


Alright we've made the title not say open, in the hope of routing around this objection.


Then you deserve all the censorship that comes your way. Treat others like you wish to be treated.


I don’t think censorship of nearly any kind has any place on the internet, but neither do kids.

It’s a parent’s responsibility to keep their children away from that type of content, not to hand them access to it so they can develop maligned, destructive ideas about sex, intimacy, and women.


Buying NSFW things.


I think the further down the supply chain of "NSFW things" you go the more closely it resembles "human misery trafficking".


I don't think they were referring to live-action "NSFW things" which is traditionally called "pornography."


The same applies to drug manufacturing, so far as I'm aware.


Exactly


This is literally the original use case for cryptocurrency.


Unfortunately that part never became practicable, and the only use cases that gained traction are speculation, fraud, extortion, and dark web marketplaces.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: