Resolves yes if the US federal government makes it broadly illegal to share sexually explicit images generated by AI without the subject's consent, before the end of 2025.
I'm not a lawyer and don't have a rigid definition of what kind of ban would count, but this will resolve based on the common sense spirit of the question.
I will not trade in this market.
See also:
I hope this doesn't happen. Nobody is actually being harmed, and given that the government hasn't even charged anyone involved with the Genesis scam, they aren't going to charge some random guy in Kansas for sharing a picture.
Plus, by the time this actually matters, a real nude image will have no value because the person pictured in it will just be able to claim it is AI-generated. AI will cause a reversion back to before the Internet, when it was rare for people to face consequences for images like this. But if someone criminalizes sharing of fake images, then that works against the plausibility deniability argument.
I hope this doesn't happen. Nobody is actually being harmed
How would you feel if someone you found repulsive shared convincing deepfake images/videos of you doing an act you found disgusting to your friends, family, co-workers, and current/prospective romantic partner(s)?
You're missing the point here. It's not about whether I would be bothered about this particular thing, although I would not.
We cannot just pass more laws and then expect problems to go away. If you pass a regulation and do not hire additional enforcement officers at the same rate, you have to stop policing other crimes.
And so, regardless of how bad anyone feels about video bytes streaming across the Internet, enforcing this is not worth arresting fewer financial criminals, who have actual permanent impacts on the lives of the people they steal from. Financial crimes are out of control - the number of these Indian cryptocurrency scams is unfathomable, and we see how companies like SMCI destroy billions in investor value with accounting fraud.
And I already pay 52% of my earnings in taxes. What more can the government possibly take? I don't even have a 9-5 job now because, among other reasons, I don't see the point in working one at this tax rate.
So, no, I am not bothered by pornographic videos, whether of me or anyone else, definitely not enough to agree that we should raise taxes or reassign cops to stop people from creating data over stopping people from stealing money.
@SteveSokolowski If there were sufficient cop numbers and reduced bureaucratic inefficiency, do you agree making this illegal would be better than not?
@ShakedKoplewitz I think most scenarios in which this resolves yes would be a new law, but I could imagine other things like the supreme court making a ruling about this or the president passing some sort of executive action so I don't want to say it 100% has to be a new law. But it probably does, I don't think those other things are very likely.
Do you mean a law that covers AI generated images specifically?
Because it can be illegal in some contexts already - publishing them while implying they are real might be considered libel, using them commercially would likely be infringing on the right of personality.
Do you mean making it a federal crime enforced by FBI or other federal agencies?
If a federal judge finds it illegal using existing more general laws, does it resolve yes?
Or do you need congress to vote a specific law for that?
Note: AFAIK there already is some case being investigated by FBI using existing more general federal laws, but no judgment nor sentence yet (?).