To answer your question seriously, no. Having pictures or videos of a 15 year olds breast is not by default porn, it’s only porn once it’s in a sexual context.
Not just the state. Due process does not exist here, the accusation of sexual misconduct is often enough to destroy someone’s life. So it’s anyone who chooses to accuse him of something.
Security camera taking a video of a topless 15 year old intent was not produce a pornographic image.
You as the security guard saving the picture of later…your intent is to make it sexual.
Now if you saved the picture because you needed to figure out which store the topless 15 year old was by. Your intent is to not make it sexual so that would be fine.
You make sense, and I agree, but until kids are no longer jailed or put on sex offender lists for innocent sexting, I’m not going to assume US police officers and the US justice system being as reasoned as you are
Does “in a sexual context” have a falsifiable definition? I mean, here’s an article from the American Bar Association journal about a man who was arrested and deported for “child sexual abuse” for having photos printed of himself kissing his infant daughter after a bath.
The accuser was the photo lab tech. After the man was arrested and deported, his wife was arrested, their child was removed from them, then the investigation took the whole roll of film into consideration and found there was no child abuse, this was photographic evidence of loving parents caring for their child. “Sexual context” indeed.
I envision a future where a business owner puts up security cameras around his shop, those security cameras send their video to “The Cloud,” an underage girl walks by topless in view of those video cameras, as is her legal right to do so, a closed-source unauditable CSAM detection algorithm running on “The Cloud” flags the video as CSAM, and the business owner gets arrested, his business and/or home destroyed or even killed before an investigation determines no wrongdoing. Because we put the time for reasonableness after the bodies have cooled. We check to see if it’s a false positive after the “arrest” has been made. THAT’s the ultimate problem I have here.
I mean that’s alot of effort to go through when they could just as easily plant CSM on someone if they wanted to. I understand the paranoia people have, but there are easier ways to fuck someone’s life up.
To answer your question seriously, no. Having pictures or videos of a 15 year olds breast is not by default porn, it’s only porn once it’s in a sexual context.
It’s porn if the state wants to harass you for something.
Not just the state. Due process does not exist here, the accusation of sexual misconduct is often enough to destroy someone’s life. So it’s anyone who chooses to accuse him of something.
So… if i would look at it … O nice i know that store she walks by = pic is safe O nice tits i gonna jerk off to it = pic is porn ?
Than one pic can be porn for one and safe for an other. I must be missing something.
Intent
Was the picture taken for sexual intent?
Security camera taking a video of a topless 15 year old intent was not produce a pornographic image.
You as the security guard saving the picture of later…your intent is to make it sexual.
Now if you saved the picture because you needed to figure out which store the topless 15 year old was by. Your intent is to not make it sexual so that would be fine.
You make sense, and I agree, but until kids are no longer jailed or put on sex offender lists for innocent sexting, I’m not going to assume US police officers and the US justice system being as reasoned as you are
Does “in a sexual context” have a falsifiable definition? I mean, here’s an article from the American Bar Association journal about a man who was arrested and deported for “child sexual abuse” for having photos printed of himself kissing his infant daughter after a bath.
The accuser was the photo lab tech. After the man was arrested and deported, his wife was arrested, their child was removed from them, then the investigation took the whole roll of film into consideration and found there was no child abuse, this was photographic evidence of loving parents caring for their child. “Sexual context” indeed.
I envision a future where a business owner puts up security cameras around his shop, those security cameras send their video to “The Cloud,” an underage girl walks by topless in view of those video cameras, as is her legal right to do so, a closed-source unauditable CSAM detection algorithm running on “The Cloud” flags the video as CSAM, and the business owner gets arrested, his business and/or home destroyed or even killed before an investigation determines no wrongdoing. Because we put the time for reasonableness after the bodies have cooled. We check to see if it’s a false positive after the “arrest” has been made. THAT’s the ultimate problem I have here.
I mean that’s alot of effort to go through when they could just as easily plant CSM on someone if they wanted to. I understand the paranoia people have, but there are easier ways to fuck someone’s life up.