If it has images of construction equipment and houses, it can make images of houses that look like construction equipment. Swap out vocabulary as needed.
Is a kid just a 60% reduction by volume of an adult? And these are generative algorithms… nobody really understands how it perceives the world and word relations.
It understands young and old. That means it knows a kid is not just a 60% reduction by volume of an adult.
We know it understands these sorts of things because of the very things this whole kerfuffle is about - it’s able to generate images of things that weren’t explicitly in its training set.
But it doesn’t fully understand young and “naked young person” isn’t just a scaled down “naked adult”. There are physiological changes that people go through during puberty which is why the “It understands young vs. old” is a clearly vapid and low effort comment. Yours has more meaning behind it so I’d clarify that just being able to have a vague understanding of young and old doesn’t mean it can generate CSAM.
But it doesn’t fully understand young and “naked young person” isn’t just a scaled down “naked adult”.
Do you actually know that, or are you just assuming it?
Personally, I’m basing my assertions off of experience with related situations, where I’ve asked image AIs to generate images of things that I’m quite sure weren’t in its training set and that require conceptual understanding to create “hybrids.” It’s done a decent job of those so I’m assuming that it can figure out this specific situation as well, since most of these models have a lot of examples of naked people and young people in their training sets. But I haven’t actually asked any AIs to generate images of naked young people to test this one specific case.
My opinion here is that “naked young person” isn’t as simple as other compound concepts because there are physiological changes we go through during puberty that an AI can’t reverse engineer. Something like “Italian samurai” involves concepts that occur at a surface level that it can easily understand while “naked young person” involves some components that can’t be derived simply from applying “young” to “naked person” or “naked” to “young person”.
AI can compose novel looking things from components it has been trained on - it can’t imagine new concepts. If CSAM is being generated it’s because it was included in it’s training set which is highly suspected as we know the common corpus had CSAM in it: https://cyber.fsi.stanford.edu/news/investigation-finds-ai-image-generation-models-trained-child-abuse
If it has images of construction equipment and houses, it can make images of houses that look like construction equipment. Swap out vocabulary as needed.
Cool, how would it know what a naked young person looks like? Naked adults look significantly different.
It understands young and old.
Is a kid just a 60% reduction by volume of an adult? And these are generative algorithms… nobody really understands how it perceives the world and word relations.
It understands young and old. That means it knows a kid is not just a 60% reduction by volume of an adult.
We know it understands these sorts of things because of the very things this whole kerfuffle is about - it’s able to generate images of things that weren’t explicitly in its training set.
But it doesn’t fully understand young and “naked young person” isn’t just a scaled down “naked adult”. There are physiological changes that people go through during puberty which is why the “It understands young vs. old” is a clearly vapid and low effort comment. Yours has more meaning behind it so I’d clarify that just being able to have a vague understanding of young and old doesn’t mean it can generate CSAM.
Do you actually know that, or are you just assuming it?
Personally, I’m basing my assertions off of experience with related situations, where I’ve asked image AIs to generate images of things that I’m quite sure weren’t in its training set and that require conceptual understanding to create “hybrids.” It’s done a decent job of those so I’m assuming that it can figure out this specific situation as well, since most of these models have a lot of examples of naked people and young people in their training sets. But I haven’t actually asked any AIs to generate images of naked young people to test this one specific case.
My opinion here is that “naked young person” isn’t as simple as other compound concepts because there are physiological changes we go through during puberty that an AI can’t reverse engineer. Something like “Italian samurai” involves concepts that occur at a surface level that it can easily understand while “naked young person” involves some components that can’t be derived simply from applying “young” to “naked person” or “naked” to “young person”.
Someone did have a valid counter argument in this subthread though: https://sh.itjust.works/comment/11713795