U
User_159114
Unconfirmed
I dislike AI Art, because it is too derivative. The style is uniform and there is no soul to it.
Look, I get this a free-speech forward type of forum. And I get that a cartoon of my little pony doing the nasty with solid snake is probably ethically safer than real porn. So what I'm going to say isn't gonna be super popular.
But AI art is bad.
a) It is produced using images of non-consenting people
b) it is known that ai training sets include images from real life dark content (not fake stuff, actual dark shit floating collected from seedy ass corners of the web by the auto-crawler)
c) it is known that ai will occasionally get lazy and reproduce images from its dataset, basically exactly as they are
Now I get that the people here are masturbation addicts whose brains have been ruined by porn, probably with extra dark fetishes that will never see the light of day. But I like to think that most of us don't want to hurt real people, and more or less are respectable human beings outside of our brains being soaked in hentai for too long.
So respectfully, don't look at ai porn. And if it were up to me, it wouldn't be a thing, especially for known celebrities and darker content.
The data set used to train the original Stable Diffusion models (up to 1.5) used a list of URLs that were not filtered for content. The URL list was generated by scraping the internet and published as a "here's a bunch of images" thing, and nobody had verified that the contents were safe. The SD team just picked it up and used it for training. When the issue was discovered, everyone building image generation AIs freaked out and switched to filtered data sets. SD 2 and all later image generation models were not trained on the problematic content.b) is it though? please provide references and I will read, not arguing here, it is just not known to me,
Thanks for that. I'm only starting to follow all of this stuff now, so kinda late to the show.The data set used to train the original Stable Diffusion models (up to 1.5) used a list of URLs that were not filtered for content. The URL list was generated by scraping the internet and published as a "here's a bunch of images" thing, and nobody had verified that the contents were safe. The SD team just picked it up and used it for training. When the issue was discovered, everyone building image generation AIs freaked out and switched to filtered data sets. SD 2 and all later image generation models were not trained on the problematic content.
a) so non-consent, including pirated games, songs, and movies are a no-go for you?
b) is it though? please provide references and I will read, not arguing here, it is just not known to me,
c) agreed, it's not magic, it kinda sucks, and it's kinda the pinnacle of human creation, minus actual babies, which are magic...
The "questionable" content was no worse than items you find on an average year of browsing the internet, it was really blown out of proportion by media and politicians to garner attention. They claimed that if it can generate images of things like rape it MUST contain them in it's data set (which is untrue). The whole point of advanced AI is it can extrapolate from the data set, earlier versions would simply mimic and slightly modify by joining elements of images (but we have come a long way baby).The data set used to train the original Stable Diffusion models (up to 1.5) used a list of URLs that were not filtered for content. The URL list was generated by scraping the internet and published as a "here's a bunch of images" thing, and nobody had verified that the contents were safe. The SD team just picked it up and used it for training. When the issue was discovered, everyone building image generation AIs freaked out and switched to filtered data sets. SD 2 and all later image generation models were not trained on the problematic content.
Think of the datasets as the knowledge acquired by observing, not just AI learns this way as humans do as well. Humans literally go to school, read books, even watch movies to learn from others and expand our own dataset. Unless somehow everyone agrees to stop (didn't work for nukes either) it will be as Thano's said:The day of ai will stop using artist work in training i will be okay with that. For now, i can't
i don't agree with your opening statement & the rest of it is just nothing more than personal opinion that leads nowhere than to argument..Look, I get this a free-speech forward type of forum. And I get that a cartoon of my little pony doing the nasty with solid snake is probably ethically safer than real porn. So what I'm going to say isn't gonna be super popular.
But AI art is bad.
a) It is produced using images of non-consenting people
b) it is known that ai training sets include images from real life dark content (not fake stuff, actual dark shit floating collected from seedy ass corners of the web by the auto-crawler)
c) it is known that ai will occasionally get lazy and reproduce images from its dataset, basically exactly as they are
Now I get that the people here are masturbation addicts whose brains have been ruined by porn, probably with extra dark fetishes that will never see the light of day. But I like to think that most of us don't want to hurt real people, and more or less are respectable human beings outside of our brains being soaked in hentai for too long.
So respectfully, don't look at ai porn. And if it were up to me, it wouldn't be a thing, especially for known celebrities and darker content.
It's more than thatThe "questionable" content was no worse than items you find on an average year of browsing the internet, it was really blown out of proportion by media and politicians to garner attention. They claimed that if it can generate images of things like rape it MUST contain them in it's data set (which is untrue). The whole point of advanced AI is it can extrapolate from the data set, earlier versions would simply mimic and slightly modify by joining elements of images (but we have come a long way baby).
I don't like the IRL looking AI, but have been surprised by My New Girlfriend and Shadows of Ambition (I think same creator). Notice I say creator and not artist - that's intentional as I don't believe they are artists (if anyone is an artist in this it is the AI itself).
Instead of the abstract, I would suggest readingIt's more than thatYou must be registered to see links
Yes but for humans, laws exist for plagiarism, and humans have a real capacity of imagination. ia lacks of regulation and punishment for non-ethical people who work with iaThink of the datasets as the knowledge acquired by observing, not just AI learns this way as humans do as well. Humans literally go to school, read books, even watch movies to learn from others and expand our own dataset. Unless somehow everyone agrees to stop (didn't work for nukes either) it will be as Thano's said:
You must be registered to see attachments
Violating copyright with AI is just as illegal as it is via any other means. Copyright law makes no distinction for the method of copying.Yes but for humans, laws exist for plagiarism, and humans have a real capacity of imagination. ia lacks of regulation and punishment for non-ethical people who work with ia
This site provides links to other sites/services, and does not store any files