While the motives may be noble (regulating surveillance) it might happen that models like Stable Diffusion will get caught in the regulatory crossfire to a point where using the original models becomes illegal and new models will get castrated until they are useless. Further this might make it impossible to train open source models (maybe even LoRAs) by individuals or smaller startups. Adobe and the large corporations would rule the market.
Yes, but it if they make it contraband then the open sharing will stop. If they tell the general public it is used to create illegal porn then it will be seen as nothing else. It still feels like sorcery to me sometimes, these mundies will certainly call it witchcraft and sharpen their pitchforks. If posession is made illegal then very few people will dare to share it. Being labeled a pirate has at least honor, being labeled a pervert is something else.
My point is: I think it is not enough we have models and code, it has to become part of many more people’s jobs now so they want to actively protect it and lobby for it.
SD is an amazing tool for basic photo editing, restoration, indie games and simply art. But most non-techie artists I talk to either cry about stolen copyright or some artisan fantasy, claiming real art must be created through diligence. (And then tell me a week later how they just looove Adobe’s Firefly and how magic it is).
one solution is to use public images to train it than lot of volunteer will help it improve by rating images and reinforcement lerning