See this article, and yours:
Should we really be surprised?
www.creativebloq.com
These two articles are a great example of why the courts will never rule that AI companies must only use copyright cleared content in training and stop the great replacement.
In the article, their definition of ethical is not about asking creators for permission, or even paying them. Shutterstock and Getty, mentioned in these articles, made their own AI model of their content, made by the same people that think these companies might win the legal battle for them. But they're the ones that you'll have to fight as well if you want to stop all this copyrighted material to be used for training
Look at what Adobe are quoted as saying in the above article:
Hayward also noted that Adobe Stock contributors who submitted AI-generated imagery would qualify for Adobe's 'Firefly bonus', which it paid to contributors whose content was used to train the first public version of the AI model.
So when companies like Midjourney and OpenAI use massive datasets of imagery scraping the internet for everything they can get their hands on, it's "theft" and "unethical". But when Shutterstock, Adobe, and Getty Images do it to their own content creators, that's fine?
But don't worry, if Adobe used your images to train their AI, they'll throw a few dollars at you and call it a "bonus". That will make up for losing your entire industry. We know it can't possibly be more than a few dollars because otherwise they'd have bankrupted themselves.
It's sad that artists (and in-denial composer producers) are thinking they'll be fought for in court, when really these companies have zero interest in fighting for them and are currently (or actively) already betraying them. They care about THEMSELVES being replaced by AI, not you! They cared that someoneELSE "stole" your work for training their AI. Like Adobe, Universal Music will do the same thing. They'll talk about how much this harms artists on the one hand then casually train their own AI on all your work and say they're the ethical ones and expect you to be grateful. Here have $20.
It should be even more insulting because unlike the AI companies, these company's are the ones claiming it's theft and bad for artists, yet they'll go ahead and do the same thing anyway to their creators and act like it's effectively different.
Lets say you have music published with Universal Music. How much would you need if Universal Music made an AI like Udio
(only better because it's the future), trained on every track in their collection, and suddenly you find you've been paid a hundred bucks or something for all your work to be used in the training?
The best outcome for these companies is they get those like OpenAI to license their content to them. That's why they're so casual about making their own models, they know the court's legal case isn't going to actually make THEIR models unlawful.
The faster creators give up on this fantasy that they can stop this the better off they'll be. If you worked really hard and were really successful, you might be able to marginally slow down a company or two for maybe 6 months. By the time these cases even reach a legal precedent open source will have already made it impossible to go back, even if you wanted to.