Even if you can enforce this somehow, other countries will not. Unlike copyright and patent law in consumer products and content - getting an upper hand in AI race could have huge implications down the line. So the only government that would enforce this is the one that has no chance of competing in this space in the first place (EU)
Let’s be honest - this is an argument that “the ends justify the means.” But that kind of reasoning should make all of us uneasy. Where do we draw the line? If we eliminated a third of the world’s population to stop global warming, would the noble goal make it acceptable? Clearly not.
We can’t ignore the ethical cost of how AI is being developed - especially when it relies on taking other people’s work without permission. Many of today’s most powerful AI systems were trained on vast datasets filled with human-made content: art, writing, music, code, and more. Much of it was used without consent, credit, or compensation. This isn’t conjecture - it’s been thoroughly documented.
That approach isn’t just legally murky - it’s ethically indefensible. We cannot build the future on a foundation of stolen labor and creativity. Artists, writers, musicians, and other creators deserve both recognition and fair compensation. No matter how impactful the tools become, we cannot accept theft as a business model.
> So the only government that would enforce this is the one that has no chance of competing in this space in the first place (EU)
Mistral waves hello. They're alive and well, and competing well.
Also, while the AI Act and copyright are handled at the EU level, I always get the impression that anyone talking about a "EU government" simply doesn't understand the EU. If you think Germans or Slovaks are rooting for Mistral just because they're European you'd be wrong - they'd be more accepting of it, maybe, due to higher trust in them respecting privacy and related rights, but that's.
For starters, we only really care about the companies developing big commercial AI products, not the people running said models on their home PCs or anything along those lines.
If a company starts offering a new AI model commercially, you simply send someone to audit it and make sure they can provide proof of consent, have their input data, etc.
In most cases, this should be enough. If there's reason to believe an AI company is actually straight up lying to the authorities, you simply have them re-train their model in a controlled environment.
Oh and no, you don't need cryptographically secure random numbers for AI training and/or operation, so you can easily just save your random seeds along with the input data for perfectly reproducible results.
This isn't an enforcement problem, it's a lobbying problem. Lawmakers are convinced that AI will solve their problems for them when reality is that it's still mostly speculation on someone at some point finding a way to make it profitable.
In reality, training and even running AI is still way too expensive to the companies selling them, even without considering the long-term economic impact of the harmful ways they are trained (artists contribute to GDP directly, open source projects do so indirectly, and free services like wikipedia are an important part of modern society; AI is causing massive costs to all of these)
>If a company starts offering a new AI model commercially, you simply send someone to audit it and make sure they can provide proof of consent, have their input data, etc.
Good luck getting China to agree to this. So you just handicapped your own AI development in comparison to China
I wouldn't count on the US anymore, considering today's political climate. But in theory, EU+US could probably make a very compelling argument to China that if all three agree to play nice, nobody gets an advantage because of it, while everyone can benefit from a slower technological development leaving more time to figure out the societal problems.
Ultimately us random people on the internet can't say if China would want that or could be convinced with some other concessions unrelated to AI, but what we can say for sure is that, if China has the will to chill, the west has the negotiating power to match them.
AI poisoning might be the answer, but it needs a business case. Some sort of SaaS that artists can pay for to process their content that will flood and poison the crawlers.