caption
a screenshot of the text:
Tech companies argued in comments on the website that the way their models ingested creative content was innovative and legal. The venture capital firm Andreessen Horowitz, which has several investments in A.I. start-ups, warned in its comments that any slowdown for A.I. companies in consuming content “would upset at least a decade’s worth of investment-backed expectations that were premised on the current understanding of the scope of copyright protection in this country.”
underneath the screenshot is the “Oh no! Anyway” meme, featuring two pictures of Jeremy Clarkson saying “Oh no!” and “Anyway”
screenshot (copied from this mastodon post) is of a paragraph of the NYT article “The Sleepy Copyright Office in the Middle of a High-Stakes Clash Over A.I.”
We need copyright reform. Life of author plus 70 for everything is just nuts.
This is not an AI problem. This is a companies literally owning our culture problem.
We do need copyright reform, but also fuck “AI.” I couldn’t care less about them infringing on proprietary works, but they’re also infringing on copyleft works and for that they deserve to be shut the fuck down.
Either that, or all the output of their “AI” needs to be copyleft.
Not just the output. One could construct that training your model on GPL content which would have it create GPL content means that the model itself is now also GPL.
It’s why my company calls GPL parasitic, use it once and it’s everywhere.
This is something I consider to be one of the main benefits of this license.
So if I read a copyleft text or code once, because I understood and learned from it any text I write in the future also has to be copyleft?
HOLY SHIT!
Doctor here, I’m sorry to inform you that you have a case of parasitic copyleftiosis. Your brain is copyleft, your body is copyleft, and even your future children will be copyleft.
GPL. Not even once!
Yes, now gimme that brain of yours. My comment was GPL too.
It already is. God you uninformed people are insufferable.
It already is.
If you mean that the output of AI is already copyleft, then sure, I completely agree! What I meant to write that we “need” is legal acknowledgement of that factual reality.
The companies running these services certainly don’t seem to think so, however, so they need to be disabused of their misconception.
I apologize if that was unclear. (Not sure the vitriol was necessary, but whatever.)
There have already been cases decided… That’s enough
Going one step deeper, at the source, it’s oligarchy and companies owning the law and in consequence also its enforcement.
If this is what it takes to get copyright reform, just granting tech companies unlimited power to hoover up whatever they want and put it in their models, it’s not going to be the egalitarian sort of copyright reform that we need. Instead, we will just getting a carve out just for this, which is ridiculous.
There are small creators who do need at least some sort of copyright control, because ultimately people should be paid for the work they do. Artists who work on commission are the people in the direct firing line of generative AI, both in commissions and in their day jobs. This will harm them more than any particular company. I don’t think models will suffer if they can only include works in the public domain, if the public domain starts in 2003, but that’s not the kind of copyright protection that Amazon, Google, Facebook, etc. want, and that’s not what they’re going to ask for.
Piracy / stealing content is ok for big corps Piracy / stealing content punishable by life in prison for us proletarians
This is simply not stealing. Viewing content has never ever ever been stealing.
There is no view right.
They are downloading the data so thei LLM can “view” it. How is that different than downloading movies to view them?
They’re not downloading anything tho. That’s the point. At no point are they posessing the content that the AI is viewing.
This is LESS intrusive than a Google web scraper. No one trying to sue Google for copyright for Google searches.
What? Of course they are downloading, the content still has to reach their networks and computers.
Go look up how ai works. There is no download lol. It’s the exact same principal as web scrapers which have been around for literally decades.
Tech illiterate guy here. All these Ml models require training data, right? So all these AI companies that develop new ML based chat/video/image apps require data. So where exactly do they? It can’t be that their entire dataset is licensed, isn’t it?
If so, are there any firms that are using these orgs for data theft? How to know if the model has been trained on your data? Sorry if this is not the right place to ask.
You know how you look at a pic on the internet and don’t pay? The AI is basically doing the same thing only it’s collecting the effect of the data points ( like pixels in a picture) more accurately. The input no matter what it is only moves a set of weights. That’s all. It does not copy anything it is trained on.
Yes it can reproduce with some level of accuracy any work just like a painter or musician could replay a piece they see or hear.
Again, this is not theft any more than u hearing a Song or viewing a selfie.
I make the exact argument all the time and it gets ignored. I think people fundamentally don’t understand the tech and can’t conceptualize that AI models train the same way we get ideas and schemas from our own observations.
People even deny that AI can “learn” but that they just copy and manipulate data…
only it’s collecting the effect of the data points ( like pixels in a picture) more accurately
Isn’t that the entire point of creativity. though? What separates an artist from a bad painter is the positioning of pixels on a 2-Dimensional plane? If the model collects the positions of pixels together with the pixel RGB (color? Don’t know the technical term for it), then the model is effectively stealing the “pixel configuration and makeup” of that artist which can be reproduced by the said model anywhere if similar prompts were passed to it?
Focus. We are talking about copyright. Copyright doesn’t cover this at all.
Could say piracy is just running a program that “views” the content, and then regurgitates its own interpretation of it into local data stores.
It’s just not very creative, so it’s usually very close.
You could say that but you’d be wrong.downloading is a bitwise copy. Training isn’t even close to the same thing.
That’s one thing, but I think regurgitating it and claiming it as your own is a completely different thing.
Also, I’m pretty sure the argument is more about the unequal enforcement of the law. Copyright should be either enforced fairly or not at all. If AI is allowed to scrape content and regurgitate it, piracy should also be legal.
Again that’s not what’s happening here
I’m gonna play them a song on the world’s smallest violin.
And i’m gonna put this for me Lucky 10000
That’s what would be called “a swing and a miss”
It’s almost like speculating has risks
Bu-bu-but you didn’t think of my investors!
“You can’t just decide we were wrong about IP, that would make us broke!”
Either this kill large AI models (at least commercial). Or it kill some copyright bs in some way. Whatever happens, society wins.
Second option could also hurt small creator though.
I fear this is a giant power grab. What this will lead to is that IP holders, those that own the content that AI needs to train will dictate prices. So all the social media content you kindly gave reddit, facebook, twitter, pictures, all that stuff means you won’t be able to have any free AI software.
No free / open source AI software means there is a massive power imbalance because now only those who can afford to buy this training data and do it, any they are forced to maximize profits (and naturally inclined anyway).
Basically they will own the “means of generation” while we won’t.
Current large model would all be sued to death, no license with IP owner yet, would kill all existing commercial large models. Except all IP owner are named and license granted retroactive, but sound unlikely.
Hundred of IP owner company and billion of individual IP owner setting prices will probably behave like streaming: price increase and endless fragmentation. Need a license for every IP owner, paperwork will be extremely massive. License might change, expire, same problem as streaming but every time license expire need to retrain entire model (or you infringe because model keep using data).
And in the EU you have right to be forgotten, so excluded from models (because in this case not transformative enough, ianal but sound like it count as storing), so every time someone want to be excluded, retrain entire model.
Do not see where it possible to create large model like this with any amount of money, time, electricity. Maybe some smaller models. Maybe just more specific for one task.
Also piracy exists, do not care about copyright, will just train and maybe even open source (torrent). Might get caught, might not, might become dark market, idk. Will exist though, like deepfakes.
Yeah those are the myriad of complications this will cause. People are worried about AI, I’m too, but we need smart regulation not to use IP laws that only increases power of the ultra-rich. Because if AI will continue to exist, that will severely distort and limit the market to very specific powerful entities. And that is almost certainly going to be worse than completely unregulated.
I know plenty of small creators who urge me to pirate their content.
Because all they want is people to enjoy their content, and piracy helps spread their art.
So even small creators are against copyright.
I mean, I won’t deny that small bit of skill it took to construct a plausible sounding explanation for why the public should support your investment, because it’s “not illegal (yet)”.
“technically this thievery isn’t covered by law”
"technically this what?”
"OBJECTION!”
You stole my post by looking at it. Pay me.
Removed by mod
That’s the point about money, if you have enough you can simply sue or bribe in order to not lose money.
“In other news, the new Dacia Sandero looks fabulous.”
“warned”
Or what? I want to see that bluster called.
Or their victims will realize they got scammed.
Can someone rephrase me ? I’ve read it two times and I really don’t get it
As Robert Evan’s put it: “If we can’t steal every book ever written, we’ll go broke!”
A scammer made unreasonable promises to investors and is now warning everyone that their victims/investors are going to lose money when the process of making fair laws takes the typical amount of time that it always takes.
Let them fight!
A’ight. Time to self host the entire of the internet in a server and do machine learning with the content I stored. :)