Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The really interesting question to me is if this transcends copyright and unravels the whole concept of intellectual property. Because all of it is premised on an assumption that creativity is "hard". But LLMs are not just writing software, they are rapidly being engineered to operate completely generally as knowledge creation engines: solving math proofs, designing drugs, etc.

So: once it's not "hard" any more, does IP even make sense at all? Why grant monopoly rights to something that required little to no investment in the first place? Even with vestigial IP law - let's say, patents: it just becomes and input parameter that the AI needs to work around the patents like any other constraints.



> So: once it's not "hard" any more, does IP even make sense at all? Why grant monopoly rights to something that required little to no investment in the first place? Even with vestigial IP law - let's say, patents: it just becomes and input parameter that the AI needs to work around the patents like any other constraints.

I think it still does: IIRC, the current legal situation is AI-output does not qualify for IP protections (at least not without substantial later human modification). IP protections are solely reserved for human work.

And I'm fine with that: if a person put in the work, they should have protections so their stuff can't be ripped off for free by all the wealthy major corporations that find some use for it. Otherwise: who cares about the LLMs.


I think you have a rather idealized model of IP in mind. In practice, IP law tends to be an expensive weapon the wealthy major corporations use against the little guy. Deep enough pockets and a big enough warchest of broad parents will drain the little guy every time.


> In practice, IP law tends to be an expensive weapon the wealthy major corporations use against the little guy. Deep enough pockets and a big enough warchest of broad parents will drain the little guy every time.

Then fix that instead of blowing it up. Because IP law is also literally the only thing that protects the little guy's work in many cases.

Arguments like yours are kinda unfathomably incomplete to me, almost like they're the remnants of some propaganda campaign. It's constructed to appeal to the defense of the little guy, but the actual effect would be to disempower him and further empower the wealthy major corporations with "big enough warchest[s]."

I mean, one thing I think the RIAA would love is to stop paying royalties to every artist ever. And the only thing they'd be worried about is an even bigger fish (like Amazon, Apple, or Spotify) no longer paying royalties to them. But as you said, they have a big enough war chest that they probably could force a deal somehow. All the artists without a war chest? Left out in the cold.


It's not at all obvious whether copyright net protects or destroys the little guy.

It definitely does some of both, and we have no obvious measure or counterfactual to know otherwise.

You also have to take into account not just if optimal reform or optimal dismantle is better, but the realistic likelihood of each, and the risk of the bad outcomes from each.

Protect even more conceptual product ideas seems pretty strongly like it will result in more of a tool for big guys only, it's patents on crack and patents are already nearly exclusively "big guy crushes small guy" tool, versus copyright is at least debatably mixed.


> It's not at all obvious whether copyright net protects or destroys the little guy.

It's super obvious, unless your perspective basically stems from someone who was mad they couldn't BitTorrent a ton of movies.

I mean, FFS, copyright is the literal foundation for open source licenses like the GPL.

My sense is a lot of the radically anti-IP fervor ultimately stems from people who were outraged they could be sued for seeding an MP3 (though it's accreted other complaints to justify that initial impulse, and it's likely some where indoctrinated from secondary argumentation somewhat obscured from the core impulse).

That's not to say that there are not actors who abuse IP or there aren't meaningful reforms that could be done, but the "burn it all down" impulse is not thought through.


GPL was created as a workaround for copyright - it wouldn’t have been needed if there wasn’t copyright. There are complex arguments both for and against copyright and there’s no reason to simply assume it must always be just as now even as circumstances change.


It is ad hominem that people who see it different are just pretty criminals.

Yes it is a genius move that copy left used copyright to achieve their goal. But the name is literally reflecting the judo going on in that case. Copyleft licenses also does have a lot benefits to big companies as well too so it's not strictly a David vs Goliath victory.

I don't think it's a commonly held belief that copyright benefits small YouTube creators more than it hurts them as a concrete example, they seem to live in constant fear of being destroyed in an asymmetrical system where copyright can take away they livelihood at any moment while not doing anything to meaningfully protect it.


Blowing up IP would sink the RIAA. They would no longer have legal grounds to go after file sharing, and I’m confident that given the same legal footing that file sharing would win any day of the week.


What if a person puts in the work, but the work was worthless or can be trivially reproduced without effort?

See also: https://en.wikipedia.org/wiki/Sweat_of_the_brow


You mean like when I take a photo?


A photo is easy to take but hard to reproduce.


As is randomly splattering paint on a canvas, even with no artistic vision or skill.


Does this matter in practice though? By modifying some of the generated code and not taking a solution produced by an LLM end-to-end but borrowing heavily from it, can't a human claim full ownership of the IP even though in reality the LLM did most of the relevant work?


I think as long as the human puts in substantial and transformational effort, they can claim to be the copyright holder of the entire work, yes.


Compare taking snapshots with a camera.

Because some photographer somewhere can claim to have put in a lot of effort, we all get IP protection for photographs by default.


In the US it isn't the sweat of the brow, but rather a minimal threshold of human creativity.

https://en.wikipedia.org/wiki/Sweat_of_the_brow

https://en.wikipedia.org/wiki/Copyright_law_of_the_United_St...


Oh, good point. It's not related at all to effort then. Either way it still has to be a human.


Software is considered a complete piece of work. Therefore as long as you modify a single character - that whole product is under your copyright.



Yes, I didn't include the monkey in my 'we'.


> AI-output does not qualify for IP protections

I beg to differ. AI-output did not entitle the person creating the prompt for IP protections, so far – but my objection is not directed towards the "so far", but towards your omission of "the person creating the prompt", because if an AI outputs copyrighted material from the training data, that material is still copyrighted. AI is not a magical copyright removal machine.


The U.S. Supreme Court just declined to hear a case, thus upholding a lower court precedent that LLM output are not copyrightable: https://www.reuters.com/legal/government/us-supreme-court-de...

What this means in practice is that (currently), all output of an LLM is legally considered to not be copyrightable (to the extent that it's an original work). If it happens to regurgitate an existing copyrighted work, though, is that infringement? I'm not sure we have a legal precedent on that question yet.


The Thaler case here is something different than "AI-generated = uncopyrightable" though. Thaler was not trying to copyright work in the way humans who make work with tools normally copyright their work ("Copyright 2026 by Me"), he was specifically trying to give AI the copyright ("Copyright 2026 by My-AI-Tool"). The court rejected this because only humans can own copyright.

I believe there are other cases where AI-generated works were found uncopyrightable but Thaler is not a good example* of them.


There’s several large settlements that say Anthropomorphic/OAI didn’t want to have legal precedent. In general if it’s not outright regurgitated it would be derivative.


The out of court settlements that avoid precedent don't mean anything in a broader legal context. Legally speaking, right now in the USA, output of LLMs is not copyrighted and cannot be copyrighted (without substantial transformation by a human).

I don't think this means the same thing as whether or not LLM output can infringe on someone else's copyright though (that does pose an interesting question -- can something non-copyrightable in general infringe on something copyrighted?).


Of course. I cannot claim copyright on a poem that I have memorized as a child and written down as an adult. The original author can, though.


I don't believe that you require to do much to claim copyright over an output of an LLM. The input prompt is under copyright - a simple modification to the source code will grant copyright to you.


I'm afraid as of last week this is now as settled as it gets in US law: the output of LLMs is not per se copyrightable, though arrangements and modifications of it can be. It's like a producer who made a song entirely with public domain audio samples: he can't then demand the compulsory license when someone resamples that song.


They actually wouldn't, since they'd be sampling the new arrangement. They could reconstruct a new, similar sounding arrangement based on the original samples, but it'd be have to be different enough to that new arrangement so as not to be considered derivative of it.

That also applies to generative AI, pure output may not be copyrightable but as soon as you do something beyond type some words and press a button, like doing area-specific infills and paintovers, which involve direct and deliberate choices by a human, the copyrighted human-driven arrangement becomes so deeply intertwined with the generative work that it's effectively inseperable.


> operate completely generally as knowledge creation engines: solving math proofs, designing drugs, etc.

Any example of that? So far I haven't seen any but maybe I'm looking at the wrong places.

I've see a lot of :

- "solving" math proofs that were properly formalized, with often numerous documented past attempts, re-verified by proper mathematicians, without necessarily any interesting results

- haven't seen any designed trust, most I've seen was (again with entire teams of experts behind) finding slight optimizations

Basically all outputs I've seen so far have been both following existing trends (basically low hanging fruits without any paradigm shift) and never ever alone but rather as search supports for teams of World class experts. None of these that would quality IMHO as knowledge creation. Whenever such results were published the publication seemed mostly to be promotion about the workflow itself more than the actual results. DeepMind seems to be the prime example for that.

PS: for the epistemological distinction you can see a few past comments of mine (e.g. https://news.ycombinator.com/item?id=47011884 )


Good. Intellectual property is now a twisted concept by the elite, whatever its benefits were previously. As soon as Disney made Mickey popular, it was all downhill.


Copyright is about originality and expression, not effort. US copyright law does not use "Sweat of the Brow" doctrine.


The labor theory of value is bunk economics anyway.


More likely: this is a transitional phase where our previously hard problems become easy, and we will soon set our sights on new and much harder problems. The pinnacle of creative achievement in the universe is probably not 2010s B2B SaaS.

It is entirely possible, however, that human beings will not be the primary drivers of progress on those problems.


Finally, a perspective that looks beyond the buggy whips! As for your last comment, it depends on what you mean by the primary drivers. Figurative crank turners, maybe not. Creativity and insight, don’t count us out just yet.


> if this transcends copyright and unravels the whole concept of intellectual property.

I have been saying this for years. Intellectual property is based on the concept that ideas can be owned, which is fundamentally a contradiction with how reality operates. We've been able to write laws that paper over that contradiction by introducing concepts like "fair use", but it doesn't resolve it.

AI is just making the conflict arising out of that contradiction more intense in new ways and forcing us to reckon with it in this new technological landscape. You can follow two perfectly reasonable lines of logic and end up with contradictory solutions. So how are we going to get out of this mess? I don't know, not without rolling back (at least parts of) what intellectual property is in the first place.


Nothing changes for drug patents regardless of whether an LLM was used in the discovery process.


Not sure why this should be true; the US Supreme Court recently chose to let precedent stand that AI creations are not copyrightable. https://www.reuters.com/legal/government/us-supreme-court-de...

That also seems relevant for this whole discussion, actually -- if a work can't be copyrighted it certainly can't have a changed license, or any license at all. (I guess it's effectively public domain to the extent that it's public at all?)


You're really missing the point in multiple ways. First, precedents on copyright law are irrelevant to patent law. Second, AI generated works generally can be copyrighted under the human creator's name.


No, I think you are quite incorrect, at least on the latter point:

"Lower courts upheld a U.S. Copyright Office decision that the AI-crafted visual art at issue in the case was ineligible for copyright protection because it did not have a human creator."

Not eligible for copyright protection does not mean it can be copyrighted "under the human creator's name". It means there is no creative work at all. No copyright.


And while courts in theory aren't supposed to apply copyright precedent to patent cases, in practice, they apparently do a lot of the time, so it's kind of a mess! https://scholarship.kentlaw.iit.edu/ckjip/vol16/iss1/4/#:~:t...


No, you're still missing the point. Did you even read the court's opinion?


No, just news articles. Perhaps the news media have misrepresented the outcome here.


Even if all I have to do is tell my agent, "here is a patent for a drug, analyse the patent and determine an equivalent but non-infringing drug" and it chugs away for a couple of hours and spits out a drug along with all the specifications to manufacture it?

I guess the state of play will be that for new drugs the original manufacturer will already have done that and ensured that literally anything that could be found as a workaround is included in the scope of the patent. But I feel like it will not be possible to keep that wartertight.


Yes, even so. Human drug researchers have been doing the same thing for decades. As soon as one pharmaceutical company launches a successful small-molecule drug everyone else jumps to find a minor tweak that will hit the same target (ideally with fewer side effects) while evading the patent. There is already specialized software to help with this process so I'm skeptical that LLM agents would be very helpful for this use case.


The formula is what is patented, not the process to come up with it.


At some level, IP makes sense — creators should be rewarded. But IP only benefits those who claim it. The benefits rarely flow back to humanity who made it all possible. Every LLM was trained on humanity's collective knowledge. The value was created collectively, then captured privately.

That's the reason I like the idea of DUKI/dju:ki/ — Decentralized Universal Kindness Income, similar to UBI but driven by voluntary kindness and sincere marketing rather than taxation. If AI makes creation trivially easy and IP loses its justification, the question becomes: how do we ensure a tiny part of the wealth generated flows back to everyone?


This is similar :

https://www.vice.com/en/article/musicians-algorithmically-ge...

Two musicians generated every possible melody within an octave, and published them as creative Commons Zero.

I never heard about this again though.


The point of IP is to encourage the creation of new things.

Not all protections have to be ones that give total control like copyright.

I think it's a mistaken assumption that costs will fall to zero. The low hanging fruit will get picked, and then we'll be doing expensive combined AI/wetlab search for new drugs.

If there is any meaningful headroom we will keep doing expensive things to make progress.


> The point of IP is to encourage the creation of new things.

Then why are corporations allowed to milk successful works for all eternity? Why do we have Disney monopolizing films made half a century ago? Why do we have Nintendo selling people the exact same Mario ROMs from the 80s every single console generation?

They should have like 10 years of copyright so they can turn a profit. Once it expires it's over and the work enters the public domain where it belongs. If they want to keep profiting they should have to keep creating new things. They shouldn't be able to turn shared culture into eternal intellectual property portfolios that they monopolize and then sit on like dragons.


There is always drift between intent and implementation, but to be generous here, Disney is generally making new works with their IP and so is Nintendo.

I am somewhat curious what you think shortening the copyright window would do that's so great for the culture though. We already have more than enough IP slop that's just licensed.


> Disney is generally making new works with their IP and so is Nintendo

Let them profit from those new works then. All the works from the last century belong in the public domain.


From what I understand, LLMs can't really generate anything meaningful that doesn't implicitly rely on the operator's choices. It's hard to make the right novel choices as soon as you leave well-defined problem spaces.

In terms of math and biochemistry the cost of generating candidates has collapsed, but the cost of validating them hasn't.


It might unravel intellectual property, just not in a fair way. When capitalism started, public land was enclosed to create private property. Despite this being in many cases a quite unfair process, we still respect this arrangement.

With AI, a similar process is happening - publicly available information becomes enclosed by the model owners. We will probably get a "vestigial" intellectual property in the form of model ownership, and everyone will pay a rent to use it. In fact, companies might start to gatekeep all the information to only their own LLM flavor, which you will be required to use to get to the information. For example, product documentation and datasheets will be only available by talking to their AI.


Copyright doesn’t depend on the “sweat of the brow”. See Feist v Rural Telephone co 1991

Also copyright can protect something normally not eligible when the author chooses what information to include and exclude


The basis of your argument is that AI-generated work isn't hard, but your conclusion is that ALL work, AI-generated or not, should lose IP rights?


There's different kinds of intellectual property.

Copyright might rest on 'creativity is hard'. But patents and trademarks do not.


Trademarks don't, patents do. Different kind of creativity but still.


"What happens when an LLM outputs a patented algorithm?" remains a huge land mine out there, particularly since patent infringement does not require intent or even knowledge, and these models have trained on every patent ever granted.


If you can prove that your LLM did not learn from the patent (eg cut-off for learning was before), then the LLM outputting the algorithm (or product etc) would be pretty good evidence that a practitioner of ordinary competence in the field, or whatever the exact legal wording is, found the whole invention to be trivial.

It doesn't work when that happens with humans so it absolutely wouldn't work when it happens with a machine

Patents do to a small extent, maybe. But eg medical patents are a lot about protecting all the 'sweat' you put in, not so much the creativity.


You don't need a realization to receive a patent, just the drawing.


I'm not quite so sure for medical patents.

I mean, why are patent trolls not getting patents for all compounds under the sun for all conceivable medical uses?


If you think about creative outcomes as n dimensional 'volumes', AI expressions can cover more than humans in many domains. These are precisely artistic styles, music styles etc. and tbh not everyone can be a Mozart but may be a lot more with AI can be Mozart lite. This begs the question how much of creativity is appreciated as a shared experience


Intellectual property never made any sense to begin with. It is logically reducible to ownership of numbers. It is that absurd. Computers made the entire concept irrelevant the second they were invented but they kept holding on via lobbying power. Maybe AI will finally put the final nail on the coffin of intellectual property.

Sure, it's disgusting and hypocritical how these corporations enshrined all this nonsense into law only to then ignore it all the second LLMs were invented. It's ultimately a good thing though. The model weights are all that matters. All we need to do is wait for the models to hit diminishing returns, then somehow find a way to leak them so that everyone has access. If they refuse, then just force them. By law or by revolution.


"Hard" or "easy" has never been part of the premise.

A company spends a decade and billions of dollars to develop a groundbreaking drug and patents it.

I think of a cool new character called "Mr Poop" and publish a short story about him with an hour of work.

Both of us get the exact same protection under the law (yes yes I know copyright vs patent etc., but ultimately they are all about IP protection).


Creativity is still hard. AI-generated content is called "slop" for a reason ;-)


I've always thought the opposite: IP law was created to make sure creativity stays hard, and hence controllable by the elites.

Patents came along when farmers started making city goods, threatening guilds secrets. Copyright came when the printing press made copying and translating the bible easy and accessible to all. (Trademark admittedly does not fit this view, but doesn't seem all that damaging either)

To Protect The Arts, and To Time Limit Trade Secrets were just the Protect The Children of old times, a way to confuse people who didn't look too hard at actual consequences.

This means that the future of IP depends on what lets the powers that be pull up the ladder behind them. Long term I'd expect e.g. copyright expansion and harder enforcement, just because cloning by AI gets easy enough to threaten the status quo.


> Trademark admittedly does not fit this view, but doesn't seem all that damaging either

Isn’t trademark the only thing keeping a certain cartoon mouse out of the public domain, despite the fact that his earliest animations are out of copyright? Not sure if you’d consider that damaging, or if anyone has yet tested the boundaries of the House of Mouse’s patience here.


:/ before copyright you just had patrons, which looks a lot more like the rich controlling what art gets made than what we have today


Don't worry. The courts have consistently sided with huge companies on copyright. In the US. In Europe. Doesn't matter.

Company incorporates GPL code in their product? Never once have courts decided to uphold copyright. HP did that many times. Microsoft got caught doing it. And yet the GPL was never applied to their products. Every time there was an excuse. An inconsistent excuse.

Schoolkid downloads a movie? 30,000 USD per infraction PLUS armed police officer goes in and enforces removal of any movies.

Or take the very subject here. AI training WAS NOT considered fair use when OpenAI violated copyright to train. Same with Anthropic, Google, Microsoft, ... They incorporated harry potter and the linux kernel in ChatGPT, in the model itself. Undeniable. Literally. So even if you accept that it's changed now, OpenAI should still be forced to redistribute the training set, code, and everything needed to run the model for everything they did up to 2020. Needless to say ... courts refused to apply that.

So just apply "the law", right. Courts' judgement of using AI to "remove GPL"? Approved. Using AI to "make the next Disney-style movie"? SEND IN THE ARMY! Whether one or the other violates the law according to rational people? Whatever excuse to avoid that discussion is good enough.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: