• 7112@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    5
    ·
    9 months ago

    I agree that AI work should not have copyright protection. Even with human intervention it still collects data, without expressed permission, from numerous sources.

    This will actually protect smaller artists. It will prevent giant companies from profiting from their work without credit or payment.

    • thehatfox@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      9
      ·
      9 months ago

      I agree that AI work should not have copyright protection. Even with human intervention it still collects data, without expressed permission, from numerous sources.

      Generative AI models could be trained using only on public domain and royalty free images. Should the output of those be eligible for copyright, but not if they also had unlicensed training data?

      It seems there two separate arguments being conflated in this debate. One is whether using copyrighted works as AI training data is fair use. The other is whether creative workers should be protected from displacement by AI.

      • FaceDeer@kbin.social
        link
        fedilink
        arrow-up
        8
        ·
        9 months ago

        “Royalty free” is not the same as public domain, most “royalty free” images still need to be licensed for particular uses and come with other restrictions. The only thing royalty free means is that the copyright owner doesn’t demand a cut of each sale you make of whatever you used it in.

    • Peanut@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      5
      ·
      9 months ago

      So we kill open source models, and proprietary data models like adobe are fine, so they can be the only resource and continue rent seeking while independent artists can eat dirt.

      Whether or not the model learned from my art is probably not going to affect me in any way shape or form, unless I’m worried about being used as a prompt so people could use me as a compass while directing their new image aesthetic. Disney/warner could already hire someone to do that 100% legally, so it’s just the other peasants im worried about. I don’t think the peasants are the problem when it comes to the wellbeing and support of artists

      • 7112@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        2
        ·
        9 months ago

        I believe a person can still sell or market art that is AI created. I just believe they shouldn’t have total ownership of the work.

        Already most creators don’t fret over fanart or fanfiction so there is wiggle room for fair-use. It’s a lot like the game modding scene. Usually modders use pre-existing assets or code to create something new.

        Let people play but not own AI work for now.

        • FaceDeer@kbin.social
          link
          fedilink
          arrow-up
          4
          arrow-down
          2
          ·
          9 months ago

          If I take a copy of the Mona Lisa and draw a luxurious moustache on it, I now own the copyright to that moustache-bedecked Italian’s image. Sure, the original image is still public domain, and if someone was to crop the moustache out of my version the bit they’d be left over with would be free and clear of my copyright. But if I use an AI to generate an image and then do the same thing to it how would you even know which bit to crop? And what value would there be in the “leftovers”? Might as well just use your own AI to generate what you need.

          I think a lot of AI-hating artists feel that if AI-generated art is declared uncopyrightable they’d “win” somehow. I don’t think they’ll see the results they’re expecting, if that comes to pass.

          • 7112@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            9 months ago

            It seems we need to just let this all run longer and see what happens. Currently we have no real way to detect AI in media beside disclosures and the silly mistakes like 20 fingers. This all relies on the creator (Not hard to edit a photo to clean up those hands etc)

            I think a lot of creatives are struggling so they just feel shut out of the conversation. Copyright is probably the one thing most people can understand as a talking point.

            I think we still have some time before we see which way will work. Ideally we could always augment the laws… but yeah, America and stuff.

    • HubertManne@kbin.social
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      9 months ago

      I agree. With their example im not sure photos should. The exact photo. Ok, but someone making another thing based on it was sued? Thats bs. the photo was an event that happened. Im ok with them having rights on the photo but not what the photo shows.

    • Fubarberry@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      9 months ago

      Big companies like Adobe and Google can get the rights to use material to train their models. If stricter laws get passed it will only slightly inconvenience the larger companies, but might completely destroy any of the smaller companies or open-source versions available.

      The anti-ai lawsuits aren’t going to stop ai art/etc, just determine whether it’s completely controlled by the current tech giants or not.

      • 7112@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 months ago

        Sadly no matter what, the big media companies are going to have a huge advantage in everything because of decades of lobbying etc.

        I think people should still be able to profit from selling the image themselves, however, I don’t think we have enough knowledge on how AI will truly impact things. If it becomes a minor fad and is just a tool to help speed a process I think the law doesn’t need to change much.

        If AI becomes the majority creator on projects then we have to have this conversation about who owns what.

        Close models will probably be the future, much like stock photos, and people will have to pay to access the models.

        In the end big business will always fuck us over, copyright or not.