• 7112@lemmy.world
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    6
    ·
    9 months ago

    I agree that AI work should not have copyright protection. Even with human intervention it still collects data, without expressed permission, from numerous sources.

    This will actually protect smaller artists. It will prevent giant companies from profiting from their work without credit or payment.

    • thehatfox@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      10
      ·
      9 months ago

      I agree that AI work should not have copyright protection. Even with human intervention it still collects data, without expressed permission, from numerous sources.

      Generative AI models could be trained using only on public domain and royalty free images. Should the output of those be eligible for copyright, but not if they also had unlicensed training data?

      It seems there two separate arguments being conflated in this debate. One is whether using copyrighted works as AI training data is fair use. The other is whether creative workers should be protected from displacement by AI.

      • FaceDeer@kbin.social
        link
        fedilink
        arrow-up
        8
        arrow-down
        1
        ·
        9 months ago

        “Royalty free” is not the same as public domain, most “royalty free” images still need to be licensed for particular uses and come with other restrictions. The only thing royalty free means is that the copyright owner doesn’t demand a cut of each sale you make of whatever you used it in.

    • SkySyrup@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      9 months ago

      But that won’t happen. Companies have money, and by extension, lobbyists. It doesn’t matter what the general consensus is, they will get their way.

    • Peanut@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      6
      ·
      9 months ago

      So we kill open source models, and proprietary data models like adobe are fine, so they can be the only resource and continue rent seeking while independent artists can eat dirt.

      Whether or not the model learned from my art is probably not going to affect me in any way shape or form, unless I’m worried about being used as a prompt so people could use me as a compass while directing their new image aesthetic. Disney/warner could already hire someone to do that 100% legally, so it’s just the other peasants im worried about. I don’t think the peasants are the problem when it comes to the wellbeing and support of artists

      • 7112@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        2
        ·
        9 months ago

        I believe a person can still sell or market art that is AI created. I just believe they shouldn’t have total ownership of the work.

        Already most creators don’t fret over fanart or fanfiction so there is wiggle room for fair-use. It’s a lot like the game modding scene. Usually modders use pre-existing assets or code to create something new.

        Let people play but not own AI work for now.

        • FaceDeer@kbin.social
          link
          fedilink
          arrow-up
          4
          arrow-down
          2
          ·
          9 months ago

          If I take a copy of the Mona Lisa and draw a luxurious moustache on it, I now own the copyright to that moustache-bedecked Italian’s image. Sure, the original image is still public domain, and if someone was to crop the moustache out of my version the bit they’d be left over with would be free and clear of my copyright. But if I use an AI to generate an image and then do the same thing to it how would you even know which bit to crop? And what value would there be in the “leftovers”? Might as well just use your own AI to generate what you need.

          I think a lot of AI-hating artists feel that if AI-generated art is declared uncopyrightable they’d “win” somehow. I don’t think they’ll see the results they’re expecting, if that comes to pass.

          • 7112@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            9 months ago

            It seems we need to just let this all run longer and see what happens. Currently we have no real way to detect AI in media beside disclosures and the silly mistakes like 20 fingers. This all relies on the creator (Not hard to edit a photo to clean up those hands etc)

            I think a lot of creatives are struggling so they just feel shut out of the conversation. Copyright is probably the one thing most people can understand as a talking point.

            I think we still have some time before we see which way will work. Ideally we could always augment the laws… but yeah, America and stuff.

      • sorrybookbroke@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        3
        ·
        9 months ago

        Ok that’s entirely disingenuous. You can make an AI model open source where you get real permission from artists instead of taking their work without permission and using it to replace them.

        It’s entirely possible. Will the large orgs have more resources to collect art? No shit, yeah, and they’ll have better PCs to train on too. No matter what, allowed to take from hard working small artists while attempting to make them irrelevant or not, the big corps will always have a heads up here.

        Unless, like so many other projects, an extremely complicated system benefits from collaborative work inside an open environment. Shocking, I know, working on merit over money.

        You don’t want a conversation though you just want “epic dunks”

        • lolcatnip@reddthat.com
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          2
          ·
          9 months ago

          The amount of training data needed for a model is so huge that you’d have to use only artwork that was preemptively licensed for that purpose. Individually asking artists for permission to use their work would be far too expensive even if they all agreed to let you use their work for free.

          • BURN@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            1
            ·
            9 months ago

            And why is that a problem?

            Artists should have control over their work. It’s not my problem that a big company vs a small company is stealing my work, I don’t want either of them to.

            I no longer post anything online that I create cause I’d rather nobody see it than it be stolen for AI training.

          • sorrybookbroke@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            2
            ·
            9 months ago

            That is correct, though there could be campaigns to collect art otherwise. There are plenty of artists in the open source world who could do it, and asking individuals to signal boost these calls to action can get more push. Once more, no matter what, big corps will always have more monitary resources. The power of open source is volunteer manpower and passion. Even if these weren’t the case, the moral argument still stands in using a persons work to replace them without permission.

            Regardless of that even, what this will do is cause stagnation in the art field if not protected. Nobodies going to share their art, their method, or their ideas freely in a world where doing so allows a massive corp to take it freely without permission, thus replacing them. This kills ideas of open distribution of art and art information. It will become hidden, and new ideas, new art, will not be available to view.

            Allowing people to take without permission will only ever hurt the small artists. Disney will always be able to just “take” any art they make.

            Also, you’re not entirely correct on that. Models made for specific purposes don’t actually need the absurd amount generalist models need. However in the context of current expectations yeah, you’re right on quantity.

            • FaceDeer@kbin.social
              link
              fedilink
              arrow-up
              3
              arrow-down
              2
              ·
              9 months ago

              Massive corps don’t need to use the output of “little artists”, they have their own massive repositories of works they own or license that they can train AIs on.

              The small artists won’t be able to use those AIs, though. Those AIs will belong to Disney or Getty Images, and if they deign to allow others to use them it’ll be through paywalls and filters and onerous licensing terms. The small artists would only be able to use open models freely.

              This insistence on AIs being prohibited from learning from otherwise public images is going to be a phyrric victory if it ever comes to pass.

              • sorrybookbroke@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                4
                arrow-down
                1
                ·
                9 months ago

                Why do they do it now then? They do need this. They need absurd amounts of tagged images of varying quality and style. No, their own repositories are nowhere near enough for general models. They require the small artists. Many artists, small or large, will simply refuse to license to disney too.

                Allowing them to take from the smaller artists does not help the situation either. They now simply have more data, which they can run through their better equiped systems, quicker than anyone else can do. This helps the big corps while doing little for us small devs.

                On the matter of these being “otherwise public images” being what they are trained on, can you not see this destroying this large public repository of information? No new work made by people who have unique ideas will be made public. Why would they? if they do, disney and getty images can now out compete them. This will cause the currently massive resource of images, information, and general art to become hidden. To become no-longer public. This stagnates art where it is now. Only that which people are OK with AI taking will be shared, becouse it will be. We get the same outcome either way, save for that already shared, the only difference is that nobody is able to enjoy the art being made which the artists don’t want training AI.

                • FaceDeer@kbin.social
                  link
                  fedilink
                  arrow-up
                  3
                  arrow-down
                  1
                  ·
                  9 months ago

                  It’s convenient to be able to use whatever publicly available images you want for training, but it’s not necessary. Adobe proved this with their Firefly AI.

                  • sorrybookbroke@sh.itjust.works
                    link
                    fedilink
                    English
                    arrow-up
                    3
                    arrow-down
                    1
                    ·
                    edit-2
                    9 months ago

                    Their text to image is nowhere near the abilities of other tools, and the rest are specialized tools.

                    It’s convenient, yes, but without it these models are much more limited.

                    Even of it was, my other points which you’ve ignored still stand

              • BURN@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                arrow-down
                2
                ·
                9 months ago

                It’ll be a massive victory for artists and a failure for all the sham AI prompt generators.

                There’s not a single downside to requiring all material used in training to be licensed.

                • FaceDeer@kbin.social
                  link
                  fedilink
                  arrow-up
                  3
                  arrow-down
                  2
                  ·
                  9 months ago

                  There’s not a single downside to requiring all material used in training to be licensed.

                  It destroys the open source/hobbyist sector. The only AIs that would be available for artists to use would be corporate-controlled, paywalled, and filtered. That’s a pretty huge downside.

                  • BURN@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    3
                    arrow-down
                    2
                    ·
                    9 months ago

                    That’s not my problem

                    Art is not generated by machines. Nothing of value is lost.

    • HubertManne@kbin.social
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      9 months ago

      I agree. With their example im not sure photos should. The exact photo. Ok, but someone making another thing based on it was sued? Thats bs. the photo was an event that happened. Im ok with them having rights on the photo but not what the photo shows.

    • Fubarberry@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      9 months ago

      Big companies like Adobe and Google can get the rights to use material to train their models. If stricter laws get passed it will only slightly inconvenience the larger companies, but might completely destroy any of the smaller companies or open-source versions available.

      The anti-ai lawsuits aren’t going to stop ai art/etc, just determine whether it’s completely controlled by the current tech giants or not.

      • 7112@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 months ago

        Sadly no matter what, the big media companies are going to have a huge advantage in everything because of decades of lobbying etc.

        I think people should still be able to profit from selling the image themselves, however, I don’t think we have enough knowledge on how AI will truly impact things. If it becomes a minor fad and is just a tool to help speed a process I think the law doesn’t need to change much.

        If AI becomes the majority creator on projects then we have to have this conversation about who owns what.

        Close models will probably be the future, much like stock photos, and people will have to pay to access the models.

        In the end big business will always fuck us over, copyright or not.