Greg Rutkowski, a digital artist known for his surreal style, opposes AI art but his name and style have been frequently used by AI art generators without his consent. In response, Stable Diffusion removed his work from their dataset in version 2.0. However, the community has now created a tool to emulate Rutkowski’s style against his wishes using a LoRA model. While some argue this is unethical, others justify it since Rutkowski’s art has already been widely used in Stable Diffusion 1.5. The debate highlights the blurry line between innovation and infringement in the emerging field of AI art.

  • doeknius_gloek@feddit.de
    link
    fedilink
    arrow-up
    109
    ·
    1 year ago

    While some argue this is unethical, others justify it since Rutkowski’s art has already been widely used in Stable Diffusion 1.5.

    What kind of argument is that supposed to be? We’ve stolen his art before so it’s fine? Dickheads. This whole AI thing is already sketchy enough, at least respect the artists that explicitly want their art to be excluded.

    • FaceDeer@kbin.social
      link
      fedilink
      arrow-up
      26
      ·
      1 year ago

      His art was not “stolen.” That’s not an accurate word to describe this process with.

      It’s not so much that “it was done before so it’s fine now” as “it’s a well-understood part of many peoples’ workflows” that can be used to justify it. As well as the view that there was nothing wrong with doing it the first time, so what’s wrong with doing it a second time?

      • Pulse@dormi.zone
        link
        fedilink
        arrow-up
        34
        ·
        1 year ago

        Yes, it was.

        One human artist can, over a life time, learn from a few artists to inform their style.

        These AI setups are telling ALL the art from ALL the artists and using them as part of a for profit business.

        There is no ethical stance for letting billion dollar tech firms hoover up all the art ever created to the try and remix it for profit.

        • FaceDeer@kbin.social
          link
          fedilink
          arrow-up
          23
          ·
          1 year ago

          No, it wasn’t. Theft is a well-defined word. When you steal something you take it away from them so that they don’t have it any more.

          It wasn’t even a case of copyright violation, because no copies of any of Rutkowski’s art were made. The model does not contain a copy of any of the training data (with an asterisk for the case of overfitting, which is very rare and which trainers do their best to avoid). The art it produces in Rutkowski’s style is also not a copyright violation because you can’t copyright a style.

          There is no ethical stance for letting billion dollar tech firms hoover up all the art ever created to the try and remix it for profit.

          So how about the open-source models? Or in this specific instance, the guy who made a LoRA for mimicking Rutkowski’s style, since he did it free of charge and released it for anyone to use?

          • Pulse@dormi.zone
            link
            fedilink
            arrow-up
            28
            ·
            1 year ago

            Yes copies were made. The files were downloaded, one way or another (even as a hash, or whatever digital asset they claim to translate them into) then fed to their machines.

            If I go into a Ford plant, take pictures of their equipment, then use those to make my own machines, it’s still IP theft, even if I didn’t walk out with the machine.

            Make all the excuses you want, you’re supporting the theft of other people’s life’s work then trying to claim it’s ethical.

            • FaceDeer@kbin.social
              link
              fedilink
              arrow-up
              17
              ·
              edit-2
              1 year ago

              Yes copies were made. The files were downloaded, one way or another (even as a hash, or whatever digital asset they claim to translate them into) then fed to their machines.

              They were put on the Internet for that very purpose. When you visit a website and view an image there a copy of it is made in your computer’s memory. If that’s a copyright violation then everyone’s equally boned. When you click this link you’re doing exactly the same thing.

              • Pulse@dormi.zone
                link
                fedilink
                arrow-up
                11
                ·
                1 year ago

                By that logic I can sell anything I download from the web while also claiming credit for it, right?

                Downloading to view != downloading to fuel my business.

                • FaceDeer@kbin.social
                  link
                  fedilink
                  arrow-up
                  15
                  ·
                  1 year ago

                  No, and that’s such a ridiculous leap of logic that I can’t come up with anything else to say except no. Just no. What gave you that idea?

                • Amju Wolf@pawb.social
                  link
                  fedilink
                  arrow-up
                  9
                  ·
                  1 year ago

                  No, but you can download Rutkovski’s art, learn from it how to paint in his exact style and create art in that style.

                  Which is exactly what the image generation AIs do. They’re perhaps just a bit too good at it, certainly way better than an average human.

                  Which makes it complicated and morally questionable depending on how exactly you arrive at the model and what you do with it, but you can’t definitively say it’s copyright infringement.

              • TwilightVulpine@kbin.social
                link
                fedilink
                arrow-up
                9
                ·
                1 year ago

                Here is where a rhethorical sleight of hand is used by AI proponents.

                It’s displayed for people’s appreciation. AI is not people, it is a tool. It’s not entitled to the same rights as people, and the model it creates based on artists works is itself a derivative work.

                Even among AI proponents, few believe that the AI itself is an autonomous being who ought to have rights over their own artworks, least of all the AI creators.

                • FaceDeer@kbin.social
                  link
                  fedilink
                  arrow-up
                  6
                  ·
                  1 year ago

                  I use tools such as web browsers to view art. AI is a tool too. There’s no sleight of hand, AI doesn’t have to be an “autonomous being.” Training is just a mechanism for analyzing art. If I wrote a program that analyzed pictures to determine what the predominant colour in them was that’d be much the same, there’d be no problem with me running it on every image I came across on a public gallery.

              • M0RNlNGW00D@kbin.social
                link
                fedilink
                arrow-up
                8
                ·
                1 year ago

                For disclosure I am a former member of the American Photographic Artists/Advertising Photographers of America, and I have works registered at the United States Copyright Office.

                When we put works in our online portfolio, send mailers or physical copies of our portfolios we’re doing it as promotional works. There is no usage license attached to it. If loaded into memory for personal viewing, that’s fine since its not a commercial application nor violating the intent of that specific release: viewing for promotion.

                Let’s break down your example to help you understand what is actually going on. When we upload our works to third party galleries there is often a clause in the terms of service which states the artist uploading to the site grants a usage license for distribution and displaying of the image. Let’s look at Section 17 of ArtStation’s Terms of Service:

                1. License regarding Your Content

                Your Content may be shared with third parties, for example, on social media sites to promote Your Content on the Site, and may be available for purchase through the Marketplace. You hereby grant royalty-free, perpetual, world-wide, licenses (the “Licenses”) to Epic and our service providers to use, copy, modify, reformat and distribute Your Content, and to use the name that you provide in association with Your Content, in connection with providing the Services; and to Epic and our service providers, members, users and licensees to use, communicate, share, and display Your Content (in whole or in part) subject to our policies, as those policies are amended from time-to-time

                This is in conjunction with Section 16’s opening line:

                1. Ownership

                As between you and Epic, you will retain ownership of all original text, images, videos, messages, comments, ratings, reviews and other original content you provide on or through the Site, including Digital Products and descriptions of your Digital Products and Hard Products (collectively, “Your Content”), and all intellectual property rights in Your Content.

                So when I click your link, I’m not engaging in a copyright violation. I’m making use of ArtStation’s/Epic’s license to distribute the original artist’s works. When I save images from ArtStation that license does not transfer to me. Meaning if I were to repurpose that work it could be a copyright violation depending on the usage the artist agrees to. Established law states that I hold onto the rights of my work and any usage depends on what I explicitly state and agree to; emphasis on explicitly because the law will respect my terms and compensation first, and your intentions second. For example, if a magazine uses my images for several months without a license, I can document the usage time frame, send them an invoice, and begin negotiating because their legal team will realize that without a license they have no footing.

                • Yes, this also applies to journalism as well. If you’ve agreed to let a news outlet use your works on a breaking story for credit/exposure, then you provided a license for fair compensation in the form of credit/exposure.

                I know this seems strange given how the internet freely transformed works for decades without repercussions. But as you know from sites like YouTube copyright holders are not a fan of people repurposing their works without a mutually agreed upon terms in the form of a license. If you remember the old show Mystery Science Theater 3000, they operated in the proper form: get license, transform work, commercialize. In the case of ArtStation, the site agrees to provide free hosting in compensation for the artist providing a license to distribute the work without terms for monetization unless agreed upon through ArtStation’s marketplace. At every step, the artist’s rights to their work is respected and compensated when the law is applied.

                If all this makes sense and we look back at AI art, well…

                • FaceDeer@kbin.social
                  link
                  fedilink
                  arrow-up
                  7
                  ·
                  1 year ago

                  Meaning if I were to repurpose that work it could be a copyright violation depending on the usage the artist agrees to.

                  Training an AI doesn’t “repurpose” that work, though. The AI learns concepts from it and then the work is discarded. No copyrighted part of the work remains in the AI’s model. All that verbiage doesn’t really apply to what’s being done with the images when an AI trains on them, they are no longer being “used” for anything at all after training is done. Just like when a human artist looks at some reference images and then creates his own original work based on what he’s learned from them.

            • ricecake@beehaw.org
              link
              fedilink
              arrow-up
              11
              ·
              1 year ago

              Copies that were freely shared for the purpose of letting anyone look at them.

              Do you think it’s copyright infringement to go to a website?

              Typically, ephemeral copies that aren’t kept for a substantial period of time aren’t considered copyright violations, otherwise viewing a website would be a copyright violation for every image appearing on that site.

              Downloading a freely published image to run an algorithm on it and then deleting it without distribution is basically the canonical example of ephemeral.

        • jarfil@beehaw.org
          link
          fedilink
          arrow-up
          19
          ·
          edit-2
          1 year ago

          One human artist can, over a life time, learn from a few artists to inform their style.

          These AI setups […] ALL the art from ALL the artists

          So humans are slow and inefficient, what’s new?

          First the machines replaced hand weavers, then ice sellers went bust, all the calculators got sacked, now it’s time for the artists.

          There is no ethical stance for letting billion dollar tech firms hoover up all the art ever created to the try and remix it for profit.

          We stand on the shoulders of generations of unethical stances.

      • Kara@kbin.social
        link
        fedilink
        arrow-up
        20
        ·
        edit-2
        1 year ago

        I don’t like when people say “AI just traces/photobashes art.” Because that simply isn’t what happens.

        But I do very much wish there was some sort of opt-out process, but ultimately any attempt at that just wouldn’t work

        • chemical_cutthroat@kbin.social
          link
          fedilink
          arrow-up
          5
          ·
          1 year ago

          People that say that have never used AI art generation apps and are only regurgitating what they hear from other people who are doing the same. The amount of arm chair AI denialists is astronomical.

        • ricecake@beehaw.org
          link
          fedilink
          arrow-up
          4
          ·
          1 year ago

          There’s nothing stopping someone from licensing their art in a fashion that prohibits their use in that fashion.
          No one has created that license that I know of, but there are software licenses that do similar things, so it’s hardly an unprecedented notion.

          The fact of the matter is that before people didn’t think it was necessary to have specific usage licenses attached to art because no one got funny feelings from people creating derivative works from them.

          • TwilightVulpine@kbin.social
            link
            fedilink
            arrow-up
            5
            ·
            1 year ago

            Not at the point of generation, but at the point of training it was. One of the sticking points of AI for artists is that their developers didn’t even bother to seek permission. They simply said it was too much work and crawled artists’ galleries.

            Even publicly displayed art can only be used for certain previously-established purposes. By default you can’t use them for derivative works.

            • FaceDeer@kbin.social
              link
              fedilink
              arrow-up
              4
              ·
              1 year ago

              At the point of training it was viewing images that the artists had published in a public gallery. Nothing pirated at that point either. They don’t need “permission” to do that, the images are on display.

              Learning from art is one of the previously-established purposes you speak of. No “derivative work” is made when an AI trains a model, the model does not contain any copyrightable part of the imagery it is trained on.

              • TwilightVulpine@kbin.social
                link
                fedilink
                arrow-up
                4
                ·
                1 year ago

                Of course they need permission to process images. No computer system can merely “view” an image without at least creating a copy for temporary use, and the purposes for which that can be done are strictly defined. Doing whatever you want just because you have access to the image is often copyright infringement.

                People have the right to learn from images available publicly for personal viewing. AI is not yet people. Your whole argument relies on anthropomorphizing a tool, but it wouldn’t even be able to select images to train its model without human intervention, which is done with the intent to replicate the artist’s work.

                I’m not one to usually bat for copyright but the disregard AI proponents have for artists’ rights and their livelihood has gone long past what’s acceptable, like the article shows.

                • FaceDeer@kbin.social
                  link
                  fedilink
                  arrow-up
                  4
                  ·
                  1 year ago

                  If I run an image from the web through a program that generates a histogram of how bright its pixels are, am I suddenly a dirty pirate?

              • Kichae@kbin.social
                link
                fedilink
                arrow-up
                2
                ·
                1 year ago

                Bring publicly viewable doesn’t make them public domain. Bring able to see something doesn’t give you the right to use it for literally any other reason.

                Full stop.

                My gods, you’re such an insufferable bootlicking fanboy of bullshit code jockies. Make a good faith effort to actually understand why people dislike these exploitative assholes who are looking to make a buck off of other people’s work for once, instead of just reflexively calling them all phillistines who “just don’t understand”.

                Some of us work on machine learning systems for a living. We know what they are and how they work, and they’re fucking regurgitation machines. And people deserve to have control over whether we use their works in our regurgitation machines.

            • FaceDeer@kbin.social
              link
              fedilink
              arrow-up
              3
              ·
              1 year ago

              They were not used for derivative works. The AI’s model produced by the training does not contain any copyrighted material.

              If you click this link and view the images there then you are just as much a “pirate” as the AI trainers.

              • TwilightVulpine@kbin.social
                link
                fedilink
                arrow-up
                3
                ·
                1 year ago

                The models themselves are the derivative works. Those artists’ works were copied and processed to create that model. There is a difference between a person viewing a piece of work and putting that work to be processed through a system. The way copyright works as defined, being allowed to view a work is not the same as being allowed to use it in any way you see fit. It’s also innacurate to speak of AIs as if they have the same abilities and rights as people.

          • zeus ⁧ ⁧ ∽↯∼@lemm.ee
            link
            fedilink
            English
            arrow-up
            5
            ·
            1 year ago

            i’m not making a moral comment on anything, including piracy. i’m saying “but it’s part of my established workflow” is not an excuse for something morally wrong.

            only click here if you understand analogy and hyperbole

            if i say “i can’t write without kicking a few babies first”, it’s not an excuse to keep kicking babies. i just have to stop writing, or maybe find another workflow

            • FaceDeer@kbin.social
              link
              fedilink
              arrow-up
              4
              ·
              1 year ago

              The difference is that kicking babies is illegal whereas training and running an AI is not. Kind of a big difference.

                • FaceDeer@kbin.social
                  link
                  fedilink
                  arrow-up
                  4
                  ·
                  1 year ago

                  You’re using an analogy as the basis for an argument. That’s not what analogies are for. Analogies are useful explanatory tools, but only within a limited domain. Kicking a baby is not the same as creating an artwork, so there are areas in which they don’t map to each other.

                  You can’t dodge flaws in your argument by adding a “don’t respond unless you agree with me” clause on your comment.

          • Kichae@kbin.social
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            His work was used in a publicly available product without license or compensation. Including his work in the training dataset was, to the online vernacular use of the word, piracy.

            They violated his copyright when they used his work to make their shit.

      • grue@lemmy.ml
        link
        fedilink
        arrow-up
        22
        ·
        1 year ago

        That’s true, but only in the sense that theft and copyright infringement are fundamentally different things.

        Generating stuff from ML training datasets that included works without permissive licenses is copyright infringement though, just as much as simply copying and pasting parts of those works in would be. The legal definition of a derivative work doesn’t care about the techological details.

        (For me, the most important consequence of this sort of argument is that everything produced by Github Copilot must be GPL.)

        • Rikudou_Sage@lemmings.world
          link
          fedilink
          arrow-up
          19
          ·
          1 year ago

          That’s incorrect in my opinion. AI learns patterns from its training data. So do humans, by the way. It’s not copy-pasting parts of image or code.

          • MJBrune@beehaw.org
            link
            fedilink
            arrow-up
            13
            ·
            1 year ago

            At the heart of copyright law is the intent. If an artist makes something, someone can’t just come along and copy it and resell it. The intent is so that artists can make a living for their innovation.

            AI training on copyrighted images and then reproducing works derived from those images in order to compete with those images in the same style breaks the intent of copyright law. Equally, it does not matter if a picture is original. If you take an artist’s picture and recreate it with pixel art, there have already been cases where copyright infringement settlements have been made in favor of the original artist. Despite the original picture not being used at all, just studied. Mile’s David Kind Of Bloop cover art.

            • grue@lemmy.ml
              link
              fedilink
              arrow-up
              5
              ·
              1 year ago

              You’re correct in your description of what a derivative work is, but this part is mistaken:

              The intent is so that artists can make a living for their innovation.

              The intent is “to promote the progress of science and the useful arts” so that, in the long run, the Public Domain is enriched with more works than would otherwise exist if no incentive were given. Allowing artists to make a living is nothing more than a means to that end.

              • MJBrune@beehaw.org
                link
                fedilink
                arrow-up
                6
                ·
                1 year ago

                It promotes progress by giving people the ability to make the works. If they can’t make a living off of making the works then they aren’t going to do it as a job. Thus yes, the intent is so that artists can make a living off of their work so that more artists have the ability to make the art. It’s really that simple. The intent is so that more people can do it. It’s not a means to the end, it’s the entire point of it. Otherwise, you’d just have hobbyists contributing.

                • whelmer@beehaw.org
                  link
                  fedilink
                  arrow-up
                  5
                  ·
                  1 year ago

                  I like what you’re saying so I’m not trying to be argumentative, but to be clear copyright protections don’t simply protect those who make a living from their productions. You are protected by them regardless of whether you intend to make any money off your work and that protection is automatic. Just to expand upon what @grue was saying.

          • grue@lemmy.ml
            link
            fedilink
            arrow-up
            8
            ·
            1 year ago

            By the same token, a human can easily be deemed to have infringed copyright even without cutting and pasting, if the result is excessively inspired by some other existing work.

          • Samus Crankpork@beehaw.org
            link
            fedilink
            arrow-up
            6
            ·
            1 year ago

            AI doesn’t “learn” anything, it’s not even intelligent. If you show a human artwork of a person they’ll be able to recognize that they’re looking at a human, how their limbs and expression works, what they’re wearing, the materials, how gravity should affect it all, etc. AI doesn’t and can’t know any of that, it just predicts how things should look based on images that have been put in it’s database. It’s a fancy Xerox.

            • Rikudou_Sage@lemmings.world
              link
              fedilink
              English
              arrow-up
              3
              ·
              1 year ago

              Why do people who have no idea how some thing works feel the urge to comment on its working? It’s not just AI, it’s pretty much everything.

              AI does learn, that’s the whole shtick and that’s why it’s so good at stuff computers used to suck at. AI is pretty much just a buzzword, the correct abbreviation is ML which stands for Machine Learning - it’s even in the name.

              AI also recognizes it looks at a human! It can also recognize what they’re wearing, the material. AI is also better in many, many things than humans are. It also sucks compared to humans in many other things.

              No images are in its database, you fancy Xerox.

              • Samus Crankpork@beehaw.org
                link
                fedilink
                English
                arrow-up
                4
                ·
                1 year ago

                And I wish that people who didn’t understand the need for the human element in creative endeavours would focus their energy on automating things that should be automated, like busywork, and dangerous jobs.

                If the prediction model actually “learned” anything, they wouldn’t have needed to add the artist’s work back after removing it. They had to, because it doesn’t learn anything, it copies the data it’s been fed.

                • Rikudou_Sage@lemmings.world
                  link
                  fedilink
                  arrow-up
                  2
                  ·
                  1 year ago

                  Just because you repeat the same thing over and over it doesn’t become truth. You should be the one to learn, before you talk. This conversation is over for me, I’m not paid to convince people who behave like children of how things they’re scared of work.

        • Otome-chan@kbin.social
          link
          fedilink
          arrow-up
          8
          ·
          edit-2
          1 year ago

          It’s actually not copyright infringement at all.

          Edit: and even if it was, copyright infringement is a moral right, it’s a good thing. copyright is theft.

          • MJBrune@beehaw.org
            link
            fedilink
            arrow-up
            13
            ·
            1 year ago

            It’s likely copyright infringement but that’s for the courts to decide, not you or me. Additionally, “copyright infringement is a moral right” seems fairly wrong. Copyright laws currently are too steep and I can agree with that but if I make a piece of art like a book, video game, or movie, do I not deserve to protect it in order to get money? I’d argue that because we live in a capitalistic society so, yes, I deserve to get paid for the work I did. If we lived in a better society that met the basic needs (or even complex needs) of every human then I can see copyright laws being useless.

            At the end of the day, the artists just want to be able to afford to eat, play games, and have shelter. Why in the world is that a bad thing in our current society? You can’t remove copyright law without first removing capitalism.

            • grue@lemmy.ml
              link
              fedilink
              arrow-up
              9
              ·
              1 year ago

              Additionally, “copyright infringement is a moral right” seems fairly wrong. Copyright laws currently are too steep and I can agree with that but if I make a piece of art like a book, video game, or movie, do I not deserve to protect it in order to get money? I’d argue that because we live in a capitalistic society so, yes, I deserve to get paid for the work I did.

              No. And it’s not just me saying that; the folks who wrote the Copyright Clause (James Madison and Thomas Jefferson) would disagree with you, too.

              The natural state of a creative work is for it to be part of a Public Domain. Ideas are fundamentally different from property in the sense that property’s value comes from its exclusive use by its owner, wheras an idea’s value comes from spreading it, i.e., giving it away to others.

              Here’s how Jefferson described it:

              stable ownership is the gift of social law, and is given late in the progress of society. it would be curious then if an idea, the fugitive fermentation of an individual brain, could, of natural right, be claimed in exclusive and stable property. if nature has made any one thing less susceptible, than all others, of exclusive property, it is the action of the thinking power called an Idea; which an individual may exclusively possess as long as he keeps it to himself; but the moment it is divulged, it forces itself into the possession of every one, and the reciever cannot dispossess himself of it. it’s peculiar character too is that no one possesses the less, because every other possesses the whole of it. he who recieves an idea from me, recieves instruction himself, without lessening mine; as he who lights his taper at mine, recieves light without darkening me. that ideas should freely spread from one to another over the globe, for the moral and mutual instruction of man, and improvement of his condition, seems to have been peculiarly and benvolently designed by nature, when she made them, like fire, expansible over all space, without lessening their density in any point; and like the air in which we breathe, move, and have our physical being, incapable of confinement, or exclusive appropriation. inventions then cannot in nature be a subject of property. society may give an exclusive right to the profits arising from them as an encouragement to men to pursue ideas which may produce utility. but this may, or may not be done, according to the will and convenience of the society, without claim or complaint from any body.

              Thus we see the basis for the rationale given in the Copyright Clause itself: “to promote the progress of science and the useful arts,” which is very different from creating some kind of entitlement to creators because they “deserve” it.

              The true basis for copyright law in the United States is as a utilitarian incentive to encourage the creation of more works - a bounty for creating. Ownership of property is a natural right which the Constitution pledges to protect (see also the 4th and 5th Amendments), but the temporary monopoly called copyright is merely a privilege granted at the pleasure of Congress. Essentially, it’s a lease from the Public Domain, for the benefit of the Public. It is not an entitlement; what the creator of the work “deserves” doesn’t enter into it.

              And if the copyright holder abuses his privilege such that the Public no longer benefits enough to be worth it, it’s perfectly just and reasonable for the privilege to be revoked.

              At the end of the day, the artists just want to be able to afford to eat, play games, and have shelter. Why in the world is that a bad thing in our current society? You can’t remove copyright law without first removing capitalism.

              This is a bizarre, backwards argument. First of all, a government-granted monopoly is the antethesis of the “free market” upon which capitalism is supposedly based. Second, granting of monopolies is hardly the only way to accomplish either goal of “promoting the progress of science and the useful arts” or of helping creators make a living!

              • MJBrune@beehaw.org
                link
                fedilink
                arrow-up
                8
                ·
                1 year ago

                Thus we see the basis for the rationale given in the Copyright Clause itself: “to promote the progress of science and the useful arts,” which is very different from creating some kind of entitlement to creators because they “deserve” it.

                … You realize the reason it promotes progress is because it allows the creators to get paid for it, right? It’s not “they deserve it” it’s “they need to eat and thus they aren’t going to do it unless they make money.” Which is exactly my argument.

                Ownership of property is a natural right which the Constitution pledges to protect (see also the 4th and 5th Amendments), but the temporary monopoly called copyright is merely a privilege granted at the pleasure of Congress

                It’s a silly way to put that since the “privilege granted” is given in to Congress in the Constitution.

                Overall though, you are referencing a 300-year-old document like it means something. The point comes down to people needing to eat in a capitalistic society.

                This is a bizarre, backwards argument. First of all, a government-granted monopoly is the antethesis of the “free market” upon which capitalism is supposedly based.

                Capitalism isn’t really based on a free market and never has been in practice.

                Second, granting of monopolies is hardly the only way to accomplish either goal of “promoting the progress of science and the useful arts” or of helping creators make a living!

                Sure but first enact those changes then try to change or break copyright. Don’t take away the only current way for artists to make money then say “Well, the system should be different.” You are causing people to starve at that point.

          • grue@lemmy.ml
            link
            fedilink
            arrow-up
            2
            ·
            1 year ago

            Edit: …copyright infringement is a moral right, it’s a good thing. copyright is theft.

            Except when it’s being used to enforce copyleft.

      • Samus Crankpork@beehaw.org
        link
        fedilink
        arrow-up
        9
        ·
        1 year ago

        Aside from all the artists whose work was fed into the AI learning models without their permission. That art has been stolen, and is still being stolen. In this case very explicitly, because they outright removed his work, and then put it back when nobody was looking.

        • I_Has_A_Hat@lemmy.ml
          link
          fedilink
          arrow-up
          4
          ·
          edit-2
          1 year ago

          Let me give you a hypothetical that’s close to reality. Say an artist gets very popular, but doesn’t want their art used to teach AI. Let’s even say there’s even legislation that prevents all this artist’s work from being used in AI.

          Now what if someone else hires a bunch of cheap human artists to produce works in a style similar to the original artist, and then uses those works to feed the AI model? Would that still be stolen art? And if so, why? And if not, what is this extra degree of separation changing? The original artist is still not getting paid and the AI is still producing works based on their style.

          • Samus Crankpork@beehaw.org
            link
            fedilink
            arrow-up
            4
            ·
            1 year ago

            Comic book artists get in shit for tracing other peoples’ work all the time. Look up Greg Land. It’s shitty regardless of whether it’s a person doing it directly, or if someone built software to do it for them.

          • wizardbeard@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            Fine, you win the semantic argument about the use of the term “stealing”. Despite arguments about word choice, this is still a massively disrespectful and malicious action against the artist.

          • CallumWells@lemmy.ml
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            Strictly speaking it wouldn’t exactly be stealing, but I would still consider it as about equal to it, especially with regards to economic benefits. It may not be producing exact copies (which strictly speaking isn’t stealing, but is violating copyright) or actually stealing, but it’s exploiting the style that most people would assume mean that that specific artist made it and thus depriving that artist from benefiting from people wanting art from that artist/in that style.

            Now, I’m not conflicted about people who have made millions off their art having people make imitations or copies, those people live more than comfortably enough. But in your example there are still other human artists benefiting, which is not the case for computationally generated works. It’s great for me to be able to have computers create art for a DnD campaign or something, but I still recognize that it’s making it harder for artists to earn a living from their skills. And to a certain degree it makes it so people who never would have had any such art now can. It’s in many ways like piracy with the same ethical framing. And as with piracy it may be that people that use AI to make them art become greater “consumers” of art made by humans as well, paying it forward. But it may also not work exactly that way.

            • Otome-chan@kbin.social
              link
              fedilink
              arrow-up
              3
              ·
              1 year ago

              People aren’t allowed to produce similar styles to other humans? So do you support disney preventing anyone from making cartoons?

              • CallumWells@lemmy.ml
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                Now you’re making a strawman. Other humans that are actually making art generally don’t fully copy a specific style, they draw inspiration from different sources and that amalgamation is their style.

                Your comment reads as bad-faith to me. If it wasn’t meant as such you’re free to explain your stance properly instead of making strawman arguments.

          • Samus Crankpork@beehaw.org
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            So you hire people to trace the original art, that’s still copying it, and nobody is learning anything. It’s copying.

      • FaceDeer@kbin.social
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        Yeah, all these people yelling about how people who use AI art generators are “thieves” who are “stealing” art and that the things they generate are “not really art” and so forth. Very disrespectful.

    • ParsnipWitch@feddit.de
      link
      fedilink
      arrow-up
      9
      ·
      edit-2
      1 year ago

      We will probably all have to get used to this soon because I can see the same happening to authors, journalists and designers. Perhaps soon programmers, lawyers and all kinds of other people as well.

      It’s interesting how people on Lemmy pretend to be all against big corporations and capitalism and then they happily indulge in the process of making artists jobless becaus “Muh technology cool!”. I don’t know the English word to describe this situation. In German I would say “Tja…”

    • FaceDeer@kbin.social
      link
      fedilink
      arrow-up
      7
      ·
      1 year ago

      Just as quickly as people disregard the human art enjoyer, who now has access to a powerful tool to create art undreamed of a year ago.

      I have found over the years that forums that claim to be about various forms of art are almost always really about the artists that make that art, and have little to no regard for the people who are there just for the art itself. The AI art thing is just the latest and most prominent way of revealing this.

    • teichflamme@lemm.ee
      link
      fedilink
      arrow-up
      22
      ·
      1 year ago

      Nothing was stolen.

      Drawing inspiration from someone else by looking at their work has been around for centuries.

      Imagine if the Renaissance couldn’t happen because artists didn’t want their style stolen.

    • KoboldCoterie@pawb.social
      link
      fedilink
      English
      arrow-up
      22
      ·
      1 year ago

      I don’t fully understand how this works, but if they’ve created a way to replicate his style that doesn’t involve using his art in the model, how is it problematic? I understand not wanting models to be trained using his art, but he doesn’t have exclusive rights to the art style, and if someone else can replicate it, what’s the problem?

      This is an honest question, I don’t know enough about this topic to make a case for either side.

      • jamesravey@lemmy.nopro.be
        link
        fedilink
        English
        arrow-up
        32
        ·
        edit-2
        1 year ago

        TL;DR The new method still requires his art.

        LoRA is a way to add additional layers to a neural network that effectively allow you to fine tune it’s behaviour. Think of it like a “plugin” or a “mod”

        LoRas require examples of the thing you are targeting. Lots of people in the SD community build them for particular celebrities or art styles by collecting examples of the that celebrity or whatever from online.

        So in this case Greg has asked Stable to remove his artwork which they have done but some third party has created an unofficial LoRA that does use his artwork to mod the functionality back in.

        In the traditional world the rights holder would presumably DMCA the plugin but the lines are much blurrier with LoRA models.

      • delollipop@beehaw.org
        link
        fedilink
        arrow-up
        11
        ·
        edit-2
        1 year ago

        Do you know how they recreated his style? I couldn’t find such information or frankly have enough understanding to know how.

        But if they either use his works directly or works created by another GAI with his name/style in the prompt, my personal feeling is that would still be unethical, especially if they charge money to generate his style of art without compensating him.

        Plus, I find that the opt-out mentality really creepy and disrespectful

        “If he contacts me asking for removal, I’ll remove this.” Lykon said. “At the moment I believe that having an accurate immortal depiction of his style is in everyone’s best interest.”

        • fsniper@kbin.social
          link
          fedilink
          arrow-up
          20
          ·
          1 year ago

          I still have trouble understanding the distinction between “a human consuming different artists, and replicating the style” vs “software consuming different artists, and replicating the style”.

        • averyminya@beehaw.org
          link
          fedilink
          arrow-up
          12
          ·
          1 year ago

          But if they either use his works directly or works created by another GAI with his name/style in the prompt, my personal feeling is that would still be unethical, especially if they charge money to generate his style of art without compensating him.

          LORA’s are created on image datasets, but these images are just available anywhere. It’s really not much different from you taking every still of The Simpsons and using it. What I don’t understand is how these are seen as problematic because a majority of end users utilizing AI are doing it under fair use.

          No one charges for LORA’s or models AFAIK. If they do, it hasn’t come across the Stable Diffusion discords I moderate.

          People actually selling AI generated art is also a different story and that’s where it falls outside of fair use if the models being used contain copy-written work. It seems pretty cut and dry, artists complained about not being emulated by other artists before AI so it’s only reasonable that it happens again. If people are profiting off it, it should be at least giving compensation to the original artist (if it could be adjusted so that per-token payments are given as royalties to the artist). However, on the other hand think about The Simpsons, or Pokemon, or anything that has ever been sold as a sticker/poster/display item.

          I’m gonna guess that a majority of people have no problem with that IP theft cause it’s a big company. Okay… so what if I love Greg but he doesn’t respond to my letters and e-mails begging him to commission him for a Pokemon Rutkowski piece? Under fair use there’s no reason I can’t create that on my own, and if that means creating a dataset of all of his paintings that I paid for to utilize it then it’s technically legal.

          The only thing here that would be unethical or illegal is if his works are copywritten and being redistributed. They aren’t being redistributed and currently copy-written materials aren’t protected from being used in AI models, since the work done from AI can’t be copywritten. In other words, while it may be disrespectful to go against the artists wishes to not be used in AI, there’s no current grounds for it other than an artist not wanting to be copied… which is a tale as old as time.

          TL;DR model and LORA makers aren’t charging, users can’t sell or copywrite AI works, and copywritten works aren’t protected from being used in AI models (currently). An artist not wanting to be used currently has no grounds other than making strikes against anything that is redistributing copies of their work. If someone is using this LORA to recreate Greg Rutkowski paintings and then proceeds to give or sell them then the artist is able to claim that there’s theft and damages… but the likelihood of an AI model being able to do this is low. The likelihood of someone selling these is higher, but from my understanding artistic styles are pretty much fair game anyway you swing it.

          I understand wanting to protect artists. Artists also get overly defensive at times - I’m not saying that this guy is I actually am more on his side than my comment makes it out, especially after how he was treated in the discord I moderate. I’m more just pointing out that there’s a slippery slope both ways and the current state of U.S. law on it.

        • SweetAIBelle@kbin.social
          link
          fedilink
          arrow-up
          8
          ·
          1 year ago

          Generally speaking, the way training works is this:
          You put together a folder of pictures, all the same size. It would’ve been 1024x1024 in this case. Other models have used 768z768 or 512x512. For every picture, you also have a text file with a description.

          The training software takes a picture, slices it into squares, generates a square the same size of random noise, then trains on how to change that noise into that square. It associates that training with tokens from the description that went with that picture. And it keeps doing this.

          Then later, when someone types a prompt into the software, it tokenizes it, generates more random noise, and uses the denoising methods associated with the tokens you typed in. The pictures in the folder aren’t actually kept by it anywhere.

          From the side of the person doing the training, it’s just put together the pictures and descriptions, set some settings, and let the training software do its work, though.

          (No money involved in this one. One person trained it and plopped it on a website where people can download loras for free…)

        • KoboldCoterie@pawb.social
          link
          fedilink
          arrow-up
          6
          ·
          1 year ago

          Do you know how they recreated his style? I couldn’t find such information or frankly have enough understanding to know how.

          I don’t, but another poster noted that it involves using his art to create the LoRA.

          Plus, I find that the opt-out mentality really creepy and disrespectful

          I don’t know about creepy and disrespectful, but it does feel like they’re saying “I know the artist doesn’t want me to do this, but if he doesn’t specifically ask me personally to stop, I’m going to do it anyway.”

        • Rhaedas@kbin.social
          link
          fedilink
          arrow-up
          6
          ·
          1 year ago

          they charge money to generate his style of art without compensating him.

          That’s really the big thing, not just here but any material that’s been used to train on without permission or compensation. The difference is that most of it is so subtle it can’t be picked out, but an artist style is obviously a huge parameter since his name was being used to call out those particular training aspects during generations. It’s a bit hypocritical to say you aren’t stealing someone’s work when you stick his actual name in the prompt. It doesn’t really matter how many levels the art style has been laundered, it still originated from him.

            • Rhaedas@kbin.social
              link
              fedilink
              arrow-up
              6
              ·
              1 year ago

              And yet the artist’s name is used to push the weights towards pictures in their style. I don’t know what the correct semantics are for it, nor the legalities. That’s part of the problem, the tech is ahead of our laws, as is usually the case.

              • conciselyverbose@kbin.social
                link
                fedilink
                arrow-up
                8
                ·
                1 year ago

                And yet the artist’s name is used to push the weights towards pictures in their style.

                That’s not even vaguely new in the world of art.

                Imitating style is the core of what art is. It’s absolutely unconditionally protected by copyright law. It’s not even a .01 out of 10 on the scale of unethical. It’s what’s supposed to happen.

                The law might not cover this yet, but any law that restricts the fundamental right to build off of the ideas of others that are the core of the entirety of human civilization is unadulterated evil. There is no part of that that could possibly be acceptable to own.

                • Rhaedas@kbin.social
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  1 year ago

                  I totally agree with you on protecting the basics of creativity and growth. I think the core issue is using “imitate” here. Is that what the LLM is doing, or is that an anthropomorphism of some sense that there’s intelligence guiding the process? I know it seems like I’m nitpicking things to further my point, but the fact that this is an issue to many even outside artwork says there is a question here of what is and isn’t okay.

              • Altima NEO@lemmy.zip
                link
                fedilink
                arrow-up
                7
                ·
                1 year ago

                It’s only using his name because the person who created the LORA trained it with his name. They could have chosen any other word.

                • Rhaedas@kbin.social
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  1 year ago

                  True, and then because it’s a black box there wouldn’t be a known issue at all. Or maybe it would be much less of an issue because the words might have blended others into the mix, and his style wouldn’t be as obvious in the outputs, and/or it would be easier to dismiss. Did the training involve actual input of his name, or was that pulled from the source trained on? How much control was in the training?

            • Peanut@sopuli.xyz
              link
              fedilink
              arrow-up
              6
              ·
              edit-2
              1 year ago

              Just wait until you can copywrite a style. Guess who will end up owning all the styles.

              Spoiler, it’s wealthy companies like Disney and Warner. Oh you used cross hatching? Disney owns the style now you theif.

              Copyright is fucked. Has been since before the Mickey mouse protection act. Our economic system is fucked. People would rather fight each other and new tools instead of rallying against the actual problem, and it’s getting to me.

              • Pseu@beehaw.org
                link
                fedilink
                arrow-up
                5
                ·
                1 year ago

                You’re right, copyright won’t fix it, copyright will just enable large companies to activate more of their work extract more from the creative space.

                But who will benefit the most from AI? The artists seem to be getting screwed right now, and I’m pretty sure that Hasbro and Disney will love to cut costs and lay off artists as soon as this blows over.

                Technology is capital, and in a capitalist system, that goes to benefit the holders of that capital. No matter how you cut it, laborers including artists are the ones who will get screwed.

                • TheBurlapBandit@beehaw.org
                  link
                  fedilink
                  arrow-up
                  4
                  ·
                  1 year ago

                  Me, I’ll benefit the most. I’ve been using a locally running instance of the free and open source AI software Stable Diffusion to generate artwork for my D&D campaigns and they’ve never looked more beautiful!

              • ricecake@beehaw.org
                link
                fedilink
                arrow-up
                5
                ·
                1 year ago

                You said it yourself. You’re drawing Micky mouse in a new pose, so you’re copying Mickey mouse.

                Drawing a cartoon in the style of Mickey mouse isn’t the same thing.

                You can’t have a copyright on “big oversized smile, exaggerated posture, large facial features, oversized feet and hands, rounded contours and a smooth style of motion”.

                • tqgibtngo@kbin.social
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  edit-2
                  1 year ago

                  There is nothing at all being copied but an aesthetic.

                  Although to me it is interesting that, even without literal copying, a generator might be capable of potentially emulating some key features of a specified source. Can this sometimes arguably extend beyond just “an aesthetic”? We’ve all seen examples similar to this one (from the SD online demo, default setting, with a familiar public-domain source) — https://i.imgur.com/PUJs3RL.png

      • Hubi@feddit.de
        link
        fedilink
        arrow-up
        7
        ·
        edit-2
        1 year ago

        You’re pretty spot on. It’s not much different from a human artist trying to copy his style by hand but without reproducing the actual drawings.

    • falsem@kbin.social
      link
      fedilink
      arrow-up
      19
      ·
      1 year ago

      If I look at someone’s paintings, then paint something in a similar style did I steal their work? Or did I take inspiration from it?

      • Pulse@dormi.zone
        link
        fedilink
        arrow-up
        15
        ·
        1 year ago

        No, you used it to inform your style.

        You didn’t drop his art on to a screenprinter, smash someone else’s art on top, then try to sell t-shirts.

        Trying to compare any of this to how one, individual, human learns is such a wildly inaccurate way to justify stealing a someone’s else’s work product.

        • falsem@kbin.social
          link
          fedilink
          arrow-up
          14
          ·
          1 year ago

          If it works correctly it’s not a screenprinter, it’s something unique as the output.

          • Pulse@dormi.zone
            link
            fedilink
            arrow-up
            18
            ·
            1 year ago

            The fact that folks can identify the source of various parts of the output, and that intact watermarks have shown up, shows that it doesn’t work like you think it does.

            • FaceDeer@kbin.social
              link
              fedilink
              arrow-up
              11
              ·
              1 year ago

              They can’t, and “intact” watermarks don’t show up. You’re the one who is misunderstanding how this works.

              When a pattern is present very frequently the AI can learn to imitate it, resulting in things that closely resemble known watermarks. This is called “overfitting” and is avoided as much as possible. But even in those cases, if you examine the watermark-like pattern closely you’ll see that it’s usually quite badly distorted and only vaguely watermark-like.

              • Pulse@dormi.zone
                link
                fedilink
                arrow-up
                11
                ·
                1 year ago

                Yes, because “imitate” and “copy” are different things when stealing from someone.

                I do understand how it works, the “overfitting” was just laying clear what it does. It copies but tries to sample things in a way that won’t look like clear copies. It had no creativity, it is trying to find new ways of making copies.

                If any of this was ethical, the companies doing it would have just asked for permission. That they didn’t says a everything you need to know.

                I don’t usually have these kinds discussions anymore, I got tired of conversations like this back in 2016, when it became clear that people will go to the ends of the earth to justify unethical behavior as long as the people being hurt by it are people they don’t care about.

                • FaceDeer@kbin.social
                  link
                  fedilink
                  arrow-up
                  5
                  ·
                  1 year ago

                  And we’re back to you calling it “stealing”, which it certainly is not. Even if it was copyright violation, copyright violation is not stealing.

                  You should try to get the basic terminology right, at the very least.

            • jarfil@beehaw.org
              link
              fedilink
              arrow-up
              4
              ·
              1 year ago

              Does that mean the AI is not smart enough to remove watermarks, or that it’s so smart it can reproduce them?

              • nickwitha_k (he/him)@lemmy.sdf.org
                link
                fedilink
                arrow-up
                4
                ·
                1 year ago

                LLMs and directly related technologies are not AI and possess no intelligence or capability to comprehend, despite the hype. So, they are absolutely the former, though it’s rather like a bandwagon sort of thing (x number of reference images had a watermark, so that’s what the generated image should have).

                • jarfil@beehaw.org
                  link
                  fedilink
                  arrow-up
                  3
                  ·
                  1 year ago

                  LLMs […] no intelligence or capability to comprehend

                  That’s debatable. LLMs have shown emergent behaviors aside from what was trained, and they seem to be capable of comprehending relationships between all sorts of tokens, including multi-modal ones.

                  Anyway, Stable diffusion is not an LLM, it’s more of a “neural network hallucination machine” with some cool hallucinations, that sometimes happen to be really close to some or parts of the input data. It still needs to be “smart” enough to decompose the original data into enough and the right patterns, that it can reconstruct part of the original from the patterns alone.

              • Swedneck@discuss.tchncs.de
                link
                fedilink
                arrow-up
                1
                ·
                1 year ago

                It’s like staring yourself blind at artworks with watermarks until you start seeing artworks with blurry watermarks in your dreams

  • CapedStanker@beehaw.org
    link
    fedilink
    arrow-up
    44
    ·
    1 year ago

    Here’s my argument: tough titties. Everything Greg Rutkowski has ever drawn or made has been inspired by other things he has seen and the experiences of his life, and this applies to all of us. Indeed, one cannot usually have experiences without the participation of others. Everyone wants to think they are special, and of course we are to someone, but to everyone no one is special. Since all of our work is based upon the work of everyone who came before us, then all of our work belongs to everyone. So tough fucking titties, welcome to the world of computer science, control c and control v is heavily encouraged.

    In that Beatles documentary, Paul McCartney said he thought that once you uttered the words into the microphone, it belonged to everyone. Little did he know how right he actually was.

    You think there is a line between innovation and infringement? Wrong, They are the same thing.

    And for the record, I’m fine with anyone stealing my art. They can even sell it as their own. Attribution is for the vain.

    • hglman@lemmy.ml
      link
      fedilink
      English
      arrow-up
      32
      ·
      1 year ago

      Greg wants to get paid, remove the threat of poverty from the loss of control and its a nonissue.

        • CallumWells@lemmy.ml
          link
          fedilink
          English
          arrow-up
          7
          ·
          1 year ago

          But every human activity desirable to others deserve compensation. If you want someone to do something for you or make something for you or entertain you then it deserves compensation. The way ads on the internet have trained a lot of people to think that a lot of entertainment et cetera on the internet is free has been a negative for this. But at the same time that ad-supported model does make it more available to people that otherwise couldn’t afford the price of admission. It’s partly democratizing, but it’s also a scourge.

          • Virulent@reddthat.com
            link
            fedilink
            English
            arrow-up
            3
            ·
            1 year ago

            Even if that were true it wouldn’t apply to this situation. The man wants monopoly rights to his art style. That’s insane.

            • CallumWells@lemmy.ml
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              Has he said that no other humans could be inspired by his art style? If no then he hasn’t expressed a want for monopoly rights to his art style. But he has expressed that he doesn’t want computers to generate art explicitly to mimic his art style.

              Also don’t make claims that are totally disconnected from the argument discussed. It’s dishonest discourse and serves as a way to brush aside the other argument. You didn’t make any counterargument to my argument and the point of this chain which came from you saying that “Not every human activity deserves compensation” as a reply to someone saying “Greg wants to get paid, remove the threat of poverty from the loss of control and its [sic] a nonissue.”

              Your reply to me was inane.

    • ParsnipWitch@feddit.de
      link
      fedilink
      arrow-up
      11
      ·
      edit-2
      1 year ago

      I think people forget the reality when they take their supposedly brave and oh so altruistic stance of “there should be no copyright”.

      When people already know they won’t even have a small chance of getting paid for the art they create, we will run out of artists.

      Because most can not afford to learn and practice that craft without getting any form of payment. It will become a very rare hobby of a few decadent rich people who can afford to learn something like illustration in their free time.

    • smart_boy@beehaw.org
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      If a company stole your art and copyrighted it such that it no longer belonged to everyone, in the same way that a Beatles record cannot be freely and openly shared, would you be fine with that?

    • ultratiem@lemmy.ca
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      A sad fact but undeniable truth. I work in the industry. It’s standard for us to do mood boards. I have a lot hate relationship with them because it can be helpful to hone the design to a client’s liking and get your bearings. But, the fact it’s essentially what AI is doing by “borrowing” existing art as a reference. It’s the exact same thing. And that’s why I hate doing it. Because I don’t want to take someone’s button or background pattern.

      Regardless of how I feel, I still can’t recognize AI as being “stealing” but industry accepted practices that do the exact same thing aren’t?

    • Mossy Feathers (She/They)@pawb.social
      link
      fedilink
      arrow-up
      22
      ·
      edit-2
      1 year ago

      Pretty much. There are ways of using it that most artists would be okay with. Most of the people using it flat out refuse to use it like that though.

      Edit: To expand on this:

      Most artists would be okay with AI art being used as reference material, inspiration, assisting with fleshing out concepts (though you should use concept artists for that in a big production), rapid prototyping and whatnot. Most only care that the final product is at least mostly human-made.

      Artists generally want you to actually put effort into what you’re making because, at the end of the day, typing a prompt into stable diffusion has more in common with receiving a free commission from an artist than it has with actually being an artist. If you’re going to claim that something AI had a hand in as being your art, then you need to have done the majority of the work on it yourself.

      The most frustrating thing to me, however, is that there are places in art that AI could participate in which would send artists over the moon, but it’s not flashy so no one seems to be working on making AI in those areas.

      Most of what I’m personally familiar with has to do with 3d modeling, and in that discipline, people would go nuts if you released an AI tool that could do the UV work for you. Messing with UVs can be very tedious and annoying, to the point where most artists will just use a tool using conventional algorithms to auto-unwrap and pack UVs, and then call it a day, even if they’re not great.

      Another area is in rigging and weight painting. In order to animate a model, you have to rig it to a skeleton (unless you’re a masochist or trying to make a game accurate to late 90s-early 00s animation), paint the bone weights (which bones affect which polygons, and by how much), add constraints, etc. Most 3d modelers would leap at the prospect of having high-quality rigging and UVs done for them at the touch of a button. However, again, because it’s not flashy to the general public, no one’s put any effort into making an AI that can do that (afaik at least).

      Finally, even if you do use an AI in ways that most artists would accept as valid, you’ll still have to prove it because there are so many people who put a prompt into stable diffusion, do some minor edits to fix hands (in older version), and then try to pass it off as their own work.

      • DekkerNSFW@kbin.social
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        Sadly, AI isn’t as good with sparse data like vertices and bones, so most attempts to use AI on 3D stuff is via NERFs, which is closer to a “photo” you can walk around in than to an actual 3D scene.

    • kboy101222@lemm.ee
      link
      fedilink
      arrow-up
      10
      ·
      1 year ago

      Welcome to the wonderful world of the silicon valley tech era! Everything must be profitable at all costs! Everything must steal every tiny fact about you! Everything must include ! Everything must go through enshittification!

  • AzureDusk10@kbin.social
    link
    fedilink
    arrow-up
    32
    ·
    edit-2
    1 year ago

    The real issue here is the transfer of power away from the artist. This artist has presumably spent years and years perfecting his craft. Those efforts are now being used to line someone else’s pockets, in return for no compensation and a diminishment in the financial value of his work, and, by the sounds of it, little say in the matter either. That to me seems very unethical.

    • millie@beehaw.org
      link
      fedilink
      arrow-up
      22
      ·
      1 year ago

      Personally, as an artist who spends the vast majority of their time on private projects that aren’t paid, I feel like it’s put power in my hands. It’s best at sprucing up existing work and saving huge amounts of time detailing. Because of stable diffusion I’ll be able to add those nice little touches and flashy bits to my work that a large corporation with no real vision has at their disposal.

      To me it makes it much easier for smaller artists to compete, leveling the playing field a bit between those with massive resources and those with modest resources. That can only be a good thing in the long run.

      But I also feel like copyright more often than not rewards the greedy and stifles the creative.

    • moon_matter@kbin.social
      link
      fedilink
      arrow-up
      9
      ·
      edit-2
      1 year ago

      But that’s sort of the nature of the beast when you put your content up for free on a public website. Does Kbin or Beehaw owe us money for our comments on this thread? What about everyone currently reading? At least KBin and Beehaw are making profit off of this.

      The argument is not as clear cut as people are making it sound and it has potential to up-end some fundamental expectations around free websites and user-generated content. It’s going to affect far more than just AI.

  • trashhalo@beehaw.orgOP
    link
    fedilink
    arrow-up
    26
    ·
    edit-2
    1 year ago

    Re: Stolen. Not stolen comments Copyright law as interpreted judges is still being worked out on AI. Stay tuned if it’s defined as stolen or not. But even if the courts decide existing copyright law would define training on artists work as legitimate use. The law can change and it still could swing the way of the artist if congress got involved.


    My personal opinion, which may not reflect what happens legally is I hope we all get more control over our data and how it’s used and sold. Wether that’s my personal data like my comments, location or my artistic data like my paintings. I think that would be a better world

    • FaceDeer@kbin.social
      link
      fedilink
      arrow-up
      18
      ·
      1 year ago

      Copyright law as interpreted judges is still being worked out on AI. Stay tuned if it’s defined as stolen or not.

      You just contradicted yourself in two sentences. Copyright and theft are not the same thing. They are unrelated to each other. When you violate copyright you are not “stealing” anything. This art is not “stolen”, full stop.

      • MJBrune@beehaw.org
        link
        fedilink
        arrow-up
        11
        ·
        1 year ago

        The “nothing of value was lost when you pirate” argument. I’m a game developer who fully encourages people to pirate my games (or email me if they can’t afford my games and want a free Steam key) but I can tell you value is lost when people pirate content. Even if that’s simply a positive Steam review which in turn will put you higher up on placements on Steam’s algorithm which will gain you more sales. Something of value is lost when you pirate. It’s on the artist to determine if that value is acceptable to be lost. If they made their art for the sake of humanity or if they made art for the sake of survival in our shitty capitalistic society.

        So sorry, yes, something is lost and it’s because of capitalism. I’d argue otherwise if it didn’t mean someone didn’t get to eat or pay rent. I pirated a lot of media back in my day when I couldn’t afford that media. I used to tell myself I wouldn’t have likely bought those things anyways. That I wasn’t taking from someone. In reality, I would have waited for a sale and gotten that media for 5 dollars. 5 dollars is still a lot of money when selling something though. If I just gave you 5 dollars you could do something small but nice for yourself. You could go buy a lot of things with that sale money. Just because you aren’t spending 60 dollars on it doesn’t mean you would never buy it. The fact that you want to play it says you’d probably buy it. Maybe you’d refund it. Maybe you wouldn’t. Your time is worth something to you though. Thus when you pirate something you are committing something of value from yourself to search, download and ingest that media.

        So how does this deal with copyright theft? Stealing something and using it devalues the original product. You’ve seen it a dozen times for better or worse. Minecraft is a great example of how it got devalued for a while there when everyone made Minecraft clones. My kid told me the other day that he got Minecraft on his tablet for free. It was some terrible knockoff he had been playing. I explained this and asked if he wanted the real thing. He said yes and I went and bought Minecraft. That in itself is proof that value is being lost by even legally taking an idea and copying it. A kid’s parent who didn’t know better would have just been like “Hmm, that’s great, have fun.” The best point I can make is that if there was one video game ever, to play a video game you would have to buy that one game. That one game would have more sales than any single game out there today. Clearly, something of value is being created by the exclusivity of copyright.

        There is, of course, a balance. What is copyrightable? What stifles creativity and innovation? I would say if these AI artists were able to recreate the style from prompts and only train the AI on images that it has the authority to distribute (public domain images, CC0, etc.) then it’s fair game. Training AI on copyrighted materials and then distributing derived works is copyright theft and should be deemed as such.

      • Storksforlegs@beehaw.org
        link
        fedilink
        arrow-up
        10
        ·
        1 year ago

        Copying art for personal, non-commercial use is not theft, but copying someone’s art and then profiting (using their image without permission to enrich yourself) is theft.

        • FaceDeer@kbin.social
          link
          fedilink
          arrow-up
          33
          ·
          1 year ago

          No.

          • Copying someone’s art without permission is copyright violation, not theft.
          • These AIs aren’t copying anyone’s art, so it’s not even copyright violation.
          • whelmer@beehaw.org
            link
            fedilink
            arrow-up
            6
            ·
            1 year ago

            That’s your opinion. The contrary opinion would be that copyright infringement is the theft of intellectual property, which many people view as of equal substantiality to physical property.

            You can disagree with the concept of intellectual property but clearly there’s an alternative to your point of view that you can’t just dismiss by declaration.

            • FaceDeer@kbin.social
              link
              fedilink
              arrow-up
              10
              ·
              1 year ago

              Take your opinion to a court of law and see how far it gets. They actually pay close attention to what words mean there. If copyright violation was theft why do they have two different sets of laws to deal with them?

              • whelmer@beehaw.org
                link
                fedilink
                arrow-up
                2
                ·
                1 year ago

                I’m sure you’re aware that the manner in which legal bureaucracies define terms is a form of jargon that differentiates legal language from actual language.

                They have separate categories of laws to deal with them because physical property is different than intellectual property. The same reason they use a different category of law to deal with identity theft.

        • meteokr@community.adiquaints.moe
          link
          fedilink
          arrow-up
          8
          ·
          1 year ago

          Breaking Copyright is a contract/license violation, not theft. Depending on where you live, breach of contract is handled very differently than theft in most jurisdictions.

      • trashhalo@beehaw.orgOP
        link
        fedilink
        arrow-up
        5
        ·
        1 year ago

        “Is copyright infringement theft” is something that had been debated for as long as mp3s were a thing. This is an old argument with lots of material on both sides scattered across the web. I clearly fall on the side of copyright infringement is theft and theft is stealing.

        • Amju Wolf@pawb.social
          link
          fedilink
          arrow-up
          9
          ·
          edit-2
          1 year ago

          There’s absolutely no debate, legal or otherwise.

          Theft, by definition, requires you to deprive someone of something. That simply cannot happen when you copy stuff. That’s why it’s called copyright infringement and not theft.

          You can only steal art by physically stealing an art piece - then and only then it’s theft.

          • whelmer@beehaw.org
            link
            fedilink
            arrow-up
            4
            ·
            edit-2
            1 year ago

            What do you mean there is no debate? You’re debating it right now.

            Plenty of artists view it as theft when people take their work and use it for their own ends without their permission. Not everyone, sure. But it’s a bit odd to state so emphatically that there is no debate.

  • Melody Fwygon@beehaw.org
    link
    fedilink
    arrow-up
    25
    ·
    edit-2
    1 year ago

    AI art is factually not art theft. It is creation of art in the same rough and inexact way that we humans do it; except computers and AIs do not run on meat-based hardware that has an extraordinary number of features and demands that are hardwired to ensure survival of the meat-based hardware. It doesn’t have our limitations; so it can create similar works in various styles very quickly.

    Copyright on the other hand is, an entirely different and, a very sticky subject. By default, “All Rights Are Reserved” is something that usually is protected by these laws. These laws however, are not grounded in modern times. They are grounded in the past; before the information age truly began it’s upswing.

    Fair use generally encompasses all usage of information that is one or more of the following:

    • Educational; so long as it is taught as a part of a recognized class and within curriculum.
    • Informational; so long as it is being distributed to inform the public about valid, reasonable public interests. This is far broader than some would like; but it is legal.
    • Transformative; so long as the content is being modified in a substantial enough manner that it is an entirely new work that is not easily confused for the original. This too, is far broader than some would like; but it still is legal.
    • Narrative or Commentary purposes; so long as you’re not copying a significant amount of the whole content and passing it off as your own. Short clips with narration and lots of commentary interwoven between them is typically protected. Copyright is not intended to be used to silence free speech. This also tends to include satire; as long as it doesn’t tread into defamation territory.
    • Reasonable, ‘Non-Profit Seeking or Motivated’ Personal Use; People are generally allowed to share things amongst themselves and their friends and other acquaintances. Reasonable backup copies, loaning of copies, and even reproduction and presentation of things are generally considered fair use.

    In most cases AI art is at least somewhat Transformative. It may be too complex for us to explain it simply; but the AI is basically a virtual brain that can, without error or certain human faults, ingest image information and make decisions based on input given to it in order to give a desired output.

    Arguably; if I have license or right to view artwork; or this right is no longer reserved, but is granted to the public through the use of the World Wide Web…then the AI also has those rights. Yes. The AI has license to view, and learn from your artwork. It just so happens to be a little more efficient at learning and remembering than humans can be at times.

    This does not stop you from banning AIs from viewing all of your future works. Communicating that fact with all who interact with your works is probably going to make you a pretty unpopular person. However; rightsholders do not hold or reserve the right to revoke rights that they have previously given. Once that genie is out of the bottle; it’s out…unless you’ve got firm enough contract proof to show that someone agreed to otherwise handle the management of rights.

    In some cases; that proof exists. Good luck in court. In most cases however; that proof does not exist in a manner that is solid enough to please the court. A lot of the time; we tend to exchange, transfer and reserve rights ephemerally…that is in a manner that is not strictly always 100% recognized by the law.

    Gee; Perhaps we should change that; and encourage the reasonable adaptation and growth of Copyright to fairly address the challenges of the information age.

          • Deniz Opal@syzito.xyz
            link
            fedilink
            arrow-up
            10
            ·
            1 year ago

            @raccoona_nongrata

            Actually. It is necessary. The process of creativity is much much more a synergy of past consumption than we think.

            It took 100,000 years to get from cave drawings to Leonard Da Vinci.

            Yes we always find ways to draw, but the pinnacle of art comes from a shared culture of centuries.

              • Deniz Opal@syzito.xyz
                link
                fedilink
                arrow-up
                2
                ·
                1 year ago

                @raccoona_nongrata

                A machine will not unilaterally develop an art form, and develop it for 100,000 years.

                Yes I agree with this.

                However, they are not developing an art form now.

                Nor did Monet, Shakespeare, or Beethoven develop an art form. Or develop it for 100,000 years.

                So machines cannot emulate that.

                But they can create the end product based on past creations, much as Monet, Shakespeare, and Beethoven did.

                • ParsnipWitch@feddit.de
                  link
                  fedilink
                  arrow-up
                  3
                  ·
                  1 year ago

                  No, humans create and develope styles in art from “mistakes” that AI would not continue pursuing. Because they personally like it or have a strange addiction to their own creative process. The current hand mistakes for example were perhaps one of the few interesting things AI has done…

                  Current AI models recreate what is most liked by the majority of people.

        • Ben from CDS@dice.camp
          link
          fedilink
          arrow-up
          13
          ·
          1 year ago

          @selzero @raccoona_nongrata @fwygon But human creativity is not ONLY a combination of past creativity. It is filtered through a lifetime of subjective experience and combined knowledge. Two human artists schooled on the same art history can still produce radically different art. Humans are capable of going beyond has been done before.

          Before going too deep on AI creation spend some time learning about being human. After that, if you still find statistical averages interesting, go back to AI.

          • Deniz Opal@syzito.xyz
            link
            fedilink
            arrow-up
            5
            ·
            edit-2
            1 year ago

            @glenatron @raccoona_nongrata @fwygon

            I mean, yes, you are right, but essentially, it is all external factors. They can be lived through external factors, or data fed external factors.

            I don’t think there is a disagreement here other than you are placing a lot of value on “the human experience” being an in real life thing rather than a read thing. Which is not even fully true of the great masters. It’s a form of puritan fetishisation I guess.

            • Ben from CDS@dice.camp
              link
              fedilink
              arrow-up
              8
              ·
              1 year ago

              @selzero @raccoona_nongrata @fwygon I don’t think it’s even contraversial. Will sentient machines ever have an equivalent experience? Very probably. Will they be capable of creating art? Absolutely.

              Can our current statistical bulk reincorporation tools make any creative leap? Absolutely not. They are only capable of plagiarism. Will they become legitimate artistic tools? Perhaps, when the people around them start taking artists seriously instead of treating them with distain.

              • Deniz Opal@syzito.xyz
                link
                fedilink
                arrow-up
                6
                ·
                1 year ago

                @glenatron @raccoona_nongrata @fwygon

                This angle is very similar to a debate going on in the cinema world, with Scorsese famously ranting that Marvel movies are “not movies”

                The point being without a directors message being portrayed, these cookie cutter cinema experiences, with algorithmically developed story lines, should not be classified as proper movies.

                But the fact remains, we consume them as movies.

                We consume AI art as art.

    • Thevenin@beehaw.org
      link
      fedilink
      arrow-up
      19
      ·
      1 year ago

      It doesn’t change anything you said about copyright law, but current-gen AI is absolutely not “a virtual brain” that creates “art in the same rough and inexact way that we humans do it.” What you are describing is called Artificial General Intelligence, and it simply does not exist yet.

      Today’s large language models (like ChatGPT) and diffusion models (like Stable Diffusion) are statistics machines. They copy down a huge amount of example material, process it, and use it to calculate the most statistically probable next word (or pixel), with a little noise thrown in so they don’t make the same thing twice. This is why ChatGPT is so bad at math and Stable Diffusion is so bad at counting fingers – they are not making any rational decisions about what they spit out. They’re not striving to make the correct answer. They’re just producing the most statistically average output given the input.

      Current-gen AI isn’t just viewing art, it’s storing a digital copy of it on a hard drive. It doesn’t create, it interpolates. In order to imitate a person’t style, it must make a copy of that person’s work; describing the style in words is insufficient. If human artists (and by extension, art teachers) lose their jobs, AI training sets stagnate, and everything they produce becomes repetitive and derivative.

      None of this matters to copyright law, but it matters to how we as a society respond. We do not want art itself to become a lost art.

      • Fauxreigner@beehaw.org
        link
        fedilink
        arrow-up
        8
        ·
        1 year ago

        Current-gen AI isn’t just viewing art, it’s storing a digital copy of it on a hard drive.

        This is factually untrue. For example, Stable Diffusion models are in the range of 2GB to 8GB, trained on a set of 5.85 billion images. If it was storing the images, that would allow approximately 1 byte for each image, and there are only 256 possibilities for a single byte. Images are downloaded as part of training the model, but they’re eventually “destroyed”; the model doesn’t contain them at all, and it doesn’t need to refer back to them to generate new images.

        It’s absolutely true that the training process requires downloading and storing images, but the product of training is a model that doesn’t contain any of the original images.

        None of that is to say that there is absolutely no valid copyright claim, but it seems like either option is pretty bad, long term. AI generated content is going to put a lot of people out of work and result in a lot of money for a few rich people, based off of the work of others who aren’t getting a cut. That’s bad.

        But the converse, where we say that copyright is maintained even if a work is only stored as weights in a neural network is also pretty bad; you’re going to have a very hard time defining that in such a way that it doesn’t cover the way humans store information and integrate it to create new art. That’s also bad. I’m pretty sure that nobody who creates art wants to have to pay Disney a cut because one time you looked at some images they own.

        The best you’re likely to do in that situation is say it’s ok if a human does it, but not a computer. But that still hits a lot of stumbling blocks around definitions, especially where computers are used to create art constantly. And if we ever hit the point where digital consciousness is possible, that adds a whole host of civil rights issues.

        • Thevenin@beehaw.org
          link
          fedilink
          arrow-up
          4
          ·
          1 year ago

          It’s absolutely true that the training process requires downloading and storing images

          This is the process I was referring to when I said it makes copies. We’re on the same page there.

          I don’t know what the solution to the problem is, and I doubt I’m the right person to propose one. I don’t think copyright law applies here, but I’m certainly not arguing that copyright should be expanded to include the statistical matrices used in LLMs and DPMs. I suppose plagiarism law might apply for copying a specific style, but that’s not the argument I’m trying to make, either.

          The argument I’m trying to make is that while it might be true that artificial minds should have the same rights as human minds, the LLMs and DPMs of today absolutely aren’t artificial minds. Allowing them to run amok as if they were is not just unfair to living artists… it could deal irreparable damage to our culture because those LLMs and DPMs of today cannot take up the mantle of the artists they hedge out or pass down their knowledge to the next generation.

          • Fauxreigner@beehaw.org
            link
            fedilink
            arrow-up
            2
            ·
            1 year ago

            Thanks for clarifying. There are a lot of misconceptions about how this technology works, and I think it’s worth making sure that everyone in these thorny conversations has the right information.

            I completely agree with your larger point about culture; to the best of my knowledge we haven’t seen any real ability to innovate, because the current models are built to replicate the form and structure of what they’ve seen before. They’re getting extremely good at combining those elements, but they can’t really create anything new without a person involved. There’s a risk of significant stagnation if we leave art to the machines, especially since we’re already seeing issues with new models including the output of existing models in their training data. I don’t know how likely that is; I think it’s much more likely that we see these tools used to replace humans for more mundane, “boring” tasks, not really creative work.

            And you’re absolutely right that these are not artificial minds; the language models remind me of a quote from David Langford in his short story Answering Machine: “It’s so very hard to realize something that talks is not intelligent.” But we are getting to the point where the question of “how will we know” isn’t purely theoretical anymore.

      • Zyansheep@lemmy.ml
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago
        1. How do you know human brains don’t work in roughly the same way chatbots and image generators work?

        2. What is art? And what does it mean for it to become “lost”?

          • Zyansheep@lemmy.ml
            link
            fedilink
            arrow-up
            2
            ·
            1 year ago

            No, he just said AI isn’t like human brains because its a “statistical machine”. What I’m asking is how he knows that human brains aren’t statistical machines?

            Human brains aren’t that good at direct math calculation either!

            Also he definitely didn’t explain what “lost art” is.

    • ParsnipWitch@feddit.de
      link
      fedilink
      arrow-up
      13
      ·
      1 year ago

      Current AI models do not learn the way human brains do. And the way current models learn how do “make art” is very different from how human artists do it. To repeatedly try and recreate the work of other artists is something beginners do. And posting these works online was always shunned in artist communities. You also don’t learn to draw a hand by remembering where a thousand different artists put the lines so it looks like a hand.

    • Shiri Bailem@foggyminds.com
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      1 year ago

      @fwygon all questions of how AI learns aside, it’s not legally theft but philosophically the topic is debatable and very hot button.

      I can however comment pretty well on your copyright comments which are halfway there, but have a lot of popular inaccuracies.

      Fair use is a very vague topic, and they explicitly chose to not make explicit terms on what is allowed but rather the intents of what is to be allowed. We’ve got some firm ones not because of specific laws but from abundance of case evidence.

      * Educational; so long as it is taught as a part of a recognized class and within curriculum.
      * Informational; so long as it is being distributed to inform the public about valid, reasonable public interests. This is far broader than some would like; but it is legal.
      * Narrative or Commentary purposes; so long as you’re not copying a significant amount of the whole content and passing it off as your own. Short clips with narration and lots of commentary interwoven between them is typically protected. Copyright is not intended to be used to silence free speech. This also tends to include satire; as long as it doesn’t tread into defamation territory.

      These are basically all the same category and includes some misinformation about what it does and does not cover. It’s permitted to make copies for purely informational, public interest (ie. journalistic) purposes. This would include things like showing a clip of a movie or a trailer to make commentary on it.

      Education doesn’t get any special treatment here, but research might (ie. making copies that are kept to a restricted environment, and only used for research purposes, this is largely the protection that AI models currently fall under because the training data uses copyrighted data but the resulting model does not).

      * Transformative; so long as the content is being modified in a substantial enough manner that it is an entirely new work that is not easily confused for the original. This too, is far broader than some would like; but it still is legal.

      “Easily confused” is a rule from Trademark Law, not copyright. Copyright doesn’t care about consumer confusion, but does care about substitution. That is, if the content could be a substitute for the original (ie. copying someone else’s specific painting is going to be a violation up until the point where it can only be described as “inspired by” the painting)

      * Reasonable, ‘Non-Profit Seeking or Motivated’ Personal Use; People are generally allowed to share things amongst themselves and their friends and other acquaintances. Reasonable backup copies, loaning of copies, and even reproduction and presentation of things are generally considered fair use.

      This is a very very common myth that gets a lot of people in trouble. Copyright doesn’t care about whether you profit from it, more about potential lost profits.

      Loaning is completely disconnected from copyright because no copies are being made (“digital loaning” is a nonsense attempt to claiming loaning, but is just “temporary” copying which is a violation).

      Personal copies are permitted so long as you keep the original copy (or the original copy is explicitly irrecoverably lost or destroyed) as you already acquired it and multiple copies largely are just backups or conversions to different formats. The basic gist is that you are free to make copies so long as you don’t give any of them to anyone else (if you copy a DVD and give either the original or copy to a friend, even as a loan, it’s illegal).

      It’s not good to rely on it being “non-profit” as a copyright excuse, as that’s more just an area of leniency than a hard line. People far too often thing that allows them to get away with copying things, it’s really just for topics like making backups of your movies or copying your CDs to mp3s.

      … All that said, fun fact: AI works are not covered by copyright law.

      To be copyrighted a human being must actively create the work. You can copyright things made with AI art, but not the AI art itself (ie. a comic book made with AI art is copyrighted, but the AI art in the panels is not, functioning much like if you made a comic book out of public domain images). Prompts and set up are not considered enough to allow for copyright (example case was a monkey picking up a camera and taking pictures, those pictures were deemed unable to be copyrighted because despite the photographer placing the camera… it was the monkey taking the photos).

    • joe_vinegar@slrpnk.net
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      This is a very nice and thorough comment! Can you provide a reputable source for these points? (no criticism intended: as you seem knowledgeable, I’d trust you could have such reputable sources already selected and at hand, that’s why I’m asking).

      • throwsbooks@lemmy.ca
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        Not the poster you’re replying to, but I’m assuming you’re looking for some sort of source that neural networks generate stuff, rather than plagiarize?

        Google scholar is a good place to start. You’d need a general understanding of how NNs work, but it ends up leading to papers like this one, which I picked out because it has neat pictures as examples. https://arxiv.org/abs/1611.02200

        What this one is doing is taking an input in the form of a face, and turning it into a cartoon. They call it an emoji, cause it’s based on that style, but it’s the same principle as how AI art is generated. Learn a style, then take a prompt (image or text) and do something with the prompt in the style.

  • Dizzy Devil Ducky@lemm.ee
    link
    fedilink
    English
    arrow-up
    20
    ·
    edit-2
    1 year ago

    All this proves to me, based on the context from this post, is that people are willing to commit copyright infringement in order to make a machine produce art in a specific style.

    • Hawk@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 year ago

      It doesn’t say anywhere they used copyrighted art though?

      Seems the new model might use art inspired by him, not his art itself.

      It’s a moral gray zone. If you add enough freely available works inspired by someone, the model can produce a similar style without using any original works.

      Is it still copyright infringement at that point?

      • UnknownCircle@kbin.social
        link
        fedilink
        arrow-up
        10
        ·
        edit-2
        1 year ago

        Its unlikely that this did not use his work, these models require input data. Even if they took similar art, that would only resolve the issue of Greg himself but would shift it to those other artists. Unless there is some sort of unspoken artistic genealogical purity that prevents artists with similar or inspired styles from having equal claim on their own creations when inspired by another.

        It also could be outputs generated from another AI model. But I don’t think people who see ethical problems in this care about the number of steps removed and processing that occurs when the origin is his artwork and it ultimately outputs the same or similar style. The result is what bothers people, no matter how disparate or disconnected the source’s influence is. If the models had simply found the Greg Rutkowski latent space through random chance people would still take issue with it.

        The ability and willingness to generate images in a style associated with a person, without consent, is a threat to that persons job security and shows a lack of value for them as a human. As if their creative expression is worth nothing but as a commodity to be consumed.

        The people supporting this don’t care though. They want to consume this person’s style in far greater quantities and variations then a human is capable or willing to fulfill. That’s why these debates are so fierce, the two sides have incentives that are in direct conflict with one another.

        We currently lack the economic ingenuity or willingness to create a system that will satisfy both parties. The barrier of entry to AI is low, someone at home has every incentive to maintain the status quo or even actively rail against artists. Artists will need a heavy handed approach from the government or as a collective to combat this effectively.

        • Harrison [He/Him]@ttrpg.network
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          The ability and willingness to generate images in a style associated with a person, without consent, is a threat to that persons job security and shows a lack of value for them as a human. As if their creative expression is worth nothing but as a commodity to be consumed.

          You can’t own an art style. Copyright only extends to discrete works and characters. If I pay a street artist to draw a portrait of me in the style of Picasso, I’m not devaluing Picasso as a person.

          • UnknownCircle@kbin.social
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            I agree that you can’t own an art style in the US and I don’t know if there’s any other legal basis for artist’s claims.

            Legality doesn’t automatically deal with problems that are not based on whether something is legal or not. Losing money is losing money, regardless of if its the result of something legal. And people can feel devalued by something that is legal. It just means that the government will not use force to intervene in what you’re doing and may in-fact use force to support you.

            Picasso is dead, so he has no ability to feel devalued. Artists who are alive do have that ability and other living people who value his works do as well.

            I myself support and love this technology. But it is clear that it causes problems for some people. I would prefer for it to exist in a form where artists could get value from and be happy with it too, but that is just not the case at present.

        • KoboldCoterie@pawb.social
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          It also could be outputs generated from another AI model.

          This is an interesting point, and you get into some real Ship of Theseus territory. At what point is it no longer based on his work? How many iterations before he no longer has any claim to it at all?

          • UnknownCircle@kbin.social
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            1 year ago

            Its certainly interesting, but its ultimately going to be wherever we collectively decide.

            One thing modern ML advancements have made painfully clear is that something being the “same” is variable based on what definition you use to determine sameness. Is it the same crew, same look, same feel, same atoms, same purpose, same name, etc… In the absence of such definition, everything ceases to be the same the moment after it has been described. As every single thing is constantly changing.

            Living things naturally generalize similarities, relationships, and associations into patterns that are re-used and abstracted. So we very much take these things for granted.

            If you like that type of thing you may enjoy Funes the Memorious by Luis Borges

      • Dizzy Devil Ducky@lemm.ee
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        If it’s inspired then at that point I guess it might not be copyright infringing unless it’s an accurate enough recreation of a copyrighted piece… And it looks like my mind filled in the gaps to assume it was copyrighted work being used.

  • arvere@lemmy.ml
    link
    fedilink
    arrow-up
    16
    ·
    1 year ago

    my take on the subject, as someone who worked both in design and arts, and tech, is that the difficulty in discussing this is more rooted on what is art as opposed to what is theft

    we mistakingly call illustrator/design work as art work. art is hard to define, but most would agree it requires some level of expressiveness that emanates from the artist (from the condition of the human existence, to social criticism, to beauty by itself) and that’s what makes it valuable. with SD and other AIs, the control of this aspect is actually in the hands of the AI illustrator (or artist?)

    whereas design and illustration are associated with product development and market. while they can contain art in a way, they have to adhere to a specific pipeline that is generally (if not always) for profit. to deliver the best-looking imagery for a given purpose in the shortest time possible

    designers and illustrators were always bound to be replaced one way or a another, as the system is always aiming to maximize profit (much like the now old discussions between taxis and uber). they have all the rights to whine about it, but my guess is that this won’t save their jobs. they will have to adopt it as a very powerful tool in their workflow or change careers

    on the other hand, artists that are worried, if they think the worth of their art lies solely in a specific style they’ve developed, they are in for an epiphany. they might soon realise they aren’t really artists, but freelance illustrators. that’s also not to mention other posts stating that we always climb on the shoulders of past masters - in all areas

    both artists and illustrators that embrace this tool will benefit from it, either to express themselves quicker and skipping fine arts school or to deliver in a pace compatible with the market

    all that being said I would love to live in a society where people cared more about progress instead of money. imagine artists and designers actively contributing to this tech instead of wasting time talking fighting over IP and copyright…

    • Harrison [He/Him]@ttrpg.network
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      Artists don’t own their styles, so it’s interesting to see them fight to protect them.

      The only thing that makes anything valuable is that someone wants it, or at least wants it to exist. Nothing has intrinsic value because value itself is a human construction. This necessarily includes art.

      • itsgallus@beehaw.org
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        Artists should own their styles, but only in combination with their name. Forgery has always been a problem, but it’s obviously a lot more accessible thanks to AI. As a hobbyist artist myself, I don’t see monetary value as the main problem, but rather misrepresentation. Feel free to copy my style, but don’t attribute your art to me — AI generated or otherwise.

        That being said, I’m super excited about this evolution of technology.

  • SmoochyPit@beehaw.org
    link
    fedilink
    arrow-up
    10
    ·
    1 year ago

    If an image is represented as a network of weighted values describing subtle patterns in the image rather than a traditional grid of pixel color values, is that copy of the image still subject to copyright law?

    How much would you have to change before it isn’t? Or if you merged it with another representation, would that change your rights to that image?

    • whelmer@beehaw.org
      link
      fedilink
      arrow-up
      16
      ·
      1 year ago

      It doesn’t matter how you recreate an image, if you recreate someone else’s work that is a violation of copyright.

      Stealing someone’s style is a different matter.

      • KoboldCoterie@pawb.social
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        if you recreate someone else’s work that is a violation of copyright.

        Only if the work is copyrighted, and your copy does not constitute fair use…

        I could create a faithful reproduction of the Mona Lisa (or… I mean, someone could, I sure couldn’t), and it’s not violating copyright, because the Mona Lisa is not copyrighted.

        • Samus Crankpork@beehaw.org
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          I could create a faithful reproduction of the Mona Lisa

          You could, but Stable Diffusion couldn’t. All it can do is output what it’s been fed. It doesn’t know composition, or colour theory. It doesn’t understand that something is a human, or a fabric, or how materials work, it just reproduces variations of what it’s been fed. Calling it “intelligence” is disingenuous: it doesn’t “know” anything, it just reproduces what’s built into it’s database, usually without the artist’s permission.

  • Storksforlegs@beehaw.org
    link
    fedilink
    arrow-up
    9
    ·
    edit-2
    1 year ago

    There’s a lot of disagreement here on what is theft, what is art, what is copyright… etc

    The main issue people have with AI is fundamentally how is it going to be used? I know there isnt much we can do about it now, and its a shame because there it has so much potential good. Everyone defending AI is making a lot of valid points.

    But at the end of the day it is a tool that is going to be misused by the rich and powerful to eliminate hundreds of millions of well paying careers, permanently. MOST well paying jobs in fact, not just artists. What the hell are people supposed to do? How is any of this a good thing?

    • sapient [they/them]@infosec.pub
      link
      fedilink
      arrow-up
      8
      ·
      1 year ago

      What the hell are people supposed to do?

      Eat the rich :)

      More concretely, there are a number of smaller and larger sociopolitical changes that can be fought for. On the smaller side, there’s rethinking the way our society values people and pushing for some kind of UBI, on the larger side there’s shifting to postcapitalist economics and organisation to various degrees .)

      • boff@lemmy.one
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        But the rich are the ones buying a lot of the art! Who will pay the artists if you eat the people with the money?

    • Harrison [He/Him]@ttrpg.network
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      The rich and powerful must go away, or everyone else will suffer.

      Soon enough they will succeed in eliminating most jobs, and the moment will come where action must be taken. Them or us.

    • Steeve@lemmy.ca
      link
      fedilink
      arrow-up
      13
      ·
      edit-2
      1 year ago

      This person has no idea what machine learning actually is. And they hate such a generic concept on a “gut feeling” and come up with the reasons later?

      If you want good reasons to hate AI generated art you won’t find them in this shitty blogpost.

      • liminalDeluge@beehaw.org
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 year ago

        Apparently your comment really got to them, because the blogpost now contains a direct quote of you and a response.

        • Steeve@lemmy.ca
          link
          fedilink
          arrow-up
          6
          ·
          edit-2
          1 year ago

          Someone I don’t get along with very well wrote:

          Hahaha yikes. Pretty cowardly to post their unhinged response on their blog where nobody can actually respond.

          Also, why the hell would this person who hates the very general concept of machine learning (because of their gut lol) get a degree in a field that significantly utilizes machine learning? Computational linguistics is essentially driven by machine learning, so that’s uh… probably bullshit.

    • Thrashy@beehaw.org
      link
      fedilink
      arrow-up
      11
      ·
      1 year ago

      as a counterpoint, when the use-case for the tool is specifically “I want a picture that looks like it was painted by Greg Rutkowski, but I don’t want to pay Greg Rutkowski to paint it for me” that sounds like the sort of scenario that copyright was specifically envisioned to protect against – and if it doesn’t protect against that, it’s arguably an oversight in need of correction. It’s in AI makers and users’ interest to proactively self-regulate on this front, because if they don’t somebody like Disney is going to wade into this at some point with expensive lobbyists, and dictate the law to their own benefit.

      That said, it’s working artists like Rutkowski, or friends of mine who scrape together a living off commissioned pieces, that I am most concerned for. Fantasy art like Greg makes, or personal character portraits of the sort you find on character sheets of long-running DnD games or as avatar images on forums like this one, make up the bread and butter of many small-time artists’ work, and those commissions are the ones most endangered by the current state of the art in generative AI. It’s great for would-be patrons that the cost of commissioning a mood piece for a campaign setting or a portrait of their fursona has suddenly dropped to basically zero, but it sucks for artists that their lunch is being eaten by an AI algorithm that was trained by slurping up all their work without compensation or even credit. For as long as artists need to get paid for their work in order to live, that’s inherently anti-worker.

      • Sandra@idiomdrottning.org
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        I’m also an artist, for whatever that’s worth, 🤷🏻‍♀️

        Copyright is artificial scarcity which is ultimately designed for publishers, not workers.

        One of the many, many bugs in market capitalism is that it can’t handle when something is difficult to initially create but when copies are cheap. Like a song. It’s tricky to write it but once you have it you can copy it endlessly. Markets based on supply and demand can’t handle that so they cooked up copyright as kind of a brutal patch, originally for book publishers in an era where normal readers couldn’t easily copy books anyway, only other publishers could.

        It’s a patch that doesn’t work very well since many artist still work super hard and still have to get by on scraps. Ultimately we need to re-think a lot of economics. Not only because digital threw everything on its ear and what could’ve been a cornucopia is now a tug of war for pennies, but also because of climate change (which is caused by fossil fuel transaction externalities being under-accounted for—if I sell you a can of gas, the full environmental impact of that is not going to be factored in properly. Sort of like how a memory leak works in a computer program).

        I definitively sympathize with your artist friends and I’ve been speaking out against AI art, at least some aspects of it (including, but not limited to, the environmental impact of new models, and the increasing wealth&power concentration for big data capital).

      • Harrison [He/Him]@ttrpg.network
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        It sucked for candle makers when electric lights were adopted. It sucked for farriers and stable hands and saddle makers when cars became affordable for the average person. Such is the cost of progress.