• diffuselight@lemmy.world
    link
    fedilink
    English
    arrow-up
    192
    arrow-down
    2
    ·
    1 year ago

    It’s a bullshit article by a bullshit website. The law in question is a decade old. Japan hasn’t decided anything - they are slow to decide new things. It’s just this page clickbaiting.

    • dwks@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      2
      ·
      1 year ago

      That explain their birth rate issue, it’s new issue for them too 😂

    • shotgun_crab@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      1 year ago

      Thanks for the summary. When I read a title written like this it always smells like bullshit.

  • honey_im_meat_grinding@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    60
    arrow-down
    7
    ·
    edit-2
    1 year ago

    I sympathize with artists who might lose their income if AI becomes big, as an artist it’s something that worries me too, but I don’t think applying copyright to data sets is a long term good thing. Think about it, if copyright applies to AI data sets all that does is one thing: kill open source AI image generation. It’ll just be a small thorn in the sides of corporations that want to use AI before eventually turning them into monopolies over the largest, most useful AI data sets in the world while no one else can afford to replicate that. They’ll just pay us artists peanuts if anything at all, and use large platforms like Twitter, Facebook, Instagram, Artstation, and others who can change the terms of service to say any artist allows their uploaded art to be used for AI training - with an opt out hidden deep in the preferences if we’re lucky. And if you want access to those data sources and licenses, you’ll have to pay the platform something average people can’t afford.

    • Phanatik@kbin.social
      link
      fedilink
      arrow-up
      26
      arrow-down
      8
      ·
      1 year ago

      I completely disagree. The vast majority of people won’t be using the open source tools unless the more popular ones become open source (which I don’t think is likely). Also, a tool being open source doesn’t mean it’s allowed to trample over an artist’s rights to their work.

      They’ll just pay us artists peanuts if anything at all, and use large platforms like Twitter, Facebook, Instagram, Artstation, and others who can change the terms of service to say any artist allows their uploaded art to be used for AI training - with an opt out hidden deep in the preferences if we’re lucky.

      This is going to happen anyway. Copyright law has to catch up and protect against this, just because they put it in their terms of service, doesn’t mean it can’t be legislated against.

      This was the whole problem with OpenAI anyway. They decided to use the internet as their own personal dataset and are now charging for it.

      • honey_im_meat_grinding@lemmy.blahaj.zone
        link
        fedilink
        arrow-up
        7
        arrow-down
        1
        ·
        edit-2
        1 year ago

        I get where you’re coming from, but I don’t think even more private property is the answer here. This is ultimately a question of economics - we don’t like that a) we’re being put out of jobs, and b) it’s being done without our consent / anything in return. These are problems that we can address without throwing even more monopolosation power into the equation, which is what IP is all about - giving artists a monopoly over their own content, which mostly benefits large media corporations, not independent artists.

        I’d much rather we tackled the problem of automation taking our jobs in a more heads on manner via something like UBI or negative income taxes, rather than a one-off solution like even more copyright that only really serves to slow this inevitability down. You can regulate AI in as many ways as you want, but that’s adding a ton of meaningless friction to getting stuff done (e.g. you’d have to prove your art wasn’t made by AI somehow) when the much easier and more effective solution is something like UBI.

        The consent question is something that needs a bit more of a radical solution - like democratising work, something that Finland has done to their grocery stores, the biggest grocery chains are democratically owned and run by the members (consumer coops). We’ll probably get to something like that on a large scale… eventually - but I think it’s probably a bigger hurdle than UBI. Then you’d be able to vote on what ways an organisation operates, including if or how it builds AI data sets.

        • archomrade [he/him]@midwest.social
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          I appreciate this take, especially when applying copyright in the manner being proposed extends the already ambiguous grey area of “fair use”, which is most often used against artists.

      • Pulp@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        12
        ·
        1 year ago

        Who gives a shit about artists rights? We need to move on with the progress like we always have.

        • Phanatik@kbin.social
          link
          fedilink
          arrow-up
          6
          arrow-down
          1
          ·
          1 year ago

          We should give a shit about everyone’s rights to put food on the table. Compassion can be exhausting but it’s important to recognise that someone else’s problem might be yours one day and you’d wish someone was there to help you.

    • krnl386@lemmy.ca
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      2
      ·
      1 year ago

      I sympathize with artists too, but to a point. I predict that:

      1. AI art will overtake human art eventually; that is human art jobs will be mostly replaced. Day to day art (e.g. ads, illustrations, decorations, billboards etc) will likely be AI generated.
      2. Human art will become something akin to a home cooked meal in a sea of fast food art. This might actually make some artists famous and rich.
      3. Humans will continue to learn art, but more as a pastime/hobby/mental exercise.
      • ParsnipWitch@feddit.de
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        1 year ago

        For point 2 and 3 art is too expensive and time consuming to learn. I feel a lot of people extremely underestimate the time and cost that people have to bring up to become decent artists.

    • CloverSi@lemmy.comfysnug.space
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      1 year ago

      This was my thinking too. In principle I support restrictions on the data AI can be trained on, no question - but practically speaking the only difference restricting it makes is giving whatever companies gobble up the most IP the sole ability to make legal AI art. If a decision like that was made, there would be no more stable diffusion, available to anyone and everyone for free; the only legal options would be e.g. Adobe Firefly.

  • mtchristo@lemm.ee
    link
    fedilink
    English
    arrow-up
    47
    arrow-down
    1
    ·
    1 year ago

    So Japan is telling us, that intellectual property is holding back its progress in AI. so are they recognizing that IP is a hinderess to progress and innovation ? should we expect this to nullify other IP legislation ? is this heading to court?

    • Boinketh@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Could we get around IP in general by just training an AI on the data and then getting the gist of it out of the AI? It’s just gotta be kinda remixed, yeah?

  • jayandp@sh.itjust.works
    link
    fedilink
    arrow-up
    41
    arrow-down
    1
    ·
    1 year ago

    This is a strange move from a country that is usually the most overprotective when it comes to copyright. Though I guess if you view it from a “pro-business” view then it might make sense. Sucks a ton for artists though.

  • ox0r@jlai.lu
    link
    fedilink
    English
    arrow-up
    31
    ·
    1 year ago

    My AI trained torrent client will be very happy to hear this

  • Jerkface (any/all)@lemmy.ca
    link
    fedilink
    English
    arrow-up
    31
    arrow-down
    5
    ·
    1 year ago

    I’m not thrilled that copyright exists and that it is used as a weapon against innovation and artistic expression. But if it’s going to exist, I want it to actually fucking protect my works.

      • Gutless2615@ttrpg.network
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        1 year ago

        It isn’t, and that wasn’t what the monkey selfie lawsuit was about. The monkey selfie lawsuit in fact supports the idea that generative art can be protected, if it demonstrates a manifestation of an artists specific intent. The monkey selfie wasn’t copyrightable not because a monkey isn’t a human; but because the monkey didn’t know wtf it was doing when it took a selfie.

  • !ozoned@lemmy.world@beehaw.org
    link
    fedilink
    arrow-up
    12
    arrow-down
    1
    ·
    1 year ago

    So if the work they used to train it isn’t a copyright violation canthr things it creates be copyrighted? I hate copyright. It doesn’t protect the people it should. Public domain everything that these AI create, companies will stay away, and we support creators directly.

  • krnl386@lemmy.ca
    link
    fedilink
    English
    arrow-up
    11
    ·
    1 year ago

    Well, they should prepare for a crapton of new datacenters to be built there. 😂

  • Gamey@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    1 year ago

    Well on one hand the expception for AI is annoying but it’s also kind of the direction I try to convince people of for ages, FCK copyright!

  • Sir_Kevin@discuss.online
    link
    fedilink
    arrow-up
    9
    arrow-down
    9
    ·
    1 year ago

    Smart move. They also clearly understand that AI is here to stay and it’s better to embrace it rather than fight it. This will give Japan an unhindered advantage while the rest of the world cries over who allowed a computer to look at their artwork.

  • Gutless2615@ttrpg.network
    link
    fedilink
    English
    arrow-up
    27
    arrow-down
    29
    ·
    edit-2
    1 year ago

    The absolute right decision. Generative art is a fair use machine, not a plagiarism one. We need more fair use, not less.

    • donuts@kbin.social
      link
      fedilink
      arrow-up
      34
      arrow-down
      7
      ·
      1 year ago

      Not at all… In fact, it’s totally batshit insane to determine that the biggest tech companies in the world can freely use anybody’s copyrighted data or intellectual property to train an AI and then claim to have ownership over the output.

      The only way that it makes sense to have AI training be “fair use” is if the output of AI is not able to be copyrighted or commercially used, and that’s not the case here. This decision will only enable a mass, industrialized exploitation of workers, artists and creators.

      • Gutless2615@ttrpg.network
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        9
        ·
        1 year ago

        Expanding on the already expansive terms of copyright is not the appropriate way to deal with the externalities of AI. This copyright maximalists approach will hurt small artists, remix culture, drive up business costs for artists who will be dragged into court to prove their workflows didn’t involve any generative steps, and as with every expansion of copyright, primarily help the large already centralized corporate IP holders to further cement their position.

      • RGB3x3@lemmy.world
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        1 year ago

        But would copyright law not cover the creation of a piece of art that is derivative or a copy of another piece without proper credit?

        A human artist does not violate copyright merely for studying a piece of art. Only by replicating it do they violate the law.

        Why should these AI models not be covered in the same way?

      • Gutless2615@ttrpg.network
        link
        fedilink
        English
        arrow-up
        20
        arrow-down
        5
        ·
        edit-2
        1 year ago

        Expanding the terms of copyright to 70 years after the life of the author actually didn’t help artists make art. Expanding copyright to cover “training” will result in more costly litigation, make things harder for small artists and creators, and further centralize the corporate IP hoarders that can afford to shoulder the increased costs of doing business. There are inumerable content creators that could and will make use of generative art to make content and they should be allowed to prosper. We need more fair use, not less.

      • wolfshadowheart@kbin.social
        link
        fedilink
        arrow-up
        14
        arrow-down
        8
        ·
        edit-2
        1 year ago

        That’s not true? There’s nothing stopping content creators from using their own content to create models. In fact, that’s my exact project for some of my visual art.

        Moreover (edit: visual) models can’t effectively replicate the copywrite, so I don’t really see how it would infringe on it.

      • Gutless2615@ttrpg.network
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        1
        ·
        1 year ago

        You do realize individuals can train neural networks on their own hardware, right? Generative art and generative text is not something owned by corporations — and in fact what is optimistically becoming apparent is that it is specifically difficult to build moats around a generative model, meaning that it’s especially hard for for corporations to own this technology outright — but those corporations are the only ones that benefit from expanding copyright. Also, I disagree with you also. A trained model is a transformative work, as are the works you can generate with those models. Applying the four factor fair use test comes out heavily on the side of fair use.

        • RyanHeffronPhoto@kbin.social
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          1 year ago

          @Gutless2615 Of course individuals can train models on their own work, but if they train it on other artists work, that too is an unauthorized use.

          Honestly whether AI outputs can be copyrighted is really a separate issue from what I am concerned about… what matters in these cases is where/ how they obtained the inputs on which they trained the models. If a corporation or individual is using other artists works without authorization they are also committing theft, irrespective of any copyright infringement.

          • Gutless2615@ttrpg.network
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            1 year ago

            And while we’re at it let’s throw out mashup artists, collages, remixes and fair use altogether, huh? You’re just incorrect here, fair use exists for a reason, and applying the four factor fair use test to generative art comes out on the side of fair use nine times out of ten. What’s more, what you’re arguing for will only make it harder for small artists who get spurious accusations lobbed their way or automated take downs from bad “ai detector” software and have to drag out in progress files and lawyer money to argue they didn’t use generative tools in their workflow. There are better ways to make sure artists can still get paid - and, spoiler alert: it’s not just the artists that are going to get hit. We need to embrace more creative solutions to the problems of AI than “copyright harder”

    • brimnac@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      11
      ·
      1 year ago

      To me it’s essentially the same as someone reading a book or watching a movie when the AI learns from those examples.

      • hh93@lemm.ee
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        4
        ·
        1 year ago

        The problem is that the AI can print the book word for word if you ask the right questions and at that point it’s breaking copyright again but that’s not a problem with the learning part but with how AI has no concept of understanding context at all

          • hh93@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            3
            ·
            1 year ago

            You can’t easily tell it to replicate any painting for you - with current AI you can do that with almost any book it trained with

        • zkfcfbzr@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          2
          ·
          edit-2
          1 year ago

          I was skeptical of this, but it checks out: I easily got ChatGPT to print out the full text to The Tell-Tale Heart, without any errors at all in the various spots I accuracy-checked.

          Granted I chose it because it’s a very short public domain work - I was more skeptical of its technical ability to recall the exact text without errors than of the ability to trick it into violating copyright law.

          I still suspect it’s much easier to (accidentally) trick it into writing a fanfiction of a copyrighted work that it claims is the original than it is to get it to produce the true original, though.

          • Gutless2615@ttrpg.network
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            3
            ·
            edit-2
            1 year ago

            Your argument that it is useful as a copyright infringing machine is that it can reproduce a public domain work? That’s… not the argument you think it is.

            • zkfcfbzr@lemmy.world
              link
              fedilink
              English
              arrow-up
              4
              arrow-down
              2
              ·
              1 year ago

              My message was pretty clear about which part of their claim I was skeptical about and what I was testing for. It’s not what you described here.

        • Pulp@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          Just like a person with really good memory can. So what? Nobody is actually printing 300 page books that way when we can use libgen or any other source instead.

      • donuts@kbin.social
        link
        fedilink
        arrow-up
        10
        arrow-down
        6
        ·
        1 year ago

        AI has no personal agency, lived experiences, or independent creative input.
        Humans don’t have the ability to synthesize thousands of pages of text in a matter of minutes.

        Any analogy toward human learning or behavior is shallow and flawed.

        • ReCursing@kbin.social
          link
          fedilink
          arrow-up
          4
          arrow-down
          3
          ·
          1 year ago

          This is why humans are involved in the process. Your counterargument is shallow and flawed

          • Gutless2615@ttrpg.network
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            2
            ·
            1 year ago

            Yes exactly. When someone is creating art using stable diffusion it is clearly a manifestation of that artists intent. That is what copyright is designed to protect and should protect.

            • donuts@kbin.social
              link
              fedilink
              arrow-up
              2
              arrow-down
              1
              ·
              edit-2
              1 year ago

              Wrong. Copyright protects works, not ideas.

              The part that you AI bots always forget is that the machine doesn’t do shit without a dataset. No data input, no output. And if you don’t own the inputs, what the hell makes you think you can claim ownership over the outputs?

              If you ask an AI art program to paint you a “pretty kitty cat”, it can only do so because it has been fed enough pictures and paintings (plus metadata) to synthesize an acceptable output. Your human intent is an insignificant filter over their data, and if they haven’t trained on any pictures of cats, you will never achieve anything even close to your intent. Your prompt has the value of a Google search.

              Finally, there is a key thing called the “artistic process” in which a human artist imagined vision of their finished work takes shape as they work. This is nothing like what happens under a neutral network, and it is why you are never going to be an artist simply by filling in a web form. You have no vision, and even if you did, the AI will never achieve it on your behalf.

              Sorry, but if AI art sounds too good to be true, it’s because it is simply exploiting and distorting other people’s copyrighted artwork. It gives you the illusion of having created something, like the kid mashing buttons at the arcade machine without putting any money in. But the good news is that it’s not too late to learn how to draw.

              • ReCursing@kbin.social
                link
                fedilink
                arrow-up
                2
                ·
                1 year ago

                You’re fundamentally wrong and presenting a bad-faith argument in an insulting manner. Please shut up

                • donuts@kbin.social
                  link
                  fedilink
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  1 year ago

                  I was wrong to use the dismissive term “AI bots”. I’m genuinely sorry about that and I let my feelings as an artist get the best of me, but other than that my point still stands. To be fair, “you’re wrong” and “shut up” aren’t exactly the strongest counter arguments either. No hard feelings.

                  The objective truth is that “AI” neural networks synthesize an output based on an input dataset. There is no creativity, personality artistry or other x-factor there, and until there is real “general artificial intelligence” there never will be. Human beings feed inputs into the machine, and they generate an output based on some subset of those inputs. If those inputs are “fair use” or otherwise licensed, then that’s perfectly fine. But if those inputs are unlicensed copyrighted works, then you would be insane to believe that you own the output that the algorithm produces–that’s like thinking you own the music that comes out of your speakers because you hit the play button. Just because you’re in control of the playback does not mean that you created the music, and nobody would seriously think that.

                  I’ve worked as an artist and a programmer, and a simple analogy is the concept of a software license. Just because you can see or download some source code on GitLab does not mean that you own it or can use it freely for any purpose; most code repositories are open sourced under some kind of license, which legitimate users of that code must comply with. We’ve already seen Microsoft make this mistake and then instantly backtrack with Github Copilot, because they understand that they simply do not have the IP rights to use GPL code (for one example) to train their AI. Similarly, if a musician samples a portion of a song to use in their own song, depending on various factors they may have to share credit with the original creator, and sometimes that make sense, in my opinion.

                  No matter how you or I feel about it, copyright law has always been there with the basic intent to protect people who create unique works. There are some circumstances which are currently considered “fair use” of unlicensed copyrighted works (for example, for educational purposes), and I think that’s great. But I think there is zero argument that unlimited automated content generation via AI ought to be considered genuine fair use. No matter how much AI fans want to try to personify the technology, it is not engaging in a creative or artistic process, it is merely synthesizing an output based on mixed inputs, just like how an AI chat bot is not truly thinking but merely stringing words together.

          • donuts@kbin.social
            link
            fedilink
            arrow-up
            1
            arrow-down
            1
            ·
            edit-2
            1 year ago

            You do realize individuals can train neural networks on their own hardware, right?

            Good luck training something that rivals big tech, especially now that they’re all putting “moats” around their data…

            We, the little people, don’t have the data, the storage, the processing power, the RAM, and least but not least, the cash, to compete with them.

            At any rate, if you train your NN using appropriately licensed or public domain data, more power to you. But if you feed a machine a bunch of other people’s writing, artwork, music, etc., please understand that you will never truly own the output.

            You seem to be imagining a future in which AI is the great equalizer that ushers us in to some kind of utopia, but right now I’m only seeing even more money, power and control being clawed away from the people in favor of the biggest, richest tech conglomerates. It’s fucking dystopian, and I hope people like you will recognize that before it’s really too late.

              • donuts@kbin.social
                link
                fedilink
                arrow-up
                1
                ·
                edit-2
                1 year ago

                At any rate, if you train your NN using appropriately licensed or public domain data, more power to you. But if you feed a machine a bunch of other people’s writing, artwork, music, etc., please understand that you will never truly own the output.

                I am.

                It is only the profit maximizing hyper capitalists who intend to use AI to exploit workers and rip off artists. I have no problem with the technology behind AI, I just don’t think people should be using it as a tool for continual, industrialized mass exploitation of the little people (like you and me) who actually own the data that they put online.

      • AnonTwo@kbin.social
        link
        fedilink
        arrow-up
        7
        arrow-down
        3
        ·
        edit-2
        1 year ago

        Can it tell you what it learned, or does it copy billions of conversations online of what other people learned?

        If it can’t interpret, it’s not learning.

        All you get is the most basic form of data retention, if it retained millions of examples.

      • RyanHeffronPhoto@kbin.social
        link
        fedilink
        arrow-up
        4
        arrow-down
        3
        ·
        1 year ago

        @brimnac it’s not a ‘someone’ though. The AI isn’t an actual consciousness. It’s a software company illegally using other artists work to develop their own commercial product. BIG DIFFERENCE.

      • Surreal@programming.dev
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        10
        ·
        1 year ago

        Does AI also have eyeballs and brain tissues? Do they have conscience, sentience, shame?