PDA

View Full Version : Ai being used for artwork in Upcoming books?



animewatcha
2023-08-05, 08:55 PM
https://www.youtube.com/watch?v=fK7GwhRS2uY Nerd Immersion.

Isn't WOTC in big enough doo as it is or is this a nonstory story?

Dork_Forge
2023-08-05, 09:45 PM
Sounds like they were entirely unaware that AI was used for the pieces until recently and have reacted accordingly:

- Updated artist guidelines to prevent it again
- Spoken to the artist involved and had him agree to not use AI again

It's definitely bad for their image and crappy for people paying good money for the books. If anything I think this highlights a lack of real QA when it comes to art, they should have an in house artist, probably multiple, who review art. It would have caught this and if they do already have that, well then they clearly failed spectacularly.

Unoriginal
2023-08-06, 08:50 AM
Sounds like they were entirely unaware that AI was used for the pieces until recently and have reacted accordingly:

- Updated artist guidelines to prevent it again
- Spoken to the artist involved and had him agree to not use AI again

It's definitely bad for their image and crappy for people paying good money for the books. If anything I think this highlights a lack of real QA when it comes to art, they should have an in house artist, probably multiple, who review art. It would have caught this and if they do already have that, well then they clearly failed spectacularly.

From what I've been told, the artist in question is very outspoken about their AI-generated pictures (as in, it's what you find directly if you check their work twitter page).

Probably a case of "someone approved the AI-pictures and didn't think people would notice".

That the corporate suits at Wizards of Hasbro don't understand anything they do in public is constantly scrutinized is part of why they keep landing themselves in swamp water.

Zhorn
2023-08-06, 10:45 AM
AI art being one of those things you can never be 100% sure it's not made using stolen art, I'm very much against the practice of it being used in these products.
Sure you could say they are feeding it only art that they have paid for to have the AI learn from, but it that's the case just use the original human made art.

As a reference point for artists; I'm a little warmer on the topic there. Having an artist generate as sample to use for getting a rough draft of what they are aiming for on scale, colour, texture, posing, composition, etc. BUT do not incorporate anything AI generated into the final product.
No touch-ups, no copying, no inclusions whatsoever.

Witty Username
2023-08-06, 11:51 AM
From what I've been told, the artist in question is very outspoken about their AI-generated pictures (as in, it's what you find directly if you check their work twitter page).

That doesn't actually mean much depending on how corporate policy is updated, like say if this was a situation that has occurred in the mtg art end where things come up for a current artist in new spaces (I think the only examples I can think of may be beyond the scope of forum allowed, since it is politics). Also AI art is still new and most organizations are slow to react to such things, even high end tech companies are pretty incompetent when it comes to how to use AI and the implications of using it. Microsoft and Google are slow on this uptake, of figuring out the how and what. A card game/rpg company with a tech department renowned for software that hasn't been updated significantly since the 90s (Magic Online), and mistakes and missteps are guaranteed to happen. Like say if policy already changed recently, then changes are at speed of communication, compliance and reprisal.

The fact they even noticed this at all is encouraging, not great, its still a problem after all, but given other companies haven't noticed yet or just don't care in the slightest, it is on the better end.

Edit: assuming this conversation at face value of all parties, This is the first I have heard of it, so everything here could be based on false assumptions.

ZRN
2023-08-06, 11:59 AM
AI art being one of those things you can never be 100% sure it's not made using stolen art, I'm very much against the practice of it being used in these products.
Sure you could say they are feeding it only art that they have paid for to have the AI learn from, but it that's the case just use the original human made art.

As a reference point for artists; I'm a little warmer on the topic there. Having an artist generate as sample to use for getting a rough draft of what they are aiming for on scale, colour, texture, posing, composition, etc. BUT do not incorporate anything AI generated into the final product.
No touch-ups, no copying, no inclusions whatsoever.

1. Kind of a stretch to call AI art "stolen" when you basically mean "the AI looked at it to try to figure out what art looks like." Especially when you get into the whole "fruit of the poisonous tree" argument that an artist can't even use AI elements in their final product for some reason? Like, an artist can straight up take a photo of the Mona Lisa and mess with it to create a new piece of art; that's protected by copyright law. (That's how e.g. rap and techno music is allowed to sample old songs.)

2. If they're feeding the AI stuff they already own there's DEFINITELY no problem. (In this case they probably weren't since WOTC apparently didn't know the artist was using AI.) "Just use the original human art" doesn't make sense here, because you're using the AI to generate a specific image you don't already have.

PhoenixPhyre
2023-08-06, 12:11 PM
1. Kind of a stretch to call AI art "stolen" when you basically mean "the AI looked at it to try to figure out what art looks like." Especially when you get into the whole "fruit of the poisonous tree" argument that an artist can't even use AI elements in their final product for some reason? Like, an artist can straight up take a photo of the Mona Lisa and mess with it to create a new piece of art; that's protected by copyright law. (That's how e.g. rap and techno music is allowed to sample old songs.)

2. If they're feeding the AI stuff they already own there's DEFINITELY no problem. (In this case they probably weren't since WOTC apparently didn't know the artist was using AI.) "Just use the original human art" doesn't make sense here, because you're using the AI to generate a specific image you don't already have.

If remixing inspiration and element of style from other people’s work without explicit authorization and credit is immoral, then D&D is in big trouble from the get go.

Everyone everyone “steals” from others. There hasn’t been anything truly original in…years? Centuries?

At most, if you ask an ai (or google, or a living artist) to copy someone else’s work (not style, that’s not protected) and the try to pass it off as yours or otherwise evade giving credit and payment, that’s YOUR fault. Not the ai. IMO anyway. Speaking morally, not legally, because the law is screwy enough that it rarely aligns with morality.

False God
2023-08-06, 12:18 PM
Y'all are late to the discussion, the art community had this discussion a year ago.

AI "art" is almost explicitly founded off stealing the art of others by feeding it the work of human artists without their consent. It's not as simple as some program "looking" at google search results. It has to be explicitly taken from a source and fed into the AI's system. That's theft. It basically destroyed Deviantart because they started with an "opt out" mentality, automatically feeding every artist's work into their AI "art" system without consent, hoping that artists would be too slow to opt out and they'd have trained their AI before anyone could stop them.

These BS arguments of "everyone steals" are just pathetic self-placations to make yourself feel better about stealing.

AI "art" is theft. Period.

Snowbluff
2023-08-06, 12:20 PM
I mentioned WotC's position on AI art in the other thread, but someone detailed this on reddit and gave a good break down. (https://old.reddit.com/r/dndnext/comments/15jkm1a/ilya_shkipin_april_prime_and_ai/)

PhoenixPhyre
2023-08-06, 12:25 PM
Y'all are late to the discussion, the art community had this discussion a year ago.

AI "art" is almost explicitly founded off stealing the art of others by feeding it the work of human artists without their consent. It's not as simple as some program "looking" at google search results. It has to be explicitly taken from a source and fed into the AI's system. That's theft. It basically destroyed Deviantart because they started with an "opt out" mentality, automatically feeding every artist's work into their AI "art" system without consent, hoping that artists would be too slow to opt out and they'd have trained their AI before anyone could stop them.

These BS arguments of "everyone steals" are just pathetic self-placations to make yourself feel better about stealing.

AI "art" is theft. Period.

Then so is all art. Including writing. Every artist is trained on a subset of what came before and then goes beyond that to one degree or another.

Legally, “theft” is questionable. There are colorable arguments on either side. But the law is famously a donkey’s hindquarters.

Morally, I see no argument that AI is any different than a human doing the same thing. Learning elements of styles (which are not at all protected morally or legally) and mixing them together to get something similar but different. That’s how all creative work happens. And always has.

False God
2023-08-06, 12:29 PM
Then so is all art. Including writing. Every artist is trained on a subset of what came before and then goes beyond that to one degree or another.

Legally, “theft” is questionable. There are colorable arguments on either side. But the law is famously a donkey’s hindquarters.

Morally, I see no argument that AI is any different than a human doing the same thing. Learning elements of styles (which are not at all protected morally or legally) and mixing them together to get something similar but different. That’s how all creative work happens. And always has.

{Scrubbed}

Imbalance
2023-08-06, 12:47 PM
Between the robots stealing art and the aliens having faster vehicles, no wonder the humans are getting crankier. What next? The zombies get better TV ratings?

Zhorn
2023-08-06, 01:41 PM
Kind of a stretch to call AI art "stolen" when you basically mean "the AI looked at it to try to figure out what art looks like."No, I mean stolen. Seen too many examples of people feeding the work of other artists into these AI programs and it spitting out works that have blatant replications of parts of their work.


Especially when you get into the whole "fruit of the poisonous tree" argument that an artist can't even use AI elements in their final product for some reason?
If the work wasn't drawn by you, you have no claim to call it your own work
Just the same as if a text wasn't written by you, you cannot claim authorship.
If the music was produced by someone else, do not claim to be the musician of the piece.

We need to have hard lines in the sand as AI tools WILL be abused.
Peoples work is being stolen and plagiarized, and the original creators are not receiving credit of compensation for their efforts.

... that's protected by copyright law. (That's how e.g. rap and techno music is allowed to sample old songs.)
Poor example; sampling (using parts, bits, or portions of prior recordings to be incorporated into a new song) without permission is a violation of copyright law.

If they're feeding the AI stuff they already own there's DEFINITELY no problem.
There are not controls on that aspect and that's huge part of the issue.

"Just use the original human art" doesn't make sense here, because you're using the AI to generate a specific image you don't already have.
Think of it like performing a cover version of a song. You know what the original sound like. You listen to it a lot to learn how to replicate it. But when YOU are performing it, you don't have the original playing, you perform the piece yourself. Using it to learn and experiment is 100% ethically fine, just don't use it to to produce the final work.

PhoenixPhyre
2023-08-06, 02:09 PM
My personal hard line is "don't reproduce someone else's work without credit and permission". And that's the law's hard line, for the most part. Whether manually or via AI. AI is a tool, it has no moral quality either way.

But that's a far cry from what's being claimed. You can't justly claim ownership of a style. Or of an idea (not by copyright, and patents cover a completely disjoint and non applicable set of ideas). Remixing bits and pieces of ideas and styles has a long and stories history in art. And, for reference, the AI are not literally storing and reproducing the training data. They only have a big vector of weights that represent a gestalt of the entire set.

For reference, the fashion industry has zero copyright protection except in explicit logos. "Knockoffs" are often made on the same machines from the same patterns with the same materials, and as long as they sew in a different label, there's no "theft" involved. And fashion does just fine.

Atranen
2023-08-06, 03:25 PM
To summarize a few points:

1) While the art community has had this discussion it is not settled and certain artists are comfortable using AI in their work.

2) There are two separate questions regarding the AI training on existing data; 2a) whether or not AI using training data at all is ethical and 2b) whether the data that was used for current AI training sets was obtained ethically.

3) These are both distinct from the question of 'should WotC be using AI art in their books'. It could be that AI training is ethical but it is still wrong for WotC to use it, because doing so would be harmful to the community of artists.

For 2a I think PhoenixPhyre has it right, that there's no problem writ large. For 2b I think False God has it right that big ethical questions exist about the content of the current sets. I think this is enough to make me suspicious of WotC using it. But I wouldn't begrudge an individual using it for their own tokens or whatever. I hope more ethically sourced AI art will become available, especially for smaller companies.

Witty Username
2023-08-06, 08:45 PM
The important take away, I think, is all of this is new. I can see applications that are positive. But the implications and effects are still being worked on. Caution is warranted.
Especially since taking a repackaging others work is definitely a potential use, and what organizations seem to be using it for.

But I won't call moralism in either direction, how people approach the unknown is going to vary and it is how unknown becomes known.

Envyus
2023-08-06, 10:24 PM
Wizards does not intend to use AI art anyway as they were unaware AI enhanced art was used until a couple days ago.

Dr.Samurai
2023-08-07, 08:14 AM
My issue with this is:

1. The sketches look better than the AI art, so I don't understand why you would use the AI to "enhance" the product. In the sketches, I can understand that the tail of the dinosaur is slamming on the ground and creating a cloud of dust and debris. With the AI "enhancement" it looks awful and like AI gobbly ****.

2. If Wizards is paying for this... why aren't they asking for the artist to fix this stuff? (Not involved in this industry in any way shape or form so I don't know how it goes when you receive the commission you paid for, but the weapon going through the arm of the person in the image seems like something you could ask them to fix, no?)

False God
2023-08-07, 08:39 AM
My issue with this is:

1. The sketches look better than the AI art, so I don't understand why you would use the AI to "enhance" the product. In the sketches, I can understand that the tail of the dinosaur is slamming on the ground and creating a cloud of dust and debris. With the AI "enhancement" it looks awful and like AI gobbly ****.

2. If Wizards is paying for this... why aren't they asking for the artist to fix this stuff? (Not involved in this industry in any way shape or form so I don't know how it goes when you receive the commission you paid for, but the weapon going through the arm of the person in the image seems like something you could ask them to fix, no?)

What I REALLY don't understand is their choice in using this artist to begin with, googling them, NONE of their work is anything even remotely resembling what you might see in a D&D book or something similar, and then suddenly they're producing AI-looking art? Because they "sketched" it beforehand? I understand the dinosaurs were drawn by someone else, but in the particular case of the giants, those "before" and "after" pieces don't look anything like what I've seen from human artists. The "before" sketches look suspiciously like first-pass AI work, and my experience over on DeviantArt is that every pop-up AI artist claims to be a totally normal professional artist who only uses AI to "enhance" their sketches, and this almost universally turns out to be a complete lie.

Yeah, clearly Ilya does art, but I don't see ANY examples of anything remotely resembling D&D/MTG/TTRPG-fantasy/sci-fi art
https://ilyashkipin.com/

stoutstien
2023-08-07, 08:42 AM
What I REALLY don't understand is their choice in using this artist to begin with, googling them, NONE of their work is anything even remotely resembling what you might see in a D&D book or something similar, and then suddenly they're producing AI-looking art? Because they "sketched" it beforehand? I understand the dinosaurs were drawn by someone else, but in the particular case of the giants, those "before" and "after" pieces don't look anything like what I've seen from human artists. The "before" sketches look suspiciously like first-pass AI work, and my experience over on DeviantArt is that every pop-up AI artist claims to be a totally normal professional artist who only uses AI to "enhance" their sketches, and this almost universally turns out to be a complete lie.

Yeah, clearly Ilya does art, but I don't see ANY examples of anything remotely resembling D&D/MTG/TTRPG-fantasy/sci-fi art
https://ilyashkipin.com/

They actually do have some decent examples in the MM. One of the few that didn't do 100% weird action poses.

False God
2023-08-07, 08:51 AM
They actually do have some decent examples in the MM. One of the few that didn't do 100% weird action poses.

Googling them they did the Thri-Keen it seems? I didn't see any others. Alright, so they've worked with this artist before. I guess I would say then their current portfolio doesn't seem to say "Hire me to do fantasy creature and character design."

stoutstien
2023-08-07, 09:00 AM
Googling them they did the Thri-Keen it seems? I didn't see any others. Alright, so they've worked with this artist before. I guess I would say then their current portfolio doesn't seem to say "Hire me to do fantasy creature and character design."

Little bit of irony in that they did the Rakshasa. *This one sticks out to me because it has a vibe that shows the arrogance of the NPC extremely well*

Working from memory I think they did the basilisk, and one of the plant based NPCs.

Wish I knew who did the dragon turtle..

False God
2023-08-07, 09:06 AM
Little bit of irony in that they did the Rakshasa. *This one sticks out to me because it has a vibe that shows the arrogance of the NPC extremely well*

Working from memory I think they did the basilisk, and one of the plant based NPCs.

Hmmm, I'll be honest I don't pay super much attention to who the D&D artists are (I care more about MTG art and artists) so I'll concede that WotC has clearly worked with this artist before and they seem perfectly capable of producing fantasy art.

I would then probably suspect that, given they were handed sketches from another artist to "refine", they probably dramatically underbid. IME, when poor workmanship is so obviously apparent, the source is usually underbidding.


Wish I knew who did the dragon turtle..
To the best of my searching, it is also by Vance Kovacs.
This is the only website I can cite accrediting it to him via the signature. (https://citystompers1.tumblr.com/post/166165284664/illustration-of-a-dragon-turtle-by-vance-kovacs)

Ionathus
2023-08-07, 09:13 AM
Then so is all art. Including writing. Every artist is trained on a subset of what came before and then goes beyond that to one degree or another.

Legally, “theft” is questionable. There are colorable arguments on either side. But the law is famously a donkey’s hindquarters.

Morally, I see no argument that AI is any different than a human doing the same thing. Learning elements of styles (which are not at all protected morally or legally) and mixing them together to get something similar but different. That’s how all creative work happens. And always has.

Yes, adaptation is crucial to how art grows and changes over the years. Yes, all artists are inspired by other artists. Yes, if you wanted to be terribly uncharitable you could compare this to "stealing."

IMO, the difference is in the..."metabolism" of the art, for lack of a better word. Are you taking these sources and internalizing them, processing them, and then letting them inform how you create a new work of your own? Or are you just regurgitating the originals undigested, with a thin veneer of new paint to disguise their true source?

People have made glib remarks about the difference between "inspiration" and "plagiarism" for generations but there is unquestionably a difference. I think a lot of people are nervous about / hostile to AI art because the process of "creation" is so different. Copy/paste is a core function of technology, but humans are notoriously terrible at perfectly mimicking visual art -- even if I tried to plagiarize the Mona Lisa, the work of art I would create would be vastly (and laughably) different, and thus would stand on its own much more clearly. But AI blurs that line beyond comfort for a lot of people, including me.

I'm not a visual artist but I feel the same concept applies to my area (writing).

stoutstien
2023-08-07, 09:21 AM
Hmmm, I'll be honest I don't pay super much attention to who the D&D artists are (I care more about MTG art and artists) so I'll concede that WotC has clearly worked with this artist before and they seem perfectly capable of producing fantasy art.

I would then probably suspect that, given they were handed sketches from another artist to "refine", they probably dramatically underbid. IME, when poor workmanship is so obviously apparent, the source is usually underbidding.


To the best of my searching, it is also by Vance Kovacs.

I pay attention b/c honestly it's the only reason I buy npc splat books. The lore is "ok" but not in a kitchen sink format like the MM and the rest is math. I like art and it's one of the few ways I can purchase it in large quantities on my budget.

Personally I'd love if more artists started Co-oping art books. I buy them when I find them but they are rare.

Dr.Samurai
2023-08-07, 09:23 AM
Ok... so a couple of more questions.

1. I've never checked for who the artists are for individual pieces in D&D books, but shouldn't this be an easy thing to learn?

2. This is coming off of Ionathus' comment above, but if, hypothetically speaking, you could create a database of all original artwork available online, and set the AIs to never duplicate that work, no matter the prompt, would that allay some of the concern? (I doubt this is even feasible because I think we'd get into questions of "what is original art?". But for the sake of discussion.)

False God
2023-08-07, 09:27 AM
I pay attention b/c honestly it's the only reason I buy npc splat books. The lore is "ok" but not in a kitchen sink format like the MM and the rest is math. I like art and it's one of the few ways I can purchase it in large quantities on my budget.
That's basically my stance on MTG these days. I homebrew a lot of D&D (mostly 3X) & WoD statblocks based on MTG creatures.


Personally I'd love if more artists started Co-oping art books. I buy them when I find them but they are rare.
WotC likely has restrictions on printings outside of the books unfortunately, but frankly I'd be down for an entire D&D book that was BIG FOCUS on the art of the creature, maybe even with a couple different images, and a very minor description (if one at all) and no stats. Sorta like how WoW releases artbooks with their Special Editions of each expansion.

stoutstien
2023-08-07, 09:27 AM
Ok... so a couple of more questions.

1. I've never checked for who the artists are for individual pieces in D&D books, but shouldn't this be an easy thing to learn?

2. This is coming off of Ionathus' comment above, but if, hypothetically speaking, you could create a database of all original artwork available online, and set the AIs to never duplicate that work, no matter the prompt, would that allay some of the concern? (I doubt this is even feasible because I think we'd get into questions of "what is original art?". But for the sake of discussion.)

You'd think but sometimes they don't include the credits so you have to go digging and b/c they have a lot of concept artists as well it can bet difficult to find the person who did the part you like.

False God
2023-08-07, 09:33 AM
Ok... so a couple of more questions.

1. I've never checked for who the artists are for individual pieces in D&D books, but shouldn't this be an easy thing to learn?

2. This is coming off of Ionathus' comment above, but if, hypothetically speaking, you could create a database of all original artwork available online, and set the AIs to never duplicate that work, no matter the prompt, would that allay some of the concern? (I doubt this is even feasible because I think we'd get into questions of "what is original art?". But for the sake of discussion.)

1: The long and short of it is that although the artists are all credited at the end, not every artist is specifically credited to each piece.

2: Short answer: it's not. Long answer: because feeding the AI runs entirely on the honor system, and many of the AI models operate outside of the USA/Western World/etc... where copyright either can't be enforced, or it's well known everyone ignores it anyway. In the case of DA, they started up their own AI tool and because they're apparently run by Web3 Cryptobros, forced everyone to check-box each work they had uploaded to "opt out" with the default setting being that everyones art was "opted in" to DA's new AI tool. They backpeddled quickly(about a week) but functionally the damage was done, the AI tool had already had time to analyze a great deal (I can't speculate how much, but AI are pretty quick) of the work already uploaded.

Beyond that, there is functionally nothing stopping someone else from downloading and re-uploading someone's work and feeding it into the AI, or an entirely different AI from scraping the whole of DA without anyone knowing. And AI's can't really "forget" a piece. It's not like its stored internally within the AI, the AI has already "learned" from that piece and can now replicate its style.

So, the short long answer is: that's impossible.

Sigreid
2023-08-07, 09:34 AM
Y'all are late to the discussion, the art community had this discussion a year ago.

AI "art" is almost explicitly founded off stealing the art of others by feeding it the work of human artists without their consent. It's not as simple as some program "looking" at google search results. It has to be explicitly taken from a source and fed into the AI's system. That's theft. It basically destroyed Deviantart because they started with an "opt out" mentality, automatically feeding every artist's work into their AI "art" system without consent, hoping that artists would be too slow to opt out and they'd have trained their AI before anyone could stop them.

These BS arguments of "everyone steals" are just pathetic self-placations to make yourself feel better about stealing.

AI "art" is theft. Period.
As a not artist, I'm not sure this is any different than artists searching the internet and going to museums to get ideas and study different styles, techniques and learn what say a fantasy tavern would look like in practical terms.

Dr.Samurai
2023-08-07, 09:38 AM
WotC likely has restrictions on printings outside of the books unfortunately, but frankly I'd be down for an entire D&D book that was BIG FOCUS on the art of the creature, maybe even with a couple different images, and a very minor description (if one at all) and no stats. Sorta like how WoW releases artbooks with their Special Editions of each expansion.
This seems almost like untapped potential lol.

I'd buy these types of artbooks. A book of each of the core races, depicted as various player classes in different environments fighting different creatures would be awesome. Same with monsters, young version, prime, dire, old, all in different places or poses.

Maybe not everyone would be interested, so maybe not a good place to invest their resources, but I'd love books of D&D vibes.

Rynjin
2023-08-07, 09:41 AM
As a not artist, I'm not sure this is any different than artists searching the internet and going to museums to get ideas and study different styles, techniques and learn what say a fantasy tavern would look like in practical terms.

The difference is a person is doing it and forming their own interpretation of what a "fantasy tavern" means to them, whereas a machine learning tool is literally just taking those bits, chopping them up, and remixing them into "new" art.

Not only is there zero of the intentionality, there is zero of the baseline artistic talent required.

It's a lot like tracing. Even if you're tracing different parts of something and stapling them together later, you're not really much of an artist.

False God
2023-08-07, 09:46 AM
As a not artist, I'm not sure this is any different than artists searching the internet and going to museums to get ideas and study different styles, techniques and learn what say a fantasy tavern would look like in practical terms.

It is different because AI doesn't "improvise" like that. AI doesn't look at a selection of pieces and say "yeah, this is nice, but I could do a little differently here, or make adjustments for my own style there". It doesn't think creatively. There's a comment Seven of Nine makes back in her early days on Voyager regarding how the Borg don't innovate, they only present an illusion of innovation by having combined so many different species technologies.

This is equally true of AI. AI replicates. Importantly, it doesn't duplicate (though it can). It looks at what its fed and it replicates it. That's why early AI art all had a particular painterly "Sakimichan"(google it) style. That's why so much of AI art, no matter who puts in the keywords, outputs very similar results when asked to make say a "furry adoptable" or "cute girl in a bikini" or "busty anime chick".

GAN models don't innovate or think creatively. They take what they are given and ouput something reasonably similar. Sure, it can be a totally different work, but fundamentally the "mind" behind AI isn't "thinking" like a human artist is. It's replicating known patterns.

Sigreid
2023-08-07, 09:46 AM
The difference is a person is doing it and forming their own interpretation of what a "fantasy tavern" means to them, whereas a machine learning tool is literally just taking those bits, chopping them up, and remixing them into "new" art.

Not only is there zero of the intentionality, there is zero of the baseline artistic talent required.

It's a lot like tracing. Even if you're tracing different parts of something and stapling them together later, you're not really much of an artist.
You'd have to know a lot more about machine learning than I do to know if this is a true statement. I do know that my niece is in college studying art and she's been able to produce works drawings and paintings that would blow Michaelangelo away since she was a kid; largely because she's had another 1000 years of new techniques to draw on. Not just because she's more talented than he was (my proud uncle goggles are firmly in place 😜).

PhoenixPhyre
2023-08-07, 09:48 AM
Yes, adaptation is crucial to how art grows and changes over the years. Yes, all artists are inspired by other artists. Yes, if you wanted to be terribly uncharitable you could compare this to "stealing."

IMO, the difference is in the..."metabolism" of the art, for lack of a better word. Are you taking these sources and internalizing them, processing them, and then letting them inform how you create a new work of your own? Or are you just regurgitating the originals undigested, with a thin veneer of new paint to disguise their true source?

People have made glib remarks about the difference between "inspiration" and "plagiarism" for generations but there is unquestionably a difference. I think a lot of people are nervous about / hostile to AI art because the process of "creation" is so different. Copy/paste is a core function of technology, but humans are notoriously terrible at perfectly mimicking visual art -- even if I tried to plagiarize the Mona Lisa, the work of art I would create would be vastly (and laughably) different, and thus would stand on its own much more clearly. But AI blurs that line beyond comfort for a lot of people, including me.

I'm not a visual artist but I feel the same concept applies to my area (writing).

Great, that's how AI already works. AI does not simply "regurgitate the originals undigested, with a thin veneer of new paint to disguise their true source". It can't--the originals are no longer part of it once training is complete. That's why you can download a ~few GB trained model that was trained on multiple tens of TB of data--all that remains is a matrix with a few billion entries, called "weights". The images themselves, along with the text describing those images, is gone.

Curiously, this is exactly the same sort of reduction (via different mechanisms) that happens when a human learns--the sensory data floods in and is reduced to a much smaller change in the neuronal structure of the brain (and other parts). The actual data is no-where encoded.

An AI (whether a LLM or one of the picture transformers we're talking about) is basically a gigantic matrix solver in a loop. It takes an input matrix (either a source image or a random field of pixels) and some text data (the prompt) and repeatedly applies its weight matrix using one of a number of solver algorithms, transforming the input until the output array is in a "potential minimum" (region of high probability) with regards to the transformed text. It doesn't know anything about the sources that went into its training; it doesn't know anything about the output, not really. All it does is fuzz pixels until its math says "ok, that's good" based on its training weights. Which were informed by the input, but not dictated by it in any deterministic way.

To get replicas of existing works requires active human interaction pushing it that way. You have to start with a source image that's fairly close and then prompt it (often refining the prompts several times) to get closer and closer. Doing that, I'd say, is unethical. But doing what AI-enabled artists do, which is start with an original sketch, feed it in with instructions on transforming it (maybe adding color, maybe doing shading, etc), pulling out pieces, fixing them, then sending it back through the AI, etc? That's art. Just with a different tool.

AI is a tool. Nothing more. Training it is no different than a human sitting there watching a stream of pictures go by[1] with someone telling them what each picture was and then producing a picture that matches the description given and being judged on it[2]. Just...a lot faster per image. And since many many images fit each and every description, no single description uniquely identifies a single protected work.

[1] Reproducing art for education is explicitly protected from international copyright laws. There is absolutely no reasonable chance that you'll have any issue doing this if it were a human being trained this way. And the analogy here is fairly close. Which is why I'd say that the legal issues here are...murky and uncertain. Certainly not clear-cut on either side.

[2] This requires massive amounts of human involvement. Labeling the images in a consistent way, where slight deviations can massively bias the model and ruin the entire training run (which is the expensive part) is decidedly non-trivial. And then doing the judging at the end, at this point, requires a trained artist to look at the output and say "yup, that matches the prompt" or "no, that doesn't match the prompt" so the training process can re-adjust.

False God
2023-08-07, 09:48 AM
This seems almost like untapped potential lol.

I'd buy these types of artbooks. A book of each of the core races, depicted as various player classes in different environments fighting different creatures would be awesome. Same with monsters, young version, prime, dire, old, all in different places or poses.

Maybe not everyone would be interested, so maybe not a good place to invest their resources, but I'd love books of D&D vibes.

I think it would also be great for reference images for a group, without worrying about disclosing stats. You could pass the book around, people could look up and down the creature and still have no idea if its weak or strong or has any special abilities they need to worry about.

But I think it is clear that WotC, at least their D&D side, does not fundamentally value art if their creative directors are either willing to give shoddy AI art (and there is good AI art, don't get me wrong) a pass. As I recall, there have been earlier statements, particularly when it comes to the high prices of D&D books, about "how expensive the art is", which leads me to the conclusion that D&D generally sees "art" as a cost and not a benefit.

Chronos
2023-08-07, 09:56 AM
Quoth Zhorn:

No, I mean stolen. Seen too many examples of people feeding the work of other artists into these AI programs and it spitting out works that have blatant replications of parts of their work.
...
If the work wasn't drawn by you, you have no claim to call it your own work
Think carefully about what you're saying, here. Let's say that Bob the Human makes a painting X, and then Hal the AI uses that painting as part of its training data, and uses the training data to make an image Y that's similar in some way to X. By your own standard, since Bob did not draw Y, he has no claim to call it his own work, and therefore no basis on which to complain about theft.

You can't say that what humans are doing is fine, but what AI are doing is theft, because the AI and the humans are doing the same thing.

Sigreid
2023-08-07, 09:58 AM
I think it would also be great for reference images for a group, without worrying about disclosing stats. You could pass the book around, people could look up and down the creature and still have no idea if its weak or strong or has any special abilities they need to worry about.

But I think it is clear that WotC, at least their D&D side, does not fundamentally value art if their creative directors are either willing to give shoddy AI art (and there is good AI art, don't get me wrong) a pass. As I recall, there have been earlier statements, particularly when it comes to the high prices of D&D books, about "how expensive the art is", which leads me to the conclusion that D&D generally sees "art" as a cost and not a benefit.
I thought that the story was that WoTC hired an artist and that artist used an AI tool to make their job easier and that went wrong?

False God
2023-08-07, 10:04 AM
I thought that the story was that WoTC hired an artist and that artist used an AI tool to make their job easier and that went wrong?

Sure, that's the "story". Every piece of art submitted to D&D/WotC goes past the Art Director to make sure it's "right". They don't just hire a guy, have them make some work and then stick it in the book. But strangely, that is exactly what happened here. And apparently, given some of the absolutely terrible art in some other recent books, whoever's job it is to check the work of the artists they hire hasn't been doing it.

WotC is too big of a company, with too clearly defined jobs to simply go "whoops, we let one slip!". Sure, if this was an isolated case, a single image, that's one thing. Clearly the art was reviewed, but a couple instances were missed. But we're talking about the majority of the work throughout an entire book, if not the whole of it. And we have prior examples of WotC clearly not caring about the quality. And we have prior statements talking about "boo hoo, the cost of art!" And we have the failed OGL revamp that quite clearly stated WotC felt entitled to your work.

In a vacuum could this be a simple mistake? Sure.

In context? I would highly doubt it.

Rynjin
2023-08-07, 10:19 AM
You'd have to know a lot more about machine learning than I do to know if this is a true statement. I do know that my niece is in college studying art and she's been able to produce works drawings and paintings that would blow Michaelangelo away since she was a kid; largely because she's had another 1000 years of new techniques to draw on. Not just because she's more talented than he was (my proud uncle goggles are firmly in place 😜).

I use a machine learning tool for work, which is part of where my contempt for it comes from. =p

ML is ultimately dumb. It's really good at performing repetitive tasks; recognizing incoming source of information and sorting them into little boxes.

That is ultimately what AI "art" is. It gets a prompt like "draw me an anime girl with red hair in the style of Akira Toriyama" and looks for keywords in its dataset. "anime red hair"; maybe it pulls characters like Ranma or even Super Saiyan God Goku for the exact shade of the hair color. It has millions of data points to pull from for "anime girl", most with near-identical body types, and takes random elements from any or all of those, etc.

And when it's done assembling the elements it tries to blend them all together so they look like they were all made by the same person.

That's it. It's complex, it's a real technical marvel, but great art it is not, and will never be. It's, at best, "vague competence".

Which is not exactly worth paying someone for.

PhoenixPhyre
2023-08-07, 10:25 AM
That's it. It's complex, it's a real technical marvel, but great art it is not, and will never be. It's, at best, "vague competence".

Which is not exactly worth paying someone for.

With a few quibbles about the details I snipped, I basically agree with you.

Real great art >>>>> real not-so-great art >> the best AI art ~ real "bad" art >~ no art >>>> bad AI art.

NichG
2023-08-07, 10:39 AM
An ML approach that can actually 'pull' data from its dataset would be much harder to make than how a diffusion model, or other autoregressive generative approaches work. That information is no longer accessible in such a form that you can say 'here's a fragment, I'm taking it from A and putting it on B'. There isn't a Ranma image somewhere in the model that the left sleeve is being extracted from and carefully put into place and fused. Statistical information about the aggregate of Ranma sleeves (and anime sleeves, and sleeves in general, and stick-like things in general, and cloth in general, and ...) is present, but its more 'given this and the general structure of sleeves, how likely is it that this blurry smudge in the noise should be an edge or not?'. When the statistics are very sharp - there's only one way for images with those labels to be, and they are always that way - then you get things more like the training image. When the statistics allow for lots of completions in different ways, you get something new but consistent.

That's fundamental to why things like GPT hallucinate when you ask them about concrete things in their training data, and you have to go very far out of your way to actually make sure something like that only responds with things that are actually in the data rather than just putting together something plausible. On the one hand, you can extract someone's password that was leaked by accident and part of the training set - because that's a specific thing which is always that way, if the password is p#&b*N3QxAkB: then the probability that x follows Q follows 3 follows N is basically close to 1 in that context - nothing close but different to that sequence is going to occur anywhere in the training data. If the password is 'sMarMyCaT' then it might instead come up with 'sMoReCaLl' or 'SMarTeDge' or whatever because now those are words, and lots of stuff follows 'sm' even in that sort of context.

People have tried that kind of patch-based synthesis approach before, and it was something that was a contender for awhile back around the time of BigGAN. Mostly where it was really effective was in generating larger textures without obvious periodically repeating subunits from a single reference image - the fact that it could pull data directly from a source meant that you wouldn't have to worry about training, you just use the image as input. But in the end, for the kind of full composition synthesis that this thread is about, guided and non-guided diffusion models won out. Those are more like what PhoenixPhyre described.

(Though I would nitpick some of the technical details, as that description sounds a little bit more like a Deep Boltzmann Machine where the data distribution is the equilibrium distribution of a thermodynamic model with a learned energy function. Whereas diffusion is a non-equilibrium process that can also be thought of as having a score function or energy, but now its one that explicitly depends on time - so you can't 'run it until it looks good', you actually have to tell it what the current time is and, while you can take smaller time-steps, the transform from noise distribution to image distribution occurs over a fixed interval. But that's not all that relevant to the point here...)

Sigreid
2023-08-07, 10:48 AM
I use a machine learning tool for work, which is part of where my contempt for it comes from. =p

ML is ultimately dumb. It's really good at performing repetitive tasks; recognizing incoming source of information and sorting them into little boxes.

That is ultimately what AI "art" is. It gets a prompt like "draw me an anime girl with red hair in the style of Akira Toriyama" and looks for keywords in its dataset. "anime red hair"; maybe it pulls characters like Ranma or even Super Saiyan God Goku for the exact shade of the hair color. It has millions of data points to pull from for "anime girl", most with near-identical body types, and takes random elements from any or all of those, etc.

And when it's done assembling the elements it tries to blend them all together so they look like they were all made by the same person.

That's it. It's complex, it's a real technical marvel, but great art it is not, and will never be. It's, at best, "vague competence".

Which is not exactly worth paying someone for.

Been watching a fair bit of anime lately and I don't know that I'd use anime girl as an example as there seems to be only a dozen or so characters that get new clothes, script and occasionally voice actor. Body type, hair style etc. Doesn't seem to vary much at all. Lol

Hael
2023-08-07, 11:21 AM
This is rather silly to be frank. AI art used properly is simply an evolution over photoshop. I know many artists who use AI in their workflows everyday.

Used in the right way, you won’t ever be able to spot the difference.

Five years from now, finding a digital artist that doesn’t use this technology will be as rare as finding one who doesn’t use photoshop/aftereffects etc

Dr.Samurai
2023-08-07, 11:35 AM
This is rather silly to be frank. AI art used properly is simply an evolution over photoshop. I know many artists who use AI in their workflows everyday.

Used in the right way, you won’t ever be able to spot the difference.

Five years from now, finding a digital artist that doesn’t use this technology will be as rare as finding one who doesn’t use photoshop/aftereffects etc
I'm not sure that this is a fair representation of the concerns.

An artist using software to augment art (their own art?) is different from software generating art on its own.

I'd be cautious to call concerns silly just yet. There some disagreement in this thread about how the AI "learns" or "knows" how to create art, so I believe people are coming at this earnestly from their own understandings of how it works.

All that said... if we hold WotC to this standard of not using AI art, they may wind up competing with companies that do use it and don't have that cost. So we as consumers may pay for it in other ways, unless we all refuse to purchase products with AI art in it. Not sure that can be expected to happen.

False God
2023-08-07, 11:42 AM
All that said... if we hold WotC to this standard of not using AI art, they may wind up competing with companies that do use it and don't have that cost. So we as consumers may pay for it in other ways, unless we all refuse to purchase products with AI art in it. Not sure that can be expected to happen.

Considering Hasbro doesn't really "compete" with anyone for market share save Paizo, and Paizo has already committed to not using AI art (https://kotaku.com/paizo-pathfinder-artificial-intelligence-art-ban-rule-1850186576), and Hasbro is one of if not THE biggest toy company in the world, I'm not exactly inclined to worry.

And we're already paying for it, see past WotC comments about the cost of art. The Giants book certainly isn't any cheaper because of using AI.

Dr.Samurai
2023-08-07, 11:58 AM
I'm imagining a world where Critical Role and MCDM and others become competitors as WotC flails its way out of Top Dog status.

It's unlikely to happen, and also unlikely that content creators born out of the community would not also adopt an anti-AI art stance. So... I'll see myself out.

Sigreid
2023-08-07, 12:13 PM
I'm not sure that this is a fair representation of the concerns.

An artist using software to augment art (their own art?) is different from software generating art on its own.

I'd be cautious to call concerns silly just yet. There some disagreement in this thread about how the AI "learns" or "knows" how to create art, so I believe people are coming at this earnestly from their own understandings of how it works.

All that said... if we hold WotC to this standard of not using AI art, they may wind up competing with companies that do use it and don't have that cost. So we as consumers may pay for it in other ways, unless we all refuse to purchase products with AI art in it. Not sure that can be expected to happen.

I'm back to the with the quality of art I've seen for a few editions now, I'd be OK if they dropped it all together. I know I'm an outlier on that.

Witty Username
2023-08-07, 01:40 PM
2. If Wizards is paying for this... why aren't they asking for the artist to fix this stuff? (Not involved in this industry in any way shape or form so I don't know how it goes when you receive the commission you paid for, but the weapon going through the arm of the person in the image seems like something you could ask them to fix, no?)

If d&d is like mtg on this, their process is likely already past the point of no return for that. WOTC usually finishes its product design well before actual release date, including art. In D&D it has to be less time than mtg, (they are still doing design for the new PHB, which I think is next year release). Mtg design is finalized, I think 1-2 years in advance so they have zero deadline risk.

ZRN
2023-08-07, 03:17 PM
Sure, that's the "story". Every piece of art submitted to D&D/WotC goes past the Art Director to make sure it's "right". They don't just hire a guy, have them make some work and then stick it in the book. But strangely, that is exactly what happened here. And apparently, given some of the absolutely terrible art in some other recent books, whoever's job it is to check the work of the artists they hire hasn't been doing it.

WotC is too big of a company, with too clearly defined jobs to simply go "whoops, we let one slip!".

I mean, clearly the art director screwed up bad here. Even if there wasn't an anti-AI policy somebody should have flagged art with obvious screwy elements.

But I read the context differently than you: I think they just have a small budget for art (relative to the amount of content they want) and tight deadlines, so they make do with crappy art that's not carefully reviewed. People on the internet like to say "Hasbro is huge and rich!" but to quote Bill Gates on the Simpsons, they didn't get rich by writing checks. They "create shareholder value" or whatever by pinching as many pennies as they can.

That said, I don't see evidence they want to replace all their artists with AI; they don't even need to take positive steps in that direction. They can just wait while generative AI annihilates the marketplace for independent artists and in a year or two they'll be paying so little that artists have no choice but to use AI to get done quicker if they want to pay rent.

Hael
2023-08-07, 03:46 PM
I'm not sure that this is a fair representation of the concerns.

An artist using software to augment art (their own art?) is different from software generating art on its own.

I'd be cautious to call concerns silly just yet. There some disagreement in this thread about how the AI "learns" or "knows" how to create art, so I believe people are coming at this earnestly from their own understandings of how it works.
.

The artist in question retouched part of the image with AI art. In other words it was mostly his own, he just tweaked a portion of the scene. The modern tools do exactly that, where you control almost everything in a scene, to the point that its hard to figure out where the AI starts and the artist stops. In other words where the lights are, the poses, the styles and objects and how the composition is. Its all controlled by the artist.

The only reason this is a thing at all, was that the artist was lazy and careless, and the existing methods weren’t quite good enough yet (you used to be able to spot AI art b/c of a variety of errors, like bad hands or weapons, or weird issues when you would zoom in to see small details.).

Well its safe to say at this point all those old errors have mostly gone away. Its harder than ever to spot. 5 years from now, only other AIs will have a prayer at differentiating a human from a non human.

As far as how AIs learn, unfortunately it suits peoples biases (and paychecks) to remain willfully ignorant. I’ve sort of given up even trying as its not something people really want to hear.

False God
2023-08-07, 04:00 PM
That said, I don't see evidence they want to replace all their artists with AI; they don't even need to take positive steps in that direction. They can just wait while generative AI annihilates the marketplace for independent artists and in a year or two they'll be paying so little that artists have no choice but to use AI to get done quicker if they want to pay rent.

Possibly, but there's certainly historical precedence for the negative economic repercussions from tanking an entire industry in order to rely on mass produced cheap goods largely from poor nations. And its not like this doesn't apply beyond TTRPG/MTG card art, there are multi-million-dollar businesses for logo design, and AI can do that perfectly fine and people making that kind of money aren't going to take kindly to some $5 knock-off work.

Grod_The_Giant
2023-08-07, 07:38 PM
To summarize a few points:

1) While the art community has had this discussion it is not settled and certain artists are comfortable using AI in their work.

2) There are two separate questions regarding the AI training on existing data; 2a) whether or not AI using training data at all is ethical and 2b) whether the data that was used for current AI training sets was obtained ethically.

3) These are both distinct from the question of 'should WotC be using AI art in their books'. It could be that AI training is ethical but it is still wrong for WotC to use it, because doing so would be harmful to the community of artists.

For 2a I think PhoenixPhyre has it right, that there's no problem writ large. For 2b I think False God has it right that big ethical questions exist about the content of the current sets. I think this is enough to make me suspicious of WotC using it. But I wouldn't begrudge an individual using it for their own tokens or whatever. I hope more ethically sourced AI art will become available, especially for smaller companies.
Even if you set aside the capitalism-related issues (which you absolutely should not) I think there are much deeper problems with the idea of using AI to generate more than bits and fragments. Art isn't a hobby or the province of a small subculture--it's one of the three fundamental functions of humanity. Once you put aside basic survival activity, our activities boil down to some combination of creating art, finding truths, and making connections. Art is a goal. The purpose of technology is to give us more freedom to pursue those three functions. When you look at it in that light, AI art represents more than just taking away jobs-- it's demoting an entire pillar of humanity to something we should rush through on our way to more "productive" uses of our time.



(and to be clear, I'm using "art" in the broadest possible sense)

Sigreid
2023-08-08, 07:56 AM
Even if you set aside the capitalism-related issues (which you absolutely should not) I think there are much deeper problems with the idea of using AI to generate more than bits and fragments. Art isn't a hobby or the province of a small subculture--it's one of the three fundamental functions of humanity. Once you put aside basic survival activity, our activities boil down to some combination of creating art, finding truths, and making connections. Art is a goal. The purpose of technology is to give us more freedom to pursue those three functions. When you look at it in that light, AI art represents more than just taking away jobs-- it's demoting an entire pillar of humanity to something we should rush through on our way to more "productive" uses of our time.



(and to be clear, I'm using "art" in the broadest possible sense)
I have a friend who has gotten into AI generated art. He feels like he's now got a creative outlet he's never had before and is delighted.

Imbalance
2023-08-08, 08:11 AM
I have a friend who has gotten into AI generated art. He feels like he's now got a creative outlet he's never had before and is delighted.

There's the distinction: artificial intelligence is no threat whatsoever to creative humans making art. It's the people who make art for profit who are feeling threatened by AI.

Sigreid
2023-08-08, 08:52 AM
There's the distinction: artificial intelligence is no threat whatsoever to creative humans making art. It's the people who make art for profit who are feeling threatened by AI.
Yep, and it's a tale as old as mankind that technology is developed and some portion of the population finds their niche threatened.

Keltest
2023-08-08, 09:07 AM
Yep, and it's a tale as old as mankind that technology is developed and some portion of the population finds their niche threatened.

I mean, if somebody went up to you at your job, told you to train your replacement (without being paid for it!) and that they weren't going to pay them as much as you get either, how would you react?

I'm not hugely impressed with the idea that everyone's hobby or joy should be doable as an actual job, but for people who actually have successfully monetized their art, this is basically devastating to their livelihood, to say nothing of insulting on a personal level.

Sigreid
2023-08-08, 09:23 AM
I mean, if somebody went up to you at your job, told you to train your replacement (without being paid for it!) and that they weren't going to pay them as much as you get either, how would you react?

I'm not hugely impressed with the idea that everyone's hobby or joy should be doable as an actual job, but for people who actually have successfully monetized their art, this is basically devastating to their livelihood, to say nothing of insulting on a personal level.
That's actually happened at a job I had in the past. My suspicion on this case is that there will still be a market for human made art, but the barrier to entry will be higher.

Dr.Samurai
2023-08-08, 09:29 AM
I think whether any given thing is inevitable or a foregone conclusion or "the future" or whatever, it's perfectly reasonable for the people that stand to be negatively impacted by it to have concern.

Imbalance
2023-08-08, 09:33 AM
I mean, if somebody went up to you at your job, told you to train your replacement (without being paid for it!) and that they weren't going to pay them as much as you get either, how would you react?

I've honestly been living in anticipation of that day for more than a decade, so my reaction would be with little to no surprise. There are a handful of plan B options that I've courted, as anyone should, because there has never been any guarantee about being able to make a living doing the same thing forever. The statistic I was told when I graduated was that the average person changes careers five times over a lifetime. At that point, I had already supported myself via four distinct vocations. I don't know what they teach in art schools that may make expectations different for those grads, but I believe that life experiences should teach one fairly early on that the world does not owe one a living.

I make art, too. For myself. I've kept most of it off the internet because long before AI was a thing, ideas of mine that I had no intention to sell were used by another for their gain. I've taught my children how to draw, and they surpassed me in skill and talent, as it should be. We give art away. My oldest seeks to make art into a vocation, to also teach it. The future of this topic hits very close to home, but there is so far no new concern that has not already been a concern.

NichG
2023-08-08, 10:30 AM
I mean, if somebody went up to you at your job, told you to train your replacement (without being paid for it!) and that they weren't going to pay them as much as you get either, how would you react?

I'm not hugely impressed with the idea that everyone's hobby or joy should be doable as an actual job, but for people who actually have successfully monetized their art, this is basically devastating to their livelihood, to say nothing of insulting on a personal level.

I'd refuse. And if I was really essential to the process, that would be a strong negotiating position I could leverage. But if it turns out that they can do it without my help because e.g. someone else is willing to volunteer to do this for free or for other considerations that aren't relevant to me or whatever, then so be it. And if it turns out that they could replace me just because e.g. they've been keeping records of my work process and thats enough to train a model without my assistance, again, so be it. In that case, I'm actually not needed, so I shouldn't expect to be paid just because in some previous point in time I did happen to be needed. I might still refuse other work from them or being shifted around within the organization to some other job on the basis of how its done because, as you say, it can certainly still being insulting on a personal level if and when there's deception or manipulation involved and I wouldn't want to work in such an environment. But I wouldn't take offense at someone else choosing to take the offer or at the people creating the thing that enables them to replace me, or even at their desire to not have to need me.

I don't feel I should get to gatekeep my particular skills, whatever they may be. I get to choose if and how I use them (or sell their use), but I don't get to tell someone else not to sell theirs more cheaply. And that may mean that the value of my skills ends up being lowered, but again, so be it - the value or not of my skills is a reflection of how much other people need them, not a reflection of how much effort or money I spent on acquiring them.

Anymage
2023-08-08, 10:43 AM
I get to choose if and how I use them (or sell their use), but I don't get to tell someone else not to sell theirs more cheaply. And that may mean that the value of my skills ends up being lowered, but again, so be it - the value or not of my skills is a reflection of how much other people need them, not a reflection of how much effort or money I spent on acquiring them.

Look into the history of the labor movement. Because people need money for things like food and shelter they prefer to have a job over not having one, which creates strong incentives for employers to see who will do the work for the lowest pay. This has a long history of being really nasty for most of the people involved, and there's a reason that both laws and labor organizations have sprung up to help counterbalance it.

We're not going to stuff the AI genie back into the bottle. Tying to would just mean that other people would advance the technology on their own. But if we're going to be making a lot of jobs compete with "employees" who don't need time off or remotely the same paycheck, we're going to want to have some discussions as a society about how to handle things going forward. Artists are just the tip of the spear in terms of people who will be affected. And if our collective response is to just shrug and leave it to the people affected to work things out for themselves, you shouldn't be too surprised if there's a lot of pre-emptive resistance as well as broader instabilities as lives are disrupted.

ZRN
2023-08-08, 10:44 AM
That's actually happened at a job I had in the past. My suspicion on this case is that there will still be a market for human made art, but the barrier to entry will be higher.

Yeah, I think AI art is basically to artists what LegalZoom was to lawyers/paralegals: it automates a lot of the dumb-but-moderately-profitable grunt work, which translates to a bunch of entry- and mid-level jobs disappearing from the field. Which sucks for the job prospects of people in those fields but is generally good for everyone else (e.g. now if I want a will made out it costs $20 instead of hundreds for an actual human lawyer).

As someone else here alluded, the "problems" generated here are all problems with capitalism, not problems with technological progress.

Brookshw
2023-08-08, 11:29 AM
My suspicion on this case is that there will still be a market for human made art, but the barrier to entry will be higher.

Considering the Copyright Office's current 'no protection' stance for AI generated images (see: Zarya of the Dawn), there are dwindling incentives for big companies to use AI generated art, though how things continue to play out is a big question mark.

False God
2023-08-08, 11:37 AM
Considering the Copyright Office's current 'no protection' stance for AI generated images (see: Zarya of the Dawn), there are dwindling incentives for big companies to use AI generated art, though how things continue to play out is a big question mark.

Bee-Tee-Dubs, here's the letter (https://www.copyright.gov/docs/zarya-of-the-dawn.pdf).

It's important I think that people read it, since it can provide clarification on the risk involved in using AI art, or buying AI art from someone or buying a product with AI art. *looks in WotCs direction*

Zhorn
2023-08-08, 12:05 PM
Been out of this thread for a bit, so bit too much to respond to directly. But some good points and counter points being raised.
Will just focus on the post directed at me


Quoth Zhorn
No, I mean stolen. Seen too many examples of people feeding the work of other artists into these AI programs and it spitting out works that have blatant replications of parts of their work.
...
If the work wasn't drawn by you, you have no claim to call it your own workThink carefully about what you're saying, here. Let's say that Bob the Human makes a painting X, and then Hal the AI uses that painting as part of its training data, and uses the training data to make an image Y that's similar in some way to X. By your own standard, since Bob did not draw Y, he has no claim to call it his own work, and therefore no basis on which to complain about theft.

You can't say that what humans are doing is fine, but what AI are doing is theft, because the AI and the humans are doing the same thing.
So first up that's squishing two lines together by taking out the response components and the following statements that contextualize them as different things being talked on. Also simplified things down into saying something I definitely do not agree with, presenting the opposite of what my stance is.

The first is to with AI and how it generates images. It doesn't creature new things, it slaps together composites of the work it has been 'trained' on, and uses the data on how different images blend from one component to the other to smooth out the differences. It is a literal copy paste at the start of the process.
When you have access to the original images the AI is used for template information, you'll be able to find the different parts taken from the different works. The further away any single component is from something else the AI is trying to blend into, the more blatant the plagiarism is.

The second line is to do with human artists taking the work of other human artists and claiming ownership, with giving other human-to-human copying examples. repeating with more emphasis;
If you are writing a paper, and you copy a chunk of text from someone (no rewording, just select, copy paste) and submit that without citing that section as someone else's work, that is plagiarism.
If someone else writes a song and you lift a part of their recording and put that into your audio track without permission, and then attempt to present the track as your own performance, that is also plagiarism.

These are different to holding up the original work to learn from in attempt to recreate from the ground up.
You can write out someone else's ideas in your own words and you get get away with a simple posting the source in the bibliography for where the information was sourced from, but you CAN claim the writing as your own.
You can listen to a song played by someone else, or read off their sheet music, but playing the instrument yourself for your recording has you instead producing a cover or rendition. You have performed the piece. The song was still someone else's, but the performance was yours.
Same goes for paintings. You can look at someone else's work, but as long as you've built up your image 100% yourself (zero copy/pasting from the source image), then it's a rendition of someone else's work, but YOU are the creator of the new image.
AI does not do this. It copies the source directly and then mutates from there. It cannot make anything new, and so has to start from an existing template.

Where it becomes blatant theft for AI is where the work being used is not being sourced from the user of the AI.
If the original artist did not submit there work with permission for it to be used by that AI, then any use of the original artists work by that AI is theft.
And this repeats the second point from what you quoted me on; if the user of the AI then takes what the AI produced, using the source work of another artists, then that final product cannot be claimed as that user's work.

It can be very tiresome repeating this point over and over. Many folks here have demonstrated they understand the difference, but there are others where they cannot see the distinction.

One last tabletop analogy
Alice rolls a d20 on her turn for an attack roll and gets a 17 on the die.
Bob's turn comes up, when called for a d20 roll, he points to the dice Alice rolled and says that was his dice roll and he's using that 17 result as his attack roll.
Carol is up next, then borrows Alice's d20 and also manages to roll a 17 using Alice's die. The die is still Alice's, but the roll was Carol's.

Alice is the original
Bob is doing the AI copy
Carol is doing a human replication

NichG
2023-08-08, 12:15 PM
Look into the history of the labor movement. Because people need money for things like food and shelter they prefer to have a job over not having one, which creates strong incentives for employers to see who will do the work for the lowest pay. This has a long history of being really nasty for most of the people involved, and there's a reason that both laws and labor organizations have sprung up to help counterbalance it.

We're not going to stuff the AI genie back into the bottle. Tying to would just mean that other people would advance the technology on their own. But if we're going to be making a lot of jobs compete with "employees" who don't need time off or remotely the same paycheck, we're going to want to have some discussions as a society about how to handle things going forward. Artists are just the tip of the spear in terms of people who will be affected. And if our collective response is to just shrug and leave it to the people affected to work things out for themselves, you shouldn't be too surprised if there's a lot of pre-emptive resistance as well as broader instabilities as lives are disrupted.

I'm never surprised at pre-emptive resistance, but that doesn't mean that I sympathize with it. The issue is that while as far as a short term thing you can make a positional argument for a protectionist stance - I need this, this will help me, etc - in the long term it cuts both ways. You create a law to protect artists by assigning them a monopoly to their creative work, well now the scam for is corporate collectives to make it impossible to compete as an individual artist and use that to force artists to sign over those monopolies to the corporate collective. Medieval guilds protected the interests of skilled labor against a ruling class that would otherwise use their leverage to push artisans around, but at the same time also created a stagnant environment that slowed improvements which would eventually be responsible for how much better the modern quality of life is compared to those times.

Or take the same argument, but switch the parties involved. I would have no sympathy for, say, a group of software developers and engineers that wanted to force me to stay entirely within their software ecosystem and did so by having their OS hobble or refuse to run cheaper competing products. Yet the same basic structure of argument is there for that group - they need to protect the value of their work, because otherwise its a race to the bottom and they'll be out-competed by developers living somewhere with a lower cost of living who can afford to charge less, or even worse those open-source zealots who are just giving stuff away for free. So in the same way, I have no sympathy for an artist who says 'I want to force you (indirectly) to pay me, by using collective pressure to kill competitors to my business model'. I would support their right to bargain collectively (as opposed to a situation in which there would be laws compelling them to work even when its not in their interests), but that doesn't mean I necessarily want them to win. Nor does it mean that it makes sense for me to support their cause.

For me, its more important that there are boundaries of reasonable actions that people can take, even if the actions within those boundaries harm the interests of other people. So its fine, good that artists can say "If I continue to work this way, I will by harming myself, so I will refuse.". But its also, for me, good that in turn companies can say "Well if we continue to negotiate with those artists we will be harming ourselves, so we will refuse". The ability to make that choice is, again to me, more important than a specific outcome of e.g. 'people I happen to like prospering'.

Damon_Tor
2023-08-08, 12:23 PM
I think whether any given thing is inevitable or a foregone conclusion or "the future" or whatever, it's perfectly reasonable for the people that stand to be negatively impacted by it to have concern.

This is how painters felt when photography was invented. And again when digital art tools were developed. None of this is new. Every time there's a leap forward and somebody's livelihood is threatened "it's not really art" and "this will never catch on" and "this should be illegal" is the predictable response. In 10 years nobody will care anymore. In 20 years nobody will remember that anybody ever cared.

Imbalance
2023-08-08, 12:25 PM
The situation also makes me wonder if there's a smug sense of vindication among former sculptors and modelers whose artistic endeavors were outsourced to digital sculpting and cadd prototyping some years ago.

verbatim
2023-08-08, 12:27 PM
Considering the Copyright Office's current 'no protection' stance for AI generated images (see: Zarya of the Dawn), there are dwindling incentives for big companies to use AI generated art, though how things continue to play out is a big question mark.


If the work wasn't drawn by you, you have no claim to call it your own work
Just the same as if a text wasn't written by you, you cannot claim authorship.
If the music was produced by someone else, do not claim to be the musician of the piece.

When photography was gaining prominence the US Copyright Office said that they would not give protection to photographs under basically the exact same argument. What ended up happening was Congress passed a law overruling them and then the Supreme Court upheld the law on appeal. The argument given was that the gear and the settings the photographer chooses represents an artistic medium.

I feel pretty confident that sooner or later it will be held that AI content qualifies for copyright when big business does it. At this point it's just a matter of whether or not ordinary people will be able to do so as well, which is presumably also why a lot of big companies are on board with the "unlicensed training data is a copyright violation" theory of tightening the fair use noose (they would certainly rather people be focusing on this than stuff like the Writer's Guild demands on AI restrictions in the ongoing Hollywood union row, which are actually designed to protect writers). They can afford to license this new technology, so licensing fees are a great means of stopping other people from doing it.




I have a friend who has gotten into AI generated art. He feels like he's now got a creative outlet he's never had before and is delighted.

I've stopped drawing and 3d modeling recreationally for disability related reasons. Messing around with Stable Diffusion in a not for profit manner has been a nice opportunity, and a big driver in my interest in the broader discourse around the subject.

Dr.Samurai
2023-08-08, 12:29 PM
This is how painters felt when photography was invented. And again when digital art tools were developed. None of this is new. Every time there's a leap forward and somebody's livelihood is threatened "it's not really art" and "this will never catch on" and "this should be illegal" is the predictable response. In 10 years nobody will care anymore. In 20 years nobody will remember that anybody ever cared.
The reality of that is not a reason to act like people are crazy or delusional or not informed. Not saying you're doing that, but I think some understanding is always appropriate.

Brookshw
2023-08-08, 12:40 PM
When photography was gaining prominence the US Copyright Office said that they would not give protection to photographs under basically the exact same argument. What ended up happening was Congress passed a law overruling them and then the Supreme Court upheld the law on appeal. The argument given was that the gear and the settings the photographer chooses represents an artistic medium. Sure, Congress could certainly pass new laws, or things could continue to progress in a different direction. For the moment though, there is a disincentive for the use of AI art by corporations, I completely agree its an unstable and new issue (sorta?) that's going to see various changes. Then again, the piano roll didn't kill musicians.

(photographs have a lot more nuance to them than I want to go into).


I feel pretty confident that sooner or later it will be held that AI content qualifies for copyright when big business does it. At this point it's just a matter of whether or not ordinary people will be able to do so as well, which is presumably also why a lot of big companies are on board with the "unlicensed training data is a copyright violation" theory of tightening the fair use noose (they would certainly rather people be focusing on this than stuff like the Writer's Guild demands on AI restrictions in the ongoing Hollywood union row, which are actually designed to protect writers). They can afford to license this new technology, so licensing fees are a great means of stopping other people from doing it. Ordinary people already can, Segev demonstrated the process in the other thread. I suspect that certain corners are (or at least should be) celebrating this as its generally inline with the copyleft, Creative Commons, etc., mindsets. I'm going to hold comment on the unlicensed training thing.

Damon_Tor
2023-08-08, 02:14 PM
The reality of that is not a reason to act like people are crazy or delusional or not informed. Not saying you're doing that, but I think some understanding is always appropriate.

I do have compassion. But I think the way this needs to be looked at is as grief, and the sooner they get to acceptance the better off they'll be.

Zhorn
2023-08-08, 10:04 PM
When photography was gaining prominence the US Copyright Office said that they would not give protection to photographs under basically the exact same argument. What ended up happening was Congress passed a law overruling them and then the Supreme Court upheld the law on appeal. The argument given was that the gear and the settings the photographer chooses represents an artistic medium.
A big difference between photographs though is it it still recognising the image it is capturing is OF something else.
A photograph of the Mono Lisa isn't trying to pass off that the photographer created the Mona Lisa.

A more apt comparison to the AI art discussion would be movie posters and the use of stock photography.
Images that are taken and sold for the dedicated purpose of being made availably for constructing advertisements.
A graphics designer compiles and edits these images via a manual process that would be considered similar (in a primitive sense) to AI art
Takes existing images, and mutates them to fit together into the desired end result.
When the source images are obtained through the correct channels then all is good.
If the source images are taken without consent, then legal issues may ensue if that work is profited off.


That I see as the future AI art laws should be headed;
If the AI user is hired to produce an image as the final product, then as long as all the assets being used by the AI algorithm are sourced from the correct channels (a collection of images that have been legally obtained for the express purpose of use with the project), then all is fine.
this shark was used in the poster for the 2018 Aquaman filmhttps://www.geekfeed.com/wp-content/uploads/2018/07/16-Aquaman-Shark-Comparison.jpgIt was legally obtained via purchasing the right to use commercially through a stock photo service, to which the graphics designer then edited to fit the scene they were putting together

If an AI user were to create sample images to use as references; but not include in their final product; That I could see as no different to using a live model when making a painting. The likeness of the individual is being used, but the painting is built from the ground up without the need of pasting photos of the subject onto the canvas. The end result isn't AI art even though AI imaging was generated as a reference. There is still a lot of legal weeds to sort through here though, as a painting BASED on a photograph likeness can still be disputed as a copyright infringement, and I can see that complexity translating to AI art laws, where the levels of recognisable likeness becomes the determining factor.

The end point is I'm not against AI being used as a tool. The issue is about the legality of where it is obtaining it's source data and the recognition of the incorporation of original works.
And at the moment; the majority of AI art programs out there are chewing up vast amount of other people work without consent or giving credit; that's where the theft is.
That's not a thing that's solved by artists "just getting over it and accepting the future", and some people expressing that view clearly still are not understanding the issue.

NichG
2023-08-08, 10:14 PM
The end point is I'm not against AI being used as a tool. The issue is about the legality of where it is obtaining it's source data and the recognition of the incorporation of original works.
And at the moment; the majority of AI art programs out there are chewing up vast amount of other people work without consent or giving credit; that's where the theft is.
That's not a thing that's solved by artists "just getting over it and accepting the future", and some people expressing that view clearly still are not understanding the issue.

Well the tricky thing is, given just the output of the AI generator, can you identify the works which you claim are being incorporated? E.g. without specifically knowing 'this piece was in the training data', which are the cases in which human observers can correctly identify which pieces were or were not in said data given a particular output?

Because without that specific identification, to me it isn't such a stretch to say 'well, I know that the artist you hired watched Aquaman, so the artistic brain that produced this image is fruit of the poisoned tree of an act of unlicensed referencing of the images that were part of that movie, even if the image they produced happens to be of a cat on a unicycle'. And that degree of contagion being considered illegal would be massively destructive to basically all human creative activities, much less any algorithmic art.

verbatim
2023-08-08, 10:20 PM
A more apt comparison to the AI art discussion would be movie posters and the use of stock photography.

When this is done recreationally it is referred to as photobashing, and the general rule of thumb is that if the final output is sufficiently unrecognizable it does not matter that you started with an unlicensed copyrighted work.



And at the moment; the majority of AI art programs out there are chewing up vast amount of other people work without consent or giving credit; that's where the theft is.
That's not a thing that's solved by artists "just getting over it and accepting the future", and some people expressing that view clearly still are not understanding the issue.

Under current US copyright law,
"copyrighted material goes into process --> sufficiently different looking output comes out of process" is legal, and a cornerstone of several preexisting arts, like remixing/using samples in music and the aforementioned photobashing. Other stuff where there actually clearly is infringing content in the final output like machinima is (sometimes) covered under fair use but the standard of determining if infringement is happening by comparing output to output clearly protects a lot of stuff, including a number of very clearly problematic things corporations are heavily investing in.

I sympathize greatly with anyone and everyone in the crosshairs here, but tightening the noose around creative expression isn't going to save anyone's job. Disney will license the images necessary for a model and benefit from the fact that the artists they fired can't do the same.

That doesn't mean nothing can be done, the ongoing union strike in Hollywood has setting sensible regulations limiting the ability of companies to replace people with large language models as one of their big sticking points. I guess what I'm really trying to get at is that a lot of this copyright maximalist positioning is self defeating and doesn't serve to actually advance the interests of artists.

Zhorn
2023-08-08, 11:24 PM
Because without that specific identification, to me it isn't such a stretch to say 'well, I know that the artist you hired watched Aquaman, so the artistic brain that produced this image is fruit of the poisoned tree of an act of unlicensed referencing of the images that were part of that movie, even if the image they produced happens to be of a cat on a unicycle'. And that degree of contagion being considered illegal would be massively destructive to basically all human creative activities, much less any algorithmic art.
Conflating two distinctly different processes here.
Recreating something you have seen in the past by building it up from a blank canvas is not the same as mutating an existing image.

I generally avoid the 'fruit of the poisoned tree' arguments because that's just a philosophical mess with no end point.
I'm keeping the scope of what I'm taking about into very distinct categories of
is the work of someone else incorporated directly (think copy paste) into the new work, or
is the work of someone else used as a reference but not physically present in the new work
See playing an instrument yourself vs playing an audio recording of someone else performing.
Even when the subject is about the same thing, those are two very distinct aspects.


When this is done recreationally it is referred to as photobashing, and the general rule of thumb is that if the final output is sufficiently unrecognizable it does not matter that you started with an unlicensed copyrighted work.
...
Under current US copyright law,
"copyrighted material goes into process --> sufficiently different looking output comes out of process" is legal, and a cornerstone of several preexisting arts, like remixing/using samples in music and the aforementioned photobashing. Other stuff where there actually clearly is infringing content in the final output like machinima is (sometimes) covered under fair use but the standard of determining if infringement is happening by comparing output to output clearly protects a lot of stuff, including a number of very clearly problematic things corporations are heavily investing in.
Yes, all valid points. But in those cases, the photobashing is still recognised as incorporating the work of another.
I do not agree that all of those laws are just. If the work was unlicensed there, then that is also theft of intellectual property. Photography has a similar thing regarding taking photo's without consent of the subject. Lines are generally kept blurry in the hobbyist sense or when the images has low commercial value, but the subject who the image is of has the general right to request legal take downs of the work that bears likeness to which they legally own. At the higher ends of commercial value lawsuits follow for damages and compensation. This all is delving into far more complex elements than the fundamental argument though; which I again try and keep my scope down to those two dot point I repeated above.


I sympathize greatly with anyone and everyone in the crosshairs here, but tightening the noose around creative expression isn't going to save anyone's job. Disney will license the images necessary for a model and benefit from the fact that the artists they fired can't do the same.

That doesn't mean nothing can be done, the ongoing union strike in Hollywood has setting sensible regulations limiting the ability of companies to replace people with large language models as one of their big sticking points. I guess what I'm really trying to get at is that a lot of this copyright maximalist positioning is self defeating and doesn't serve to actually advance the interests of artists.
Creative expression and commercial use are not exactly the same thing.
Art made for personal use doesn't have the same legal ramifications that selling art for financial gain does.
Likewise for an artist creating works under contract with an employer an then using those those designs outside of that employment relation.
These are all very messy webs of legal precedence and establishing cases, to the point where you will probably find rulings in all sorts of directions.

The general public having access to AI tools for creating art; not an issue as long as it's just for personal use.
People using AI art for financial gain and/or the large scale dissemination of the works; there needs to be a level of control in place respect the intellectual property of those whole information the AI is sourcing from; particularly recognition, consent and compensation for its use, and the clear labelling of the final product being an AI production. Will it be easy? No. But missing any one of those will cause problems.

verbatim
2023-08-08, 11:50 PM
Photography has a similar thing regarding taking photo's without consent of the subject. Lines are generally kept blurry in the hobbyist sense or when the images has low commercial value, but the subject who the image is of has the general right to request legal take downs of the work that bears likeness to which they legally own.

In these cases infringement is being decided based on judging the output's likeness to their IP.



Creative expression and commercial use are not exactly the same thing.
Art made for personal use doesn't have the same legal ramifications that selling art for financial gain does.
Likewise for an artist creating works under contract with an employer an then using those those designs outside of that employment relation.
These are all very messy webs of legal precedence and establishing cases, to the point where you will probably find rulings in all sorts of directions.

The general public having access to AI tools for creating art; not an issue as long as it's just for personal use.

Huge agree!


People using AI art for financial gain and/or the large scale dissemination of the works; there needs to be a level of control in place respect the intellectual property of those whole information the AI is sourcing from; particularly recognition, consent and compensation for its use, and the clear labelling of the final product being an AI production. Will it be easy? No. But missing any one of those will cause problems.

Requiring that the output be watermarked as AI generated was one of the main recommendations of the then prescient Stochastic Parrots (https://dl.acm.org/doi/10.1145/3442188.3445922) paper on language models, and would be a substantial step forward.

If the Writer's Union ends up winning concessions that involve compensation in situations where it is not legally required, specifically in Hollywood, I would consider that a big win under the maxim of never taking the side with all the money, but any attempts to enshrine such notions into copyright law seems extremely untenable and full of nasty drawbacks... unless you're Disney.

NichG
2023-08-09, 05:55 AM
Conflating two distinctly different processes here.
Recreating something you have seen in the past by building it up from a blank canvas is not the same as mutating an existing image.

I generally avoid the 'fruit of the poisoned tree' arguments because that's just a philosophical mess with no end point.
I'm keeping the scope of what I'm taking about into very distinct categories of
is the work of someone else incorporated directly (think copy paste) into the new work, or
is the work of someone else used as a reference but not physically present in the new work
See playing an instrument yourself vs playing an audio recording of someone else performing.
Even when the subject is about the same thing, those are two very distinct aspects.


The generative methods are in fact building up the new images from a blank canvas, starting with pure noise and step by step trying to find patterns in that noise to turn into image. That process is informed by the statistics of the 5 billion images that the network was trained on, which create the sense of e.g. 'what does an apple look like' or 'what do photographs look like' that are used to basically hallucinate the new image from finding patterns in the noise.

It's far more transformative than photobashing, really. That's why I'd challenge you to find even say a 32x32 patch of pixels that matches precisely from any given AI generated image to any particular source image, if you want to make the claim that the AI generated image is infringing.

It's as if someone for example went and looked at every picture on DeviantArt and accumulated by computer the histogram of how often different pixel colors are used together, then used that information to decide what an aesthetic palette might be for a new piece, likely landing on the orange/grey/teal color scheme. Yes that is derived from those works, but no individual work or individual artist's work if removed from that process would significantly change the result. Similarly for the AI generated stuff - if you prepared a dataset that was exactly the same, except with all the works of a single artist removed, then retrained the model, it'd be really hard to tell the difference between the models unless someone is directly referencing that artist in the prompt.

So the only thing left is that the problem is that the AI was exposed to the work during training, not that the AI's output copies the work. If you want to legislate that, you're also effectively legislating how people are allowed to look at a given artist's work when its posted publically - 'you can view this but not think about to make something like it', etc. Because the process of a human deriving artistic understanding from looking at a piece of work is already hard to distinguish from AI training, and its going to get even more blurry with time as new machine learning methods are developed and the basic processes don't stand still in the way that something like 'photobashing' is more or less a fixed thing you can rule on.

Zhorn
2023-08-09, 10:43 AM
It's far more transformative than photobashing, really. That's why I'd challenge you to find even say a 32x32 patch of pixels that matches precisely from any given AI generated image to any particular source image, if you want to make the claim that the AI generated image is infringing.
You are now trying to narrow the goalposts to such a thin margin are to protect your point.
The AI isn't only doing a copy past, it it blending them from there, and the more distant it is from the differing works around it the more the likeness stays intact. I've said as much before in an earlier post.
Doing a google search yielded a lot of examples; many of which I could not post here due to explicit; but here's a SFW example
when the number of source images is kept low, it is far easier to spot what it is actually doinghttps://alexanderwales.com/wp-content/uploads/2022/08/image.png

Yes there are some pretty impressive ones that are using using gaussian noise in their later iterations whcih when combined with a very large data set give the illusion of originality, but without the original image data to map to it cannot produce anything of worth. And that source data of other artists work is a problem.
That noise is what mangles up the signatures from the stolen work alsohttps://cl.buscafs.com/www.levelup.com/public/uploads/images/776137/776137_832x467.jpg

You clearly demonstrate you have a lot of tech knowledge on the subject, and I applaud you for that. But wilfully turning a blind eye to the blatant art theft isn't a good look.

this thread has run its course for me. I just wanna go back to playing d&d.
Laters all

Dr.Samurai
2023-08-09, 11:17 AM
this thread has run its course for me. I just wanna go back to playing d&d.
Laters all
Roll your Wisdom saving throw to leave please...

ZRN
2023-08-09, 01:17 PM
You are now trying to narrow the goalposts to such a thin margin are to protect your point.
The AI isn't only doing a copy past, it it blending them from there, and the more distant it is from the differing works around it the more the likeness stays intact. I've said as much before in an earlier post.
Doing a google search yielded a lot of examples; many of which I could not post here due to explicit; but here's a SFW example
when the number of source images is kept low, it is far easier to spot what it is actually doinghttps://alexanderwales.com/wp-content/uploads/2022/08/image.png


The specific image you're showing here was generated with the prompt "Girl with a Pearl Earring by Johannes Vermeer" and it's only possible because that particular artwork is SO prominent/ubiquitous the AI weights basically remember the whole thing. Like, yes, if you specifically ask the AI to copy one of the most famous paintings in the world (which happens to obviously be out of copyright) it can sort of do so. AI isn't IMMUNE to criticisms of copying if you literally ask it to do that, and even then it can only "copy" if you either ask it about an incredibly famous image or if you give it the original to copy from.

verbatim
2023-08-09, 01:40 PM
without getting too into the weeds (this is a good thread (https://twitter.com/ai_curio/status/1564878372185989120) that does, for the curious), overfitting can produce output that is infringing US copyright by virtue of comparing output to output agnostic of the creation process.

The problematic part (in a legal, not moral, sense) isn't that the painting was in the training data, it's that it was preserved in the output.

Taking a step back away from copyright for a second though, when we're talking about morality the nuts and bolts don't really matter. If you built a robot brain that learned the way people do (whatever that means?), the focal tension point would still be that it was being used to outsource human jobs. Before they left Zhorn hit the nail on the head by specifying how big of a deal the intent behind use of a model is when evaluating it from a labor/artist's rights standpoint.

ZRN
2023-08-09, 02:04 PM
without getting too into the weeds (this is a good thread (https://twitter.com/ai_curio/status/1564878372185989120) that does, for the curious), overfitting can produce output that is infringing US copyright by virtue of comparing output to output agnostic of the creation process.

The problematic part (in a legal, not moral, sense) isn't that the painting was in the training data, it's that it was preserved in the output.

Taking a step back away from copyright for a second though, when we're talking about morality the nuts and bolts don't really matter. If you built a robot brain that learned the way people do (whatever that means?), the focal tension point would still be that it was being used to outsource human jobs. Before they left Zhorn hit the nail on the head by specifying how big of a deal the intent behind use of a model is when evaluating it from a labor/artist's rights standpoint.

I think when we're talking about a moral point of view (especially regarding labor issues), we should recognize that there's a whole literal multibillion-dollar industry ("effective altruism") dedicated to loudly pondering these "moral" questions by specifically sidestepping the real core of the issue, which is the economic system (capitalism) that underpins everything.

In short, it's a GOOD thing that grunt-work can get outsourced to AI. "People losing jobs" is just "less work is needed" fed into a capitalist framework. Our complaint should be with the system that makes some people draw Goku porn to pay for their healthcare (D&D-related art too, but let's be real, probably a lot more money for independent artists in the Goku porn), not the AI that can draw Goku porn for cheaper.

Smashing the looms/railroad tracks/LLM source material in Luddite style is a Maginot line that can't hold against technological progress. All the mini-arguments about HOW and WHY an AI can spit out an image are side-issues. (Except for one big technical issue I'll concede: as the incentive to create art and text online is killed off because robots do it cheaper, the training data for AIs will start to erode. Better find a way to keep those human artists in the loop, at a minimum as inputs!)

False God
2023-08-09, 02:17 PM
The specific image you're showing here was generated with the prompt "Girl with a Pearl Earring by Johannes Vermeer" and it's only possible because that particular artwork is SO prominent/ubiquitous the AI weights basically remember the whole thing. Like, yes, if you specifically ask the AI to copy one of the most famous paintings in the world (which happens to obviously be out of copyright) it can sort of do so. AI isn't IMMUNE to criticisms of copying if you literally ask it to do that, and even then it can only "copy" if you either ask it about an incredibly famous image or if you give it the original to copy from.

Even I'll chime in to say that I understand AI art well enough that yes, if you keep the input narrow enough, the prompt specific enough, then the output will be very similar if not a mild photomanip of the original material. That's not copyright infringement any more than some other artist has the capability when shown a specific work and asked to reproduce that specific work, to do so.

I'll even argue that the outputs of AI are generally unique and original images, they are not photomanips or duplicates of the input material. They can be very similar to the input material if constrained enough.

My problem still resides on the part that fundamentally, we can't do anything about now: how the AI were trained. If you're downloading people's work, there's copyright violation. If you're integrating that work into a new program, there's copyright violation. If you're using that new problem that has "analyzed" the work you didn't have permission to use for these purposes to then output "new" material, there's copyright violation.

If I can't build a house without using someone else's labor, then me not paying them is a law violation. If I create a house-building machine using that can only do what it does thanks to the labor of others that I didn't get consent to use, that's a violation.

And the answer in my mind is either
A: All the AI trained thanks to other people's labor need to compensate them properly
or
B: All the AI trained thanks to other people's labor need to "forget" that training.

Now normally with humans we couldn't do the latter(at best we'd have a contract saying you can't practice with your training for X amount of time in Y region), but we CAN delete that training from an AI. It would destroy the AI, but hey, you build something on the backs of other people's labor and don't compensate them for that, sucks to be you.

NichG
2023-08-09, 03:10 PM
Even I'll chime in to say that I understand AI art well enough that yes, if you keep the input narrow enough, the prompt specific enough, then the output will be very similar if not a mild photomanip of the original material. That's not copyright infringement any more than some other artist has the capability when shown a specific work and asked to reproduce that specific work, to do so.

I'll even argue that the outputs of AI are generally unique and original images, they are not photomanips or duplicates of the input material. They can be very similar to the input material if constrained enough.

My problem still resides on the part that fundamentally, we can't do anything about now: how the AI were trained. If you're downloading people's work, there's copyright violation. If you're integrating that work into a new program, there's copyright violation. If you're using that new problem that has "analyzed" the work you didn't have permission to use for these purposes to then output "new" material, there's copyright violation.


That's not how copyright law works. Getting into it gets close to legal advice so I'm deleting the paragraph I wrote.

The one thing I will say is, you had better hope that no court decides that this is how copyright law works if you actually care about the economic stability of individual creators. This sort of strict standard would make fan art, fan fiction, cultural references, reviews, streaming, and other such things also into infringing acts. If you think 'companies aren't hiring artists' is bad, 'companies have a mandate to extract 10s of thousands of dollars in penalties from someone who looks at their media and then draws something even loosely inspired' would be a nightmare.


If I create a house-building machine using that can only do what it does thanks to the labor of others that I didn't get consent to use, that's a violation.

There is no such law. Someone, say, installs a door in my house. I need to install a new door, so I look at how they attached the hinges and copy that. That's completely legal.

Maybe you want this to be a moral point, in which sure, you're free to believe that. I would absolutely not agree to this point in general, but it wouldn't be a matter of something being factually incorrect. I wouldn't even disagree that its sleazy to use people's generosity specifically to undermine their interests. But I don't think it should be illegal, and I think the sort of legal theories which try to make that illegal end up being disastrously bad in other ways because there's a fundamental problem with letting someone create a legal obligation for someone else without their consent - and laws where you can put things in a public space and then attack people for what they do having encountered that thing in a public space inherently have this problem. This kind of logic would mean e.g. that if someone sneaks onto my property and digs a cellar, now they can invoice me for that if I happen to use it (or use the dirt they dug up in my garden, or somehow otherwise benefited from their act). It's obviously not the same thing as the artists posting work, but thats the problem with trying to make rules based on wide sweeping generalizations like implicitly giving someone who does labor the right to control how that labor is then later used.

Given that I did say I think its sleazy, what do I think should be done? It'd be great if generative AI companies donated some of their profits - say 10% - to grants for artists. But from my accounting of it at least, they don't have to. And similarly by my personal values, if someone (a person, a company, etc) wants to use those generative models instead of hiring an artist, that's within the sphere of choices that I believe people should be allowed to make at their own recognizance, rather than having that choice forced by rule or by social pressure. It's within the 'you can dislike them for their choice, but you shouldn't try to force their choice' sphere of reasonable coexistence for me. To me its about the same level of choice as not giving a tip at a restaurant - its not nice, it'd be reasonable for the waiter to be curt with them next time, but its within the sphere of choices that I think a person should be allowed to make.

Frogreaver
2023-08-09, 03:12 PM
Large Language Model AI that generates Text and Large Language Model AI that generates images operate on the same underlying technology.

So with that in mind, I used Perplexity AI. I think this makes the case about how much generative AI tends to copy clear.

Prompt
Write a paragraph about elves in the style of the D&D 5e Players Handbook

Output
Elves are a magical people of otherworldly grace, living in the world but not entirely part of it. They live in places of ethereal beauty, in the midst of ancient forests or in silvery spires glittering with faerie light, where soft music drifts through the air and gentle fragrances waft on the breeze. Elves love nature and magic, art and artistry, music and poetry, and the good things of the world. They are slender and graceful, with pointed ears and a natural elegance that sets them apart from other races. Elves can live well over 700 years, giving them a broad perspective on events that might trouble the shorter-lived races more deeply. They are more often amused than excited, and more likely to be curious than greedy. Elves tend to remain aloof and unfazed by petty happenstance, but when pursuing a goal, whether adventuring on a mission or learning a new skill or art, they can be focused and relentless. Elves are slow to make friends and enemies, and even slower to forget them. As a high elf, they are known for their keen senses, deep intuition, and remarkable resilience. They are also recognized for their magical abilities and their connection to the arcane. As a wood elf, they are characterized by their earthy appearance and forest-bound lifestyle. They are recognized for their stealth, archery skills, and their natural ability to blend seamlessly into the wilderness. Elves are a fascinating and diverse race, with many subraces and unique characteristics that make them a popular choice for players in the world of Dungeons and Dragons.

5e PHB (though the rest sounds like it might be from 3e or 4e)
Elves are a magical people of otherworldly grace, living in the world but not entirely part of it. They live in places of ethereal beauty, in the midst of ancient forests or in silvery spires glittering with faerie light, where soft music drifts through the air and gcntle fragrances waft on the breeze. Elves love nature and magic, art and artistry, music and poetry, and the good things of the world.

With their unearthly grace and fine features, elves appear hauntingly beautiful to humans and members of many other races. They are slightly shorter than humans on average, ranging from well under 5 feet tall to just over 6 feet. They are more slender than humans, weighing only 100 to 145 pounds. Males and females are about the same height, and males are only marginally heavier than females.

Elves' coloration encompasses the normal human range and also includes skin in shades of copper, bronze, and almost bluish-white, hair of green or blue, and eyes like pools of liquid gold or silver. Elves have no facial and little body hair. They favor elegant clothing in bright colors, and they enjoy simple yet lovely jewelry.

Elves can live well over 700 years, giving them a broad perspective on events that might trouble the shorterlived races more deeply. They are more often amused than excited, and more likely to be curious than greedy. They tend to remain aloof and unfazed by petty happenstance. When pursuing a goal, however, whether, adventuring on a mission or learning a new skill or art, elves can be focused and relentless. They are slow to make friends and enemies, and even slower to forget them. They reply to petty insults with disdain and to serious insults with vengeance.

NichG
2023-08-09, 03:17 PM
Large Language Model AI that generates Text and Large Language Model AI that generates images operate on the same underlying technology.

Not quite. LLMs predict the probability of the next token given the previous context. Diffusion models iteratively remove noise from an image (which begins as pure noise). One place where this makes a difference is that the image generators have to work with what the noise provides, so you can actually drive them with large-scale composition, lighting cues, etc. With LLMs, there isn't that kind of natural extra source of variation, so they're more likely to copy verbatim.

That said, in your example, you explicitly asked for something from the D&D 5e Players Handbook. That's on you, not on it. It's like if I said 'Human artists are bad, because they're all plagiarists and forgers. I went to this guy and asked him to make a copy of a Monet for me and he did it!'

Compare with, for example:


Write a paragraph describing a fantasy race called the Alves



The Alves are a captivating and ethereal fantasy race, embodying an aura of enchantment and mystique that sets them apart from other beings. Standing at the crossroads between nature and magic, the Alves are slender and graceful, with luminous, opalescent skin that shimmers like moonlight through the leaves of an ancient forest. Their eyes, reflecting the hues of both the skies and the seas, hold a wisdom that spans millennia. With a deep connection to the natural world, Alves possess a unique ability to commune with animals and plants, fostering a harmonious coexistence that grants them insights into the secrets of the earth. Their society is governed by a profound reverence for the cycles of life and the balance of forces, and they are often sought out as gifted healers and wise advisors. Adorned in intricate, nature-inspired attire and wielding elegant, elaborately carved staffs channeling their innate magic, the Alves evoke an alluring blend of enchantment and reverence wherever their presence graces the realms of fantasy.

I'd be curious if you can find any of those sentences or sentence fragments online. Individual adjectives, sure - 'ethereal' got used in both the 5e text and in this.

False God
2023-08-09, 03:25 PM
That's not how copyright law works. Getting into it gets close to legal advice so I'm deleting the paragraph I wrote.

The one thing I will say is, you had better hope that no court decides that this is how copyright law works if you actually care about the economic stability of individual creators. This sort of strict standard would make fan art, fan fiction, cultural references, reviews, streaming, and other such things also into infringing acts. If you think 'companies aren't hiring artists' is bad, 'companies have a mandate to extract 10s of thousands of dollars in penalties from someone who looks at their media and then draws something even loosely inspired' would be a nightmare.
This already exists. Fanart is a copyright violation, and many companies have aggressively pursued fan-artists creating and more notably, selling fan-art. Most companies don't because they look like massive jerks and people stop buying their stuff. But control of your property is integral to retaining copyright, if you fail to exercise use and control of your copyright, you'll almost certainly lose it.


There is no such law. Someone, say, installs a door in my house. I need to install a new door, so I look at how they attached the hinges and copy that. That's completely legal.
That is not what I suggested. I suggested, using your door example, that I had someone install a door, and then I didn't pay them, and then used their craftsmanship as a basis for a magic button that installs doors.

The key is the unpaid labor and the lack of consent.

AI "learning models" can't just gobble up whatever art they can find and then start outputting similar works without either obtaining appropriate consent or providing appropriate compensation.

NichG
2023-08-09, 03:41 PM
This already exists. Fanart is a copyright violation, and many companies have aggressively pursued fan-artists creating and more notably, selling fan-art. Most companies don't because they look like massive jerks and people stop buying their stuff. But control of your property is integral to retaining copyright, if you fail to exercise use and control of your copyright, you'll almost certainly lose it.

There are shelters in fair use for legal fan art, basically in whether it is sufficiently transformative. That of course doesn't mean that a company couldn't make you pay more money to a lawyer than you can afford to risk in order to back it up.

As far as 'protect it or lose it', that's trademarks, not copyright. In some countries you can't even get rid of your copyright if you want to - in Germany for example its not even possible to transfer your copyright to someone else, which is part of the reason for people in the opensource community to do really elaborate legal theorycraft in making licenses that basically amount to 'I want to put this into the public domain' - because for creators in some countries, you literally can't do that.



That is not what I suggested. I suggested, using your door example, that I had someone install a door, and then I didn't pay them, and then used their craftsmanship as a basis for a magic button that installs doors.

The key is the unpaid labor and the lack of consent.


Okay, you watch your friend install a door in their own home. You don't pay them, they give you consent to watch, but not any kind of explicit consent to e.g. go and use what you learned. Then you take the steps, analyze them, and make a door-installing robot.

The labor was unpaid. There was no consent for you to benefit from it. It's perfectly legal. If you want to call that immoral thats up to you. For me, someone making a fuss about 'you stole my door installation secrets!' would be on the wrong side of reasonable in that particular conflict.

Or, I fly above some farmland on a plane, and I see how they laid out their irrigation and rows in a clever way. I go home and lay out a farm in that way, and (indirectly) even compete with them since I'm also selling food. Again, perfectly legal, and by my ethics perfectly reasonable.

PhoenixPhyre
2023-08-09, 04:01 PM
As a note, there's an explicit copyright exception under the Berne Convention for "quotations and use of works by way of illustration for teaching purposes". Does it cover training an AI? I'm not a lawyer. But there's (IMO) a not-ludicrous argument to be made for that. I could certainly, as a teacher, quote from or use copyrighted works to train humans to create new works, even if in the process they produce things that are heavily influenced by or based on those works without fear of infringing copyright...as that's rather the point of using specific works for teaching purposes.

And this isn't even getting to Fair Use, which is a defense (rather than an exception) to copyright infringement. Fair Use is a muddy ball of muddy mud, so I won't even speculate there.

Brookshw
2023-08-09, 04:57 PM
As a note, there's an explicit copyright exception under the Berne Convention for "quotations and use of works by way of illustration for teaching purposes". Does it cover training an AI? I'm not a lawyer. But there's (IMO) a not-ludicrous argument to be made for that. I could certainly, as a teacher, quote from or use copyrighted works to train humans to create new works, even if in the process they produce things that are heavily influenced by or based on those works without fear of infringing copyright...as that's rather the point of using specific works for teaching purposes.

And this isn't even getting to Fair Use, which is a defense (rather than an exception) to copyright infringement. Fair Use is a muddy ball of muddy mud, so I won't even speculate there.

Within the US the exception is granted to in classroom activities with the expectation there isn't time for teachers to license content for use. Universities have gotten in trouble for trying to exceed the education exception to use content in other ways (such as internal resources not tied to a class). Relevant cases which went up on appeal required specific analysis of each work/piece for Fair Use, and rejected blanket determinations for overall use. That approach is utterly impractical for AI training, this is a very new issue which is going to require a new approach (I won't be surprised to see it make its way up to the SJC within the next decade).

I'm going to back to biting my tongue on this topic. Carry on.

verbatim
2023-08-10, 01:04 AM
As a note, there's an explicit copyright exception under the Berne Convention for "quotations and use of works by way of illustration for teaching purposes". Does it cover training an AI? I'm not a lawyer. But there's (IMO) a not-ludicrous argument to be made for that. I could certainly, as a teacher, quote from or use copyrighted works to train humans to create new works, even if in the process they produce things that are heavily influenced by or based on those works without fear of infringing copyright...as that's rather the point of using specific works for teaching purposes.

And this isn't even getting to Fair Use, which is a defense (rather than an exception) to copyright infringement. Fair Use is a muddy ball of muddy mud, so I won't even speculate there.

As I understand it, that exception covers situations wherein people pay for a class that involves them being shown material that wasn't licensed. I would imagine thinking about machine learning models would not get that far in because no unlicensed material is being stored or shown, only devoured in order to create what is essentially a stochastic plinko machine.

Hael
2023-08-10, 06:26 AM
A *properly* trained AI model does not copy and paste.. Thats not how they operate at all, and when they do, its almost always an illusion. From a signal processing point of view, this is basically a theorem.

Now, when you overfit a model, it does produce some memorization to some extent, and when you have very specific data sets that have a very limited amount of data about a topic (eg something that might appear only once or twice), you will find that a model sometimes reproduces it (often with some errors).

However, this is how humans work as well. If I tell you to think about what the sound of a blackhole merger is, i’m willing to bet everyone here has the same ‘sound’ and image in their mind from the day when that went viral (probably from a talkshow, am i right?).

The reason its an illusion for the most part (and a lot of work has been done on this), is that the amount of ways to say something or even to draw something, is much, much more constrained than people realize. There is only so many ways to draw a face for instance and its not unsurprising that images are reproduced when you get to cherry pick millions of images to find the proverbial needle in a haystack.

In any event, this is something that gets minimized as models get better and better.

truemane
2023-08-10, 07:52 AM
Metamagic Mod: reminder to everyone that dispensing legal advice (even in the theoretical) is not permitted on the forum.