This isn’t an issue with stable diffusion, this is an issue with the wrong data set being used. Diffusion isn’t like a lot of AI in that it doesn’t actually think (in as much as any AI thinks) it’s just finding patterns, so it doesn’t actually look at the situation and go “hmm, that’s not right”.
This isn’t an issue with stable diffusion, this is an issue with the wrong data set being used. Diffusion isn’t like a lot of AI in that it doesn’t actually think (in as much as any AI thinks) it’s just finding patterns, so it doesn’t actually look at the situation and go “hmm, that’s not right”.
your earnest explanation of an AI’s inability to understand context, in response to an obvious joke, is kinda funny
Taking bets that the parent comment was actually posted by two LLMs in a trench coat with a groucho marx costume mask?