Why do you care if content is partly created by AI?
Why do you care if content is partly created by AI?
What's the context? Is the content intended to represent truth? Is the content intended to be creative? Can it cite its sources when there is a need to do so? Can it actually do what it's being advertised to do? If it delivers bad content and there is a need for accountability, how would that work?
My gut says that it has the potential to create a lot of junk that's going to pass as legitimate, which isn't exactly a good direction.
There's an implied "relationship" between the viewer of any creative content - and its creator. That another human is sharing an emotion, a human experience, a human vision...
I heard someone say: imagine you could replace a child's parents with machines.
Would you? Should you?
The question is far too vague. I may or may not care, depending on the context and how it's done.
I do care about disclosure, though. If the output of an LLM is being used verbatim, that should be disclosed. Knowing that helps to properly understand the content.
The current state of agitation over AI reminds me of the fuss over music sampling back in the '80s. I am sure all of the big reactive feelings people are expressing right now will sound just as quaint to the next generation.
Generally, I don’t, any more than I care if the creator used any other tools in the creative process.