I was wondering last night how Google (not to mention the other engines) will adapt its SEO strategies to discern a human's written content from an AI's content.
Because if everyone starts to generate articles with AIs, the web will become a pile of content created by AIs based on content written by humans for humans. But we know that training models are based on text corpora from different sources (including the web).
Where is the border? The more content generated by AIs, the more the new AIs will train on content generated by other AIs to create new content.
We will end up reading content that humans no longer wrote, which could become almost rare.
Content is one of many variables taken into account by the algorithm to index pages, but it will be interesting to see how search engines will solve this issue.
Tell me what you think:
Search engine works on how useful 'knowledge' is for the seeker. One of the parameters is how many others have back linked to the same source - among various other signals.
If AI content is highly cited and was found useful by many others, then I would want it at the top of search result.
Check out this discussion https://news.ycombinator.com/item?id=30347719
I was thinking of this issue this morning, I wonder if anyone tried to create a search engine that has weight for content approved by experts and will add more smaller weight based on engagement or other factors. I think Pocket is an example for what I have in mind.
It will definitely require massive resources to make, but it will be the new trusted Google.
Looks like this: https://i.imgur.com/BTt1DTh.png
I can say to ChatGPT: "Here are my ideas in point form, write them out as full paragraphs: ..." And ChatGPT will do that for me. So, is that content written by an AI, or content written by a human?