Awesome! Now half the sites on the web contain a tiny drop of information, buried under paragraphs and paragraphs of GPT-generated text, and now my browser uses the same LLM to reverse engineer the original information. What could possibly go wrong the LLMs talk to other LLMs and summarize, bloat, summarize and then bloat and then finally summarize every bit of information for us.
Do we actually still make websites for humans, or is it all just AIs and SEO?