Phrases like “it symbolizes the increasing role” are something I generally see ChatGPT say. People don’t typically talk like that, even pretentious lunatics on LinkedIn.
For me it was the fact that the first sentence literally just spells out what the line above says. It feels like every other sentence coming from ChatGPT is just a summary of the previous sentence. (Unless it’s trying to relativize, then it hits you with the “it’s important to remember”.)
This will keep happening as long as humans keep ranking wordy AIs higher than succinct ones. Unfortunately we have this gut instinct to judge long responses as more true than short ones, so we keep making the problem worse.
Phrases like “it symbolizes the increasing role” are something I generally see ChatGPT say. People don’t typically talk like that, even pretentious lunatics on LinkedIn.
For me it was the fact that the first sentence literally just spells out what the line above says. It feels like every other sentence coming from ChatGPT is just a summary of the previous sentence. (Unless it’s trying to relativize, then it hits you with the “it’s important to remember”.)
This will keep happening as long as humans keep ranking wordy AIs higher than succinct ones. Unfortunately we have this gut instinct to judge long responses as more true than short ones, so we keep making the problem worse.