I'd find it very understandable if true. I also think there will be some junior devs that it will supercharge, and they will eventually make some of the things we only dreamed about. If you don't actually enjoy coding but are starting out as a coder, it's probably not going to help. If you are thirsty to understand and do things, it is an incredible time to start out.
Good old fashioned human trolling is the most likely explanation. People seem to think that LLM training just involves absorbing content from the internet and sources, but it also involves a lot of human interaction that allows it to have much more well-adjusted communication than it would otherwise have. I think it would need to be specifically instructed to respond this way.
Here's how I'd break down the two types of users: People who are using AI to teach themselves how to work in the domain they are interested in, and people who are relying on AI to do all or most of the heavy lifting.
I'd argue that the people using AI most effectively are in the mostly-chatters group that the author defines, and specifically they are using the AI to understand the domain on a deeper level. The "power users" are heading for a dead end, they will arrive as soon as AI is capable of figuring out what is actually valuable to people in the given domain, not generally a difficult problem to solve. These power users will eventually be outclassed by AIs that can self-navigate. But I would argue that a human that has a rich understanding of the domain will still beat self-navigating AI for a long time to come.