I mean I would still expect them to currently lose money ? Their tiers aren't as generous but they're still free free (i.e no revenue generation whatsoever, google search is free but they're still generating revenue per user via ads and such).
I think the authors point isn't that inference is so cheap that they can be profitable without changing anything but that inference is now cheap enough for say ads (however that might be implemented for an LLM provider) to be a viable business model. It's an important distinction because a lot of people still think LLMs are so expensive that subscriptions are the only way profit can be made.
>The entire comparison hinges on people only making simple factual searches
You have a point but no it doesn't. The article already kind of addresses it, but Open AI had a pretty low loss in 2024 for the volume of usage they get. 5B seems like a lot until you realize chatgpt.com alone even in 2024 was one of the most visited sites on the planet each month with the vast majority of those visits being entirely free users (no ads, nothing). Open AI in December last year said chatgpt had over a billion messages per day.
So even if you look at what people do with the service as a whole in general, inference really doesn't seem that costly.
LLMs also have a 'g factor' https://www.sciencedirect.com/science/article/pii/S016028962...