>Sure, but we are not talking about evaluating your contributions daily. Over a lifetime, people find new ways to provide more value. Life is long, and that is how adapting works.
I can't really take that sentiment to my bank when I default on my mortgage while I retrain though. So although you're correct, across a lifetime, this isn't much of an issue, you're minimising people's very real near-term anxieties here.
I don’t think anyone needs to compete with the LLM SOTA to get the benefits of these technologies on-device.
Consumers don’t need a 100k context window oracle that knows everything about both T-Cells and the ancient Welsh Royal lineage. We need focused & small models which are specialised, and then we need a good query router.
I disagree. I run a sudoku site. It’s completely static, and it gets a few tens of thousands of hits per day, as users only download the js bundle & a tiny html page. It costs me a rounding error on my monthly hosting to keep it running. To add an api or hosted mcp server to this app would massively overcomplicate it, double the hosting costs (at least), and create a needless attack surface.
But I’d happily add a little mcp server to it in js, if that means someone else can point their LLM at it and be taught how to play sudoku.