It's curious to me why we have no theory of intelligence. By which I mean an actual hard and verified theory, as in physics for gravity, electromagnetism, quantum mechanics.
Intelligence is simply not well-understood at a mathematical level. Like medieval engineers, we rely so heavily on experimentation in AI. We have no idea how far away from the human level we actually are. Or how far above the human level we can get. Or what, if anything, the limits of intelligence are.
This is a good counter in my view to the singularity argument:
https://timdettmers.com/2025/12/10/why-agi-will-not-happen/
I think if we obtain relevant-scale quantum computers, and/or other compute paradigms, we might get a limited intelligence explosion -- for a while. Because computation is physical, with all the limits thereof. The physics of pushing electrons through wires is not as nonlinear in gain as it used to be. Getting this across to people who only think in terms of the abstract digital world and not the non-digital world of actual physics is always challenging, however.
Wait till the Chinese land on the Moon first in this new space race. There will be a Sputnik moment, massive additional investment, and this will inevitably impact sci-fi. Just like in the previous space race, we had to fall quite a bit behind first before we wake up -- and then, we go all-out.
I also don't agree with the general dystopian or cynical view quite prevalent here on HN these days, frankly. It's always been so, but it seems to have gotten darker, such that I think a lot of old-timers like me pretty much avoid HN these days. It's not all bleak, especially when you get away from these screens and out into the real world. Looking outward, rather than inward, can lead to the kind of desire for discovery and progress which underpinned the Apollo era. The world out there is in extreme disarray too -- but to an optimist, it presents opportunity to do good.