This depends on your goals. If your goal is to drive efficiency into your processes, drive down tech debt, or fix pain points for customers of your existing products, sure. Most people at a your company with have thoughts, and lots of them will have good ideas.
If your goal is to pivot the company into new verticals, or to develop an entirely new product, then "asking staff for ideas" isn't a likely way to succeed.
I don't think it's odd. Sacrificing deep understanding, and delegating that responsibility to others is risky. In more concrete terms, if your livelihood depends on application development, you have concrete dependencies on platforms, frameworks, compilers, operating systems, and other abstractions that without which you might not be able to perform your job.
Fewer abstractions, deeper understanding, fewer dependencies on others. These concepts show up over and over and not just in software. It's about safety.
It's not just fork. The operating system overcommits memory all over the place. For example, when you map memory, that can/will succeed without actually mapping physical pages. Even "available" memory is put to some use and freed in an asynchronous way behind the scene, a process that is not always successful.
Honestly, I think overcommit is a good thing. If you want to give a process an isolated address space, then you have to allow that process to lay out memory as it sees fit, without having to worry too much about what else happens to be on the system. If you immediately "charge" the process for this, you will end up nit-picking every process on the system, even though with overcommit you would have been fine.