Has anyone else considered that producing code faster isn't necessarily a good thing? There's a lot that goes into getting a solution correct that has nothing to do with programming. Just because you can scale code production doesn't mean you can scale things like understanding user wants and expectations. At a point you're more work for your self/organization because unless you get everything perfect the first time you're creating more work than you're resolving.
Giving an audience something they never asked for is very easy when you don't have much experience interacting with them. That's where a lot of would be entrepreneurs and creators alike stumble. They forget (or don't know to) quantify the size of their obtainable market before taking action and building something.
Seeing as how LLMs just tell you what you want to hear and not what they think you need to hear I don't see this problem changing anytime soon. They might need to develop a different type of model to have it reason that way.
The problem with bouncing ideas off of AI is that you still need to know enough to know when something is likely a hallucination. Because unless you're double-checking it with some kind of regular cadence you're probably accepting fiction as fact. Its really easy to just trust everything these chatbots output because of the style of communication. I'll be the first admit I fall for this trap all the time.