The legal system is fundamentally broken. It's not designed to handle the kind of throughput that is required to enforce justice in countries with many millions of inhabitants.
The legal system is mostly a fantasy. It doesn't exist for most people. Currently it only serves large corporate and political interests since only they can afford access.
Yes, with current costs, most people literally cannot afford legal representation, especially in the plaintiff side.
For example, I've been cheated out of at least $100k net worth by the founder of a crypto project because he decided to abandon tech which was working and switched to a competitor's platform for no reason. Now I was already worried about repercussions outside of the legal system... This is crypto sector after all... But also, legally, there's no way I can afford to sue a company which controls almost $100 million in liquid assets and probably has got government regulators on their payroll... Even though it is a simple case, it would be difficult to win even if I'm right and the risk of losing is that they could seek reimbursement of lawyers fees which they seek to maximize just to make things difficult for me.
As I've said before, AI is a force multiplier. A 10x developer is now a 100x developer and a -10x developer (complexity maker/value destroyer) is now a -100x developer.
I can understand why a lot of companies are cutting junior roles. What AI does is it automates most of the stuff that juniors are good at (coding fast) but not much of the stuff that the seniors are good at.
That said, I've worked with some juniors who managed to navigate; they do this by focusing on higher order thinking and developing a sense of what's important by interacting with senior engineers. Unfortunately, it raises the talent bar for juniors; they have to become more intelligent; not in a puzzle-solving way, but in a more architectural big-picture sort of way; almost like entrepreneurial thinking but more detailed/complex.
LLMs don't have a worldview; this means that they miss a lot of inconsistencies and logical contradictions. Also, most critically, LLMs don't know what's important (at least not accurately enough) so they can't prioritize effectively and they make a lot of bad decisions.
It's kind of interesting for me because a lot of the areas where I had a contrarian opinion in the field of software development, I now see LLMs getting trapped into those and getting bad results. It's like all my contrarian opinions became much more valuable.