> What you seem to be missing is that LLMs are better at/for some things than others.
I guarantee it is using the same system to write code and teach you about electronics that it is using to teach people about chemistry, and if you can't see how that means the resulting information is suspicious at best, then I don't even know what to say anymore.
Don't worry, LLMs are perfectly ok for getting information. Just ask drugs.com about penisomab https://bsky.app/profile/harrisonk.bsky.social/post/3mfs6adw...