> Yes we are now dealing with an automated Photoshop. And somehow the people in charge have decided to do something about it, probably more for political or maybe darker reasons.
I don't get what's difficult to understand or believe here. Grok causes a big issue in practice right now, a larger issue than photoshop, and it should be easy for X to regulate it themselves like the competition does but they don't, so the state intervenes.
> maybe France or the EU should ban its citizen from investing in the upcoming SpaceX/xAI IPO, and also Microsoft, NVIDIA, OpenAI, Google, Meta, Adobe, etc. ?
You're basically asking "why do a surgical strike when you can do carpet bombing"? A surgical strike is used to target the actual problem. With carpet bombing you mostly cause collateral damage.
> Anyone skilled at photoshop
So let's say there are two ways to do something illegal. The first requires skills from the perpetrator, is tricky to regulate, and is generally speaking not a widespread issue in practice. The second way is a no brainer even for young children to use, is easy to regulate, and is becoming a huge issue in practice. Then it makes sense to regulate only the second.
> People can now 3D print guns at home, or at least parts that when assembled can make a functioning firearm. Are now 3D printer makers to blame if someone gets killed with a 3D printed gun?
Tricky question, but a more accurate comparison would be with a company that runs a service to 3D print guns (= generating the image) and shoot with them in the street (= publishing on X) automatically for you and keeps accepting illegal requests while the competitors have no issue blocking them.
> Where do we draw the line at tools in terms of effort required, between when the tool bares the responsibility and not just the human using the tool to do illegal things?
That's also a tricky question, but generally you don't really need to know precisely where to draw the line. It suffices to know that something is definitely on the wrong side of the line, like X here.