I think you're right and it's possible to have something that exists with no other purpose than to cause harm. And it's not moral to make that thing. I also don't think it's fruitful to find the specific circumstances it's moral to eat babies (go down philosophical rabbit holes until you find the one time that doing something despicably immoral is actually the moral thing to do). But I would say the technology is the least important part of the problem. A moral person uses dangerous tools sparingly and intentionally harmful tools never. If Palantir did not exist, would they perform the raids? I think so.
Okay, that's where you draw the line. But someone provides power to their data center and their offices. Someone provides hand-held devices. Someone provides network connectivity. Someone has a contract to house and feed these agents. Someone has the logistical and fleet services for their vehicles. Someone is likely the landlord to their buildings. Someone has a contract to clean the buildings. Someone is a deciding to buy a block of Palantir stock versus some other software company. Someone runs the private prison into which people are herded. An attorney has a choice to file a charge or not file a charge. A judge has the choice to bend over backward to give ICE/CBP the benefit of the doubt, or be skeptical.
Baking a roll of bread is not immoral. Baking bread as part of a contract to feed the gestapo, is.
My point was, if you do invent something like Zyklon B, you need to consider its uses. While the gas itself is just a molecule, devoid of morality, not everyone who employs it will be a moral person.
In the case of Palantir, should we allow the federal government to combine databases (which may have been hoovered up by DOGE and held in a private sector company that isn't subject to FOIA)? Should there be judicial review, like for FISA warrants before you can field an application? Should we allow the government to buy that kind of app in the first place? I don't give Palantir a free pass.
But it's not the engineer at Palantir that decides to send poorly vetted and trained people into a home, fully stoked, believing your have complete immunity, and full of anabolic steroids, and praying any of the occupants shows an iota of resistance. 79 million voters chose this. This is the morality of the people employing the tool.
A thing clearly has no intention and it's impossible for us to know every possible use for a product. But at some level we need to feel responsible for what we create, we need to feel responsible for our choices, and we need to see the responsibility others have because of their choices.
Yes. It's not, and I agree. There's no bright line that says you're morally culpable or you are not morally culpable for what you do. But all of us should think about our roles in that light. If Palantir uses Git, does that mean new Git contributions are part of what is arguably an ethnic cleansing? I wouldn't be able to sleep at night and work on this project. (I do not work at Palantir).
But the point is also that maybe we should take one step back and think about the morality of the people we put in decision making roles. The technology is morally neutral, but the intention is not. And helping to realize that intention is not. And sometimes the things we build can be used in horrible ways unless we also think about safeguarding their use.
This is just the tip of the iceberg. It is my very real fear that a lot of information has been aggregated into Palantir and other applications and is usable with no restraint. And that even if you just run the build system, across hundreds of apps, you might be culpable as well.
In another comment, I referenced Eichmann. A train is not a good thing or a bad thing. A rail car is not a good thing or a bad thing. Having an app that aggregates multiple different data sources and puts them together is not a good thing or a bad thing. It's the morality behind the hands into which we put that tools that matters. The more capable the tool, the more good or evil you can do with it. Maybe we should ask ourselves if this kind of a tool should exist at all, or there should be some level of process before it can be used. But the engineer at Palantir is just as guilty or not guilty in your eyes as the engineer fixing the trains or laying new track.