You know in certain animal groups when a member of the herd or whatever becomes a danger to the rest of the group, say through some kind of disease or injury or increased aggression or mental issue, that heard will eliminate the threat to the herd.
Take anything Palantir says about democracy as either a dog whistle or a threat. Palantirs product is mass surveillance and criminal behavior prediction (location, whereabouts, movement patterns). That’s authoritarian, but not necessarily antidemocratic. You can still vote and be a democracy with mass (edit: typo was with ass survelliance…might still work) surveillance, don’t conflate it.
Where the anti Democratic comes in is using that surveillance to prevent people from exercising their right to vote and manipulating their information so they vote how you want. That’s what Palantir is enabling.
Aside: Its nuts how accurate their name is in spirit. Almost commendable they carried through embracing the name and the power behind it.
I like the idea that they think that educated jobs only belong to women. That’s an interesting thought.
The sad truth is that this shit isn’t going to replace “highly educated” jobs, and that the AI gravy train will end once people start to enforce basic intellectual property enforcement. Time is ticking, and the market taking a hit now is making them scramble.
Given that copy desks were being gutted more than a decade ago just with little things like Grammarly, it absolutely will replace knowledge jobs. It won’t be better, but it will mean more share buybacks.
It doesn’t need to be good to replace jobs, as long as there are no consequences for the people making those decisions.
I’ve lost count of how many “oops, it was AI’s fault, not my fault!” stories I’ve heard, even within highly regulated fields. Like, lawyers submitting documents with completely fake citations, and then…no real consequences. Seems to me like that should be cause for immediate disbarment, but no, apparently not.
The lack of consequences has been a problem for quite a while now, from before LLMs. In my opinion it’s been caused by a widespread increase in professional incompetence, together with a mutually protective network of incompetent people. “I won’t point out that you’re incompetent and won’t blame you for your mistakes, if you do me the same favour”.
They call it “imposter syndrome”, but it isn’t a syndrome: it’s a symptom.
Indeed: Everything was already AI
This has been a very long project — separating conduct from consequences, in order to maximize profit. AI is just a breakthrough tool for doing it.





