Published Date : 25/08/2025
“Ice is just around the corner,” my friend said, looking up from his phone. We were writing at a coffee shop in one of the oldest neighborhoods of New York City, where schools and churches support thriving migrant communities as they have since long before the United States existed. Now the agents of this rogue federal agency – recognized for civil rights abuses like racial profiling, wrongful detention, medical neglect, and inhumane detentions – were just footsteps away, shaking down our neighbors in their homes and at the park across the street.
A day earlier, I had met with foreign correspondents at the United Nations to explain the AI surveillance architecture that ICE is using across the United States. The law enforcement agency uses targeting technologies which one of my past employers, Palantir Technologies, has both pioneered and proliferated – tools I was once charged with illustrating as a graphic designer and writer, yet the consequences of which I am just coming to understand. Although largely invisible, technology like Palantir’s plays a major role in world events, from wars in Iran, Gaza, and Ukraine to the detainment of immigrants and dissident students in the United States. But despite its ubiquity, lawmakers, technologists, and the media are failing to protect people from the threat of this particular kind of weaponized AI and its consequences, partly because they haven’t recognized it by name.
Known as intelligence, surveillance, target acquisition, and reconnaissance (ISTAR) systems, these tools, built by several companies, allow users to track, detain, and, in the context of war, kill people at scale with the help of AI. They deliver targets to operators by combining immense amounts of publicly and privately sourced data to detect patterns, and are particularly helpful in projects of mass surveillance, forced migration, and urban warfare. Also known as “AI kill chains,” they pull us all into a web of invisible tracking mechanisms that we are just beginning to comprehend, yet are starting to experience viscerally in the US as ICE wields these systems near our homes, churches, parks, and schools.
The invisible nature of these surveillance structures – and how they influence our lives – is part of the reason the public understanding of what these tools do is so murky. It is also, however, what drew me to work for Palantir as an architecture writer. It was a chance to get to know the digital spaces where many people spend most of their lives today. Working with cloud software in offices, driving new cars in our commutes, doom-scrolling on social media at home – we all feed vast amounts of data to surveillance and targeting programs created by big tech which we often don’t recognize until it’s too late. This is why I continue trying to convey and illustrate how these ISTAR applications violate our civil rights and autonomy in increasingly perverse and violent ways.
The dragnets powered by ISTAR technology trap more than migrants and combatants – as well as their families and connections – in their wake. They appear to violate first and fourth amendment rights: first, by establishing vast and invisible surveillance networks that limit the things people feel comfortable sharing in public, including whom they meet or where they travel; and second, by enabling warrantless searches and seizures of people’s data without their knowledge or consent. They are rapidly depriving some of the most vulnerable populations in the world – political dissidents, migrants, or residents of Gaza – of their human rights.
There was a time when I wrote about the row homes in my neighborhood, with ornate windows and star-shaped iron studs, and how they welcomed migrants who sought hard work and opportunity in the US. With shared walls and affordable rents, they created tolerant and prosperous communities and accelerated the rise of the largest middle class in history. Now a new kind of architecture greets migrants and visitors to America and decides their future – one that is not made of bricks, mortar, and lumber, but comprising these invisible and invasive digital surveillance systems.
With names like Investigative Case Management (ICM) and ImmigrationOS, the big data platforms Palantir provides for the Department of Homeland Security, and like those it offers the IDF, are fundamentally composed of four shared elements: the underlying data integrated into the system, the interpretation and modeling of that data through analytics, and the execution of automated actions – with or without human involvement. At every layer of this architecture, there are significant ethical questions regarding civil rights, data collection, data quality, bias, discrimination, accuracy, automation, and, most importantly, accountability.
Ultimately, however, these platforms generate and track targets by exploiting a mind-boggling range of datasets. This can include deeply personal information such as biometric and medical data, social media data involving friends and family, precise location data derived from license plate readers, SIM card data, and surveillance drone data. They can also process data purchased from a thriving ecosystem of private data brokers, or subpoenaed from companies such as Waymo and Meta. The lack of transparency regarding datasets exploited in these applications, and how they are shared across systems, further distorts the picture. That’s why it’s important to focus on the victims.
Soon, Trump’s mass resettlement agenda – from targeting and tracking to managing the arrest and removal of migrants from the country – could be seamlessly coordinated using ISTAR tools. ICE recently paid Palantir tens of millions to enable “complete target analysis of known populations,” bolstering the Trump administration’s deportation efforts. In Gaza, Palantir provides the IDF with critical data infrastructure for war-related missions. The Israeli armed forces, meanwhile, have developed ISTAR tools of their own like “Where’s Daddy,” which follow targets to their family homes for execution via cheap, non-guided “dumb bombs.”
Palantir has contested reports that it conducts widespread surveillance of Americans and says it is “committed to defending human rights.” For all the reasons above, I reject those claims. It is time to embrace the cause of privacy again, or we will witness the unbridled proliferation of these targeting tools in our commercial and public lives. As AI targeting technologies become more normalized in the United States, they are also increasingly incorporated into the private sector as companies build their own dragnets of data with platforms like Palantir to target their customers and employees – not to kill or deport them, but to shape their behavior and maximize revenue, increasing further systems of control.
Unfortunately, the fight for civil rights in the face of AI is struggling at the federal and state levels. In Colorado, the nation’s first consumer protection laws on AI – which aim to protect state residents from discrimination – are now under threat. This is the reason why last month I took to the streets of Denver, along with around 40 other activists, to march to Palantir’s headquarters from the state capitol. We were joined by protesters from coast to coast, in Washington DC, New York, Palo Alto, and Seattle, who, driven by loose connections but a shared cause, also picketed the offices of Palantir in their cities. Four people were arrested in New York, and in Denver our small group was met with an impressive and coordinated show of force. We faced just about as many police officers as protesters throughout our two-mile journey, where they shut down many streets and followed our convoy with drones.
Riding in my truck bed, I yelled out for the release of my neighbors from ICE custody – such as Eric Sanchez Goitia, Jeanette Vizguerra, and Nixon and Dixon Perez – people who, like me, have built their entire lives in Colorado.
Q: What are ISTAR systems?
A: ISTAR stands for Intelligence, Surveillance, Target Acquisition, and Reconnaissance. These systems are advanced tools used for tracking, detaining, and even targeting individuals using AI and big data.
Q: How does Palantir's technology impact civil rights?
A: Palantir's technology, particularly its ISTAR systems, can violate civil rights by enabling warrantless surveillance, tracking, and targeting of individuals, often without their knowledge or consent.
Q: What are some ethical concerns with Palantir's platforms?
A: Ethical concerns include issues of data privacy, bias, discrimination, and lack of accountability in how data is collected, processed, and used to target individuals.
Q: How is Palantir involved in immigration enforcement?
A: Palantir provides big data platforms like Investigative Case Management (ICM) and ImmigrationOS to the Department of Homeland Security, which are used to track and manage the arrest and removal of migrants.
Q: What actions are being taken to protest against Palantir's surveillance tools?
A: Activists have organized protests in multiple cities, including Denver, Washington DC, New York, Palo Alto, and Seattle, to raise awareness and demand accountability from Palantir and other companies involved in surveillance.