Recently updated on October 5th, 2024 at 01:00 pm
The Israeli army has marked tens of thousands of Gazans as suspects for assassination, using an AI targeting system with little human oversight and a permissive policy for casualties, +972 and Local Call reveal.
This news story was very difficult to read. IDF Intelligence Officers involved in the use of Lavender in Gaza provided the information to these outlets because they are sickened by what is taking place. It was an eye-opening experience for me in more ways than just the obvious.
I had been condemning the IDF for the atrocities carried out in Gaza as if all members of the Israeli military are the same. I forgot that men (young men) are forced to serve in the military in Israel. In fact, there were 2 stories that came out about young men who refused to take part in this war. Instead, they were sentenced to military prison.
Tal Mitnick, 18-years-old, stated that he would be released from jail only to be called upon to join the IDF again. He soon found himself in a neverending cycle of military prison stays meant to break him into compliance.
From The Guardian article on Tal Mitnick
On Tal Mitnick’s first morning inside an Israeli military prison last month, he was ordered into a small classroom. Pinned to its walls were various famous quotes. One caught his attention: “Education is the most powerful weapon you can use to change the world.” The name beneath it: Nelson Mandela.
“I nearly laughed to myself,” says the 18-year-old, speaking over Zoom from the bedroom of his family’s Tel Aviv home. “A military upholding apartheid putting that on their wall,” he says, “while South Africa was preparing its case against Israel for the international criminal court? I pointed out how ridiculous this quote being there was. No other prisoners engaged or agreed. I realised how alone I was.”
The Guardian
Israeli magazine +972 and Local Call broke the story on “Lavender,” the name for the AI software creating the Palestinuan “hit list.”
From the article by Yuval Abraham:
The Israeli army has developed an artificial intelligence-based program known as “Lavender,” unveiled here for the first time. According to six Israeli intelligence officers, who have all served in the army during the current war on the Gaza Strip and had first-hand involvement with the use of AI to generate targets for assassination, Lavender has played a central role in the unprecedented bombing of Palestinians, especially during the early stages of the war. In fact, according to the sources, its influence on the military’s operations was such that they essentially treated the outputs of the AI machine “as if it were a human decision.”
Formally, the Lavender system is designed to mark all suspected operatives in the military wings of Hamas and Palestinian Islamic Jihad (PIJ), including low-ranking ones, as potential bombing targets. The sources told +972 and Local Call that, during the first weeks of the war, the army almost completely relied on Lavender, which clocked as many as 37,000 Palestinians as suspected militants — and their homes — for possible air strikes.
+972 Magazine
Each militant classification level had a collateral Damage allowance in place for the number of innocent civilians they were allowed to kill in order to hit one target. For the lower level militants, it was 15-20, while senior officers could be targeted, killing up to 300 civilians.
The Where’s Daddy system was created to track the men on the list of targets and alert IDF members when the target was entering their home so it could be bombed. IDF sources admitted there were sometimes children and other family members inside, but that didn’t matter. The suspected low level militants had a collateral allowance of around 15-20, so it was best to assassinate them at home where only their family members were collateral damage. Attacking them in public wasn’t an option, so Where’s Daddy was created.
Moreover, the Israeli army systematically attacked the targeted individuals while they were in their homes — usually at night while their whole families were present — rather than during the course of military activity. According to the sources, this was because, from what they regarded as an intelligence standpoint, it was easier to locate the individuals in their private houses. Additional automated systems, including one called “Where’s Daddy?” also revealed here for the first time, were used specifically to track the targeted individuals and carry out bombings when they had entered their family’s residences.
+972 magazine
Pause. Can we just take a moment to wrap our heads around the fact that a program designed to alert the IDF when a man, a husband, a father, was at home so that as his young children slept, they could drop a bomb on the home and murder innocent women, children and family members (along with a man who was likely innocent as well!), is called Where’s Daddy? Sick!
Men designated by Lavender to be higher level militants had acloateral damage ratio in the hundreds. Can you imagine ever thinking that way? I don’t care if someone that killed my family members was hiding in a group of 100 innocent people, or even behind just ONE child, I could never kill them just to get my revenge on that person. It’s just further proof (aside from calling them animals) that the Israeli government doesn’t see Palestinians As human beings.
ADD Moment: I will never forget the moment when Mehdi Hasan had an Israeli government spokesperson on his show who had just repeated the tired talking point, “Hamas is using Palestinians as human shields” regarding a hospital bombing. Mehdi asked, “If a member of Hamas was hiding in the basement of a hospital in Israel, would you bomb it?” The man refused to answer the question. “That would never happen. We would never allow Hamas to get anywhere near one of our hospitals.”
We all know the real answer is: “No way! Innocent Israelis would be killed!”
Excuse me, sir, your bigotry and hate is showing.
“We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity,” A., an intelligence officer, told +972 and Local Call. “On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”
Tweet
An intelligence officer in the IDF told +972 and Local Call that they would drop the 2,000 pound “dumb bombs” on the homes of the lower ranking targets. He explained that they didn’t want to “waste” the more expensive targeted missiles on unimportant targets.
They saved money and equipment by killing innocent family members.
That explains all of the destruction in Northern Gaza and why a majority of the buildings destroyed were residential.
This classification system and the collateral damage allowance are new to this war in Gaza. During Israel’s previous wars, since this was an “especially brutal” way to kill someone — often by killing an entire family alongside the target — such human targets were marked very carefully and only senior military commanders were bombed in their homes, to maintain the principle of proportionality under international law.
However, October 7, 2023 changed that and all of the rules went out the window.
The new policy also posed a technical problem for Israeli intelligence. In previous wars, in order to authorize the assassination of a single human target, an officer had to go through a complex and lengthy “incrimination” process: cross-check evidence that the person was indeed a senior member of Hamas’ military wing, find out where he lived, his contact information, and finally know when he was home in real time. When the list of targets numbered only a few dozen senior operatives, intelligence personnel could individually handle the work involved in incriminating and locating them.
+972 Magazine
The Lavender system was an auxiliary program prior to October 7th. The decision to begin using it in Gaza came just 2 weeks into the war. Was it tested? Did they know for sure that the 37,000 names on the list it generated were, indeed members of Hamas? No. The decision to put it into use came after a sampling of 100 targets was investigated by a human being—and found to be 90% accurate. That means 10%+ of the people killed had nothing to do with Hamas.
According to one Israel Intelligence Officer B.:
“At 5 a.m., [the air force] would come and bomb all the houses that we had marked,” B. said. “We took out thousands of people. We didn’t go through them one by one — we put everything into automated systems, and as soon as one of [the marked individuals] was at home, he immediately became a target. We bombed him and his house.”
“It was very surprising for me that we were asked to bomb a house to kill a ground soldier, whose importance in the fighting was so low,” said one source about the use of AI to mark alleged low-ranking militants. “I nicknamed those targets ‘garbage targets.’ Still, I found them more ethical than the targets that we bombed just for ‘deterrence’ — highrises that are evacuated and toppled just to cause destruction.”
Another interesting tidbit we’re just now learning: Some of the high rise apartment buildings destroyed by the IDF were nothing but a deterrent. Those “deterrents” are part of the reason more than 80% of Palestininians (1.6 million) are internally displaced, with no home to return to someday.
How does Lavander work?
Like other AI software, it needs data to “learn” from. By inputting information about known Hamas militants, the system can identify hundreds of features that are common among the group. Using that information, it assigns a score indicating the likelihood that someone is a member of Hamas.
The machine was then fed raw data collected by Israel during surveillance of Palestinians. The machine then used the learned algorithm and applied it to the data, looking for hundreds of features and using the number present for each individual to create the likelihood score from 1- unlikely, to 100- extremely likely.
The features of a Hamas militant
According to IDF sources that +972 and Local Call spoke to, there are a number of common features among Hamas members. Some of the surveillance data collected by Israel that indicated Hamas involvement included: Getting a new cell phone every few months (guess all iPhone users could be terrorists), belonging to a WhatsApp group a known Hamas militant is a member of, frequent address changes, just to name a few.
The man that envisioned a machine used to create target lists during war was recorded giving a lecture and the following are slides from his presentation to provide a visual of my explanation of how it works.
Several people the reporters spoke to admitted that if a human being checked names on a list of targets to give approval for bombings, it was 20 seconds of their time to confirm that all targets were male. If a female showed up on the list, they knew it was in error because there are no women in the military wing of Hamas.
According to B., there were times when a Hamas operative would be killed and someone else would pick up their phone or they would give it to an innocent civilian. He admitted that most of the men and their families killed by mistake were due to cell phones changing hands, which happens a lot during a war.
Where’s Daddy?
From the piece by +972 magazine and Local Call:
The sources told +972 and Local Call that since everyone in Gaza had a private house with which they could be associated, the army’s surveillance systems could easily and automatically “link” individuals to family houses. In order to identify the moment operatives enter their houses in real time, various additional automatic softwares have been developed. These programs track thousands of individuals simultaneously, identify when they are at home, and send an automatic alert to the targeting officer, who then marks the house for bombing. One of several of these tracking softwares, revealed here for the first time, is called “Where’s Daddy?”
“You put hundreds [of targets] into the system and wait to see who you can kill,” said one source with knowledge of the system. “It’s called broad hunting: you copy-paste from the lists that the target system produces.”
Evidence of this policy is also clear from the data: during the first month of the war, more than half of the fatalities — 6,120 people — belonged to 1,340 families, many of which were completely wiped out while inside their homes, according to UN figures. The proportion of entire families bombed in their houses in the current war is much higher than in the 2014 Israeli operation in Gaza (which was previously Israel’s deadliest war on the Strip), further suggesting the prominence of this policy.
Another source said that each time the pace of assassinations waned, more targets were added to systems like Where’s Daddy? to locate individuals that entered their homes and could therefore be bombed. He said that the decision of who to put into the tracking systems could be made by relatively low-ranking officers in the military hierarchy.
Sources also admitted that typically the youngest Hamas militant would be 17, but with Lavender, age wasn’t necessarily a factor.
“Let’s say you calculate [that there is one] Hamas [operative] plus 10 [civilians in the house],” A. said. “Usually, these 10 will be women and children. So absurdly, it turns out that most of the people you killed were women and children.”
Tweet
This summary doesn’t come close to the details and the photography from inside Gaza that you’ll find on the +972 magazine website. Be sure to read the full article for even more information on Lavender.