Phenomenology Boom!

Joined
11 Nov 2020
Messages
10,403
Reaction score
1,524
Location
Middle Earth
Country
United Kingdom
Recent discussion reminded me of the classic sci-fi movie 'Dark Star'. If you've seen it, you'll know what i mean by 'Alien Beachball' - if you haven't, here's a short clip to show what you're missing...

The bomb, having learned Cartesian doubt, trusts only itself. It is convinced that only it exists, and that its sole purpose in life is to explode...

 
After taking some time to mull over this new concept, Bomb decides...

 
"[these] people should know when they're conquered."

The Gospel
(according to Benny)
The Gospel is an AI system that uses drones to mark buildings and structures that the army claims militants operate from, Lavender marks people — and puts them on a kill list... the army preferred to only use unguided missiles, commonly known as “dumb” bombs (in contrast to “smart” precision bombs), which can destroy entire buildings on top of their occupants and cause significant casualties. “You don’t want to waste expensive bombs on unimportant people — it’s very expensive for the country and there’s a shortage [of those bombs],” said C., one of the intelligence officers. Another source said that they had personally authorized the bombing of “hundreds” of private homes of alleged junior operatives marked by Lavender, with many of these attacks killing civilians and entire families as “collateral damage.”

In an unprecedented move, according to two of the sources, the army also decided during the first weeks of the war that, for every junior Hamas operative that Lavender marked, it was permissible to kill up to 15 or 20 civilians; in the past, the military did not authorize any “collateral damage” during assassinations of low-ranking militants. The sources added that, in the event that the target was a senior Hamas official with the rank of battalion or brigade commander, the army on several occasions authorized the killing of more than 100 civilians in the assassination of a single commander.
 

STEP 1: GENERATING TARGETS​

"Once you go automatic, target generation goes crazy"

In the Israeli army, the term “human target” referred in the past to a senior military operative who, according to the rules of the military’s International Law Department, can be killed in their private home even if there are civilians around.
The new policy also posed a technical problem for Israeli intelligence. In previous wars, in order to authorize the assassination of a single human target, an officer had to go through a complex and lengthy “incrimination” process: cross-check evidence that the person was indeed a senior member of Hamas’ military wing, find out where he lived, his contact information, and finally know when he was home in real time. When the list of targets numbered only a few dozen senior operatives, intelligence personnel could individually handle the work involved in incriminating and locating them.

However, once the list was expanded to include tens of thousands of lower-ranking operatives, the Israeli army figured it had to rely on automated software and artificial intelligence. The result, the sources testify, was that the role of human personnel in incriminating Palestinians as military operatives was pushed aside, and AI did most of the work instead. Lavender has marked some 37,000 Palestinians as suspected “Hamas militants,” most of them junior, for assassination.

“It was very surprising for me that we were asked to bomb a house to kill a ground soldier, whose importance in the fighting was so low,” said one source about the use of AI to mark alleged low-ranking militants. “I nicknamed those targets ‘garbage targets.’ Still, I found them more ethical than the targets that we bombed just for ‘deterrence’ — highrises that are evacuated and toppled just to cause destruction.”

According to the sources, the army knew that the minimal human supervision in place would not discover these faults. “There was no ‘zero-error’ policy. Mistakes were treated statistically,” said a source who used Lavender. “Because of the scope and magnitude, the protocol was that even if you don’t know for sure that the machine is right, you know that statistically it’s fine. So you go for it.”

To conduct the check, B. claimed that in the current war, “I would invest 20 seconds for each target at this stage, and do dozens of them every day. I had zero added value as a human, apart from being a stamp of approval. It saved a lot of time. If [the operative] came up in the automated mechanism, and I checked that he was a man, there would be permission to bomb him, subject to an examination of collateral damage.”

In practice, sources said this meant that for civilian men marked in error by Lavender, there was no supervising mechanism in place to detect the mistake. According to B., a common error occurred “if the [Hamas] target gave [his phone] to his son, his older brother, or just a random man. That person will be bombed in his house with his family. This happened often. These were most of the mistakes caused by Lavender,” B. said.
 

STEP 2: LINKING TARGETS TO FAMILY HOMES​

‘Most of the people you killed were women and children’​


... the IDF Spokesperson claimed that “Hamas places its operatives and military assets in the heart of the civilian population, systematically uses the civilian population as human shields, and conducts fighting from within civilian structures, including sensitive sites such as hospitals, mosques, schools and UN facilities. The IDF is bound by and acts according to international law, directing its attacks only at military targets and military operatives.”

However, in contrast to the Israeli army’s official statements, the sources explained that a major reason for the unprecedented death toll from Israel’s current bombardment is the fact that the army has systematically attacked targets in their private homes, alongside their families — in part because it was easier from an intelligence standpoint to mark family houses using automated systems.

Indeed, several sources emphasized that, as opposed to numerous cases of Hamas operatives engaging in military activity from civilian areas, in the case of systematic assassination strikes, the army routinely made the active choice to bomb suspected militants when inside civilian households from which no military activity took place. This choice, they said, was a reflection of the way Israel’s system of mass surveillance in Gaza is designed. “You put hundreds [of targets] into the system and wait to see who you can kill,” said one source with knowledge of the system. “It’s called broad hunting: you copy-paste from the lists that the target system produces.”

Evidence of this policy is also clear from the data: during the first month of the war, more than half of the fatalities — 6,120 people — belonged to 1,340 families, many of which were completely wiped out while inside their homes, according to UN figures. The proportion of entire families bombed in their houses in the current war is much higher than in the 2014 Israeli operation in Gaza (which was previously Israel’s deadliest war on the Strip), further suggesting the prominence of this policy. “One day, totally of my own accord, I added something like 1,200 new targets to the [tracking] system, because the number of attacks [we were conducting] decreased,” the source said. “That made sense to me. In retrospect, it seems like a serious decision I made. And such decisions were not made at high levels.” According to this source, even some minors were marked by Lavender as targets for bombing. “Normally, operatives are over the age of 17, but that was not a condition.”

“Let’s say you calculate [that there is one] Hamas [operative] plus 10 [civilians in the house],” A. said. “Usually, these 10 will be women and children. So absurdly, it turns out that most of the people you killed were women and children.”

Decimation.
 

STEP 3: CHOOSING A WEAPON​

In December 2023, CNN reported that according to U.S. intelligence estimates, about 45 percent of the munitions used by the Israeli air force in Gaza were “dumb” bombs, which are known to cause more collateral damage than guided bombs. In response to the CNN report, an army spokesperson quoted in the article said: “As a military committed to international law and a moral code of conduct, we are devoting vast resources to minimizing harm to the civilians that Hamas has forced into the role of human shields. Our war is against Hamas, not against the people of Gaza.”

Three intelligence sources, however, said that junior operatives marked by Lavender were assassinated only with dumb bombs, in the interest of saving more expensive armaments. The implication, one source explained, was that the army would not strike a junior target if they lived in a high-rise building, because the army did not want to spend a more precise and expensive “floor bomb” (with more limited collateral effect) to kill him. But if a junior target lived in a building with only a few floors, the army was authorized to kill him and everyone in the building with a dumb bomb.
 

STEP 4: AUTHORIZING CIVILIAN CASUALTIES​

‘We attacked almost without considering collateral damage’​


One source said that when attacking junior operatives, including those marked by AI systems like Lavender, the number of civilians they were allowed to kill alongside each target was fixed during the initial weeks of the war at up to 20. Another source claimed the fixed number was up to 15. These “collateral damage degrees,” as the military calls them, were applied broadly to all suspected junior militants, the sources said, regardless of their rank, military importance, and age, and with no specific case-by-case examination to weigh the military advantage of assassinating them against the expected harm to civilians.

According to A., this was the policy for most of the time that he served. Only later did the military lower the collateral damage degree. “In this calculation, it could also be 20 children for a junior operative … It really wasn’t like that in the past,” A. explained. Asked about the security rationale behind this policy, A. replied: “Lethality.”

“In the bombing of the commander of the Shuja’iya Battalion, we knew that we would kill over 100 civilians,” B. recalled of a Dec. 2 bombing that the IDF Spokesperson said was aimed at assassinating Wisam Farhat. “For me, psychologically, it was unusual. Over 100 civilians — it crosses some red line.”

Intelligence sources [say] they took part in even deadlier strikes. In order to assassinate Ayman Nofal, the commander of Hamas’ Central Gaza Brigade, a source said the army authorized the killing of approximately 300 civilians, destroying several buildings in airstrikes on Al-Bureij refugee camp on Oct. 17, based on an imprecise pinpointing of Nofal. Satellite footage and videos from the scene show the destruction of several large multi-storey apartment buildings. “Between 16 to 18 houses were wiped out in the attack,” Amro Al-Khatib, a resident of the camp, told +972 and Local Call. “We couldn’t tell one apartment from the other — they all got mixed up in the rubble, and we found human body parts everywhere.”

Nael Al-Bahisi, a paramedic, was one of the first on the scene. He counted between 50-70 casualties on that first day. “At a certain moment, we understood the target of the strike was Hamas commander Ayman Nofal,” he said. “They killed him, and also many people who didn’t know he was there. Entire families with children were killed.”

Another intelligence source told +972 and Local Call that the army destroyed a high-rise building in Rafah in mid-December, killing “dozens of civilians,” in order to try to kill Mohammed Shabaneh, the commander of Hamas’ Rafah Brigade (it is not clear whether or not he was killed in the attack). Often, the source said, the senior commanders hide in tunnels that pass under civilian buildings, and therefore the choice to assassinate them with an airstrike necessarily kills civilians.

“There was a completely permissive policy regarding the casualties of [bombing] operations — so permissive that in my opinion it had an element of revenge,” D., an intelligence source, claimed. “The core of this was the assassinations of senior [Hamas and PIJ commanders] for whom they were willing to kill hundreds of civilians. We had a calculation: how many for a brigade commander, how many for a battalion commander, and so on.”

B., the senior intelligence source, said that in retrospect, he believes this “disproportionate” policy of killing Palestinians in Gaza also endangers Israelis, and that this was one of the reasons he decided to be interviewed. “In the short term, we are safer, because we hurt Hamas. But I think we’re less secure in the long run. I see how all the bereaved families in Gaza — which is nearly everyone — will raise the motivation for [people to join] Hamas 10 years down the line. And it will be much easier for [Hamas] to recruit them.”
 

STEP 5: CALCULATING COLLATERAL DAMAGE​

‘The model was not connected to reality’​


In October, The New York Times reported on a system operated from a special base in southern Israel, which collects information from mobile phones in the Gaza Strip and provided the military with a live estimate of the number of Palestinians who fled the northern Gaza Strip southward. Brig. General Udi Ben Muha told the Times that “It’s not a 100 percent perfect system — but it gives you the information you need to make a decision.” The system operates according to colors: red marks areas where there are many people, and green and yellow mark areas that have been relatively cleared of residents.

To illustrate, if the army estimated that half of a neighborhood’s residents had left, the program would count a house that usually had 10 residents as a house containing five people. To save time, the sources said, the army did not surveil the homes to check how many people were actually living there, as it did in previous operations, to find out if the program’s estimate was indeed accurate.

“This model was not connected to reality,” claimed one source. “There was no connection between those who were in the home now, during the war, and those who were listed as living there prior to the war. [On one occasion] we bombed a house without knowing that there were several families inside, hiding together.” The source said that although the army knew that such errors could occur, this imprecise model was adopted nonetheless, because it was faster. As such, the source said, “the collateral damage calculation was completely automatic and statistical” — even producing figures that were not whole numbers.
 

STEP 6: BOMBING A FAMILY HOME​

‘You killed a family for no reason’​


Three intelligence sources that they had witnessed an incident in which the Israeli army bombed a family’s private home, and it later turned out that the intended target of the assassination was not even inside the house, since no further verification was conducted in real time. “Sometimes [the target] was at home earlier, and then at night he went to sleep somewhere else, say underground, and you didn’t know about it,” one of the sources said. “There are times when you double-check the location, and there are times when you just say, ‘Okay, he was in the house in the last few hours, so you can just bomb.’”

Another source described a similar incident that affected him and made him want to be interviewed for this investigation. “We understood that the target was home at 8 p.m. In the end, the air force bombed the house at 3 a.m. Then we found out [in that span of time] he had managed to move himself to another house with his family. There were two other families with children in the building we bombed.”

“You don’t know exactly how many you killed, and who you killed,” an intelligence source told Local Call for a previous investigation published in January. “Only when it’s senior Hamas operatives do you follow the BDA procedure. In the rest of the cases, you don’t care. You get a report from the air force about whether the building was blown up, and that’s it. You have no idea how much collateral damage there was; you immediately move on to the next target. The emphasis was to create as many targets as possible, as quickly as possible.”

But while the Israeli military may move on from each strike without dwelling on the number of casualties, Amjad Al-Sheikh, the Shuja’iya resident who lost 11 of his family members in the Dec. 2 bombardment, said that he and his neighbors are still searching for corpses. “Until now, there are bodies under the rubble,” he said. “Fourteen residential buildings were bombed with their residents inside. Some of my reltives and friends are still buried *

*Graphic.

Yuval Abraham is a journalist and filmmaker based in Jerusalem. @972.mag - an independent, online, nonprofit magazine run by a group of Palestinian and Israeli journalists.
 
Last edited:
At great risk from the 'Wrath of Bod" the meat of this article was posted to illustrate how Israel is conducting this punitive expedition thorugh the Gaza Strip, contradicting the party line from IDF bulletins and demonstrating the policy goes to the very top of government: Netanyahu is responsible for the conduct of this war and the IDF forces sent in to 'destroy Hamas'. It may or may not be Genocide, depending on your perspective, but the war crimes committed during the past six months are an undeniable part of this retribution for the Oct. 7th attacks.

The use of AI drones in identifying targets relates to the OP, where the human decision making process is relegated to a bit-part player, relying on an algorithim to direct a bombing campaign that's indiscriminate and cruel. How would our government react to disclosure of such a policy in the public domain? No such criticism of the Zionist government responsible is allowed in Israel and since their media has undergone almost total blackout, no revelations are due to their people who see the actions of their troops as entirely justified.

No such justification would be seen as credible in the West who continue to not only support the government of Israel but maintain and increase supply of these weapons for use on civilian targets. The OP is science-fiction. No such bomb yet exists to decide on whether or not it should detonate on a direct order from a commanding officer, but the technology being developed will evolve into something that may have unexpected consequences in real time.

"Let there be light."
 
At great risk from the 'Wrath of Bod" the meat of this article was posted to illustrate how Israel is conducting this punitive expedition thorugh the Gaza Strip
Nothing has really changed in this type of warfare.
The Nazis would pick civilians at random and shoot them into pits.

The Israelis use computers to pick their targets for death.
 
War is about death. Nothing will change that. The unfortunate thing with war is innocent people including children get killed.

The best way to prevent unnecessary deaths is to send in ground troops. But Israel lost 1200 people on October 7. They would rather not loose anymore.
 
The world is facing the collapse of the 1948 international order established in the wake of World War II, amid the brutal wars in Gaza and Ukraine, while authoritarian policies continue to spread, Amnesty International has warned. Amnesty said while Israel continued to disregard international human rights law, the US, its foremost ally, and other countries including the United Kingdom and Germany were guilty of “grotesque double standards” given their willingness to back Israeli and US authorities over Gaza while condemning war crimes by Russia in Ukraine.

“Israel’s flagrant disregard for international law is compounded by the failures of its allies to stop the indescribable civilian bloodshed meted out in Gaza. Many of those allies were the very architects of that post-World War Two system of law,” Callamard said. “Alongside Russia’s ongoing aggression against Ukraine, the growing number of armed conflicts, and massive human rights violations witnessed, for example, in Sudan, Ethiopia and Myanmar – the global rule-based order is at risk of decimation.”

This year’s report also stressed how authoritarian ideas had continued to spread across the world, with narratives “based in hatred and rooted in fear”. Space for freedom of expression and association had been squeezed with ethnic minority groups, refugees and migrants bearing the brunt of the backlash, it said. Amnesty also warned of the risks to the rule of law posed by artificial intelligence (AI) and the dominance of Big Tech. “In an increasingly precarious world, unregulated proliferation and deployment of technologies such as generative AI, facial recognition and spyware are poised to be a pernicious foe – scaling up and supercharging violations of international law and human rights to exceptional levels,” Callamard warned.

Al Jazeera
 
Britain is finally leading the world ... on AI-powered surveillance.

While politicians in Europe tie themselves in knots trying to set limits on how facial recognition tools should be used by law enforcement, ministers in the post-EU U.K. have shown no such qualms...with plans to give more forces the ability to scan crowds in real-time as part of a £230 million police “productivity and technology” plan. Some British lawmakers even fear that the country could be hurtling towards a future where the movements of entire populations can be tracked in real-time by authorities with minimal oversight.

Under the Data Protection and Digital Information Bill, which returns to parliament this week, the government is proposing to abolish the position of surveillance camera commissioner, a role responsible for encouraging compliance with one of the few statutory codes that does mention LFR.
Fraser Sampson, who served in the role for four years before resigning last year, told POLITICO he thought this would lead to a gap in the oversight of public surveillance, and dismissed the government’s claim that other regulators, such as those covering data protection, will pick up the slack. “Our current arrangements in England and Wales are incomplete, inconsistent, and sometimes incoherent. “That’s just not a great thing to have if you’re encouraging police to use it and citizens to accept it.”

The Home Office this month rejected recommendations from a House of Lords committee that it put police use of LFR on a firmer legal footing. It came after lawmakers found that there is currently no U.K. statute explicitly enabling officers’ use of facial recognition. “I think the resilience towards surveillance, which tends to be a negative, loaded word, is much higher in the U.K. society,” Jensen said. “But if you go that extensive that you know that the authorities are keeping a database of biometric identifiers of each and every citizen over a certain age, then it very much becomes a society that loses the trust.” “That’s not the democratic values that we support, nor that of the democracies we want to be a part of.”

Politico.uk
 
Back
Top