به عنوان بخشی از جنگ غزه، نیروی دفاعی اسرائیل از هوش مصنوعی برای انجام سریع و خودکار بسیاری از فرآیندهای تعیین بمباران استفاده کرده است. اسرائیل، بمباران نوار غزه را که در جنگهای قبلی به دلیل تمام شدن اهداف نیروی هوایی اسرائیل محدود شده بود، بسیار گسترش داده است.
این ابزارها عبارتند از Gospel، یک هوش مصنوعی که بهطور خودکار دادههای نظارتی را به دنبال ساختمانها، تجهیزات و افرادی که تصور میشود متعلق به دشمن هستند، بررسی میکند و پس از یافتن آنها، بمباران اهداف را به یک تحلیلگر انسانی توصیه میکند که ممکن است تصمیم بگیرد که نسبت به آن هدف، چه اقدامی لازم است انجام گیرد.
منتقدان استدلال کردهاند که استفاده از این ابزارهای هوش مصنوعی غیرنظامیان را در معرض خطر قرار میدهد، مسئولیتپذیری را محو میکند و به خشونتهای نظامی نامتناسب منجر میشود که نقض حقوق بینالملل بشردوستانه است.
سیستم Habsora
اسرائیل از یک سیستم هوش مصنوعی به نام "Habsora" استفاده میکند تا مشخص کند که نیروی هوایی اسرائیل کدام اهداف را بمباران خواهند کرد.[۱] این بهطور خودکار یک توصیه هدفگیری را به یک تحلیلگر انسانی ارائه میکند،[۲][۳] که تصمیم میگیرد آن دادهها را به سربازان در میدان منتقل کند یا خیر.[۳]
هوش مصنوعی میتواند ارزیابی اطلاعات را بسیار سریعتر از انسان پردازش کند.[۴][۵] ژنرال بازنشسته آویو کوخاوی، رئیس ستاد کل ارتش اسرائیل تا سال ۲۰۲۳، اظهار داشت که این سیستم میتواند روزانه ۱۰۰ هدف بمباران در غزه تولید کند.[۶] یک سخنران با مصاحبه انپیآر این ارقام را ۵۰ تا ۱۰۰ هدف در ۳۰۰ روز برای ۲۰ افسر اطلاعاتی و ۲۰۰ هدف در ۱۰ تا ۱۲ روز برای Habsora تخمین زد.[۷]
گاردین گفتگویی را با چند تن از افسران اطلاعاتی ارتش اسرائیل منتشر کرد که میگوید، مردان فلسطینی مرتبط با شاخه نظامی حماس بدون در نظر گرفتن درجه یا اهمیت اهداف بالقوه در نظر گرفته میشوند،[۸] و اعضای رده پایین حماس ترجیحاً هدف قرار خواهند گرفت.[۹] دو تن از منابع گفتند که حملات به ستیزهجویان رده پایین معمولاً با بمب هدایتناپذیر انجام میشود و کل خانهها را ویران میکند و همه را در آنجا میکشد. یکی از این منابع، گفته است که اسرائیل، بمبهای گرانقیمت را برای افراد کماهمیت، خرج نمیکند.[۱۰]
پیشینهٔ فناوری
هوش مصنوعی برخلاف نامش، توانایی تفکر یا هوشیاری را ندارد.[۱۱] بلکه سیستمهایی هستند که برای انجام وظایفی طراحی شدهاند که معمولاً انسانها با استفاده از تواناییهای شناختی خود انجام میدهند. گاسپل از یادگیری ماشینی استفاده میکند[۱۲] که در آن به هوش مصنوعی مسئولیت تشخیص الگوها در مجموعه دادههای گسترده (مانند تصاویر بافتهای سرطانی، عکسهایی که حالات چهره را نشان میدهند، یا فیلمهای نظارتی اعضای حماس که توسط تحلیلگران انسانی شناسایی شدهاند) واگذار میشود و متعاقباً آن الگوها را در دادههای جدید جستجو میکند.[۱۳]
منابع اطلاعاتی مورد استفاده گاسپل نامشخص است. با این حال، اعتقاد بر این است که دادههای نظارتی گسترده از منابع مختلف را با هم ترکیب میکند.[۱۴] توصیهها از تشخیص الگو به دست میآیند. فردی که شباهتهای کافی را با افراد دیگری که به عنوان جنگجویان دشمن دستهبندی میشوند نشان میدهد نیز ممکن است به عنوان یک جنگجو تعیین شود.[۱۲] Heidy Khlaaf، مدیر مهندسی AI Assurance در شرکت امنیت فناوری Trail of Bits، توسط NPR هوش مصنوعی در رابطه با مناسب بودن برای این کار مورد اشاره قرار گرفت. او اظهار داشت که «الگوریتمهای هوش مصنوعی بهطور دارای نقص ایمنی هستند و نرخ خطای بالایی را در برنامههایی که نیاز به دقت، دقت و ایمنی دارند، نشان میدهند». بیانکا باگیارینی، مدرس مرکز مطالعات استراتژیک و دفاعی دانشگاه ملی استرالیا، اظهار داشت که هوش مصنوعی «در محیطهای قابل پیشبینی که در آن مفاهیم عینی، بهطور منطقی پایدار و از لحاظ درونی سازگار هستند، مؤثرتر هستند». او چالش تمایز قائل شدن بین جنگجویان و غیر جنگجویان را برجسته کرد، وظیفه ای که حتی انسانها اغلب برای به انجام رساندن آن تلاش میکنند.[۱۵] خلف تأکید کرد که تصمیمات اتخاذ شده توسط چنین سیستمی کاملاً متکی به دادههایی است که بر اساس آن آموزش دیده است. این تصمیمات از استدلال، شواهد واقعی یا علت و معلول ناشی نمیشوند. بلکه منحصراً بر اساس احتمالات آماری هستند.[۱۶]
گاسپل توسط بخش مدیریت هدف ارتش، که به نام اداره اهداف یا اداره هدف[۱۷] نیز شناخته میشود، استفاده میشود که در سال ۲۰۱۹ در داخل اداره اطلاعات ارتش اسرائیل[۱۸] تأسیس شد. این ابتکار برای پرداختن به مسئله تخلیه گزینههای هدف نیروی هوایی ایجاد شد. کوهاوی این لشکر را «با قابلیتهای هوش مصنوعی» و متشکل از صدها افسر و سرباز توصیف کرد.[۱۹] گاردین، فراتر از وظایف خود در زمان جنگ، گزارش داد که در سالهای اخیر به ارتش اسرائیل در جمعآوری پایگاه داده از حدود ۳۰۰۰۰ تا ۴۰۰۰۰ شبه نظامی مظنون کمک کرده است. بعلاوه، سیستمهایی مانند گاسپل در ایجاد فهرستهایی از افرادی که مجاز به ترور تلقی میشوند، مؤثر بودهاند.[۱۸]
استفاده در ۲۰۲۱
کوهاوی لشکر مورد نظر با استفاده از گاسپل را با یک ماشین مقایسه کرد و اظهار داشت که هنگامی که این دستگاه در جریان درگیری در ماه مه ۲۰۲۱ عملیاتی شد، روزانه ۱۰۰ هدف تولید میکرد که ۵۰ مورد از آنها درگیر بودند. این افزایش قابل توجهی را در مقایسه با نرخ قبلی ۵۰ هدف در سال در غزه نشان میدهد.[۲۰] طبق گزارش ارتش، حدود ۲۰۰ مورد از اهداف شناسایی شده توسط اسرائیل در جریان درگیری در غزه از گاسپل بدست آمده است،[۲۱] از مجموع ۱۵۰۰ هدفی که درگیر شده بودند. این رقم شامل اهداف ثابت و متحرک میشود.[۲۲] مؤسسه یهودی برای امنیت ملی آمریکا در گزارش پس از اقدام، مشکلی را شناسایی کرد و بیان کرد که این سیستم اطلاعاتی در مورد هدف مورد نظر دارد، اما فاقد اطلاعات در مورد آنچه نبوده است.[۲۳] این سیستم بهطور انحصاری بر دادههای آموزشی متکی است،[۱۶] و اطلاعاتی که تحلیلگران انسانی بررسی کرده بودند و تعیین کرده بودند که هدف نیستند، حذف شد و خطر سوگیری وجود داشت. معاون رئیس جمهور ابراز امیدواری کرد که این امر از آن زمان اصلاح شده باشد.[۲۲]
واکنشها
- آنتونیو گوترش، دبیرکل سازمان ملل متحد، گفت که از گزارشها مبنی بر استفاده اسرائیل از هوش مصنوعی در عملیات نظامی خود در غزه «عمیقاً ناراحت» است و گفت که این عمل غیرنظامیان را در معرض خطر قرار میدهد و مسئولیتپذیری را مخدوش میکند.[۲۴]
- مارک اوون جونز، استاد دانشگاه حمد بن خلیفه، در مورد سیستم هوش مصنوعی گفت: «بیایید واضح بگوییم: این یک نسلکشی با کمک هوش مصنوعی است، و در آینده، نیاز به توقف استفاده از هوش مصنوعی در جنگ وجود دارد.».[۲۵]
- بن سائول، گزارشگر ویژه سازمان ملل، اظهار داشت که اگر گزارشهای مربوط به استفاده اسرائیل از هوش مصنوعی صحت داشته باشد، «بسیاری از حملات اسرائیل در غزه جنایت جنگی خواهند بود».[۲۶]
منابع
- ↑ Lee, Gavin (12 December 2023). "Understanding how Israel uses 'Gospel' AI system in Gaza bombings". France24. Archived from the original on 20 February 2024. Retrieved 1 April 2024.
- ↑ Brumfiel, Geoff. "Israel is using an AI system to find targets in Gaza. Experts say it's just the start". NPR. Archived from the original on 20 February 2024. Retrieved 1 April 2024.
The Gospel is actually one of several AI programs being used by Israeli intelligence, according to Tal Mimran, a lecturer at Hebrew University in Jerusalem who has worked for the Israeli government on targeting during previous military operations. Other AI systems aggregate vast quantities of intelligence data and classify it. The final system is the Gospel, which makes a targeting recommendation to a human analyst. Those targets could be anything from individual fighters, to equipment like rocket launchers, or facilities such as Hamas command posts.
- ↑ ۳٫۰ ۳٫۱ Brumfiel, Geoff. "Israel is using an AI system to find targets in Gaza. Experts say it's just the start". NPR. Archived from the original on 20 February 2024. Retrieved 1 April 2024.
A brief blog post by the Israeli military on November 2 lays out how the Gospel is being used in the current conflict. According to the post, the military's Directorate of Targets is using the Gospel to rapidly produce targets based on the latest intelligence. The system provides a targeting recommendation for a human analyst who then decides whether to pass it along to soldiers in the field.
"This isn't just an automatic system," Misztal emphasizes. "If it thinks it finds something that could be a potential target, that's flagged then for an intelligence analyst to review."
The post states that the targeting division is able to send these targets to the IAF and navy, and directly to ground forces via an app known as "Pillar of Fire," which commanders carry on military-issued smartphones and other devices. - ↑ Brumfiel, Geoff. "Israel is using an AI system to find targets in Gaza. Experts say it's just the start". NPR. Archived from the original on 20 February 2024. Retrieved 1 April 2024.
Algorithms can sift through mounds of intelligence data far faster than human analysts, says Robert Ashley, a former head of the U.S. Defense Intelligence Agency. Using AI to assist with targeting has the potential to give commanders an enormous edge.
"You're going to make decisions faster than your opponent, that's really what it's about," he says. - ↑ Baggiarini, Bianca (8 December 2023). "Israel's AI can produce 100 bombing targets a day in Gaza. Is this the future of war?". The Conversation. Archived from the original on 20 February 2024. Retrieved 1 April 2024.
Militaries and soldiers frame their decision-making through what is called the “OODA loop” (for observe, orient, decide, act). A faster OODA loop can help you outmanoeuvre your enemy. The goal is to avoid slowing down decisions through excessive deliberation, and instead to match the accelerating tempo of war. So the use of AI is potentially justified on the basis it can interpret and synthesise huge amounts of data, processing it and delivering outputs at rates that far surpass human cognition.
- ↑ Baggiarini, Bianca (8 December 2023). "Israel's AI can produce 100 bombing targets a day in Gaza. Is this the future of war?". The Conversation. Archived from the original on 20 February 2024. Retrieved 1 April 2024.
- ↑ Brumfiel, Geoff. "Israel is using an AI system to find targets in Gaza. Experts say it's just the start". NPR. Archived from the original on 20 February 2024. Retrieved 1 April 2024.
- ↑ McKernan, Bethan; Davies, Harry (3 April 2024). "'The machine did it coldly': Israel used AI to identify 37,000 Hamas targets". The Guardian. Retrieved 4 April 2024.
In the weeks after the Hamas-led 7 October assault on southern Israel, in which Palestinian militants killed nearly 1,200 Israelis and kidnapped about 240 people, the sources said there was a decision to treat Palestinian men linked to Hamas’s military wing as potential targets, regardless of their rank or importance.
- ↑ McKernan, Bethan; Davies, Harry (3 April 2024). "'The machine did it coldly': Israel used AI to identify 37,000 Hamas targets". The Guardian. Retrieved 4 April 2024.
The testimonies published by +972 and Local Call may explain how such a western military with such advanced capabilities, with weapons that can conduct highly surgical strikes, has conducted a war with such a vast human toll.
When it came to targeting low-ranking Hamas and PIJ suspects, they said, the preference was to attack when they were believed to be at home. “We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity,” one said. “It’s much easier to bomb a family’s home. The system is built to look for them in these situations. ” - ↑ McKernan, Bethan; Davies, Harry (3 April 2024). "'The machine did it coldly': Israel used AI to identify 37,000 Hamas targets". The Guardian. Retrieved 4 April 2024.
Two sources said that during the early weeks of the war they were permitted to kill 15 or 20 civilians during airstrikes on low-ranking militants. Attacks on such targets were typically carried out using unguided munitions known as “dumb bombs”, the sources said, destroying entire homes and killing all their occupants.
“You don’t want to waste expensive bombs on unimportant people – it’s very expensive for the country and there’s a shortage [of those bombs],” one intelligence officer said. Another said the principal question they were faced with was whether the “collateral damage” to civilians allowed for an attack.
“Because we usually carried out the attacks with dumb bombs, and that meant literally dropping the whole house on its occupants. But even if an attack is averted, you don’t care – you immediately move on to the next target. Because of the system, the targets never end. You have another 36,000 waiting. ” - ↑ Mueller, John Paul; Massaron, Luca (2016). Machine Learning For Dummies®. Hoboken, New Jersey: John Wiley & Sons. ISBN 978-1-119-24551-3. p. 13:
Machine learning relies on algorithms to analyze huge datasets. Currently, machine learning can’t provide the sort of AI that the movies present. Even the best algorithms can’t think, feel, present any form of self-awareness, or exercise free will.
- ↑ ۱۲٫۰ ۱۲٫۱ Baggiarini, Bianca (8 December 2023). "Israel's AI can produce 100 bombing targets a day in Gaza. Is this the future of war?". The Conversation. Archived from the original on 20 February 2024. Retrieved 1 April 2024.
How does the system produce these targets? It does so through probabilistic reasoning offered by machine learning algorithms.
Machine learning algorithms learn through data. They learn by seeking patterns in huge piles of data, and their success is contingent on the data’s quality and quantity. They make recommendations based on probabilities.
The probabilities are based on pattern-matching. If a person has enough similarities to other people labelled as an enemy combatant, they too may be labelled a combatant themselves. - ↑ Mueller, John Paul; Massaron, Luca (2016). Machine Learning For Dummies®. Hoboken, New Jersey: John Wiley & Sons. ISBN 978-1-119-24551-3. p. 33:
The secret to machine learning is generalization. The goal is to generalize the output function so that it works on data beyond the training set. For example, consider a spam filter. Your dictionary contains 100,000 words (actually a small dictionary). A limited training dataset of 4,000 or 5,000 word combinations must create a generalized function that can then find spam in the 2^100,000 combinations that the function will see when working with actual data.
- ↑ Inskeep, Steve. "Israel is using an AI system to find targets in Gaza. Experts say it's just the start". NPR. Archived from the original on 20 February 2024. Retrieved 1 April 2024.
The system is called the Gospel. And basically, it takes an enormous quantity of surveillance data, crunches it all together and makes recommendations about where the military should strike.
- ↑ Baggiarini, Bianca (8 December 2023). "Israel's AI can produce 100 bombing targets a day in Gaza. Is this the future of war?". The Conversation. Archived from the original on 20 February 2024. Retrieved 1 April 2024.
Some claim machine learning enables greater precision in targeting, which makes it easier to avoid harming innocent people and using a proportional amount of force. However, the idea of more precise targeting of airstrikes has not been successful in the past, as the high toll of declared and undeclared civilian casualties from the global war on terror shows.
Moreover, the difference between a combatant and a civilian is rarely self-evident. Even humans frequently cannot tell who is and is not a combatant.
Technology does not change this fundamental truth. Often social categories and concepts are not objective, but are contested or specific to time and place. But computer vision together with algorithms are more effective in predictable environments where concepts are objective, reasonably stable, and internally consistent. - ↑ ۱۶٫۰ ۱۶٫۱ Brumfiel, Geoff. "Israel is using an AI system to find targets in Gaza. Experts say it's just the start". NPR. Archived from the original on 20 February 2024. Retrieved 1 April 2024.
The Israeli military did not respond directly to NPR's inquiries about the Gospel. In the November 2 post, it said the system allows the military to "produce targets for precise attacks on infrastructures associated with Hamas, while causing great damage to the enemy and minimal harm to those not involved," according to an unnamed spokesperson.
But critics question whether the Gospel and other associated AI systems are in fact performing as the military claims. Khlaaf notes that artificial intelligence depends entirely on training data to make its decisions.
"The nature of AI systems is to provide outcomes based on statistical and probabilistic inferences and correlations from historical data, and not any type of reasoning, factual evidence, or 'causation,'" she says. - ↑ Leshem, Ron (30 June 2023). "IDF possesses Matrix-like capabilities, ex-Israeli army chief says". Ynetnews. Retrieved 26 March 2024.)
- ↑ ۱۸٫۰ ۱۸٫۱ Davies, Harry; McKernan, Bethan; Sabbagh, Dan (December 2023). "'The Gospel': how Israel uses AI to select bombing targets in Gaza". The Guardian. Archived from the original on 2 December 2023. Retrieved 1 April 2024.
In early November, the IDF said “more than 12,000” targets in Gaza had been identified by its target administration division.
The activities of the division, formed in 2019 in the IDF’s intelligence directorate, are classified.
However a short statement on the IDF website claimed it was using an AI-based system called Habsora (the Gospel, in English) in the war against Hamas to “produce targets at a fast pace”.
[...] In recent years, the target division has helped the IDF build a database of what sources said was between 30,000 and 40,000 suspected militants. Systems such as the Gospel, they said, had played a critical role in building lists of individuals authorised to be assassinated. - ↑ Davies, Harry; McKernan, Bethan; Sabbagh, Dan (December 2023). "'The Gospel': how Israel uses AI to select bombing targets in Gaza". The Guardian. Archived from the original on 2 December 2023. Retrieved 1 April 2024.
The target division was created to address a chronic problem for the IDF: in earlier operations in Gaza, the air force repeatedly ran out of targets to strike. Since senior Hamas officials disappeared into tunnels at the start of any new offensive, sources said, systems such as the Gospel allowed the IDF to locate and attack a much larger pool of more junior operatives.
One official, who worked on targeting decisions in previous Gaza operations, said the IDF had not previously targeted the homes of junior Hamas members for bombings. They said they believed that had changed for the present conflict, with the houses of suspected Hamas operatives now targeted regardless of rank. - ↑ Davies, Harry; McKernan, Bethan; Sabbagh, Dan (December 2023). "'The Gospel': how Israel uses AI to select bombing targets in Gaza". The Guardian. Archived from the original on 2 December 2023. Retrieved 1 April 2024.
Aviv Kochavi, who served as the head of the IDF until January, has said the target division is “powered by AI capabilities” and includes hundreds of officers and soldiers.
In an interview published before the war, he said it was “a machine that produces vast amounts of data more effectively than any human, and translates it into targets for attack”.
According to Kochavi, “once this machine was activated” in Israel’s 11-day war with Hamas in May 2021 it generated 100 targets a day. “To put that into perspective, in the past we would produce 50 targets in Gaza per year. Now, this machine produces 100 targets a single day, with 50% of them being attacked. ” - ↑ Brumfiel, Geoff. "Israel is using an AI system to find targets in Gaza. Experts say it's just the start". NPR. Archived from the original on 20 February 2024. Retrieved 1 April 2024.
A report by the Israeli publication +972 Magazine and the Hebrew-language outlet Local Call asserts that the system is being used to manufacture targets so that Israeli military forces can continue to bombard Gaza at an enormous rate, punishing the general Palestinian population.
NPR has not independently verified those claims, and it's unclear how many targets are currently being generated by AI alone. But there has been a substantial increase in targeting, according to the Israeli military's own numbers. In the 2021 conflict, Israel said it struck 1,500 targets in Gaza, approximately 200 of which came from the Gospel. Since October 7, the military says it has struck more than 22,000 targets inside Gaza — a daily rate more than double that of the 2021 conflict.
The toll on Palestinian civilians has been enormous. - ↑ ۲۲٫۰ ۲۲٫۱ Brumfiel, Geoff. "Israel is using an AI system to find targets in Gaza. Experts say it's just the start". NPR. Archived from the original on 20 February 2024. Retrieved 1 April 2024.
Misztal's group documented one of the first trials of the Gospel, during a 2021 conflict in Gaza between Israel and the militant groups Hamas and Islamic Jihad. According to press reports and statements from the military itself, Israel used the Gospel and other AI programs to identify likely targets such as rocket launchers. The system was used to identify static targets as well as moving targets as they appeared on the battlefield. According to press reports, it identified around 200 targets in the conflict.
But it was not without its problems. The after-action report by Misztal's group noted that, while the AI had plenty of training data for what constituted a target, it lacked data on things that human analysts had decided were not targets. The Israeli military hadn't collected the target data its analysts had discarded, and as a result the system's training had been biased.
"It's been two years since then, so it's something that, hopefully, they've been able to rectify," Misztal says. - ↑ [۱]
- ↑ "UN chief 'deeply troubled' by reports Israel using AI to identify Gaza targets". France 24 (به انگلیسی). 2024-04-05. Retrieved 2024-04-06.
- ↑ "'AI-assisted genocide': Israel reportedly used database for Gaza kill lists". Al Jazeera. Retrieved 12 April 2024.
- ↑ "'AI-assisted genocide': Israel reportedly used database for Gaza kill lists". Al Jazeera. Retrieved 16 April 2024.
