By KYLE MIZOKAMI - 01. June 2021
- Libyan forces reportedly used Kargu-2 drones to autonomously seek out and attack human targets.
- This is the first recorded case of using a self-hunting drone against people.
- Drone experts say this extremely dangerous development could be dangerous to people far beyond the traditional battlefield.
The world’s first recorded case of an autonomous drone attacking humans took place in March 2020, according to a United Nations (UN) security report detailing the ongoing Second Libyan Civil War. Libyan forces used the Turkish-made drones to “hunt down” and jam retreating enemy forces, preventing them from using their own drones.
The field report (via New Scientist) describes how the Haftar Affiliated Forces (HAF), loyal to Libyan Field Marshal Khalifa Haftar, came under attack by drones from the rival Government of National Accord (GNA) forces.
After a successful drive against HAF forces, the GNA launched drone attacks to press its advantage. From the report:
Logistics convoys and retreating HAF were subsequently hunted down and remotely engaged by the unmanned combat aerial vehicles or the lethal autonomous weapons systems such as the STM Kargu-2 (above) and other loitering munitions. The lethal autonomous weapons systems were programmed to attack targets without requiring data connectivity between the operator and the munition: in effect, a true “fire, forget and find” capability.
These military drones may have autonomously attacked humans for the first time ever last year, according to a United Nations report. While the full details of the incident, which took place in Libya, haven’t been released and it is unclear if there were any casualties, the event suggests that international efforts to ban lethal autonomous weapons before they are used may already be too late.
The robot in question is a Kargu-2 quadcopter produced by STM
The report says Turkey supplied the drones to Libyan forces, which is a violation of a UN arms embargo slapped on combatants in the conflict.
The Kargu-2 (“Hawk”), from Turkish defense contractor STM, is a quadcopter drone designed to carry a weapons payload. STM’s marketing video below explicitly describes Kargu-2 as being capable of autonomous attack.
This content is imported from YouTube. You may be able to find the same content in another format, or you may be able to find more information, at their web site.
How does it work? First, the shooter loads coordinates into the Kargu-2 drone’s software, and then launches the drone. The drone will travel to those coordinates, identify likely “targets,” and execute a dive maneuver, swooping down on the target and blowing itself up as it detonates a shotgun-like explosive package.
In military parlance, this process is known as “fire and forget,” which means once the shooter launches the drone, they can do something else, like relocate, prepare another attack, or even go eat lunch.
Drone experts have been dreading this moment while advocating for a ban on autonomous attack drones.
“The UN report implying first use of autonomous weapons against soldiers paints an uncertain picture—however, that’s the point,” Zachary Kallenborn, an official U.S. Army “Mad Scientist” and national security consultant, tells Pop Mech. He continues:
“The first use of autonomous weapons in war won't be heralded with a giant fireball in the sky and dark words on how humanity has become Death, Destroyer of Worlds. First use of autonomous weapons may just look like an ordinary drone. The event illustrates a key challenge in any attempt to regulate or ban autonomous weapons: how can we be sure they were even used?”
Kargu-2 drones under production at STM, Ankara, Turkey, June 2020. ANADOLU AGENCYGETTY IMAGES
The Kargu-2 indeed looks like any other quadcopter drone. The major difference is the software, which might be difficult to obtain from scattered bits of plastic for forensic analysis. This raises the question: Could military forces modify civilian drones into human-hunting counterparts to attack civilians?
There are some events in the history of mankind, like the atomic bomb test in Alamogordo, New Mexico in 1945, that are so profound, they serve as a divider between one social, economic, or military era and another.
The events in Libya may similarly divide the time when humans had full control of weapons, and a time when machines made their own decisions to kill.
BIG SHAME: Often under the disguise of CONservation Organizations or "Peacekeeping" Missions, the military has developed and tested since over 40 years the most heinous surveillance and autonomous killing systems in the remotest corners of especially African countries that lack until today any regulations and legislation on such technologies - just like African children still serve as guinea pigs to Big Pharma corporations and its Medical Mafia . AFRICA WAKE UP!!!
We need legislation against ‘killer robots,’ Human Rights Watch says
By Hannah Sparks - 10. August 2021
It’s a Hollywood blockbuster premise rooted in our not-so-distant future.
For decades, robot thrillers such as “The Terminator,” “Blade Runner” and “Westworld” have warned viewers that our reliance on artificial intelligence is a real threat to civilization. Now, real-life researchers with the Human Rights Watch are sounding the alarm on potentially world-ending “killer robots,” according to a new report.
The message comes as part of their Campaign to Stop Killer Robots, which calls for a global ban on “fully autonomous weapons.”
In the words of Arnold Schwarzenegger’s Terminator: Hasta la vista, baby.
“Removing human control from the use of force is now widely regarded as a grave threat to humanity that, like climate change, deserves urgent multilateral action,” said Mary Wareham, advocacy director of the arms division at Human Rights Watch, in a press release on HRW.org, whose campaign is pushing for “an international ban treaty” on AI-operated weapons.
Their review, which analyzed defense policies from 97 countries that outlined stances on lethal robotics, revealed that a majority of lawmakers believe human intervention is fundamental to ethical weapons systems.
United Nations Secretary-General António Guterres has called the advanced programs “morally repugnant and politically unacceptable” and urged countries to take action.
The authors explain that autonomous weapons “would decide who lives and dies, without … inherently human characteristics such as compassion that are necessary to make complex ethical choices.” Aside from many other potential pitfalls of programming death machines, HRW suggests their use would also make unclear who would be held responsible for unlawful acts of war committed by an autonomous weapon: the computer programmers or the military commanders?
So far, at least 30 countries, including Austria, Brazil and Chile, are seeking to put a global ban on the use of these weapons, according to the report.
However, a minority of influential countries, namely Russia and the United States, have hampered talks — while notably “investing heavily in the development of various autonomous weapons systems,” the report claims.
MORE ON ROBOTS:
NO ONE WOULD BE SAFE
Campaign to Stop Killer Robots
Help us achieve our goal of banning fully autonomous weapons.
Contact your government
Ask if they support the calls to ban fully autonomous weapons that would select and attack targets without human control.
1 Non-governmental organizations can apply to become a member of the campaign. There are no fees. Learn more and apply.
2 Government representatives should endorse and work for a ban on fully autonomous weapons.
3 Technology companies should publicly commit not to engage in the development of fully autonomous weapons and call for a ban.
4 Individual experts working in artificial intelligence, robotics and related fields are encouraged to join the International Committee for Robot Arms Control, a co-founder of our campaign.
June 28 - July 5
Hamburg - Many Hanseatic citizens were amazed!
A robot resembling a dog or a creature from Star Wars caused astonishment in Hamburg's city centre on Thursday.
Berlin-based security service provider Ciborius unveiled a new technology for security guards that will be deployed across Germany.
The four-legged robot "Spot" from Boston Dynamics can secure buildings without tiring, the company explained.
It has an all-round camera and is programmed in such a way that it can make its rounds agilely on its own. If it detects inconsistencies, it reports it.