(C) Daily Kos
This story was originally published by Daily Kos and is unaltered.
. . . . . . . . . .



Weaponized Drones and AI: Another Unappreciated Danger to Humanity [1]

['This Content Is Not Subject To Review Daily Kos Staff Prior To Publication.']

Date: 2025-06-21

In an era obsessed with innovation and convenience, few technological trends have accelerated with as little public oversight—and as much peril—as autonomous drones and artificial intelligence (AI) on the battlefield. Once relegated to science fiction, unmanned aerial vehicles (UAVs), robotic warships, and killer quadcopters are now deployed or in development across the globe. These machines, increasingly untethered from direct human control, signal a seismic shift in modern warfare that most of humanity is either unaware of or dangerously underestimating.

We live in a time when a 200-foot-long robotic naval vessel can pilot itself around half the globe without human intervention. When swarms of quadcopters armed with machine guns can coordinate their attacks using AI. When tanks and aircraft can track, aim, and kill without a single command from a human hand. And yet, there is no Geneva Convention, no NATO treaty, no binding international framework limiting this rapidly expanding field. We are building Terminators in our own image and naming our surveillance systems after the machines that destroyed humanity in fiction—China's “Skynet” being a particularly on-the-nose example.

From Remote-Controlled Tools to Autonomous Killers

The term “drone” once evoked a glorified model airplane—something piloted remotely, dependent on a ground crew, and subject to human judgment. But in a span of two decades, the technology has transformed beyond recognition. Twenty years ago, military drones were rare, with just a handful under U.S. control. Today, over eighty nations operate military UAVs, supported by a booming private sector of manufacturers. Even this, however, is a quaint vision of the past.

The real danger is not the drone per se, but what powers it: artificial intelligence. Unlike traditional UAVs, which still require a person to fly, monitor, and decide on lethal action, the new generation of military hardware is designed to operate autonomously. Turkey’s Kargu and Songar drones can independently identify and engage human targets. Russia’s T-14 Armata tank can navigate and fire on enemies without a crew. The U.S. Navy’s Overlord ship and X-47B aircraft have demonstrated full autonomy. These are not tools guided by people—they are machines entrusted with life-and-death decisions.

Lessons from Ukraine: A Glimpse into the Drone-Driven Battlefield

The ongoing war between Russia and Ukraine offers the world a sobering glimpse into the near-future of drone warfare. Both nations have employed UAVs extensively, not just for reconnaissance and artillery spotting, but also for direct attacks. The Ukraine conflict has become a proving ground for modern drone warfare—and it validates both the strategic value and the existential risks of these systems.

The proliferation of drones in the Ukraine conflict demonstrates how accessible and adaptive this technology has become. From commercial drones retrofitted with explosives to military-grade UAVs, drone warfare has become decentralized, inexpensive, and remarkably effective. Ukrainian forces have used relatively low-cost drones to disable tanks worth millions of dollars. The battlefield has seen not just large drones, but swarms of small quadcopters used to harass enemy lines, drop grenades, or deliver tactical intelligence in real-time.

This new mode of war has introduced a stark asymmetry. Nations or groups with limited resources can now deploy drones to level the playing field against technologically superior adversaries. That affordability, however, also means these weapons can fall into the hands of non-state actors—terrorist groups, warlords, and rogue hackers—without the oversight or ethical constraints of formal militaries.

The Ukraine war has also demonstrated the crucial role of electronic warfare in the drone age. Both sides have developed sophisticated systems to jam, misdirect, or neutralize enemy drones. This arms race has led to creative countermeasures—such as fiber-optic-guided drones that evade radio interference. It reveals a deeply unstable environment where control of the skies is no longer dependent on jet fighters but on invisible battles in the electromagnetic spectrum.

Perhaps most unsettling is the recruitment of civilian drone operators, often individuals with gaming experience, who are trained to fly drones and deliver lethal payloads with joysticks from remote locations. This gamification of war distances the human operator even further from the battlefield—and from the moral weight of taking lives.

The Military Logic—and Moral Abyss—of Automation

Why are governments racing toward automation? The logic is cold but compelling: With falling birthrates and increasing political resistance to military casualties, democratic and authoritarian governments alike prefer sending robots to war rather than risking citizens. Public sentiment reinforces this shift. Ask any population whether they'd prefer to fight a war with machines instead of their children, and the answer is a resounding yes.

But flip the question: Would they want to be attacked by machines? The answer, just as consistently, is no. Therein lies the paradox. What’s desirable for defense is terrifying for offense. A world that deploys autonomous killing machines becomes one where human life is no longer protected by empathy, hesitation, or conscience—but by algorithms and code.

This problem is compounded by the speed and scalability of autonomous warfare. Unlike nuclear weapons, which are highly centralized and difficult to steal or replicate, autonomous drones and AI-enabled killing systems can be mass-produced, smuggled, and easily hacked. The Ukraine conflict has already seen reports of drones reprogrammed in the field, hijacked, or used by irregular forces, underscoring just how precarious this new era truly is.

A Third Revolution in Warfare?

In August 2017, a coalition of AI and robotics experts—including Elon Musk and DeepMind co-founder Mustafa Suleyman—signed an open letter to the United Nations calling for a global ban on autonomous weapons. The letter described such machines as the potential catalysts of a “third revolution in warfare,” following gunpowder and nuclear arms. It warned of a near-future arms race in which despots and terrorists could use autonomous weapons to carry out assassinations, ethnic cleansing, or acts of terror with no human oversight and little accountability. Musk and his co-signers wrote that these weapons “have the potential to be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways.” (The Guardian, 2017).

Despite these warnings, progress remains halting. Since 2014, discussions under the United Nations Convention on Certain Conventional Weapons (CCW) have aimed to explore controls or bans on autonomous systems. However, key powers—such as the U.S., Russia, China, and India—have resisted binding regulations, citing strategic concerns. As of May 2025, UN meetings continue to address the issue, but binding international standards remain conspicuously absent (Reuters, 2025).

The Coming Swarm: A New Kind of WMD

What sets autonomous drones apart from traditional weapons is their capacity to swarm. Dozens, hundreds, or thousands of small, AI-enabled machines can communicate, coordinate, and attack in unison, overwhelming defenses never designed for such threats. Unlike nuclear bombs, which are meant to deter, drone swarms can be used both surgically and indiscriminately.

The Ukraine conflict has made clear how quickly drone swarms can change the tempo of a battle. Ukrainian forces have used swarms of FPV drones in synchronized attacks on convoys and ammunition depots. The possibility of these swarms becoming autonomous—capable of navigating, identifying, and engaging targets without human input—adds a terrifying new dimension to their use. The ethical, tactical, and humanitarian implications of this development cannot be overstated.

Conclusion: The Cost of Indifference

In CyberWar, author Matthew Mather warns that a true conflict involving autonomous machines would make fiction look tame. He is not exaggerating. The world is sleepwalking toward a future where wars are fought by robots, decisions are made by neural nets, and no human ever sees the moment of death. The Ukraine war has moved these warnings from the realm of speculation into stark, visible reality.

As with climate change or genetic engineering, the dangers of autonomous weapons lie not only in what they can do, but in how little thought we've given to whether they should. The question is no longer "Can we build these machines?" It’s “Can we stop them before they build a future none of us want to live in?”

The clock is ticking. And this time, it’s not counting down to zero—it’s counting up, as we let the machines take over one battlefield at a time.

Citations & Further Reading:

⁃ Mather, Matthew. CyberWar (World War C Book 3)

⁃ The Guardian. “Elon Musk and AI Experts Call for Ban on Killer Robots.” August 20, 2017. https://www.theguardian.com/technology/2017/aug/20/elon-musk-killer-robots-experts-outright-ban-lethal-autonomous-weapons-war

⁃ Reuters. “Nations Meet at UN for Killer Robot Talks as Regulation Lags.” May 12, 2025. https://www.reuters.com/sustainability/society-equity/nations-meet-un-killer-robot-talks-regulation-lags-2025-05-12/

⁃ Council on Foreign Relations: “How the Drone War in Ukraine Is Transforming Conflict”

⁃ Campaign to Stop Killer Robots. https://www.stopkillerrobots.org/

⁃ United Nations Office for Disarmament Affairs. https://www.un.org/disarmament/

[END]
---
[1] Url: https://www.dailykos.com/stories/2025/6/21/2329353/-Weaponized-Drones-and-AI-Another-Unappreciated-Danger-to-Humanity?pm_campaign=front_page&pm_source=more_community&pm_medium=web

Published and (C) by Daily Kos
Content appears here under this condition or license: Site content may be used for any purpose without permission unless otherwise specified.

via Magical.Fish Gopher News Feeds:
gopher://magical.fish/1/feeds/news/dailykos/