Violations of national airspace by drones are on the rise in Europe. When European leaders discussed these events at a meeting in Copenhagen, Denmark, in October 2025, they responded by announcing plans for a defensive “drone wall”.
So what is a drone wall? Put simply, it is a network of sensors, electronic warfare equipment and weapons. This “multi-layered” defensive wall is intended to detect, track and neutralise incursions by uncrewed aircraft – drones.
If a drone wall was implemented in Europe, it would fulfil two main tasks: monitoring the situation along Nato’s eastern borders, where Russia is seen as a potential threat, and providing air defence against drones. It could potentially protect other airborne threats too, should hostilities break out.
It would not be a single, EU-owned system, but instead a network of national systems that can operate independently. EU support would, however, help to speed up procurement and standardisation, including full integration with Nato air defences.
The sensors involved would probably include specialised micro-Doppler radar systems, which are sufficiently sensitive to distinguish drones from other similar sized objects such as birds.
Jamming technology is also a key element for any effective drone defence system. These would send out radio frequency signals that interfere with the operation of an enemy drone – for instance, by disrupting the connection between the drone and the operator.
Finally, if the technology can be developed, a drone wall will eventually require drones to counter other drones. These small drones would require some means, probably using munitions, to intercept and destroy other incoming uncrewed aircraft. The EU is keen to develop effective versions of these air-to-air interceptor “defensive” drones. They have so far proved very difficult to create.
The Ukraine war has shown that drones launched to attack foreign targets can often be deployed in large numbers, or swarms.
Drone swarms currently consist of individual aircraft each controlled by an operator. Russia has also launched hundreds of its “fire and forget” Shahed-based drones at a time in single wave attacks on Ukraine.
But fully autonomous drones, made possible with the help of AI, are on the horizon. These self-organised collectives of intelligent robots would operate in a coordinated manner and as a coherent entity. So similarly coordinated defences will be needed.
Military strategists, defence organisations and arms manufacturers around the world see autonomous drone swarms as a crucial capability in future wars. These swarms would be able to attack multiple targets simultaneously, thereby overwhelming its defensive measures. That could include single, tactical level attacks against individual soldiers, or widespread attacks against cities and infrastructure.
Autonomous drone swarms will still be vulnerable to signal jamming if they need to communicate with each other or a human source. But if each drone is individually programmed for a mission, they would be more resistant to attempts to jam their signals.
Effectively defending Nato territory against drone swarms will require militaries to match the enemy drone capabilities in terms of size and in levels of autonomy.
Legal dimension
The widespread use of drones in the Ukraine war has led to rapid technological and tactical innovation. An example can be seen in responses to attempts by both sides to jam drone signals.
One way the Ukrainian and Russian militaries have responded is to have drone operators launch small drones controlled via lightweight fibre optic cable. Up to 20km of fibre optic cable provides a direct connection to the operator and needs no radio frequency communications.
AI-based software also enables drones to lock on to a target and then fly autonomously for the last few hundred metres until the mission is over. Jamming is impossible and shooting down such a small flying object remains difficult.
As autonomous capabilities evolve, however, there are legal ramifications to consider. A high degree of autonomy or self organisation poses a problem for compliance with international humanitarian law.
Central concepts in this area include distinguishing combatants from civilians, and proportionality – weighing civilian harm against military requirements. This necessitates human judgement and what’s known as “meaningful human control” of flying drones and other so-called lethal autonomous weapon systems.
The principle of meaningful human control means that key decisions before, during and after the use of force should be made by people, not AI software. It also ensures that humans remain accountable and responsible in the use of force.
In order to ensure this is possible, machines must remain predictable and their actions explainable. The last of these requirements is not straightforward with AI, which can often work in ways that even experts do not understand. This is called the “black box problem”. The expansion of autonomy in warfare means that the need for binding rules and regulations is as urgent as ever.
The European Union stresses that humans should be responsible for making life and death decisions. The difficult task, however, is to develop a drone wall with a high degree of autonomy and simultaneously enabling meaningful human control.
This article is republished from The Conversation, a nonprofit, independent news organization bringing you facts and trustworthy analysis to help you make sense of our complex world. It was written by: Peter Lee, University of Portsmouth; Ishmael Bhila, University of Paderborn, and Jens Hälterlein, University of Paderborn
Read more:
- A visual guide to 14 of the drones wreaking havoc in Ukraine, Russia and beyond
- Africa’s drone wars are growing – but they rarely deliver victory
- Chicken wire, AI and mobile phones on sticks: how the drone war in Ukraine is driving a fierce battle of innovation
Ishmael Bhila received funding from the German Federal Ministry of Research, Technology and Space under grant number 01UG22064.
Jens Hälterlein receives funding from the German Federal Ministry of Research, Technology and Space under grant number 01UG22064.
Peter Lee does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.


The Conversation
AlterNet
Local News in California
CNN
Cleveland Jewish News
Foreign Policy
CBS News Politics
NPR
ABC News
RadarOnline