Codenamed Lumberjack, Northrop Grumman’s drone hunted targets with AI during a live exercise with the division that jumped into Normandy
During Operation Lethal Eagle, one of the largest readiness exercises of the United States Army, a combat drone named Lumberjack flew autonomously and simulated precision strikes against ground targets.
According to DefenseScoop, the test was conducted in April 2026 for the 101st Airborne Division — the same unit that jumped into Normandy on D-Day.
The drone was integrated into the Maven Smart System, built by Palantir, which uses artificial intelligence to identify targets, analyze battlefield data, and suggest actions to operators.
-
This scientist revealed he had discovered the “Matrix” but disappeared without a trace in 1994; meet neuroscientist Jacobo Grinberg and the Sintergic Theory.
-
China did what no country managed by reversing large-scale desertification, and now the world admits it needs to copy these techniques before billions of hectares of soil turn to dust.
-
New York launched its first urban forest plan because the city could have 35 days above 35°C per year by the 2050s, and the temperature difference between neighborhoods with and without trees already reaches 12 degrees today.
-
19-year-old Sertão twins achieve 16 admissions to public universities across Brazil, accumulate medals in scientific olympiads, and choose to study Civil Engineering in the interior of Pernambuco to transform public education in the region.
Additionally, the system features the Agentic Effects Agent — an AI agent that automates part of the decision-making process in combat.
Take off, identify, attack: how Lumberjack operates without direct human intervention
The Lumberjack is classified as a Group 3 UAS — an unmanned aircraft weighing between 25 and 600 kg, according to Department of Defense standards.
As detailed by Northrop Grumman, it is a low-cost, single-use strike system designed for modern combat.
Therefore, the drone does not return to base — it is launched, executes the mission, and is consumed in the process.
The key differentiator is its modular core, which allows for rapid swapping between kinetic (explosives) and non-kinetic (electronic interference) payloads.
This way, the same drone can both destroy a target and silence enemy communications.

The first time a combat drone connected to the Maven Smart System in a live exercise
According to Northrop Grumman, Operation Lethal Eagle was the first customer demonstration where Lumberjack operated integrated with the Maven Smart System.
In this sense, Palantir’s system functioned as the “brain” of the operation — receiving sensor data, processing real-time images, and generating engagement recommendations.
The drone, in turn, executed orders autonomously — from route planning to the moment of the simulated attack.
However, human operators maintained supervision over the process. The final engagement decision still passes through a manned command station.
Similarly, the exercise served to test the limits of autonomy in conditions close to real combat — without the risk of casualties.

The 101st Airborne tests the future of war: drones that replace the first jump
The choice of the 101st Airborne Division to test Lumberjack is no coincidence.
Also known as the “Screaming Eagles,” the division is the most famous rapid response unit in the American Army.
Traditionally, paratroopers jump first into hostile territory to clear the way. With autonomous drones, the first “jump” can be made by machines.
Consequently, the integration of autonomous attack drones with airborne units creates a new combat doctrine — where technology reduces human risk in the most dangerous phases of the operation.

The ethical debate: should a machine be able to decide who lives and who dies?
On the other hand, the advancement of autonomous drones with strike capability raises profound ethical questions.
International organizations, including the Red Cross, are calling for regulation on lethal autonomous weapons — systems that can select and attack targets without human intervention.
Above all, the speed of decision-making in combat favors automation — but every millisecond less of human reflection is an increased risk of irreversible error.
Still, militaries argue that keeping a human in the loop remains a requirement. Lumberjack suggests targets, but an operator authorizes engagement.
Despite this, the distance between “automated suggestion” and “autonomous decision” is measured in lines of code — and the pressure of real combat can quickly render this theoretical distinction obsolete.

Be the first to react!