Summary
On 7 December 1941, at 7:02 AM, two U.S. Army privates operating the SCR-270B radar set at Opana Point on the northern tip of Oahu detected an incoming formation of aircraft 132 miles north of the island, close to the oscilloscope's display limit. Private Joseph Lockard recognised the signature as larger than any friendly flight scheduled that morning. Private George Elliot tracked the formation for 38 minutes as it closed to within 20 miles of the coast, where the return was lost in island ground clutter.
The sensor worked. The signature was captured. The range was correct, the bearing was correct, the magnitude was unambiguous. At 7:06 AM, Elliot telephoned the Information Centre at Fort Shafter. Private Joseph McDonald took the call and passed it to Lieutenant Kermit Tyler, on his second day of training as an observer. Tyler assumed the return was a flight of B-17s inbound from California. He told Opana Point not to worry about it.
At 7:48 AM, 183 Imperial Japanese Navy aircraft struck Pearl Harbor in the first wave of the attack.
The SCR-270B was a working sensor, correctly operated. The institutional framework to convert its output into response was not yet built.
Analysis
On 3 April 2026, Rob Lee and Dmytro Putiata published the Russian 2nd Combined Arms Army's Drone Line allocation for a 32-kilometre sector of the eastern Ukrainian front: 560 UAS per day, of which 360 copter FPV, 111 fibre optic FPV, and 89 Molniya fixed-wing kamikazes. Seventeen drones per kilometre of frontage, per day. Scaled to the full Centre Group of Forces, 1,700 UAS crews operate under one command with a daily cap of 4,000 FPV. Named Rubicon detachments and spetsnaz brigades hold specific sectors beyond 10 kilometres past the forward line of enemy troops.
The airframes are named: Molniya, Lancet, Shahed and Geran, Orlan-10, Zala-16, Supercam, Kub, R-18, Vobla, Vandal. The fibre optic FPVs emit no RF signal. The fixed-wing kamikazes fly profiles unlike any commercial quadcopter.
Open any public drone detection dataset. DroneRF captures RF signatures from three COTS quadcopters in a controlled lab. RFUAV extends to additional commercial drones. VisioDECT covers 20,924 annotated images from six COTS airframes across sunny, cloudy, and evening conditions. LRDDv2 adds long-range weather and lighting variance for COTS. The Anti-UAV benchmark, DUT Anti-UAV, SynDroneVision, Drone-type-Set, the Swedish Armed Forces multi-sensor collaboration, the College of Charleston multiclass acoustic set, CageDroneRF from Rowan University. Each is built for peacetime quadcopter threats to airports, stadiums, prisons, and critical infrastructure.
A combat corpus contains Molniya. Lancet. Shahed and Geran. Orlan-10 at operational range. Zala-16. Supercam. Kub. It captures fibre optic FPV through the modalities that still see it: acoustic, optical, radar. It is collected under active EW, at artillery adjacency, across frontline clutter. Academic papers solve weather, distance, and lighting. A combat corpus solves the rest.
The Opana Point Pattern
Current Western drone detection research faces the inverse of the 1941 condition. The framework for academic benchmarking is mature. The data it benchmarks against is structurally incomplete for the threat it addresses. A model that achieves 99% accuracy on DroneRF correctly classifies three COTS quadcopters in a controlled lab. It does not recognise a Molniya inbound through EW clutter at frontline density.
In 1941, the sensor worked and the framework was missing. In 2026, the framework is elaborate and the training data is peacetime. In both cases, the deliverable performs correctly on the wrong problem.
Combat Data
The combat data exists. Ukrainian acoustic networks — Zvook and Sky Fortress — have deployed more than 14,000 sensors across the front at unit costs between $400 and $1,000, with NATO funding another 15,000 (United24). General James Hecker, Commander of U.S. Air Forces Europe, told Aviation Week the network feeds target tracks to fire teams via tablet. Operational reporting credits the networks with 95% interception rates in documented engagements against Shahed-class one-way attack drones (drone-warfare.com).
The data these systems produce is combat-captured, combat-validated, and institutionally held. Zvook owns its training corpus. Sky Fortress operations are integrated with Ukrainian military command. The Swedish Armed Forces collaboration set used for 2025 defence-tech hackathons was recorded at 200-metre range in peacetime with camera-mount noise present.
The vendor landscape reflects the data landscape. Systems deployed in combat — Zvook and Sky Fortress at scale, Fenek in Ukraine, Bionic's Komodo, Welles Acoustics integrated onto Quantum Systems' ISR platforms — train on proprietary corpora drawn from the war. Systems at pilot or demonstration stage — Microflown AVISA, Squarehead, Askalon Industries, Monava, and others — train on peacetime collections.
The combat-deployed tier and the training-data tier correlate. This is not coincidental.
Operational Relevance
The question for any Western counter-drone programme is whether it has, or can obtain, training data collected under the conditions its systems will operate in. Academic dataset papers will continue to publish against airport-threat quadcopter benchmarks. Those datasets will keep doing their job for the problem they were built for.
A combat-relevant corpus is built from different data, under a different access model. It specifies its airframes — Molniya, Lancet, Shahed and Geran, Orlan-10 at range, the full operational load-out Lee and Putiata named. It specifies its conditions — active EW, artillery adjacency, frontline clutter. It is produced by partnership with the military organisations that operate under those conditions.
Ukraine MoD frameworks exist for this class of data exchange. The airframes are documented. The corpus is buildable.
The operational partnership that builds it first gets a clear warfare advantage.