Mikhaylo Fedorov gave an interview to The New York Times — and the most interesting part in it is not a quote about nuclear weapons, but a detail that headlines overlooked: the minister's strategy has already caused internal conflict within the military command.
Mathematics Instead of Infantry
Fedorov's assistant Valeriya Ionan explained to the NYT: the minister "believes in the mathematics of war". In his vision, the "kill zone" — a strip along the front line currently dominated by drones — will eventually be emptied of people. Robots on land and in the air will fight each other.
"Autonomous weapons are the new nuclear weapons. Countries that possess them will be protected."
Mikhaylo Fedorov, Ukraine's Minister of Defense, NYT
The strategy has an official name — "Air, Land, Economy" — and has been approved by Zelenskyy. Three objectives: intercept at least 95% of Russian drones and missiles, strike Russian oil export terminals, and eliminate personnel faster than Moscow can replenish it through recruitment.
Where It's No Longer Theory
Alongside the interview, Fedorov announced the combat deployment of an autonomous AI turret developed within the Brave1 cluster in cooperation with Palantir. The system operates in a semi-autonomous mode — detecting and shooting down drones, including those resistant to electronic warfare. Currently, turrets are deployed in more than 10 units on the hottest sections of the front. The next step, according to the minister, is mass production along the entire line of contact.
Through the Brave1 Dataroom platform, created together with Palantir, over 100 companies are training more than 80 AI models on anonymized real combat data. This is no longer a pilot — this is infrastructure.
The Conflict That the NYT Did Not Bury in Subtext
The newspaper recorded openly: Fedorov's futuristic rhetoric has caused a struggle for influence in the military. Commander-in-Chief of the Armed Forces of Ukraine Oleksandr Syrskyi and some generals are skeptical about relying on autonomous systems — especially when the front has been held by living people for three years now. This is not a public dispute, but the NYT identified it as a real rift in the army's strategic thinking.
Here lies the true nerve of the story: not technology, but the question of trust. Autonomous weapons that themselves make combat decisions represent a different level of delegation for which an army built on a command hierarchy may simply not be institutionally ready.
- A drone already replaces a 155mm howitzer — delivering shells to target without an artillery crew.
- An AI turret eliminates targets in semi-autonomous mode without an operator in the loop.
- The next step — full autonomy: the machine makes the targeting decision itself.
It is precisely on this third step that the conflict is focused. Not between Fedorov and generals as individuals — but between two models of warfare: industrial, where humans decide, and algorithmic, where humans only set the parameters.
If Ukraine is the first to scale fully autonomous combat systems, it gains an asymmetric advantage. But if the first high-profile malfunction or mistaken strike happens on the front line, it will be more than just a tactical loss: it will be ammunition for those in the West who are already pushing for an international ban on autonomous weapons. Will Fedorov have time to prove the "mathematics" at scale before the first public incident turns the discussion on its head?