Tech-No-Moral Warfare: Fully Autonomous Weapons And The Law As It Should Be

Millie Hornby
vLex News and Updates
4 min readApr 8, 2022

--

This is a notable entry from one of the runners-up in vLex’s International Law and Technology Writing Competition 2022. Brian Collins Ocen of Makerere University, Uganda, won the Future category of the competition for the below article: Tech-No-Moral Warfare: Fully Autonomous Weapons And The Law As It Should Be.

It seems increasingly likely that AI-powered Fully Autonomous Lethal (FAL) weapons will form an essential and possibly irreversible characteristic of the military apparatus of modern states. But as warfare is being re-shaped by technology, legal experts and ethicists are searching for answers as to whether there is antimony between the use of such weapons and the laws of war.

While some take the view that the apprehension of FAL weapons is mere technophobia, others maintain that these weapons present a disconcerting possibility of a perilous future of wars fought with technology, but minus morality. Nonetheless, these contrasting claims drum out a common imperative to explore how the rules of war in relation to weapons can be improved to adequately cover the risks presented by such weapons.

From the outset, the main challenge that FAL weapons present to the current set of rules, is that they do away with the traditional dichotomy between the inherent characteristics of the weapon and the manner in which the weapon is used. This is a significant challenge because the current laws of war in relation to weapons contain a distinction between the inherent characteristics of a weapon, and the manner in which a weapon is used. In other words, the rules are not designed to regulate lethal weapons that are fully autonomous.

This distinction between the inherent characteristics of the weapon and the manner in which the weapon is used means that there is a first limp under which a weapon spawns a characterization as lawful per se if it successfully navigates the rules that govern its inherent characteristics. Those rules require that a weapon must neither be indiscriminate, nor one that causes unnecessary suffering.

However, there is also a second limp which requires that a weapon which is lawful per se must also be used in a lawful manner. This second limb consists of rules which do not govern the inherent characteristics of weapon but rather govern the decision making processes of persons using such weapons. The rules under this limb assess whether those decision making processes are in accordance with principles of IHL which are proportionality, precaution, humanity and military necessity.

A good illustration of this divergence at play is the use of certain free fall bombs. Although some free-fall bombs are weapons which are lawful per se, since they fall neither under the category of weapons that are indiscriminate in nature nor those that cause unnecessary suffering, the use of such bombs in an area overpopulated with civilians will be considered a use of lawful weapon in an unlawful manner; since such use will lead to collateral loss of civilian life contrary to the principles of proportionality, and precaution in attack.

Now given that FAL weapons do away with the distinction between the inherent characteristics and the manner in which a weapon is used, we must consider the possibility of requiring the test of legal scrutiny for fully autonomous lethal weapons to result in a convergence between the rules that govern the weapon itself, and those that govern the manner in which the weapon is used.

Practically, this would mean that an assessment of whether these weapons are inherently lawful or not, would include not only the prohibition of indiscriminate weapons and weapons that cause unnecessary suffering, but also an assessment whether such weapons, inherently, have the ability to navigate other rules of IHL which include proportionality, precaution, humanity and military necessity.

This convergence would ensure that these weapons have the inherent ability to carry out functions previously required of humans in decisions making processes. Undoubtedly this would create a higher threshold for determining whether a weapon is unlawful per se, but it must be taken that this threshold would be necessary, given the unique status of full autonomy that these weapons possess, and the need to make FAL weapon systems as safe as possible.

But that’s not all. Vitally, this test of legal scrutiny would also require a normative ethical framework beyond that required of fully autonomous lethal weapons under the current legal framework. It would require machine learning capabilities to enable FAL weapons to make decisions beyond just the deontological and consequentialist ethical approaches that they are impliedly restricted to now.

Now those normative ethical approaches may be sufficient when a weapon’s artificial intelligence is only required to distinguish between civilians and combatants, but they are insufficient under unpredictable vicissitudes of warfare which often present moral dilemmas that will require these FAL weapons to make judgmental calls requiring an application of the principles of proportionality, precaution, humanity and military necessity.

The application of those principles of proportionality, precaution, humanity and military necessity to resolve the moral dilemmas involved in different circumstances of warfare, impliedly requires human combatants to invariably be able to make decisions based on the full spectrum of normative ethics, which includes the normative ethical approach of virtue ethics.

Thus we must conclude that the lawful development of FAL weapons must be a condition precedent to technological advances that can enable such weapons to do more that they can right now, that enables such weapons to be able to operate on the full spectrum of normative ethics. Because after all, to whom much is given, much must be required.

--

--