Introduction
Countering Unmanned Aircraft Systems (UAS) in international armed conflict is a difficult task, since it can be hard to determine the scope of a threat. There is no single threat scenario: the threats come in various shapes, formats, and applications. A major complication in countering UAS for defence purposes is caused by emerging technologies such as autonomous applications implemented in UAS.
An example of a UAS for military purposes that makes use of advanced autonomous applications is the ‘Harpy’, developed by Israeli Aerospace Industries (IAI). The ‘Harpy’ is designed to attack targets by self-destructing into these targets or, if no target could be engaged during the duration of the mission, return home. An undisclosed number of ‘Harpy’s’ were bought by China in 1994. China created its own system, the ASN-301, which appears to be a near copy of the ‘Harpy’; it was unveiled at a military parade in 2017. Examples of autonomous applications can be found in consumer products as well, such as active detecting, tracking and following of persons and objects, or waypoint navigation with autonomous trajectory calculation and active obstacle avoidance based on the device’s sensor inputs.1 ‘Both categories, commercially available drones as well as military UAS, should be considered ‘autonomous’ in the way that they probably no longer require a permanent command and control link to fulfil their mission. This eliminates many of the current countermeasures which rely on jamming their radio transmissions.’2
Against the background of these already existing examples of advanced autonomous applications in UAS we will focus, without delving into specific (advanced) autonomous capabilities, on countering UAS that make use of autonomous applications in general. Our point of departure is Counter-UAS (C-UAS) systems with autonomous applications within the scope of military operations for defence purposes in armed conflicts. The developments on the autonomy of UAS imply that current countermeasures, such as jamming, will not be sufficient. Moreover, time can be a factor that limits the possibility to counter UAS adequately. The difficulty to detect and determine if a UAS is being used as a weapon -in an early stage- is high, and countering a UAS is a time-critical operation. It is possible that the time window to response is too limited to leave the countering task solely to a human decision-maker and therefore it might be necessary to implement autonomous applications in C-UAS operations as well.
In this chapter we investigate how the operational use of C-UAS systems is connected to the Rule of Law and International Humanitarian Law (IHL) in general. Additionally, we will elaborate on the review process of new weapons, means or methods as described in article 36 of Additional Protocol I of the Geneva Conventions (AP I), with a particular focus on emerging technologies. We also propose some considerations with regard to the novelty of emerging technologies and how -and under which rules, regulations and procedures- new countermeasures can be shaped. We will especially reflect on how the review process of the study, development, acquisition or adoption of new means and methods of warfare, in article 36 AP I, might have to evolve with regard to C-UAS systems that make use of emerging technologies such as data- and code-driven applications. Our chapter will be divided into three sections.
In the first section, we will give a short introduction to the relevant legal framework and a short explanation of the Rule of Law. We will provide a general outline of the legal basis and the legal regime of using countermeasures for defence in armed conflicts.
In the second section, we will discuss article 36 AP I and point out some difficulties in C-UAS systems that incorporate data- and/or code-driven applications, such as machine learning and artificial intelligence. We will also reflect on the question as to whether data- and/or code-driven applications in C-UAS can be seen as new.
Our considerations will culminate in the third section with reflections on the acquisition procedures, which leads to suggestions for how to shape emerging technologies such as data- and/or code-driven applications and how to provide safeguards in the acquisition procedures of C-UAS systems. We place emphasis on the fact that it might be desirable to mandate a preregistration of the research design as a requirement – without disclosing classified information in these acquisition procedures – in order to provide transparency. More specifically, we will state that transparency is needed with regard to future claims on the safety, security and reliability of such applications, while transparency is also key in enabling the contestability of decisions based on such applications in terms of potential violations of fundamental rights.
Legal Framework and the Rule of Law
The purpose of a legal system is to have rules that bind all people living in a community.3 These rules are there to protect the general safety and to ensure that the rights of citizens are protected against abuses by other people, organizations or governments. Two basic principles of any legal system are first, to have a code of conduct that enables all relevant actors to know what their rights and obligations are, thereby enabling a more predictable and efficient interaction in any particular sphere of activity; and second, to set out norms that are considered to be essential to protect basic shared values. To find out which countermeasures can be employed in situations of an armed conflict, it is necessary to establish a legal basis and to determine the relevant legal regime. The legal basis will answer the question of whether the use of force is legal, the legal regime regulates the where, how and against whom the use of force can be employed.4 The legality of C-UAS systems will not be discussed in front of a court, when there is no legal basis for the operational use to begin with.
Legal Basis
Jus ad bellum refers to the conditions under which a state can legally resort to the use of force.5 The UN Charter states ‘All Members shall refrain in their international relations from the threat or use of force against the territorial integrity or political independence of any state, or in any other manner inconsistent with the Purposes of the United Nations’.6 This provision is considered as binding upon all states in the world.7 However, there are three important exceptions to this provision: (1) individual or collective self-defence, as described in article 51 of the UN Charter,8 (2) a mandate of the UN Security Council, as described in articles 39 to 42 of the UN Charter9 and (3) aid to the authorities of a state, in other words the use of force on the territory of a third state requested by its government.10 When it is determined that there is a legal basis for the use of force, the next step is to determine which rules are there to obey during the deployment of C-UAS systems.
Legal Regime
In addition to jus ad bellum, international law also seeks to regulate the conduct of hostilities, also called jus in bello. This relevant legal regime is primarily based on IHL.11 IHL limits the right of parties to an armed conflict to freely choose any means and methods of warfare.12 There are certain principles in an armed conflict that are valid at all times and have as their purpose to strike a balance between humanity on one hand and military necessity on the other.13 This balance is stated by the International Court of Justice as ‘cardinal’ principles of IHL as, inter alia, basic norms of targeting.14 These basic norms of targeting set rules on the use of certain types of weapons. The main principles relating to targeting in IHL are (1) distinction, (2) precautions and (3) proportionality.
Firstly, in order to protect the civilian population, the means and methods used should be able to make a distinction; meaning that no civilians or civilian objects may be attacked, but only combatants and military objects.15 As a consequence, a state is never allowed to use means and methods that are incapable of distinguishing between civilians and military targets.16
Secondly, precautions in attack, which entails the prohibition of the use of means and methods of warfare that are of a nature to cause superfluous injury or unnecessary suffering.17 As a consequence, all feasible measures should be taken to avoid or at least minimize damage or injury to civilians from the effects of an attack on a military objective.
Lastly, proportionality: in attacking a military objective in situations where civilians and civilian objects are likely to be affected, the attack may not be carried out if the estimated collateral effects would be excessive in relation to the anticipated military advantage resulting from the attack. In 1996 the International Court of Justice has addressed the issue of the IHL principles in the context of the use of weapons and confirmed that these principles of IHL apply […] to all forms of warfare and to all kinds of weapons, those of the past, those of the present and those of the future.18
Although the relevant legal framework of C-UAS systems -the legal basis and legal regime- is quite clear, we are questioning which problems in acquisition procedures might arise with regard to data- and/or code-driven applications. As stated in the introduction, our point of departure is the assumption that C-UAS systems are likely to need to contain these emerging technologies, since current countermeasures such as jamming will not be sufficient in some situations. ‘The fact that a weapon system did not exist at the time a particular treaty rule of IHL came into force or customary international law norm crystallized into binding law does not preclude application of the rules.’19 To make sure that future means or methods of warfare adhere to the rules of IHL, it is necessary to have a meaningful review of these new means and methods.
Article 36 Additional Protocol (I) to the Geneva Conventions
The question of whether data- and/or code-driven applications can be seen as ‘new’ depends on the role these applications play in C-UAS systems. This involves not just an understanding of the technology itself, but also of the military use of that technology.20 If the application solely collects data, without altering the nature or content of the data, and does not further use that data, then it would not be considered as falling within the scope of means or methods of warfare. However, if the application would provide an integral part of the targeting decision process, it becomes part of means or methods of warfare.21
An existing legal instrument to review the legality of new weapons and weapon systems used in armed conflict can be found in article 36 of the first additional protocol to the Geneva Conventions (AP I).22 The article states that ‘In the study, development, acquisition or adoption of a new weapon, means or method of warfare, a High Contracting Party is under an obligation to determine whether its employment would, in some or all circumstances, be prohibited by this Protocol or by any other rule of international law applicable to the High Contracting Party.’23 This article is an example of a provision of IHL that is also relevant and applicable in peacetime, because the process from the development to the employment of new weapons does not usually take place during the armed conflict itself. New weapons and means and methods of warfare should be reviewed with respect to meeting the IHL targeting rules before being employed, to ensure that these new weapons employed during armed conflict are capable of meeting the IHL targeting rules. IHL is the only legal regime that has an instrument to review new weapons. For example, there is no comparable obligation under human rights law, the legal regime which would be applicable in a peacetime situation.
Article 36 binds all parties and for all parties to AP I there is an unequivocal duty to conduct reviews of weapons that are being employed in an armed conflict. Moreover, article 36 reflects — customary law, as it is argued that the obligation of a legal review of new weapons, described in article 36 AP I, is only a codification of a pre-existing customary obligation.24 This entails that it is a binding obligation on (nearly) all NATO members. Subsequently, there is no legal distinction between weapons used in international armed conflict, a conflict between two or more opposing states, and non-international armed conflicts, a conflict between governmental forces and non-governmental armed groups, or between such groups only.25 In this regard, the International Criminal Tribunal for the former Yugoslavia rightly determined ‘What is inhumane, and consequently proscribed, in internal wars, cannot but be inhumane and inadmissible in civil strife’.26, i
Unmanned combat aerial systems are already a reality, for example the Predator and the Reaper, and can be qualified as systems that are operated remotely.27 Given the current state of technology and the nature of the threat of a UAS, it is to be expected that some form of autonomy in C-UAS systems is necessary to effectively eliminate, for defence purposes, the threat of a UAS. The rules of IHL apply fully on such a C-UAS system, although the means of complying with the applicable rules may differ as a result of the fact that the operator is not co-located with the system. This implies that the operator must be able to verify that the target is a military objective to a reasonable level of certainty, especially when relying upon information provided by onboard sensors.28 Moreover, the fact that a weapon is unmanned does not relieve those who plan, decide, and execute attacks from the obligation to fully consider collateral damage in assessing the proportionality of an attack.29
‘The transformative potential of autonomy derives, first and foremost, from the fact that autonomy can help the military to overcome a number of operational and economic challenges associated with manned operations.’30 For instance, to overcome the problem of the short time window wherein a UAS threat should be eliminated. Moreover, autonomous applications change the means and methods of warfare on the aspects of greater speed, agility, accuracy, persistence, reach, coordination and mass.31 It is self-evident that the same principles of the rule of targeting apply to C-UAS systems that make use of autonomous applications. This system must be capable of distinguishing between civilians and combatants and civilian objects and military objectives.32 It might be argued that new technologies would make it easier to make this distinction, since they can be programmed to identify military targets. However, at the moment this is only possible in a limited number of circumstances; for instance, when the intended target is unmistakably military and the operational environment is predictable.33 An autonomous C-UAS system should also be able to make some kind of qualitative assessment to determine whether an attack is proportionate or excessive in relation to the military advantage anticipated.34
It is questionable whether an autonomous system will be capable of assessing military advantage, since this often requires the assessment of the broader context, rather than only the physical engagement of a target. Military advantage can extend beyond the technical level, to the tactical and or even to the strategic level. A well-known example of a counter weapons system that already exists and uses autonomous applications, in order to provide military advantage, is the ‘goalkeeper’ of the Royal Netherlands Navy. Collateral damage in the open sea is negligible, since a ship equipped with a ‘goalkeeper’ is operating in an environment with few to none civilian objects. Moreover, the consequences of protection against an attack are a case of life and death, since it is the moment of last resort.
‘Goalkeeper is an autonomous and completely automatic weapon system for short-range defence of ships against highly manoeuvrable missiles, aircraft and fast manoeuvring surface vessels. The system automatically performs the entire process from surveillance and detection to destruction, including selection of the next priority target.’35
However, in the (onshore) world with many civilians and civilian objects you will face several problems; (1) the C-UAS system deals with a highly unpredictable and complex world, where many factors have to be taken into account, (2) therefore, the data- and/or code-driven applications in the C-UAS system must be able to collect and interpret sufficient information in order to classify threats and make a distinction between civilian and military personnel and objects, and (3) even if a reliable distinction can be made, the algorithms must also be able to make an assessment of (the probability of) collateral damage resulting from engaging the threat, and whether this collateral damage is not excessive in relation to the defensive objective.
How to Provide Safeguards in the Acquisition Procedures of C-UAS Systems
‘Before examining the manner in which legal reviews can be conducted it is first important to understand the process by which weapons are acquired. If the legal review of a new weapon is to have any impact on the acquisition process of that weapon, then it must not only be cognizant of the process of acquiring it, but also be a part of that process. The acquisition process is complex but can be broken down generically into several distinct phases: […] a. concept […], b. assessment […], c. demonstration […], d. manufacture […] and e. in-service […].’36
In this section we will focus on the first two phases in the acquisition process, namely the phase of the ‘concept’ and the phase of the ‘assessment’. This specific process is not officially guided or determined by a legal process. Nevertheless, the system to be acquired needs to comply with IHL, and therefore we would like to emphasise the fact that in the case of emerging technologies such as data- and/or code-driven applications, the legal normative impact might be severe. With this taken into account it is desirable to open the dialogue with legal experts as early as possible in the acquisition process. Since as soon as functional decisions in the design are made, without a dialogue with legal experts, the review of the legal normative impact afterwards can become a difficult task since the design choices can entail a ‘black box’. As a result, the aim of the operational use of these applications within C-UAS systems can become problematic, since the normative impact within the juridical landscape cannot be overseen.
‘a. Concept: The military will first have to assess what the ‘capability gap’ is that they wish to fill, i.e. what it is that the military wants the new system to do that its current equipment does not allow it to do. Thereafter a concept for the weapon, weapons system, platform or equipment will be developed. The acquisition process will deal with the whole spectrum of equipment to be acquired for military use, from beds to sophisticated weaponry.’37
We would like to point out that there exists a capability gap, since the current equipment in C-UAS systems, such as jamming, is not sufficient to counter the threat of autonomous UAS, and there is a pressing need for better defence to adequately counter these systems. The problem that needs to be solved is closing this capability gap, and the problems related to distinction, proportionality and transparency are problems that need to be focused on in the acquisition process to counter the threat of autonomous UAS, and there is a pressing need for better defence to adequately counter these systems. The problem that needs to be solved is closing this capability gap, and the problems related to distinction, proportionality and transparency are problems that need to be focused on in the acquisition process.
‘b. Assessment: After the concept has been developed, it is further refined and its characteristics delineated. If the equipment being acquired is being purchased ‘off the shelf’, it may be possible to seek data on its performance from the manufacturer.’38
We would like to point out that if data- and/or code-driven applications are procurable as ‘off the shelf’ acquisitions, it may be desirable to include a specific review process in the pre-conceptual phase for the (research) design of data- and/ or code-driven applications, which become part of a C-UAS system, in order to be able to justify the countermeasures for defence in military operations.
Transparency of the Assessment
C-UAS systems equipped with data- and/or code-driven applications might introduce new risks regarding the decision support, due to decreased transparency on how the data- and/ or code-driven applications for decision support exactly function. It is desirable that the assessment takes place early in the development process of the concept and not at the stage when the concept is fully developed. Monitoring the research design in the conceptual development of data- and/or code-driven applications is key in order to be eventually able to adapt these applications in C-UAS systems in a responsible way. Without knowing how the analyses exactly function, these applications can become a black-box and cannot be used in a responsible way during military operations. Hence, the transparency of the data- and/or code-driven applications must be guaranteed.
The research design contains the framework of research methods and techniques chosen by a researcher. Several requirements in the process of the research design can be imposed in order to provide transparency and in the case of data- and/or code-driven applications determine the performance of a system.39 The goal of these requirements is to ensure that such studies remain transparent and to guarantee quality of the studies. The imposed requirements make it easier to identify any research results that were found by chance. In addition, these requirements enhance the replicability, and falsifiability, of the research design.40 Whoever procures or acquires a data- and/or code-driven application for a C-UAS should require in the acquisition procedure that the research design is preregistered (preferably with a highly trusted party). This preregistration includes the subsequent updates that were used to develop the application. This will contribute to the contestability of claims regarding the safety, security and reliability of such applications, while also enabling the contestability of decisions based on such applications in terms of potential violations of fundamental rights.41
In terms of rules and regulations, there is currently no explicit legal requirement for the preregistration of research designs in the acquisition procedures of systems with data- and/or code-driven applications for military purposes. We would like to state that this requirement can also be a good example of how the review process in article 36 AP I could evolve with these data- and code-driven applications taken into account, and how these new means or methods should be designed. In the acquisition of these applications the officials should be able to review the research design of these applications, preferably via a highly trusted party, from the very first moment in the development stage to the final implementation of the application.
The preregistration of research designs would create openness regarding the processes involved in developing a data- and/or code-driven application’s capabilities. This approach makes the details of an algorithm’s history, including its development history, and performance clear for review during the acquisition procedure. Preregistration of the research design is required to prevent opaque data- and code-driven applications. It is important that the choice of the data, the types of error it may contain and the way it has been curated are explained in advance. The points that need to be clarified during preregistration of the research design include:42
- ‘The type of datasets used;
- The relationship between training data and validation data;
- How frequently testing took place, and what kind of sample data were used for this purpose;
- How the hypothesis (about how the machine can learn most effectively) was developed;
- All pre-processing choices – in addition to the choice of data, this concerns the way data are cleaned up, how they are labelled, and the range of potential labels (or alternative labels);
- The types of algorithms used, or the use of which use is planned.’43
Who Should Review the Research Design?
When autonomous applications are applied in C-UAS systems and used in military operations, these systems become part of the juridical landscape. The developments on autonomy can be traced back in applications that are data- and/or code-driven, such as machine learning and artificial intelligence. These forms of emerging technologies are becoming more and more part of our daily lives. Therefore, these developments require us to take auxiliary precautions in order to protect what is at stake: to prevent implementing opaque data- and/or code-driven applications in C-UAS systems. It is desirable to include in the review process of data- and/or code-driven applications the standard of a research design and it is recommendable to appoint reviewers who monitor the development of the research design from the very first moment in the acquisition procedure. Since transparency and classification of information are at odds with each other, these reviewers need to be highly trusted parties, who are allowed to work at the proper level of classification.
‘For security reasons, such a thorough process necessarily precludes its being completely transparent. Nonetheless, for those countries conducting it, its impact is keenly felt and this must be considered a measure of its success. The answer to the need to widen implementation of the legal review process is not the creation of an international agency to conduct or monitor such reviews, but the strict adherence of States to the obligation imposed under Article 36.’44
It is desirable that legal experts are involved in the different phases of acquisition, even though this is not a legal process in itself. Decisions will be taken throughout the acquisition process on the basis of military requirements and commercial prudence,45 it is highly recommended that scientists who are involved in shaping these emerging technologies and officials who lead the acquisition procedures have a sustainable dialogue guided by legal experts, in order to provide insights and theoretical reflection on the legal normative impact of these emerging technological applications.
The review of the research design can be done by a highly trusted party and a committee should be appointed with several experts from different fields, especially legal experts need to be involved in this committee, to ensure that the legal issues are addressed properly.
Conclusion
For operational efficacy, it might be necessary to deploy technologically innovative C-UAS applications that tend to use data- and code-driven applications. These applications come with the presumption, or promise, of enhancing capabilities, efficiency and accuracy to counter UAS attacks.46 In this chapter we focused on technological developments to counter UAS within the scope of the rule of law and also on C-UAS systems via a dual analysis. The dual analysis involved a delicate balancing act; on the one hand we need -for defensive purposes- protection against the threat of UAS in armed conflicts; on the other hand, we need to counter UAS in a way that the chosen means or methods are in compliance with the existing legal framework, and safeguards values and rules within IHL. The delicate balancing act mentioned above entails that it is necessary to take auxiliary precautions towards emerging technologies in the acquisition process.
Legal aspects such as transparency and contestability within the juridical landscape, when these applications are used in C-UAS systems, are important to take into account. Several requirements need to play a role during the design phase and interrelated design choices and should be connected to the acquisition procedures of C-UAS systems.
Requirements on the design of applications are needed in order to provide safeguards which endorse legal norms and values such as contestability of the data- and/or code-driven applications in a court of law. Requirements can include, for example, the obligation that scientists who are involved in shaping these emerging technologies and officials who lead acquisition procedures have a sustainable dialogue, in order to provide insights and theoretical reflection on the legal normative impact of these emerging technological applications. It is also desirable that legal experts are consulted in an early stage of the acquisition, and that a highly trusted party will review the applications independently, since in-depth reflections on these applications and presumed accuracy are needed.