Copeland, Damian2023-11-082023-11-08http://hdl.handle.net/1885/305643Autonomous weapon systems ('AWS') are no longer limited to science fiction. In March 2021, the final report of the United Nations Panel of Experts on Libya included a reference to the use of a Turkish made STM Kargu-2 autonomous drone to attack the logistics convoys of the Haftar Affiliated Forces by the Libyan Government of National Accord Affiliated Forces. The report states that the AWS 'were programmed to attack targets without requiring data connectivity between the operator and the munition'. This was the first recorded use of an AWS in armed conflict and occurred against the background of an burgeoning AI arms race. AWS and the prospect of algorithms making life and death decisions in armed conflict raise complex legal, ethical, technical, moral and operational challenges for those states and private industry developing them for use in armed conflict. Since 2014, states party to the Convention on Certain Conventional Weapons ('CCW') and civil society have debated in the United Nations the need to regulate AWS (referred to as lethal autonomous weapon systems or 'LAWS'). Elements of civil society seek a prospective ban on LAWS by creating a new Protocol under the CCW. However, despite extensive debate, the forum is yet to achieve consensus on the need for new international law either prohibiting or restricting the use of AWS in armed conflict. In the absence of new international law regulating AWS, an important question is how can states ensure that their development and use of AWS is consistent with their existing international law obligations? For states party to the First Additional Protocol to the Geneva Conventions of 1949 ('AP 1'), art 36 requires: In the study, development, acquisition or adoption of a new weapon, means or method of warfare, a High Contracting Party is under an obligation to determine whether its employment would, in some or all circumstances, be prohibited by this Protocol or by any other rule of international law applicable to the High Contracting Party. The central question addressed by this thesis is: How can states party to AP 1 fulfil their art 36 weapons review obligation to determine the legality of AWS? This is a challenging question. While there is academic discourse and evidence of state practice on the weapons review of conventional, human operated weapons, there is little in relation AWS. Many states have called for the voluntary exchange of best practice for the weapons review of AWS however no state has publicly disclosed their national approach to the weapons review of AWS. This thesis seeks to contribute to the knowledge of the art 36 weapons review obligation (referred to as 'weapons reviews') as it applies to AWS. This thesis concludes that the traditional weapons review process is insufficient to determine the legality of AWS. This is because the traditional method does not consider in detail the international humanitarian law ('IHL') rules regulating the use of force in armed conflict. To determine the legality of AWS, this thesis proposes that states should expand their domestic weapons review processes in two ways. First, to permit the analysis of the IHL targeting law that regulates the AWS' functions, a new 'functional review' step is proposed in addition to the existing traditional weapons review steps. Second, an expanded weapons review process is proposed that applies across the AWS lifecycle, from its inception through acquisition and during its in-service life. This expanded process has three stages: the informative stage designed to inform those responsible for the initial design and development of the AWS of the state's international law obligations, a determinative stage (including the traditional weapons review) to determine the legality of the AWS prior to its use in armed conflict and, finally, a governance stage to monitor the ongoing legality of the AWS during its use in armed conflict.en-AUThe Article 36 legal review of weapons enhanced by AI (autonomous weapons)202310.25911/1Y6F-7W97