New Report: Don’t Be Evil – A Survey of the Tech Sector’s Stance on Lethal Autonomous Weapons

As we move towards a more automated world, tech companies are increasingly faced with decisions about how they want — and don’t want — their products to be used. Perhaps most critically, the sector is in the process of negotiating its relationship to the military, and to the development of lethal autonomous weapons in particular. Some companies, including industry leaders like Google, have committed to abstaining from building weapons technologies; Others have wholeheartedly embraced military collaboration. 

In a new report titled “Don’t Be Evil,” Dutch advocacy group Pax evaluated the involvement of 50 leading tech companies in the development of military technology. They sent out a survey asking companies about their current activities and their policies on autonomous weapons, and used each company’s responses to categorize it as “best practice,” “medium concern,” or “high concern.” Categorizations were based on 3 criteria:  

  • Is the company developing technology that could be relevant in the context of lethal autonomous weapons?
  • Does the company work on relevant military projects?
  • Has the company committed to not contribute to the development of lethal autonomous weapons? 

“Best practice” companies are those with explicit policies that ensure their technology will not be used for lethal autonomous weapons. Companies categorized as “medium concern” are those currently working on military applications of relevant technology but who responded that they were not working on autonomous weapons; or companies who are not known to be working on military applications of technology but who did not respond to the survey. “High concern” companies are those working on military applications of relevant technology who did not respond to the survey. 

The report makes several recommendations for how companies can prevent their products from contributing to the development of lethal autonomous weapons. It suggests that companies make a public commitment not to contribute; that they establish clear company policies reiterating such a commitment and providing concrete implementation measures; and that they inform employees about the work they are doing and allow open discussion around any concerns. 

Pax identifies six sectors considered relevant to autonomous weapons: big tech, AI software and system integration, autonomous (swarming) aerial systems, hardware, pattern recognition, and ground robots. The report is organized into these categories, and then subdivided further by country and product. We’ve instead listed the companies in alphabetical order. Find basic information about all 50 companies in the chart, and read more about a select group below.

CompanyHQRelevant TechnologyRelevant Military/Security ProjectsConcern Level
AerialXCanadaCounter-drone systemsDroneBulletHigh
AiroboticsIsraelAutonomous dronesBorder security patrol botsMedium
Airspace SystemsUSCounter-drone systemsAirspace interceptor High
AlibabaChinaAI chips; facial recognitionMedium
AmazonUSCloud; drones; facial and speech recognitionJEDI; Rekognition High
Anduril IndustriesUSAI platformsProject Maven; LatticeHigh
Animal DynamicsUKAutonomous dronesSkeeterBest practice
AppleUSComputers; facial and speech recognitionMedium
Arbe RoboticsIsraelAutonomous vehiclesBest practice
ATOSFranceAI architecture; cyber security; data ManagementMedium
BaiduChinaDeep learning; pattern recognitionMedium
Blue Bear SystemsUKUnmanned maritime and aerial systemsProject Mosquito/LANCAHigh
CambriconChinaAI chipsMedium
Citadel DefenseUSCounter-drone systemsTitanHigh
ClarifaiUSFacial recognitionProject MavenHigh
Cloudwalk TechnologyChinaFacial recognitionMedium
Corenova TechnologiesUSAutonomous swarming systemsHiveDefense; OFFSETHigh
DeepGlintChinaFacial recognitionMedium
DiboticsFranceAutonomous navigation; drones‘Generate’Medium
EarthCubeFranceMachine learning‘Algorithmic warfare tools of the future’High
FacebookUSSocial media; pattern recognition; virtual RealityMedium
General RoboticsIsraelGround robotsDogoBest practice
GoogleUSAI architecture; social media; facial recognitionBest practice
Heron SystemsUSAI software; machine learning; drone applications‘Solutions to support tomorrow’s military aircraft’High
HiveMapperUSPattern recognition; mappingHiveMapper appBest practice
IBMUSAI chips; cloud; super computers; facial recognitionNuclear testing super computers; ex-JEDIMedium
InnovizIsraelAutonomous vehiclesMedium
IntelUSAI chips; UASDARPA HIVEHigh
MegviiChinaFacial recognition Medium
MicrosoftUSCloud; facial recognitionHoloLens; JEDIHigh
MontvieuxUKData analysis; deep learning ‘Revolutionize human information relationship for defence’High
NaverS. Korea‘Ambient Intelligence’; autonomous robots; machine vision systemsMedium
NeuralaUSDeep learning neural network softwareTarget identification software for military drones Medium
OracleUSCloud; AI infrastructure; big dataEx-JEDI High
Orbital InsightUSGeospatial analyticsMedium
PalantirUSData analyticsDCGS-AHigh
PerceptoIsraelAutonomous drones Medium
Roboteam IsraelUnmanned systems; AI softwareSemi-autonomous military UGVsHigh
SamsungS. KoreaComputers and AI platformsMedium
SenseTimeChinaComputer vision; deep learningSenseFace; SenseTotem for police use High
Shield AIUSAutonomous (swarming) dronesNovaHigh
SiemensGermanyAI; AutomationKRNS; TRADESMedium
SoftbankJapanTelecom; RoboticsBest practice
SparkCognitionUSAI systems; swarm technology‘Works across defense and national security space in the U.S.’High
SynthesisBelarusAI- and cloud-based applications; pattern recognitionKipodHigh
Taiwan SemiconductorTaiwanAI chipsMedium
TencentChinaAI applications; cloud; ML; pattern recognitionMedium
TharsusUKRoboticsMedium
VisionLabsRussiaVisual recognitionBest practice
YituChinaFacial recognitionPolice use High

Company names are colored to indicate concern level: best practicemedium concernhigh concern.

AerialX

  • Developing the DroneBullet, a kamikaze drone that can autonomously identify, track, and attack a target drone
  • Working to modify DroneBullet “for a warhead-equipped loitering munition system”

Airobotics

  • In response to survey, stated that its “drone system has nothing to do with weapons and related industries
  • Has clear links to military and security business; announced a Homeland Security and Defense division and an initiative to perform emergency services in 2017
  •  Involved in border security, in particular US-Mexico border; provides patrol bots
  • Co-founder has stated that it will not add weapons to its drones

Airspace Systems

  • Utilizes AI and advanced robotics for airspace security solutions, including “long-range detection, instant identification, and autonomous mitigation—capture and safe removal of unauthorized or malicious drones”
  • Developed Airspace Interceptor, a fully autonomous system that can capture target drones, in collaboration with US Department of Defense

Alibaba

  • China’s largest online shopping company
  • Recently invested in seven research labs that will focus on areas including AI, machine learning, network security, and natural language processing
  • Established a semiconductor subsidiary, Pingtouge, in September 2018
  • Major investor in tech sector, including in Megvii and SenseTime 

Amazon

  • Likely winner of JEDI contract, a US military project that will serve as universal data infrastructure linking Pentagon and soldiers in the field
  • Developed Rekognition program, used by police; testing by ACLU revealed that nearly 40 percent of false matches involved people of color
  • CEO has stated, “If big tech companies are going to turn their back on the U.S. Department of Defense, this country is going to be in trouble.” 
  • Received backlash since exposure of partnership with government agencies, including ICE
  • Has since proposed guidelines for responsible use of tech

Anduril Industries

  • Co-founded by a former intelligence official
  • Has vocally supported stronger ties between tech sector and Pentagon: “AI has paradigm-shifting potential to be a force-multiplier […] it will provide better outcomes faster, a recipe for success in combat.”
  • Involved in Project Maven
  • Has offered support for the Pentagon’s newly formed Joint Artificial Intelligence Center
  • Developed the Lattice, an autonomous system that provides soldiers with a view of the front line and can be used to identify targets and direct unmanned vehicles into combat; has been used to catch border crossers
  • Co-founder has stated that Anduril is “deployed at several military bases. We’re deployed in multiple spots along the U.S. border […] We’re deployed around some other infrastructure I can’t talk about.”

Animal Dynamics

  • Spin-off company originating in Oxford University’s Zoology Department
  • Develops unmanned aerial vehicles
  • Stork, a paraglider with autonomous guidance and navigation, has received interest from both military and humanitarian aid/disaster relief organizations
  • Skeeter, “disruptive drone technology,” was developed with funding from UK government’s Defense Science and Technology Laboratory
  • In March 2019, took over software developer Accelerated Dynamics, which has developed ADx autonomous flight-control software
  • Use of ADx with Skeeter allows it to be operated in a swarm configuration, which has military applications
  • In response to survey, CEO stated that “we will not weaponize or provide ‘kinetic’ functionality to the products we make,” and that “legislating against harmful uses for autonomy is an urgent and necessary matter for government and the legislative framework to come to terms with.”

Arbe Robotics

  • Began in military and homeland security sectors but has moved to cars
  • In response to survey, stated that it “will sign agreements with customers that would confirm that they are not using our technology for military use.”

Baidu

  • Largest provider of the Chinese-language Internet search services
  • Highly committed to artificial intelligence and machine learning and is exploring applications for facial recognition technology
  • Opened Silicon Valley AI research lab in 2013, where it has been heavily investing in AI applications; has scaled down this research since US-China trade war
  • In charge of China’s Engineering Laboratory for Deep Learning Technologies, established March 2017
  • Will contribute to National Engineering Laboratory for Brain-Inspired Intelligence Technology and Applications

Blue Bear Systems

  • Research company involved in all aspects of unmanned systems and autonomy, including big data, AI, electronic warfare, and swarming systems
  • In March 2019, consortium it headed was awarded UK Ministry of Defense contract worth GBP 2.5 million to develop drone swarm technology

Citadel Defense

  • “Protects soldiers from drone attacks and surveillance in enemy combat” and “creates a force multiplier for Warfighters that enables them to get more done with the same or fewer resources”
  • Contracted by US Air Force to provide systems that can defeat weaponized drones and swarms
  • Developed autonomous counter-drone system called Titan

Corenova Technologies

  •  Offers “military-grade solutions to secure autonomous operations,” according to website
  • Developed HiveDefense, “an evolving swarm of self-learning bots”
  • Works with DARPA on OFFSET, facilitating unmanned missions without human control

Dibotics

  • Works on autonomous navigation
  • Supported by Generate, a program for French defense start-ups
  • Founder/CEO signed FLI’s 2017 open letter to the UN

EarthCube

  • “Developing monitoring solutions based on an automated analysis of geospatial information”
  • Has been described as “conceiving of the algorithmic warfare tools of the future.”
  • CEO has stated, “With the emergence of new sensors—whether they are satellite, UAV or plane—we have seen here a great opportunity to close the gap between AI in the lab and Activity Based Intelligence (ABI) in the field.”

General Robotics

  • Robotics company focused on defense and security
  • Founder previously worked in Israeli defense ministry’s R & D authority
  • Supplies “advanced robotics systems to counter-terrorist units worldwide,” many of which are designed for “urban warfare”
  • Developed Dogo, said to be “the world’s first inherently armed tactical combat robot,” but controlled remotely rather than autonomously
  • In response to survey, CEO stated that “our position is not to allow lethal autonomous weapons without human supervision and human final active decision […] In general, our systems are designed to provide real-time high quality information and to present it to a trained human operator in an intuitive manner; this insures better decision making by the human and thereby better results with less casualties.” 

Heron Systems

  • Provides “leading-edge solutions for national security customers”
  • States that its mission is “to strengthen America’s defense by providing innovative laboratory testing and simulation solutions”

Hivemapper

  • Software provides mapping, visualization, and analytic tools; uses video footage to generate instant detailed 3-D maps and detect changes; could potentially be used by Air Force to model bombing
  • Founder/CEO has stated that he “believes Silicon Valley and the US government have to work together to maintain America’s technological edge—lest authoritarian regimes that don’t share the US values catch up.”
  • Founder/CEO signed FLI’s 2015 open letter; In his survey response, he stated that “we absolutely want to see a world where humans are in control and responsible for all lethal decisions.”  

IBM

  • Bid for Jedi contract and failed to qualify 
  • Actively working towards producing “next-generation artificial intelligence chips” for which it is building a new AI research center; over the next 10 years, expect to improve AI computing by 1,000 times
  • Long history of military contracting, including building supercomputers for nuclear weapons research and simulations
  • Currently involved in augmented military intelligence research for US Marine Corps
  • 3 dozen staff members, including Watson design lead and VP of Cognitive Computing at IBM Research, signed a 2015 open letter calling for a ban on lethal autonomous weapons
  • Developed Diversity in Faces dataset using information from Flickr images; claims the project will reduce bias in facial recognition; Dataset available to companies and universities linked to military and law enforcement around the world
  • In response to survey, confirmed it is not currently developing lethal autonomous weapons systems

Innoviz

  • Produces laser-based radar for cars
  • Founded by former members of the IDF’s elite technological unit, but does not currently appear to be developing military applications

Intel

  • Develops various AI technologies, including specific solutions, software, and hardware, which it provides to governments
  • Selected by DARPA in 2017 to collaborate on DARPA HIVE, a data-handling and computing platform utilizing AI and ML
  • Announced in 2018 that it will work with DARPA on developing “the design tools and integration standards required to develop modular electronic systems”
  • Has invested significantly in unmanned aerial vehicles and flight control technology

Megvii

  • AI provider known for facial recognition software Face++
  • Reportedly uses facial scans from a Ministry of Public Security photo database that contains files on nearly every Chinese citizen
  • Has stated, “We want to build the eyes and brain of the city, to help police analyze vehicles and people to an extent beyond what is humanly possible.”

Microsoft

  • Competing with Amazon for JEDI contract
  • Published “The Future Computed” in 2018, which defines core principles necessary for the development of beneficial AI 
  • According to employees, “With JEDI, Microsoft executives are on track to betray these principles in exchange for short-term profits.” 
  • Company position on lethal autonomous weapons systems unclear
  • First tech giant to call for regulations to limit use of facial recognition technology

Montvieux

  • Developing a military decision-making tool that uses deep learning-based neural networks to assess complex data
  • Receives funding from the UK government

Neurala

  • Sells AI technology that can run on light devices and helps drones, robots, cars, and consumer electronics analyze their environments and make decisions
  • Military applications are a key focus
  • Works with a broad range of clients including the US Air Force, Motorola, and Parrot
  • Co-founder/CEO signed FLI’s 2017 open letter to the UN

Oracle

  • Provides database software and technology, cloud-engineered systems, and enterprise software products
  • Website states, “Oracle helps modern defense prepare for dynamic mission objectives”
  • Bid for JEDI contract and failed to qualify; filed several complaints, in part related to Pentagon’s decision to use only one vendor

Palantir

  • Data-analysis company founded in 2004 by Trump advisor; has roots in CIA-backed In-q-Tel venture capital organization
  • Producer of “Palantir Intelligence,” a tool for analyzing data that is used throughout the intelligence community
  • Has developed predictive policing technology used by law enforcement around the US
  • In 2016, won a Special Operations Command contract worth USD 222 million for a technology 
  • In March 2019, won a US Army contract worth over USD 800 million to build the Distributed Common Ground System, an analytical systems for use by soldiers in combat zones

Percepto

  • Developed the Sparrow, an autonomous patrol drone with security applications
  • Focuses explicitly on industrial, rather than military or border security, applications
  • In response to survey, stated “Since we develop solutions to the industrial markets, addressing security, safety, and operational needs, the topic of lethal weapon[s] is completely out of the scope of our work”

Roboteam

  • Founded by two former Israeli military commanders with “access to the Israel Defense Forces as our backyard for testing”
  • Specifically serves military markets, including the Pentagon
  • Developed Artificial Intelligence Control Unit (AI-CU), which brings autonomous navigation, facial recognition, and other AU-enabled capabilities to control and operation of unmanned systems and payloads
  • Exposure of links to Chinese investment firm FengHe Fund Management appears to have cost them a series of US Army robotics contracts last year

Samsung

  • One of world’s largest tech companies
  • Developing AI technologies to be applied to all its products and services in order to retain its hold on telephone/computer market
  • Samsung Techwin, Samsung’s military arm known for SG1A Sentry robot, was sold in 2014

SenseTime

  • Major competitor of Megvii
  • Sells software that recognizes objects and people
  • Various Chinese police departments use its SenseTotem and SenseFace systems to analyze video and make arrests
  • Valued at USD 4.5 billion, it is “the world’s most valuable AI start-up” and receives about two-fifths of its revenue from government contracts
  • In November 2017, sold its 51 percent stake in Tangli Technology, a “smart-policing” company it helped found

Shield AI

  • States that its “mission is to protect service members and innocent civilians with artificially intelligent systems”
  • Makes systems based on Hivemind, AI that enables robots to “learn from their experiences”
  • Developed Nova, a “combat proven” robot that autonomously searches buildings while streaming video and generating maps
  • Works with Pentagon and Department of Homeland Security “to enable fully autonomous unmanned systems that dramatically reduce risk and enhance situational awareness in the most dangerous situations.

Siemens

  • Europe’s largest industrial manufacturing conglomerate
  • Known for medical diagnostics equipment (CT scanners), energy equipment (turbines, generators), and trains
  • Produces MindSphere, a cloud-based system that helps enable the US of AI in industry
  • In 2013, won a USD 2.2 million military research contract with Carnegie Mellon University and HRL Laboratories to develop improved intelligence tools
  • Collaborating with DARPA on the TRAnsformative DESign (TRADES) program
  • In response to survey, stated: “Siemens in not active in this business area. Where we see a potential risk that components or technology or financing may be allocated for a military purpose, Siemens performs a heightened due diligence. […] All our activities are guided by our Business Conduct Guidelines that make sure that we follow high ethical standards and implement them in our everyday business. We also work on responsible AI principles which we aim to publish later this year.”

SoftBank

  • Invests in AI technology through its USD 100 billion Vision Fund, including BrainCorp, NVIDIA, and Slack Technologies; Owns some 30 percent of Alibaba
  • Works in partnership with Saudi Arabia’s sovereign wealth fund and is part of Saudi strategy for diversifying away from oil
  • In 2017, took over Boston Dynamics and Schaft, both connected with DARPA
  • Developed the humanoid Pepper robot
  • In response to survey, stated, “We do not have a weapons business and have no intention to develop technologies that could be used for military purposes”

SparkCognition

  • Collaborates “with the world’s largest organizations that power, finance, and defend our society to uncover their highest potential through the application of AI technologies.”
  • Has attracted interest from former and current Pentagon officials, several of whom serve on the board or as advisors
  • Works “across the national security space—including defense, homeland security intelligence, and energy—to streamline every step of their operations”; has worked with the British Army on military AI applications
  • Founder/CEO has stated that he believes restrictions on autonomous weapons would stifle progress and innovation

Synesis

  • Developed Kipod, a video analytics platform used by law enforcement agencies, governments, and private security organizations to find faces, license plates, object features, and behavioral events
  • In use by law enforcement in Belarus, Russia, Kazakhstan, and Azerbaijan

Tencent

  • China’s biggest social media company
  • Created Miying platform to assist doctors with disease screening and more
  •  Focused on research in machine learning, speech recognition, natural language processing, and computer vision
  • Developing practical AI applications in online games, social media, and cloud services
  • Investing in autonomous vehicle AI technologies
  • Has described its relationship to public in terms of a social contract: “Billions of users have entrusted us with their personal sensitive information; this is the reason we must uphold our integrity above the requirements of the law.”

VisionLabs

  • Developed Luna, software package that helps businesses verify and identify customers based on photos or videos
  • Partners with more than 10 banks in Russia and the Commonwealth of Independent States (CIS)
  • In response to survey, stated that they “explicitly prohibit the use of VisionLabs technology for military applications. This is a part of our contracts. We also monitor the results/final solution developed by our partners.”

Yitu

  • Developed “Intelligent Service Platform,” an algorithm that covers facial recognition, vehicle identification, text recognition, target tracking, and feature-based image retrieval
  • Its DragonFly Eye System can reportedly identify a person from a nearly two-billion-photo database within seconds
  • Technology utilized by numerous public security bureaus
  • In February 2018, supplied Malaysia’s police with facial recognition technologies; partners with local governments and other organizations in Britain

State of AI: Artificial Intelligence, the Military and Increasingly Autonomous Weapons

As artificial intelligence works its way into industries like healthcare and finance, governments around the world are increasingly investing in another of its applications: autonomous weapons systems. Many are already developing programs and technologies that they hope will give them an edge over their adversaries, creating mounting pressure for others to follow suite.

These investments appear to mark the early stages of an AI arms race. Much like the nuclear arms race of the 20th century, this type of military escalation poses a threat to all humanity and is ultimately unwinnable. It incentivizes speed over safety and ethics in the development of new technologies, and as these technologies proliferate it offers no long-term advantage to any one player.

Nevertheless, the development of military AI is accelerating. Below are the current AI arms programs, policies, and positions of seven key players: the United States, China, Russia, the United Kingdom, France, Israel, and South Korea. All information is from State of AI: Artificial intelligence, the military, and increasingly autonomous weapons, a report by Pax.

“PAX calls on states to develop a legally binding instrument that ensures meaningful human control over weapons systems, as soon as possible,” says Daan Kayser, the report’s lead author. “Scientists and tech companies also have a responsibility to prevent these weapons from becoming reality. We all have a role to play in stopping the development of Killer Robots.”

The United States

UN Position

In April 2018, the US underlined the need to develop “a shared understanding of the risk and benefits of this technology before deciding on a specific policy response. We remain convinced that it is premature to embark on negotiating any particular legal or political instrument in 2019.”

AI in the Military

  • In 2014, the Department of Defense released its ‘Third Offset Strategy,’ the aim of which, as described in 2016 by then-Deputy Secretary of Defense “is to exploit all advances in artificial intelligence and autonomy and insert them into DoD’s battle networks (…).”
  • The 2016 report ‘Preparing for the Future of AI’ also refers to the weaponization of AI and notably states: “Given advances in military technology and AI more broadly, scientists, strategists, and military experts all agree that the future of LAWS is difficult to predict and the pace of change is rapid.”
  • In September 2018, the Pentagon committed to spend USD 2 billion over the next five years through the Defense Advanced Research Projects Agency (DARPA) to “develop [the] next wave of AI technologies.”
  • The Advanced Targeting and Lethality Automated System (ATLAS) program, a branch of DARPA, “will use artificial intelligence and machine learning to give ground-combat vehicles autonomous target capabilities.”

Cooperation with the Private Sector

  • Establishing collaboration with private companies can be challenging, as the widely publicized case of Google and Project Maven has shown: Following protests from Google employees, Google stated that it would not renew its contract. Nevertheless, other tech companies such as Clarifai, Amazon and Microsoft still collaborate with the Pentagon on this project.
  • The Project Maven controversy deepened the gap between the AI community and the Pentagon. The government has developed two new initiatives to help bridge this gap.
  • DARPA’s OFFSET program, which has the aim of “using swarms comprising upwards of 250 unmanned aircraft systems (UASs) and/or unmanned ground systems (UGSs) to accomplish diverse missions in complex urban environments,” is being developed in collaboration with a number of universities and start-ups.
  • DARPA’s Squad X Experimentation Program, which aims for human fighters to “have a greater sense of confidence in their autonomous partners, as well as a better understanding of how the autonomous systems would likely act on the battlefield,” is being developed in collaboration with Lockheed Martin Missiles.

China

UN Position

China demonstrated the “desire to negotiate and conclude” a new protocol “to ban the use of fully
autonomous lethal weapons systems.” However, China does not want to ban the development of these
weapons, which has raised questions about its exact position.

AI in the Military

  • There have been calls from within the Chinese government to avoid an AI arms race. The sentiment is echoed in the private sector, where the chairman of Alibaba has said that new technology, including machine learning and artificial intelligence, could lead to a World War III.
  • Despite these concerns, China’s leadership is continuing to pursue the use of AI for military purposes.

Cooperation with the Private Sector

  • To advance military innovation, President Xi Jinping has called for China to follow “the road of military-civil fusion-style innovation,” such that military innovation is integrated into China’s national innovation system. This fusion has been elevated to the level of a national strategy.
  • The People’s Liberation Army (PLA) relies heavily on tech firms and innovative start-ups. The larger AI research organizations in China can be found within the private sector.
  • There are a growing number of collaborations between defense and academic institutions in China. For instance, Tsinghua University launched the Military-Civil Fusion National Defense Peak Technologies Laboratory to create “a platform for the pursuit of dual-use applications of emerging technologies, particularly artificial intelligence.”
  • Regarding the application of artificial intelligence to weapons, China is currently developing “next generation stealth drones,” including, for instance, Ziyan’s Blowfish A2 model. According to the company, this model “autonomously performs more complex combat missions, including fixed-point timing detection, fixed-range reconnaissance, and targeted precision strikes.”

Russia

UN Position

Russia has stated that the debate around lethal autonomous weapons should not ignore their potential benefits, adding that “the concerns regarding LAWS can be addressed through faithful implementation of the existing international legal norms.” Russia has actively tried to limit the number of days allotted for such discussions at the UN.

AI in the Military

  • While Russia does not have a military-only AI strategy yet, it is clearly working towards integrating AI more comprehensively.
  • The Foundation for Advanced Research Projects (the Foundation), which can be seen as the Russian equivalent of DARPA, opened the National Center for the Development of Technology and Basic Elements of Robotics in 2015.
  • At a conference on AI in March 2018, Defense Minister Shoigu pushed for increasing cooperation between military and civilian scientists in developing AI technology, which he stated was crucial for countering “possible threats to the technological and economic security of Russia.”
  • In January 2019, reports emerged that Russia was developing an autonomous drone, which “will be able to take off, accomplish its mission, and land without human interference,” though “weapons use will require human approval.”

Cooperation with the Private Sector

  • A new city named Era, devoted entirely to military innovation, is currently under construction. According to the Kremlin, the “main goal of the research and development planned for the technopolis is the creation of military artificial intelligence systems and supporting technologies.”
  • In 2017, Kalashnikov — Russia’s largest gun manufacturer — announced that it had developed a fully automated combat module based on neural-network technologies that enable it to identify targets and make decisions.

The United Kingdom

UN Position

The UK believes that an “autonomous system is capable of understanding higher level intent and direction.” It suggested that autonomy “confers significant advantages and has existed in weapons systems for decades” and that “evolving human/machine interfaces will allow us to carry out military functions with greater precision and efficiency,” though it added that “the application of lethal force must be directed by a human, and that a human will always be accountable for the decision.” The UK stated that “the current lack of consensus on key themes counts against any legal prohibition,” and that it “would not have any
practical effect.”

AI in the Military

  • A 2018 Ministry of Defense report underlines that the MoD is pursuing modernization “in areas like artificial
    intelligence, machine-learning, man-machine teaming, and automation to deliver the disruptive
    effects we need in this regard.”
  • The MoD has various programs related to AI and autonomy, including the Autonomy program. Activities in this program include algorithm development, artificial intelligence, machine learning, “developing underpinning technologies to enable next generation autonomous military-systems,” and optimization of human autonomy teaming.
  • The Defense Science and Technology Laboratory (Dstl), the MoD’s research arm, launched the AI Lab in 2018.
  • In terms of weaponry, the best-known example of autonomous technology currently under development is the top-secret Taranis armed drone, the “most technically advanced demonstration aircraft ever built in the UK,” according to the MoD.

Cooperation with the Private Sector

  • The MoD has a cross-government organization called the Defense and Security Accelerator (DASA), launched in December 2016. DASA “finds and funds exploitable innovation to support UK defense and security quickly and effectively, and support UK property.”
  • In March 2019, DASA awarded a GBP 2.5 million contract to Blue Bear Systems, as part of the Many Drones Make Light Work project. On this, the director of Blue Bear Systems said, “The ability to deploy a swarm of low cost autonomous systems delivers a new paradigm for battlefield operations.”

France

UN Position

France understands the autonomy of LAWS as total, with no form of human supervision from the moment of activation and no subordination to a chain of command. France stated that a legally binding instrument on the issue would not be appropriate, describing it as neither realistic nor desirable. France did propose a political declaration that would reaffirm fundamental principles and “would underline the need to maintain human control over the ultimate decision of the use of lethal force.”

AI in the Military

  • France’s national AI strategy is detailed in the 2018 Villani Report, which states that “the increasing use of AI in some sensitive areas such as […] in Defense (with the question of autonomous weapons) raises a real society-wide debate and implies an analysis of the issue of human responsibility.”
  • This has been echoed by French Minister for the Armed Forces, Florence Parly, who said that “giving a machine the choice to fire or the decision over life and death is out of the question.”
  • On defense and security, the Villani Report states that the use of AI will be a necessity in the future to ensure security missions, to maintain power over potential opponents, and to maintain France’s position relative to its allies.
  • The Villani Report refers to DARPA as a model, though not with the aim of replicating it. However, the report states that some of DARPA’s methods “should inspire us nonetheless. In particular as regards the President’s wish to set up a European Agency for Disruptive Innovation, enabling funding of emerging technologies and sciences, including AI.”
  • The Villani Report emphasizes the creation of a “civil-military complex of technological innovation, focused on digital technology and more specifically on artificial intelligence.”

Cooperation with the Private Sector

  • In September 2018, the Defense Innovation Agency (DIA) was created as part of the Direction Générale de l’Armement (DGA), France’s arms procurement and technology agency. According to Parly, the new agency “will bring together all the actors of the ministry and all the programs that contribute to defense innovation.”
  • One of the most advanced projects currently underway is the nEUROn unmanned combat air system, developed by French arms producers Dassault on behalf of the DGA, which can fly autonomously for over three hours.
  • Patrice Caine, CEO of Thales, one of France’s largest arms producers, stated in January 2019 that Thales will never pursue “autonomous killing machines,” and is working on a charter of ethics related to AI.

Israel

UN Position

In 2018, Israel stated that the “development of rigid standards or imposing prohibitions to something that is so speculative at this early stage, would be imprudent and may yield an uninformed, misguided result.” Israel underlined that “[w]e should also be aware of the military and humanitarian advantages.”

AI in the Military

  • It is expected that Israeli use of AI tools in the military will increase rapidly in the near future.
  • The main technical unit of the Israeli Defense Forces (IDF) and the engine behind most of its AI developments is called C4i. Within C4i, there is the the Sigma branch, whose “purpose is to develop, research, and implement the latest in artificial intelligence and advanced software research in order to keep the IDF up to date.”
  • The Israeli military deploys weapons with a considerable degree of autonomy. One of the most relevant examples is the Harpy loitering munition, also known as a kamikaze drone: an unmanned aerial vehicle that can fly around for a significant length of time to engage ground targets with an explosive warhead.
  • Israel was one of the first countries to “reveal that it has deployed fully automated robots: self-driving military vehicles to patrol the border with the Palestinian-governed Gaza Strip.”

Cooperation with the Private Sector

  • Public-private partnerships are common in the development of Israel’s military technology. There is a “close connection between the Israeli military and the digital sector,” which is said to be one of the reasons for the country’s AI leadership.
  • Israel Aerospace Industries, one of Israel’s largest arms companies, has long been been developing increasingly autonomous weapons, including the above mentioned Harpy.

South Korea

UN Position

In 2015, South Korea stated that “the discussions on LAWS should not be carried out in a way that can hamper research and development of robotic technology for civilian use,” but that it is “wary of fully autonomous weapons systems that remove meaningful human control from the operation loop, due to the risk of malfunctioning, potential accountability gap and ethical concerns.” In 2018, it raised concerns about limiting civilian applications as well as the positive defense uses of autonomous weapons.

AI in the Military

  • In December 2018, the South Korean Army announced the launch of a research institute focusing on artificial intelligence, entitled the AI Research and Development Center. The aim is to capitalize on cutting-edge technologies for future combat operations and “turn it into the military’s next-generation combat control tower.”
  • South Korea is developing new military units, including the Dronebot Jeontudan (“Warrior”) unit, with the aim of developing and deploying unmanned platforms that incorporate advanced autonomy and other cutting-edge capabilities.
  • South Korea is known to have used the armed SGR-A1 sentry robot, which has operated in the demilitarized zone separating North and South Korea. The robot has both a supervised mode and an unsupervised mode. In the unsupervised mode “the SGR-AI identifies and tracks intruders […], eventually firing at them without any further intervention by human operators.”

Cooperation with the Private Sector

  • Public-private cooperation is an integral part of the military strategy: the plan for the AI Research and Development Center is “to build a network of collaboration with local universities and research entities such as the KAIST [Korea Advanced Institute for Science and Technology] and the Agency for Defense Development.”
  • In September 2018, South Korea’s Defense Acquisition Program Administration (DAPA) launched a new
    strategy to develop its national military-industrial base, with an emphasis on boosting ‘Industry 4.0
    technologies’, such as artificial intelligence, big data analytics and robotics.

To learn more about what’s happening at the UN, check out this article from the Bulletin of the Atomic Scientists.