ARTIFICIAL INTELLIGENCE (AI)
Artificial Intelligence (AI)[1] is a
special and complex field with high potential that promises value, be it for automated
decision making within the military planners’ community or overshadowing the
human intelligence. It is ably put by Marvin Minsky, that, “Within a generation
[...] the problem of creating 'artificial intelligence' will substantially be
solved”.[2] And,
since then it has come a long way, much to the surprise of human race.
ROLE OF AI IN MILITARY WARFARE
Artificial Intelligence (AI) holds prominent significance for
facilitating military decision making and enhancing the lethalness of the
forces with minimum causalities to the forces[3]. This write up focuses on various AI options and developments being
pursued by Indian Military Planners, its possible ramifications on core human
rights principles and involving legal and ethical issues.
AI promises various facets including but not limited to
minimisation of human causalities, strengthening of warfare tactics resulting
in accentuating the combat edge of the military and also acting as a force
multiplier.
In context to current scenario and with the growing issues
and tensions on both eastern and western fronts, military planners will be
forced to employ a large number of autonomous vehicles[4]
whether in air, land or sea that would be highly capable for executing its
missions on their own with less involvement of human interface. However, in the
absence of consensus on how Meaningful Human Control (MHC) is to be specified,
it concedes that there is lack of clarity on the definition of LAWS in hand[5].
A close perusal of these automation technologies bring into fore,
the concerns whether AI employed systems should be allowed to execute such dangerous
military operations, especially in scenarios where human lives and human
privacy is at stake and that is where the world is divided in two streams.
A HUMAN RIGHTS PERSPECTIVE ON USING AUTONOMOUS VEHICLES
(a) ISTAR[6] requirements by the military
planners by the so called Autonomous systems for zeroing in to High Value
Targets may require continuous collecting large swarms of data of ordinary and
benign individuals thus interfering with privacy of an individual. Presently,
there is no data protection law in India which can act as a shield in
preventing the privacy of a human individual. Edward Snowden in his revelations
in WikiLeaks has also divulged out some of the best kept secrets by the States
so far.[7]
(b) It is observed many times that during pre-programmed missions, due
to shift or change of location of any High Value Targets, there occurs a chance
of collateral damage resulting in loss of property and innocent human lives.
(c) There is a growing fear that machines with inbuilt
AI will become so smart that they will take over the realms and end human race
& civilisation as fictionally depicted in the Terminator Series[8].
If we presume that’s not the case, certainly we cannot afford to undermine a
misuse or abuse of AI in the future.
(d) Lack of Distinguishing – Parties to an armed
conflict[9]
are required to distinguish civilian population(s) and its assets from military
population(s) and assets and to target only the latter of the two but that
distinction might not be available for AI powered weapons.
(e) Lack of Proportionality[10]
- Parties to an armed conflict are required to determine the civilian cost of
achieving a particular military target and prohibits an attack if the civilian
harm exceeds the military advantage but that might not be readily available for
AI powered weapons.
(f) Also a human remains under the protection of the
principles of humanity and the dictates of the public conscience but that is
not possible for AI Powered weapons since such adherence requires a subjective
judgement, which machines can never achieve.
International Humanitarian Laws (IHL) under
Geneva Conventions[11]
revealed that already people and communities are ranging in long debates for
banning and further developments of the AI Powered Systems.
COUNTER PERSPECTIVE
There is an equally supportive body of opinion
which states that development and deployment of AI Powered vehicles is not illegal,
and in fact would lead in saving human lives[12].
Some of their counter views are as follows:-
1. AI Powered weapons do not need to have
self-preservation as a foremost drive and hence can be used in a
self-sacrificing manner, saving human lives in the process. E.g. usage of
kamikaze drones[13] such as
Harop[14]
and Harpy[15] instead of
actually sending Armed forces personnel.
2. It can be designed without any emotions that
normally cloud human judgment during battle leading to unnecessary loss of
lives.
3. The development of better AI sensors superior to
human capabilities would enable systems to pierce the fog of war, leading to
better informed “kill” decisions.
4. Autonomous weapons would have a wide range of
uses in scenarios where civilian loss would be minimal or non-existent, such as
naval warfare[16].
5. And then the question of legality comes which depends
on how these weapons are used, not their development or existence. Also it is
too early to argue over the legal issues surrounding autonomous weapons because
the technology itself has not been completely developed as India still is in
nascent stage of development, so it’s too early to speak about LAWS now[17].
As the saying goes, ‘for Peace we must
prepare for the War’. Even when most of the leaders of the world are supporters
of peace, it is the responsibility of governments to keep defense system
technically advanced. Robots equipped with the Artificial Intelligence can do
the ground work of the soldiers, which will definitely benefit the army and
will save hundreds of lives. But, as a result of being preoccupied with the
huge challenges faced by Indian Military on operational and logistic fronts
including issues related to modernization, the AI paradigm is yet to become a
key driving force in the doctrinal thinking and perspective planning of the Indian
Military Planners.
AI MILITARY WEAPONS IN INDIAN MILITARY ARMOURY
The Indian military landscape is comprised of a
wide variety of scenarios where AI Powered Vehicles can be deployed to
advantage. With the progressive development of AI technologies, following scenarios
can be visualized as under:
1. Autonomous systems such as DRDO Daksh[18]
designed to disarm IEDs[19]
are already in use by Indian Army (IA) and National Security Guards (NSG),
although there is scope for further improvement. Such autonomous systems are
non-lethal and defensive in nature.
2. We are currently in the process of procuring
manually piloted armed UAVs. Future armed Unmanned Aerial Vehicles (UAVs) or
Unmanned Undersea Vehicles (USVs) with increasing degrees of autonomy in ISR[20]
and kill functions are getting visualised. Such systems would be classified as
lethal and offensive.
3. There is a scope for deployment of Robot
Sentries, duly tailored to our requirements, along the Line of Control (LOC),
Line of Actual Control (LAC), International Border (IB) on the lines of SGR-A1[21].
Such a deployment would be categorised as lethal and defensive in character.
4. DRDO’s main facility working in this area is the
Centre for Artificial Intelligence and Robotics (CAIR), whose vision, mission
and objectives, all refer to development of intelligent systems, AI, Robotics
technologies.
5. CAIR has achieved some headway in making some
prototype systems, such as “Muntra” UGV[22],
remotely operated vehicles, wall climbing and flapping wing robots, etc.
It is now in the process of developing a Multi
Agent Robotics Framework (MARF) for catering to a myriad of military
applications. However, in order to keep in step with progress in the
international arena, these efforts alone may not suffice
CONCLUSION
Given the extended borders with our adversaries
on two fronts and the volatile Counter Insurgency (CI) scenarios in J&K and
in the North-East, it is well appreciated that having sufficient boots on the
ground is an absolute must.
At the same time, it is imperative that the Indian
Military keeps pace with the changing nature of warfare in the 21st Century,
driven by rapid advances in technology on many fronts. AI technologies, after
decades of false starts, today appear to be at an inflection point and are
rapidly being incorporated into a range of products and services in the
commercial environment. It is only a matter of time before they manifest
themselves in defence systems, in ways significant enough to usher in a new era.
Notwithstanding the world-wide concern on
development of AI weapons from human rights, legal and ethical points of view,
it is increasingly clear that, no matter what conventions are adopted at
International platforms, R&D in this area is likely to proceed unhindered.
Given our own security landscape, adoption of AI
based systems with increasing degrees of autonomy in various operational
scenarios is expected to yield tremendous benefits in the coming years and
hence, usage of this technology in a formidable manner can save many human
lives as “Wars are poor chisels for carving out
peaceful tomorrows”.
By: Mohit Kansal and Shobhit Aggarwal
[1] AI wasn't
formally founded until 1956, at a conference at Dartmouth College, in Hanover,
New Hampshire, where the term "artificial intelligence" was first
coined.
[2] The Tumultuous
Search for Artificial Intelligence" (Basic Books, 1994)
[3] Excerpts from
Villani Report, URL: https://www.iris-france.org/110108-villanis-report-defence-at-the-age-of-ai/
[4] As per Human
Rights Watch (HRW), “fully autonomous weapons are those that once initiated, will
be able to operate without Meaningful Human Control (MHC). They will be able to
select and engage targets on their own, rather than requiring a human to make
targeting and kill decisions for each individual attack.”
[5] Presentation to
PIR Center Conference on Emerging Technologies, Moscow, 29 September 2016,
Delivered by Mary Wareham, Human Rights Watch, on behalf of the Campaign to
Stop Killer Robots. URL: https://www.stopkillerrobots.org/wp-content/uploads/2013/03/KRC_Moscow_29Sept2016.pdf
[6] ISTAR stands
for intelligence, surveillance, target acquisition, and reconnaissance. In its
macroscopic sense, ISTAR is a practice that links several battlefield functions
together to assist a combat force in employing its sensors and managing the
information they gather. From: Report to the Subcommittee on Air and Land
Forces, Committee on Armed Services, House of Representatives — General
Accounting Office, 2008-03-15
[7] Burrough,
Bryan; Ellison, Sarah; Andrews, Suzanna (April 23, 2014). "The Snowden
Saga: A Shadowland of Secrets and Light". Vanity Fair. Retrieved April 29,
2016
[8] Hogan, Michael;
Whitmore, Greg (2015-01-08). "The top 20 artificial intelligence films -
in pictures". The Guardian. Retrieved 2015-08-29.
[9] How is the Term
"Armed Conflict" Defined in International Humanitarian Law? Excerpt
from International Committee of the Red Cross (ICRC) Opinion Paper, March 2008
[10] Stated in
Prosecutor v. Prlić, principle of proportionality is defined in Article
51(5)(b) of Additional Protocol 1 of Geneva Convention 1949 and was later drawn
by the drafters of Rome Statute of International Criminal Court for Article
8(2)(b)(iv) as War Crime of causing excessive incidental damage civilian
objects and loss to civilians.
[11] Definition of International Humanitarian Laws - International humanitarian law is a
set of rules which seek, for humanitarian reasons, to limit the effects of
armed conflict. Source URL: https://www.icrc.org/
en/doc/ assets/files/ other/ what_is_ihl.pdf
[12] Vikram
Suryawanshi, ‘An article by How Artificial Intelligence Can Save Human Lives’,
Source URL: https://www.insightssuccess.in/how-artificial-intelligence-can-save-human-lives/#:~:text=As%20stated%20
above %2C%20by%20using,able%20to%20tackle%20the%20accidents.
[13] A loitering
munition (also known as a suicide drone or kamikaze drone) is a weapon system
category in which the munition loiters around the target area for some time,
searches for targets and attacks once a target is located. By Army regulations, it is categorized as a
“missile”. That's why the Army prefers
the term "loitering munition” for these drones.
[14] Harop is a
loitering munition (LM) system developed in Israel by the MBT missiles division
of Israel Aerospace Industries (IAI). The Harop unmanned combat aerial vehicle
(UCAV) has been developed from the Harpy unmanned aerial vehicle (UAV), also
developed by the IAI. This combat drone was unveiled in India at the Aero-India
show in February 2009.
[15] The IAI Harpy
is a loitering munition produced by Israel Aerospace Industries. The Harpy is
designed to attack radar systems and is optimised for the suppression of enemy
air defence (SEAD) role. It carries a high explosive warhead. The Harpy has
been sold to several foreign nations, including South Korea, Turkey, India, and
China.
[16] R. Shashank
Reddy, ‘India And The Challenge Of Autonomous Weapons’, Paper published in
Carnegie India, Source URL: https://carnegieendowment.org/files/CEIP_CP275_Reddy_final.pdf
[17] LAWS, often
dubbed as “killer robots” can be ethically challenged view lethality and
machines replacing humans in taking critical tactical decisions. Owning
responsibility for a misfire is the main hurdle in deployment and commissioning
of LAWS. LAWS raise a host of philosophical, psychological and legal issues
since it is a killer machine having AI embedded with no human control. In fact,
these weapons have the potential to disrupt the present conventional way of
fighting war and many human rights activists are up in arms to ban LAWS, as they
would violate the International Humanitarian Laws (IHL) under the Geneva Convention
and International Human Rights Law (IHRL) under Universal Declaration of Human
Rights (UDHR).
[18] Daksh is a
battery-operated remote-controlled robot on wheels that was created with a
primary function of bomb recovery. Developed by Defence Research and
Development Organisation, it is fully automated. It can navigate staircases,
negotiate steep slopes, navigate narrow corridors and tow vehicles to reach
hazardous materials. Source URL: https://drdo.gov.in/robotics
[19] An improvised
explosive device (IED) attack is the use of a “homemade” bomb and/or
destructive device to destroy, incapacitate, harass, or distract. IEDs are used
by criminals, vandals, terrorists, suicide bombers, and insurgents. Because
they are improvised, IEDs can come in many forms, ranging from a small pipe
bomb to a sophisticated device capable of causing massive damage and loss of
life. IEDs can be carried or delivered in a vehicle; carried, placed, or thrown
by a person; delivered in a package; or concealed on the roadside. The term IED
came into common usage during the Iraq War that began in 2003. Factsheet from
the National Academies and the Department of Homeland Security titled, ‘News
& Terrorism – Communicating in a Crisis’, Source URL: https://www.dhs.gov/xlibrary/assets/prep_ied_fact_sheet.pdf
[20] See Supra Note 6.
[21] The SGR-A1 is a
type of sentry gun that was jointly developed by Samsung Techwin (now Hanwha
Aerospace) and Korea University to assist South Korean troops in the Korean
Demilitarized Zone. It is widely considered as the first unit of its kind to
have an integrated system that includes surveillance, tracking, firing, and
voice recognition. Pike, John, "The Samsung Techwin SGR-A1 Sentry Guard
Robot". (November 7, 2011), Global Security.
[22] MUNTRA"
UGV is a BMP class vehicle converted into tele-operated & autonomous
vehicles. The UGV has diverse range of technologies and systems incorporated in
it, including electro-optics, sensor fusion, electro-mechanical actuators and
communication systems. Source URL: https://www.drdo.gov.in/muntra-ugv