In discussion of the social issues associated with autonomous weapons, I turn to a treatment of such in terms of strategic, or military, culture. Strategic culture is defined as “shared beliefs, assumptions, and modes of behavior, derived from common experiences and accepted narratives (both oral and written) that shape collective identity and relationships to other groups and which determine appropriate ends and means for achieving security objectives” (Kartchner, 2009). In other words, strategic culture concerns the identity, values, norms, and perceptions that affect the functioning of a given security apparatus.
American innovation has been responsible for breakthroughs in abstract communication media such as the internet and allied social networking technology. These communication technologies are a part of the status quo American identity. These technologies ‘close the gap’ of communication by enabling virtual correspondence with many, anytime. Ironically, utilization of these remote communication technologies can make for a disconnection in personal communication in terms of more time calling, texting, Facebooking, or e-mailing and often less actual face time. This component of American identity is reflected in military technology. For instance, today, at least one-third of all aerial, sea, and land-based military vehicles are unmanned, albeit still controlled from a remote location by a human operator. So while technology enables the military to ‘close the gap’ – to be anywhere in the world in a moment - this same technology can also cultivate a sense of disconnectedness from the reality on the ground. The cultural identity of America, with its emphasis on remote command, control, and communications, feeds the development of autonomous weapons technologies. Likewise, the utilization of such technologies reinforces this component of sociocultural identity.
America values a specific type of instrumental rationality; one based upon cost benefit analyses. Not everybody formulates rationality in this same way, nor is it strictly American. In the United States, the common way of evaluating war is on this scale of cost benefit analysis. There is also a strong sense of ‘duty’. As far as the US is committed to compliance with the Geneva Conventions and the principals of Just War Doctrine, military culture is concerned with proportionality (i.e. the rewards must outweigh the risks) and discrimination (i.e. between combatants and non-combatants). The verdict on whether or not an event or mode of action fits within this framework of valuation often determines the extent of socioeconomic resources committed to the cause. Turning to the US development and utilization of autonomous weapons, the extent of development and utilization presumably depends upon whether or not autonomous technology can demonstrate a reliable level of discriminatory intelligence, enough to satisfice concerns over proportionality and discrimination. This is the materialistic level of evaluation, which is supplemented with concerns about whether or not it is ‘worth it’ to advance autonomous systems for warfighting purposes. Such worthiness is a function of risk versus reward.
Risk perception also affects disposition to use or non-use of autonomous weapons. One who sees a threat around every corner is more likely to justify an aggressive posture than one who feels a relative sense of safety. In the United States, the gravity of the ‘terrorist threat’ motivates the magnitude of compromise willing to be made for the sake of security. In a time of less readily perceivable war, risky venture support is marginalized. This dynamic can affect autonomous weapons in two ways: first, it could be seen that autonomous weapons are too risky because of concerns in terms of discrimination, proportionality, and the threat of proliferation of such weapons systems at the hands of adversaries. On the other hand, it could also be seen that autonomous weapons limit the risk of human life by virtue of their characteristic ‘remote control’ function. Opponents will use the former argument, and advocates will use the latter. To flip the script, the actual use of autonomous weapons affects risk perception in a paradoxical way: friendly human life is valuable to the extent that it is desirable to minimize risk of human life through employment of autonomous systems, and enemy life is expendable to the extent that robotic technology can execute the kill.
Whether or not it is acceptable and ‘normal’ to develop autonomous weapons is a judgment created out of a combination of identity, values, and risk perception. The ultimate establishment of such norms is reflected in public policy, such as arms control. Resonance or dissonance with established norms is predictable according to an analysis of the strategic culture of the entity in question. For instance, whether or not China will chill with development of autonomous weapons, or push full speed ahead, depends upon its sociocultural environment. Likewise, once recommended rules and regulations are formalized in public policy, compliance or non-compliance will affect, and be affected by, culture.