My face when ethicists won't let me live out my dreams.
For those not in the know, nuclear deterrence is a theory in international security stating that nuclear armed states will refrain from using their nuclear arsenals if they believe that retaliation for such use will be unacceptable. Simply put, I won't nuke you if I think you have the ability to hurt me badly. So it's in your best interest to keep me scared, and in my best interest to keep you scared. But only a little scared. Here's where the idea gets complicated and somewhat circular: I don't wanna scare you too much, because then you become more likely to nuke me, and vise versa. On the flip side, if I try to protect myself from nuke attacks too much, that might make you nervous; why am I so worried about being nuked? Am I planning to do something that would make you attack me? You see how this logic can easily devolve into something resembling that old Abbot and Costello bit Who's on First. Likewise, you can see that when more than one country has nukes, everyone else at the table gets nervous.
Walzer points out, quite correctly, that deterrence can never be fully proven as effective. There are no case studies for philosophers or strategists, so they make up imaginary scenarios and simulations. Whether these have any relationship to what would happen in reality is entirely unknown, and herein lies an important point: We never ever want to find out. Once a nuke goes off in aggression, deterrence theory has failed, and we really can't be sure what would happen next. Maybe nothing. Maybe the aggressing nation is immediately invaded and its leaders brought up on war crimes charges. Maybe everyone launches all their nukes on everyone. Maybe everyone launches their nukes on the original aggressor. Paging Dr. Strangelove.
So here's where we come to a place where I think Walzer might not have the full nuance of the argument. Certainly, the existence of such weapons has inherent moral implications. Certainly, threatening (implicitly or explicitly) to kill millions of innocent people if you are attacked is morally reprehensible. But it might be better than the alternative. Whatever we might wish, and whatever disarmament we might work towards, these weapons exist, and a lot of countries have them. Until such as time as they no longer exist, they have to be accounted for in strategy.
Now we'll see who gets chewing gum put in their hair on the playground.
Let's say I'm the leader of a large Western nation. As their leader, one of my top priorities (perhaps my highest priority) is the security of my constituents. I have a legal and moral obligation to protect these people as best I can. Certainly, threatening to kill millions in an instant is inherently immoral. But my first moral obligation is to my people. Nuclear weapons are one of the chief threats to them, even if I wish they weren't. I wish I lived in that world, but I don't. I live and lead in this world, and for me, it would be a greater moral violation to allow my people to be dominated or exploited by a nuclear power. To keep that from happening, I engage in another moral violation, but it is the lesser of the evils I am faced with.
Maybe in the future, humanity will move past nuclear weapons. But until then, we have to make compromises to protect ourselves. It's not so much about compromising morality away as it is about compromising for as much morality as is realistic. Change comes in small steps. Real world leaders don't get the luxury of clean hands. The goal has to be not to let them get too filthy. That's a blurry line at best, and it's one that world leaders will likely never get down perfectly. But we're all we've got, and we can only do what's possible. The good news is, we have a lot of smart people with strong moral compasses. I like our odds.