Guest post: The future of automated warfare by Nelson Lowhim

nelsonlowhim2There’s been a lot of talk lately about the trolley problem as envisioned by future artificial intelligence (AI) in driverless cars that will be forced to choose between one or five people in case of an accident. The trolley problem, if you aren’t aware, has to do with a trolley bearing down on five people and you’re presented with a choice: you can pull a lever to send the trolley down another rail that will kill one person or let the five people die. Do you do it? On average the majority of people agree that the lever should be pulled. There are a myriad of variations of the Trolley problem and even more interpretations as people’s answers point to a morality that differs from the standard utilitarian viewpoint.

Such an AI in a car will of course cause problems, but my main issue isn’t the trolley problem in the individual situations but the more insidious ones already playing out with our remote-controlled wars—more specifically the drone wars—and how not only in-the-air decisions will be made, but the decision to have the drone up to begin with, represents a kind of Trolley problem (and indeed there are already just such variations of the problem like: “air bomber vs terrorist” that only reinforce our cultural norms).

Let me back up and examine the current drone war, a remote-controlled one, and why it’s a preferred way to wage undeclared wars. I will also show how this automated route is and will continue to be preferred by politicians and military leaders alike.

First, let’s take a look at history. Back in the days of the wars on Indochina, Laos was considered, by many in the military, to be the best way to fight a war. Unlike Vietnam, Laos was “won”, by bombing it to pieces. Plains of Jars is an example of this method; it’s a book that provides testimonials of the people on the receiving end of these bombs as they speak to their confusion at being dealt so much horror.

Of course, those on the receiving end of mass murder was not the concern of our leaders. What was learned from this first real automated war was that with few troops in harm’s way, the public simply did not care. And with even fewer American soldiers to bear witness to the horrors of such a war—to say nothing of those troublesome draftees that we have currently bypassed—the war resulted in a win-win for military and political leaders.

With the current drone war, as with the Tomahawk missiles of near-past engagements, we see a similar attitude playing out. A war can be fought for a longer amount of time at a lower cost and with no physical risks for our soldiers with the added benefit of having a docile media tout every strike as a success. Imagine that—everyday a victory. And no witnesses to speak of the flesh and blood and bone marrow leaked in such wins. Every piece of information that does manage to make it past our national security filters is treated not as a revelation or light shed upon secret compartments that make money for a few, but as treason—to what exactly, of course, is never asked by our intellectual class. This state of affairs is unfortunate, but for those at top and those with money to gain, what could be better?

Well, there are a few issues: the pilots of the drones must still deal with PTSD. Unfortunate, this. Even more unfortunate are those who have come forth to speak of the drone wars as immoral. Pesky troops, even professional ones, tend to buck the yoke. What can a leader do? Well, one solution is to remove the person firing the trigger by at least one decision making level. This separation removes some of the moral responsibility and the military is looking to move towards such a system sooner rather than later. In fact, that’s a side effect of the trolley problem whereby people are more likely to do the “utilitarian thing” when they are once removed from the action of doing. Again, handing off moral responsibility.

And what about long term issues? Those in charge will want wars at no political cost, something we’re already approaching with an unquestioning media. Only obviously wrong choices or losses seem to be costly. Our metadata strikes already allow politicians to assume little responsibility of choice outside of agreeing with an algorithm which chooses what and when to strike. Here is where it becomes important: once face recognition, online data—being swept up as we speak—as well as other strike signatures are fully developed and automated, and given a minimal moral examination by our leaders, what’s to stop this system?

And if each strike will be treated like some variation of the trolley problem-by-algorithm, as they currently are by our dear leaders (one dead terrorist is n-> ∞ amount of saved civilians, is the constant claim; to say nothing of the standard metric of 1 of us > 1 of them)—the idea of simply deploying these weapons will also be treated as such: deploying them to an area for an infinite time will kill n people but oh it will save n^n many more people so that it’s not even worth discussing.

In fact, I’m certain that this logic will be so easily assumed as a truth, a national security commandment, that such weapons systems would be deployed whenever people need “saving”, or ex-colonial countries need interests served, or droughts and other climate change effects turn a population problematic. For these weapons systems simply save people, the exact number of those saved is only limited by the feeble, fearful—yet somehow always capitalizing off it—mind of our do-you-want-a-mushroom-cloud-over-this-city politicians.

How can it be otherwise? If there is no political cost to these actions and systems carrying out murder? I want to reiterate that I’m not speaking of Skynet, I’m speaking of how the only political sphere—public discourse—is currently too stunted for this debate and seems unable to unmask these kinds of forever wars. But since we all have been taught to worship the high priest of national security and not doubt it, we need a massive rise in consciousness to battle it. Again, these automated wars will only provide less transparency, less short term economic and political cost and will thus spread in size and number—after all, they save lives, don’t they?

Now, this alone should be enough to scare even the most authoritarian amongst you, but let’s further extrapolate current directions and add Snowden’s revelations to this equation. Data, text, video and access to all cameras under the control of the NSA. There hasn’t been enough done to confront this and once it’s tied up to a fully networked drone system, we’ll see all sorts of data turned into the Trolley problem. Again, look at Minerva [LINK] and how predictive analysis will spot these problems everywhere. And again, what will stop this from spreading? Certainly not the political sphere, for what oversight will we have on these decisions?

Because the technology will spread, let’s not fool ourselves. Other, more authoritarian actors, will get this technology too. And even if that doesn’t happen any time soon, those looking to avenge their innocent friends and family who will die in these “wars” will look to exact a price. When these chickens finally come to roost, you can bet money that the high priests of national security, cloaked in their flags, will be gone, hiding behind whatever fortress they’ve saved enough money for and we’ll be left as the only targets of those whom our leaders have terrorized.

I can hear the naysayers out there claiming that there will be no blowback as long as we don’t let up. Let’s imagine that this doesn’t happen, that no one decides to be human and react. Well then, what’s the end game here? Intelligence shows that every strike creates more terrorists, a progression that can only mean genocide or creating a mass exodus of refugees. Since the latter aren’t desirable, politically I imagine a similar solution, will be dealt to them as well.

How do we confront this situation? Like I said, the political costs are too low as it is. Calling or writing your congressman and protesting are certainly a couple of ways, so is helping out groups fighting against this. Another is raising the awareness amongst people you know. Finally, our leaders need to work towards a more feasible international framework, one that doesn’t allow the great powers to run amok. It’s becoming more and more imperative as Climate Change looks to decrease our world’s resources—to include water tables.

One thing we can’t do is allow our leaders to frame drones as a Trolley problem. It’s far too complicated and far too horrific for that. It’s more like being forced to choose between the two by a madman who built the tracks through two houses with 1 and 5 residents. The madman then pays for a trolley to come through. We are not allowed to ask why there are no brakes or why they built the tracks there or why we are only given two choices. This manifests itself in claims that one shouldn’t think about history or the past and only about “what can we do now?” We can work towards more humane solutions. We can keep working to making sure this technology is used to help people, not kill them.

 

About Nelson Lowhim
He was born in Tanzania where he lived for the first decade of his life. He then lived in India for a year before finally settling in the U.S. in the state of Michigan. From there he joined the Army and served for seven years as an Infantryman in 1st AD then as an Engineer in Fifth Group. After his time in the Army, he came to New York and earned an undergraduate degree from Columbia University. He wrote in the Bronx for a few years and now lives in Spokane, WA, with his better half . Connect with his FacebookTwitter or Blog.

One Comment - Write a Comment

  1. Interesting take but remember, weapons serve one purpose – to kill people. The fact that drones are good at it isn’t a problem. In fact, drones might be the most ethical way to kill people once you have determined they need to be killed: http://www.nytimes.com/2012/07/15/sunday-review/the-moral-case-for-drones.html

    You make a good point and I do agree that we can’t let drones reduce the need for a thoughtful review before ordering a strike. However, once the need is confirmed, there’s nothing morally superior about putting American lives at risk to kill someone.

    Reply

Leave a Reply to Phil Cancel reply