‘Nothing Kept Me Up At Night the Way the Gorgon Stare Did.’
Arthur Holland Michel | Longreads | June 2019 | 15 minutes (3,946 words)
Drones have come to define the United States’ forever war, the so-called war on terror. The expansion of drone systems developed by the military into new territories — including the continental United States — embodies this era’s hyper-paranoid ethos: new threats are ever imminent, conflict is always without resolution. At the same time, non-militarized drones have entered civilian life in a number of ways, from breathtaking cinematography to flight control at Heathrow airport. There are many avid documenters of this new technology, but no one seems to understand its many facets quite like Arthur Holland Michel, founder and co-director of the Bard Center for the Study of the Drone, which catalogs the growing use of drones around the world. Now, Holland Michel has written Eyes in the Sky: The Secret Rise of Gorgon Stare and How It Will Watch Us All, a book of startling revelations about drone surveillance in the United States.
Holland Michel has lived and breathed drone technology for the last six years, but nothing quite shocked him like the technology of Wide Angle Motion Imagery (WAMI). WAMI greatly expands the power that a camera attached to a drone can have; it is able to watch and record a much greater area while also tracking multiple specific targets within that area. In his book Michael lays out how scientists and engineers created this surveillance technology through a Manhattan-project like mission. The name — a little too on the nose — that the scientists decided to give their new invention was “Gorgon Stare,” after the terrifying mythological creature whose mere glance could turn you to stone. Even from the very beginning, Gorgon Stare’s creators knew that its power would extend beyond its original stated purpose — to help prevent IED attack and track insurgents across conflict zones. Now, proponents of WAMI are finding uses for it in civilian life, and Holland Michel argues that the public must be involved in any decision before it is deployed over us. I met up with Arthur on a beautiful Spring day (perfect for flying drones) to discuss this profoundly troubling technology, how to prevent its worst potential from being realized, and maybe — just maybe — how drones can be used for good.
*
Sam Jaffe Goldstein: You co-founded the Bard Center for the Study of the Drone. Why was the center created? What is its mission?
Arthur Holland Michel: I came up with the idea between my junior and senior year of college. I was fascinated by the idea of drones; they were flying robots that were going to deliver burritos, they were being used for military operations overseas, and they were going to start crowding the airspace. I felt it would be interesting to study this new and mysterious technology.
As it turned out my first-year roommate, Dan Gettinger, was writing his senior thesis about drones, and so he joined me and became a crucial part of the project. I came back to campus after the summer and we set up speaker’s seminar. By the time we graduated there was enough interest in what we were doing that we decided to take it further. That was in May of 2013, and here we are now.
How did you learn Gorgon Stare and Wide Angle Motion Imagery (or WAMI)?
I believe the first time I found out about Gorgon Stare was in reading about it in 2013. At that point the first iteration of Gorgon Stare had been deployed and there was a little bit of press around this totally unprecedented technology. From that point onward, I couldn’t stop thinking about it. When you study drones, you spend a lot of time looking at really impressive, futuristic, and at times troubling technologies. But nothing kept me up at night the way Gorgon Stare did. There was just something so formidable about it.
I spent about two years thinking about the technology and all the things it would mean for society. Then it struck me: This is an urgent topic and I should probably write about it. That was in November of 2015 and I started on the book a few months after that.
So what is Gorgon Stare and WAMI technology, how does it work?
Think of a traditional camera on a drone as a high powered telescope. What it’s really good for is zooming in on things on the ground very closely. The downside is that you can really only watch one person or vehicle at a time. Maybe something important is happening a few blocks away or on the other side of the city. If you focus on just one target you are going to miss all the other important things that happened around it, you are going to lose all the context. I’m talking about cameras aboard military drones that fly at 25,000 thousand feet, by the way.
What Wide Area Motion Imagery does is expand the aperture. You can watch an entire city at once and zoom in on any one part of the imagery with a decent amount of detail, while still recording everything else. To do that is a tremendous technological leap, because you need an incredibly powerful camera. And that’s the other thing that sets them apart. They are tremendously powerful.
In 2008 there were already live operational tests over football games, NASCAR races, and rallies.
What does it look like?
Generally they are larger than the standard cameras on drones. Some of the original iterations were essentially a number of cameras bolted together, and with a bit of software you could stitch those images together to create one single view. As the technology matured and became more refined you could have one very large digital camera with multiple lenses that absorbs a very wide angle view of the ground with a lot of resolution.
To put this in context, an iPhone camera maybe has twelve million pixels. A Gorgon Stare has 1.8 billion pixels — that is, 1.8 gigapixels, 150 times more powerful than an iPhone. And as it happens, the fundamental technology that enabled this was the camera chips in cell phones. Your cell phone has a little chip that sits behind the camera lens, and if you stitch a bunch of those together, then you’ve made a camera that can generate very high resolution images. The technology in your pocket right now that enabled this all-seeing view of the ground.
In Eyes in the Sky, you tell the story of how the 1998 blockbuster Enemy of The State was the initial inspiration for this technology.
In Enemy of the State there’s an imagined technology, a satellite that can watch people on the ground across vast areas. This was of course pure fantasy at the time. An engineer who works for the government saw the film in a theater and thought it would be quite incredible if the government could actually do that. That seed of inspiration precipitated a whole series of events and development projects that culminated with Gorgon State about ten years later. It took a while, but during that period the rate at which camera technology got more powerful outpaced Moore’s law, which predicts the rate at which computer chips become more powerful. It was this phenomenal jump in capability in a very short period of time, all driven from an initial seed of inspiration, a 1998 Will Smith blockbuster.
Why was it built other than this pure fantasy? What is its practical purpose?
It was envisioned as a counter terrorism tool. It’s a way to find people, a way to find insurgents. It was seen as a way to potentially turn the tide at a point in the Iraq and Afghanistan wars where it became clear the counter insurgency fight was going to be a lot more challenging than originally thought.
Do proponents feel like it did turn the tide?
The events on the ground speak for themselves. The wars haven’t ended. The counter insurgency fight and the counter terrorism fight continue to this day. No single tool is going to win a war for you. However, everybody I spoke to described it as being transformative. The fact that it is still being used is a strong sign that it has some utility.
So it is currently deployed?
Yes it is flying right now as we speak.
Do we know where it’s flying right now?
No. That information is classified. All operational information about Gorgon Stare is considered classified. Same goes for other WAMI systems.
How did we learn that it has been flown in previous theatres of war?
Because there have been selective disclosures over the years, in congressional reports, for example, and in Air Force budget documents. You can piece together these little bits of information that have been released. But if you were to ask the Air Force how Gorgon Stare is being used on this very day, you would not get a specific answer.
Do we at least know who is using it?
The Air force owns Gorgon Stare.
But not NSA or the CIA?
We don’t know. All these groups work together very closely on intelligence matters. There is a lot of intelligence sharing.There’s a lot of intelligence that comes from a whole range of different sources. All we know is that it is being used for counter terrorism missions.
People throughout the book — people in the military — when they talk about using a drone to find a suspect in a terror attack, they use a euphemism, they say “they’d go knock on his door.” It’s a remarkably revealing allusion to a far simpler way of doing things. It seems like, as with so much about the war on terror, Gorgon Stare does not fit into a larger goal.
There’s a difference between tactics and strategy. A tool can be very effective in finding who you are looking for — that’s tactics. But whether finding a particular person, apprehending them, killing them, or turning them is good strategy in the long run is a separate question for a separate group of people.
Yet this truly powerful tool doesn’t seem to have changed anything in terms of turning the tide. We are about to sign a peace treaty with the Taliban in Afghanistan.
There are a whole range of incredible tools at the DOD’s disposal and that just speaks to the complexity of the situation on the ground.
What difficulties does that bring up for you since you are writing about the tool without writing about the larger picture?
I did not want to wade into the debate about strategy, because I was interested in a story about a technology that will have ramifications for all of us. This is a story that began squarely in the military space but does not end in the military space. It is a story that ends here, over our heads. That’s what mattered to me about this story.
I visited Baltimore during that operation and was astounded seeing first-hand how the technology was being used on the people of Baltimore without their knowledge.
Should we even be using this technology on people even if they are insurgents, let alone bringing it stateside?
Well as a proponent of the technology might say, if you have the capability to prevent a terrorist attack it is incumbent upon you to use that capability to do so. Gorgon Stare falls under that logic. When you are so focused on the mission at hand, it can be difficult to see the larger picture.
That being said, everybody I interviewed who had been involved in the development of the technology was very realistic about the fact they had created a very dangerous tool. That it could be misused and that there were privacy concerns. I was very surprised that many of them brought up the privacy concerns before I had a chance to ask them. It’s front and center in their minds.
[But when it was being developed] they have a very specific and singular focus: save lives on the ground. Prevent U.S. Soldiers from getting blown up by IEDs. For them, that trumps everything. Their perspective is that many of the other concerns can be dealt with — that you will weigh the benefits of the technology against its real and perceived costs, and that you can make that balance work. Some of that thinking is potentially very misguided, and that’s why some of the dangers of the technology are truly frightening and urgent.
What are the dangers?
It’s a way of seeing everybody all the time. Fundamental to liberal democracy is the ability to have sacrosanct private spaces. That is where the life of civil society exists. It is where our own personal lives exist, where we are able to pursue our dreams and passions. And it is often where we hold power to account. When you uncover those spaces, you fundamentally put all of those things at risk.
Gorgon Stare also creates a real and tangible fear. The thought that you are being observed from the sky will have a direct impact on your behavior. It will directly impinge upon your desires and your decisions when you enter into those spaces. One has to ask whether we want to live in a society where people are scared to organize around causes, or where they are scared to associate with their peers in a religious or political context. There is a real threat that needs to be taken seriously.
When was it first used stateside in a secret government experiments, and when was it used stateside with public knowledge?
The first experimental use of the technology began at the very start of this story. As soon as the first iterations of the technology were created, they were tested in the U.S., in California specifically. In one of the earliest tests they flew over a gas station in the San Fernando Valley. Then there were tests in San Diego, and a whole range of other cities. A lot of the companies that develop the technology are based in the United States, so it was very natural that they would conduct their testing here.
The story of its domestic use for active operations is almost just as old. In 2008 there were already live operational tests over football games, NASCAR races, and rallies. Shortly after that there were pilot programs in a number of U.S. cities. A few companies emerged offering surveillance services. One of the most extensive programs that we have seen to date was in Baltimore in 2016. I visited Baltimore during that operation and was astounded seeing first-hand how the technology was being used on the people of Baltimore without their knowledge.
How was it being used on the people of Baltimore?
It was being used for investigative law enforcement. Trying to solve “unsolvable crimes,” is how its mandate put it.
When you watched it, did you feel uncomfortable?
Yes, I felt deeply uncomfortable.
Ross Mcnutt and his company Persistent Surveillance Systems was in charge of the program. How did he defend it?
He said that it’s not illegal. He said all of the perceived dangers are accounted for with the company’s privacy policies. For example, the policy prevents a rogue agent from tracking their spouse across the city. It was thorough policy and, to his mind, that was enough.
Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.
This program was funded by philanthropist John D. Arnold. Can you talk about his motives?
He funds a whole range of different issues. One of his key areas of interest is in technologies that can be used for law enforcement and for keeping cities safe. He focuses on high risk technologies, things that are untested and on the frontier.
If you were to lay it out, how would you want it look like versus what John D. Arnold or Ross Mcnutt wants it look like.
I would be hesitant to pit myself against their views. My philosophy is that we often have much more in common than we don’t. We all want to have safe cities. No one wants people’s personal privacy to be intruded upon in egregious ways. But there are significant differences of opinion as to the best ways to achieve those goals. One of the core principles that I believe is important is transparency. If a city is surveilling its people, it needs to be honest about it. WAMI watches everybody, so it’s everybody’s business.
The rules for the technology need to be a result of a discussion involving multiple stakeholders: the people who will be watched, the people whose job it is to protect those residents, the people who make the technology, government oversight groups, and civil-society organizations. Everyone should come together and have a discussion and the details should come out of that process. If the process is managed properly, then we should have every reason to believe that the technology’s many perils can be held in check while its promise can benefit all of us. It shouldn’t be one person who just comes up with the smartest solution.
The question of who counts as a legitimate target of surveillance is purely subjective.
A peril that comes to mind is the use of this technology against the vulnerable. That even the best processes will not be able to protect them.
Managing surveillance technology is going to be at the heart of the struggle for the future of democracy. We cannot have a democracy where everybody is watched all the time, in high resolution, through multiple formats. [But] you also cannot stop technological progress. You cannot fully block the logic that says if there’s a way to stop something — a terrorist attack or a crime that we agree is bad for society — that the effort should be made to stop it. The challenge is in drawing those lines. Who should be protected? Who shouldn’t be protected? When is use fair? When is it not fair?
Part of the question will be whether the structures in place to protect the vulnerable will continue to hold up in the face of technological change. There will be those who say that the protections in the Bill of Rights and Constitution do enough to hold all of these technologies in check. There will also be those who say no, these tools raise completely unfamiliar questions and as a result new rules will need to be written.
One of the dangers of the technology is that the question of who counts as a legitimate target of surveillance is purely subjective. A legitimate target to you might not be a legitimate target to the next person. The structures and rules that protect people from unwarranted surveillance are necessary and that’s why these rules are used every single day. You cannot just go and tap anybody’s phone just because you think they are a legitimate target of surveillance. There are abuses, and those abuses are the reason why we need processes for keeping surveillance in check. The process is not a static thing; it is a living discussion, which has to change as the technology evolves.
Do you have any hope for that happening considering how agencies like the NSA have been empowered, or America’s conservative judiciary where a lot of these questions are going to be dealt with?
Democracy has existed for two thousand years and it has adapted itself to all kinds of technologies that posed a fundamental threat. It wasn’t all that long after the introduction of wire tapping that we collectively decided there’s a line that must not be crossed. If it has happened with previous tectonic technological shifts, then we have some reason to believe it can happen again.
We are certainly living in a remarkable time, though. It’s not just WAMI that’s watching us in new ways. It is also social media monitoring systems, big data analytics, license plate readers, ground cameras, facial recognition. It is an extensive list. Artificial intelligence is a theme running through all of those technologies, and it is at the base of many of the challenges that we will have to confront.
Even with a conservative government like ours, it’s easy to foresee mass use of WAMI in the U.S. coming not from a government spy agency, but from a private corporation. For instance, Amazon sets up a blimp with a WAMI camera over a major American city. Or, a customer buys a toothbrush on their phone, and with the use of WAMI a drone is able to deliver that toothbrush within the hour. While there might be protests, people are probably going to be more willing to go along with it. How do we warn them about the risks of data collection?
Get this: Amazon has a patent for a system to analyze the video footage of private properties collected by its delivery drones and then feed that analysis into its product recommendation algorithm. You order an iPad case, a drone comes to your home and delivers it. While delivering this package the drone’s computer vision system picks up that the trees in your backyard look unhealthy, which is fed into the system, and then you get a recommendation for tree fertilizer. There is tremendous value in the data that can be collected from the sky and people will seek to take advantage of that data.
However, nobody likes being watched from above. There is a profound human resistance to it. A resistance as old as time: if you go back to Greek mythology, there was already a very clear fear of the sky and things that inhabit the sky. We may just say “No, we don’t want it.: This is a very real possibility. Let’s just hope that it is a reasonable discussion, and that the functions used to protect privacy in the past kick in again. I think it’s needed now more than ever before.
Did the creators of WAMI understand the vast power of what they were building and how it could change everything?
The men who built this technology — and let’s not mince words: they were predominantly men — have generally never been the victims of unwarranted surveillance. They have not been subject to egregious intrusion upon their privacy, so naturally they lack that perspective. But they are very conscious that they have created something dangerous. They are not unaware, and it troubles them. They will not go so far as to apologize for it, but they are not going to tell you there’s no need to do anything. They are realistic about it, but they maybe don’t respond to those dangers in visceral way that is grounded in personal experience.
Technology is usually described as neutral, but it is hard not to see some of our worst tendencies as humans baked into WAMI. The inspiration for it was a paranoid thriller about surveillance! Can you really even describe this technology as ‘neutral’?
There is a very good reason that when anybody hears about WAMI for the first time, they feel a universal emotion: fear. There’s a reason we fear the technology now just as we have always feared observation from above. That fact calls into question this notion that technology is neutral.
Maybe there is something baked into it. Would you call the atomic bomb neutral? Would you call the smallpox vaccine neutral? It’s only neutral if it exists in a total vacuum, but it doesn’t.
What I want, my reason for writing the book, is that there needs to be a discussion. This technology could be dangerous, but it could also be beneficial. However, in all likelihood if we don’t talk about it, WAMI will be more dangerous than beneficial. If we talk about it, given the fact we generally share a common goal of balancing privacy against safety, and of maintaining and protecting the core structures of democracy, we’ll land on the right side.
* * *
Sam Jaffe Goldstein is a bookseller in Brooklyn, New York.
Editor: Dana Snitzky