7/5/19 Arthur Holland Michel on the Government Surveillance Program Watching Us from the Sky

by | Jul 9, 2019 | Interviews

Scott interviews Arthur Holland Michel about his new book on the emerging drone surveillance technology capable of watching entire cities at once. The technology was first developed to help prevent IED attacks on American soldiers in Iraq, but researchers and intelligence officers soon realized they could apply the same technology to domestic surveillance. Today several American cities already deploy this terrifyingly powerful system against their own citizens.

Discussed on the show:

– Advertisement –

Arthur Holland Michel is a journalist, research, and the co-director of the Center for the Study of Drones at Bard College. He is the author of Eyes in the Sky: The Secret Rise of Gorgon Stare and How It Will Watch Us All. Follow him on Twitter @WriteArthur.

This episode of the Scott Horton Show is sponsored by: Kesslyn Runs, by Charles Featherstone; NoDev NoOps NoIT, by Hussein Badakhchani; The War State, by Mike Swanson; WallStreetWindow.com; Tom Woods’ Liberty ClassroomExpandDesigns.com/Scott; and LibertyStickers.com.

Donate to the show through PatreonPayPal, or Bitcoin: 1KGye7S3pk7XXJT6TzrbFephGDbdhYznTa.

Play

Sorry, I'm late.
I had to stop by the Wax Museum again and give the finger to FDR.
We know Al-Qaeda, Zawahiri, is supporting the opposition in Syria.
Are we supporting Al-Qaeda in Syria?
It's a proud day for America.
And by God, we've kicked Vietnam syndrome once and for all.
Thank you very, very much.
I say it, I say it again, you've been had.
You've been took.
You've been hoodwinked.
These witnesses are trying to simply deny things that just about everybody else accepts as fact.
He came, he saw, he died.
We ain't killing they army, but we killing them.
We be on CNN like Say Our Name been saying, say it three times.
The meeting of the largest armies in the history of the world.
Then there's going to be an invasion.
All right, you guys, introducing Arthur Holland-Michel.
He is a journalist, researcher, and the co-director of the Center for the Study of Drones at Bard College.
He's written for Wired, Al Jazeera America, Vice, U.S. News, etc., etc.
And he's got this new book out, which you guys have got to read for your dystopian nightmares and everything.
Plenty of ammo.
I'm sure you already have some.
Eyes in the Sky, the secret rise of Gorgon Stare and how it will watch us all.
Welcome to the show.
How's it going?
Hi, Scott.
Good to be here.
Good to have you here.
You know, it's funny.
When I first heard of the Gorgon Stare, I think it probably would have been when they deployed it in Afghanistan back on 10 years ago or something like that.
I made a bumper sticker that said, here in a few years, your local sheriff will have a Reaper drone, and you still won't make the connection between your wars and lost liberty.
But, I don't know.
I guess you're a little bit ahead of your time in that regard, then.
Well, I'll tell you, I mean, that's the story of all this.
There's a book by Trevor Thrall about what they call the boomerang effect.
In fact, James Madison, the principal author of the U.S. Constitution, said it's the instruments of war that are most often used as the instruments of tyranny at home.
And, of course, that's absolutely the case.
So, the Gorgon Stare, the most advanced aerial surveillance system that anyone could have ever dreamed of, and it works, essentially, is what this book is about, correct?
Yes, absolutely.
I mean, as far as we know, in terms of its use in foreign wars, we have very little information because it's mostly classified.
It's used generally in sort of highly secret counterterrorism operations.
But it continues to be used more than 10 years after the first deployment of this type of system, and there is continuing investment, all of which suggests, reading between the lines, that it continues to be valuable and that it's not going away anytime soon.
Yeah.
Well, and you know what?
Even back then, the idea is so obvious, right?
A missing child?
What are you going to do?
Not send drones up?
If there's a missing child, of course you are.
Absolutely.
If it's a tool and it's available, worst case scenario, plug in, variable X, bad thing happens, well, we have to suspend you.
One more right of yours or ability of yours to just go about your day without every bit of your action being on your permanent record just sacrificed in the name of whatever emergency.
I mean, even the original concept for this technology, which just to explain a little bit, it's a giant camera capable of watching an entire city all at once in a way that is pretty much unfathomable just a few years earlier.
The initial concept with that was to actually try and prevent attacks with improvised explosive devices, suicide bombers and car bombs, things of that nature, which were really wreaking havoc on US service members and coalition service members and also civilians in Iraq.
And that was, you know, it was a pretty noble goal.
The people who are developing the technology really did believe that the sooner they got this technology into the battlefield, the sooner they'd be able to help save people's lives.
And even though the technology is very classified, everybody has indicated that it did make a difference in that respect.
I mean, there are US service members who returned from the wars alive because the technology was able to save those lives.
It was able to prevent IED attacks and help capture people who had been involved in those attacks.
I mean, in a way, that feels like quite a pure and narrow vision for doing good in the world with this particular technology.
But as you mentioned earlier, things aren't always so black and white, right?
Well, and even then, I mean, that's nonsense, right?
That's all war through a soda straw from the point of view of the guys selling drones and flying drones.
And I'm sure they have anecdotes.
But if you look at Kill Chain by Andrew Coburn or Gareth Porter's reporting about the drone surveillance and link analysis in Afghanistan, a lot of this is in my book about Afghanistan.
It was all a wash.
They're killing innocent people all night long in these night raids.
When they were bragging and pretending it was a success, they were saying that they think they were going to the right house half the time.
And as you talk about numerous times in the book, as they're showing you the footage, you start making up stories about, wow, I think maybe this guy looks like he might have been planning something with that other guy.
That's exactly what it is.
It's data, but it's not really information.
It's true.
You're referring to this moment when I was operating one of these cameras alongside an engineer who helped develop the technology over Albuquerque, New Mexico.
We were flying over the city, surveilling the people of Albuquerque.
Just for test purposes, we weren't doing anything nefarious with that information.
And I started following random cars around from 10,000 feet in the sky.
And it's very easy to project stories onto what is unfolding below.
In a funny way, in that context, as seen from above, everything looks suspicious.
But by the same token, nothing looks suspicious because when I was in Baltimore where the police were using this technology over the city, I was shown footage from the parts of the city where a higher number of crimes were generally observed.
And then shown the footage of parts of the city where very few crimes were committed.
And from above, those two neighborhoods looked exactly the same.
There's something kind of equalizing about it in that sense.
From the point of view of a cop or a soldier, that just means that the nice part of town is just as guilty as the poor part of town to them.
Not the other way around.
It's very much in the context of who's operating it.
That's why you have to be very careful about it.
In this case, we're talking about government employees.
Law enforcement and soldiers are the ones looking.
It's their tool, it's their hammer, so it's their perspective that counts.
They have this very singular focus on catching the bad guy, to use their terminology.
And they can be relentless in that regard.
And when I speak to these guys, those issues that you alluded to are very real.
But it was very interesting speaking to these guys.
They have this very firm belief that this is the right thing to do.
Because if you have the capacity through technological means to prevent, as you put it, an Amber Alert, a child who gets abducted or a murder from happening or to round up someone who has committed such a crime, it's sort of incumbent upon you to do that.
But if only it was so simple, because it is this truly double-edged sword where one, even in trying to serve that, in theory, noble goal, people get carried away, people get misled, people see things in the data that may not actually be there.
Or the technology itself can just lend itself to true abuse that is sort of unapologetically abusive.
You could imagine the use of the technology for tracking peaceful protesters, for example.
And that's where it gets particularly dangerous.
So let's get back to the wars and the invention of this thing and how it all works.
You do a great write-up in the first section of the book.
We don't have to get into all the details of the different companies and labs and everything.
But this is a massive effort on the part of the government to try to come up with a way to protect infantry in Iraq War II from IEDs, is how it really started.
And just one more on that last point, I guess, on this.
In Kill Chain by Andrew Coburn, he really shows how, yeah, sure, tactically this works.
We follow this guy home, and then our troops went and got him, or whatever.
You talk about that in the book, as well.
But all that meant, just like with taking out gangsters in, say, for example, the drug markets in America.
You take out the top guy, you just get a more violent lieutenant comes and takes his place, and casualties actually go up.
Things like this.
So, tactically, you can say it works.
But then, as you say in the book, they refuse to even answer you about whether it was useful at all, strategically speaking.
Did this actually help us put a dent in the Taliban insurgency?
No, of course not.
They rule 60% of the country right now.
Same thing for Iraq War II.
Yes, they saved some GIs' lives in some cases, but at the cost of others the next day.
So, it's at least a wash, maybe worse.
It comes down to the difference between tactics and strategy.
Individual technologies enable tactics, but a war is won by strategy.
So, it's hard to point to technologies as being sort of culprits in failures of strategy, when perhaps the failure of strategy comes from a failure of imagination, or it comes from a failure of grander view thinking, or looking for short-term gains at the cost of long-term results.
Like, hey, let's round up all the Sunni fighting-age males in this neighborhood.
That way, we won't have any more Sunni fighting-age males in this neighborhood for a minute, and that'll be nice and peaceful.
Except you just created 100 new insurgents, or more.
Good tactic for today, bad strategy for this week and this year.
That kind of thing.
That doesn't necessarily have to be tied up with technology at all, just military thinking.
Because if there was another technology that could do that, or if one could do that with really old-fashioned means, that would also be done.
I recognize the point.
There is a bit of a blurred line as to where the technology ends and strategy begins, and how to assign responsibility within those pieces.
It's all very complicated.
It's a great gray zone.
But the technology itself, I'm glad you asked.
It's a pretty fascinating story.
I'm sure many of your listeners are familiar with the movie Enemy of the State, from 1998, with Will Smith and Gene Hackman.
And in this movie, which is about a rogue cell of officials within the National Security Agency, who pursue Will Smith's character because he has some evidence that they want.
And they use all these different surveillance technologies, some of which were real back in 1998, and some of which were fictional.
But no technology they used is more formidable and frightening as this fictional surveillance satellite, which seems to be able to watch the entire eastern seaboard all at once.
And it can track Will Smith and Gene Hackman wherever they go.
Well, one night in 1998, a government engineer from the Lawrence Livermore National Lab saw that movie on a date night with his wife.
And whereas the other audience members were no doubt somewhat shocked by what they were seeing on screen, he saw this as a truly great idea.
He thought that the government would be able to do all sorts of wonderful things with a satellite that was able to stare down upon the earth and record wide area video imagery.
And so he rushed back to his house and left a message with his supervisor saying, I have a great idea.
And that kicked off this development program to basically build these very, very large cameras capable of doing exactly what that satellite does in Enemy of the State.
Now, ultimately, they realized it was too expensive to put the technology on a satellite.
So instead, they put these cameras on airplanes and helicopters.
But the effect is very, very similar.
And then because of this technology's potential in the sort of context of the war in Iraq and Afghanistan, a number of government agencies became very interested.
The CIA started throwing its weight behind the technology.
It was first deployed in 2006.
And then there were numerous iterations became more and more powerful, culminating with the one that graces the cover of the book, Gorgon Stare, which is a huge system, sort of almost unfathomably large.
You can watch whole city size areas at once.
And it continues to be used today.
And the kind of incredible thing is that if you put the imagery from that camera side by side with the fictional imagery from the movie, they look so similar.
I mean, it's almost it's almost a challenge to tell one from the other.
So it truly is a case of life imitating, imitating art.
And then as far as again, strategically, this, that, whatever.
But technologically speaking, it really does work that they can.
I think the key here, as you explain in the book, is you can have a shot of the entire city at once.
It's flying high enough.
You can get the whole city at once.
But also they can zoom in on and focus and track all different targets all at the same time because of the number of cameras being used altogether.
But also because of all of the different techniques in the software that I think at one point you say one of the very early versions of this was it Angel Fire or one of the other was able to track 8000 targets at once.
And so the number is much higher than that now.
Right.
Well, you know, there are tradeoffs between resolution and coverage area and, you know, the length of time that you're able to track a particular target.
Keeping in mind that while these cameras cover a very, very large area, the moment that one target exits one of the edges, you know, you've lost it.
But, yeah, you know, it's it's it is it is within the realm of reason that if you have an area with, you know, 8000 vehicles, you could track all of them at once.
Now, because that was the constant hawk, I said Angel Fire is constant hawk.
Because because you can cover such a large area, the tradeoff is that you each vehicle or person in the footage really only appears as a couple of pixels, a dot on the screen.
But that's enough because all you want to do is track where they go.
And then if you want to get a closer look, try and potentially identify the type of vehicle or the person, then you are going to need another camera.
And so maybe you send those coordinates over to, you know, a drone with a more sort of telescopic style camera.
And that will give you a closer look.
But the other thing to be aware of is that while the cameras may indeed be capable of tracking thousands of vehicles simultaneously, that doesn't mean that, you know, you can actively watch all of those thousands of vehicles simultaneously.
As one of the engineers put it, it takes a million people to watch a million people.
And so if you want to track thousands of vehicles, I mean, you're going to need, you know, thousands of hours of human analysis watching these vehicles move through the footage.
Well, there is a solution to that, which is finally in a way coming to fruition as we speak now, which is the advent of automation and ultimately artificial intelligence to analyze this footage automatically.
And if you accomplish that, then you can do a number of things.
At the sort of most basic level, you could click on a vehicle of interest in the footage and you could say to the software, hey, show me everywhere this vehicle has been in the last few hours and everywhere it's going.
And how far back can you rewind?
I guess indefinitely, right?
Well, I mean, you can rewind as far back as the aircraft was flying.
So in other words, as long as they have the hard drive space, they could save up all year long or whatever they want.
Well, yeah, I mean, you know, in theory, in an area where there is a particularly intense interest in in what's happening on the ground, you can have a rotation of drones where when one drone starts running low on fuel after flying for, you know, 12 or 24 hours, then the next drone will come in and replace it so that you get this single uninterrupted view of events.
On the ground.
So in theory, you know, you can you can track a vehicle over pretty extended periods of time.
That being said, if you're going to want if you if you're trying to track a vehicle, you know, over days or weeks, that's probably because it's a very high value target.
And so you'd probably dedicate some other form of intelligence collection device to watch it.
You know, at all times, I think this this technology is more useful for seeing, say, in a given day where a vehicle went to and that that will give you leads for other intelligence activities so that the software can do that.
It can tell you everywhere vehicles been and where it's going.
You add another layer of complexity to that you say okay computer.
Once you've tracked this vehicle to a particular location, then you can flag that location and show me all the other vehicles that come to that location as well, because if the initial vehicle was a, you know, was associated then other vehicles that come to that same location may likely also have some involvement in that group.
And so now you start building this this this this sort of ever expanding picture of what the network looks like in a way that requires far less human analysis time.
But then it gets even more complicated.
Scott and and and the capabilities get even more impressive, because, say, you don't want to start your investigation, knowing that a particular vehicle is suspicious right now, the way they originally would do it is that they would only identify suspicious vehicles, because they could see that that vehicle had been involved in an ambush or a bombing attack that it already had.
Right.
So they rewind to the footage, see the bomb go off in the footage, and then they see the people who planted it and then they start tracking those people.
But, you know, in theory, it's better to prevent the bomb going off before it happens, then, then sort of starting your investigation after the fact.
Well, what they found is that a lot of the vehicles associated with these attacks exhibited the same behavior in the lead up to these activities.
Right.
They would drive around aimlessly.
It's what they call the signature in the signature strike, which sort of sounds like, oh, it requires Obama's signature to say no.
It means these guys did jumping jacks at seven thirty in the morning, which is the same thing that a guy that we killed before did.
And we had concluded he was a terrorist.
So these guys must be terrorists, too.
We kill people who do jumping jacks.
And that's a real story, too.
I didn't make that one up.
The jumping jacks is one of their things or driving a pickup truck, having a rifle, you know, guilty stuff like that.
So get you killed in Afghanistan and get you arrested around here, I guess.
In in theory, if you can identify these behaviors prior to an attack, then you can sort of prevent the attack.
That's the theory.
Right.
And so the the sort of cutting edge work now is to develop algorithms that will flag certain vehicles or individuals that are exhibiting behavior that could indicate participation in in one of these activities.
And so the things are, you know, the U-turn or driving and meeting up with another vehicle in a remote area or any number of things.
Now, because lots of vehicles do U-turns for completely innocent reasons, they they they need to see a number of these behaviors happening.
And it does make sense, right?
With a large enough data set, you can say, you know what?
Hit men really do follow these few patterns that seem to, you know, work, you know, not as conclusively, but as a good as as you say, a cause of suspicion, something like that.
And in theory, you know, in theory, I mean, you know, you mentioned the signature strikes, you know, that it's those are very controversial tactic where, you know, you don't know a person's identity, but you you you you sort of conclude that they participate in a group because of their actions.
And again, we know from the assassination complex by Jeremy Scahill and his associates at the Intercept in that book, and because of a heroic leaker who's now being prosecuted, by the way, apparently it's the guy that this didn't work.
They're killing innocent people all day long.
And what following pixels around from the air is no substitute for having any idea what you're really doing.
Not that these guys can't convince themselves otherwise as they pull the trigger all day long.
So in theory, you know, you you have these suspicious behaviors, at least in in this footage, because, again, this isn't drone footage where you can see, you know, you can't tell if a person is doing jumping jacks in this footage.
I mean, the level of detail in the in the wide area imagery is much lower.
In theory, once you have flagged an individual, then you do a level of follow up that is, you know, in theory, again, will give you a higher confidence sort of reading as to this person's intentions.
And that is sort of this Holy Grail in intelligence.
I mean, if you can if you can truly prevent attacks before they happen with a high level of confidence and with a high enough level of confidence that you're not just killing everybody who does a U-turn, because obviously that is unsustainable, then you're really getting towards this sort of this Holy Grail, as I put it, of intelligence collection.
I mean, obviously, there are there are huge technological hurdles involved, but then things start to get a little troubling when you consider that in the domestic context, because, as you mentioned earlier, you know, military technologies often have a tendency to to find their way into the civilian domain.
And so you could imagine that in a city where we know that hitmen are active, you could potentially deploy one of these systems and identify the hitmen before they execute their job.
And in a way, again, speaking in theory, there is a certain appeal to that, you know, if you can identify, you know, people who are planning to abduct a child before it happens, you'd want to do that.
But at the same time, we, you know, I'll give you a story.
And on my way to these offices, I got lost.
My phone, which I was using for directions, Google Maps, ran out of battery.
And I had to do these U-turns.
And I drove around aimlessly for a little while.
And I sort of changed lanes on the highway pretty erratically to get across the highway to get into the right exit.
And while these engineers were describing the system to me, it occurred to me that the way I had been driving matched a lot of these suspicious patterns that they were trying to detect automatically with these algorithms.
And so I said to them, you know, I told them what had happened.
And they said, oh, yeah, that's that is exactly the kind of thing that would get you flagged, which raises this question.
You know, do we do we want to live in a world where we can be flagged for activities that while often potentially associated with dangerous acts and criminal acts, may also just be innocent or may just be, you know, the exercise of rights, you know, your freedom.
When you also have a computer deciding what's a reasonable suspicion or not based on some algorithm instead of any kind of reality, too.
Well, in theory, the computer then passes that information on to a human.
Right.
Who says, hey, the computer says it's suspicious.
So pull him over.
Right.
Right.
Well, you know, exactly.
So that's where the human element comes in.
I mean, everybody I've spoken to both proponents of the technology and people who are against it talk about this idea of the human element.
I mean, that is you know, that is so crucial.
You have to rely on a human.
And not only do you have to rely on a human, you have to rely on a human who is subject to very, very strict rules and an inflexible ethical code, because otherwise these things do begin to fall apart very, very, very, very quickly.
Of course, I mean, anybody could figure that, hey, if they could figure out how hit men drive on the way to do hits or whatever, in a way that's effective at all, then they can do the same thing for a teenager going to buy a bag of weed who, you know, two people got in the car, but only one person got out and then he got back in 10 minutes later.
So it must have been an illegal drug transaction.
Go and get him.
And you know, for kids listening, you might not realize, but when I was a kid in, say, the 1980s, even in the early 90s, you could go all the way to 7-Eleven and back and it wasn't on your permanent record.
You might spend the whole day at the Lake Hill.
You might drive all the way down to South Padre Island and go swim in the ocean or, well, the Gulf and then come all the way back to the center of Texas again.
And it was nobody's business but yours.
And nobody wrote it down.
Nobody photographed you.
And not that it hurts necessarily to be filmed at 7-Eleven, but we're just living in an entirely different world than that now.
Sure.
And so that is the sort of final, that goes to the final part of what I talk about in the book, which is that we have this tremendous technological capability that's able to do very, very positive things, but also very, very negative things.
And the only way that we can balance the two is through really substantial regulation.
You know, you have to create rules that will prevent the kind of abuses that you're talking about.
Because if you create, for example, a very strict warrant requirement for investigations with this technology, then that will, in theory, protect the teenager who's just heading to the 7-Eleven for completely innocent reasons.
Because that will not pass muster with a judge to grant a special investigations warrant.
The reality right now is that these regulations do not exist.
A lot of what we have described in this conversation in terms of just the simple act of monitoring a city from above with a very large video surveillance system is legal.
There are no rules that prevent even a private citizen from doing that.
That is because the technology has gotten very much ahead of the rules that are supposed to protect our privacy.
Because there are privacy rules, but they're becoming quickly outdated.
And essentially, as you're saying here, they never had to pass a law saying that they could do this.
It's just that there's nothing preventing them.
It's sort of like photographing you on a public government-owned sidewalk or street, essentially.
So law enforcement will often operate according to – just as much according to – I shouldn't say often, but because law enforcement is so relentless in its work, it will sometimes operate just as much according to what the law doesn't say, what is omitted from the law, as to what the law does say.
So in this case, it's operating according to an omission in the law.
There is nothing in the law that says you cannot use a wide-area aerial surveillance system to track vehicles for extended periods of time.
That being said, there is a recent Supreme Court case, the Carpenter case, that does have some bearing on this technology.
It basically said that law enforcement could not access historical location records from a person's cell phone without a warrant.
Now, this technology also creates historical location records.
So you could imagine the use of this technology being challenged in a court of law in a similar way and the court reaching a similar verdict based on that precedent set by the Supreme Court.
But the fact of the matter is we really need to act because in the absence of action, the technology will continue to proliferate.
It will be used in all sorts of ways.
While it will achieve some positive results, it will also have some very dangerous sort of side effects, if you will, that everybody, including the people who are the most fierce proponents of the technology, agree are things that should be prevented.
I mean, even the most sort of vocal sort of people that are pushing the use of this technology for domestic law enforcement really want there to be regulations that will prevent, I don't know, police departments from tracking peaceful protesters or from preventing rogue law enforcement officers.
Law enforcement officers from tracking their wives to make sure they're not doing anything.
Because if things like that happen, then that sort of hurts everybody.
I mean, it hurts society.
It hurts our rights.
And it also hurts these companies because the technology gets a well-deserved, at that point, bad reputation.
Well, and those kind of rogue abuses are actually the least of it, right?
Because it's the institutionalization of it all.
Like, for example, you talk about somebody takes a right turn into the far lane instead of the close one, this kind of thing.
I mean, you're talking about unlimited amounts of revenue, persecuting so-called citizens for offenses against the rules and regulations of the state, victimless crimes, but that are good for their coffers.
They have huge incentives.
Professionally, no personal corruption necessary whatsoever to do these things against us for themselves.
We're seeing a bit of a sea change in that regard, because society has shown that while it's willing to accept a certain level of government surveillance, there is a sort of threshold at which point society says, no, even though we understand the benefits, we consider this to be a step too far.
And so a good example of this is a number of local municipalities throughout the country in recent years have opted to take down red light cameras.
They actually just banned them in the entire state of Texas, which is unbelievable to me that that happened, but it did.
And that goes to exactly what you're saying, because on a certain level, we can all understand why we would want red light cameras.
We don't want cars running red lights.
It can be very dangerous, but there is a level of intrusion there that outweighs the potential benefits of the technology.
And it really highlights the perverse incentives of the governments, too, because all over the country we see where they shorten the time of the yellow lights so they can make more money.
They will drive your kid right out into traffic to get killed, make the intersection more dangerous deliberately for the dollars.
Over and over and over and over again, that's been documented.
So I think it's it's heartening to see that local citizens I mean, we're not talking about, you know, sort of civil society in D.C.
All these think tanks that, you know, come up with very, you know, sort of complex advocacy messaging against this technology.
We're seeing local citizens coming to terms with that balance and showing governments where those lines should be.
I think there is a much greater sensitivity to these issues now than we saw, for example, 10 years ago.
And that gives me great hope, because while everything that you've said about the potential abuses of this technology and also the things that I point out are very much true.
We we we have a growing level of comprehension in this country for how to manage that that tension, how to ensure that, you know, we're only using the one side of that that double edged sword.
And I think the optimism part is is really crucial because the people who are truly pessimistic about our prospects in our overwatched future are not the people who are going to their town council meetings to demand change because they see that there is no way out.
They see that we are on this completely unstoppable path towards a truly panoptic society, whereas the optimists are saying, no, I can I can speak to my local representatives.
I can say that this red light camera or this aerial surveillance system is an undue intrusion upon my privacy and that they will listen.
And an incredible thing about this.
And you got me.
I mean, I just admit to you that I would have never believed that the state of Texas would have repealed that.
I certainly would have would not have participated in trying to bother them about it, because what would be the point?
And look how wrong I was about that.
Yeah.
And there have been in recent years a number of cities that have abandoned pilot programs or full programs to use this aerial surveillance technology because of public pushback.
So that happened in Miami, where it was revealed that the city was embarking on an extensive wide area aerial surveillance program.
And as soon as it was revealed, there were local citizens coming out against it.
There were local mayors coming out against it.
And the city responded to that in kind and said, we are not you know, we're not going to pursue this.
You know, as one official put it, the people have spoken.
That also happened in Dayton.
And it also happened in Baltimore.
And I can imagine that there might be a city that when the local police department announces that this technology will be used, the residents say, OK, bring it on.
You know, and and that, too, is part of the democratic process.
If it if if there has been a true, robust participation, which is a big if because we know that a lot of people are not as politically engaged as perhaps they could be at least very much so on a local level, then then that that that is that is, again, the will of of the people.
So that people don't have the right to vote away my rights.
Well, I mean, if they're all agreed.
Well, yes.
So there are certain inalienable rights.
I mean, you're not following me around with your police state all day long.
Sure.
So that's why, in addition to the the town hall, we also have the courts because, say, the town hall decides wholeheartedly that they want to pursue people who do U-turns in the middle of the night.
You then have the recourse to challenge that in the courts.
And if it is determined, as it probably as it very likely would be that that violates some set of your rights as defined by the Constitution, the Fourth Amendment, the First Amendment, most likely, then that will be challenged and struck down.
You know, I think that in theory we have this pretty robust structure in place, but it depends on very inclusive participation, because if you only have one group pushing things in one direction or the other, then you get this sort of lopsided result.
And that's why I call for local governments to participate, local citizens to participate, the companies that make this technology to participate, law enforcement, civil society, because if everyone's participating, then then you get the closest to something that feels like a sort of consensus driven approach.
And I think we have we should have some faith in that, because at the end of the day, everybody likes privacy.
Everybody does not want their own personal privacy intruded upon.
That goes to the core of what it means to be human.
I mean, we all understand that at a very intuitive level.
And and so that that that gives me some amount of of hope to even though there are some truly frightening further technological evolutions that I I talk about in the book that are coming down the pipeline as well.
So we'll have to see.
Yeah, I'm sorry, but I only got through chapter seven this morning.
I wasn't able to.
Well, well, yeah, I mean, I was going to ask you, what's the worst part of the part I didn't get to yet?
Well, I'll have to keep that a secret because there's some pretty terrifying stuff in there.
But I'll tell you, this is a really important book.
And I'm really glad that you wrote it.
And it's an important part of what you're just describing there.
People absolutely need to be aware of this.
This is not just over Kandahar.
This is over Austin, Texas, over Lubbock, Texas, over Baltimore, Maryland.
As you said, it's already been deployed in Florida and California and all over the place.
And so the fight is on right now.
And before I let you go, I'm sorry, I have to ask you this.
Did I read you right?
Did I read you right?
And can you elaborate a little bit about how you know about this?
That they followed, successfully followed Abu Musab al-Zarqawi around Baghdad for 600 and something hours before they killed him in 2006.
Is that right?
Because that's weeks and weeks and weeks, right?
Yes.
That's numerous weeks.
So that was, you know, Zarqawi was a very senior al-Qaeda in Iraq leader.
And once he was identified through other sources, they had a rotation of predator drones, not equipped with the kind of surveillance systems that I talk about in the book.
These are telescopic sort of one, you know, one frame for one person systems.
And they followed him for a great deal of time for a few reasons, ostensibly.
One was to figure out who is, you know, who is associated with, again, build out that network, you know, to understand how the group worked.
But also understand his routine, you know, where he went, how he behaved at certain points of the day, when would be the most prudent time to either capture him or take him out with a strike.
Because he was such a high, high priority individual, they wanted to make absolutely sure that they had all the information they could have and that they they sort of exercise the most surefire response.
And so they ultimately decided that they would conduct an airstrike, which they did with a couple of fighter jets.
And he was killed in that strike.
That is like the sort of peak instance of predator drones engaging in this persistent stare style operation.
And it's a truly remarkable story.
There have been some really good, good books about that.
There's a book called Hunter Killer, which talks a great deal about the predator's role in the in the Zarkawi mission.
There's a big part of Stanley McChrystal's book about about the hunt for Zarkawi.
And it's a truly remarkable story because it is that intersection of tactics, strategy and technology that you were speaking about earlier.
And Tony Camerino, too, a.k.a.
Matthew Alexander, who got the location of Zarkawi in the first place by not torturing a guy and instead choosing, you know, the FBI be your best friend method of interrogation and got the location in the first place there.
So, yeah, I appreciate you helping me with my puzzle pieces here, you know, filling in the gaps in that story.
Yeah, absolutely.
And there's probably more that will be revealed as the years go by.
All right.
Well, listen, I really appreciate your time on the show.
And even though the interview is over, I'm going to finish this book.
It's really great.
Oh, yeah.
Well, I'm glad.
I'm really glad you're enjoying it.
Yeah.
Eyes in the Sky.
The Secret Rise of Gorgon's Stare and How It Will Watch Us All by Arthur Holland Michelle.
Thank you very much again for your time.
Oh, thanks.
Thanks so much.

Listen to The Scott Horton Show