🎙️ Protecting Liberty in the Age of Surveillance – Transcript Sentinel Podcast E21

🎙️ Protecting Liberty in the Age of Surveillance

How a former Chief of the Office of Civil Liberties, Privacy and Transparency at the Office of the Director of National Intelligence thinks about privacy, protest, and the power of government data.

Watch the episode here.

This speaker-specific transcript has been formatted for clarity and readability.

**Host:** Peter Mina
**Guest:** Alex Joel, Senior Project Director and Resident Adjunct Professor at American University Washington College of Law
**Topic:** National Security, Privacy, and the Privacy Act of 1974

**Peter Mina:** Good evening. I’m Peter Mina, a civil rights and federal employment law attorney, as well as a former Department of Homeland Security official who worked to integrate civil rights and civil liberties protections in the department’s national security programs. And you are listening to the Steady State Sentinel from the Steady State. We are facing an existential threat: growing autocracy in the United States. The Steady State Sentinel is a place where we and our distinguished guests use our national security expertise to discuss and analyze the decisions and acts of this administration that feed that autocratic slide and threaten to supplant the pillars of our constitutional democracy.

**Peter Mina:** Today, we’re talking with Alex Joel, the Senior Project Director and Resident Adjunct Professor at the American University Washington College of Law. Professor Joel leads the Privacy Across Borders Research Initiative, which is part of the law school’s tech law and security program. He is conducting research, developing programming, and teaching courses focused on the intersection between the law, national security, technology, and privacy. Prior to coming to America, Professor Joel spent nearly two decades in the intelligence community. In 2002, he joined the Central Intelligence Agency’s Office of General Counsel. Subsequently, in 2005, he moved to the office of the Director of National Intelligence, or ODNI, where he served as a civil liberties protection officer for 14 years, reporting directly to five different directors of national intelligence. And starting in 2015, he simultaneously served as the ODNI’s chief transparency officer. And with that, welcome to the show, Alex.

**Alex Joel:** Thank you, Peter. Thank you for having me.

**Peter Mina:** Thanks so much for coming. So why don’t we start off by having you talk to our audience a little bit about what your current work is at American University, and in particular, what the Privacy Across Borders initiative is.

**Alex Joel:** Yes, thanks. I’ve been teaching at American University at the law school for the last five years or so. I teach national security surveillance and secrecy as well as issues relating to technology and privacy around the world. And with the Privacy Across Borders Research Initiative, we’re looking at data flows and technology as it’s required for our society and our economies around the world, focusing on cross-border data flows and issues of trust relating to how governments access personal data held by companies for national security or law enforcement purposes, as well as how AI is governed and used in ways that implicate national security and privacy at the same time.

**Peter Mina:** Well, those issues couldn’t be more salient than right now. And so I want to kind of start a little bit at your beginnings. What led you to the, you know, ultimately very distinguished career you had in intelligence and national security? And then, you know, within that, what led you to a particular focus on privacy, and then certainly civil liberties, particularly in your role at ODNI?

**Alex Joel:** Okay, well, we’re going to go back in time a little bit since you’ve asked for my beginnings. So I graduated from law school and I served four years as an Army JAG. And after I got out of the army, I joined a law firm and I did technology outsourcing work—sort of technology transactional work. Then I joined Marriott International in-house and I was their privacy and e-commerce and new business ventures lawyer for seven years. And I was super happy in that job; that was a great place to work.

And then 9/11. That changed everything for me. With 9/11, I decided I needed to re-enter public service, and I was asking around, trying to figure out how I could best contribute. I was very interested already at the time in terms of the intersection of privacy and technology and now what could we do with national security. I was also concerned about civil liberties and about making sure that our Muslim American friends weren’t being stigmatized and those kinds of issues.

**Alex Joel:** A friend of mine who had been at the FBI for a few years strongly recommended that I apply to the CIA. So I did in October 2001, and I started at the CIA in 2002. I spent three years there, and a lot of the issues that I was dealing with at the agency were: How do we apply the rules and constraints that grew out of the Church Committee hearings in the 1970s that are embodied in Executive Order 12333? How do we apply those rules as well as deal with conflicting laws around the world that might impact the ability to obtain information to prevent another terrorist attack?

When they stood up the Office of the Director of National Intelligence—as your audience may know, that was in the Intelligence Reform and Terrorism Prevention Act of 2004—Director Negroponte was the first DNI. He was sworn in in April 2005. I was detailed over from the CIA to the ODNI in that first wave of lawyers to help stand it up. I was there in June of 2005, and I was named the civil liberties protection officer pretty immediately as the interim to fulfill that role while we were looking for a permanent person. Director Negroponte decided to make me the full-time civil liberties protection officer by the end of 2005.

I did it for 14 years. I had occasion to think about doing something different in the intelligence community, but I didn’t have an interest in that. I’ve always been both intellectually fascinated and sort of emotionally motivated and committed to this idea that we can do both. We must be able to do both. We must be able to protect the nation’s security and at the same time protect people’s privacy and civil liberties.

**Alex Joel:** I still believe that to be true. Yes, there are tensions between the two goals, obviously, but there are ways to enable our agencies to carry out their national security mission and at the same time constrain them from going too far to make sure that they don’t become themselves a threat to our freedoms and our democracy.

**Peter Mina:** Well, I actually want to ask you a little bit more about that, because I think that’s at the core of trying to make sense of what’s happening right now. How do you—and you use the word tension, and I’ve certainly heard that from others over the years—do you see it as sort of this push and pull? Or do you think, as I pick up from your answer, that these really important foundational concepts can actually coexist?

**Alex Joel:** I think they have to coexist. I mean, there’s clearly potential trade-offs. Obviously, if I get an order to read somebody’s emails, that person has lost privacy to the government. Sure. But that’s where you have the internal and external checks and balances. The way I think about it—and I’ve done a lot of work with other countries as well and spoken to different oversight entities from around the world on how they think about this balance—a legal framework in a democracy has to do two things simultaneously and equally well.

The legal framework has to authorize the intelligence agencies to do things to protect national security. What kinds of things? It typically is intrusive things. And it also are intrusive measures that are taken in secret because if it’s fully transparent, then the targets—the bad guys—will know what you’re doing and will try to avoid it. So, the way I put it is that a fully transparent intelligence service would be fully ineffective.

**Alex Joel:** So you’re doing secret intrusive things. And at the same time, that legal framework has to constrain the agencies from going too far. It has to do both. Early on in my tenure, there were a lot of arguments about the balance metaphor. Is this a balance where one outweighs the other? I was always a proponent of saying that the goal is to keep the scale balanced. You want to do things on the national security side of the scale, but you have to counterbalance it with things on the civil liberties protection side of the scale—including cutting back on some national security measures that aren’t really necessary.

**Alex Joel:** That’s been the metaphor that I’ve always felt comfortable with. But now, with so many changes in the way technology moves so quickly and world events change, I really come to feel like it’s always going to be dynamic. It’s not static. You never achieve a balance and say, “we’re done.” As things change, you have to adjust and it’s always going to be shifting in a constant dynamic motion. There will be periods where things seem to be out of balance, but as long as the system is trying to correct it, that’s the key.

**Peter Mina:** Makes complete sense. As we continue this conversation, I want to talk a little more with you about the civil liberties component. We’ll talk a lot about privacy, but again, given the moment that we’re in, there has been lots of discussion and litigation over the intersection of national security and civil liberties. For folks that don’t really understand sort of the legal framework around these issues, one of the foundational statutes in this area is the Privacy Act. I was wondering if you could give our listeners just a basic description of what is the Privacy Act, and what and who is it designed to protect?

**Alex Joel:** Yeah, the Privacy Act is an amazing statute. A lot of people feel like it’s outdated and needs to be amended, and I would agree it could use a refresher. But it was enacted in 1974, in the middle of press disclosures about Watergate and some of the intelligence abuses. In response to that, there was a landmark report by the U.S. Department of Health, Education and Welfare (HEW) that was very concerned about the centralization of data of Americans in these large, big mainframe computers that governments were operating.

Out of that report came what we call the Fair Information Practice Principles. This idea that people should know what the government has on them and should be able to access that information and correct it if necessary. Information should be used for the purpose the agency had when it collected the information. If I fill out a form for social security benefits, my expectation is that that’s exactly what the agency will do with it.

**Alex Joel:** Those concepts were embodied in the Privacy Act of 1974. It applies government-wide, though there are exemptions for classified information, national security, and law enforcement. For example, you can’t have a terrorist submit a request to the CIA and say, “Give me access to my file.” But at its core, it does a few things:
1. It requires agencies to publish “Systems of Records Notices” (SORNs)—the databases they have about Americans.
2. It requires them to process requests for access from the individuals covered by those records.
3. It requires them to explain what they’re using the information for and who has access to it (called “routine uses”).
4. It limits the ability of agency employees to access data; they must have a “need to know.”

We often criticize the United States for not having a comprehensive privacy law for the private sector, but we were way ahead of the rest of the world in having one that applies to the federal government.

**Peter Mina:** It’s interesting because there are a lot of parallels between 1974 and today. Here we are more than 50 years later talking about the aggregation of data—dating back to the beginning of this administration with DOGE and collecting social security data and sharing it with other agencies. Then questions about whether that data was also going to private entities like SpaceX or Palantir. What’s the problem with agencies sharing information? Why, in a post-9/11 world where siloing was a major criticism, should we be so concerned with that now?

**Alex Joel:** I’m glad you brought up 9/11. A huge issue then was the need to share information. The 9/11 Commission said the federal government failed to “connect the dots.” That idea—that disparate pieces of data in different databases could show a terrorist plot if combined—is very powerful and still a concern. You want relevant information accessible by people charged with protecting national security.

**Alex Joel:** But at the same time, the concern in the U.S. has always been the creation of a huge centralized database of Americans. That would give people with access enormous power for potential abuse and misuse. People expect that when they deal with an agency for benefits, that agency won’t then provide their information to all kinds of other agencies with different approaches. That’s why there’s a requirement that if you share data, it must be for a “compatible purpose” with the original collection.

Post-9/11, we went through enormous trouble to make sure we complied with the Privacy Act. There was no effort by the President or any agency I’m aware of to simply ignore or override those protections. It was a matter of how do we work *within* the constraints of the Privacy Act and other rules to share information better so we can prevent another attack.

**Peter Mina:** With that as a backdrop, and comparing it to the present moment, it seems almost as though the aggregation of data is being used as a threat to American citizens—almost like a cudgel to say, “Be careful, don’t make statements contrary to the mission of an agency like DHS.” There’s been talk of a database of First Amendment protesters, like those protesting against ICE. How do you unpack those issues in a way that makes sense to someone who doesn’t live this every day?

**Alex Joel:** Right. There are two sets of issues here. One is the right of people to peacefully protest. The First Amendment is one of the most fiercely protected rights in our democracy. In the 1970s, a main concern of the Church Committee was that intelligence agencies were turning their focus inward toward protesters in ways that would chill their rights. President Ford, and then Carter and Reagan, issued executive orders (culminating in EO 12333) designed to constrain the government from conducting domestic surveillance on its own people.

The Privacy Act specifically says you cannot maintain a record of an American based solely on their exercise of First Amendment rights. The idea that the government would be looking at everybody who goes to a protest is a huge issue given that history. Obviously, if there is violence or criminality, that is a legitimate reason for investigation. But if they’re looking up protesters in a centralized database just to find something on them to get them in trouble, that is an inappropriate “fishing expedition.”

**Peter Mina:** It seems that if the public believes the government is using information inconsistently with civil liberties, it diminishes confidence in the national security enterprise. How do you see the presence or lack of transparency impacting public efficacy—particularly with agencies like the NSA that used to be called “No Such Agency”?

**Alex Joel:** One of my titles was Chief Transparency Officer for the DNI. Transparency was a big push following the Snowden disclosures. I had always wanted more transparency because it’s difficult to gain public trust if you can’t explain what you’re doing and how you’re doing it. Following Snowden, we realized we needed to lean in. The intelligence community committed to the “Principles of Intelligence Transparency,” which are still on the DNI website.

We would meet with advocates, including organizations that were suing us. If you don’t have transparency, people are naturally suspicious. You should be proud of what you’re doing. What you keep secret are things that could hamper operations—sources and methods—but you shouldn’t be tempted to keep something secret just because you’re embarrassed about it. If you’re embarrassed, fix it.

**Peter Mina:** Or even if you’re thinking about it in advance—if you can’t explain it, maybe don’t do it.

**Alex Joel:** Right.

**Peter Mina:** Well, Alex, you’ve been so kind and generous with your time. It’s been incredibly educational. Is there anything you’d like to tell people you’re working on right now, and where can they find you?

**Alex Joel:** Look for me on LinkedIn. In a couple of weeks, I’ll be speaking at the Privacy Symposium in Venice on the national security and privacy implications of commercially available information in the age of AI. If you want to email me, my bio on the American University website has my address.

**Peter Mina:** Excellent. Thank you so much for joining us. If you like what you heard, please subscribe to the Steady State Sentinel wherever you get your podcasts and give us a five-star review on Google. Join us for our next episode so you can stay engaged and informed. Remember, protecting our democracy isn’t a spectator sport. This is Peter Mina for the Steady State Sentinel. Still standing watch.

**Announcer 1:** Thank you for listening to the Steady State Sentinel podcast. Don’t miss out on more insights and exposes from America’s premier global security experts.

**Announcer 2:** Also, subscribe to our Substack at [substack.com/@steadystate1](https://substack.com/@steadystate1) and follow our social media. Join us right here next week for another exciting edition.

**Peter Mina:** The Steady State is a nonprofit organization working to sustain our democracy and national security.

**Announcer 3:** Join us and support our mission by visiting www.thesteadystate.org.

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *