SAN FRANCISCO — It's been two years since the very public showdown between Apple and the FBI over encryption, in a legal battle that ultimately ended without much of a resolution. Now, a standoff in the “crypto wars” appears to again be looming.

In this excerpt from an interview on Law.com's “Unprecedented” podcast, Stanford Law cryptography fellow Riana Pfefferkorn talks about the Department of Justice's new push for “responsible encryption” and whether it could lead to new legislation. Listen to the full interview here, or subscribe to “Unprecedented” on your Apple or Android device.

If you're in the Bay Area, you can catch Pfefferkorn speak about the crypto wars on Wednesday night at an event hosted in Oakland by Ars Technica.

This transcript has been edited for length and clarity.

Ben Hancock: You wrote in a blog post several weeks ago about “responsible encryption” and how the phrase was used by Deputy Attorney General Rod Rosenstein. Tell me about that speech and what struck you about that.

Riana Pfefferkorn: This was a speech that the deputy attorney general made to the Naval Academy a few weeks ago, and there were a few things that I found kind of noteworthy about that. One was this was really the first place I think we've seen the use of this term “responsible encryption.” Now, we've heard it before, going all the way back to then-FBI Director Louis Freeh back in the mid-90s, when he referred to “socially responsible encryption” during the previous round of the crypto wars, when at that time, the DOJ and FBI were already sounding the alarm about communications “going dark” due to new technologies, including encryption. But the U.S. Naval Academy speech was the first place where we've seen a term that the deputy attorney general has now been using quite frequently, “responsible encryption.”

Perhaps most noteworthy about that speech, though, were these sort of dark intonations from Rosenstein that there had been conversations between law enforcement and technology companies to try and, I guess, persuade companies to make their encryption designs law enforcement-friendlier, but that the companies hadn't been responsive to this. And so, what he seemed to be saying in his speech is, “The time for talking is over, we're going to need to legislate, because [tech companies] are not going to make any changes unless we force their hand.”

In a time where the winds have shifted to really be not as friendly to Silicon Valley, and where large tech companies are no longer really seen as necessarily being on the side of their users, I think this a point where Rosenstein and his colleagues really smell blood in the water.

There's another point in Rosenstein's speech that you flagged, which is that tech companies have been willing to play ball with governments such as Russia and China. Do you feel like companies have put themselves in a difficult position to argue against “responsible encryption” because of what they've done to do business overseas?

I think it's a fair point. And out of the many points of disagreement that I have with Mr. Rosenstein, when he points out that companies have submitted to security reviews by China or allowed Russia to do audits of their source code — or when they've helped out with censorship, or helped regimes persecute journalists in the past — this does open those companies up to the criticism that, when they go out and say, “We're pro-users, we're on the side of our users' privacy and our users' security,” that that can ring a little bit hollow.

So he does sort of have a point to say that companies have opened themselves up to this kind of criticism. But at the same time, it's sort of salacious in that it's saying, “Well, if you'll comply with laws that are terrible ideas in other countries — like submitting to source code reviews by governments that have been tampering with our own elections in recent years, for example — then why don't we just pass our own terrible-idea law that mandates backdoors, and then you can go and comply with that because you've shown that you'll comply with any old stupid law.” I don't think that necessarily follows.

You said that if legislation is introduced, it won't necessarily look like the Compliance with Court Orders Act. What would you expect it to look like?

It's difficult to say what it would actually look like. Every time that the deputy attorney general has gone and spoken in public, he's really sort of demurred on specifying what a regulation would look like, just [saying] that it should serve the ultimate purpose of giving law enforcement access to plain text data. And he's also sort of muddied the waters a little bit in terms of whether he's asking for a regulation that would only deal with communications in transit, or for encryption on smartphones or other devices encrypted at rest.

Most of the bills that we saw floated at the congressional level and in a few states a couple of years ago mostly dealt with smartphones as a not-so-subtle being aimed at Apple and Apple's phones. But I think if there were some sort of law in place, it would sort of not make sense for it to only target data at rest. I think that we try to target both communications encryption and encryption for data at rest, which opens up a Pandora's box of problems. Are you trying to get at the Signals of the world, or are you saying we can't encrypt web traffic anymore? And does that undermine all of e-commerce?

You were previously an attorney at Wilson Sonsini Goodrich & Rosati, advising tech companies. Putting that hat on again, what would you advise tech companies to do amid this debate now?

It would depend on which company we're talking about. A big company and a small company are going to be in different positions in terms of what's possible for them to implement and administer. A company that sort of comes at the issue from a bit more ideological backing, like the makers of Signal, might have a different tolerance for spoiling for a fight than a larger, more established, blue-chip publicly traded company. Whether they're public or not would also come into the picture a lot, because public companies don't just have to be answerable to their users, when you're public you have to be answerable to your stockholders.

But one thing I would just remind companies is that they are at liberty to design their encryption however they want to. The law permits them to do that. They do not have to make encryption that is accessible by law enforcement, they do not have to make surveillance-friendly encryption.

SAN FRANCISCO — It's been two years since the very public showdown between Apple and the FBI over encryption, in a legal battle that ultimately ended without much of a resolution. Now, a standoff in the “crypto wars” appears to again be looming.

In this excerpt from an interview on Law.com's “Unprecedented” podcast, Stanford Law cryptography fellow Riana Pfefferkorn talks about the Department of Justice's new push for “responsible encryption” and whether it could lead to new legislation. Listen to the full interview here, or subscribe to “Unprecedented” on your Apple or Android device.

If you're in the Bay Area, you can catch Pfefferkorn speak about the crypto wars on Wednesday night at an event hosted in Oakland by Ars Technica.

This transcript has been edited for length and clarity.

Ben Hancock: You wrote in a blog post several weeks ago about “responsible encryption” and how the phrase was used by Deputy Attorney General Rod Rosenstein. Tell me about that speech and what struck you about that.

Riana Pfefferkorn: This was a speech that the deputy attorney general made to the Naval Academy a few weeks ago, and there were a few things that I found kind of noteworthy about that. One was this was really the first place I think we've seen the use of this term “responsible encryption.” Now, we've heard it before, going all the way back to then-FBI Director Louis Freeh back in the mid-90s, when he referred to “socially responsible encryption” during the previous round of the crypto wars, when at that time, the DOJ and FBI were already sounding the alarm about communications “going dark” due to new technologies, including encryption. But the U.S. Naval Academy speech was the first place where we've seen a term that the deputy attorney general has now been using quite frequently, “responsible encryption.”

Perhaps most noteworthy about that speech, though, were these sort of dark intonations from Rosenstein that there had been conversations between law enforcement and technology companies to try and, I guess, persuade companies to make their encryption designs law enforcement-friendlier, but that the companies hadn't been responsive to this. And so, what he seemed to be saying in his speech is, “The time for talking is over, we're going to need to legislate, because [tech companies] are not going to make any changes unless we force their hand.”

In a time where the winds have shifted to really be not as friendly to Silicon Valley, and where large tech companies are no longer really seen as necessarily being on the side of their users, I think this a point where Rosenstein and his colleagues really smell blood in the water.

There's another point in Rosenstein's speech that you flagged, which is that tech companies have been willing to play ball with governments such as Russia and China. Do you feel like companies have put themselves in a difficult position to argue against “responsible encryption” because of what they've done to do business overseas?

I think it's a fair point. And out of the many points of disagreement that I have with Mr. Rosenstein, when he points out that companies have submitted to security reviews by China or allowed Russia to do audits of their source code — or when they've helped out with censorship, or helped regimes persecute journalists in the past — this does open those companies up to the criticism that, when they go out and say, “We're pro-users, we're on the side of our users' privacy and our users' security,” that that can ring a little bit hollow.

So he does sort of have a point to say that companies have opened themselves up to this kind of criticism. But at the same time, it's sort of salacious in that it's saying, “Well, if you'll comply with laws that are terrible ideas in other countries — like submitting to source code reviews by governments that have been tampering with our own elections in recent years, for example — then why don't we just pass our own terrible-idea law that mandates backdoors, and then you can go and comply with that because you've shown that you'll comply with any old stupid law.” I don't think that necessarily follows.

You said that if legislation is introduced, it won't necessarily look like the Compliance with Court Orders Act. What would you expect it to look like?

It's difficult to say what it would actually look like. Every time that the deputy attorney general has gone and spoken in public, he's really sort of demurred on specifying what a regulation would look like, just [saying] that it should serve the ultimate purpose of giving law enforcement access to plain text data. And he's also sort of muddied the waters a little bit in terms of whether he's asking for a regulation that would only deal with communications in transit, or for encryption on smartphones or other devices encrypted at rest.

Most of the bills that we saw floated at the congressional level and in a few states a couple of years ago mostly dealt with smartphones as a not-so-subtle being aimed at Apple and Apple's phones. But I think if there were some sort of law in place, it would sort of not make sense for it to only target data at rest. I think that we try to target both communications encryption and encryption for data at rest, which opens up a Pandora's box of problems. Are you trying to get at the Signals of the world, or are you saying we can't encrypt web traffic anymore? And does that undermine all of e-commerce?

You were previously an attorney at Wilson Sonsini Goodrich & Rosati, advising tech companies. Putting that hat on again, what would you advise tech companies to do amid this debate now?

It would depend on which company we're talking about. A big company and a small company are going to be in different positions in terms of what's possible for them to implement and administer. A company that sort of comes at the issue from a bit more ideological backing, like the makers of Signal, might have a different tolerance for spoiling for a fight than a larger, more established, blue-chip publicly traded company. Whether they're public or not would also come into the picture a lot, because public companies don't just have to be answerable to their users, when you're public you have to be answerable to your stockholders.

But one thing I would just remind companies is that they are at liberty to design their encryption however they want to. The law permits them to do that. They do not have to make encryption that is accessible by law enforcement, they do not have to make surveillance-friendly encryption.