Enlarge/ Ray Ozzie, former chief software architect at Microsoft Corp., speaks during the Microsoft Professional Developers Conference in Los Angeles, California, on Tuesday, Oct. 28, 2008.Jonathan Alcorn/Bloomberg via Getty Images

The Department of Justice is pushing for a new industry proposal that would grant law enforcement access to encrypted digital devices with a warrant, according to a new report by The New York Times.

For years, top federal law enforcement officials have advocated for a way to overcome what they call the "going dark" problem—the occasional inability to access data kept on an encrypted smartphone or tablet even when a judge has granted that authority. In recent months, the FBI director, among others, has emphasized the problem's severity.

"I recognize this entails varying degrees of innovation by the industry to ensure lawful access is available," FBI Director Christopher Wray said in a speech earlier this month. "But I just dont buy the claim that its impossible."

The DOJ's position runs counter to the consensus of information security experts, who say it is extremely difficult—if not downright impossible—to build a strong encryption system that would protect data but also allow the government access under certain conditions.

"Of course, criminals and terrorists have used, are using, and will use encryption to hide their planning from the authorities, just as they will use many aspects of society's capabilities and infrastructure: cars, restaurants, telecommunications," Bruce Schneier, a well-known cryptographer, wrote last year.

However, unlike before, where federal agencies have simply decried the problem, a handful of tech experts is now digging into developing a workable solution that could eventually find its way into actual legislation.

"Not considered ready for deployment"

In its reporting last Saturday, the Times noted that a team that includes former Microsoft Chief Software Architect and CTO Ray Ozzie is helping outline a system that would provide police with access to an encrypted device under certain circumstances. The scheme would not attempt to access messages scrambled in transit (like those produced by Signal or WhatsApp) or encrypted cloud-based storage (like iCloud or SpiderOak).

According to the Times, when a device is encrypted, this proposed system would "generate a special access key that could unlock their data without the owners passcode. This electronic key would be stored on the device itself, inside part of its hard drive that would be separately encrypted—so that only the manufacturer, in response to a court order, could open it."

Ozzie and two other computer scientists—Ernie Brickell, formerly of Intel, and Stefan Savage from the University of California—have taken part in workshops on the matter held at the Massachusetts Institute of Technology by computer science professor Daniel Weitzner. This research has also reportedly been shared with high-level FBI officials. Neither the Department of Justice nor the FBI responded to Ars' request for comment on Sunday.

When Ars reached Ozzie by email late Sunday evening, he wrote that "there is no secrecy here," adding that he had spent "two years trying to use a concrete proposal so as to catalyze constructive discussion among people who were not talking."

"To be clear, there is no 'group,'" he continued. "There have been an immense number of private bilateral and multilateral discussions among diverse sets of passionate and deeply intelligent and well-intentioned individuals—industry, academics, US government, other countries' governments. In most all, people would never have participated had they not felt comfortable that they weren't doing so privately and under Chatham House rules. There have been very concrete discussions about specific technological approaches—NOT because anyone is pushing a specific approach or outcome. Rather, I (and Stefan, and Erie) have simply been doing what we can to get the discussion to the point of 'should we' not 'can we.' That is the question, I believe, we should be asking."

This plan was first described in a February 2018 National Academy of Sciences committee report that took 18 months to research.

"[The] proposed encryption schemes are not considered ready for deployment until they have undergone careful scrutiny by experts regarding their effectiveness, scalability, and security risks and been subject to real-world testing at realistic scale in the relevant contexts," the report concluded.

Still, probably the most famous example of encryption throwing a wrench into a recent real-world investigation was in 2016, when Apple faced off with the Department of Justice in the wake of the San Bernardino terrorist shooting. There, federal authorities tried to force Apple to rewrite its firmware so that the FBI could access the seized iPhone that was used by Syed Rizwan Farook, one of the dead attackers. In the end, the FBI said that it was able to access the phone by other means—and reportedly paid more than $1.3 million for the privilege—while the legal issue was left ultimately unresolved.

Given the shape of the new proposal, Apple seems to suggest that it would similarly resist any further efforts to change its software in similar ways.

"Were continuously strengthening the security protections in our products because the threats to data security intensify every day," said Apple Senior Vice President for Software Engineering Craig Federighi in a statement sent to Ars and to the Times.

"Proposals that involve giving the keys to customers device data to anyone but the customer inject new and dangerous weaknesses into product security. Weakening security makes no sense when you consider that customers rely on our products to keep their personal information safe, run their businesses, or even manage vital infrastructure like power grids and transportation systems. Ultimately protecting someone elses data protects all of us so we need to move away from the false premise that privacy comes at the cost of security when in truth, its a question of security versus security."

"Frighteningly insecure"

Several lawyers and computer scientists reiterated to Ars that creating such a system and compelling companies to implement it could potentially be fraught with numerous problems, both legal and technical.

"A hardware-based backdoor would shift the burden onto smartphone users to go through extra inconvenience in order to secure their information," Stanford University legal fellow Riana Pfefferkorn told Ars by email. Pfefferkorn recently wrote a paper on the subject.

"The result would be that careful criminals would be more scrupulous about using app-level encryption, but unsophisticated criminals—and innocent everyday smartphone owners—probably wouldn't do a perfect job of it," elaborated Pfefferkorn. "That seems to be good enough for law enforcement, though."

A seemingly similar proposal, known as "key escrow," was first proposed under the Clinton administration in the 1990s, when mobile devices and encryption itself were far less sophisticated. The basic premise was that computers containing a special "Clipper Chip" would give the government access to devices when needed. Federal authorities would incentivize inclusion of the chips in commercial devices by requiring them in order for companies to do business with the government. However, under scrutiny, the underlying tech failed and the plan was essentially dead on arrival.

Amie Stepanovich, a lawyer and policy manager at privacy advocacy group Access Now, told Ars that its still not clear to what extent strong encryption has meaningfully stymied criminal investigations in more recent years.

"If we've learned anything from the past, I expect the proposed solution to look a lot less secure than the current environment, which, given the ever-increasing number of data breaches, is already frighteningly insecure," she e-mailed. "Given that this proposal has been developed by a privileged few, I also expect its impacts to be felt disproportionately by marginalized communities."

Meanwhile, Seny Kamara, a cryptographer and professor at Brown University, pointed out that even if Ozzie and his colleagues are successful in making their system work, its not a given that companies would even agree to implement it.

"Even if the government compelled companies to comply, it doesnt mean that law enforcement would be able to get the information it wants, because third-party app developers could still provide encrypted apps that would protect information even given access to the device," he emailed Ars.

"Is the government going to compel all app developers as well?" he continued. "What if the apps are developed in another country? What if the apps are developed and published anonymously?"

Original Article