Good morning Chairman Grassley, Ranking Member Leahy, and Members of the Senate Judiciary Committee. Thank you for your leadership on these issues, and for the opportunity to testify today.
Encryption and government access to data stored on electronic devices have largely been discussed as a federal and national security issue. In truth, the implications of this discussion are equally, or even more importantly, of significant concern to state and local law enforcement. That is because more than 90 percent of all criminal cases filed annually are filed and adjudicated in state courts around the country.
To use the New York County District Attorney’s Office as a point of reference, my Office handles more than 100,000 criminal cases each year, which is more than all of the cases handled by the Department of Justice nationwide. And the range of those cases is broad – from murder, rape, and robbery, to identity theft, financial fraud, and terrorism.
People live their lives today on their smartphones, which they use for, among others things,
emailing, texting, taking pictures, posting pictures, shopping, conducting business, and searching
the web. To investigate these 100,000 cases without smartphone data is to fight crime with one
hand tied behind our backs.
Therefore, I want to focus my testimony on an issue deeply troubling for law enforcement
at all levels: the encryption of smartphone data by Apple and Google. I believe that by having an
open discussion among lawmakers, law enforcement, and the private sector, we can reach a
solution that strikes the right balance between privacy and public safety.
I. The Use of Smartphone Evidence Pursuant to Judicial Warrants
As you know, the Fourth Amendment of the United States Constitution authorizes reasonable searches and seizures, providing law enforcement agencies access to places where criminals hide evidence of their crimes – from car trunks, to storage facilities, to computers, mobile devices, and digital networks. In order to safeguard Fourth Amendment rights, these searches are conducted pursuant to judicial warrants, issued upon a neutral judge’s finding of probable cause.
The probable cause standard represents a balance between privacy and public safety carefully calibrated by centuries of jurisprudence, and it guides individuals and companies in developing their expectations of privacy.
Through this judicial process, my Office obtains smartphone evidence to support all types of cases – homicides, sex crimes, child abuse, fraud, assaults, robberies, cybercrime, and identity theft. Many perpetrators, particularly those who commit sexual offenses, take photos and videos of their acts, and store them on computers and smartphones.
Between October 2014 and June 2015, 35 percent of the data extracted from all phones by my Office was collected from Apple devices; 36 percent was collected from Android devices. That means that when smartphone encryption is fully deployed by Apple and Google, 71 percent of all mobile devices examined—at least by my Office’s lab—may be outside the reach of a search warrant.
I want to emphasize I am testifying from a state and local perspective. I am not advocating bulk data collection or unauthorized surveillance. Instead, I am concerned about protecting local law enforcement’s ability to conduct targeted requests for information, scrutinized by an impartial judge for his or her evaluation as to whether probable cause has been established. Importantly, and by Apple’s own admission, governmental request for information have affected only .00571 percent of Apple’s customers.
II. Apple and Google’s New Encryption Policies
Last fall, Apple and Google, whose operating systems run 96 percent of smartphones worldwide, announced with some fanfare, but without notice to my Office or other law enforcement offices I have spoken to, that they had engineered their new mobile operating systems such that they can no longer assist law enforcement with search warrants written for passcode-protected smartphones. According to Apple’s website:
On devices running iOS 8.0 and later versions, your personal data such as photos, messages (including attachments), email, contacts, call history, iTunes content, notes, and reminders is placed under the protection of your passcode… Apple will not perform iOS data extractions in response to government search warrants because the files to be extracted are protected by an encryption key that is tied to the user’s passcode, which Apple does not possess. [Emphasis added.]
Apple’s announcement led to an immediate response by law enforcement officials who pointed out that allowing a phone or tablet to be locked such that it would be beyond the reach of lawful searches and seizures was unprecedented and posed a threat to law enforcement efforts – in effect, a boon to criminals. Unless law enforcement officials can obtain the passcode from the user, which will be difficult or impossible in many cases, or can use “brute force” to obtain the passcode (again, difficult or impossible, and attempts to do this would likely lead to the destruction of evidence on the iPhone), the search warrant would be of no consequence, because no one will be able to unlock the phone, notwithstanding the court order.
Law enforcement’s warnings are hardly idle. Recently, a father of six was murdered in Evanston, Illinois. City of Evanston Police believe that prior to his murder, the victim was robbed of a large sum of money. There were no eyewitnesses to or surveillance footage of the killing. Found alongside the body of the deceased were an iPhone 6 and a Samsung Galaxy S6 Edge running Google Android. Cook County prosecutors served Apple and Google with judicial warrants to unlock the phones, believing that relevant evidence might be stored on them. Apple and Google replied, in substance, that they could not, because they did not know the user’s passcode. Information that might be crucial to solving the murder, therefore, had effectively died with the victim. His homicide remains unsolved. His killer remains at large.
It is not hyperbole to say that beginning in September 2014, Americans conceded a measure of their protection against everyday crimes to Apple and Google’s new encryption policies. Yet, I would note that, before the changes, neither company, to our knowledge, ever suggested that their encryption keys, held by the companies, were vulnerable to hacking or theft.
Fully one-quarter of our felony cases now involve cybercrime or identity theft, so I am keenly aware of the dangers and impact of these crimes on our community (which happens to be situated in a world financial center and is the number one target for terrorism in the world). Because of this, my Office has invested heavily in becoming highly proficient and active in the prosecution of these crimes, and in the promotion of best cybersecurity practices for New York consumers and companies. From my vantage point, and in my opinion, for reasons set forth later in my testimony, Apple and Google’s new encryption policies seem to increase protection for consumers from hackers only minimally, if at all. But those policies create serious new risks for my constituents and the millions of visitors and workers passing through Manhattan every day.
III. The Problem Created by Apple and Google’s New Encryption Policy
Some commentators have argued that the effective unsearchability of smartphones will not be significant because law enforcement has other avenues for investigation. These arguments are flawed, for the reasons set forth below. While Apple is not the only company to have announced a similar program, for ease of presentation, and because Apple has been the most vocal company on the relevant issues, I address here only Apple’s program, devices, and statements. I believe that the arguments and reasoning here would apply as well to Google’s program and any similar program.
A. The Search of an iCloud Account is Not a Substitute for the Search of an iPhone
Apple’s new encryption policy currently does not affect law enforcement officials’ ability to obtain user data from an iCloud account. Law enforcement officials who obtain a search warrant for a person’s iCloud account can serve that warrant on Apple, and thus obtain the contents of the account, regardless of whether the person’s iPhone uses iOS 8.
But searching a person’s iCloud account is not the same as searching the person’s iDevice. The ability of law enforcement officials to obtain a search warrant for an iCloud account does not mean that those officials will obtain the same content as they would if they could search the user’s device. In many cases, law enforcement officials cannot obtain the entirety of an iPhone’s data by obtaining the contents of the associated iCloud account.
First, and most fundamentally, users of iDevices are not required to set up iCloud accounts or to back-up to iCloud accounts. Therefore, not all users of iDevices will have data stored to iCloud. It is clear that even minimally sophisticated wrongdoers who use their iDevices to perpetrate crimes will take the relatively simple steps necessary to avoid backing up those devices to iCloud.
Second, even when a user chooses to back up his or her data to iCloud routinely, some data may not be backed up and would, therefore, be unattainable through a search warrant for an iCloud account. Data that is saved on an iPhone will not be backed up to the cloud until the iPhone is connected to WiFi. So, if evidence is stored on an iPhone when the phone is disconnected from Wi-Fi, and the iPhone is recovered by law enforcement officials before it is reconnected to Wi-Fi, then the evidence would exist only on the iPhone itself.
iPhone users are given only five gigabytes of free storage space on iCloud, whereas iPhone 6s come with either 16, 64, or 128 gigabytes of storage space on the device itself. Thus, unless a user pays for additional iCloud storage space, the vast majority of their storage space will be on the iPhone itself.
Third, it may be possible to recover at least some deleted data from an iDevice. Once data has been deleted from an iCloud account, however, it is not recoverable. Thus, the iPhone is the only route to evidence that has been deleted – which may, of course, be among the most probative evidence.
Fourth, it will often be more difficult for a prosecutor to prove who uses the data on a particular iCloud account than it would be to prove who owns a particular iDevice. iDevices are often recovered from a person, which supports the inference that the person controls the device. To establish the person’s ownership of the device, a prosecutor may simply have to call one witness – the officer who recovered the device. By contrast, the identity of the user of an iCloud account may be more difficult to establish. A prosecutor may need to present testimony or records from Apple relating to the subscriber information, IP login history, and/or content of the account, testimony or records from internet service providers regarding the subscriber information of certain IP addresses, and/or testimony of forensic analysts, among other witnesses.
B. The Difficulty of Getting Passcodes from Defendants
In many cases, the iPhone that may contain evidence about a criminal case belongs to the defendant, and thus it is her or his passcode that prevents the government from gathering the evidence. In most cases, it will be almost impossible to compel a defendant to provide her or his passcode to the government.
Case law holds almost universally that a defendant cannot be compelled (by, for example, a grand jury subpoena or order of the court) to provide the government with her or his passcode, because such compulsion would violate the defendant’s Fifth Amendment right against self-incrimination. There are two potential exceptions to this rule.
First, it is an open question whether, instead of being compelled to provide the government with a passcode, the defendant might be compelled to unlock her or his iPhone using the passcode. There have been no cases of which we are aware that consider this precise question, and although a court might conclude that it is no different from the situation in which a defendant is compelled to provide the government with the password, it might also determine that the situations are somewhat different.
Second, if the existence of particular evidence on the iPhone is a foregone conclusion, then the defendant may have no Fifth Amendment privilege with respect to those contents of the iPhone, and thus may be compelled to provide the government with the passcode. It would be very difficult in most circumstances, however, for the government to establish with the requisite degree of certainty the existence of evidence in the contents of an iPhone that would clear the “foregone conclusion” hurdle.
In any event, even if the government could lawfully compel a defendant to disclose her or his passcode – or to open her or his phone using the passcode – there is a substantial likelihood that any defendant who faces potentially serious criminal charges would simply refuse to comply with the subpoena or order, and go into contempt.
The consequence of the foregoing is that in almost all cases, it will be legally impossible to compel a defendant to provide his or her passcode or to use the passcode to open her or his iPhone, and that, in those few cases in which it might be legally possible to compel the defendant to provide the information, it would be impossible as a practical matter to compel a recalcitrant defendant facing serious charges to do so. And, of course, whatever powers law enforcement might have to compel a defendant to cooperate in opening her or his phone, those powers are irrelevant to the situation in which the phone that law enforcement needs to open belongs to an unavailable victim (for example, the deceased, in a murder case) or to a witness or potential defendant who fled the scene of the crime.
IV. Apple’s Stated Reasons for Its New Encryption Policy
Apple, to our knowledge, has given four principal justifications for its new policy.
First, it has suggested that the new policy is a response to public concerns expressed in light of the revelations by Edward Snowden about data collection by the National Security Agency.
Second, Apple has suggested that unless it makes its iDevices impregnable to lawful governmental access, Apple will lose customers, who will seek to purchase substitute, impregnable devices.
Third, Apple has suggested that if it were to build its devices such that it could respond to lawful governmental requests for information in them, then the iDevices would be less secure than if Apple built them (as it has) to be impregnable to such requests.
Fourth, Apple has also suggested that if it were to build its devices such that it could respond to lawful domestic governmental requests for information in them, then foreign governments would also request access to the information contained in the devices, or hack into the devices to gain access; if those government were repressive, commentators have suggested, then it would, in effect, be helping repressive governments to limit their citizens’ liberty.
Each of these proposed justifications will be addressed in turn.
A. American Customer Privacy Concerns Based on NSA’s Actions
Apple’s encryption efforts – and, more particularly, its announcement of those efforts – appear to be partially in response to concerns expressed by the public in the wake of revelations about incursions of privacy by the NSA, many of which were brought to light by Edward Snowden and others working with him. The encryption efforts are not a reasonable response to the Snowden-NSA issue for at least two reasons.
First, the effect of Apple’s encryption is that it prevents (i) the necessity of Apple responding to lawful government requests and (ii) the government from examining the contents of iDevices, even when an independent judge has authorized such disclosure by issuing a search warrant. Of course, a search warrant cannot be issued absent a showing of probable cause to believe that a crime has been committed and that evidence or proceeds of the crime might be found on the iDevice to be searched.
The warrant requirement has been described by the Supreme Court as “[t]he bulwark of Fourth Amendment protection,” and there is no reason to believe that it cannot continue to serve in that role, whether the object to be searched is an iPhone or a home. In fact, what makes Apple’s proposal remarkable is that it would provide greater protection to one’s iPhone than one has in one’s home, which, of course, has always been afforded the highest level of privacy protection. Every home can be entered with a search warrant. I cannot think of another device that has been knowingly designed in a way to prevent lawful government inspection. Thus, even if Apple’s new encryption policy would have prevented the NSA’s actions, or will be able to prevent similar conduct, it comes at a very high and unjustified cost.
Second, Apple’s new encryption policy would not have prevented the NSA’s gathering of data, which, at least according to press coverage, was not sanctioned by a court or covered by a search warrant.
That press coverage indicates that the NSA collected phone call data (including numbers called and the duration of phone calls), and the content of at least some phone conversations, often involving people for whom there was no reasonable suspicion of criminal activity, without judicial approval. Of course, that is nothing like what we are suggesting, and, in any event, the encryption of iDevices would not have prevented the mass collection of phone call data, which is obtained from phone service providers rather than from phones themselves. And default full-disc encryption would not have prevented the NSA from intercepting communications in transit.
B. Security Concerns if Apple Can Decrypt
Apple has asserted that full-disc encryption maximizes the security of their users’ devices and data. One should question that assertion for several reasons.
First, even before Apple enacted its new policy, a person who somehow obtained Apple’s encryption key would still need the iPhone itself to obtain information on the device. It is highly unlikely that someone who snatches an iPhone from a subway commuter would also be a master hacker capable of breaching Apple’s own security systems to obtain the encryption key. As best I can tell, Apple and Google’s new encryption policies prevent lawful access by state and local law enforcement, and do nothing to address serious challenges like institutional data breaches or invasion by malware.
Second, if a user’s phone were to be stolen, the user could use the Find My iPhone app in order to wipe the phone’s data and prevent the thief from accessing that data. Users who have an iCloud account can use the Find My iPhone app remotely to lock their phones or erase their data if their phones are lost or stolen. This app can effectively prevent thief-hackers from obtaining a phone’s data.
Third, according to Apple, most data stored on the iCloud is encrypted. As noted above, law enforcement officials are able to obtain some iCloud data through the service of a search warrant on Apple. The fair inference is that Apple retains the ability to decrypt this iCloud data. But if Apple’s ability to decrypt data on the iCloud does not render that data insecure – and Apple touts the security of its iCloud data – then presumably neither would Apple’s retaining the ability to decrypt data on an iPhone.
C. Concern About Foreign Governments’ Access to Customer Information
Apple has suggested that if it maintains an encryption key to iOS 8, then repressive governments would seek access to information contained on devices, including those belonging to dissidents. If Apple were to comply, this would impede those dissidents’ civil liberties. This contradicts commentators’ arguments, referenced above, that iOS 8 will not be a significant burden on law enforcement, which can still request data from a user’s iCloud account. Apple cannot have it both ways: Either iOS 8 does deter investigations or it does not.
But even more fundamentally, Apple’s desire not to cooperate with requests from repressive governments does not justify its refusal to cooperate with bona fide requests from local law enforcement, approved by independent state or federal courts, in the United States. As I said before, the warrant requirement of the United States Constitution has been the fundamental
protection for people’s privacy and liberty in this country. It can and should apply to iOS 8, as it does to any other home or device.
V. The Cost of Evidence Made Inaccessible Through Apple’s Encryption
Although encryption has been often discussed in the context of international terrorism, the NSA, and the CIA, the greatest cost of these new encryption policies may well be borne by local law enforcement. Smartphones are ubiquitous, and there is almost no kind of case in which prosecutors have not used evidence from smartphones. My Office (and, I expect, every other local prosecutor’s office) has used evidence from cellphones in homicides, rape cases, human trafficking, assaults, domestic violence cases, narcotics cases, kidnappings, larcenies, frauds, identity theft, cybercrime, and robberies. Indeed, it is the rare case in which information from a smartphone is not useful. The following list of recent cases is representative:
There are many other cases—almost too many to count—that I might have selected, but the point is clear: We would risk losing crucial evidence in all of these cases if the contents of passcode-protected smartphones were unavailable to us, even with a warrant.
The enormity of the loss is fully appreciated by wrongdoers who use smartphones. Recently, a defendant in a serious felony case told another individual on recorded jailhouse call that “Apple and Google came out with these softwares that can no longer be encrypted [sic: decrypted] by the police. . . . If our phones is running on the i0[S]8 software, they can’t open my phone. That might be another gift from God.”
This defendant’s appreciation of the safety that the iOS 8 operating system afforded him, is surely shared by criminal defendants in every jurisdiction in America charged with all manner of crimes, including rape, kidnapping, robbery, promotion of child pornography, larceny, and presumably by those interested in committing acts of terrorism. Criminal defendants across the nation are the principal beneficiaries of iOS 8, and the safety of all American communities is imperiled by it.
VI. Proposed Solution: Return to a Balanced Approach – Requiring Phones To Be Manufactured So That They Would Be Accessible To Law Enforcement When Law Enforcement Has Obtained a Search Warrant
Apple has done something truly extraordinary. I am aware of no products other than those running iOS 8 and Google’s analogous Android devices that have been designed specifically to be impervious to lawful governmental processes. While Apple pursues an extraordinary path, I am proposing here a perfectly conventional one – a return to the balanced approach in place prior to the introduction of iOS 8. Apple’s products should be configured such that data on its iDevices can be accessed by law enforcement when it has judicial authorization to do so.
I believe that the only way Apple could be compelled to configure its products along the lines that I have suggested is through legislation. It is important to proceed with great caution, and any legislation must be carefully drafted to avoid stifling innovation or causing material harm to the United States technology sector, which is so vital to our national economy. But, in the fast-changing field of technology, legislation is the right tool – far better than litigation – to effect change and balance social objectives.
Legislation can be changed, revised, amended, and tweaked, to accommodate valid social goals in the face of a shifting technological landscape. As Justice Alito has observed, “[i]n circumstances involving dramatic technological change, the best solution to privacy concerns may be legislative.” Furthermore, legislation allows the public, rather than Apple or any other company, to set the balance between privacy and security. The trade-off is borne by the public, and it should be decided by the public as well.
Informed legislation, however, requires debate and open discussion. I have attempted to discuss with Apple many of the items that I have discussed here. In March of this year, I travelled to Apple’s headquarters and expressed my concerns directly to members of Apple’s management team as to how Apple’s encryption policy adversely affected law enforcement’s ability to protect the public safety. I followed up my visit with a letter, summarizing my questions about iOS 8. To date, I have not received a response. (A copy of my letter to Apple, and a copy of a similar letter that I wrote to Google, are attached to this written testimony.)
I would encourage this Committee to seek answers from Apple. The time has come for someone to determine the proper balance between the marginal benefits of full-disc encryption, and the need for state and local law enforcement to prosecute and prevent crime, and for victims to obtain justice. That someone should not be Apple or Google. It should be you, the Congress.
Thank you for the opportunity to testify today.
[PDF contains footnoted citations.]