Inadvertently Disclosed Warrant Application Against Apple in a Criminal Investigation Against Retired Marine General Reveals Latest DOJ Search Procedures, the Dangers of Pacer and Too Much Court Record Transparency, and Much More – Part Two
This article is Part Two of the blog Examining a Leaked Criminal Warrant for Apple iCloud Data in a High Profile Case. See here for Part 1.
Items To Be Seized – Search Procedures
In Attachment B to the Application, entitled, Items To Be Seized, the government describes in Section I the Search Procedures they want Apple to follow. That’s where it gets really interesting for anyone in ediscovery. The fun continues in Section II, Information to be Disclosed by Provider, Section III, Information to be Seized by the Government, and Section IV Provider Procedures.
Section I starts off by directing Apple to make a forensic copy, i.w., bit by bit. The language for this is informative. Note how this intrusive request is characterized as a kind of nice courtesy to all of us other Apple iCloud users.
2 . To minimize any disruption of service to third parties, the PROVIDER’s employees and / or law enforcement personnel trained in the operation of computers will create an exact duplicate of the information described in Section II below.
Skipping to paragraph four of the Search Procedures section, the government talks about the search tools they may use. One would hope it is not an exhaustive list. There are so many other good tools out there. Just peruse around EDRM.net and you will see many of the best,
The search shall extract and seize only the specific items to be seized under this warrant (see Section III below ). The search team may use forensic examination and searching tools, such as “Encase” and “FTK” (Forensic Tool Kit), which tools may use hashing and other sophisticated techniques. The review of the electronic data may be conducted by any government personnel assisting in the investigation, who may include, in addition to law enforcement officers and agents, attorneys for the government, attorney support staff, and technical experts.
In the next paragraph five, you see a “this crime only” type relevance limitation put on the search. That should keep it from being a general fishing expedition, of oh, gee, look what I found, yet another new crime.
The search team will not seize contraband or evidence relating to other crimes outside the scope of the items to be seized without first obtaining a further warrant to search for and seize such contraband or evidence.
In the next paragraph six a time limit for the search is self-imposed by the government, but of course a back door is provided to ask the court for more time, which, I hear, is the rule, not the exception. In other words, this time limit is about as flexible as one of Dali’s clocks.
The search team will complete its search of the content records as soon as is practicable but not to exceed 120 days from the date of receipt from the PROVIDER of the response to this warrant. The government will not search the content records beyond this 120-day period without first obtaining an extension of time order from the Court .
In paragraph seven, it is explained that after the search team completes its review of the data, the original production by Apple, the provider here, will then be “sealed and preserved” by the government, not returned and destroyed. The reasons given for this procedure is what you would expect, “authenticity and chain of custody purposes.”
In paragraph nine of the Search Procedures, the Application asserts that “Pursuant to 18 U.S.C. 2703(g) the presence of an agent is not required for service or execution of this warrant.” I am sure the search team of ediscovery experts who will actually do the work here are relieved to know that they won’t have to have an FBI agent looking over their shoulders the whole time. But it does raise the question as to who watches the watchers, or in their case, the seekers. I assume they will do a better job with cybersecurity that the NSA did with Snowden, or the Clerk here did with the sealed Applicatioin. Thumb drive cuff links anyone? Only $39.95 on Amazon.
Information to be Disclosed by Provider
Attachment B to the Application is entitled, Items To Be Seized. Section II of Attachment B describes the Information to be Disclosed by Provider, in this case Apple. This is paragraph ten of the Application. First of all, the Application makes clear that Apple must disclose the information, no matter where in the world Apple may have the ESI stored. So much for international privacy laws. This is a criminal warrant by the DOJ, so you do what the government says, the US government, or else. This is a real problem for countries with strong ESI privacy rights, such as those located in the EU. For good background on this, see The Ultimate Guide to GDPR and Ediscovery by Zapproved (EDRM 5/19/22) (Order in a civil case forbidding the forensic examination of the computers in China as “out of proportion with the needs of this case,” citing Rule 26 (b)(1), Federal Rules of Civil Procedure.)
Also, under the Application, it does not matter if Apple has already deleted the data. Apple must still restore and fetch it, if it is “still available.” Use your forensic experts and cough it up. See: EDRM Collection Standards, 1. Forensic Image (Physical or Logical Target) and 6.4. International Protocols. There is no proportionality, cost burden analysis in the Application that you would ordinarily see in a civil case. Apple is required to turn over all “wire and electronic communications” to the DOJ search team from October 1, 2016, to the date of the Application, April 15, 2022. See eg. my 5/28/18 blog, Proportionality Analysis Defeats Motion for Forensic Examination, discussing Motorola Sols., Inc v. Hytera Communications Corp., No. 17 C 1973 (N.D. Ill.).
Now comes the typical including without limitation laundry list of in paragraph 10 a. i-iv. It is quite an extensive list, including “buddy lists.” (I can’t believe anyone still uses that feature. I don’t even see it on my apple devices.) I quote this part 10. a. in full, except for subparagraph iii, which is provider specific, in case you want to use something obnoxiously long and complete like this yourself some day when subpoenaing a private party.
i . All e-mails , communications , or messages of any kind associated with the SUBJECT ACCOUNT, including stored or preserved copies of messages sent to and from the account, deleted messages, and messages maintained in trash or any other folders or tags or labels, as well as all header information associated with each e-mail or message, and any related documents or attachments.
ii. All records or other information stored by subscriber of the SUBJECT ACCOUNT including address books, contact and buddy lists, calendar data, pictures, videos, notes, texts, links, user profiles, account settings, access logs, and files. . . .
iv. All stored files and other records stored on iCloud for the SUBJECT ACCOUNT, including all device backups, all Apple and third-party app data (such as third-party provider emails and Whatsapp application chats backed up via iCloud), all files and other records related to iCloud Mail, iCloud Photo Sharing, My Photo Stream, iCloud Photo Library, iCloud Drive, iWork (including Pages, Numbers, and Keynote) , iCloud Tabs, and iCloud Keychain, and all address books, contact and buddy lists, notes, reminders, calendar entries, images, videos, voicemails, device settings, and bookmarks;
Just in case that list is not exhaustive enough for you, the government goes on to make the list even longer by adding a part b, specifically 10. b. i-iii found at pages 7-9 of 77 of the Application. Most of this is information that a provider might have about the subscriber, the target of the investigation. I quote below the subsection iii on encryption and keybags, which is pretty interesting and could have other uses by practitioners.
b. iii. All files, keys, or other information necessary to decrypt any data produced in an encrypted form, when available to Apple (including, but not limited to, the keybag.txt and fileinfolist.txt files);
Here is Apple’s explanation of what a keybag.txt file should contain, basically the passwords, and how it is used. It gets very complicated. The fileinfolist.txt is not explained by Apple, but appears to be a device file directory.
For background on the related issues of encryption in criminal wiretaps, and the problems this has been causing criminal investigations lately, see the excellent article by Zuckerman Spaeder LLP, in JD Supra, 6/10/22, entitled Warranted wiretapping? What to look for in this year’s Wiretap Report. Zuckerman cites the government Wiretap Report that in 2020 encryption was encountered in 398 wiretaps, and the plain text of the messages could not be decrypted in 383 of those. Yikes, that a 96% failure rate! Moreover, the expenses per wiretap reached an all-time high of $119,418 in 2020, up 183% from $42,216 in 2015. United State Courts, 2020 Wiretap Report. Also see the interesting article on a criminal ESI discovery case with bizarre facts to match the title, Despite Estimate of 37 Years to Crack iPhone, Government Doesn’t Have to Return it – Yet: eDiscovery Case Law, (EDRM, Cloud Nine, 3/27/20). Wonder if people will still even use phones in 37 years? I kind of doubt it.
To be continued . . . Part three of this Blog will examine Section III of the Application, namely Information To Be Seized by the Government, and Section IV, Provider Procedures. The last part of the blog will focus on the dangers of too much information, the dangers of Pacer, suggestions for its reform, the complex transparency of online court records, privacy rights and speculation on how the leak to the API in this case could have happened. In the meantime, please leave some comments below.