eDiscovery Case of the Week with Kelly Twigger

How Apple’s Failure to Suspend its Retention Policy Resulted in Siri-ous Sanctions

Kelly Twigger Season 1 Episode 147

In this episode of Case of the Week with Kelly Twigger, we dive into a significant decision from Lopez v. Apple, Inc., issued by U.S. Magistrate Judge Sallie Kim on June 17, 2024. This case explores critical eDiscovery issues, including proportionality, spoliation, and sanctions, stemming from Apple’s data retention policies for Siri interactions and alleged privacy violations. Plaintiffs claim Apple failed to preserve critical Siri data after the duty to preserve was triggered, leading to allegations of bad faith and spoliation.

Kelly unpacks the Court’s analysis under Rule 37 of the Federal Rules of Civil Procedure, highlighting Apple’s shifting data policies, the implications of their auto-deletion practices, and the Court’s decision to leave the question of intent for the jury. The episode covers the harsh sanctions imposed on Apple and provides practical takeaways for litigators on proactive preservation strategies, navigating asymmetrical data, and presenting solutions to courts in preservation disputes.

Don’t miss this compelling breakdown of a case with far-reaching implications for privacy, preservation, and the ever-evolving standards in eDiscovery.

 Lopez v. Apple, Inc. (June 17, 2024)
Read the blog about this case-
eDiscovery Assistant Blog
eDiscovery Assistant Website
Sign up for Kelly's Case of the Week Newsletter here
eDiscovery Assistant Free 7 day Trial (no credit card required)

#eDiscovery #CaseLaw #DataPreservation #Spoliation #Sanctions #LegalTech #Proportionality #Rule37 #PrivacyLaw #LitigationStrategy #ElectronicDiscovery #DataRetention #CourtDecisions #LegalInsights 

Thank you for tuning in to Case of the Week with Kelly Twigger. If you found today’s discussion helpful, don’t forget to subscribe, rate, and leave a review wherever you get your podcasts. For more insights and resources on eDiscovery, visit eDiscovery Assistant and explore our practical tools, case law library, and on-demand education from the eDiscovery Academy. Join us next episode as we break down another important case shaping the future of eDiscovery.

Background


This week’s decision comes to us from the Lopez v. Apple, Inc. case, and this decision is from United States Magistrate Judge Sallie Kim, dated June 17, 2024. Judge Kim has 38 decisions in our eDiscovery Assistant database and is well-versed in ediscovery issues. As always, we apply our proprietary issue tagging structure to each of the decisions in our database, and this week’s issues include sampling, proportionality, bad faith, scope of preservation, spoliation, sanctions, audio, privacy, protective order, and failure to preserve. Before we dive in, it’s important to note that this case has very specific facts that I won’t touch on just for brevity’s sake. I encourage you to read the entire decision when considering it as a tool for advising your clients.

This is a sanction motion that comes about as a result of Apple’s allowing data to be deleted on a retention schedule. And it’s a schedule that’s changed after the complaint is filed. There are a number of nuances in the way that data is retained here, and as always in these decisions, it’s difficult to know how much detail was presented to the Court versus what is adopted in its decision. We’ll discuss more ways about how this decision needs to be considered in our takeaways.


Facts


We are before the Court on plaintiffs’ motion for sanctions. The underlying case here involves claims that Apple violated plaintiffs’ privacy and misused their data when plaintiffs activated Apple Siri in a false accept or a false trigger that then led to a recording. If you have an iPhone, you’ve likely encountered this issue where Siri just starts recording data when you didn’t activate it. A false accept or a false trigger happens when an Apple device records a user’s speech without the user having said “Hey, Siri.”

Plaintiffs also allege that Apple disclosed the recordings to third-party contractors to “improve the functionality” — again, without user’s consent. These actions are all contrary to Apple’s policy that a device would only listen to, record, and share conversations with the user’s consent, which can be given in three different ways:

  • by uttering an activation command, like “Hey Siri”;
  • by manually pressing the button on the device; and
  • if you have the Apple Watch, by raising the Apple Watch to one’s mouth and beginning to talk.

I personally never got my Apple Watch to work with Siri. Hopefully, you’ve had a better experience.

The complaint alleges violations of the Federal Wiretap Act and the California Penal Code. The timeline of the matter relative to Apple’s retention policy for Siri data is really important, and as always, it’s one of the themes that we discuss here on the Case of the Week. Prior to the lawsuit being filed, Apple retained data for Siri as follows:

  • Siri requests were associated with a random device generated identifier known as the Assistant ID and the recordings were saved for six months;
  • After six months, the Assistant ID was disassociated from the Siri recordings and the recordings were saved for an additional 18 months; and
  • A much smaller subset of data was selected for grading by humans to determine whether Siri heard the request correctly, and that smaller subset was retained for five years.

On July 26, 2019, when that three-step process was in effect, an article in The Guardian identified that Apple had been recording their customers’ confidential and private conversations without their consent and had been storing and sending those recordings to humans for review.

Eight days after that article came out, on August 2, 2019, Apple announced that it would only store audio recordings of Siri’s interactions for users who opted in to contribute audio samples of their request to improve Siri. Five days later, plaintiffs filed their complaint on August 7, 2019, triggering Apple’s duty to preserve.

Three weeks later, on August 28, 2019, Apple announced details for the opt-in program for Siri that would implement the following changes:

  • It would no longer retain audio recordings of Siri interactions. Instead, Apple would use computer generated transcripts to help improve Siri;
  • Users could opt in to help Siri to improve by learning from the audio samples of user requests; and
  • Only Apple employees would be allowed to listen to the audio samples of the Siri interactions for users who opted in.

So instead of contractors, we’re now talking about Apple employees. Apple implemented this new policy, announced in August, in October 2019 — two months after the complaint was filed.

Now, ostensibly due to motion practice, plaintiffs did not serve discovery until February and September of 2022, so a year and six or so months after the complaint was filed. In those discovery requests, they sought data related to audio recordings and transcripts from false accepts that Apple stored on its servers. To come up with a plan for producing this data, the parties met and conferred more than 15 times to try and come up with a sampling proposal given the alleged volume of the data.

A year after those discovery requests were served, in November 2023, Apple “alluded to the fact” that it had not retained the Siri recordings. This is the first red flag of potential spoliation. Keep in mind that this entire time, Apple has been negotiating with the plaintiffs about a sampling protocol for Siri data that it turns out did not exist. A couple of months later, on January 12, 2024, Apple filed a protective order “seeking relief from the burden of retaining all Siri recordings (and associated data) for all Siri recordings made worldwide.”

Timing-wise, we’ve got a year and more than six months later, Apple finally making a motion for a protective order for the scope of all the Siri data because they claim it is so voluminous. In its motion papers on the protective order, Apple made assurances to the Court that “[p]laintiffs would not be prejudiced by Apple deleting data in accordance with its retention policy because Apple offered to provide Plaintiffs with a sample of Siri requests, including over several hundred thousands of Siri requests and over one hundred hours of Siri audio recordings.”

Two weeks later, on January 26th, the plaintiffs informed the Court that Apple had destroyed millions of class members’ data by not suspending its auto-deletion policies when this litigation commenced, and noted that Apple’s deletion of this data would be the subject of a later joint letter discovery brief.

A month later, before the motion for sanctions was filed, the Court granted Apple’s request for a protective order relying on Apple’s representation “that Plaintiffs’ proposal would require the production of hundreds of millions or billions of Siri requests.”

All of these facts lead to the conclusion that two and a half years after the complaint was filed in this case, Apple only maintained a small subset of data from Siri users — the data that was disclosed to reviewers in its grading process. The decision here specifically redacts the numerical value that’s placed on the volume of data sent for grading, so there’s no way to know whether it was a percentage or a volume of requests that were actually sent for grading. That limited data set precluded plaintiffs from being able to leverage the data that previously existed to prove the allegations in the complaint.

With that factual basis, we’re now before the Court on whether or not Apple’s deletion of its Siri data is sanctionable.


Analysis


The analysis for this week’s case really dovetails perfectly with our discussion from last week that looked carefully at the difference between subsections (e)(1) and (e)(2) of Federal Rules of Civil Procedure 37, which deals with sanctions for failure to preserve. The Court begins here with the prerequisites for sanctions under Rule 37 — whether that data should have been preserved, whether Apple took reasonable steps to preserve it; and whether it could be replaced.

Applying analysis to the facts, the Court deals rather swiftly with those issues, finding that the data was relevant, Apple did have a duty to preserve it, and it could not be replaced. Importantly, though, in that conversation, the Court notes that Apple should have asked the Court whether it was required to retain the Siri data at the outset of the case instead of making the determination to delete data on its own. That’s an important takeaway for you for your clients.

The Court also cited to previous decisions sanctioning Apple for failure to preserve data and stated that “Apple is well aware from previous orders in cases in this District that it had an obligation to suspend its auto-deletion policy under its retention policy for relevant evidence, which is a broad standard.”

The Court then turned to prejudice required under Rule 37(e)(1) for sanctions. The Court found that Apple had prejudiced plaintiffs by “deleting the overwhelming majority of the data, and by saving limited fields with respect to the recordings it did save, Apple has precluded Plaintiffs from any opportunity to sort through the data on their own or to draw valuable conclusions from the data that was saved.”

The Court then turned to the intent standard under Rule 37(e)(2) and found that whether Apple intentionally deleted the data was a close call and that that call should be submitted to a jury to determine whether or not they can come up with a finding of intent. In making that call, the Court acknowledged that “a passive failure to halt an automatic deletion process, without more, often does not rise to a reasonable inference of intent.”

On the other hand, with regard to the intent issue, the Court states 1) that Apple changed its retention policy to substantially narrow the preservation of ESI and did not start implementing those changes until Apple was served with the complaint; and 2) Apple had been previously sanctioned for its failure to place a litigation hold in those decisions that I mentioned earlier, cited from the Northern District of California.

The Court also pointed to Apple’s misleading of the Court regarding the volume of Siri data it had retained and the sufficiency of the sampling that it had proposed, basically calling Apple to the mat by saying hey, here’s what you told the Court, and it turns out it’s not true.

Having left the intent question for the jury, the Court then turned to the sanctions to be levied under Rule 37(e)(1), and they are pretty harsh. So even without a finding of intent, we’re dealing with some very harsh sanctions. And I’m going to give you the quote from the Court here on what and how they sanctioned Apple:

Apple should be precluded from affirmatively arguing or otherwise using Plaintiffs’ failure to make certain showings that they could have made if they had access to the deleted Siri data. For example, Apple cannot argue that named Plaintiffs lack standing to sue because they have no proof that they were subject to false triggers, cannot argue against class-wide damages based on the number of false recordings, and cannot argue against Apple’s intent based on the number of, and instances of repeated, false recordings. Apple shall not be allowed to introduce evidence about the data it destroyed or to rely on the absence of the data it destroyed in challenging class certification, Plaintiffs’ damages expert, in moving for summary judgment, or at trial. Moreover, as noted above, this list is not exhaustive. The presiding judge may determine during litigation, including at trial, that Apple’s failure to preserve ESI has prejudiced Plaintiffs in ways that cannot be anticipated now.Additionally, because the issue of Apple’s intent in the spoliation should be sent to the jury, the jury should be instructed that, if it finds that Apple acted with the intent to deprive Plaintiffs of the information’s use in the litigation, the jury may infer from the loss of the information that it was unfavorable to Apple. However, the jury may make this finding if, and only if, it first makes a finding that Apple’s deletion of the ESI was intentional.

Takeaways


With all of that, what are our takeaways here?

If you have a case like this one that is completely dependent on specific data from the other side, you need to take proactive steps to ensure that it is preserved, and you need to go to the court if you have to. The Siri data here was highly relevant, and Apple even published publicly that it was changing its retention policy after the complaint was filed. In these types of cases, you have to be 100% aware of what’s happening and be prepared to go to the court.

While crippling sanctions were awarded, Apple still got a pass from the Court here on the intent issue. That means that the plaintiffs are going to have to go all the way to trial to be able to determine whether dispositive sanctions or terminating sanctions would even be awarded here. I’m not even sure that they can be if the jury decides intent.

Next, we’re dealing with asymmetrical litigation here and potentially huge volumes of data. We often see courts in these types of situations evaluating whether the scope of the data that is sought by plaintiffs in these matters is necessary, and that’s what spurred the Court to grant a protective order here. Remember that the protective order was based solely on Apple’s representations to the Court, and the Court relied very heavily on those representations, which ultimately turned out to be false.

While the Court did not find intent to order terminating sanctions under Rule 37(e)(2), the ones it did order are very crippling. The Court’s order here may remove any barriers on standing or class certification, which could have been a huge hurdle for plaintiffs. And it appears that Apple cannot rely on any information about the Siri data that did exist to mount its defense. The Court also left open the ability of the judge to levy further sanctions during trial and laid out the instruction to the jury regarding the finding of intent. Those are crippling sanctions for Apple in this particular situation. It really reduces their ability to get rid of this case on early motions — whether it’s a motion for standing, a motion to dismiss, or a motion for summary judgment — because they do not have this data.

I agree with the Court here that the intent is a very close call, and I’m basing that agreement on the vast number of decisions that I read on Rule 37 sanctions for failure to preserve every year. Loss of data by good faith operation of systems was the original exception in Rule 37 before the Rule’s 2015 amendment. I think the question here is whether this was, in fact, a good faith operation of the systems.  

What is not articulated in this decision, and what I have a question about, is how the jury will come by enough facts to make this determination. So we’ll have to see how that plays out, if the parties get that far in litigation. But by pushing it to the jury instead of holding an extensive evidentiary hearing, the Magistrate Judge effectively grants Apple a reprieve until trial, allowing them to string out the case and force the plaintiffs to incur costs of taking the case to trial to get an answer on the intent question. It will really be interesting to see whether this case becomes another example of what we call “discovery on discovery”, as plaintiffs try to ferret out whether Apple intentionally crafted the revised retention and losses to preclude providing data here.

This is an important decision to review with your clients when talking about putting retention in place and how to handle that retention when the duty to preserve arises. You need a relationship with your client that allows you to go directly to the source of the data and find out what the options and potential cost estimates are to preserve data and present that to the court with constructive options for how to proceed. It has been 18 years since the Federal Rules of Civil Procedure were amended to provide for the duty to preserve, and we are still dealing with the most basic preservation questions in ediscovery. This is one you can work with your clients on to prevent — if you want to.


People on this episode