History
  • No items yet
midpage
1:21-cv-21940
S.D. Fla.
Jun 26, 2025
OMNIBUS ORDER ON MOTION FOR SUMMARY JUDGMENT AND DAUBERT MOTIONS
I. BACKGROUND
A. Material Facts
i. The Subject Collision
ii. The 2019 Tesla Model S
iii. McGee's Knowledge of Autopilot
iv. Causation Regarding McGee
v. Investigation into Tesla and Recall
vi. Tesla's Representations Before and After McGee Purchased the Vehicle
B. The Parties' Motions
II. LEGAL STANDARD
A. Daubert Standard
B. Rule 56(a)—Summary Judgment Standard
III. DISCUSSION
A. Tesla's Daubert Motion
i. Admissibility of Alan Moore's Expert Testimony
a. Moore's Qualifications
b. Reliability of Moore's Methods
i. Moore's Opinion Regarding the DMS Defect
ii. Moore's Opinion Regarding the ODD Defect
iii. Moore's Opinion Regarding the TACC Defect
iv. Moore's Opinion Regarding Insufficient Training
v. Moore's Opinion Regarding the Beta Software Defect
c. Helpfulness of Moore's Testimony
ii. Admissibility of Dr. Mary Cummings' Expert Testimony
a. Cummings' Qualifications
b. Reliability of Cummings' Opinions
i. Cummings' Opinion Regarding ODD Defect
ii. Cummings' Opinion Regarding DMS Defect
iii. Cummings' Opinion Regarding Defective Triggering of FCW and AEB
iv. Cummings' Opinion Regarding Tesla's Failure to Warn and Train
c. Helpfulness of Cummings' Opinions
iii. Whether Plaintiffs' Expert Testimony is Duplicative
B. Plaintiffs' Daubert Motions
i. Plaintiffs' Motion to Exclude Expert Testing and Testimony of Ryan Harrington
C. Tesla's Motion for Summary Judgment
i. Design Defect
iii. Manufacturing Defect
iv. Negligent Misrepresentation
v. Punitive Damages
IV. CONCLUSION
Notes

NEIMA BENAVIDES, as Personal Representative of the Estate of Naibel Benavides Leon, deceased, v. TESLA, INC., a/k/a. Tesla Florida, Inc.

Case No. 21-cv-21940-BLOOM/Torres

UNITED STATES DISTRICT COURT SOUTHERN DISTRICT OF FLORIDA

June 26, 2025

OMNIBUS ORDER ON MOTION FOR SUMMARY JUDGMENT AND DAUBERT MOTIONS

THIS CAUSE is before the Court upon Defendant Tesla, Inc.‘s (“Tesla“) Motion for Summary Judgment,1 ECF No. [326]; Tesla‘s Motion to Exclude Plaintiffs’ Experts Moore, Cummings, and Pettingill, ECF No. [318]. Plaintiffs Neima Benavides, as personal representative of the Estate of Naibel Benavides Leon, and Dillon Angulo (“Plaintiffs“), filed Responses in Opposition to both of Tesla‘s Motions, ECF No. [352] (“Response to Motion Summary Judgment“);2 ECF Nos. [347] (“Response to Tesla‘s Motion to Exclude Plaintiffs’ Experts“), to which Tesla filed Replies, ECF No. [378] (“Reply to Motion for Summary Judgment“); ECF No. [377] (“Reply to Exclude Plaintiffs’ Experts“). Plaintiffs also filed a Sur-Reply to Tesla‘s Motion

for Summary Judgment (“Sur-Reply“). ECF No. [407]. Before the Court also for consideration is Plaintiffs’ Motion to Exclude Expert Testing and Testimony of Tesla Expert Ryan Harrington, ECF No. [322]. Tesla filed a Response in Opposition, ECF No. [349], to which Plaintiffs filed a Reply. ECF No. [375].

The Court has reviewed the Motions, the supporting and opposing submissions, the record, and is otherwise fully advised. For the reasons that follow, Tesla‘s Motion for Summary Judgment is granted in part and denied in part, Tesla‘s Motion to Exclude is granted in part and denied in part, and Plaintiffs’ Motion to Exclude is granted in part and denied in part.3

I. BACKGROUND

This matter arises from a collision that occurred in Key Largo, Florida. George McGee owned a 2019 Tesla Model S (“Vehicle“) “equipped with automatic driving features, one of which Tesla called ‘Autopilot,’ that could navigate without driver input.” ECF No. [205] at 3. On April 25, 2019, McGee was driving the Vehicle when it hit a parked Chevrolet Tahoe, which then struck Decedent Naibel Benavides Leon and Plaintiff Dillon Angulo, killing Benavides Leon and causing significant injuries to Angulo. See id. at 3-4.

On April 22, 2021, Plaintiff Neima Benavides, as Personal Representative, brought this action against Tesla on behalf of the Estate of Decedent Naibel Benavides Leon in the Circuit Court for Miami-Dade County, Florida, alleging automotive product liability claims against Tesla. ECF No. [1-1]. Tesla removed the action to this Court on May 25, 2021. ECF No. [1]. On August 16, 2022, Plaintiff Angulo initiated a similar automotive products liability action against Tesla in this district, Case No. 22-cv-22607-KMM. See 22-cv-22607, ECF No. [1]. The Court accepted the transfer of that case and consolidated both actions due to the overlapping issues presented. ECF No. [50].

Plaintiffs filed a consolidated Amended Complaint on March 11, 2024, asserting the following claims against Tesla: Strict-Products-Liability-Defective design (Count I), Failure to Warn (Count II), Defective Manufacture (Count III), and Negligent Misrepresentation (Count IV). See ECF No. [205] at 6, ¶¶ 39-46.

A. Material Facts

Based on the Parties’ briefings and the evidence in the record, the following facts are not materially in dispute unless otherwise noted.

i. The Subject Collision

George McGee purchased his 2019 Model S Tesla (“Vehicle“) in early 2019. See ECF No. [318-9] at 33. “The [s]ubject [collision] occurred on April 25, 2019[,] at a ‘T’ intersection on Card Sound Road in Key Largo[,] Florida,” just a few miles from McGee‘s home. ECF No. [325] at ¶¶ 1-2. Card Sound Road is a “two-lane undivided rural road, with unpaved shoulders and drop-offs in places, several curves, and no roadside lighting for most of the road.” ECF No. [351] at ¶ 112. Pedestrians often use the road, and vehicles are often “stopped on the roadside.” Id.

Before he reached the intersection of Card Sound Road, McGee had activated the Vehicle‘s Autopilot, including the Traffic Aware Cruise Control (“TACC“), which among other things, restricts the Vehicle‘s speed to 45 miles per hour when the Vehicle is not operating on a highway or limited access roadway. ECF No. [325] at ¶ 3; ECF No. [351] at ¶ 117. However, McGee subsequently manually engaged the Vehicle‘s accelerator, increasing the Vehicle‘s speed to 62 miles per hour and temporarily disengaging the TACC speed restrictions4 while leaving

certain Autopilot features operational.5 ECF No. [351] at ¶¶ 3, 157. The Parties dispute which specific Autopilot features remained active once McGee pressed the accelerator.6 Most notably, the Parties disagree whether Autopilot‘s longitudinal control function and the automatic emergency brake function were deactivated in the moments leading up to the collision. See ECF No. [379] at ¶ 126; see also ECF No. [351] at ¶ 3.

As McGee continued driving toward the Card Sound Road intersection, he dropped his cell phone and immediately reached down to pick the phone off the floorboard. While McGee was reaching for his phone, the Vehicle detected a stop sign, a stop bar, the road‘s edge,7 a pedestrian, and a parked Chevrolet Tahoe, but the Vehicle did not provide McGee with any audio alert or other warning of the obstacles and never engaged its emergency brakes. Id. at ¶¶ 8, 125.8 Because McGee also failed to observe the traffic signs and “obstacles,” McGee drove through the intersection, failing to brake before striking the side of the parked Tahoe, “which in turn was pushed into two pedestrians“—killing Decedent Naibel Benavides Leon and seriously injuring Angulo. ECF No. [325] at ¶¶ 5-6, 8; ECF No. [351] at ¶¶ 29, 118.

Shortly after the crash, McGee called 911, telling the operator: “Oh my God, I wasn‘t looking,” “I don‘t know what happened. I ended up missing the turn. I was looking down,” and “I dropped my phone. Oh my God.” Id. at ¶ 8. Officers eventually arrived on the scene, and McGee told them, “I was driving. I dropped my phone and looked down and I ran the stop sign and hit the guy‘s car.” Id. at ¶ 9. McGee stated to officers, “[i]t was actually because I was driving [ ]. I looked down and I‘ve been using cruise control, and I looked down, I didn‘t realize (INAUDIBLE) and then I [ ] sat up. The minute I sat up[,] I hit the brakes and saw his truck.” Id. at ¶ 9.9 McGee acknowledged that the road “signs were visible if he had looked up” and “that, had he been watching the road, [he would have] had a clear and unobstructed view of the ‘t’ intersection for a ‘long distance‘—at least 1,000 feet.” Id. at ¶¶ 11-12. According to McGee, “there was nothing that prevented him from acting to prevent the crash.” Id. at ¶ 10.

ii. The 2019 Tesla Model S

“The Vehicle is equipped with a set of driver assist features collectively known as ‘Autopilot,‘” which the Parties agree includes “(a) [Traffic Aware Cruise Control] [‘]TACC[‘], an adaptive cruise control system that helps drivers maintain a safe distance behind a detected vehicle in the same lane—to maintain the posted speed limit” and “(b) Autosteer, which provides lane centering” thereby providing “both lateral and longitudinal control of the vehicle to assist the driver.”10 ECF No. [325] at ¶ 13; ECF No. [351] at ¶ 100. The Vehicle was also equipped with

Forward Collision Warning (“FCW“) and Automatic Emergency Braking (“AEB“).11 ECF No. [325] at ¶ 13; ECF No. [351] at ¶ 101. “The Autopilot suite of features is designated by the Society of Automotive Engineers (SAE) as a Level 2 [Advanced driver-assistance system] [‘]ADAS[‘] System.” ECF No. [325] at ¶ 14; see also ECF No. [351] at ¶ 100.12 When using the Level 2 system, the SAE recommends that the driver always remain in control of the vehicle, and therefore, a driver “must still brake, accelerate, and steer just as if the system is not engaged, and retains responsibility to always keep his hands on the wheel and his eyes on the road.” ECF No. [325] at ¶¶ 15-16.

“Before using Autosteer for the first time, the driver must, while parked, enable the feature on the center touchscreen.” Id. at ¶ 17. “Thereafter, each time a user engages Autosteer, the vehicle displays a message instructing him to keep his hands on the steering wheel and to remain prepared to take over.” Id. at ¶ 21. “When a driver overrides Traffic Aware Cruise Control, a visible alert is issued when after six seconds of override on the dash that TACC will not brake.” Id. at ¶ 28.13 In addition to those warnings, Tesla provides an electronic Owner‘s Manual on the Vehicle‘s touchscreen. Id. ¶ 22.14 “The Owner‘s Manual contains several pages of warnings about the

capabilities and limitations of the Autopilot features,” including a warning about Autosteer, which states:

Warning: Autosteer is intended for use only on highways and limited-access roads with a fully attentive driver. When using Autosteer, hold the steering wheel and be mindful of road conditions and surrounding traffic . . .. Never depend on Autosteer to determine an appropriate driving path. Always be prepared to take immediate action. Failure to follow these instructions could cause serious property damage, injury or death.

Id. at ¶ 27.15 “Despite the [O]wner‘s [M]anual statements about where Autopilot should not be used, Tesla allows Autopilot to be engaged on two-lane county roads, like that of Card Sound Road, [ ] the road McGee was driving on just prior to the crash.” ECF No. [351] at ¶ 152.

iii. McGee‘s Knowledge of Autopilot

Prior to McGee purchasing the Vehicle, “Tesla aired a commercial stating that ‘[t]he person in the driver‘s seat is only there for legal reasons. He is not doing anything. The car is driving itself.‘”16 ECF No. [351] at ¶ 109. Elon Musk (“Musk“), the CEO of Tesla, later “stated in a [2017] TED talk that, in about two years, Tesla owners could actually sleep while their cars drove themselves.” Id. at ¶ 111.

Although there is no evidence that McGee saw the commercial or that he heard or read any statements Elon Musk had made about the 2019 Tesla Model S, “McGee testified that his beliefs about the capabilities of Autopilot came from ‘looking at information on the [V]ehicle’ . . . [and] that he likely watched videos online or on Tesla‘s website about the [V]ehicle‘s features and how they work . . . [including] [o]ne video show[ing] Tesla[‘s] drivers operating the vehicle without their hands.” ECF No. [351] at ¶ 39. Nevertheless, McGee does not believe Mr. Musk‘s representations had any effect on his decision to purchase the Vehicle. ECF No. [325] at ¶ 39. McGee testified he purchased the Vehicle because it was equipped with the most advanced autonomous driving package, which he thought “would assist in his 100-mile drive from Boca Raton to Key Largo.” ECF No. [351] at ¶¶ 98-99.

Following the collision, however, McGee testified that “he understood that Autopilot did not make the car ‘self-driving‘” and “that it was his ‘responsibility as the driver of the [V]ehicle—even with Autopilot activated—to drive safely and be in control of the [V]ehicle at all times.‘” Id. at ¶¶ 30-31. McGee also acknowledged that “he understood before the crash that it was his job—not the [V]ehicle‘s job—to detect and to react to a stop sign or red flashing light,” and that it was his responsibility to be aware of his surroundings, the speed limit, and other traffic control devices on the road. Id. at ¶¶ 33-35.

Furthermore, McGee understood that “the [V]ehicle would not be able to stop under all circumstances” and, in particular, the Vehicle “would not detect or stop at signs [ ] and stoplights.” Id. at ¶¶ 36, 38. Therefore, it was ultimately McGee‘s responsibility to do that job and “manage the technology.” Id. at ¶¶ 37-38.

Despite McGee‘s acknowledgment that he was ultimately responsible for controlling and operating the Vehicle, the Parties dispute what McGee expected the Vehicle to do when the Autopilot function was engaged. According to Plaintiffs, “McGee believed that the Vehicle‘s features would ‘keep him in the lane, avoid crashes, [ ] direct him to where he needed to go’ . . . help him see any ongoing traffic[,] and would help him stop or turn or avoid collisions.” ECF No. [351] at ¶¶ 103-04. McGee also thought “that when the Vehicle was on Autopilot[,] it ‘would stop regardless of any car . . . [and i]f there was a parked car, it would stop and not hit it.‘” Id. (quoting ECF No. [318-9] at 108:15-23). Therefore, because the Vehicle did not prevent the collision, McGee believes the “[V]ehicle‘s [A]utopilot‘s system, automatic emergency braking system, and front collision warning malfunctioned at the time of the accident.” ECF No. [351] at ¶ 120 (quoting ECF No. [318-9] (internal quotations omitted).

iv. Causation Regarding McGee

“[W]ith a Level 2 vehicle, like the 2019 Model S, ‘the operator of the vehicle is in control of the vehicle and responsible for what occurs in the vehicle.‘” ECF No. [325] at ¶ 43 (quoting ECF No. [318-2] at 24:11-20).17 Not only is the operator responsible for the vehicle, he or she is also “legally obligated to drive the vehicle and obey all traffic laws.” Id. at ¶ 44.18

On the day of the collision, McGee did not fulfill his role as a Level 2 driver, and if he had, the Parties agree the collision “could have been avoided or mitigated.” Id. at ¶¶ 46, 48. Notwithstanding his delayed reaction and his failure to observe multiple traffic signals, “in the last five seconds, McGee was [still] in a better po[sition] than Tesla to avoid the crash.” Id. at ¶¶ 49-50 (quoting ECF No. [318-2] at 214:15-18) (internal quotations omitted).

“During McGee‘s three-month ownership of his Tesla, he experienced 23 ‘strikeouts’ due to inattentive driving, many less than 10 minutes apart.” ECF No. [351] at ¶ 107 (quoting ECF No. [318-1] at 10). “When McGee received a strikeout, he frequently pulled over, placed the [V]ehicle in Park and back into Drive, then continued the trip with Autopilot back in use. On the final drive cycle, on which the collision occurred, McGee had already received one strikeout and a total of five audible warnings. He was [one] chime away from another strikeout prior to impact.” ECF No. [351] at ¶ 108 (quoting ECF No. [318-1] at 10.) “Although Tesla recorded McGee‘s abuse and misuse of the Autopilot system, [Tesla] did not change his behavior, provide additional training, or significantly restrict his use of the feature.” ECF No. [351] at ¶ 133.

Plaintiffs’ expert, Alan Moore, acknowledges that “drivers get distracted using cell phones all the time[,]” and “driving while using a cellphone is not limited to Tesla drivers who use Autopilot.” ECF No. [325] at ¶¶ 51-52.19 Moore concedes that “using cell phones or taking one‘s eyes off the road while driving occurred before [the invention of] Autopilot.” Id. at ¶ 53.20 Therefore, “McGee could still have been distracted by his phone and crashed into the Tahoe [even] if Autopilot was unavailable” in the Vehicle. Id. at ¶¶ 55, 61.21 In fact, other drivers still “crash and run the stop sign at the Card Sound Road intersection without Autopilot.” Id. at ¶ 54. Moore further acknowledges he never conclusively “opined that automatic emergency braking should have triggered in this case.” Id. at ¶ 76 (quoting ECF No. [318-3] at 108:15-110:12 and citing ECF No. [318-2] at 246:23-247:6).

Plaintiffs’ experts22 also testified “that disabling Autopilot or designing Autopilot to not engage outside of its [Operational Design Domain] [“ODD“]23 would not have prevented McGee from accelerating the [V]ehicle,” and ultimately, the collision resulted from McGee‘s complacency and confusion about Autopilot‘s capabilities. Id. at ¶¶ 57-58.24 According to Plaintiffs’ expert, Dr. Mary Cummings, although McGee did not testify as such, “McGee ‘[thought] this car [wa]s gonna pick up his slack . . . And he fe[lt] like [if] he drop[ped] his phone, what‘s the big deal? My car‘s got it[.]‘” Id. at ¶ 66 (quoting ECF No. [318-6] at 246:21-247:1, 247:16-248:15). Notwithstanding her conclusion that the collision was a result of McGee‘s confusion about Autopilot, Cummings acknowledged she had “no scientific study” or “empirical data” to support the opinion that “Tesla‘s Autosteer Beta message” or “Tesla‘s use of the term Autopilot was confusing to users.” Id. at ¶¶ 68-69.

v. Investigation into Tesla and Recall

“In 2016, the National Transportation Safety Board (‘NTSB‘) launched an investigation into a fatal crash involving a [different] Tesla [vehicle] that was operating under Autopilot at the time of the collision.” ECF No. [351] at ¶ 83. In the report, “NTSB issued various recommendations for manufacturers of vehicles equipped with Level 2 automation systems such as:

[1] Incorporate system safeguards that limit the use of automated vehicle control systems to those conditions for which they were designed; and

[2] Develop applications to more effectively sense the driver‘s level of engagement and alert the driver when engagement is lacking while automated vehicle control systems are in use.

Id. at ¶ 85.25 NTSB‘s report and recommendations were sent to six automakers, including Tesla. However, Tesla did not respond to the report. See ECF No. [379] at ¶ 85 (citing ECF No. [380] at 2-4).

Even though Tesla contends the National Highway Traffic Safety Administration (“NHTSA“) ultimately rejected NTSB‘s recommendations,26 NHTSA initially launched an investigation into Tesla‘s Autopilot system in 2017 and eventually expanded the investigation to “understand how Tesla‘s Autopilot system ‘may exacerbate human factors or behavioral safety risks by undermining the effectiveness of the driver‘s supervision.‘” ECF No. [351] at ¶¶ 88-89 (quoting ECF No. [252] at 3; ECF No. [252-2] at 1-2). NHTSA concluded that Tesla‘s “Autopilot controls did not sufficiently ensure driver attention and appropriate use,” but at the same time, invited “greater confidence via its higher control authority and ease of engagement.” Id. at ¶ 90 (quoting ECF No. [252-2] at 2). According to NHTSA, this “mismatch of weak usage controls and high control authority” indicated driver disengagement. Id. at ¶ 91.

NHTSA reported “that ‘drivers involved in the crashes were not sufficiently engaged in the driving task and the warnings provided by Autopilot when Autosteer was engaged did not adequately ensure that drivers maintained their attention on the driving task[,]’ [and a]s a result, ‘crashes with no or late evasive action attempted by the driver were found across all Tesla hardware versions and crash circumstances.‘” Id. at ¶ 92 (quoting ECF No. [252-2] at 2).

Furthermore, NHTSA determined “that the use of the term ‘Autopilot’ is itself misleading, because it ‘elicits the idea of drivers not being in control.‘” Id. at ¶ 93 (quoting ECF No. [252-2] at 7). Consequently, the NHTSA suggested that using the term “Autopilot” “may lead drivers to believe that the automation has greater capabilities than it does and invites drivers to overly trust the automation.” Id. Because of these risks, NHTSA noted that “non-TESLA vehicles using similar technology ‘generally use more conservative terminology like assist . . . to imply that the driver and automation are intended to work together, with the driver supervising the automation.‘” Id.

In response to NHTSA‘s preliminary findings, Tesla initiated a voluntary recall of vehicles with the SAE Level 2 advanced driver-assistance feature. Id. at ¶ 94. “Despite the recall, Tesla did not restrict or ‘geofence’ Autopilot and Autosteer to the kinds of roads on which the technology was designed to operate.” Id. at ¶ 96. Following the recall, NHTSA opened a new investigation to determine the adequacy of the recall. In a letter to Tesla related to the investigation, NHTSA noted that “there had been 20 post recall crashes, 14 of which involved front end crashes or inadvertent disengagement[.]” Id. at ¶ 97.

vi. Tesla‘s Representations Before and After McGee Purchased the Vehicle

Prior to the collision, Tesla and its CEO, Elon Musk (“Musk“), made several representations about Tesla‘s Autopilot system and autonomous vehicles. Approximately three and a half years before the subject collision, Musk stated that:

The forward-facing camera [in Tesla vehicles] is able to determine where the lanes are, where the cars are ahead of it, and it‘s also able to read signs. It‘s been able to read speed signs for a while, for example but it‘s able to read pretty much any sign. Then that‘s combined with the forward radar. The radar is very good at detecting fast moving large objects, and it can actually see through fog, rain, snow, and dust. So the forward radar gives the car superhuman sensors. It can see through things that are close to the car.

ECF No. [351] at ¶ 188 (quoting ECF No. [350-2] at 10). Musk later emphasized the safety benefits of Autopilot, claiming “approximately half a million people would have been saved if the Tesla[‘s] [A]utopilot was universally available.” Id. at ¶ 189. Regarding the improvements to braking technology, Musk made the following statement in September 2016:

[P]articularly anything large, or metallic, or anything that‘s dense, the radar system in the car we‘re confident will be able to detect that and initiate a braking event. Both when the Autopilot is active and when it is not active. When the Autopilot is not active, not activated, it will operate in an emergency braking mode. So, in that case, it‘s more likely to mitigate the impact speed because if Autosteer is not on, it doesn‘t know whether the driver is actually going to turn out-of-the-way of an obstacle or not. So, it will only brake at the very last second. If Autosteer is turned on, the car computer knows what its probable path is and whether it will actually turn in time or not. And so, it will be a much more comfortable braking experience as opposed to the last-minute and in that case, we think most likely we will be able to brake to a complete stop instead of simply mitigating the impact velocity. So, we think it probably works better with Autopilot on than off.

...

The exciting thing is that even if the vision system doesn‘t recognize what the object is because it could be a very strange-looking vehicle, it could be a multi-car pileup, it could be a truck crossing the road, it really could be anything – an alien spaceship, a pile of junk metal that fell off the back of a truck. It actually doesn‘t matter what the object is[,] it just knows that there‘s something dense that it is going to hit - and it should not hit that.

ECF No. [351] at ¶ 190 (quoting ECF No. [350-2] at 12).27

On October 19, 2016, Musk insisted “that Tesla vehicles driven on Autopilot were already safer than human drivers.” Id. at ¶ 193.28 He also claimed around the same time that he “would

consider autonomous driving to be basically a solved problem.” Id. (quoting ECF No. [350-2] at 13). Musk‘s proclamations continued as late as two days before the collision when he claimed that Tesla‘s Autopilot was so advanced that Autopilot did not require driver monitoring, and that in the immediate future “having a human co-pilot intervene in the driving task would decrease safety.”29

In fact, I think it will become very, very quickly, maybe even towards the end of this year, but I would say, I‘d be shocked if it‘s not next year, at the latest, that having a human intervene will decrease safety. Decrease. Like imagine if you‘re in an elevator. Now, it used to be that there were elevator operators. And you couldn‘t go on an elevator by yourself and work the lever to move between floors. And now nobody wants an elevator operator because the automated elevator that stops at the floors is much safer than the elevator operator. And in fact, it would be quite dangerous to have someone with a lever that can move the elevator between floors.

....

I think [Autopilot] will require detecting hands-on wheel for at least six months or something like that from here, really, it‘s a question of like, from a regulatory standpoint, what - how much safer than a person does auto-pilot need to be for it to be OK to not monitor the car?

....

If you have a system that‘s at or below human level reliability, then driver monitoring makes sense. But if your system is dramatically better, more reliable than a human, then driving, monitoring is not - does not help much. And, like I said, just like you wouldn‘t want someone - if you‘re in an elevator, do you really want someone with a big lever, some random person operating the elevator between floors? I wouldn‘t trust that. I would rather have the buttons.

ECF No. [351] at ¶ 195 (quoting [350-2] at 6-7) (emphasis added by Plaintiffs).

In light of Musk‘s statements and Tesla‘s advertising, FTC Chairman Joseph Simon (“Simon“) urged the FTC to investigate “Tesla‘s deceptive and unfair practices” asserting that “[t]he marketing and advertising practices of Tesla combined with Elon Musk‘s public statements, have made it reasonable for Tesla owners to believe, and act on that belief, that a Tesla with Autopilot is an autonomous vehicle capable of ‘self-driving.‘” ECF No. [351] at ¶ 198 (emphasis removed) (quoting ECF No. [205] at ¶ 38; ECF No. [350-4]. Simon stated that such advertisements “are likely to deceive even diligent consumers, who would act reasonably in believing them, and are likely to use Autopilot differently than they would if Tesla employed more honest and transparent marketing and advertising strategies.” Id. (emphasis removed). Not only did Simon believe the advertising was confusing, “Tesla commissioned a survey in Germany about possible driver confusion with Autopilot—and [found] 47 out of 675 drivers thought the car could drive itself.” Id. at ¶ 168.

B. The Parties’ Motions

Tesla seeks to exclude Plaintiffs’ expert witnesses pursuant to Daubert, arguing that their proposed testimony is speculative and unreliable and such testimony will not assist the jury in deciding the merits of Plaintiffs’ claims. ECF No. [318]. Plaintiffs argue that their experts rely on sound methods to reach their opinions, and that those opinions will be helpful to the jury in resolving critical issues in this case. ECF No. [347].

Plaintiffs have also filed a Daubert Motion of their own, which seeks to exclude Tesla‘s expert, Ryan Harrington‘s testing and testimony. ECF No. [322]. Plaintiffs argue that Harrington‘s crash testing should be excluded as unduly prejudicial because the testing conditions were not substantially similar to those of the subject collision. See id. at 1. Tesla disagrees that Harrington‘s first two phases of crash testing were attempts to recreate the collision. ECF No. [349] at 11. Therefore, the first two phases of the crash testing and the accompanying testimony are admissible because the testing need only be relevant, not substantially similar, before it may be presented to the jury. See id. at 11-12. Tesla concedes that the third phase of testing was an attempted recreation of the crash; however, Tesla maintains the recreation conditions were substantially similar to those of the subject collision. See id. at 12-13.

In addition to the Daubert Motions, Tesla also seeks summary judgment as to Plaintiffs’ design defect, manufacturing defect, failure to warn, and negligent misrepresentation claims as well as Plaintiffs’ request for punitive damages. See ECF No. [326]. Tesla contends all four counts and the request for punitive damages must be dismissed because there is no evidence demonstrating that Tesla proximately caused Plaintiffs’ injuries. Tesla argues Plaintiffs’ claims are also deficient as there is no evidence: (1) that Autopilot was defective; (2) that Tesla‘s warnings were inadequate; (3) that Tesla owed Plaintiffs a duty to make truthful representations or warn of the product‘s known or foreseeable dangers; (4) or that McGee ever read the warnings.30 Plaintiffs maintain there is more than sufficient evidence that Autopilot and the associated warnings were defective and that these defects proximately caused Plaintiffs’ injuries. ECF No. [352]. Plaintiffs also contend that Tesla owed a duty to warn and make truthful representations, and because Tesla‘s failure to do so caused Plaintiffs’ injuries, they have a viable failure to warn and negligent misrepresentation claim. See id.

II. LEGAL STANDARD

A. Daubert Standard

Federal Rule of Evidence 702 governs the admissibility of expert testimony. When a party proffers the testimony of an expert under Rule 702, the party offering the expert testimony bears the burden of laying the proper foundation, and that party must demonstrate admissibility by a preponderance of the evidence. See Rink v. Cheminova, Inc., 400 F.3d 1286, 1291-92 (11th Cir. 2005); Allison v. McGhan Med. Corp., 184 F.3d 1300, 1306 (11th Cir. 1999). To determine whether expert testimony or any report prepared by an expert may be admitted, the Court engages in a three-part inquiry, which requires the Court to consider whether: (1) the expert is qualified to testify competently regarding the matters the expert intends to address; (2) the methodology by which the expert reaches his or her conclusions is sufficiently reliable; and (3) the testimony assists the trier of fact, through the application of scientific, technical, or specialized expertise, to understand the evidence or to determine a fact in issue. See City of Tuscaloosa v. Harcros Chems., Inc., 158 F.3d 548, 562 (11th Cir. 1998) (citing Daubert v. Merrell Dow Pharms., Inc., 509 U.S. 579, 589 (1993)). The Court of Appeals for the Eleventh Circuit refers to each of these requirements as the “qualifications,” “reliability,” and “helpfulness” prongs. United States v. Frazier, 387 F.3d 1244, 1260 (11th Cir. 2004). While some overlap exists among these requirements, the Court must individually analyze each concept. See id.

Under Daubert, a district court must take on the role of gatekeeper, but this role “is not intended to supplant the adversary system or the role of the jury.” Quiet Tech. DC-8, Inc. v. Hurel-Dubois UK Ltd., 326 F.3d 1333, 1341 (11th Cir. 2003) (citations and quotation marks omitted). Consistent with this function, the district court must “ensure that speculative, unreliable expert testimony does not reach the jury.” McCorvey v. Baxter Healthcare Corp., 298 F.3d 1253, 1256 (11th Cir. 2002). “[I]t is not the role of the district court to make ultimate conclusions as to the persuasiveness of the proffered evidence.” Quiet Tech., 326 F.3d at 1341 (citations and quotation marks omitted). Thus, the district court cannot exclude an expert based on a belief that the expert lacks credibility. Rink, 400 F.3d at 1293 n.7. On the contrary, “vigorous cross-examination, presentation of contrary evidence, and careful instruction on the burden of proof are the traditional and appropriate means of attacking shaky but admissible evidence.” Quiet Tech., 326 F.3d at 1341 (quoting Daubert, 509 U.S. at 596). “Thus, ‘[o]n cross-examination, the opposing counsel is given the opportunity to ferret out the opinion‘s weaknesses to ensure the jury properly evaluates the testimony‘s weight and credibility.‘” Vision I Homeowners Ass‘n, Inc. v. Aspen Specialty Ins. Co., 674 F. Supp. 2d 1321, 1325 (S.D. Fla. Dec. 15, 2009) (quoting Jones v. Otis Elevator Co., 861 F.2d 655, 662 (11th Cir. 1988)). Ultimately, “a district court enjoys ‘considerable leeway’ in making” evidentiary determinations such as these. Cook ex rel. Est. of Tessier v. Sheriff of Monroe Cnty., Fla., 402 F.3d 1092, 1103 (11th Cir. 2005) (quoting Frazier, 387 F.3d at 1258).

B. Rule 56(a)—Summary Judgment Standard

A court may grant a motion for summary judgment “if the movant shows that there is no genuine dispute as to any material fact and the movant is entitled to judgment as a matter of law.” Fed. R. Civ. P. 56(a). The parties may support their positions by citations to materials in the record, including depositions, documents, affidavits, or declarations. See Fed. R. Civ. P. 56(c). “A factual dispute is ‘material’ if it would affect the outcome of the suit under the governing law, and ‘genuine’ if a reasonable trier of fact could return judgment for the non-moving party.” Miccosukee Tribe of Indians of Fla. v. United States, 516 F.3d 1235, 1243 (11th Cir. 2008) (citing Anderson v. Liberty Lobby, Inc., 477 U.S. 242, 247-48 (1986)).

A court views the facts in the light most favorable to the non-moving party, draws “all reasonable inferences in favor of the nonmovant and may not weigh evidence or make credibility determinations[.]” Lewis v. City of Union City, Ga., 934 F.3d 1169, 1179 (11th Cir. 2019); see also Crocker v. Beatty, 886 F.3d 1132, 1134 (11th Cir. 2018) (“[W]e accept [the non-moving party‘s] version of the facts as true and draw all reasonable inferences in the light most favorable to him as the non-movant.” (citation omitted)).

“The mere existence of a scintilla of evidence in support of the [non-moving party‘s] position will be insufficient; there must be evidence on which a jury could reasonably find for the [non-moving party].” Anderson, 477 U.S. at 252. The moving party shoulders the initial burden of showing the absence of a genuine issue of material fact. Shiver v. Chertoff, 549 F.3d 1342, 1343 (11th Cir. 2008). Once this burden is satisfied, “the nonmoving party ‘must do more than simply show that there is some metaphysical doubt as to the material facts.‘” Ray v. Equifax Info. Servs., L.L.C., 327 F. App‘x 819, 825 (11th Cir. 2009) (quoting Matsushita Elec. Indus. Co., Ltd. v. Zenith Radio Corp., 475 U.S. 574, 586 (1986)). Instead, “the non-moving party ‘must make a sufficient showing on each essential element of the case for which he has the burden of proof.‘” Id. (quoting Celotex Corp. v. Catrett, 477 U.S. 317, 322 (1986)). Accordingly, the non-moving party must produce evidence, beyond the pleadings, and by its own affidavits, or by depositions, answers to interrogatories, and admissions on file, designate specific facts to suggest that a reasonable jury could find in the non-moving party‘s favor. Shiver, 549 F.3d at 1343. Even “where the parties agree on the basic facts but disagree about the factual inferences that should be drawn from those facts,” summary judgment may be inappropriate. Warrior Tombigbee Transp. Co., Inc. v. M/V Nan Fung, 695 F.2d 1294, 1296 (11th Cir. 1983).

“If more than one inference could be construed from the facts by a reasonable fact finder, and that inference introduces a genuine issue of material fact, then the district court should not grant summary judgment.” Bannum, Inc. v. City of Fort Lauderdale, 901 F.2d 989, 996 (11th Cir. 1990) (citation omitted).

III. DISCUSSION

A. Tesla‘s Daubert Motion

Tesla contends that Plaintiffs’ experts have not provided competent or reliable testimony that any of the alleged defects existed or that the defects proximately caused Plaintiffs’ injuries. ECF No. [318] at 6. Accordingly, Tesla seeks to preclude Alan Moore‘s and Dr. Mary Cummings’ expert testimony from being presented to the jury.

i. Admissibility of Alan Moore‘s Expert Testimony

Tesla asserts that Plaintiffs’ expert, Alan Moore, has offered the following opinions:

  1. Tesla‘s Driver Monitoring System (“DMS“) failed to sufficiently monitor driver awareness while Autopilot was engaged;
  2. Tesla failed to restrict the use of Autopilot to its “Operational Design Domain” (“ODD“);
  3. Autopilot should have warned the driver or applied the brakes when it detected obstacles;
  4. Tesla provided insufficient training on the use of Autopilot; and
  5. Tesla created a significant level of risk by using beta software.

ECF No. [318] at 10. Based on those opinions, Moore ultimately concludes that “the accident would not have occurred but for McGee‘s use of Autopilot.” Id. at 11. However, Tesla argues that Moore‘s opinions are not “supported by sound, scientific data, or methodology.” Id. According to Tesla, Moore fails to offer any facts or data to support his conclusion and fails to “outline a reliable methodology explaining how he arrived at the opinion.” Id. Instead, Tesla contends Moore relies exclusively on speculation to conclude that the Autopilot system is defective. Id. Tesla insists that Moore has never identified any facts or data that establish that “Autopilot was the deciding factor in causing this exact crash,” nor could he. Id. (emphasis removed). However, even if there is causation evidence, Tesla challenges whether Moore‘s defect opinions are reliable enough to be presented to the jury.

Plaintiffs argue that the fact that Moore “admitted it was ‘possible’ that McGee could have crashed even if his car had not been equipped with Autopilot” is rather unremarkable and does not impact the reliability of his proximate cause determination. ECF No. [347] at 14. According to Plaintiffs, as an expert, Moore need not rule out all possible explanations or causes of the injuries for his testimony to be reliable. An expert opinion will only be excluded pursuant to Daubert where the opposing party offers an alternative explanation for the cause of the injury, and the expert fails to explain why that alternative explanation is not the sole cause. Moreover, Plaintiffs argue that under Florida law, “a plaintiff is not required to prove that a particular outcome would have been different if the defendant had not committed misconduct in order to prove causation.” Id. at 16. Accordingly, Plaintiffs insist that neither proximate causation nor “‘but for’ causation [ ] require Plaintiffs [ ] to prove that McGee would have acted different if his car had not been equipped with Autopilot” as there is no requirement to prove causation with absolute certainty. Id. at 16. Furthermore, Plaintiffs contend that Moore‘s defect opinions are not only reliable but well supported by the facts in the record.

Because the admissibility of expert testimony requires the Court to determine whether (1) the expert is qualified; (2) the expert‘s methodology is sufficiently reliable; and (3) the testimony is helpful to the jury, the Court addresses each requirement in turn.

a. Moore‘s Qualifications

The Parties do not dispute that Moore is qualified to testify on the subject matter at issue, and the Court finds that Moore has the requisite qualifications.

b. Reliability of Moore‘s Methods

In determining whether an expert‘s testimony is reliable, “the trial judge must assess whether the reasoning or methodology underlying the testimony is scientifically valid and . . . whether that reasoning or methodology properly can be applied to the facts in issue.” Frazier, 387 F.3d at 1261-62 (citation and internal quotation marks omitted). To make this determination, the district court typically examines: “(1) whether the expert‘s theory can be and has been tested; (2) whether the theory has been subjected to peer review and publication; (3) the known or potential rate of error of the particular scientific technique; and (4) whether the technique is generally accepted in the scientific community.” See id. (citing Quiet Tech., 326 F.3d at 1341). The Eleventh Circuit has emphasized that these four factors are not exhaustive, and a court may need to conduct an alternative analysis to evaluate the reliability of an expert opinion depending “on the nature of the issue, the expert‘s particular expertise, and the subject of his testimony.” United States v. Brown, 415 F.3d 1257, 1268 (11th Cir. 2005); see also Frazier, 387 F.3d at 1262 (“These factors are illustrative, not exhaustive; not all of them will apply in every case, and in some cases other factors will be equally important in evaluating the reliability of proffered expert opinion.“). Consequently, trial judges are afforded “considerable leeway” in ascertaining whether a particular expert‘s testimony is reliable. See id. at 1258 (citing Kumho Tire Co., 526 U.S. at 152).

“Although an opinion from a non-scientific expert should receive the same level of scrutiny as an opinion from an expert who is a scientist, some types of expert testimony will not naturally rely on anything akin to the scientific method, and thus should be evaluated by other principles pertinent to the particular area of expertise.” Washington v. City of Waldo, Fla., No. 1:15CV73-MW/GRJ, 2016 WL 3545909, at *3 (N.D. Fla. Mar. 1, 2016) (citing Fed. R. Evid. 702, Advisory Committee Notes (2000)). However, an expert who offers opinions or testimony “based primarily on his experience, . . . must be able to explain ‘how that experience leads to the conclusion reached, why that experience is a sufficient basis for the opinion, and how that experience is reliably applied to the facts.‘” Id. (quoting Fed. R. Evid. 702, Advisory Committee Notes (2000)). “An expert‘s unexplained assurance that [his] opinions rest on accepted principles’ is not enough.” Clena Invest., Inc. v. XL Specialty Ins. Co., 280 F.R.D. 653, 663 (S.D. Fla. 2012) (quoting Furmanite Am., Inc. v. T.D. Williamson, Inc., 506 F. Supp. 2d 1126, 1130 (M.D. Fla. 2007)); Hudgens v. Bell Helicopters/Textron, 328 F.3d 1329, 1344 (11th Cir. 2003) (“[A]n expert‘s failure to explain the basis for an important inference mandates exclusion of his or her opinion.“). The Court finds that while some of Moore‘s opinions rely on sound and reliable methodologies, others do not.31

i. Moore‘s Opinion Regarding the DMS Defect

Tesla argues that Moore‘s opinion on the Driving Monitoring System (“DMS“) is entirely speculative and unreliable. According to Tesla, the DMS feature of Autopilot was just like every other hands-off SAE Level 2 ADAS in 2019 in that it “included technology to monitor driver attentiveness by gauging torque applied to the steering wheel” while also providing “audible and visual reminders to the driver to put their hand on the wheel if their hands were not detected.” ECF No. [318] at 13. If the driver failed to provide sufficient torque on the wheel after multiple warnings,32 Autosteer would eventually disengage (i.e., “strikeout“). Id. at 14. Once disengaged, the driver was then prevented from using the Autopilot function for the remainder of the drive cycle.

Tesla contends Moore‘s opinion is speculative because he relies on the fact that McGee had a prior strikeout during the drive to conclude that Autopilot would have been disabled if Tesla had utilized a system with a one-week suspension period for drivers misusing the Autopilot technology. However, because “the cadence of strike-outs would have been different depending upon the timing of prior strikeouts,” Tesla contends there is no reasonable method for determining or otherwise proving that the Autopilot would have been suspended leading up to the collision. See id. Tesla also disputes the foundation for Moore‘s conclusion that a longer suspension period would have altered McGee‘s behavior even if Autopilot had otherwise been operable at the time of the collision. While Moore relied on an empirical study to support this contention, Tesla argues that it is unreliable because the “‘study’ consisted of random, unidentified, unquantified, and undocumented happenstance conversations [Moore] claims to have had with Tesla owners” and otherwise lacked dates, names, vehicle identities, or any other substantive information that would allow researchers to cross-examine the evidence. Id. at 15.

Plaintiffs respond that Moore‘s opinion “that a one-week disablement of Autosteer would have prevented the subject accident makes perfect sense” because McGee was a serial abuser of Autopilot who would respond to temporary disablement by overriding the system by simply turning off the car and restarting it. ECF No. [347] at 17. Plaintiffs concede that “Moore acknowledged that the cadence of strike-outs would have been different depending upon the timing of prior strike-outs,” and therefore, could not say with certainty that Autopilot would have been disabled on the date of the crash. ECF No. [347] at 18 (quoting ECF No. [318] at 12). Nevertheless, Plaintiffs contend Moore is not required to testify with that level of certainty for his testimony to be admissible.

Plaintiffs also defend Moore‘s alternative argument that “even if a suspension had not been in effect on the day of the crash,” McGee‘s knowledge that a strikeout would trigger a suspension would likely have encouraged more responsible driving on [McGee‘s] part.” ECF No. [318] at 19. Plaintiffs argue that Moore need not have relied on scientific data or more formal research methods because an “expert may be qualified on the basis of experience.” Id. (quoting Committee Notes to the 2000 amendments to Fed. R. Evid. 702) (internal quotations omitted). Accordingly, Moore‘s experience talking to other Tesla drivers about the impact of suspension policies and how they impacted their driving behavior is “more than sufficient” to support his opinion. Id.

Moore‘s opinion regarding the defect in Autopilot‘s DMS is reliable. In reaching the conclusion that Tesla‘s DMS fails to adequately ensure driver attentiveness, Moore reviewed the Vehicle‘s car log data, which shows that by Tesla‘s own measures, McGee was frequently distracted while using Autopilot. Moore noted that during the three months McGee owned the Vehicle, McGee experienced twenty-three total strikeouts,33 or approximately one strikeout every commute,34 and a warning about inattentive driving every 10 miles. ECF No. [318-1] at 11. Thus, there is a reasonable basis for Moore to conclude that McGee was likely more distracted while using Autopilot, given that without Autopilot, McGee would have necessarily been required to put his hands on the wheel more often or risk significantly increasing the likelihood of a lane departure or some other accident prior to the collision.

Furthermore, the methodology Moore used to form his opinion that a one-week disablement period for each strikeout would have prevented the subject collision is not “so unreliable to warrant exclusion.” Ohio State Troopers Ass‘n, Inc. v. Point Blank Enters., Inc., No. 18-CV-63130, 2020 WL 1666763, at *4 (S.D. Fla. Apr. 3, 2020). Moore relies on the Vehicle‘s log data to show that on the date of the collision, McGee had already had a strikeout and was only one warning away from having a second strikeout before the collision. See ECF No. [318-1] at 11; ECF No. [318-2] at 133:10. Based on that data, there is a reasonable basis for Moore to infer that if the strikeout earlier in the drive was accompanied by a one-week disablement of Autopilot, McGee would have been required to be more engaged during the drive, which would have meant McGee would have been more likely to be paying attention and prepared to intervene as he approached the intersection of Card Sound Road and ultimately would have avoided the crash.

The Court recognizes Tesla has a persuasive argument that a one-week disablement period would have altered “the cadence of strike-outs” depending upon the timing of prior strikeouts, and therefore, Moore cannot conclude with certainty that Autopilot would have been disabled in the moments leading up to the crash. However, certainty is not required to provide expert testimony and the Court “must remain mindful of the delicate balance between its role as a gatekeeper and the jury‘s role as the ultimate factfinder.” In re Abilify (Aripiprazole) Prods. Liab. Litig., 299 F. Supp. 3d 1291, 1305 (N.D. Fla. 2018) (citing Frazier, 387 F.3d at 1272); see Hendrix ex rel. G.P. v. Evenflo Co., 609 F.3d 1183, 1198 n.10 (11th Cir. 2010) (explaining that the trial court must remain mindful that ”Daubert does not require certainty; it requires only reliability.“). Accordingly, the Court finds the alleged deficiency in Moore‘s reasoning goes to the weight to be given to the testimony rather than its admissibility. See Carideo v. Whet Travel, Inc., No. 16-23658, 2018 WL 1367444, at *11 (S.D. Fla. Mar. 16, 2018) (“[T]he Court must be careful not to conflate questions of admissibility of expert testimony with the weight appropriately to be accorded to such testimony by the fact finder.“); Quiet Tech., 326 F.3d at 1345 (parenthetically quoting In re TMI Litig., 193 F.3d 613, 692 (3d Cir. 1999) (“So long as the expert‘s testimony rests upon ‘good grounds,’ it should be tested by the adversary process—competing expert testimony and active cross-examination—rather than excluded from jurors’ scrutiny for fear that they will not grasp its complexities or satisfactorily weigh its inadequacies.“) (additional layer of citations and quotations omitted)). Even if Moore‘s reliance on the Vehicle‘s log data was unreliable, the Court still finds that Moore has offered adequate alternative grounds for his opinion that a defect in the DMS‘s design caused the collision. Moore asserts that even if the implementation of a one-week disablement period would not have meant Autopilot was disabled on the date of the collision, McGee‘s knowledge that a strikeout would trigger a weeklong suspension would likely have encouraged McGee to be more engaged and drive more responsibly. See ECF No. [318-2] at 134:6-25. To form his opinion, Moore relies on his training and expertise to reason that McGee clearly wanted to use the Autopilot system as much as possible, as evidenced by how often McGee used Autopilot and how quickly he would pull over to reset the Autopilot system following a strikeout.35 See Maiz v. Virani, 253 F.3d 641, 669 (11th Cir. 2001) (“Although Daubert applies to all expert testimony, not just ‘scientific’ testimony, [ ] there is no question that an expert may still properly base his testimony on ‘professional study or personal experience.‘“) (quoting Kumho Tire Co., Ltd. v. Carmichael, 526 U.S. 137, 151 (1999)). Therefore, because McGee‘s past conduct indicates he would not want to be without Autopilot for any significant amount of time, ECF No. [318-1] at 11, Moore reasonably concludes that McGee would likely alter his driving behavior to avoid a weeklong strikeout to ensure that Autopilot was available for as much of his commute as possible. ECF No. [318-2] at 135:10-136:25.

Moore also relied on an “empirical study of Full Self-Driving [Tesla owners],” which had similar lockout parameters to the weeklong lockout proposed here, to buttress his conclusion that a weeklong disablement of Autopilot would have altered McGee‘s behavior. ECF No. [318-2] at 137:10-12. According to Moore, his study indicated that drivers were consistently concerned about being locked out of their self-driving system and thus were more likely to stay more engaged to avoid a significant lockout. While Tesla correctly challenges Moore‘s contention that undocumented conversations with an unidentified group of Tesla owners could reasonably constitute an “empirical study,” the fact that Moore does not rely on a formal, scientific, or peer-reviewed study does not necessarily render the source, or the inference made therefrom, unreliable.

Federal courts have made clear that non-scientific opinions, such as an opinion from an engineer,36 need not be based on empirical testing. See Simmons v. Ford Motor Co., 576 F. Supp. 3d 1136, 1146 (S.D. Fla. 2021) (“To suggest that a lack of empirical testing is automatic grounds for exclusion is directly contradicted by Daubert‘s treatment in the Eleventh Circuit.“). Indeed, engineers and design experts may give opinions based on their “knowledge, experience, and education” alone. Simmons, 576 F. Supp. 3d at 1146; see also Schenone v. Zimmer Holdings, Inc., No. 12-1046-J-39MCR, 2014 WL 9879924, at *5–8 (M.D. Fla. July 30, 2014) (“[T]he expert‘s experience in conjunction with knowledge, skill, training or education alone may provide a sufficient basis to the reliability of the expert‘s opinion.“). Here, the conversations Moore had with Tesla drivers about a seven-day suspension policy simply show that Moore has direct knowledge and experience regarding this issue and is not merely relying on his expertise to form his opinion.37 Although Moore‘s “testing” and “studies” may not be as thorough as would be expected for a scientific journal,38 that is not a basis for the exclusion of Moore‘s opinion given that Moore has

sufficient knowledge, education, and experience to give such an opinion.39 See Daubert, 509 U.S. at 596 (“Vigorous cross-examination, presentation of contrary evidence, and careful instruction on the burden of proof are the traditional and appropriate means of attacking shaky but admissible evidence.“).

ii. Moore‘s Opinion Regarding the ODD Defect

Tesla contends that simply because a collision occurred does not render the Autopilot system defective. Moreover, even if Tesla had confined Autopilot to its ODD, rendering Autopilot unavailable at the time of the collision, Moore admitted he would be “reaching [ ] into hypotheticals and possibilities” in guessing what McGee would have done. ECF No. [318] at 14. As such, Tesla maintains there is no reasonable evidentiary basis to support Moore‘s opinion that the use of Autopilot outside of the ODD was a substantial contributing factor to the collision. See id. Plaintiffs respond that Moore‘s opinion is “unassailable,” given that the opinion is supported by Cummings’ testimony that “other manufacturers of driving-assist systems like GM and Ford do not allow their autopilot systems to operate in ODDs outside of their capabilities[.]” ECF No. [347] at 20 (quoting ECF No. [318-2] at 1-2). Plaintiffs also disagree that Moore‘s ODD opinion is inconsistent with the guidance issued by the Society of Automotive Engineers. Although the guidance states that drivers alone should determine when to use driving assist technology in Level 2 vehicles like the Model S, Plaintiffs point out that the guidance also indicates “only Level 5 operation is possible without ODD limitation.” Id. (quoting ECF No. [318-1] at 5-6) (internal quotations omitted). The Court finds Moore‘s opinion that Autopilot was defective because it could be used outside of its ODD does not depend on methods so unreliable that the opinion should be excluded. See Ohio State Troopers Ass‘n, Inc., 2020 WL 1666763, at *4. Moore relies on Tesla‘s Owner‘s Manual, Tesla‘s own statements, and records from Tesla employees to show that Tesla permitted drivers to utilize Autopilot in areas where it was not designed to function.40 Tesla has stated that, in 2019, Autopilot, specifically Autosteer, was “[i]ntended for use only on highways and limited-access roads,” that “ha[d] a center divider and clear lane markings,” and was “[u]nlikely to operate as intended when driving on hills, approaching a toll booth, or driving on a road that has sharp curves.” ECF No. [318-1] at 6-7. Therefore, Moore reasonably concluded that Card Sound Road was not the type of road on which Tesla‘s Autopilot system was designed to operate.41

However, Moore does not conclude there was a design defect simply because Tesla drivers were capable of misusing the product in a manner Tesla had not intended. Moore determined there was a design defect because, in 2019, the evidence indicated that Tesla had the means to easily limit the misuse of Autopilot but elected not to do so. Moore first pointed to record evidence demonstrating that, as of 2019, at least two other manufacturers had limited their ADAS technology to its ODD and, therefore, Tesla could not argue that its technology was state of the art.42 Moore further explained that in 2019:

Tesla had available technology to restrict Autopilot use outside of its ODD,43 and on the subject road. In particular, Autosteer state “Unavailable” was a possible state given available information about the subject road. Tesla chose to support [the] operation of Autopilot outside of its ODD, increasing the risk of an accident as compared to a choice of preventing use outside of its ODD.

ECF No. [318-1] at 10.44

Tesla attempts to undermine Moore‘s findings by focusing on the fact that the 2019 Tesla Model S was a SAE Level 2 vehicle that lacked any ODD requirement and required that the driver be responsible for the vehicle, including determining whether the use of the vehicle‘s ADAS was appropriate. ECF No. [318] at 15-16. However, the fact the driver is responsible for the vehicle does not undermine the reasonable inference that it is likely more dangerous for a driver to utilize driver assistance technology in a location where the technology is not designed to operate, particularly if the driver is not fully aware of the risks.45 As Moore sufficiently explains in his report, a user‘s discretion is not, and often cannot, always be the sole means of user safety.

Otherwise, Tesla would not have prevented drivers from using “Autosteer or TACC with the seatbelt unfastened, a door or trunk open, in trailer mode, over 90 mph, or with the headlights off[.]” ECF No. [318-1] at 16.

To the extent Tesla argues Moore is purely speculating that the crash would not have happened if Tesla had limited Autopilot to its ODD, the Court disagrees. Just as Moore may use his knowledge and experience to reasonably conclude that certain modifications to Tesla‘s DMS technology would have prevented the collision, he may also conclude that limiting Autopilot to its ODD would have prevented the collision as well, given that both conclusions apply the same reasoning. Simmons, 576 F. Supp. 3d at 1146; Schenone, 2014 WL 9879924, at *8 (“[T]he expert‘s experience in conjunction with knowledge, skill, training or education alone may provide a sufficient basis to the reliability of the expert‘s opinion.“). Because limiting Autopilot to its ODD would have disabled the system while McGee was driving on Card Sound Road, Moore reasonably concludes that McGee would have been manually driving his Vehicle on the date of the collision and, therefore, he would likely have been more engaged and more responsive to the issues that arose. See Hendrix ex rel. G.P., 609 F.3d at 1198 n.10 (”Daubert does not require certainty; it requires only reliability.“). Moore does not have to completely rebut alternative possibilities. He need only provide a sound reason for his conclusion that it is more likely than not Tesla‘s conduct rendered the product defective and caused Plaintiffs’ injuries. See 325 Goodrich Ave., LLC v. Sw. Water Co., 891 F. Supp. 2d 1364, 1381-82 (M.D. Ga. 2012) (explaining that the causation expert was not required to “definitely conclude” the cause of the damage for the opinion to be reliable, nor did he have to exclude all other “potential sources of causation“). Accordingly, based on the record evidence Moore relied upon, the Court finds that the grounds for Moore‘s conclusions on the ODD defect are reliable.

iii. Moore‘s Opinion Regarding the TACC Defect

Tesla argues that Moore‘s opinion regarding the TACC defect is unreliable and unsupported by the record because Autopilot, and specifically the TACC system, was not designed to prevent the collision that occurred here. ECF No. [318] at 17. Plaintiffs respond that Tesla‘s argument hinges on the assumption that McGee overrode the Autopilot in the seconds leading up to the collision but ignores “that Autopilot was still controlling the vehicle‘s movement despite McGee‘s acceleration.” ECF No. [347] at 21 (emphasis removed). Given that the Parties’ dispute appears to be largely a question of fact, Plaintiffs argue the question should be a question for the jury and cannot serve as a basis for excluding Moore‘s testimony.

The Court agrees with Plaintiffs that the basis for Moore‘s opinion regarding Autopilot‘s failure to timely warn or deploy the brakes is reliable. In reaching his conclusion, Moore relied on the Vehicle‘s log data as well as the Vehicle‘s augmented video to show that notwithstanding the detection of obstacles in the roadway, there was no Automatic Emergency Braking or Forward Collision Warning issued by the Vehicle prior to the collision. ECF No. [318-1] at 11. No additional evidence is needed to reasonably infer that the triggering of either of these systems following the detection of the obstacles would have made it more likely than not that the collision would never have occurred. Tesla attempts to challenge Moore‘s conclusion by arguing that McGee overrode the Autopilot system and, therefore, Autopilot‘s failure to issue a warning or trigger the automatic brakes could not reasonably be the cause of the collision. However, whether Autopilot was activated and to what extent it was activated leading up to the collision is a material fact in dispute and is not a basis for excluding Moore‘s opinion. Cf. Rappuhn v. Primal Vantage Co., Inc., No. 23-10050, 2024 WL 2930448, at *4-5 (11th Cir. June 11, 2024) (“The inquiry under Daubert is not whether the expert‘s assessment is incontrovertible. ’Daubert does not require that a party who proffers expert testimony carry the burden of proving to the judge that the expert‘s assessment of the situation is correct.‘“) (quoting Ruiz-Troche v. Pepsi Cola of Puerto Rico Bottling Co., 161 F.3d 77, 85 (1st Cir. 1998)).

Tesla‘s argument that its Autopilot system was not designed to issue a warning or trigger the automatic braking system also does not undermine the reliability of Moore‘s opinion to the extent he is arguing there is a design defect.46 Indeed, the fact that the Autopilot system was not designed to utilize either of these safety features under the circumstances is precisely why Moore concluded the product was dangerously defective.47

Furthermore, it is not dispositive that “Moore has not, and cannot, point to another vehicle which would have done what he alleges Autopilot failed to do here.” ECF No. [318] at 17. In Florida, a Plaintiff need only satisfy the consumer expectation test to prove a design defect claim. See Aubin v. Union Carbide Corp., 177 So. 3d 489, 510-11 (Fla. 2015). Neither Plaintiffs nor their experts must establish a reasonable alternative design, so long as they demonstrate that a reasonable consumer in McGee‘s shoes would have expected the Vehicle to either give an audible warning of obstacles in the road or trigger the automatic brakes with sufficient time to avoid the collision. Id. (“[W]e conclude that the Third Restatement‘s risk utility test and establishment of a reasonable alternative design mandate are not requirements for finding strict liability[.]“). Because Tesla fails to point to any record evidence that conclusively undermines the theory that a reasonable consumer would have expected a warning or for the Vehicle to deploy its automatic brakes under the circumstances, the Court finds no basis to exclude Moore‘s testimony on this issue.

iv. Moore‘s Opinion Regarding Insufficient Training

Tesla argues that Moore‘s opinion, finding Tesla‘s Autopilot training for new owners insufficient, is an unreliable “ipse dixit” opinion. ECF No. [318] at 17. According to Tesla, Moore‘s opinion ignores the record evidence that shows not only did Tesla provide Autopilot training to customers, but McGee also confirmed that he had educated himself about the system and “was confident in how to operate Autopilot.” ECF No. [347] at 17.48 Plaintiffs contend the record reflects that McGee “received no training . . . that over 50 other Tesla customers Moore had spoken with received no training, . . . and Moore himself received no training on the Tesla his company owns.” Id. at 22 (emphasis removed). Although Tesla challenges the basis of Moore‘s opinion since he did not rely on a formal study, Plaintiffs reiterate that “expert testimony based on experience is permissible under Daubert.” Id.

The Court finds Moore‘s opinion that Tesla‘s “training and familiarization with Autopilot for new owners was insufficient” and therefore “increased the risk of the subject accident” is unreliable and not sufficiently supported by the record.

Moore‘s opinion fails to explain what matters on which McGee and other Tesla drivers were inadequately trained, what information and training should have been provided that would have decreased the risk of injury in this case, or even generally accepted standards for such training. See Cook ex rel. Est. of Tessier v. Sheriff of Monroe Cnty., Fla., 402 F.3d 1092, 1112 (11th Cir. 2005) (finding the inadequate training opinion was “without foundation, since Dr. Maris ha[d] articulated neither a generally accepted standard for [the] training . . . nor an explanation of how or why he believe[d] the MCDC‘s training to be inadequate.” Thus, th[e] opinion [w]as connected to existing data only by the ipse dixit of the expert.“) (quoting Michigan Millers Mut. Ins. Corp. v. Benfield, 140 F.3d 915, 921 (11th Cir. 1998)). While Tesla may certainly have “had infinitely more information regarding the risk of Autopilot” than McGee did, Moore fails to articulate the basis for his belief that more training on the system would have decreased the possibility of the collision. See United States v. Frazier, 387 F.3d 1244, 1266 (11th Cir. 2004) (observing that an expert‘s “imprecise opinion easily could serve to confuse the jury, and might well have misled it“).

The Court therefore agrees that Moore‘s opinion constitutes the classic ipse dixit fallacy as it fails to rely on data, evidence, or Moore‘s knowledge and experience to suggest that increased training decreases the risk of these types of events. See Cook ex rel. Est. of Tessier, 402 F.3d at 1111 (“[A] trial court may exclude expert testimony that is ‘imprecise and unspecific,’ or whose factual basis is not adequately explained.“) (quoting Frazier, 387 F.3d at 1266); Michigan Millers Mut. Ins. Corp., 140 F.3d at 921 (“nothing in either Daubert or the Federal Rules of Evidence requires a district court to admit opinion evidence which is connected to existing data only by the ipse dixit of the expert.“). Accordingly, Moore‘s opinion regarding inadequate training is excluded.

v. Moore‘s Opinion Regarding the Beta Software Defect

Tesla argues that Moore‘s opinion that there was significant risk associated with allowing mass-produced vehicles to utilize Beta software is unsupported by the record, and is merely a “throwaway opinion, designed simply to unfairly prejudice the jury against Tesla.” [318] at 19. Plaintiffs respond that the fact that the Autopilot system was still in “Beta” meant that “it was not fully tested for safety, and further, the system was not designed to be used on roadways with cross-traffic or intersections.” ECF No. [347] at 23. Moreover, Plaintiffs contend that there is evidence that the Beta software put consumers and the public at risk.

The Court finds that Moore‘s Beta opinion utilizes unreliable methods. In his expert report, Moore simply concludes that “Tesla allowed Mr. McGee to take a significant level of personal risk in operating a beta software in a vehicle.” ECF No. [318-1] at 3, 16. Moore fails to explain why operating in Beta renders a product inherently more dangerous than it otherwise would be, or that there is a universal understanding in the car industry as to what “Beta” means. Indeed, Moore fails to offer any explanation as to risks commonly associated with a Beta version of an ADAS system or the level of increased risk in using such a system.49 While there might be a colloquial understanding of what Beta means50 and even a unique and specific meaning in certain industries, Moore offers no assurance that Tesla or other car manufacturers used the term in the same manner. Ultimately, Moore‘s conclusion fails to provide any basis even to infer that operating in Beta makes it more likely than not that the Autopilot in McGee‘s 2019 Tesla Model S was defectively designed.

The only statement in Moore‘s report that could even potentially support his Beta opinion is his broad generalization that “Beta software agreements generally require the beta tester to accept any adverse events that occur during use.” ECF No. [318-1] at 16. However, Moore fails to explain how this generalization is relevant to the facts here. Moore does not identify any evidence indicating that Tesla required users to accept certain consequences because the product was in a Beta form or that the Beta version was indeed more dangerous than any potential subsequent “finished product.” An expert whose opinion is based primarily on his experience “must explain ‘how that experience leads to the conclusion reached, why that experience is a sufficient basis for the opinion, and how that experience is reliably applied to the facts.‘” King v. Cessna Aircraft Co., No. 03-20482–CIV, 2010 WL 1980861, at *3 (S.D. Fla. May 18, 2010) (quoting Frazier, 387 F.3d at 1261-62) (additional level of citation and quotations omitted). Unexplained assurances that the opinion is reliable are insufficient. See Clena Invest., Inc., 280 F.R.D. at 663. Because Moore has failed to provide a sufficient explanation as to how he reached his conclusion that Tesla created a significant level of risk by using beta software, the opinion must be excluded.51 Cook ex rel. Est. of Tessier, 402 F.3d at 1111 (“a trial court may exclude expert testimony that is “imprecise and unspecific,” or whose factual basis is not adequately explained.“) (quoting Frazier, 387 F.3d at 1266).

c. Helpfulness of Moore‘s Testimony

The final element, helpfulness, turns on whether the proffered testimony “concern[s] matters that are beyond the understanding of the average lay person.” Edwards v. Shanley, 580 F. App‘x 816, 823 (11th Cir. 2014) (quoting Frazier, 387 F.3d at 1262). “[A] trial court may exclude expert testimony that is ‘imprecise and unspecific,’ or whose factual basis is not adequately explained.” See id. (quoting Cook ex rel. Est. of Tessier, Fla., 402 F.3d at 1111). To be appropriate, a “fit” must exist between the offered opinion and the facts of the case. McDowell v. Brown, 392 F.3d 1283, 1299 (11th Cir. 2004) (citing Daubert, 509 U.S. at 591). Because Tesla does not appear to challenge the helpfulness of any of Moore‘s testimony the Court has deemed reliable, the third Daubert factor is satisfied. Accordingly, to the extent Moore‘s opinions and testimony have not been excluded as unreliable, they are admissible at trial.

ii. Admissibility of Dr. Mary Cummings’ Expert Testimony

Tesla‘s Daubert motion also seeks to exclude the testimony of Dr. Mary Cummings. Tesla begins by attacking Cummings’ causation opinions. Tesla asserts there is no dispute that (1) “McGee was a cause of the crash,” (2) McGee was responsible for operating the car and obeying traffic laws, (3) “at the time of the crash, McGee had overridden [the] TACC,” and (4) there is no claim that “the lane changing feature played any role in this crash.” ECF No. [318] at 21. Because those facts are not in dispute, Tesla argues that Cummings’ opinions are only relevant if there is a reliable basis for her to conclude that “more likely than not, a claimed defect related to Autopilot was the cause of this crash.” Id. According to Tesla, the problem with Cummings’ causation opinions is that she speaks about drivers and vehicles generally, rather than addressing the evidence and the driver‘s conduct in this case. See id.

Cummings also offers the following defect opinions that Tesla argues should be excluded:

  1. Autopilot‘s design is defective because it allows drivers to engage its features outside of its ODD[;]
  2. [T]he design of the DMS is defective because it relies on steering wheel torque sensing to determine driver engagement,
  3. Autopilot detected obstacles, but failed to warn or apply the brakes [;] and
  4. Tesla failed to properly warn and train.

ECF No. [318] at 21 (citing ECF No. [318-5] at 1-2). Tesla argues that Cummings’ opinions “are not rooted in reliable principles and methods applied in a reliable manner, and in some respects are irrelevant because they do not fit the facts of this case.” Id.

Plaintiffs maintain that Cummings’ conclusions depend on reliable methods and are firmly grounded in the facts of the case. ECF No. [347] at 3. According to Plaintiffs, there is “nothing unreliable or speculative about [Cummings‘] opinions; to the contrary, they mirror the U.S. government‘s own conclusion that Autopilot is a flawed technology that lulls drivers into a false sense of security yet lacks the ability to prevent tragic crashes like the one at issue here.” Id. at 3-4.

The Court considers Plaintiffs’ expert‘s qualifications, the reliability of her methods, and the helpfulness of the testimony to determine whether such testimony is admissible.

a. Cummings’ Qualifications

Tesla does not challenge Cummings’ qualifications, and the Court finds she is qualified to provide expert testimony.

b. Reliability of Cummings’ Opinions

i. Cummings’ Opinion Regarding ODD Defect

Regarding Cummings’ opinion that Autopilot‘s design is defective because, in 2019, drivers could use Autopilot outside of its ODD, Tesla contends the opinion “is not rooted in reliable principle and methods applied in a reliable manner.” ECF No. [318] at 27. According to Tesla, the only scientific support Cummings offers for her opinion is an NTSB recommendation, which was merely a limited request for information about certain manufacturers’ “practices related to ODDS in SAE level 2 ADAS.” Id. Therefore, NTSB did not establish nor recognize an industry standard because there was only one manufacturer in 2019 who had the capability to limit Autopilot to the system‘s ODD—General Motors (“GM“).

Moreover, to the extent that GM could limit its “Super cruiser” system to its ODD in 2019, Tesla argues that is irrelevant here because GM‘s Super cruiser system was a “hands-off” system. Id. at 27. Accordingly, Tesla maintains that Cummings’ opinion is inconsistent with the recommended practices and industry standards at the time for a hands-on SAE Level 2 ADAS system and is, therefore, unreliable and irrelevant. See id. at 28.

Plaintiffs argue that Cummings’ opinion about the use of Autopilot outside its ODD is valid for largely the same reasons Moore‘s opinion is admissible—permitting use outside the Vehicle‘s ODD exceeded the product‘s stated capabilities, and such usage was not state of the art. ECF No. [347] at 28. According to Plaintiffs, in 2019, other companies such as GM prohibited their driver assist functions from operating outside of its design domain, and the NTSB recommended that all manufacturers design system safeguards to ensure that “use of automated vehicle control systems” was limited “to those conditions for which they were designed.” Id. (quoting ECF No. [381-5] at 43). Nevertheless, Tesla elected not to incorporate those reasonable safeguards into its vehicles. Plaintiffs argue that a reasonable consumer would expect a manufacturer to limit the use of technology to locations where it was safe to do so, and therefore, the opinion is admissible. See id.

The Court finds that Cummings’ opinion that “Autopilot‘s design is defective as it allows drivers to engage Autopilot in operation design domains (ODDs) that exceed its stated capabilities” is reliable and admissible.

Cummings clearly articulates the increased risk of permitting drivers to utilize the Autopilot system outside of its operational domain. Specifically, Cummings explains that despite Tesla expecting Autopilot to be functional on two-lane roads similar to Card Sound Road, the Owner‘s Manual states, “that only barriers on or near freeways and highway roads will be labeled for the computer vision system to recognize.” ECF No. [350-1] at 6. Cummings reasonably concludes that Tesla‘s decision to allow Autopilot to be used in areas where Tesla knows it has not adequately labeled potential obstacles effectively renders Autopilot blind in such circumstances, greatly increasing the chance of a collision similar to the one that occurred in this case. See id.

Cummings also effectively rebuts any argument that Tesla could not or did not need to impose such a geographic limitation on its Autopilot system. Cummings considered the testimony of Tesla‘s expert who admitted that Tesla was capable of limiting Autopilot to its intended ODD at the time but elected not to do so. ECF No. [350-1] at 6. Cummings further explained that “[o]ther manufacturers of driving-assist systems like GM (i.e., Super Cruise) or Ford (i.e., Blue Cruise) d[id] not allow their ADAS systems to operate in ODDs outside of their capabilities, which are controlled highways.” ECF No. [350-1] at 5-6. Even more significant, Cummings refers to a 2016 investigation in which NTSB recommended that “Tesla ‘[i]ncorporate safeguards that limit the use of automated vehicle controls systems to those conditions for which they were designed (H-17-14).‘” Id. at 6.

Tesla‘s sole argument for why Cummings’ ODD opinion is unreliable is because “neither SAE nor the International Organization for Standardization (“ISO“) recommended a standard to support her opinion. However, a product may still be defective even if it complies with industry standards. See Jackson v. H.L. Bouton Co., Inc., 630 So. 2d 1173, 1175 (Fla. 1st DCA 1994) (explaining “compliance with industry standards is merely evidence that a product was not defective,” it does not conclusively establish the product is not defective); see also Garrison v. Sturm, Ruger & Co., Inc., 322 F. Supp. 3d 1217, 1225 n.4 (N.D. Ala. 2018) (“But, proof of compliance with ‘industry-wide practices . . . fails to eliminate conclusively . . . liability for [a] design of [an] allegedly defective product.‘” (alterations in the original) (quoting Elliot v. Brunswick Corp., 903 F.2d 1505, 1508 (11th Cir. 1990)))).52 In any event, Cummings relies on

sufficient evidence to opine that Tesla‘s design was not state of the art and that, even if limiting Autopilot to its ODD was not universally recognized as the industry standard, multiple other manufacturers determined it was a necessary measure to ensure the public‘s safety. Consequently, the Court will not exclude Cummings’ opinion regarding the ODD defect from the jury‘s consideration.

ii. Cummings’ Opinion Regarding DMS Defect

Tesla next challenges Cummings’ opinion as to the inadequacy of Autopilot‘s DMS. ECF No. [318] at 26. According to Tesla, “Cummings’ opinion centered on her complaint that there was too long of a delay between the time when no hands were detected on the steering wheel and an alert.” Id. However, because “McGee[‘s] hands were detected on the steering wheel during the 20 seconds before the crash,” any defect resulting from a delay in the warning system is irrelevant to the facts in the instant case. Id.

Plaintiffs respond that Cummings made clear that Tesla‘s wheel torque monitoring system was not consistent with industry standards because manufacturers now rely on facial recognition technology, which more effectively measures driver attentiveness than pressure on the steering wheel. See ECF No. [347] at 28-29. Although Tesla contends that Cummings’ opinion is inapplicable to this case because McGee had his hands on the steering wheel at the time of the collision, Plaintiffs assert that the fact McGee was distracted despite having at least one hand on the wheel demonstrates why the torque monitoring system is dangerously unsafe and why a facial recognition monitoring system would have been more effective. See id. at 29.

The Court finds Cummings’ opinion that Autopilot‘s DMS is defective because it relies on steering wheel torque sensing to determine driver engagement is reliable. Cummings’ analysis initially focuses in on several NTSB investigations raising concerns about the efficacy of Tesla‘s wheel torque-based monitoring system. Cummings noted that in 2016, the “NTSB concluded that the Tesla DMS was not an ‘effective method of ensuring driver engagement’ and monitoring steering wheel torque is a ‘poor surrogate means of determining a driver‘s degree of engagement with the driving task.‘” ECF No. [318-5] at 3 (quoting (NTSB/HAR-17/02)). Cummings pointed out that NTSB concluded in another investigation that “Tesla‘s Autopilot design ‘permitted the driver to disengage from the driving task’ and was a causal factor in th[e] crash.” Id. (quoting (NTSB/HAB-19/07)). Prior to the subject collision, NTSB even recommended that Tesla “[d]evelop applications to more effectively sense the driver‘s level of engagement and alert the driver when engagement is lacking while automated vehicle control systems are in use (H-17-42).” ECF No. [350-1] at 6.

Cummings also explained that DMS does not keep with ADAS DMS practices, which often rely on facial recognition technology to measure driver engagement. Id. at 9. Cummings opined that a facial recognition monitoring system is a more effective measure of ensuring driver engagement and recent data has shown “Tesla drivers significant[ly] disengage from the driving task upon engagement of Autopilot.” ECF No. [318-6] at 39:17-24. ECF No. [350-1] at 9. The Court sees no basis to conclude that Cummings’ opinion on this issue is unreliable. Not only does she rely on her own extensive knowledge and expertise, but Cummings also depends on NTSB and NHTSA investigations and findings, along with industry standards, to opine that Tesla‘s use of steering wheel torque-sensing technology to measure driver engagement rendered the Autopilot system unreasonably dangerous. Cummings’ opinion is based on good grounds and is reliable.

iii. Cummings’ Opinion Regarding Defective Triggering of FCW and AEB

Tesla argues Cummings’ opinion that Autopilot was defective because it failed to warn McGee or trigger the automatic emergency braking system is also unreliable. Because Cummings was unable to offer a specific answer as to how the system should have been designed and testified that any questions about improving the decision-making function should be answered by Tesla‘s engineers, Tesla argues her opinion on the purported defect is unreliable. Id.

Plaintiffs respond that Cummings’ opinion is admissible because all that is required to prove a design defect under Florida law is that “the product failed to perform as safely as an ordinary consumer would expect when used as intended or in a manner reasonably foreseeable by the manufacturer.” ECF No. [347] at 30. Therefore, Plaintiffs are “not required to provide proof of a reasonable alternative design” in order to prove a defect. Id.

The Court finds no legitimate basis to exclude Cummings’ testimony that Autopilot detected obstacles but failed to warn or apply brakes. Cummings reviewed the augmented video from the Vehicle and determined that the Vehicle “clearly detected an anomalous event” and also “successfully detected” (1) “the road‘s edge,” (2) a stop sign 61.16m in front of it,” (3) “a pedestrian,” (4) “a miniva[n],” and (5) other “possible obstacles” yet failed to provide a frontal collision warning or activate the automatic emergency braking. ECF No. [350-1] at 7. In addition to reviewing the augmented video, Cummings relied upon McGee‘s testimony that further confirmed his Vehicle did not deploy the automatic emergency brake nor provided a warning. Thus, Cummings’ assertion that the Vehicle did not brake or provide a warning of an obstacle to McGee is supported by the record.

Tesla contends that Cummings lacks sufficient information as to how Autopilot interacts with the FCW system and the AEB system to conclude Autopilot was capable of issuing a warning or deploying the brakes under the circumstances, but failed to do so. However, such an explanation is unnecessary as an expert can reliably opine that the product at issue did not perform as a reasonable consumer would have expected. See Aubin, 177 So. 3d at 510-11. Cummings has done so here. She relies on Tesla‘s representations to the public that consumers should treat their cars like self-driving vehicles53 to show that a reasonable consumer would have expected the Vehicle to avoid the collision, and at the very least, warn of an impending collision. Because Cummings reasonably relies on evidence suggesting that Autopilot would perform as opined, the Court finds that the opinion should not be excluded.

iv. Cummings’ Opinion Regarding Tesla‘s Failure to Warn and Train

Tesla contends that Cummings’ opinion that Tesla drivers are inadequately trained and confused about Autopilot‘s capabilities is unreliable. ECF No. [318] at 29. Tesla contends there is no record evidence that McGee was confused about Autopilot‘s capabilities or that Tesla failed to provide McGee with adequate warnings about the system‘s capabilities. See id.

Although Plaintiffs appear to concede there are no studies or data to support Cummings’ opinion that Tesla failed to provide adequate warnings and training on the Autopilot system, Plaintiffs argue an expert‘s own “expertise—not to mention plain common sense—can be a sufficient basis for admission of the expert‘s testimony under Daubert.” ECF No. [347] at 30-31. Moreover, Plaintiffs argue that McGee‘s testimony that he was adequately trained does not undermine the applicability or reliability of Cummings’ opinion because McGee‘s lack of formal

training meant he did not fully appreciate the complexity of the Autopilot system and therefore had unreasonable overconfidence of his ability to operate the technology. See id. According to Plaintiffs, this confusion and overconfidence was not unique to McGee as the record reflects that the term “Autopilot” gave Tesla drivers undue confidence in the system‘s capabilities.

Unlike Moore‘s training opinion, Cummings’ opinion that Tesla failed to adequately warn and train Tesla drivers on the limitations and capabilities of Autopilot is based on record evidence and utilizes reliable methodologies. Cummings considered McGee‘s testimony that he never received training on Autopilot, nor was he required to take any mandatory tutorials once he received the Vehicle. ECF No. [350-1] at 5. Cummings also noted that McGee did not know that the Vehicle allowed him to electronically access the Owner‘s Manual, and therefore, it is unclear whether McGee ever acquired critical knowledge about Autopilot‘s operational domains or other Vehicle limitations such as its inability to detect and avoid certain obstacles like those present in this case. See id. However, Cummings does not rely on McGee‘s testimony alone. Cummings also relied on Tesla‘s experts to confirm that it does not provide any formal instruction or assistance to drivers to explain the Autopilot‘s limitations or how to use its various complex features. See id. Tesla experts acknowledged that drivers primarily learn how to operate the system by trial and error. See id.

Cummings’ conclusion that the lack of formal training is unsafe is supported by the record. Cummings explains that “[t]he need for additional driver training given confusing driving assist technologies is so high that the National Safety Council started the My Car Does What? program specifically to assist drivers with cars incorporating advanced technologies.” Id. Cummings also pointed out that as early as 2017, other manufacturers announced dealerships would undergo specific training to sell [their ADAS technology].” Id. Cummings’ lack of reliance on scientific studies or comprehensive data sets does not preclude her from offering her non-scientific testimony and opinions. See Maiz, 253 F.3d at 669 (rejecting argument “that testimony [wa]s not reliable because it [wa]s based largely on [expert‘s] personal experience rather than verifiable testing or studies“); Simmons, 576 F. Supp. 3d at 1146 (“[T]he lack of empirical testing is not enough to overcome the reliability of [expert‘s] opinions based on his extensive experience in vehicle design.“). Cummings utilizes reliable information and methodologies to support the inference that Tesla‘s warnings and training regarding Autopilot‘s capabilities were inadequate. The record further supports Cummings’ opinion that the deficient warnings and training, more likely than not, were a proximate cause of Plaintiffs’ injuries.

As Cummings points out, McGee testified that he thought the car could detect, stop, and otherwise avoid obstacles and ultimately prevent collisions. See ECF No. [350-1] at 8. McGee testified further that he expected Autopilot to operate “as a ‘copilot‘” stopping “regardless of any car.” Id. Other Tesla drivers had similar misunderstandings as well. Cummings points out that, in Tesla‘s own internal survey, nearly seven percent of those surveyed believed that the term Autopilot meant that Tesla‘s vehicles could drive themselves. ECF No. [350-1] at 11. Accordingly, Cummings could reasonably infer that if McGee had known the true limitations and capabilities of the Vehicle, he would have behaved less negligently. The record supports that McGee‘s fundamental misunderstanding of Autopilot greatly increased the likelihood that he was overconfident in the Vehicle‘s capabilities, leading McGee to behave negligently, or at least more negligently than he otherwise would have if he was made aware that Autopilot was incapable of preventing the type of collision that occurred in this case. As such, Cummings’ testimony regarding Tesla‘s failure to warn, adequately train, and that the failures were a cause of Plaintiffs’ injuries is admissible.

c. Helpfulness of Cummings’ Opinions

Tesla argues that Cummings’ warning and training opinions will not assist the jury because it is a lay opinion that largely mischaracterizes and interprets McGee‘s deposition testimony. ECF No. [318] at 25. According to Tesla, McGee‘s testimony speaks for itself, and therefore, opinion testimony is unnecessary to understand or evaluate McGee‘s statements. Consequently, Tesla contends Cummings’ warning testimony should be excluded. Plaintiffs respond that Cummings’ opinion is useful and well supported by the facts in this case. See ECF No. [347] at 30-31.

Cummings’ warning and training testimony is competent expert testimony that will be helpful to the jury. Cummings’ opinion goes well beyond McGee‘s testimony to evaluate the adequacy of Tesla warnings and training. Furthermore, Cummings’ knowledge, experience, and expertise will assist the jury in understanding what types of warnings and trainings are effective.54 Because Cummings is qualified, her methods are reliable, and her opinions will be helpful to the jury, the Court declines to exclude Cummings’ opinions or testimony under Daubert.55

iii. Whether Plaintiffs’ Expert Testimony is Duplicative

Tesla argues that, even if Moore‘s and Cummings’ opinions are independently admissible, they should be excluded as their testimony is duplicative. ECF No. [318] at 30. Plaintiffs respond that Tesla‘s argument fails because Moore and Cummings “have very different backgrounds,

different areas of expertise, and used different methodologies to form their opinions.”56 ECF No. [347] at 31. According to Plaintiffs, when different methods and approaches are utilized to reach the same opinion, such testimony should not be considered cumulative. See id. at 31-32. However, to the extent that their opinions might ultimately be cumulative of each other, Plaintiffs argue that the decision should be made at trial once the opinions are given the appropriate context. See id. at 32-33.

While “[e]xpert testimony may be needlessly cumulative where there is ‘substantial overlap’ between the areas on which two experts will testify,” at this stage, the Court does not find Moore and Cummings’ expert testimony to be cumulative notwithstanding the fact that they are expected to give testimony on many of the same overarching issues. Royal Bahamian Ass‘n, Inc. v. QBE Ins. Corp., No. 10-21511-Civ, 2010 WL 4225947, at *2 (S.D. Fla. Oct. 21, 2010).57 Although both experts may render opinions on the same or substantially similar issues, Moore and Cummings rely on distinct and separate areas of knowledge and expertise. See ECF No. [347] at 32. Just as the Court may not consider opinion testimony duplicative from both a quarterback and a physicist who explain the best way to throw a football,58 expert testimony should not be excluded

simply because the experts may render opinions on the same ultimate issue.59 Kapila v. Warburg Pincus, LLC, No. 8:21-cv-2362, 2023 WL 2584469, at *6 (M.D. Fla. Mar. 21, 2023) (“[T]estimony on the same topic by different experts, however, is not necessarily cumulative where the experts will testify from different professional perspectives.“) (quoting Royal Bahamian Ass‘n, Inc., 2010 WL 4225947, at *2); see also Mendez v. Unitrin Direct Prop. & Cas. Ins. Co., No. 8:06-cv-563-SCB-MAP, 2007 WL 2696795, at *1-2 (M.D. Fla. Sept. 12, 2007) (allowing two experts to testify regarding bad faith issues where one was a claims handling expert and the other was an expert on an insurer‘s legal duties to its insured); Demeritt v. Wal-Mart Stores East, LP, No. 6:20-cv-89, 2021 WL 2828686, at *2 (M.D. Fla. May 17, 2021) (finding testimony was not cumulative notwithstanding “some overlap in testimony” because “each expert applie[d] a different scientific approach“). While Moore‘s and Cummings’ testimony must be appropriately limited to avoid being overlapping or cumulative, the precise contours of those limitations are unclear on this record. The Court therefore concludes that excluding any portions of Moore‘s or Cummings’ testimony as cumulative is premature at this juncture and will address any specific objections at trial.

B. Plaintiffs’ Daubert Motions

i. Plaintiffs’ Motion to Exclude Expert Testing and Testimony of Ryan Harrington

Plaintiffs seek to exclude both the crash testing results and testimony of Tesla‘s expert, Ryan Harrington. Plaintiffs contend Harrington‘s testing and testimony will not help the jury determine whether Autopilot was defectively designed because the testimony has “no bearing on

any relevant issues at trial,” and the testing lacks “substantial similarity to the subject crash.” ECF No. [322] at 3.

According to Plaintiffs, Harrington conducted crash testing on a 2019 Mercedes Benz S560, a 2019 Subaru Legacy, and a 2019 Volvo S60 “to evaluate the performance of Forward Collision Warning (FCW) and Automatic Emergency Braking (AEB) in both daytime and nighttime conditions[.]” Id. at 6. Plaintiffs contend, however, that expert testing can only be admitted where the testing conditions in the experiment are substantially similar to those at the time of the collision. See id. at 5. Plaintiffs argue that Harrington‘s testing fails to recreate the collision conditions in any material respect. Specifically, Plaintiffs identify the following purportedly material differences: (1) none of testing involved the subject vehicle—a 2019 Tesla Model S; (2) the layout of the test road lacked a T intersection, a cross street; guardrails on both sides; a stop sign; road markings suggesting the end of travel; and a limit line; (3) the test road used four-foot road signs instead of eight-foot road signs; (4) some of the testing was performed during the day even though the collision happened at night; and (5) none of the testing was conducted with the vehicles’ advanced driving assistance features activated. See generally ECF No. [322].

Tesla responds that Harrington‘s testing is relevant to rebut Plaintiffs’ “claim that crash avoidance and crash mitigation systems in the 2019 Tesla Model S were defective in failing to prevent or mitigate the crash.” ECF No. [349] at 1. According to Tesla, Harrington‘s testing should be admitted to help “show that [ ] no Model Year 2019 vehicle made by any manufacturer was equipped with forward collision prevention or mitigation systems that would have prevented this crash by stopping or slowing a 61 mile-per-hour passenger car before it broadsided a black SUV in the dark” because no such technology existed at the time. Id. at 1-2. Tesla argues that the substantially similar doctrine only applies where a party is attempting to recreate the circumstances under which the injury occurred. Id. at 3. Therefore, the doctrine is inapplicable where a party is simply attempting to demonstrate a scientific principle or establish what technology was in existence at the time. Id. Under such circumstances, the only test is “whether the evidence is relevant to the purpose for which it is being offered.” Id.

Generally, the doctrine of substantial similarity applies “when one party seeks to admit prior accidents or occurrences involving the opposing party,” or attempts to recreate the accident involving the defendant‘s product, “in order to show, for example, ‘notice, magnitude of the danger involved, the [party‘s] ability to correct a known defect, the lack of safety for intended uses, strength of a product, the standard of care, and causation.‘” Heath v. Suzuki Motor Corp., 126 F.3d 1391, 1396 (11th Cir. 1997) (quoting Jones v. Otis Elevator Co., 861 F.2d 655, 661 (11th Cir. 1988)); see McHale v. Crown Equip. Corp., No. 21-14005, 2022 WL 4350702, at *2 (11th Cir. Sept. 20, 2022); cf. Mitsubishi Motor Corp. v. Laliberte, 52 So. 3d 31, 38 (Fla. 4th DCA 2010) (Generally, the doctrine of substantial similarity applies in products liability claims when a party attempts to . . . recreate the accident involving the defendant‘s product[.]“). The purpose of the doctrine is to avoid the potential for “substantial prejudice” by introducing past occurrences or accidents by the defendant. Heath, 126 F.3d at 1396. However, “[t]he substantial similarity doctrine does not apply to situations, . . . where the evidence is ‘pointedly dissimilar’ and ‘not offered to reenact the accident.‘” Tran v. Toyota Motor Corp., 420 F.3d 1310, 1316 (11th Cir. 2005) (quoting Heath, 126 F.3d at 1396-97); see also Burchfield v. CSX Transp., Inc., 636 F.3d 1330, 1334-35 (11th Cir. 2011). For instance, where the purpose of the testing is merely to demonstrate a scientific principle, the substantial similarity doctrine is inapplicable. See Fox v. Gen. Motors LLC, No. 1:17-CV-209-MHC, 2019 WL 3483171, at *18 (N.D. Ga. Feb. 4, 2019).

The substantial similarity doctrine “requires that the recreation of an accident ‘be so nearly the same in substantial particulars as to afford a fair comparison in respect to the particular issue to which the test is directed.‘” McHale, 2022 WL 4350702, at *2 (quoting Burchfield, 636 F.3d at 1336). Accordingly, the “doctrine does not require identical circumstances[ ] and allows for some play in the joints depending on the scenario presented and the desired use of the evidence.” Ree v. Royal Caribbean Cruises Ltd., 315 F.R.D. 682, 685-86 (S.D. Fla. 2016) (citing Sorrels v. NCL (Bahamas) Ltd., 796 F.3d 1275, 1287 (11th Cir. 2015) (citing Borden, Inc. v. Florida East Coast Railway Co., 772 F.2d 750 (11th Cir. 1985)))). However, there are situations where “a slight change in the conditions under which the experiment is made will so distort the result as to wholly destroy its value as evidence, and make it harmful, rather than helpful.” Gen. Motors Corp. v. Porritt, 891 So. 2d 1056, 1058-59 (Fla. 2d DCA 2004). Ultimately, though, “[t]he determination of the similarity of the circumstances and conditions is left to the sound discretion of the trial court.” Id.60

The Court finds that the substantial similarity doctrine applies to all of Harrington‘s crash testing. While Tesla insists that the purpose of the testing was not to recreate the accident, but instead, to show that the Autopilot system was state of the art, it is a distinction without a meaningful difference. In his report, Harrington explained:

A series of demonstrations were conducted to evaluate the nominal performance of FCW and AEB and its behavior in a scenario that included certain aspects of the subject collision. The demonstrations were structured in three phases:

  1. Daylight nominal FCW and AEB demonstrations:
    1. with a human driver; and
    2. with a robot driver.
  2. Nighttime FCW demonstration with broadside targets:
    1. with a human driver.
  1. Nighttime collision scenario FCW and AEB demonstration:
    1. with a robot driver

ECF No. [322-4] at 4 (emphasis added). As Harrington acknowledged, his crash testing was nothing more than a recreation of the collision that simply substitutes several variables. Indeed, Tesla admits that in each phase of testing, Harrington sought to show that no vehicle could have succeeded in avoiding the crash under the circumstances. See ECF No. [349] at 4, 6 (“The jury is required to consider whether, according to industry standards and the ‘state of the art’ technology in 2019, FCW and AEB systems were capable of reliably issuing a warning or stopping a vehicle that was traveling at high speed towards the side profile of a parked vehicle at night. And that is the purpose for which Harrington‘s first two categories of tests are being offered.“). Although Tesla argues the first two phases of testing are distinguishable from the third, those two stages simply removed some of the complexities of the subject collision in an attempt to isolate variables and prove just how ineffective ADAS technology was in 2019. However, the inference the jury is expected to make at each phase of testing is the same—no manufacturer‘s ADAS system was capable of avoiding the type of collision at issue in this case. Therefore, Harrington‘s testing was not pointedly dissimilar like the testing in McHale or Heath, nor does the testing merely attempt to prove a scientific principle.61 See Fox, 2019 WL 3483171, at *18. As such, the substantial similarity doctrine applies to each phase of Harrington‘s crash testing.

Turning first to phase one, Harrington refers to this phase of testing as the “Daylight nominal FCW and AEB demonstrations.” ECF No. [349-3] at 60. The first phase of testing involved a straight, in-line vehicle approaching the rear of a Global Vehicle Target (“GVT“) at 25 miles per hour during the daytime. See ECF No. [349-3] at 73. Unlike the subject collision, the phase one testing occurred during the day and involved test vehicles approaching nearly 40 miles slower than McGee‘s Vehicle. Furthermore, the target vehicle was in the same lane as the test vehicles, not on the side of the road, and the test vehicles approached the target vehicle from the rear rather than broadside. See id. The Court finds these dissimilarities to be material and substantial given the Parties’ whole dispute concerns whether an ordinary consumer would reasonably expect Autopilot to avoid a broadside collision with a vehicle outside of its driving lane at night. Because the first phase of testing offers no insight on that issue, the testing cannot be presented to the jury.

Harrington‘s second phase of testing—Nighttime FSC demonstration with broadside targets—sought to “evaluate and show how three aspects of this crash were challenging for the entire automotive industry.” ECF No. [349] at 8. The second phase of testing incorporated three variables from the collision (1) nighttime lighting; (2) a side profile target; and (3) a high speed (60 mph) approach.” Id. Within phase two, Harrington conducted three test scenarios. The first scenario involved a global target vehicle. Id. The second scenario used “a 2010 Chevrolet Tahoe (the same vehicle involved in this crash) as the target” vehicle. Id. In the final scenario, “Harrington partially occluded the Chevrolet Tahoe by road end signs in an orientation consistent with reconstruction conducted by Tesla‘s Accident Reconstruction expert.” Id. Each scenario incorporated additional relevant characteristics of the subject collision. One of the key differences between the subject collision and the second phase of testing was that, in the second phase, “[t]he driver was instructed to drive straight in the center of the travel lane towards the broadside of the target vehicle, and then steer to avoid the target upon reaching a cone positioned approximately 141 feet from the target (i.e., approximately 1.6 s time to collision).” ECF No. [349-3] at 77. “The driver was further instructed to not apply the brakes before the steering maneuver unless an abort was necessary.” Id. Additionally, none of the three scenarios included a flashing stop light or a stop sign despite both being detected by McGee‘s Vehicle prior to the subject collision. See ECF No. [349-3] at 11 (describing relevant circumstances about the subject collision).

The Court does not find that the swerving maneuver nor the lack of a stop light or stop signs renders the phase two testing so dissimilar that it must be excluded.62 Indeed, it appears that the phase two testing substantially replicates the subject accident in all material respects up until the final moments when the FWC system should have given a warning of an obstacle in the roadway. The testing was done at night, with a test vehicle traveling approximately 61 miles per hour headed toward a broadside vehicle. Because the phase two testing was focused on the effectiveness of the FWC system and not the AEB system, the Court does not believe that Tesla‘s failure to replicate the crash event itself precludes a finding of substantial similarity. Moreover, the key dispute between the Parties is whether FCW and AEB systems were capable of reliably issuing a warning or stopping a vehicle that was traveling at high speed towards the side profile of a parked vehicle at night. Since Harrington‘s phase two testing incorporates all of these relevant variables, the Court finds the testing to be substantially similar. See Rappuhn, 2024 WL 2930448, at *4 (“Typically, the ‘failure to include variables’ in an expert‘s testing ‘will affect the analysis‘[s] probativeness, not its admissibility.‘“) (quoting Quiet Tech., 326 F.3d at 1346) (additional level of quotation citations and quotations omitted).

Tesla argues that the third phase of Harrington‘s testing—the Nighttime Collision Scenario Collision—also involved variables substantially similar to those involved in the subject collision. The third phase was a two-vehicle test intended to evaluate both the FCS and AEB under conditions similar to those in the subject collision. See ECF No. [349-3] at 82. The third phase of testing was conducted at night, recreated the signs and lights at the intersection, placed a black 2010 Chevrolet Tahoe behind the signs, and had the test vehicles approach the obstacles at the same speed as McGee did on the night of the collision.63 Although Plaintiffs identify several differences between the conditions in Harrington‘s testing and the subject collision, just as in the second phase, those environmental differences are not significant or substantial and therefore go

to the weight of the evidence rather than its admissibility.64 The only potential material difference is Harrington‘s decision not to engage the test vehicles’ ADAS in any phase of testing. See ECF No. [322] at 10. According to Tesla, turning off the ADAS in the test vehicles is an accurate rendering of what took place in the subject collision because McGee overrode the Autopilot in the Vehicle by pressing the accelerator. ECF No. [349] at 14. Plaintiffs, on the other hand, argue there is no dispute that Autopilot was activated at the time of the subject collision and that the relevant safety features were still operational, notwithstanding McGee‘s increased speed. See ECF No. [322] at 9. Therefore, according to Plaintiffs, Harrington‘s decision not to activate the test vehicle‘s ADAS failed to faithfully recreate the collision in any phase of the testing. See id. However, whether all the relevant Autopilot features in McGee‘s Vehicle were operational at the time of the collision are material facts in dispute. See supra note 6. As such, the Court cannot exclude the testing based on the substantial similarity doctrine since the Parties dispute what an accurate recreation of the collision entails. Indeed, if the jury determines that McGee overrode his Autopilot system in the seconds before the collision, Harrington‘s testing is likely to be helpful in resolving issues of causation. As such, the Court will likely instruct the jury that none of the vehicles in the testing had their ADAS system engaged, but it is the jury‘s job to determine whether McGee‘s Vehicle had the Autopilot system fully engaged at the time of the collision. Accordingly, the Court finds that both Harrington‘s phase two and phase three testing are substantially similar to the subject collision and therefore will be admissible at trial.

C. Tesla‘s Motion for Summary Judgment

Tesla contends that the undisputed evidence demonstrates that summary judgment is proper on each of Plaintiffs’ claims because Plaintiffs have failed to establish Tesla proximately caused Plaintiffs’ injuries or that the Autopilot system was defectively designed.

i. Design Defect

a. Proximate Cause

Tesla argues two separate grounds for summary judgment on Plaintiffs’ design defect claim—lack of proximate cause and failure to establish a defect in the Autopilot‘s design. Tesla maintains that, to prove proximate cause for a complex and technical system like Autopilot, Plaintiffs must set forth admissible expert testimony on the issue. ECF No. [326] at 9. Because Tesla contends that Plaintiffs’ experts are excludable under Daubert, Plaintiffs cannot carry their burden. Id.

Moreover, even if Plaintiffs’ experts are permitted to testify, the design defect claim still fails because McGee‘s negligence, not the Autopilot system, was the sole proximate cause of the collision and Plaintiffs’ injuries. Id. at 10. According to Tesla, while it is true that proximate cause is typically an issue that should be left to the jury, the Court may grant summary judgment where the record reflects something other than the defendant‘s conduct was the exclusive cause of the plaintiff‘s injuries. Id. Tesla maintains that the record reflects that “McGee‘s negligence was the sole proximate cause of the accident.” Id. at 11. Tesla first notes that “McGee was very familiar with Card Sound” Road and that he caused the collision. Id. at 11-12. Specifically, McGee made four notable admissions:

  1. “[H]e became distracted from his driving responsibilities when he dropped his cell phone.”
  2. “He looked down, and when he did so, ignored multiple traffic control devices including a visible stop sign, an overhead red flashing stop light, and multiple road signs, and drove straight through the intersection.”
  3. “[T]he signs were visible if he had looked up and nothing prevented him from stopping.”
  4. “[He] was driving. [He] dropped [his] phone and looked down and [he] ran the stop sign and hit the guy‘s car.”

Id. at 12. Tesla contends McGee also provided testimony that “he was aware of the risks of operating his vehicle while Autopilot was engaged[,] . . . Autopilot did not make the car ‘self-driving[,]’ . . . [and that he] was high aware that it was still [his] responsibility to operate the vehicle safely” even with Autopilot activated. Id. at 13.

Tesla insists that not only do McGee‘s admissions demonstrate he was the sole cause of the collision, but Plaintiffs’ expert testimony supports such a finding. According to Tesla, Moore “conceded that Autosteer did not cause the accident,” and “McGee overrode TACC in the more than 30 seconds before the crash when he pressed the accelerator pedal to achieve a speed of 62 miles per hour” even though the TACC was set at 45 miles per hour. Id. at 11. Tesla claims that under those circumstances, a message would appear on the Vehicle‘s display after six seconds informing the driver that the Vehicle would no longer automatically brake. Id.

Tesla further argues that the experts all agree that:

  1. “The operator of Level 2 vehicles, like the Model S, is in control of the vehicle and responsible for what occurs in the vehicle.”
  2. “This accident was avoidable by an attentive driver.”
  3. “McGee did not fulfill his role as a Level 2 driver in supervising the driving automation system and intervening to maintain safe operation of the vehicle.”
  4. “In the last five seconds, McGee was in a better [position] than Tesla to avoid the crash.”

Id. at 13. Given those facts, Tesla contends that it is “indisputable” that “Autopilot could not be the cause of the accident” and that “this accident was solely caused by McGee‘s own conscious and deliberate decision to pick up his phone while manually operating his vehicle.” Id. at 12-14. Since no change to Autopilot‘s design would have prevented McGee from being in control of the Vehicle or kept him from using and reaching down for his phone, Autopilot could not be considered a proximate cause of the collision.

Even assuming the Court is not persuaded that McGee was the sole proximate cause of the collision, Tesla argues it is still entitled to summary judgment on the design defect claim because Plaintiffs have not “establish[ed] through expert testimony, that a specific defect [in the Autopilot] . . . was, more likely than not, a substantial contributing factor in causing the claimed injuries.” Id. at 14 (citing Gooding v. Univ. Hosp. Bldg. Inc., 445 So. 2d 1015, 1020 (Fla. 1984)). Although Plaintiffs’ expert, Moore, opined that the collision would not have occurred had Autopilot not been available, Tesla contends “Moore‘s causation opinion is nothing more than speculation,” especially given that Moore conceded:

  1. “Drivers get distracted all the time.”
  2. “Driver distraction is not limited to Tesla drivers who use Autopilot” and “[t]he problem of using cell phones and taking one‘s eyes off the road existed before Autopilot.”
  3. “With or [w]ithout Autopilot, people run the stop sign at the Card Sound Road intersection.”
  4. “With or [w]ithout Autopilot engaged, McGee could have been using his phone[,] . . . been distracted by his phone[,] and crashed into the Tahoe.”
  5. “Disabling Autopilot or designing Autopilot to not engage outside of its ODD would not have prevented McGee from accelerating it.”

Id. at 15-16. (emphasis removed). Since Plaintiffs’ expert acknowledges that disabling Autopilot “could not have prevented McGee from using his phone and could not have prevented McGee from crashing, Tesla argues that it “stands to reason that the use of Autopilot was not a substantial factor in causing Plaintiffs’ injuries.” Id. at 16 (emphasis removed).

Tesla maintains Cummings’ testimony that “as a result of alleged defects in Autopilot‘s design, McGee was complacent and confused as to Autopilot‘s capabilities and such capabilities and confusion resulted in the accident” is equally speculative. Id. at 17. While Cummings opined that “individuals who use Autopilot get complacent and engage in riskier behaviors than they would without Autopilot engaged,” Tesla argues Cummings made several concessions that undermine the relevance of that opinion here. Id. Like Moore, Cummings acknowledged that “[c]ell phone usage is a distraction that is independent of Autopilot,” and that the National Highway Traffic Safety Administration‘s research on distracted driving “identifies cell phone use as a source of distraction for all drivers.” Id. Furthermore, Cummings concedes that “[t]here is no specific study to support her opinion that drivers of [T]esla with Autopilot [ ] look away and reach for dropped items with a greater frequency than drivers of vehicles without ADAS.” Id. She also admits that “[w]ith or without Autopilot engaged, McGee could have engaged in distracted behavior and that [i]t was speculative to opine as to whether McGee would have used his cell phone while approaching the intersection without Autopilot.” Id.65

Plaintiffs argue that Tesla cannot show that McGee was the sole proximate cause of the collision. They point out that Tesla need not be the sole or even primary cause of the injuries, only the substantial cause of Plaintiffs’ injuries. ECF No. [352]. Moreover, Tesla may still be liable even if McGee constitutes an intervening tortfeasor, as long as McGee‘s negligence was reasonably foreseeable, as “manufacturers have a duty to guard against foreseeable carelessness.” Id. at 9. Plaintiffs insist that while Tesla can seek a charge of comparative negligence, given the record in this case, Tesla may not avoid liability altogether simply by asserting that McGee was an intervening cause. Id. at 10.66 To show that McGee was not the sole cause, Plaintiffs point to their experts’ opinions that McGee‘s negligence was a predictable consequence of Tesla‘s defective design, which led to drivers being less engaged and more distracted while operating their vehicles. Id. Additionally, there are also the NTSB and NHTSA determinations that “Tesla‘s Autopilot system makes drivers unduly complacent.” Id. Plaintiffs argue these facts alone are sufficient to establish that Tesla was a proximate cause of the collision, as McGee‘s conduct was a foreseeable consequence of incentivizing misuse or abuse of the Autopilot system.

Moreover, Plaintiffs dispute Tesla‘s contention that McGee accepted sole responsibility for the collision or that Plaintiffs’ experts conceded the issue of proximate cause. Plaintiffs argue that “although McGee accepted responsibility for his contribution to the crash, he also blamed [T]esla for the Vehicle‘s defective design.” Id. at 12. Plaintiffs also contend that Tesla places undue weight on specific statements of Plaintiffs’ experts while ignoring the experts’ overarching opinions. For instance, Tesla ignores Moore‘s opinion that the “crash was caused by Tesla‘s failure to do even one of a number of things that would have prevented this crash including (but not limited to“:

  1. “[D]isabling Autosteer for a week after . . . three forced disengagements.”
  2. Informing the driver that they could reengage autosteer after disablement by simply pulling over, putting the car in park for a moment, and then continuing with their journey.
  3. Limiting use of the Autopilot system to its ODD.
  1. “[F]or every disengagement, provide a training interval for the driver. Explain to the driver why it disengaged rather than simply saying ‘Autosteer unavailable for the rest of the driver. Hold steering to drive manually.‘”

Id. at 12-13. (quoting ECF No. [351] at ¶ 140). Furthermore, Moore testified that the augmented video from the Vehicle indicates that Autopilot detected the stop sign, the end of the road, the Tahoe, and Plaintiff Angulo but failed to otherwise timely warn McGee or stop the Vehicle before the crash. Id. at 13. According to Moore, had Autopilot braked or audibly warned McGee of the obstacles once those obstacles were detected, there would have been enough time for the Vehicle “or McGee to act and avoid the accident completely, or, at a minimum, significantly reduce the speed at impact.” Id. at 13. Given these opinions, Plaintiffs maintain that their design defect is still viable, notwithstanding that McGee‘s negligence might have contributed to the collision. See id.

b. Whether Tesla‘s Autopilot Contains a Defect

Tesla argues that Plaintiffs also fail to establish their design defect claim because they cannot prove that Autopilot is defective or unreasonably dangerous. Tesla first addresses Plaintiffs’ experts’ claim that “the Model S was defective because it failed to prevent drivers from engaging Autopilot on roads that were not designated for its use.” ECF No. [326] at 21. Tesla asserts that the Autopilot‘s design “was consistent with the state of the art and industry standards in 2019” and with guidance issued by the Society of Automotive Engineers (“SAE“) that stated, “for Level 2 vehicles (like the Model S), the driver alone should determine when to use what features,” including when and where Autopilot should be engaged. Id. Therefore, because the state-of-the-art technology in 2019 required a driver to always be engaged and responsible for their vehicle, Tesla argues that Plaintiffs’ experts essentially conclude that “Autopilot is defective because it does not prevent drivers from being irresponsible.” Id. Since manufacturers are not insurers and are not obligated to design a foolproof product that prevents any form of misuse, Tesla contends that Plaintiffs’ experts’ opinions on the parameters of the ODD should not go before the jury.

Tesla next addresses the contention that the Driver Monitoring System (DMS) failed to adequately monitor driver awareness and thus allowed a driver to misuse and abuse Autopilot. See id. at 22. Tesla argues that by concluding that DMS should have done a better job of keeping the driver engaged, Plaintiffs’ experts are essentially insisting that Tesla should have created an error tolerant system before it put the Autopilot system on the market—a burden that Tesla is not required to satisfy. Id.

As for the alleged defects in Autopilot‘s Automatic Emergency Brake (AEB) system and Forward Collision Warning (FCW) system, Tesla argues that they were not defective simply because the AEB system did not timely apply the emergency brakes and the FCW system did not issue a warning when it detected possible obstacles. Id. at 23. According to Tesla, the systems were only designed “to prevent a collision with a lead vehicle traveling in front of and in the same lane as the Tesla, not to detect and prevent collisions with the side profile of a stationary vehicle like the instant crash.” Id. at 23. This is why neither of Plaintiffs’ experts were willing to opine that AEB or FWC is defective. Not only did Tesla‘s Model S not have the technology to automatically stop under the circumstances present here, Tesla also contends “there is no evidence that any other car would have been able to do so.” Id. at 23.67 Accordingly, Tesla argues that because none of the features of the Autopilot were defective, Plaintiffs cannot prove their design defect claim.

Plaintiffs maintain that they have provided sufficient evidence of a defective design in Tesla‘s Autopilot system, specifically, defects that “‘created the consumer expectation’ that Autopilot would protect them from the consequences of careless driving.” ECF No. [352] at 15.

Turning to the first alleged defect, Plaintiffs argue that permitting drivers to use Autopilot outside of its ODD was not “‘consistent with state of the art’ [technology] in 2019” and was indeed a defect in the Autopilot design given the likely risk of serious injury if not utilized as Tesla intended. Id. at 15. According to Plaintiffs, their experts testified that as of 2019, Cadillac had already implemented a “Super Cruise” system that prohibited drivers from using the system outside of its ODD, and in 2016, General Motors was already “‘geofencing‘—meaning that their Super Cruise technology could not be used outside of its ODD” either. Id. at 15. Moreover, in 2017, NTSB recommended that “Tesla and other OEMs ‘incorporate system safeguards that limit the use of automated vehicle control systems to those conditions for which they were designed (H-17-41).‘” Id. (quoting ECF No. [347-5] at 43). Since NTSB “would not have made this recommendation if it had not been technically feasible,” Plaintiffs argue that Tesla cannot prove that its Autopilot technology was “state of the art.” Id.

Plaintiffs also argue that Autopilot was defective because the DMS did not adequately ensure the driver remained engaged and attentive to the road while operating the Vehicle. Plaintiffs contend that in opining that the DMS was defective, Plaintiffs’ experts are not insisting Tesla designed an “error tolerant system” or “a perfect car,” “they simply opine that [Tesla] easily could have made a far less deadly one.” Id. at 16.

With respect to AEB and FWC systems, Plaintiffs dispute Tesla‘s contention that Plaintiffs’ experts never opined that AEB and FCW are defective. Id. at 17. According to Plaintiffs, when asked “whether the forward collision warning and automatic braking [systems] . . . in the Tesla Model S were defective, Cummings responded “that, based on the augmented video, it was clear that the Vehicle malfunctioned,” seeing as the Vehicle correctly detected obstacles but still failed to brake or alert McGee of the obstacles, despite “what the owner‘s manual says it will do.” Id. at 17 (quoting ECF No. [318-8] at 8:105, 33:2-8; 51:4-6). While Cummings could not definitively conclude whether the AEB system was defective, it was because “it wasn‘t clear whether ‘Autopilot was suppressing the control’ or AEB failed independently of Autopilot.” Id. (quoting ECF No. [318-8] at 64:1-3, 71:20-72:2). Cummings opined there was a defect in the Vehicle‘s design. However, because the Autopilot, AEB, and FCW systems “are highly interdependent” and the relationship between the systems is “opaque,”68 “Cummings could not—and would not—say that the crash was exclusively caused by Autopilot or by some combination of failures by Autopilot and AEB/FWC.” Id. at 17. Therefore, even though Plaintiffs’ expert could not definitively conclude there was a defect in the Autopilot‘s utilization of the AEB and FCW systems, the factual dispute is significant enough that the Court should not grant summary judgment on the issue of whether the Vehicle‘s Autopilot contributed to the collision. Id. at 18.69

“Under Florida law, a strict products liability action based upon design defect requires the plaintiff to prove that (1) a product (2) produced by a manufacturer (3) was defective or created an unreasonably dangerous condition (4) that proximately caused (5) injury.” Brosius v. Home Depot Inc., No. 6:20-cv-1136, 2022 WL 1272087, at *4 (M.D. Fla. Feb. 8, 2022) (citing McCorvey v. Baxter Healthcare Corp., 298 F.3d 1253, 1257 (11th Cir. 2002); West v. Caterpillar Tractor Co., 336 So. 2d 80, 87 (Fla. 1976)). “To prevail in a products liability case under Florida law for . . . strict liability, Plaintiff must establish a defect in the subject product” that existed “both at the time of the accident and at the time the product was within the possession of the manufacturer.” Zuccaro v. Tricam Indus., No. 9:21-cv-80867, 2022 WL 17750747, at *10 (S.D. Fla. Sep. 12, 2022); Jozwiak v. Stryker Corp., No. 6:09-cv-1985, 2010 WL 743834, at *6 (M.D. Fla. Feb. 26, 2010) (citing Jones v. Heil Co., 566 So. 2d 565, 567 (Fla. 1st DCA 1990)); Builders Shoring and Scaffolding Equip. Co., Inc. v. Schmidt, 411 So. 2d 1004, 1006 (Fla. 5th DCA 1982)). “A product may be defective by virtue of a design defect, a manufacturing defect, or an inadequate warning.” Brosius, 2022 WL 1272087, at *4 (quoting Jennings v. BIC Corp., 181 F.3d 1250, 1255 (11th Cir. 1999)). “The burden to show a defective design exists is on the plaintiff.” Farias v. Mr. Heater, Inc., 757 F. Supp. 2d 1284, 1293 (S.D. Fla. 2010) (citation omitted).

Florida law generally requires a plaintiff to show a product was defectively designed under the “consumer-expectation test,” the “risk-utility test,” or both. Aubin v. Union Carbide Corp., 177 So. 3d 489, 510-11 (Fla. 2015); see also Cates v. Zeltiq Aesthetics, 73 F.4th 1342, 1351 (11th Cir. 2023) (explaining that “[t]wo different tests determine whether a product is defective: (1) the consumer expectations test and (2) the risk utility test.“). Under the consumer-expectation test, a product is defective if “the product fails to perform as safely as an ordinary consumer would expect when used as intended or in a manner reasonably foreseeable by the manufacturer.” Tillman v. C.R. Bard, Inc., 96 F. Supp. 3d 1307, 1338-39 (M.D. Fla. 2015); RESTATEMENT (SECOND) OF TORTS § 402A; see also Aubin, 177 So. 3d at 504 (“[U]nder the consumer-expectation theory[,] a product is defectively designed if the plaintiff is able to demonstrate that the product did not perform as safely as an ordinary consumer would expect when used in the intended or reasonably foreseeable manner.“). The risk-utility test requires a plaintiff to demonstrate that “the foreseeable risks of harm posed by the product could have been reduced or avoided by the adoption of a reasonable alternative design . . . and the omission of the design renders the product not reasonably safe.” Cates, 73 F.4th at 1351. Regardless of which test is applied, however, “[a] manufacturer is not under a duty in strict liability to design a product which is totally incapable of injuring those who foreseeably come in contact with the product.” Grieco v. Daiho Sangyo, Inc., 344 So. 3d 11, 19 (Fla. 4th DCA 2022) (quoting Husky Indus. v. Black, 434 So. 2d 988, 991 (Fla. 4th DCA 1983)) (additional level of citation and quotations omitted); see also West, 336 So. 2d at 87. “Products liability does not make the manufacturer an insurer of all foreseeable accidents which involve its products [because] [v]irtually any product is capable of producing injury when put to certain uses or misuses.” Grieco, 344 So. at 18-19. Therefore, a manufacturer will not be liable simply because the design is not the safest design possible, so long as the design is not unreasonably dangerous. Id. (quoting Brown v. Glade & Grove Supply, Inc., 647 So. 2d 1033, 1035 (Fla. 4th DCA 1994)).

“Design defects must be proven by expert testimony,” and the plaintiff must show that “but for the defect, the injury would not have occurred.” Alvarez v. General Wire Spring Co., No. 8:07-cv-1319-T-33TGW, 2009 WL 248264, *4 (M.D. Fla. Feb. 1, 2009) (citing Drury v. Cardiac Pacemakers, Inc., No. 8:02-cv-933T-17MAP, 2003 WL 23319650, *4 (M.D. Fla. June 3, 2003)); R.J. Reynolds Tobacco Co. v. Nelson, 353 So. 3d 87, 93 (Fla. 1st DCA 2022) (quoting Aubin., 177 So. 3d at 513) (internal quotations omitted). Additionally, “[t]o prove causation under a strict products liability theory, a plaintiff must prove that the product defect proximately caused his injury.” Rink v. Cheminova, Inc., 400 F.3d 1286, 1295 (11th Cir. 2005). To survive summary judgment, the plaintiff “must introduce evidence which affords a reasonable basis for the conclusion that it is more likely than not that the conduct of the defendant was a substantial factor in bringing about the result.” Guinn v. AstraZeneca Pharm. LP, 602 F.3d 1245, 1256 (11th Cir. 2010) (quoting Gooding v. Univ. Hosp. Bldg., Inc., 445 So. 2d 1015, 1018 (Fla. 1984)). “[A] mere possibility of causation is not enough.” Pierre v. Intuitive Surgical, Inc., 854 F. App‘x 316, 317 (11th Cir. 2021) (internal quotation omitted). Accordingly, where the issue of causation is “one of pure speculation or conjecture, or the probabilities are at best evenly balanced, it becomes the duty of the court to direct a verdict for the defendant.” Guinn v. AstraZeneca Pharm. LP, 602 F.3d 1245, 1256 (11th Cir. 2010).

The Court finds that Plaintiffs have provided evidence affording a reasonable basis to conclude that, more likely than not, a defect in the Autopilot system was a substantial factor in bringing about Plaintiffs’ injuries. As the Court discussed in its Daubert analysis, Plaintiffs offer sound, competent, and reliable expert testimony from Moore and Cummings that Autopilot has several design defects (e.g., the ODD Defect, DMS Defect, and TACC Defect) that they opined were a substantial cause of the subject collision. Accordingly, the Court rejects Tesla‘s request for summary judgment based on the absence of expert causation testimony. The Court also rejects Tesla‘s argument that McGee was the indisputable sole, substantial cause of the collision such that Tesla may avoid all liability for any purported defect in Autopilot‘s design. While McGee conceded that he was responsible for operating the Vehicle safely and failed to do so, that does not necessarily lead to the conclusion that he alone is responsible for the resulting collision, particularly given McGee‘s testimony that he expected Autopilot to avoid the collision. See ECF No. [318-9] at 46:3-5, 176:7. The law is clear that there may be multiple proximate causes of a given injury so long as each is a substantial cause of the injury. See Ruiz v. Tenet Hialeah Healthsystem, Inc., 260 So. 3d 977, 982 (Fla. 2018) (“[T]he law does not require an act to be the exclusive or even the primary cause of an injury in order for that act to be considered the proximate cause of the injury[.]“). As explained above, there is sufficient and reliable record evidence that both McGee and Tesla substantially contributed to the collision. Plaintiffs have adequately demonstrated that Autopilot‘s design led Tesla drivers, and McGee in particular, to become complacent and over rely on Autopilot to operate their vehicles. Additionally, Plaintiffs provided evidence that allowing Tesla drivers to utilize Autopilot outside of its ODD and without any training was unreasonably dangerous as it increased the likelihood that Tesla owners would attempt to rely on Autopilot in conditions where Tesla knew the technology was unlikely to succeed. As Plaintiffs experts opine, if Tesla had eliminated, or at least mitigated, these foreseeable dangers, McGee would likely have been paying more attention and would have been better prepared to avoid the subject collision.70 Because the Court has already determined that Plaintiffs’ expert opinions as to the design defects and causation are not based on mere conjecture or speculation, the Court will not grant summary judgment on Plaintiffs’ design defect claim as those issues are materially in dispute. See Jackson v. H.L. Bouton Co., 630 So. 2d 1173, 1175 (Fla. 1st DCA 1994).

ii. Failure to Warn (Strict Products Liability)

Tesla argues Plaintiffs have failed to establish each element of their failure to warn claim. First, Tesla contends that Tesla only has an obligation to warn a consumer of dangers where the consumer is unaware of the danger involved and the danger would not otherwise be obvious to a reasonable person. ECF No. [326] at 22. Therefore, Plaintiffs cannot establish a failure to warn under the circumstances because “the dangers of operating a vehicle without paying attention are both obvious and known,” even when Autopilot is engaged, and “McGee knew the dangers of operating a vehicle, whether equipped with Autopilot or not, while distracted.” Id. at 26. Indeed, McGee specifically acknowledged that he (1) “knew the vehicle was self-driving,” (2) that “he was responsible for the vehicle,” and (3) that he “purposefully did not purchase a package that provided hardware compatible with future software development which may enable the vehicle to be ‘self-driving.‘” Id. Consequently, “[b]ecause McGee was clearly aware of the universally understood danger posed by failing to control a vehicle on the road . . . Tesla had no duty to warn him of these dangers.” Id.

Moreover, even if Tesla did owe a duty to McGee, Tesla argues its “extensive warnings” were not only “accurate, clear, and unambiguous,” they were adequate as a matter of law. Id. Tesla notes that “[t]he first time McGee used Autosteer, he was required to acknowledge and indicate acceptance of its limitations,” including that he should only use Autosteer if he was willing to keep his hands on the steering wheel, pay close attention to the road, and override the Autosteer function if necessary. Id. at 27. Tesla maintains no other warnings were necessary to ensure that McGee could safely operate the Vehicle.

Tesla further argues that, even if its warnings were inadequate, Plaintiffs have failed to establish that the alleged failures to warn proximately caused their injuries. Id. First, assuming the Court grants Tesla‘s Daubert motion, Plaintiffs will not be able to establish proximate cause with requisite expert testimony. However, even if the Court permits the expert testimony, Plaintiff still cannot establish proximate cause because McGee admitted he did not read the existing warning, and even if he did, “there is no evidence that any particular warning would have affected McGee[‘s]” conduct on the day of the collision. Id. Given those deficiencies, Tesla maintains that summary judgment on Plaintiffs’ failure to warn claim must be granted.

Plaintiffs respond that the failure to warn claims are viable, well supported by the record evidence, and the dangers associated with the Autopilot system were not known to McGee or otherwise open and obvious to the average consumer. ECF No. [352] at 19-20. According to Plaintiffs, the dangers of Autopilot were not open and obvious given that the term “Autopilot” can be a confusing misnomer and any “obvious” risks were “contradicted by Tesla‘s own marketing of the cars as miracles of modern technology.” Id. at 19 (citing ECF No. 347-2 at 6-7). Plaintiffs also point to McGee‘s purported confusion about the Vehicle‘s Autopilot. Although he recognized he was ultimately responsible for operating the car, Plaintiffs highlight that “McGee purchased the ‘highest package’ available thinking it would ‘keep him in the lane, avoid crashes, and direct him to where he needed to go.‘” Id. at 20 (alterations adopted) (quoting ECF No. [351] at ¶¶ 99, 103-05).

Plaintiffs also disagree that Tesla‘s warnings are adequate as a matter of law. Plaintiffs argue that Tesla has made no showing that the warnings were “accurate, clear, [or] unambiguous” while Plaintiffs have established facts in the record to the contrary. Id. at 20. Plaintiffs contend that the record supports the following three warnings were inadequate and that each contributed to the collision: (1) “Tesla failed adequately to warn that the Vehicle had an Autopilot that was still in Beta;” (2) the Owner‘s Manual warnings “were inadequate due to their inaccessibility to consumers,” given that the Owner‘s Manual was only available in electronic format and difficult to access; and (3) the “Vehicle‘s DMS failed to provide sufficient warnings to ensure driver attentiveness.” Id. at 21.

Plaintiffs maintain there is also sufficient evidence to establish proximate cause for each of these alleged defective warnings. According to Plaintiffs, the issue of lack of proximate cause only goes to the issue of the warnings in the Owner‘s Manual and not the adequacy of the Beta warning or the Vehicle‘s warnings to regain the driver‘s attention. In any event, Plaintiffs dispute that a failure to read the warnings in the Owner‘s Manual is dispositive, given the fact that the Owner‘s Manual was difficult to access due to its electronic format. Therefore, even assuming the content of the Owner‘s Manual contained sufficient warnings, the accessibility of those warnings rendered them defective.71

“Under Florida law, ‘a product may be in a defective condition due to a . . . defective warning.’ Brown v. Glade & Brown v. Glade & Grove Supply, Inc., 647 So. 2d 1033, 1035 (Fla. 4th DCA 1994) (internal citations omitted). Consequently, a manufacturer may be liable in tort for introducing an otherwise safe product into the stream of commerce solely by virtue of inadequate warnings.” Veliz v. Rental Servs. Corp. USA, Inc., 313 F. Supp. 2d 1317, 1324 (S.D. Fla. 2003) (citing Ferayorni v. Hyundai Motor Co., 711 So. 2d 1167, 1170 (Fla. 4th DCA 1998)). To succeed on a failure to warn claim under a theory of strict liability,72 “a plaintiff must show (1) that the product warning was inadequate;73 (2) that the inadequacy proximately caused her injury; and (3) that she in fact suffered an injury from using the product.” Eghnayem v. Bos. Sci. Corp., 873 F.3d 1304, 1321 (11th Cir. 2017) (citing Hoffmann-La Roche Inc. v. Mason, 27 So. 3d 75, 77 (Fla. 1st DCA 2009)). “‘While in many instances the adequacy of warnings . . . is a question of fact,’ the Florida Supreme Court has held that ‘it can become a question of law where the warning is

accurate, clear, and unambiguous.‘” Id. (quoting Felix v. Hoffmann-LaRoche, Inc., 540 So. 2d 102, 105 (Fla. 1989)). “To warn adequately, the [warning] label must make apparent the potential harmful consequences” and “must be of such intensity as to cause a reasonable man to exercise for his own safety caution commensurate with the potential danger.” Farias v. Mr. Heater, Inc., 684 F.3d 1231, 1233 (11th Cir. 2012) (quoting Scheman-Gonzalez v. Saber Mfg. Co., 816 So. 2d 1133, 1139 (Fla. 4th DCA 2002) (internal quotations omitted)). The adequacy of a given warning is “determined by a ‘reasonable person’ standard, rather than on each particular plaintiff‘s subjective appreciation of the danger.” Byrnes v. Honda Motor Co., Ltd., 887 F. Supp. 279, 281 (S.D. Fla. 1994).

However, a failure to warn claim may still fail as a matter of law even where the warnings are deemed inadequate. Indeed, “[u]nder Florida law [ ], ‘where the person to whom the manufacturer owed a duty to warn . . . has not read the label, an inadequate warning cannot be the proximate cause of the plaintiff‘s injuries.‘” Leoncio v. Louisville Ladder, Inc., 601 F. App‘x 932, 933 (11th Cir. 2015) (quoting Lopez v. So. Coatings, Inc., 580 So. 2d 864, 865 (Fla. 3d DCA 1991) (record quotation omitted)). Still, a plaintiff‘s failure to warn claim may still survive, notwithstanding the consumer‘s failure to read the warning, where the very nature of the defendant‘s breach is such that it causes the consumer to fail to read the warning that would have prevented the injury. See Stanley Indus., Inc. v. W.M. Barr & Co., 784 F. Supp. 1570, 1575 (S.D. Fla. 1992) (“[A] plaintiff who does not read an allegedly inadequate warning cannot maintain a negligent-failure-to-adequately warn action unless the nature of the alleged inadequacy is such that it prevents him from reading it.“); Thomas v. Bombardier Recreational Prods., Inc., 682 F. Supp. 2d 1297, 1301 (M.D. Fla. 2010) (“[W]hen plaintiff cannot read the [warning] due to the fault of the manufacturer, her failure to warning theory can be viable.“).

Before deciding whether Tesla provided adequate warnings regarding its Autopilot system as a matter of law, the Court must first determine whether Tesla had a duty to provide any warnings. See Cook v. MillerCoors, LLC, 829 F. Supp. 2d 1208, 1214 (M.D. Fla. 2011) (“In order to prevail on a failure-to-warn claim, a plaintiff must establish the existence of a duty.“). “[A] manufacturer has a duty to warn where its product is inherently dangerous or has dangerous propensities,” and also has “a duty to warn of dangerous contents in its product which could damage or injure even when the product is not used for its intended purpose.” Grieco, 344 So. 3d at 20-21 (quoting Scheman-Gonzalez, 816 So. 2d at 1139 and High v. Westinghouse Elec. Corp., 610 So. 2d 1259, 1262 (Fla. 1992) (internal quotations omitted)). “However, there is no duty to warn of an obvious danger,” a commonly known danger, or “a danger that [the user] is aware of.” Rodriguez v. New Holland North America, Inc., 767 So. 2d 543, 544-45 (Fla. 3d DCA 2000) (quoting Siemens Energy & Automation, Inc. v. Medina, 719 So. 2d 312, 314 (Fla. 3d DCA 1998) (additional level of citation and quotation omitted); citing Wickham v. Baltimore Copper Paint Co., 327 So. 2d 826, 827 (Fla. 3d DCA 1976)); Garcia v. Character Tech., Inc., No. 6:24-cv-1903, 2025 WL 1461721, at *16 (M.D. Fla. May 21, 2025) (“A plaintiff‘s knowledge of the risks and possible consequences associated with a product likewise extinguishes proximate cause.“).74 “The obviousness of a danger . . . [is] determined by a ‘reasonable person’ standard,” whereas the consumer‘s knowledge of the danger is a subjective inquiry. Byrnes, 887 F. Supp. at 281. “While in many cases, the obvious and common nature of a peril will be a question for the jury,” such questions “may be determined as a matter of law in ‘plain and palpable cases’ where

reasonable minds could not differ. Marzullo v. Crosman Corp., 289 F. Supp. 2d 1337, 1346 (M.D. Fla. 2003).

Tesla maintains that the danger here, specifically the “dangers of operating a vehicle without paying attention[,] are both obvious and known.” ECF No. [326] at 25. According to Tesla, “[e]very licensed driver knows that there is no more fundamental requirement for the safe . . . operation of a vehicle than to watch the road ahead to avoid the risk of serious injury or death“—even when using Tesla‘s Autopilot system or some other Advanced Driver Assistance system. Id. Although courts have recognized the public is aware of “the inherent dangers of operating a motor vehicle,” John Morrell & Co. v. Royal Caribbean Cruises, Ltd., 534 F. Supp. 2d 1345, 1351 (S.D. Fla. 2008), the Court, disagrees with Tesla‘s framing of the relevant question. The appropriate inquiry is not whether driving distracted is an obvious danger. Rather, the question is whether the dangers of using and relying on driver assistance technology, specifically Tesla‘s Autopilot system, to avoid a collision is an obvious danger or at least a danger of which McGee was already aware. See e.g., Byrnes, 887 F. Supp. at 281 (“A corollary rule is that where the potential danger posed by a product is open and obvious . . . there is no duty on the part of the manufacture or distributor to warn users of said danger.“) (emphasis added). The Court does not find that the danger associated with relying on the Autopilot system to avoid a collision was such an obvious danger that no reasonable minds could differ.

As Cummings points out, the term “Autopilot” itself is a potentially confusing term as it arguably implies to the user that the 2019 Model S was more autonomous or at least had more capabilities than it really did.75 Not only is that the implication, but the record also indicates that

itself. ECF No. [351] at ¶ 168. “Tesla leadership encourages drivers to treat their cars as if they were actual self-driving vehicles.” ECF No. [347-2] at 6. There is no dispute that prior to the subject collision, the CEO of Tesla claimed that vehicles with Tesla‘s Autopilot system were “safer than human drivers” and would have saved “approximately half a million people” if Tesla‘s Autopilot system was universally available. ECF No. [351] at ¶¶ 189, 193 (emphasis added). Tesla even “posted a Vehicle Safety Report (‘VSR‘) on its website that purported to show that its cars using Autopilot were far safer than cars operating without Autopilot.” Id. at ¶ 171. Prior to the collision, Tesla‘s CEO further stated: “I think [Autopilot] will require detecting hands-on wheel for at least six months or something like that from here, really, it‘s a question of like, from a regulatory standpoint, what—how much safer than a person does auto-pilot need to be for it to be OK to not monitor the car?” Id. at ¶ 195 (emphasis removed). Given Tesla‘s numerous representations about the capabilities of the Autopilot system, there is at least a genuine dispute as to whether it would be obvious to the average consumer76 not to trust Tesla‘s Autopilot system to largely drive itself and avoid collisions.

Tesla contends even if the danger of overreliance on Autopilot technology was not obvious to a reasonable consumer, the company is still absolved of any duty to warn under the circumstances because McGee was personally aware that: (1) the Vehicle was not self-driving; (2) “he was responsible for the [V]ehicle;” and (3) and he did not purchase a Tesla model that would eventually enable the Vehicle to be “self-driving.” ECF No. [326] at 26. However, the record on

this issue does not definitively resolve whether McGee had sufficient knowledge of the capabilities and limitations of the Vehicle. While McGee conceded that he had a duty and obligation to pay attention and remain in control of the Vehicle, that testimony is not necessarily indicative of whether he had a sufficient understanding that the Vehicle would not always timely brake or otherwise avoid a collision when he failed to adequately pay attention to the road. Critically, during his deposition, McGee explained that he thought of Autopilot as a “copilot” and “understood it would keep [him] in the lane, avoid crashes, and direct [him] to where [he] needed to go.” ECF No. [318-9] at 46:3-5, 176:7. Based on McGee‘s testimony, a reasonable juror might conclude that McGee thought of himself like a captain of a plane or a ship and Autopilot like a first mate or copilot. While McGee, as the captain, was ultimately responsible for ensuring a successful voyage, he thought his copilot was largely capable of stepping in and safely operating the vessel. Therefore, given McGee‘s testimony, the Court is not convinced that Tesla lacked any duty to warn McGee of the potential dangers associated with the Autopilot system.77

Because the dangers of using Autopilot are not obvious as a matter of law, and there is a material dispute as to whether McGee fully appreciated Autopilot‘s capabilities and limitations, the Court will consider the adequacy of the warnings both on the Vehicle‘s display and in the Owner‘s Manual.

Tesla‘s Owner‘s Manual contains the following relevant warnings to drivers regarding the Autopilot system and its accompanying features:

  1. Traffic-Aware Cruise Control is designed for your driving comfort and convenience and is not a collision warning or avoidance system. It is your responsibility to stay alert, drive safely, and be in control of the vehicle at all times. Never depend on Traffic-Aware Cruise Control to adequately slow down Model S. Always watch the road in front of you and be prepared to take corrective action at all times. Failure to do so can result in serious injury or death.
  2. Although Traffic-Aware Cruise Control is capable of detecting pedestrians and cyclists, never depend on Traffic-Aware Cruise Control to adequately slow down Model S down for them. Always watch the road in front of you and be prepared to take corrective action at all times. Failure to do so can result in serious injury or death.
  3. Warning: Traffic-Aware Cruise Control cannot detect all objects and, especially in situations when you are driving over 50 mph (80 km/h), may not brake/decelerate when a vehicle or object is only partially in the driving lane or when a vehicle you are following moves out of your driving path and a stationary or slow-moving vehicle or object is in front of you. Always pay attention to the road ahead and stay prepared to take immediate corrective action. Depending on Traffic-Aware Cruise Control to avoid a collision can result in serious injury or death.
  4. Warning: Traffic-Aware Cruise Control can cancel unexpectedly at any time for unforeseen reasons. Always watch the road in front of you and stay prepared to take appropriate action. It is the driver‘s responsibility to be in control of Model S at all times.
  5. Warning: Autosteer is a hands-on feature. You must keep your hands on the steering wheel at all times.
  6. Warning: Autosteer is intended for use only on highways and limited-access roads with a fully attentive driver. When using Autosteer, hold the steering wheel and be mindful of road conditions and surrounding traffic. Do not use Autosteer on city streets, in construction zones, or in areas where bicyclists or pedestrians may be present. Never depend on Autosteer to determine an appropriate driving path. Always be prepared to take immediate action. Failure to follow these instructions could cause damage, serious injury or death.
  7. Warning: Steering interventions are minimal and are not designed to move Model S out of its driving lane. Do not rely on steering interventions to avoid side collisions.
  1. Forward Collision Warning - provides visual and audible warnings in situations when Model S detects that there is a high risk of a frontal collision.
  2. Automatic Emergency Braking - automatically applies braking to reduce the impact of a frontal collision (see Automatic Emergency Braking on page 103). Obstacle-Aware Acceleration reduces acceleration if Model S detects an object in its immediate driving path (see Obstacle-Aware Acceleration on page 104).
  3. Warning: Forward Collision Warning is for guidance purposes only and is not a substitute for attentive driving and sound judgment. Keep your eyes on the road when driving and never depend on Forward Collision Warning to warn you of a potential collision. Several factors can reduce or impair performance, causing either unnecessary, invalid, inaccurate, or missed warnings. Depending on Forward Collision Warning to warn you of a potential collision can result in serious injury or death.
  4. Warning: Automatic Emergency Braking is not designed to prevent all collisions. In certain situations, it can minimize the impact of a frontal collision by attempting to reduce your driving speed. Depending on Automatic Emergency Braking to avoid a collision can result in serious injury or death.
  5. Warning: Obstacle-Aware Acceleration is not designed to prevent a collision. In certain situations, it can minimize the impact of a collision. Depending on Obstacle-Aware Acceleration to avoid a collision can result in serious injury or death.
  6. Warning: Forward Collision Warning is designed only to provide visual and audible alerts. It does not attempt to apply the brakes or decelerate Model S. When seeing and/or hearing a warning, it is the driver‘s responsibility to take corrective action immediately.
  7. Warning: Automatic Emergency Braking is designed to reduce the severity of an impact. It is not designed to avoid a collision.
  8. Warning: Several factors can affect the performance of Automatic Emergency Braking, causing either no braking or inappropriate or untimely braking, such as when a vehicle is partially in the path of travel or there is road debris. It is the driver‘s responsibility to drive safely and remain in control of the vehicle at all times. Never depend on Automatic Emergency Braking to avoid or reduce the impact of a collision.
  9. It is the driver‘s responsibility to avoid collisions by staying alert, paying attention, and taking corrective action as early as possible.

ECF No. [318-10] at 84-85, 88-91, 94, 102, 104-06. In addition to warnings in the Owner‘s Manual, Tesla also provided additional “warnings” on the Vehicle‘s display. In particular, when using the Autosteer function, the Vehicle directed the driver to “only enable Autosteer “if you are willing to pay close attention to the road and be prepared to override it at any time,” and under specific driving conditions.” ECF No. [326] at 27 (quoting ECF No. [325] at ¶¶ 19). The Vehicle also “displayed a message telling [McGee] to keep his hands on the steering wheel and to remain prepared to take over.” Id.

Turning first to the sufficiency of the warnings on the Vehicle‘s display, the Court finds that those warnings cannot be deemed adequate as a matter of law. Florida courts have made clear that in order to adequately warn of a product‘s dangers, the “warning should contain some wording directed to the significant dangers arising from failure to use the product in the prescribed manner, such as the risk of serious injury or death.” Grieco, 344 So. 3d at 20 (quoting Scheman-Gonzalez v. Saber Mfg. Co., 816 So. 2d 1133, 1139 (Fla. 4th DCA 2002) (additional level of citation and quotation omitted)); Pinchinat v. Graco Children‘s Prods. Inc., 390 F. Supp. 2d 1141, 1146 (M.D. Fla. 2005) (“To warn adequately, the product label must make apparent the potential harmful consequences . . . [and] should contain some wording directed to the significant dangers arising from failure to use the produce in the prescribed manner[.]“) (quoting Scheman-Gonzalez, 816 So. 2d at 1139-40) (internal quotations omitted). The warnings on the Vehicle display do not contain any warning as to the risk associated with failing to adhere to the directives contained therein and, therefore, are more akin to instructions than warnings. See Brito v. County of Palm Beach, 753 So. 2d 109, 112 (Fla. 4th DCA 1998) (noting that the alleged “warning” provided was insufficient because it constituted “an instruction, not a warning” since they did not advise of the significant danger associated with failing to use the product properly); Dye v. Covidien LP, 470 F. Supp. 3d 1329, 1340 (S.D. Fla. 2020) (explaining that the relevant inquiry is whether the manufacturer warned of the risks associated with failing to use the produce in the prescribed manner). Because mere instructions on how to properly use the product do not satisfy a manufacturer‘s duty to warn, the messages on the Vehicle‘s display do not entitle Tesla to judgment as a matter of law. The Court will therefore turn to the warnings contained in the Vehicle‘s Owner‘s Manual.

While the warnings in the Owner‘s Manual appear to be “accurate, clear, and unambiguous,”78 and there is no material dispute that McGee failed to read the Autopilot warnings in the Manual, the Court finds summary judgment is still not appropriate under the circumstances.79

Indeed, McGee testified that he was aware there was an Owner‘s Manual and that he had not read the relevant warnings contained therein. See ECF No. [325] at ¶¶ 23, 24; ECF No. [351] at ¶ 23; see also ECF No. [318-9] at 36:21-36:23, 181:4-181:8.80 Typically, such admissions would foreclose the possibility of establishing proximate cause because, as this Court explained in Patt v. Volkswagen Group of Am., Inc., No. 22-cv-21585, 2024 WL 1675301, at *13 (S.D. Fla. Apr. 18, 2024) (citing Leoncio 601 F. App‘x at 933); “under Florida law, a [consumer‘s] “failure to read the warning cuts off a defendant‘s liability based on the alleged inadequacy of the warning.” Case also Lopez, 580 So. 2d at 865 (“where the person to whom the manufacturer owed a duty to warn . . . has not read the label [or manual], an inadequate warning cannot be the proximate cause of the plaintiff‘s injuries.“) (citing Ashby Div. of Consol. Aluminum v. Dobkin, 458 So. 2d 335, 337 (Fla. 3d DCA 1984)).

However, as Plaintiffs correctly point out, McGee‘s failure to read the warnings in the Owner‘s Manual is not the end of the inquiry because “the mere existence of warnings in an instruction manual is not dispositive of the adequacy of the warnings . . . [given that a] warning may be defective not only by virtue of inadequate wording, but as a result of its location and the manner in which the warning was conveyed.” Brown v. Glade & Grove Supply, Inc., 647 So. 2d 1033, 1035 (Fla. 4th DCA 1994) (internal quotations omitted) (citing Salozzo v. Wagner Spray Tech Corp., 578 So. 2d 393 (Fla. 3d DCA 1991); see Veliz, 313 F. Supp 2d at 1324. “Evaluating the adequacy of a warning requires a court to weigh (1) the dangerousness of the product; (2) the form in which the product is used; (3) the intensity and form of the warnings given; (4) the burdens to be imposed by requiring warnings; and (5) the likelihood that the particular warning will be adequately communicated to those who will foreseeably use the product.” Stanley Indus., Inc. v. W.M. Barr & Co., Inc., 784 F. Supp. 1570, 1575 (S.D. Fla. 1992).81

Here, Plaintiffs’ expert, Cummings, testified that the Model S‘s Owner‘s Manual, and the warnings therein, were difficult to access through Tesla‘s electric format “and that there is no way to ensure drivers obtain the information [and warnings] based upon how [the Manual] is conveyed.” ECF No. [352] at 22. (citing ECF No. [347-2] at 1, ¶ 1; ECF No. [318-6] at 196:12-15, 197:3-2; ECF No. [351] at ¶¶ 128-130, 132). Therefore, even assuming the words in the warnings were adequate, Plaintiffs’ failure to warn claim necessarily raises “a jury issue as to the placement of the [w]arning[s], which in turn impacts whether the failure to read the [w]arning[s] was due to the fault of the manufacturer.” Thomas, 682 F. Supp. 2d at 1301 (concluding that where the product was designed to accommodate two riders, but the warning was placed in a location where only the driver could readily observe it, the court could not grant summary judgment); see also Stanley, 784 F. Supp. at 1574 (concluding that “[t]he Lopez court swe[pt] too broadly in its holding that failure to read the warning label extinguishes proximate cause. A court must delve into the reasons for the failure to see if they coincide with the alleged inadequacies in the warning.“). Accordingly, the Court must deny summary judgment as to the failure to warn claim given that Plaintiffs’ theory of liability is at least in part based on a theory that the warnings in the Owner‘s Manual were inadequate due to their inaccessibility.

iii. Manufacturing Defect

Tesla argues that Plaintiffs’ manufacturing defect claim should be dismissed because, in order to sustain such a claim, “Plaintiffs must demonstrate some component of the Model S was different from its intended design and failed to perform as safely as the intended design would have performed.” ECF No. [326] at 25 (citing Fla. Std. Jury Instr. (Civ.) 4303.7(a)). Since “[t]he record is silent” as to whether McGee‘s Autopilot was manufactured as intended, Tesla contends summary judgment is appropriate on Count III. Id.

Plaintiffs do not reject Tesla‘s argument that they have failed to offer evidence which “suggests that “some component of the Model S was different from its intended design.” ECF No. [352] at 19 (quoting ECF No. [326] at 25). Nevertheless, “Plaintiffs urge the Court to reserve judgment on [the manufacturing defect issue] in light of the complex and cutting-edge nature of the technology at issue in this case.” Id. According to Plaintiffs, software development mistakes involving artificial intelligence could be considered a manufacturing defect rather than a design defect, and since experts have not yet figured out how to classify such defects, their claim should survive summary judgment.

The Court finds that as a matter of law, Plaintiffs have failed to establish a manufacturing defect. “To prove a manufacturing defect claim under Florida law, a plaintiff must prove that 1) the product was defective, 2) the defect existed at the time the product left the defendant-manufacturer‘s control, and 3) the defect proximately caused the plaintiff‘s injuries.” Salinero v. Johnson & Johnson, 400 F. Supp. 3d 1334, 1343-44 (S.D. Fla. 2019) (citing Wolicki-Gables v. Arrow Int‘l, Inc., 641 F. Supp. 2d 1270, 1285 (M.D. Fla. 2009); Colville v. Pharmacia & Upjohn Co., 565 F. Supp. 2d 1314, 1320 (N.D. Fla. 2008)). Unlike design defects, manufacturing defects are “limited to situations where something goes wrong in the manufacturing process” for the specific product at issue in the case, as opposed to a defect that occurs “throughout an entire line of products.” Benitez v. Synthes, Inc., 199 F. Supp. 2d 1339, 1344 (M.D. Fla. 2002) (citing Cassisi v. Maytag Co., 396 So. 2d 1140, 1145 (Fla. 1st DCA 1981)); Harduvel v. General Dynamics Corp., 878 F.2d 1311, 1317 (11th Cir. 1989) (internal citation omitted)). Here, Plaintiffs point to no evidence or testimony in the record that would indicate the Autopilot system in McGee‘s 2019 Tesla S deviated from Tesla‘s intended design. Plaintiffs’ specious argument that the Court should reserve judgment because AI technology is cutting-edge and complex does not alleviate Plaintiffs’ burden to prove to the jury that there was a defect in the production of McGee‘s Autopilot system. Therefore, because there is no evidence Plaintiffs can point to that would support even the inference that their injuries were the result of a manufacturing defect, summary judgment is granted on Count III.

iv. Negligent Misrepresentation

Tesla contends that Plaintiffs cannot establish a negligent misrepresentation claim because “Plaintiffs must prove that an injury resulted from the plaintiff acting in justifiable reliance upon the misrepresentation.” ECF No. [326] at 29. (emphasis removed). Since there is no evidence that Tesla made any representations to Plaintiffs, let alone that Plaintiffs relied on those representations to their detriment, there is no basis for a negligent misrepresentation claim. See id. Moreover, to the extent Plaintiffs “attempt to step into the shoes of McGee and assert claims based on representations to McGee,” Tesla maintains those arguments should be rejected as well. See id.

Plaintiffs argue that there is no “Florida state caselaw that specifically answers the question presented by this claim—can only those who rely on the misrepresentation [ ] suffer a legally cognizable injury” for the purposes of a negligent misrepresentation claim. ECF No. [352] at 23. Therefore, because there is no binding case law on point, Plaintiffs contend that the Court should turn to federal law and persuasive authority from other jurisdictions. Id. In search of persuasive authority, Plaintiffs direct the Court to the United States Supreme Court decision in Bridge v. Phoenix Bond & Indem. Co., 553 U.S. 639 (2008), a RICO case involving mail fraud. In Bridge, the defendants argued that proximate cause may only be established for a fraud claim where the plaintiff relies on the fraudulent misrepresentation. Id. at 24. However, the Supreme Court rejected that argument, concluding while first-party reliance is an element of a common law fraud claim, there is no general common law principle that limits the injured parties to those who relied on the misrepresentation. See id. Therefore, according to Plaintiffs, so long as someone detrimentally relied on the misrepresentation, in this case, McGee, then Plaintiffs are permitted to assert a negligent misrepresentation claim because they were injured by McGee‘s detrimental reliance.

Plaintiffs contend that the general principles outlined in Bridge were not unique to that case either. Plaintiffs argue that “[s]everal jurisdictions have recognized a cause of action for negligent misrepresentation that causes harm to third parties.” Id. at 25.82 According to Plaintiffs, “Florida

law protects not just purchasers of defective products, but bystanders who are injured by [defective products].” Id. at 27. Accordingly, since there is no Florida case law that expressly forecloses the possibility of a plaintiff bringing an action based on an injury he or she suffered as a result of a misrepresentation the defendant made to a third party, Plaintiffs’ claims should survive summary judgment. Id.

Plaintiffs’ negligent misrepresentation claim fails as a matter of law. In order to establish negligent misrepresentation under Florida law, a plaintiff “is required to prove (1) a misrepresentation of material fact that the defendant believed to be true but which was in fact false; (2) that defendant should have known the representation was false; (3) the defendant intended to induce the plaintiff to rely on the misrepresentation; and (4) the plaintiff acted in justifiable reliance upon the misrepresentation, resulting in injury. Arlington Pebble Creek, LLC v. Campus Edge Condo. Ass‘n, Inc., 232 So. 3d 502, 505-06 (Fla. 1st DCA 2017) (emphasis added) (citing Specialty Marine & Industrial Supplies, Inc. v. Venus, 66 So. 3d 306, 309 (Fla. 1st DCA 2011)); Romo v. Amedex Ins. Co., 930 So. 2d 643, 653 (Fla. 3d DCA 2006) (“In order to allege a viable cause of action for negligent misrepresentation a plaintiff must allege in his complaint that: (1) the defendant made a misrepresentation of material fact that he believed to be true but which was in fact false; (2) the defendant was negligent in making the statement because he should have known the representation was false; (3) the defendant intended to induce the plaintiff to rely . . . on the misrepresentation; and (4) injury resulted to the plaintiff acting in justifiable reliance upon the misrepresentation.“) (quoting Simon v. Celebration Co., 883 So. 2d 826, 831 (Fla. 5th DCA 2004)) (internal citations omitted and emphasis added); Despite Plaintiffs’ insistence to the contrary, the case law in Florida is clear; absent evidence the plaintiff relied on the defendant‘s alleged misrepresentation to their detriment, there can be no negligent misrepresentation claim.83 See Oakwood Ins. Co. v. N. Am. Risk Servs., Inc., No. 618CV437ORL31KRS, 2018 WL 3381284, at *2 (M.D. Fla. July 11, 2018) (concluding that for a negligent misrepresentation claim, “injury to the plaintiff as a result of acting in justifiable reliance on the misrepresentation” is necessary) (citing Postel Indus., Inc. v. Abrams Grp. Const., L.L.C., No. 6:11-cv-1179-ORL-28, 2012 WL 4194660, at *2 (M.D. Fla. Sept. 19, 2012)); Atl. Nat. Bank of Fla. v. Vest, 480 So. 2d 1328, 1331-32 (Fla. 2d DCA 1985) (explaining for a negligent misrepresentation claim, an “injury must result to the party acting in justifiable reliance on the misrepresentation.“); Souran v. Travelers Ins. Co., 982 F.2d 1497, 1505 (11th Cir. 1993) (“The fourth element of negligent misrepresentation requires that [the plaintiff] justifiably relied on [the defendant‘s] representation.“).

While Plaintiffs ask the Court to rely on the United States Supreme Court‘s reasoning in Bridge to preserve their negligent misrepresentation claim, the Court‘s reasoning in that case is inapplicable, as the Court was considering the common law doctrine of fraudulent misrepresentation, not negligent misrepresentation. See Bridge, 553 U.S. at 656. And as the Florida Supreme Court has explained, the scope of liability for fraudulent misrepresentation reaches farther than for a negligent misrepresentation claim, because a negligent misrepresenter is less culpable than a fraudulent one. See Gilchrist Timber Co. v. ITT Rayonier, Inc., 696 So. 2d 334, 337-38 (Fla. 1997). Moreover, unlike fraudulent misrepresentation, the “ordinary rules of negligence liability apply” to negligent misrepresentation claims. Id. Accordingly, the plaintiff‘s detrimental reliance is necessary to a negligent misrepresentation claim because a defendant may only be liable for negligence, and by extension, negligent misrepresentation, if the defendant owes a duty to the plaintiff. See In re Palm Beach Fin. Partners, L.P., 517 B.R. 310, 341 (Bankr. S.D. Fla. 2013) (“Essentially, the elements of negligent misrepresentation and the elements of fraudulent misrepresentation are identical, except that [for negligent misrepresentation] the defendant need not know the statement is false and that the defendant and the plaintiff must have some kind of relationship such that the defendant owes a duty to the plaintiff to communicate accurate information.“) (emphasis added); see also Grieco, 344 So. 3d at 25 (noting “a duty of reasonable care is not owed to the world at large but arises out of a relationship between the parties“).84

Therefore, even assuming McGee detrimentally relied on Tesla‘s misrepresentation, there is no record evidence that Tesla‘s alleged misrepresentation was conveyed to Plaintiffs or that Plaintiffs otherwise relied on any misrepresentation such that Tesla would owe a duty to Plaintiffs. See Levine v. Wyeth Inc., 684 F. Supp. 2d 1338, 1347 (M.D. Fla. 2010) (“To succeed in a claim for negligent misrepresentation, the plaintiff must show the defendant owed it a duty of care.“) (citing ZP No. 54 Ltd. P‘ship v. Fidelity and Deposit Co. of Md., 917 So. 2d 368, 374 (Fla. 5th DCA 2005);85 see also Antezana v. Kimley-Horn & Associates, Inc., No. 4D2024-0486, 2025 WL 1318577 at *7 (Fla. 4th DCA May 7, 2025) (“In a negligent misrepresentation claim, the plaintiff must allege that he suffered an injury when “acting in justifiable reliance upon the misrepresentation.““) (Fla. Women‘s Med. Clinic, Inc. v. Sultan, 656 So. 2d 931, 933 (Fla. 4th DCA 1995)) (additional level of citation and quotation omitted). Accordingly, Plaintiffs’ negligent misrepresentation claim must be dismissed.

v. Punitive Damages

Lastly, Tesla seeks to dismiss Plaintiffs’ request for punitive damages. See ECF No. [326] at 29. According to Tesla, Plaintiffs must overcome a significant burden to prove they are entitled to punitive damages. Tesla contends that under the Florida law, Plaintiffs must show that the defendant was either grossly negligent or engaged in intentional misconduct, which in the automobile product liability context, requires a showing that “the defendant “exhibited a reckless disregard for human life equivalent to manslaughter.“” See id. at 31 (quoting Chrysler Corp. v. Wolmer, 499 So. 2d 823, 825 (Fla. 1986)). This generally means that punitive damages are “inappropriate where the defendants” actions were consistent with government standards.” Id.

In light of this burdensome standard, Tesla contends there is insufficient evidence to allow the issue of punitive damages to go to the jury. According to Tesla, the only evidence that would support punitive damages is the December 2023 recall; however, Tesla argues the recall is inadmissible.86

Plaintiffs respond that it has provided sufficient evidence to establish the punitive damages standard because Plaintiffs can prove that Tesla had “knowledge that its product is inherently dangerous to persons or property and that its continued use is likely to cause injury or death, but nevertheless continues to market the product without making feasible modifications to eliminate the danger or making adequate disclosure and warning of such danger.” ECF No. [352] at 28 (quoting Johns-Manville Sales Corp. v. Janssens, 463 So. 2d 242 (Fla. 1st DCA 1984)).

Plaintiffs contend that they rely upon numerous other pieces of evidence besides the 2023 recall to show that Tesla had actual knowledge “as early as 2016 that drivers were misusing Autopilot, yet failed to take steps to ameliorate.” Id. at 28-29. Plaintiffs argue the “Brown accident” first placed Tesla on notice that driver inattention was leading to accidents because following the investigation of the accident, the NTSB “warned Tesla that its product should not have been allowed to be used outside of its ODD.” Id. at 30. Despite being on notice of the dangers, Plaintiffs contend that Tesla intentionally avoided fixing the problems until it was forced to do so in 2023 so that its vehicles could collect more data to improve their product and ultimately increase profits. See id.

Plaintiffs further argue that the evidence shows Tesla knowingly misrepresented Autopilot‘s capabilities and produced a “Vehicle safety report on its website that in no way represented the true rates of accidents” that Tesla Autopilot users experienced relative to standard automobile drivers. Id. at 31. Additionally, Plaintiffs maintain that evidence that Tesla prevented Florida Highway Patrol from obtaining the Autopilot data in this case also supports the presentation of punitive damages here. See id.

Florida Statute § 768.72 provides in relevant part:

In any civil action, no claim for punitive damages shall be permitted unless there is a reasonable showing by evidence in the record or proffered by the claimant which would provide a reasonable basis for recovery of such damages.

. . .

[At trial, a] defendant may be held liable for punitive damages only if the trier of fact, based on clear and convincing evidence, finds that the defendant was personally guilty of intentional misconduct or gross negligence.

Under the statute, intentional misconduct requires that “the defendant had actual knowledge of the wrongfulness of the conduct and [a] high probability that injury or damage to the claimant would result and, despite that knowledge, intentionally pursued that course of conduct, resulting in injury or damage.” Id. To establish gross negligence, the plaintiff must show “the defendant‘s conduct was so reckless or wanting in care that it constituted a conscious disregard or indifference to the life, safety, or rights of persons exposed to such conduct.” Id. The reason for these high standards is because “the purpose of punitive damages is not to further compensate the plaintiff, but to punish the defendant for its wrongful conduct and deter similar misconduct by it and other actors in the future.” Owens-Corning Fiberglas Corp. v. Ballard, 749 So. 2d 483, 486 (Fla. 1999). Accordingly, the Florida Supreme Court has made it clear that in automobile products liability cases, the standard is “whether the car manufacturer exhibited a reckless disregard for human life equivalent to manslaughter by designing and marketing the vehicle.” Tesla, Inc. v. Banner, No. 4D2023-3034, 2025 WL 610816, at *3 (Fla. 4th DCA Feb. 26, 2025) (quoting Carraway v. Revell, 116 So. 2d 16, 20 (Fla. 1959)).

Here, the record reflects sufficient evidence, even ignoring the 2023 Recall, that would allow Plaintiffs to pursue punitive damages in this case. As Cummings explained in her report, as early as 2016, NTSB recommended in their crash investigation report that “Tesla incorporate system safeguards that limit the use of automated vehicle control systems to those conditions for which they were designed (H-17-41).” ECF No. [350-1] at 6. Plaintiffs also point to numerous public statements Tesla made which arguably misrepresented the risk and limitations of the Autopilot system and improperly suggested that issues present in the crash were resolved. ECF No. [351] at 14, 23-26. Therefore, because Tesla refused to geo-fence the Autopilot system despite being warned of the dangers and despite record evidence indicating that Tesla was capable of curing the issue,87 a reasonable jury could find that Tesla acted in reckless disregard of human life for the sake of developing their product and maximizing profit. See Gen. Motors Corp. v. McGee, 837 So. 2d 1010, 1035 (Fla. 4th DCA 2002); Sims v. BMW of N. Am. LLC, No. 6:22-cv-1685-PGB-UAM, 2025 WL 724047, at *2 (M.D. Fla. Mar. 5, 2025) (citing Bridgestone/Firestone, Inc., 891 So. 2d 1188, 1191-92 (Fla. 4th DCA 2005) (holding actual knowledge of a danger followed by a failure to warn of that danger supports punitive damages)).

Tesla insists that the recent state appellate court decision in Tesla, Inc., v. Banner, No. 4D2023-3034, 2025 WL 610816, at *4 (Fla. 4th DCA Feb. 26, 2025), forecloses punitive damages in this case. In Banner, the Fourth District Court of Appeal reversed the trial court‘s decision to allow the plaintiff to amend the complaint to include a claim for punitive damages against Tesla because it knew of but failed to cure certain alleged defects in the Autopilot and Automatic Emergency Braking system. See id. at 1. The Banner court held that “the record d[id] not support a finding that Tesla knew, or reasonably should have known, that its SAE Level 2 driving assistance features were likely to cause death or great bodily injury.” Id. at 4. Instead, the court found that the “evidence indicate[d] Tesla‘s Autopilot features were “state-of-the-art” and complied with all industry and regulatory standards.” Id. Consequently, the court ultimately concluded that Tesla could not be subject to punitive damages “for failing to provide technology that it did not advertise and that did not exist.” Id.

The Court need not accept the Banner court‘s conclusion that “Tesla‘s Autopilot features were “state-of-the-art“” rendering punitive damages unavailable.88 First, the Fourth District Court of Appeal addressed the propriety of the plaintiff‘s request for punitive damages at the pleading stage as opposed to the summary judgment stage. See Banner, 2025 WL 610816, at *1. Naturally, the record was not developed as it is here. Second, although the issues may have been similar, unlike here, there appears to have been no record evidence indicating Tesla‘s knowledge of the known dangers, other manufacturers” demonstrated ability to avoid the purported dangers, and Tesla‘s own ability to have cured the alleged defect. See ECF No. [407]. Given this additional evidence, the Court finds it appropriate to permit Plaintiffs to pursue punitive damages at trial.

IV. CONCLUSION

Accordingly, it is ORDERED AND ADJUDGED that

  1. Tesla‘s Motion for Summary Judgment, ECF No. [326], is GRANTED in part and DENIED:
    1. GRANTED as to Plaintiffs’ Defective Manufacture Claim (Count III) and Negligent Misrepresentation Claim (Count IV).
    2. DENIED as to Plaintiffs’ Strict Products Liability—Defective Design Claim (Count I) and Failure to Warn Claim (Count II).
  2. Tesla‘s Motion to Exclude Plaintiffs’ Experts Moore, Cummings, and Pettingill, ECF No. [318], is GRANTED in part and DENIED in part as follows:
    1. Dr. Bernard F. Pettingill‘s expert opinion and testimony is EXCLUDED.89
    1. Alan Moore‘s opinion and testimony regarding: (i) Tesla‘s insufficient training and (ii) the design defect in providing a Beta Software, is EXCLUDED.
    2. All other expert opinions and testimony provided by Plaintiffs’ experts are admissible under Daubert.
  1. Plaintiffs’ Motion to Exclude Expert Testing and Testimony of Tesla Expert Ryan Harrington, ECF No. [322], is GRANTED in part and DENIED in part as follows:
    1. The first phase of Harrington‘s crash testing—the Daylight Nominal FCW and AEB Demonstrations—as well as the accompanying testimony and opinions, are EXCLUDED.
    2. The second and third phase of Harrington‘s crash testing—the Nighttime FSC Demonstration with Broadside Targets and the Nighttime Collision Scenario Demonstration—along with the accompanying expert testimony, are admissible.

DONE AND ORDERED in Chambers at Miami, Florida, on June 26, 2025.

Image in original document

BETH BLOOM

UNITED STATES DISTRICT JUDGE

cc: counsel of record

Notes

1
In addition to the Motion for Summary Judgment, Tesla filed an accompanying Statement of Undisputed Facts, see ECF No. [325], to which Plaintiffs filed a Counter Statement and a Statement of Additional Material Facts. ECF No. [351]. Tesla also filed a Reply Statement of Material Facts in Support of its Motion for Summary Judgment. ECF No. [379].
2
Plaintiffs originally filed a Response to Tesla‘s Motion for Summary Judgment at ECF No. [350]; however, they subsequently filed a Corrected Response at ECF No. [352]. Accordingly, when the Court refers to Plaintiffs’ Response to Tesla‘s Motion for Summary Judgment, the Court is referring to the Corrected Response filed at ECF No. [352].
3
The Court will resolve the Parties’ respective Motions in Limine, including Plaintiff Angulo‘s Motion to Limit and Exclude Certain Opinions and Testimony of Dr. Barry Crown in separate order. See ECF Nos. [320], [329], and [344].
4
Similar to the cruise control in an ordinary vehicle, once the driver releases the accelerator, the TACC cruise control function resumes at the set speed. See ECF No. [351] at ¶ 3 (citing ECF No. [318-10] at 3 1878).
5
However, the Parties do not dispute that approximately 1.65 seconds before the collision, “Autopilot aborted with an accompanying visual alert.” ECF No. [358] at ¶ 159. There was no cause given by the vehicle for the abort. Id.
6
The Parties do not dispute that the Vehicle‘s “Autopilot remains engaged even when the accelerator pedal is depressed by 20%” or that “McGee did not depress the accelerator by more than 20% prior to the accident.” ECF No. [351] at ¶¶ 123-24; ECF No. [379] at ¶¶ 123-24. Therefore, there is no dispute that the Autopilot was still engaged, to some degree, as the Vehicle approached the intersection. There is also no dispute that McGee accelerated the vehicle to 62 miles per hour. Compare, ECF No. [325] at ¶ 3 with ECF No. [351] at ¶ 3. Therefore, there is no material dispute that McGee temporarily overrode the TACC‘s 45 mph speed restriction in the 30 seconds leading up to the collision. The only dispute is whether accelerating the Vehicle past the TACC max speed temporarily disabled the Autopilot‘s longitudinal control function and the automatic emergency brake function. See ECF No. [379] at ¶ 126; see also ECF No. [351] at ¶ 3.
7
“The Vehicle detected the end of drivable space prior to the subject collision.” ECF No. [351] at ¶ 126.
8
Although both the Vehicle‘s Automatic Emergency Brake (“AEB“) system and the Forward Collision Warning (“FCW“) system are capable of operating at speeds up to 90 mph, neither system activated prior to the collision. ECF No. [358] at ¶ 159.
9
McGee also told the officers he “was driving on cruise going for – and [he] looked down [ ] – to get the phone [he] dropped ... [then he] reached down ... And then when [he] popped up and looked, [he] saw a black truck. It just happened so fast.” ECF No. [325] at ¶ 9.
10
While Tesla contends that the term “Autosteer” also includes “Forward Collision Warning (FCW) and [ ] Automatic Emergency Braking (AEB),” ECF No. [325] at ¶ 13, Plaintiffs disagree, arguing that “Autopilot” only includes TACC and Autosteer. See ECF No. [351]. However, Plaintiffs do not dispute that McGee‘s Vehicle was equipped with FCW and AEB systems. Id.
11
“The Vehicle‘s Owner Manual states that FCW monitors are designed to monitor up to 525 feet in front of the vehicle.” ECF No. [351] at ¶ 127.
12
The Parties also do not dispute that McGee purchased an Autopilot package that lacked the hardware compatible with future software development that would have enabled the Vehicle to eventually become self-driving. ECF No. [325] at ¶ 29.
13
On the day of the subject collision, McGee received a visual alert at 12:10.314 that the TACC brake system would not activate because he was exceeding the TACC speed restriction. See ECF No. [351] at ¶ 158. “Between the time of the alert about the lack of braking action at 12:10.314 and the accident at 13:16.427, McGee received no other alerts[,] and the car continued to steer while Mr. McGee pushed the accelerator pedal, as designed.” Id.
14
The Parties dispute whether McGee read the Owner‘s Manual. ECF No. [325] at ¶¶ 23-24; ECF No. [351] at ¶¶ 23-24.
15
Plaintiffs contend the Owner‘s Manual further provides that Autopilot “(1) is primarily intended for driving on dry, straight roads, such as highways and freeways; (2) should not be used on city streets; [and] (3) should not be used on winding roads with sharp curves, on icy or slippery road surfaces[.]” ECF No. [351] at ¶ 150.
16
“Segments of the commercial were staged[,] and one Tesla crashed in the making of th[e] video.” ECF No. [351] at ¶ 110.
17
The “guidance issued by the Society of Automotive Engineers (‘SAE‘) in place at the time of the Model S’ manufacture dictated [that] the driver of a vehicle equipped with a Level 2 system is expected to monitor the roadway and respond to hazards[,]” and is responsible for determining “when to use what features.” ECF No. [325] at ¶¶ 71, 73.
18
See also ECF No. [325] at ¶ 74 (Plaintiffs’ expert explaining “that drivers are responsible to make decisions consistent with their obligation to drive safely on the roadways even when ADAS features are engaged“).
19
The National Highway Traffic Safety Administration has identified “cell phone use as a source of distraction for all drivers regardless of ADAS technology.” ECF No. [325] at ¶ 60.
20
The Parties dispute whether drivers who operate vehicles without Autopilot engage in such behavior more frequently than drivers without the Autopilot function. See ECF No. [351] at ¶ 62.
21
Plaintiffs’ Expert, Dr. Mary Cummings also admitted during her deposition that there was no way she could know for certain that “if Autopilot was not available to Mr. McGee in his Model S on the date of the accident, Mr. McGee would not have used his cell phone in his hand while approaching the intersection of Card Sound Road and 905.” ECF No. [318-6] at 245:9-15. She conceded such an opinion would be “speculative.” Id. Cummings also conceded it was “possible” [McGee] would have still been “distracted” if Autopilot was not in the Vehicle. Id. at 245:24-25; see also ECF No. [325] at ¶¶ 63-65.
22
Alan Moore testified about the impacts of disengaging the Autopilot, and Dr. Mary Cummings testified that McGee‘s complacency and confusion about Autopilot resulted in the crash. See ECF No. [318-2] at 53:19-54:4; and ECF No. [318-6] at 42:19-43:1, 53:6-8.
23
Plaintiffs contend that “[a]n ODD is the ‘operating conditions under which a given driving automation system or feature thereof is specifically designed to function, including, but not limited to, environmental, geographical, and time-of-day restrictions, and/or the requisite presence or absence of certain traffic roadway characteristics.‘” ECF No. [351] at ¶ 113 (quoting ECF No. [318-1] at 4; ECF No. [318-2] at 46:5-9). Tesla argues that while this definition is accurate, it is incomplete. According to Tesla, Plaintiffs’ definition fails to acknowledge that in a Level 2 ADAS system, unlike a Level 3 system, “‘the driver determines whether/when engagement and disengagement of the driving automation system is appropriate.‘” ECF No. [379] at ¶ 113 (quoting ECF No. [318-1] at 4). Moreover, while “a given driving automation system feature may ha[ve] only one ODD, that ODD may be quite varied and multi-faceted.” Id. (internal quotations omitted).
24
While Cummings “opined that Tesla‘s Recall 23-838 ‘recall remedy’ would ‘probably’ have prevented the accident because McGee would ‘probably’ have been suspended from using Autopilot on the day of the crash,” Moore did not rely on the recall to give his opinion. ECF No. [325] at ¶¶ 67, 82 (quoting ECF No. [318-6] at 244:22-245:16 and citing ECF No. [318-2] at 146:1-148:20).
25
The chair of NTSB, Robert Sumwalt, III, also acknowledged “that highly automated systems can lead to operator or driver complacency due to the perception of high reliability.” ECF No. [351] at ¶ 86.
26
See ECF No. [379] at ¶ 85 (citing ECF No. [380-2] (6.22.20 NHTSA Response Letter to NTSB at 2)).
27
“Elon Musk [also] admitted that Tesla‘s customers are providing the data that Tesla used to improve its Autopilot software.” ECF No. [351] at ¶ 171.
28
Tesla also “posted a Vehicle Safety Report (‘VSR‘) on its website that purported to show that its cars using Autopilot were far safer than cars operating without Autopilot.” ECF No. [351] at ¶ 200.
29
Musk specifically stated:
30
As will be discussed below, Tesla further argues that even if the warnings were inadequate, McGee‘s failure to read the warnings defeats Plaintiffs’ failure to warn claim.
31
In its Daubert Motion, Tesla frequently conflates its causation and defect arguments. Therefore, the Court addresses whether the opinions as to each issue adequately support proximate cause.
32
The warnings included a large red image on the screen directing the driver to take over immediately, along with a loud, audible, escalating chime. ECF No. [318] at 14.
33
A strikeout disables Autopilot for the length of the drive cycle and will occur after three warnings to the driver. ECF No. [318-1] at 8 (citing ECF No. [285-4] at 171).
34
When the Court refers to McGee‘s “commute,” it is referring to McGee‘s regular 100-mile trip two to three times a week from Boca Raton, Florida, to Ocean Reef, Florida. See ECF No. [318-1] at 7.
35
According to the log data, “84-91% of [McGee‘s] drive time was on Autopilot, with the average reset time following a strikeout being 2.5 minutes.” ECF No. [318-1] at 11.
36
As noted earlier, Moore is a “Board Certified Forensic Engineer.” ECF No. [318-3] at 250.
37
Unlike his opinion on inadequate training, Moore explains how his interviews with other Tesla drivers supports his conclusion that a more punitive disablement policy would improve Autopilot user engagement and attentiveness.
38
The Court agrees that, while Tesla contends there is no evidence that a seven-day suspension policy would alter behavior, that argument is directly undermined by the fact that Tesla has since implemented such a policy. Tesla would be hard-pressed to argue that it implemented such a policy without any consideration of the safety benefit or that it found no evidence such a policy would have an impact on driver behavior.
39
The Court also disagrees with Tesla that Moore‘s opinions about the Autopilot‘s DMS are inherently inconsistent. See ECF No. [318] at 15. Either McGee would be deterred by the increased suspension period and improve his driving behavior, or he would continue to misuse and abuse Autopilot, significantly increasing the likelihood that Autopilot would have been disabled on the day of the collision.
40
“The Operational Design Domain, or ODD, is defined as “operating conditions under which a given driving automation system or feature thereof is specifically designed to function, including, but not limited to, environmental, geographical, and time-of-day restrictions, and/or the requisite presence or absence of certain traffic or roadway characteristics.” ECF No. [318-1] at 5.
41
Moore noted that Card Sound Road has several features that placed it outside of Tesla‘s ODD at the time of the collision:

Not a limited access road[,] Contains winding roads with sharp curves[,] Sight distance limitations[,] Cross traffic[,] Vehicles stopped on road edge[,] Occasional water over roadway[,] Presence of bicyclist and pedestrians[,] No center divider[,] Hills (at overpasses)[,] Toll Booth[,] Residence along roadway.

ECF No. [318-1] at 10.
42
Tesla disputes that the other manufacturers’ designs undermine its contention that Autopilot was state of the art because, unlike Tesla cars, those vehicles used hands-off technology, which requires more safety measures. See ECF No. [318] at 27. However, this distinction goes to the weight of the evidence, not to its admissibility. For the Court to conclude that the other manufacturers’ ADAS and geofencing technology are not relevant because they involve different technology would result in the Court reaching beyond its traditional gatekeeping function.
43
Autopilot had the ability to not only determine the Vehicle‘s location, but the Vehicle‘s GPS system also could “determine road class, road curvature, lanes and exists,” “the speed limit of the current road,” and “restrict speed” based on the road class and other relevant factors. ECF No. [318-1] at 8.
44
Prior to 2019, NTSB recommended that all manufacturers of cars with Level 2 automated systems “incorporate system safeguards that limit the use of automated vehicle control systems to those conditions for which they were designed.” ECF No. [318-5] at 43.
45
The Court further notes that despite Tesla‘s argument to the contrary, the SAE report in effect at the time of the collision indicated that “only Level 5 operation is possible without ODD limitation.” ECF No. [318-1] at 5-6. Therefore, Tesla‘s argument regarding the reliability of Moore‘s ODD opinion is unpersuasive.
46
While Tesla‘s argument could theoretically undermine Moore‘s opinion to the extent he relies on the failure to warn and brake to assert a manufacturing defect, the Court need not consider the issue since the Court finds that Moore has relied on reasonable methods to conclude there was a design defect.
47
Because Moore‘s TACC opinion supports a claim of defective design, it is ultimately not material for the purpose of the Court‘s Daubert analysis that Moore admitted the reason there was not an “Automatic Emergency Braking or Forward Collision Warning event” was likely because of “the target vehicle orientation, visual noise, lighting, lack of free space and lane information, and Tesla vehicle speed.” ECF No. [318-1] at 11.
48
Tesla further argues that there is insufficient record evidence and data for Moore to opine that McGee‘s self-education was inadequate. ECF No. [318] at 17. McGee could not point to which YouTube videos he watched or other resources he used to train himself, and therefore, Tesla argues that Moore‘s opinion that such training was insufficient is pure guesswork. See id. at 17-18.
49
Plaintiffs attempt to save Moore‘s Beta opinion by asserting that Beta means the Autopilot system “was not fully tested for safety, and, further, the system was not designed to be used on roadways with cross-traffic or intersections; however, Plaintiffs’ assertion quotes from Plaintiffs’ Complaint, not Moore‘s report or deposition. See ECF No. [347] at 23 (quoting ECF No. [205] at ¶ 19). Moore never opined as to what it meant for Tesla‘s Autopilot to be operating in “Beta.” Instead, Moore simply concluded that the “opinion speaks for itself.” Id.
50
While the term “beta” has multiple meanings, Meriam-Webster‘s Dictionary defines the relevant use of the word beta to mean “a nearly complete prototype of a product (such as software)” or “a stage of development in which a product is nearly complete but not yet ready for release.” Merriam-Webster.com Dictionary, s.v. “beta,” https://www.merriam-webster.com/dictionary/beta (last visited June 14, 2025). However, Moore does not identify any record evidence regarding how Tesla used the term.
51
However, the exclusion of Moore‘s testimony regarding the use of beta software does not foreclose Cummings from testifying on the issue. While both Moore and Cummings discuss Tesla‘s use of Beta software in their reports, see ECF No. [318-1] at 16; ECF No. [350-1] at 8-9, Tesla only challenges Moore‘s testimony on this issue, not Cummings’ opinion. See generally ECF No. [318].
52
Although the Garrison and Elliot cases are applying Alabama law, the principles asserted are consistent with Florida law. See Alderman v. Wysong & Miles Co., 486 So. 2d 673, 679 (Fla. 1st DCA 1986) (citing Clement v. Rousselle Corporation, 372 So. 2d 1156 (Fla. 1st DCA 1979)).
53
Cummings found the following representations by Tesla relevant: (1) “In April 2017, during a TED talk, Musk reportedly said that in about two years, Tesla owners could sleep while their cars drove themselves;” (2) “In 2016, Tesla aired a commercial commonly referred to as the ‘Paint It Black’ commercial where the statement was made that ‘The person in the driver‘s seat is only there for legal reasons. He is not doing anything. The car is driving itself;’ and (3) “Tesla publicly asserted many times prior to the McGee crash that Autopilot makes drivers safer.” ECF No. [350-1] at 10.
54
Although Tesla does not explicitly challenge the helpfulness of Cummings’ DMS opinion, for the sake of completeness, the Court addresses the helpfulness of that opinion as well. Tesla suggests that Cummings’ DMS defect opinion isn‘t relevant because McGee had his hands on the wheel at the time of the collision. ECF No. [318] at 26. However, as Plaintiffs’ aptly point out, the fact that Plaintiff had his hands on the wheel but was still not engaged is precisely the problem. Cummings’ opinion is that a facial recognition system would have detected McGee‘s lack of engagement quicker and more effectively. See ECF No. [347] at 28-29 Accordingly, her testimony is highly relevant and will help the jury better understand if Tesla‘s torque monitoring DMS was defective.
55
While the Court finds Cummings’ opinions admissible, the Court will preclude Cummings from testifying about the specific statements she made in her deposition that are included in footnote 19 of ECF No. [318]. The Court finds those statements inflammatory, unduly prejudicial, and unhelpful to the jury.
56
“Cummings is an autonomous systems and artificial intelligence safety expert,” and “[h]er opinions in this case principally focus on the overall defects in the Vehicle‘s design, as informed by her long experience as a safety expert.” ECF No. [347] at 32. In contrast, Moore is an “accident reconstruction specialist, board-certified forensic engineer, and a university lecturer on reconstructing accidents involving ADAS technology.” Id. According to Plaintiffs, Moore‘s methods were based on a detailed analysis of the log data in the Vehicle and, therefore, are fundamentally different than those of Cummings. See id.
57
The risk of cumulative expert testimony is that a jury might decide to “resolve differences in expert opinion by ‘counting heads’ instead of by giving fair consideration to the quality and credibility of each expert‘s opinions.” Royal Bahamian Ass‘n, Inc., 2010 WL 4225947, at *2.
58
In the Court‘s hypothetical, the jury could find each expert‘s testimony useful as the quarterback might offer testimony as to the practical and intuitive reasons a particular method of throwing is best, while the physicist could offer his scientific and theoretical opinion on the issue. And in a situation where their testimony aligned, the jury would have much more compelling evidence that the proposed method of throwing was the best than if only one of the experts was permitted to testify on the issue.
59
Plaintiff Angulo has since withdrawn Dr. Bernard F. Pettingill as an expert witness in this case. ECF No. [416]. Therefore, the Court need not consider the admissibility of Pettingill‘s expert opinions and testimony.
60
While in a diversity case the federal court is to apply the federal substantially similar doctrine, the Court finds the Florida state law decisions defining the bounds of the doctrine persuasive as well. See Cerrato v. Nutribullet, LLC, No. 8:16-cv-3077, 2017 WL 3608266, at *2 (M.D. Fla. Aug. 22, 2017).
61
In McHale, the movant sought to introduce testing involving different kinds of forklift accidents to show why the forklift door design was safe. See id. at *2. The Eleventh Circuit found the testing pointedly dissimilar because it involved fundamentally different types of accidents that did not resemble the accident that took place and was not intended to prove causation. See id. In Heath, the Court found the substantially similar doctrine did not apply. See id. at 1396-97. Although the testing involved vehicles in a rollover accident, the purpose of the testing was simply to show the physics of a rollover; the testing was not intended to recreate the rollover or assist the jury in directly answering the question of causation. See id. Here, the testing involves the same general type of collision, notwithstanding that at each stage, additional similar circumstances were incorporated. Thus, Harrington‘s testing is not analogous to the testing done in McHale or Heath.
62
While the fact that the ADAS system in the test vehicles was turned off is arguably a substantial difference, the Court will address that issue later in its analysis.
63
Harrington described the noteworthy variables included in phase three as follows:

A black 2010 Chevrolet Tahoe was acquired for each test. Each exemplar Chevrolet matched the MY and paint code (U 8555) of the subject Chevrolet. The VUT was instrumented with the driving robot, GPS, IMU, data acquisition system, and cameras used in the nominal FCW and AEB demonstrations while the Chevrolet was not instrumented. The test vehicle was driven straight toward the Chevrolet with its longitudinal centerline collinear to the centerline of the inside lane of the Exponent test track. A flashing red traffic light was suspended across the travel lane at a height of approximately 19 feet from the ground and laterally offset from the centerline by 4.1 feet. Video footage recorded by the subject Tesla‘s onboard forward-facing cameras was analyzed to determine the timing of the traffic light flashes. This timing was replicated in Exponent‘s nighttime collision scenario demonstrations. A stop sign was placed 81.5 feet upstream of the traffic light and laterally offset from the centerline by 21.6 feet.

The 2010 Chevrolet Tahoe was positioned approximately in the center of the travel lane with its right rear wheel centered over the lane line. The Chevrolet‘s longitudinal axis was orientated at approximately 96 degrees relative to the lane. The Chevrolet‘s right rear wheel was positioned approximately 56 feet from the traffic light in the longitudinal direction. Five road signs were placed across the travel lane and the left adjacent lane in a row approximately 76 degrees relative to the lane. The middle sign was positioned approximately 14.8 feet in front of the Chevrolet‘s right rear wheel. The relative position and orientation of the Chevrolet Tahoe, the traffic light, and the various road signs were consistent with Mr. Walker‘s reconstruction of the pre-impact configuration of the subject crash. Testing was conducted at night as in the subject crash.

ECF No. [349-3] at 82-83.
64
On cross-examination, Plaintiffs may challenge the usefulness of Harrington‘s testing considering the identified discrepancies between the testing and the actual circumstances of the collision. See Daubert, 509 U.S. at 596 (“Vigorous cross-examination, presentation of contrary evidence, and careful instruction on the burden of proof are the traditional and appropriate means of attacking shaky but admissible evidence.“).
65
Tesla also challenges Cummings’ opinions regarding the adequacy of Tesla‘s warnings as to the limitations of the Autopilot system. The Court addresses those challenges in the failure to warn analysis.
66
According to Plaintiffs, this would only be true if McGee‘s reliance on Tesla‘s Autopilot “was so ‘highly unusual, extraordinary, or bizarre,’ that as a matter of fairness and policy[,] it should relieve Tesla of all liability without the issue ever reaching the jury.” ECF No. [352] at 11.
67
Tesla further argues that “neither Moore nor Cummings could point to a system that would have stopped the Vehicle, particularly when McGee had engaged the accelerator. ECF No. [326] at 24. Although the Vehicle may have detected obstacles, Plaintiffs’ experts fail to provide any evidence or testimony to explain how detection would translate to action in the 2019 Model S. See id.
68
According to Plaintiffs, the reason the relationship between the three systems is opaque is “because the system is run by complex algorithms.” ECF No. [352] at 17.
69
While the Court finds this argument to be more of a proximate cause argument rather than a design defect argument, Plaintiffs also maintain that Tesla has failed to adequately prove that McGee overrode the Autopilot system by pressing the accelerator and increasing the speed to 62 mph. ECF No. [352] at 18. Because this is a material fact in dispute, Plaintiffs contend that summary judgment cannot be granted on the theory that Autopilot was not engaged and therefore could not have contributed to the collision. Id.
70
While it is true that no alteration in design could ensure that McGee would not have used his phone or reached for it in the moments prior to the collision, Plaintiffs need not disprove this possibility with certainty to survive summary judgment as the Court “must view all evidence and make all reasonable inferences in favor of the party opposing summary judgment.” Haves v. City of Miami, 52 F.3d 918, 921 (11th Cir. 1995) (citing Dibrell Bros. Int‘l, S.A. v. Banca Nazionale Del Lavoro, 38 F.3d 1571, 1578 (11th Cir. 1994)).
71
Plaintiffs also seem to argue that the warnings in the Owner‘s Manual were inadequate, given McGee‘s deposition testimony that he would likely still have expected the Vehicle to stop and prevent a collision even after reading the warnings in the Owner‘s Manual. See ECF No. [352] at 23.
72
Plaintiffs have not asserted a negligent failure to warn claim, and as such, the Court will not address that theory of liability. See generally ECF No. [205].
73
“Plaintiff must also ‘plead the content of the warning label or otherwise describe the manner in which the warning was inadequate.‘” Dye v. Covidien LP, 470 F. Supp. 3d 1329, 1338 (S.D. Fla. 2020) (quoting Tsavaris v. Pfizer, Inc., No. 1:15-cv-2182, 2016 WL 375008, at *3 (S.D. Fla. Feb. 1, 2016)).
74
The rationale is that “[w]hen a risk is obvious or generally known, the prospective addressee of a warning will or should already know of its existence[,]” and therefore, a “warning of an obvious or generally known risk in most instances will not provide an effective additional measure of safety.” Veliz, 313 F. Supp. 2d at 1323 (quoting Warren v. K Mart Corp., 765 So. 2d 235, 238 (Fla. 1st DCA 2000) (internal quotations omitted)).
75
During Cummings’ deposition, she testified that the term “Autopilot” may give the operator of the vehicle overconfidence as to the vehicle capabilities, which “‘Autopilot’ [is] a name that is not even allowed in another set of countries.” ECF No. [318-6] at 223:15-17. Furthermore, Tesla‘s own surveys indicated a significant number of consumers mistakenly believed that the term Autopilot meant that a car could drive
76
As noted above, the obvious danger inquiry is an objective analysis that does not consider “each particular plaintiff‘s subjective appreciation of the danger.” Byrnes, 887 F. Supp. at 281. Therefore, it is irrelevant whether McGee was personally aware of Tesla‘s representation or whether he personally relied on those representations. See Cates v. Zeltiq Aesthetics, Inc., 73 F.4th 1342, 1347 (11th Cir. 2023) (“To conduct this inquiry, we put ourselves in the shoes of a ‘reasonable person,’ setting aside any individual‘s ‘subjective appreciation of the danger.‘“).
77
The Court notes that Tesla does not argue the failure to warn claims is deficient because Tesla did not owe a duty to Plaintiffs. See generally ECF No. [326]. After a review of the caselaw, it appears Tesla correctly avoided such an argument because Florida courts regularly allow plaintiffs to assert failure to warn claims even when the plaintiff is not the consumer or ultimate end user. See e.g., Lopez v. Southern Coatings, Inc., 580 So. 2d 864, 865 (Fla. 3d DCA 1991) (suing the paint manufacturer that supplied the paint to the painters at her place of employment for failure to warn the painters that mixing of the paint could cause bystanders such as plaintiff, respiratory issues); Hayes v. Spartan Chemical Co., Inc., 622 So. 2d 1352, 1354 (Fla. 2d DCA 1993).
78
Because of the other deficiencies the Court will discuss below, the Court need not and will not decide at this juncture whether the content of the warnings in the Owner‘s Manual are indeed adequate as a matter of law.
79
Plaintiffs do not dispute that the “Owner‘s Manual contains several pages of warnings about the capabilities and limitations of the Autopilot features.” ECF No. [325] at ¶ 26; see ECF No. [351] at ¶ 26.
80
During McGee‘s Deposition, he was asked: “The [V]ehicle has an owner‘s manual,” to which McGee responded, “Yes, sir.” ECF No. [318-9] at 36:21-36:23. He was later asked, “You testified earlier that you never looked at the manual—through the physical manual or online on the dash; is that right?” to which McGee responded “Yes, sir, I believe so.” Id. at 181:4-181:8. Given this testimony, there can be no genuine dispute that McGee did not read the Owner‘s Manual in the Vehicle.
81
Although the Court considered these factors in the context of negligent failure to warn claim, the Court finds that these considerations equally apply to a strict liability failure to warn claim.
82
Plaintiffs cited the following cases in support: Randi W. v. Muroc Joint Unified Sch. Dist., 929 P.2d 582, 588 (Cal. 1997); Dillard v. Victoria M. Morton Enterprises, Inc., No. 08-1339 FCD/GGH, 2008 WL 11388472, at *6 (E.D. Cal. Oct. 8, 2008); In re Neurontin Mktg., Sales Practices & Products Liab. Litig., 618 F. Supp. 2d 96, 110 (D. Mass. 2009); In re Orthopedic Bone Screw Liab. Litig., 159 F.3d 817, 828-29 (3d Cir. 1998), rev‘d on other grounds, Buckman Co. v. Plaintiffs’ Legal Comm., 531 U.S. 341 (2001); Gutzan v. Altair Airlines, Inc., 766 F.2d 135, 140 (3d Cir. 1985); Golden Spread Council, Inc., v. Akins, 926 S.W.2d 287, 291 (Tex. 1996); Davis v. Bd. of Cnty. Com‘rs of Dona Ana Cnty., 987 P.2d 1172, 1178 (N.M. Ct. App. 1999); D.S.A., Inc. v. Hillsboro Ind. Sch. Dist., 973 S.W.2d 662, 664 (Tex. 1998); Brogan v. Mitchell Int‘l, Inc., 181 Ill. 2d 178, 229 Ill. Dec. 503, 692 N.E.2d 276, 278 (Ill. 1998); Sassak v. City of Park Ridge, 431 F. Supp. 2d 810, 818-19 (N.D. Ill. 2006); Bailey v. Huggins Diagnostic & Rehab. Center, Inc., 952 P.2d 768, 772 (Colo. App. 1997); State v. Purdue Pharma L.P., No. PC-2018-4555, 2019 WL 3991963, 2019 R.I. Super. LEXIS 95, at *42 (R.I. Super. Ct. Aug. 16, 2019); Univ. Sys. Of N.H. v. United States Gypsum Co., 756 F. Supp. 640, 650 (D.N.H. 1991); Bardes v. Mass. Mut. Life Ins. Co., 932 F. Supp. 2d 636, 640 (M.D.N.C. 2013). ECF No. [352] at 27.
83
Because the Court finds that Florida case law dictates the outcome of this case, the Court need not consider the persuasive authorities outlined by Plaintiffs as they are not binding on this case. Additionally, while Bridge is a United States Supreme Court case, it also does not control here as the Court was not construing Florida law and the Court sought to resolve questions regarding the requirements for mail fraud in a RICO case. See Bridge, 553 U.S. at 646 (“We granted certiorari, to resolve the conflict among the Courts of Appeals on “the substantial question” whether first-party reliance is an element of a civil RICO claim predicated on mail fraud.“) (emphasis added and internal citations omitted).
84
Other jurisdictions have similarly recognized that, at a minimum, there must be a duty owed to the plaintiff to establish a negligent misrepresentation claim. See e.g., Mosley v. Wyeth, Inc., 719 F. Supp. 2d 1340, 1345 (S.D. Ala. 2010) (“In situations “involving negligent misrepresentations relied upon by third parties, or parties who were not in privity of contract with the person making the misrepresentation,” Alabama‘s “Supreme Court has instructed that liability for negligent misrepresentation is predicated upon the existence of a duty.““) (quoting Cooper v. Bristol-Myers Squibb Co., No. 07-885, 2009 WL 5206130, at *11 (D.N.J. 2009) (applying Alabama law and quoting Fisher v. Comer Plantation, Inc., 772 So. 2d 455, 461 (Ala. 2000))).
85
Moreover, as the district court explained in Pop v. Lulifama.com LLC, No. 8:22-cv-2698, 2024 WL 1194485, at *8 (M.D. Fla. Mar. 20, 2024) (quoting Hawaiian Airlines, Inc. v. AAR Aircraft Servs., Inc., 167 F. Supp. 3d 1311, 1322 (S.D. Fla. 2016)), “under Florida law, “the failure to disclose material information is not actionable as part of a negligent misrepresentation claim absent some fiduciary or fiduciary-like duty to disclose the information.”
86
Tesla also attempts to argue that because Plaintiffs’ four other claims must be dismissed, their punitive damages claim must be dismissed as well, given that a punitive damages claim cannot survive as an independent cause of action. ECF No. [326] at 34-35. However, the Court need not address this argument because Plaintiffs’ first two counts have survived Tesla‘s Motion for Summary Judgment.
87
See Allen Moore‘s Expert Report, ECF No. [318-1] at 10 (“Tesla had available technology to restrict Autopilot use outside of its ODD, and on the subject road“).
88
As Plaintiffs correctly point out, the Banner decision has no greater precedential value than any other Florida appellate jurisprudence bearing on the issue.
89
As the Court noted in footnote 58, Plaintiff Angulo has withdrawn Dr. Bernard F. Pettingill as an expert witness. See ECF No. [416]. Accordingly, Pettingill‘s opinions and testimony are excluded.

Case Details

Case Name: Benavides v. Tesla, Inc
Court Name: District Court, S.D. Florida
Date Published: Jun 26, 2025
Citation: 1:21-cv-21940
Docket Number: 1:21-cv-21940
Court Abbreviation: S.D. Fla.
AI-generated responses must be verified and are not legal advice.
Log In