CASE NO. 4:21cv220-RH-MAF
IN THE UNITED STATES DISTRICT COURT FOR THE NORTHERN DISTRICT OF FLORIDA TALLAHASSEE DIVISION
PRELIMINARY INJUNCTION
The State of Florida has adopted legislation that imposes sweeping requirements on some but not all social-media providers. The legislation applies only to large providers, not otherwise-identical but smaller providers, and explicitly exempts providers under common ownership with any large Florida theme park. The legislation compels providers to host speech that violates their standards—speech they otherwise would not host—and forbids providers from speaking as they otherwise would. The Governor‘s signing statement and numerous remarks of legislators show rather clearly that the legislation is viewpoint-based. And parts contravene a federal statute. This order preliminarily enjoins enforcement of the parts of the legislation that are preempted or violate the First Amendment.
I. The Lawsuit
The plaintiffs are NetChoice, LLC and Computer & Communications Industry Association. Both are trade associations whose members include social-media providers subject to the legislation at issue. The plaintiffs assert the rights of their affected members and have standing to do so. See, e.g., Hunt v. Wash. State Apple Advert. Comm‘n, 432 U.S. 333, 342-43 (1977).
The defendants are the Attorney General of Florida, the members of the Florida Elections Commission, and a Deputy Secretary of the Florida Department of Management Services, all in their official capacities. The plaintiffs named the Deputy Secretary because the Secretary‘s position was vacant. Each of the defendants has a role in enforcement of the provisions at issue and is a proper defendant under Ex parte Young, 209 U.S. 123 (1908). For convenience, this order sometimes refers to the defendants simply as “the State.”
The complaint challenges Senate Bill 7072 as adopted by the 2021 Florida Legislature (“the Act“). The Act created three new Florida statutes:
Count 1 of the complaint alleges the Act violates the First Amendment‘s free-speech clause by interfering with the providers’ editorial judgment, compelling speech, and prohibiting speech. Count 2 alleges the Act is vague in violation of the Fourteenth Amendment. Count 3 alleges the Act violates the Fourteenth Amendment‘s equal protection clause by impermissibly discriminating between providers that are or are not under common ownership with a large theme park and by discriminating between providers that do or do not meet the Act‘s size requirements. Count 4 alleges the Act violates the Constitution‘s dormant commerce clause. Count 5 alleges the Act is preempted by
The plaintiffs have moved for a preliminary injunction. The motion has been fully briefed and orally argued. Each side has submitted evidentiary material. The motion is ripe for a decision.
II. Preliminary-Injunction Standard
As a prerequisite to a preliminary injunction, a plaintiff must establish a substantial likelihood of success on the merits, that the plaintiff will suffer irreparable injury if the injunction does not issue, that the threatened injury outweighs whatever damage the proposed injunction may cause a defendant, and that the injunction will not be adverse to the public interest. See, e.g., Charles H. Wesley Educ. Found., Inc. v. Cox, 408 F.3d 1349, 1354 (11th Cir. 2005); Siegel v. LePore, 234 F.3d 1163, 1176 (11th Cir. 2000) (en banc).
This order addresses these prerequisites. The order addresses the merits because likelihood of success on the merits is one of the prerequisites. With further factual development, the analysis may change. Statements in this order about the facts should be understood to relate only to the current record and the properly considered material now available. Statements about the merits should be understood only as statements about the likelihood of success as viewed at this time.
III. The Statutes
A. Terminology
Before setting out the substance of the challenged statutes, a word is in order about terminology. This order sometimes uses the term “social-media provider” to refer to what most people on the street would probably understand that term to mean—so YouTube, Facebook, Twitter, and dozens of smaller but similar providers. The distinguishing characteristic is perhaps this: the primary function of a social-media provider, or at least a primary function, is to receive content from users and in turn to make the content available to other users. This is hardly a precise definition, but none is needed; the term is used only for purposes of this order. The term “social-media provider,” as used in this order, is not limited to providers who are covered by the challenged statutes; the term is used instead to apply to all such entities, including those smaller than the providers covered by the statutes and those under common ownership with a large theme park.
When this order uses “social media platform“—the statutory term—with or without quotation marks, the reference ordinarily will be to an entity that both meets the statutory definition and is a social-media provider as described above. This order sometimes shortens the phrase to a single word: “platform.” At least on its face, the statutory definition also applies to systems nobody would refer to as social media; the definition says nothing about sharing content with other users. The State says the definition should nonetheless be understood to be limited to providers of social media within the common understanding—the State says this comports with the statutory findings and the statutes’ obvious purpose. The State may be correct. For present purposes it makes no difference.
B. Removing Candidates
A social-media provider sometimes bars a specific user from posting on the provider‘s site. This can happen, for example, when a user violates the provider‘s standards by engaging in fraud, spreading a foreign government‘s disinformation, inciting a riot or insurrection, providing false medical or public-health information, or attempting to entice minors for sexual encounters.
Newly enacted
C. Posts “By or About” a Candidate
A social-media provider sometimes takes down a user‘s post, sometimes restricts access to a post, and sometimes adds content to a post, saying, for example, that a post has been determined not to be true or that accurate information on the subject can be found at a specified location. And a social-media provider sometimes rearranges content on its site, including, for example, by making more readily available to a user content the provider believes the user will most wish to see. Social-media providers also often elevate content—make it more readily available to chosen users—when paid by advertisers to do so. Social-media providers routinely use algorithms as part of these processes.
At least by its terms,
In any event, the statute does not explain how, if the platform cannot use an algorithm “for content” by or about a candidate, the platform can know, before it has violated the statute by using an algorithm, whether a post is by or about a candidate.
The statute has a paid-content exception to the post-prioritization ban: post-prioritization of “certain content or material” from or about a candidate based on payments from the candidate or a third party is not a violation. The statute does not specify what “certain” refers to—if it just means all such paid content, the word “certain” is superfluous. But the whole paid-content exception may be superfluous anyway; the definition of post-prioritization has its own paid-content exception. See
D. Posts by a “Journalistic Enterprise”
“Censor” includes any action taken by a social media platform to delete, regulate, restrict, edit, alter, inhibit the publication or republication of, suspend a right to post, remove, or post an addendum to any content or material posted by a user. The term also includes actions to inhibit the ability of a user to be viewable by or to interact with another user of the social media platform.
The statute defines “journalistic enterprise” in a manner that covers many entities that are engaged in journalism but many that are not; any retailer who does business in Florida, has a website of substantial size, and fills 100,000 online orders per month apparently qualifies. A small newspaper, in contrast—one with fewer than 50,000 paid subscribers and fewer than 100,000 active monthly users—does not qualify, no matter how high its journalistic standards. The definition provides:
“Journalistic enterprise” means an entity doing business in Florida that:
- Publishes in excess of 100,000 words available online with at least 50,000 paid subscribers or 100,000 monthly active users;
- Publishes 100 hours of audio or video available online with at least 100 million viewers annually;
- Operates a cable channel that provides more than 40 hours of content per week to more than 100,000 cable television subscribers; or
- Operates under a broadcast license issued by the Federal Communications Commission.
The restrictions on a platform‘s treatment of posts by journalistic enterprises have two exceptions: they do not apply to obscenity or paid content.
E. Opting Out of Post-Prioritization and Shadow Banning
The State says, though, that “user” in
F. Consistent Application of Standards
The statute does not define “consistent manner.” And the statute does not address what a social media platform should do when the statute itself prohibits consistent application of the platform‘s standards—for example, when a candidate engages in conduct that would appropriately lead to deplatforming any other person, or when content “by or about” a candidate, if by or about anyone else, would be post-prioritized, or when a “journalistic enterprise” posts content that would otherwise be censored.
G. Changing the Standards
H. Information
Under
Under
Under
I. Antitrust
IV. Likelihood of Success on the Merits
A. 47 U.S.C. § 230
In Stratton Oakmont, Inc. v. Prodigy Services Co., 1995 WL 323710, at *3-4 (N.Y. Sup. Ct. May 24, 1995), an anonymous user posted allegedly defamatory content on an electronic bulletin board—an earlier version of what today might be called social media. The court said that if the provider of such a bulletin board did not undertake to review posted content—much as a librarian does not undertake to review all the books in a library—the provider would not be deemed the publisher of a defamatory post, absent sufficient actual knowledge of the defamatory nature of the content at issue. On the facts of that case, though, the provider undertook to screen the posted content—to maintain a “family oriented” site. The court held this subjected the provider to liability as a publisher of the content.
At least partly in response to that decision, which was deemed a threat to development of the internet, Congress enacted
Under
The federal statute also preempts the parts of
Claims based on alleged inconsistency of a platform‘s removal of some posts but not others are preempted. See Domen, 991 F.3d at 73.
In sum, the plaintiffs are likely to prevail on their challenge to the preempted provisions—to those applicable to a social media platform‘s restriction of access to posted material. This does not, however, invalidate other provisions; for those, the plaintiffs’ challenge must rise or fall with their constitutional claims.
B. First Amendment
1. Application to Social-Media Providers
Although a primary function of social-media providers is to receive content from users and in turn to make the content available to other users, the providers routinely manage the content, allowing most, banning some, arranging content in ways intended to make it more useful or desirable for users, sometimes adding the providers’ own content. The plaintiffs call this curating or moderating the content posted by users. In the absence curation, a social-media site would soon become unacceptable—and indeed useless—to most users.
The plaintiffs say—correctly—that they use editorial judgment in making these decisions, much as more traditional media providers use editorial judgment when choosing what to put in or leave out of a publication or broadcast. The legislative record is chock full of statements by state officials supporting the view that the providers do indeed use editorial judgment. A constant theme of legislators, as well as the Governor and Lieutenant Governor, was that the providers’ decisions on what to leave in or take out and how to present the surviving material are ideologically biased and need to be reined in.
Where social media fit in traditional First Amendment jurisprudence is not settled. But three things are clear.
First, the State has asserted it is on the side of the First Amendment; the plaintiffs are not. It is perhaps a nice sound bite. But the assertion is wholly at odds with accepted constitutional principles. The First Amendment says “Congress” shall make no law abridging the freedom of speech or of the press. The Fourteenth Amendment extended this prohibition to state and local governments. The First Amendment does not restrict the rights of private entities not performing
Second, the First Amendment applies to speech over the internet, just as it applies to more traditional forms of communication. See, e.g., Reno v. ACLU, 521 U.S. 844, 870 (1997) (stating that prior cases, including those allowing greater regulation of broadcast media, “provide no basis for qualifying the level of First Amendment scrutiny that should be applied” to the internet).
Third, state authority to regulate speech has not increased even if, as Florida argued nearly 50 years ago and is again arguing today, one or a few powerful entities have gained a monopoly in the marketplace of ideas, reducing the means available to candidates or other individuals to communicate on matters of public interest. In Miami Herald Publishing Co. v. Tornillo, 418 U.S. 241 (1974), the Court rejected just such an argument, striking down a Florida statute requiring a newspaper to print a candidate‘s reply to the newspaper‘s unfavorable assertions. A similar argument about undue concentration of power was commonplace as the social-media restrictions now at issue advanced through the Florida Legislature. But here, as in Tornillo, the argument is wrong on the law; the concentration of market power among large social-media providers does not change the governing First Amendment principles. And the argument is also wrong on the facts. Whatever might be said of the largest providers’ monopolistic conduct, the internet provides a greater opportunity for individuals to publish their views—and for candidates to communicate directly with voters—than existed before the internet arrived. To its credit, the State does not assert that the dominance of large providers renders the First Amendment inapplicable.
That brings us to issues about First Amendment treatment of social-media providers that are not so clearly settled. The plaintiffs say, in effect, that they should be treated like any other speaker. The State says, in contrast, that social-media providers are more like common carriers, transporting information from one person to another much as a train transports people or products from one city to another. The truth is in the middle.
More generally, the plaintiffs draw support from three Supreme Court decisions in which a state mandate for a private entity to allow unwanted speech was held unconstitutional. On the State‘s side are two Supreme Court decisions in which a state or federal mandate for a private entity to allow unwanted speech was held constitutional. Each side claims the cases on its side are dispositive, but this case again falls in the middle. On balance, the decisions favor the plaintiffs.
The plaintiffs push hardest of Tornillo, which, as set out above, held unconstitutional the Florida statute requiring a newspaper to allow a candidate to reply to the newspaper‘s unfavorable statements. But newspapers, unlike social-media providers, create or select all their content, including op-eds and letters to the editor. Nothing makes it into the paper without substantive, discretionary review, including for content and viewpoint; a newspaper is not a medium invisible to the provider. Moreover, the viewpoint that would be expressed in a reply would be at odds with the newspaper‘s own viewpoint. Social media providers, in contrast, routinely use algorithms to screen all content for unacceptable material but usually not for viewpoint, and the overwhelming majority of
Similarly, in Hurley v. Irish-American Gay, Lesbian and Bisexual Group of Boston, 515 U.S. 557 (1995), a state court ruled that the state‘s public-accommodation law required an association conducting a private parade to allow participation by an organization advocating gay rights. The parade association asserted the gay-rights group‘s participation would contravene what the association was attempting to communicate. The Supreme Court held the association had a First Amendment right to exclude the gay-rights group. Again, though, the parade involved a limited number of participants, all undoubtedly approved in the association‘s discretionary judgment, including for viewpoint. This was not an invisible-to-the-provider event.
The third case on the plaintiffs’ side is Pacific Gas & Electric Co. v. Public Utilities Commission of California, 475 U.S. 1 (1986). There a public utility included in its billing envelopes its own viewpoint-laden newsletters. The state
directed the utility to include in its billing envelopes four times per year a private watchdog organization‘s newsletters setting out a viewpoint with which the utility disagreed. The Supreme Court held this unconstitutional. The utility undoubtedly knew precisely what went into its billing envelopes and newsletters; as in Tornillo and Hurley, this was not an invisible-to-the-provider forum.
These three cases establish that a private party that creates or uses its editorial judgment to select content for publication cannot be required by the government to also publish other content in the same manner—in each of these instances, content with which the party disagreed. But social-media providers do not use editorial judgment in quite the same way. The content on their sites is, to a large extent, invisible to the provider.
Even so, the activities of social media platforms that are the focus of the statutes now at issue are not the routine posting of material without incident or the routine exclusion without incident of plainly unacceptable content. These statutes are concerned instead primarily with the ideologically sensitive cases. Those are the very cases on which the platforms are most likely to exercise editorial judgment. Indeed, the targets of the statutes at issue are the editorial judgments themselves. The State‘s announced purpose of balancing the discussion—reining in the ideology of the large social-media providers—is precisely the kind of state action held unconstitutional in Tornillo, Hurley, and PG&E.
On the other side, the State pushes hardest on Rumsfeld v. FAIR, 547 U.S. 47 (2006). There the Court upheld a federal statute conditioning law schools’ receipt of federal funds on allowing military recruiters the same access as other recruiters to the school‘s facilities and students. The Court held this was, for the most part, conduct, not speech. Indeed, the schools objected not primarily because they disagreed with anything they expected the recruiters to do or say on campus, but because they disagreed with the government‘s policy on gays in the military. The statute did not require the schools to say anything at all, nor did the statute prohibit the schools from saying whatever they wished whenever and however they wished. It was unlikely anyone would conclude, from the military recruiters’ presence, that the schools supported the military‘s policy.
Similarly, in PruneYard Shopping Center v. Robins, 447 U.S. 74 (1980), a shopping center refused to allow individuals to solicit petition signatures from members of the public at the shopping center. The California Supreme Court held the individuals had the right, under state law, to engage in the proposed activity. The ruling did not compel the shopping center to say anything at all, and the ruling did not prohibit the center from saying anything it wished, when and how it wished. The United States Supreme Court said it was unlikely anyone would attribute the solicitation activities to the shopping center and, with no state action compelling the center to speak or restricting it from doing so, there was no violation of the
FAIR and PruneYard establish that compelling a person to allow a visitor access to the person‘s property, for the purpose of speaking, is not a
In sum, it cannot be said that a social media platform, to whom most content is invisible to a substantial extent, is indistinguishable for
2. Strict Scrutiny
Viewpoint- and content-based restrictions on speech are subject to strict scrutiny. See, e.g., Reed v. Town of Gilbert, Ariz., 576 U.S. 155 (2015). A law restricting speech is content-based if it “applies to particular speech because of the topic discussed or the idea or message expressed.” Id. at 163 (citing Sorrell v. IMS Health, Inc., 564 U.S. 552, 563-64 (2011), Carey v. Brown, 447 U.S. 455, 462 (1980), and Police Dep‘t of Chicago v. Mosley, 408 U.S. 92, 95 (1972)). Laws that are facially content-neutral, but that cannot be justified without reference to the content of the regulated speech, or that were adopted because of disagreement with the speaker‘s message, also must satisfy strict scrutiny. See Reed, 576 U.S. at 164.
These principles plainly require strict scrutiny here. The Florida statutes at issue are about as content-based as it gets. Thus, for example,
The plaintiffs assert, too, with substantial factual support, that the actual motivation for this legislation was hostility to the social media platforms’ perceived liberal viewpoint. Thus, for example, the Governor‘s
Moreover, these statements are consistent with the statutory definition of “social media platform,” which extends only to, and thus makes the legislation applicable only to, large entities—those with $100 million in revenues or 100 million monthly participants. As the Supreme Court has recognized, discrimination between speakers is often a tell for content discrimination. See, e.g., Citizens United v. Fed. Election Comm‘n, 558 U.S. 310, 340 (2010) (“Speech restrictions based on the identity of the speaker are all too often simply a means to control content.“). That is the case here. The state has suggested no other basis for imposing these restrictions only on the largest providers. And even without evidence of an improper motive, the application of these requirements to only a small subset of social-media entities would be sufficient, standing alone, to subject these statutes to strict scrutiny. See, e.g., Minneapolis Star & Tribune Co. v. Minnesota Comm’r of Revenue, 460 U.S. 575, 591 (1983); Arkansas Writers’ Project, Inc. v. Ragland, 481 U.S. 221, 229 (1987).
Similar analysis applies to the treatment of “journalistic enterprises” in
Finally, the same is true of the exclusion for social-media providers under common ownership with a large Florida theme park. The State asserted in its brief that the provision could survive intermediate scrutiny, but the proper level of scrutiny is strict, and in any event, when asked at oral argument, the State could suggest no theory under which the exclusion could survive even intermediate scrutiny. The State says this means only that the exclusion fails, but that is at least questionable. Despite the obvious constitutional issue posed by the exclusion, the Legislature adopted it, apparently unwilling to subject favored Florida businesses to the statutes’ onerous regulatory burdens. It is a stretch to say the severability clause allows a court to impose these burdens on the statutorily excluded entities when the Legislature has not passed, and the Governor has not signed, a statute subjecting these entities to these requirements.
To survive strict scrutiny, an infringement on speech must further a compelling state interest and must be narrowly tailored to achieve that interest. See, e.g., Reed, 576 U.S. at 171. These statutes come nowhere close. Indeed,
The plaintiffs are likely to prevail on the merits of their claim that these statutes violate the
3. Intermediate Scrutiny
The result would be the same under intermediate scrutiny—the level of scrutiny that applies to some content-neutral regulations of speech. To survive intermediate scrutiny, a restriction on speech must further an important or substantial governmental interest unrelated to the suppression of free expression, and the restriction must be no greater than essential to further that interest. The narrow tailoring requirement is satisfied so long as the governmental interest would be achieved less effectively absent the restriction. See Turner Broad. Sys., Inc. v. FCC, 512 U.S. 622, 662 (1994).
The provisions at issue here do not meet the narrow-tailoring requirement. Indeed, some of the disclosure provisions seem designed not to achieve any governmental interest but to impose the maximum available burden on the social media platforms.
Intermediate scrutiny does not apply because these statutes are not content- or viewpoint-neutral. And the statutes would not survive intermediate scrutiny even if it applied.
C. Vagueness
Two provisions are especially vague. First,
This order need not and does not decide whether vagueness would provide an independent ground for a preliminary injunction.
V. Other Prerequisites
The plaintiffs easily meet the other prerequisites to a preliminary injunction. If a preliminary injunction is not issued, the plaintiffs’ members will sometimes be compelled to speak and will sometimes be forbidden from speaking, all in violation of their editorial judgment and the
VI. Conclusion
The legislation now at issue was an effort to rein in social-media providers deemed too large and too liberal. Balancing the exchange of ideas among private speakers is not a legitimate governmental interest. And even aside from the actual motivation for this legislation, it is plainly content-based and subject to strict scrutiny. It is also subject to strict scrutiny because it discriminates on its face among otherwise-identical speakers: between social-media providers that do or do not meet the legislation‘s size requirements and are or are not under common ownership with a theme park. The legislation does not survive strict scrutiny. Parts also are expressly preempted by federal law.
For these reasons,
IT IS ORDERED:
1. The plaintiffs’ motion for a preliminary injunction, ECF No. 22, is granted.
2. The defendants Ashley Brooke Moody, Joni Alexis Poitier, Jason Todd Allen, John Martin Hayes, Kymberlee Curry Smith, and Patrick Gillespie must take no steps to enforce
SO ORDERED on June 30, 2021.
s/Robert L. Hinkle
United States District Judge
