89th Legislature Regular Session

SB 2637

Overall Vote Recommendation
No
Principle Criteria
Free Enterprise
Property Rights
Personal Responsibility
Limited Government
Individual Liberty
Digest
SB 2637 proposes the addition of Section 120.0521 to the Texas Business and Commerce Code, establishing disclosure requirements for social media content generated by automated "bot accounts." Under the bill, a "bot account" is defined as any account controlled by automated software that mimics or impersonates human activity by posting content online. The bill mandates that social media platforms identify and clearly disclose when content is known to have been generated by such accounts.

The required disclosure must include two key elements: (1) a notice that the content was posted by a bot account and (2) a warning that the content may contain misinformation. Platforms that fail to comply with these disclosure requirements are subject to civil penalties of up to $7,500 per violation. The Texas Attorney General is authorized to investigate complaints, determine violations, and initiate enforcement actions to collect the specified penalties.

SB 2637 is intended to enhance transparency and accountability in the digital information ecosystem, particularly in light of concerns about misinformation and automated influence operations.
Author
Nathan Johnson
Co-Author
Cesar Blanco
Fiscal Notes

According to the Legislative Budget Board (LBB), the fiscal implications of SB 2637 are currently indeterminate. The bill authorizes civil penalties of up to $7,500 per violation for social media platforms that fail to disclose when content is posted by a bot account. However, due to the unpredictable number of violations and resulting enforcement actions, the potential revenue from these penalties cannot be reliably estimated at this time.

The Office of the Attorney General (OAG), which is granted enforcement authority under the bill, anticipates that it can manage any new administrative or legal workload within its existing budget and staff resources. Thus, no new appropriations or expenditures are projected for the OAG to fulfill its responsibilities under this legislation.

Additionally, the bill is not expected to have a significant fiscal impact on the state court system. Enforcement would likely involve limited litigation or judicial oversight. Likewise, the bill is not projected to impose notable fiscal burdens on local governments or jurisdictions, given its focus on state-level enforcement and regulation of private social media companies.

Vote Recommendation Notes

SB 2637 proposes a requirement for social media platforms to disclose when content has been posted by a bot account and to include a warning that such content may contain misinformation. While the bill is framed as a tool to enhance transparency and protect the public from digital deception, its mechanisms raise serious concerns about constitutional rights, government overreach, and regulatory burdens on private enterprise.

First and foremost, the bill introduces a form of government-compelled speech. It mandates that private companies attach language—crafted by the state—to certain posts, regardless of whether misinformation has actually occurred. This infringes on First Amendment protections by requiring businesses to make speculative statements dictated by law. While transparency is a valuable goal, compelling companies to label content in this manner veers into constitutionally questionable territory, particularly since not all automated posts are misleading or harmful.

Further, the bill lacks precision in defining what constitutes a "bot account" or how platforms are to “know” when content was generated by one. Without clear standards, enforcement could become arbitrary or politically motivated. This creates a compliance landscape where companies must either over-label to avoid fines or risk enforcement actions from the Attorney General’s office. Such ambiguity invites regulatory abuse and legal challenges.

SB 2637 also represents a clear expansion of government power. It grants new investigatory and enforcement authority to the Office of the Attorney General, with the ability to pursue civil penalties up to $7,500 per post. While the fiscal note suggests that enforcement can be absorbed within existing budgets, the structural expansion of government into online speech oversight sets a concerning precedent, especially for those who value limited government and restraint in the state’s regulatory function.

From a free enterprise standpoint, the bill imposes nontrivial burdens on private companies, especially smaller or emerging social media platforms. Identifying and labeling bot content in compliance with the bill would require costly detection systems, legal oversight, and moderation tools. These barriers to entry could suppress innovation, entrench the market power of large incumbents, and reduce diversity in the digital marketplace.

Finally, although the bill does not impose a direct tax burden, its compliance costs and penalty risks effectively shift the burden onto private businesses and, by extension, consumers. This runs counter to the principle of minimal government interference in the market and private speech.

For these reasons—the infringement on constitutional liberties, expansion of government authority, vague enforcement standards, and unnecessary regulatory burdens—Texas Policy Research recommends that lawmakers vote NO on SB 2637. While the goal of combating online misinformation is understandable, this bill adopts an approach that risks doing more harm than good.

  • Individual Liberty: The bill directly threatens individual liberty by mandating compelled speech. Requiring private social media platforms to label certain content with state-mandated language—especially speculative statements like "may contain misinformation"—interferes with constitutional protections under the First Amendment. It sets a precedent for the state to dictate how private entities communicate or label content, even in cases where no harm or deception has been proven. This could also chill lawful speech by incentivizing over-labeling to avoid fines.
  • Personal Responsibility: The bill assumes that users cannot distinguish between human and automated content without government intervention. By forcing platforms to pre-label content and warn about its possible validity, it erodes the principle that individuals are responsible for evaluating the information they consume. A society that encourages critical thinking and skepticism should not default to state-managed content warnings as a proxy for truth.
  • Free Enterprise: This bill imposes a regulatory burden on private businesses, especially social media platforms and tech startups. Compliance would require costly investments in bot-detection systems, legal infrastructure, and moderation teams. Large companies may absorb this easily, but smaller platforms could be pushed out of the market or discouraged from entering it altogether. This government-imposed cost structure distorts the marketplace and deters innovation.
  • Private Property Rights: While the bill does not directly take property or interfere with physical assets, it undermines digital property rights by dictating how platforms manage and display content. The state is effectively asserting control over part of the business interface and user experience of a private company, infringing on the owner's ability to determine how their platform operates.
  • Limited Government: This bill clearly expands the size and scope of government. It gives the Texas Attorney General new investigative and enforcement powers over online speech-related issues, allowing the state to pursue penalties up to $7,500 per post. Even if existing staff absorb the workload (as the fiscal note claims), the policy invites further government involvement in digital content regulation—something that traditionally falls outside state authority. This contradicts the principle of keeping government power narrowly tailored and restrained.
View Bill Text and Status