Estimated Time to Read: 18 minutes
As digital tools continue to reshape the modern economy, the 89th Texas Legislature responded with a flurry of proposals targeting artificial intelligence (AI), cybersecurity, and social media. The resulting legislation reflects a growing desire by lawmakers to position Texas as both a national leader in tech regulation and a stronghold for digital oversight.
In this entry of our 8-part series on the 89th session, we examine what passed, what didn’t, and how the Texas Legislature is setting the pace for AI and technology policy in the years ahead.
Bills That Made It
The Texas Legislature passed a broad set of bills addressing the challenges and opportunities posed by artificial intelligence, cybersecurity, and online platforms. From creating new agencies and ethical frameworks to implementing age verification, takedown systems, and AI risk standards, lawmakers aimed to establish guardrails while encouraging innovation. The following enacted measures reflect a balance between child safety, public transparency, individual privacy, and economic development.
Creation of the Texas Cyber Command, House Bill 150
Authored by State Rep. Giovanni Capriglione (R–Southlake), House Bill 150 (HB 150) creates the Texas Cyber Command (TCC), a new state agency tasked with centralized responsibility for cybersecurity oversight and incident response. The Command absorbs key duties from the Department of Information Resources (DIR), including threat detection, digital forensics, incident response, and training across state agencies, local governments, and critical infrastructure sectors.
Established under Chapter 2063 of the Government Code, the agency will include specialized units such as a Cybersecurity Threat Intelligence Center, a Digital Forensics Laboratory, and a 24/7 cyber incident hotline. The bill requires the creation of a statewide threat reporting portal and mandates incident reporting by all governmental and critical infrastructure entities.
The TCC’s chief will be appointed by the governor (subject to Senate confirmation), and the agency is authorized to coordinate with federal authorities, private partners, and law enforcement to prevent and respond to cyberattacks. The bill also establishes a Cybersecurity Incident Response Fund for emergency support and mandates the development of best practices and training tools for public-sector cybersecurity.
The Legislative Budget Board projects a net cost of $138.7 million for the 2026–27 biennium and a staff expansion of 130 employees by 2027. Although the bill authorizes but does not appropriate funds, it represents one of the most significant cybersecurity investments in state history. The agency will be subject to periodic review by the Texas Sunset Advisory Commission.
Governor Abbott designated the issue as an emergency item in his State of the State address and touted its passage as a major milestone in making Texas a national leader in cybersecurity preparedness.
Texas Responsible Artificial Intelligence Governance Act (TRAIGA), House Bill 149
Also authored by Capriglione, House Bill 149 (HB 149) establishes a legal and regulatory framework for artificial intelligence (AI) use and biometric data privacy in Texas. The legislation creates new chapters in the Business & Commerce Code under the heading Artificial Intelligence Protection. It is narrower than its originally filed version, which had proposed sweeping restrictions on AI.
At its core, the bill regulates how biometric identifiers, such as facial geometry, fingerprints, and voiceprints, can be used in connection with AI systems. It clarifies that public availability of biometric images does not imply consent for commercial use, but it allows limited exemptions for non-identifying AI training purposes. Once biometric data is used commercially, retention and disclosure rules apply under Section 503.001.
The bill also establishes the Texas Artificial Intelligence Council as an advisory body without regulatory powers, responsible for monitoring AI developments and coordinating best practices across state agencies. Early provisions for enforcement powers and regulatory sandboxes were removed. Instead, enforcement rests solely with the Attorney General, who may seek injunctions and impose penalties up to $100,000 per violation. No private right of action is permitted.
HB 149 carries a projected general revenue cost of nearly $25 million through the 2026–27 biennium to fund new staffing and systems at the Attorney General’s office and the Department of Information Resources. While the bill avoids the overreach of earlier drafts, critics caution that its uniform application could burden small AI startups and research efforts. Supporters praise it as a balanced, privacy-forward measure to ensure that AI technologies deployed in Texas operate ethically and transparently. Critics of the bill warn that it imposes sweeping compliance obligations on all entities, regardless of size or intent, without exemptions for small businesses or research projects, and that it may introduce vague standards that are difficult to interpret and enforce, ultimately chilling innovation and open-source development.
TRAIGA attempts to emphasize transparency, public interest, and accountability while maintaining room for experimentation. Supporters described it as a “landmark” model for other states balancing innovation with consumer protection.
Public Sector AI Transparency and Risk Standards, Senate Bill 1964
Senate Bill 1964 (SB 1964), authored by State Sen. Tan Parker (R-Flower Mound), creates the first comprehensive framework for the use and oversight of artificial intelligence by Texas state agencies and local governments. The bill defines key AI terms and establishes a tiered classification system that includes “heightened scrutiny” AI systems that autonomously influence consequential decisions such as benefit eligibility or licensing.
All state agencies must inventory their AI systems and assess associated risks, submitting this information as part of broader IT strategic planning. Local governments must also assess their high-risk AI systems and share this information with the Department of Information Resources (DIR) upon request.
DIR is tasked with developing a statewide AI Code of Ethics and minimum governance standards, modeled after the federal NIST AI Risk Management Framework. The agency must also create educational and training materials for public sector workers and the general public. Smaller agencies are given flexibility to share data management resources and must publish datasets annually to support open data goals.
Though the final version of the bill omitted enforcement mechanisms, public complaint processes, and experimental sandbox programs from earlier drafts, it still represents a meaningful step in establishing ethical guardrails and transparency for government AI use. DIR is expected to spend over $7 million in the next biennium to implement the bill’s requirements. Critics of the bill argue that while it creates necessary baselines, it also lacks meaningful transparency, public accountability mechanisms, or enforceable protections for Texans affected by government AI decisions, leaving oversight largely to internal agency discretion.
This bill reflects Texas’s first major step toward coordinated AI governance in the public sector. By requiring inventories, strategic evaluations, and adherence to a state-led ethics framework, the bill lays the groundwork for future AI policy development while deliberately stopping short of enforcement-heavy or publicly accountable regulatory mechanisms. It positions Texas to respond to the growing use of AI in government decision-making while preserving agency flexibility and avoiding centralized bureaucratic expansion.
App Store Parental Controls and Data Privacy for Minors, Senate Bill 2420
Known as the App Store Accountability Act, Senate Bill 2420 (SB 2420), authored by State Sen. Angela Paxton (R-McKinney), creates a comprehensive regulatory framework to safeguard minors from unauthorized digital purchases and exposure to inappropriate content. The bill requires mobile app stores to implement age verification systems, obtain explicit parental consent for all app downloads and in-app purchases by minors, and ensure software developers disclose content ratings and data practices.
Users must be categorized by age range (e.g., under 13, 13–15, 16–17, or 18+), and accounts for minors must be linked to a verified parent or guardian account. Consent must be given per transaction, and app stores must provide detailed disclosures for content ratings and monetization features. Developers are obligated to provide and update app ratings and notify app stores of material changes, which in turn re-triggers the parental consent process.
Sensitive apps operated by nonprofit or government entities providing emergency or educational services are exempt. The law also includes privacy safeguards, mandating minimal data collection and encryption for information used in age verification and consent processes.
Violations are enforced by the Texas Attorney General under the Deceptive Trade Practices Act, allowing for civil penalties and injunctive relief. While the bill does not create new criminal offenses or agencies, it imposes extensive compliance requirements on app stores and developers, especially smaller entities. Fiscal analysis indicates no significant cost to state or local governments, with administrative enforcement expected to be absorbed within existing resources.
SB 2420 will take effect January 1, 2026. It marks a major step in shifting digital oversight responsibilities toward platforms, while embedding parental authority into the architecture of mobile commerce and app distribution. The bill sets new standards for digital app marketplaces and requires them to implement age verification, individualized parental consent systems, standardized content ratings, and limits on data collection and sharing related to minors. It represents a notable expansion of Texas’s data privacy laws as they apply to children’s digital use.
Platform Takedown Requirements for Explicit Deepfake Content, House Bill 3133
House Bill 3133 (HB 3133), authored by State Rep. Salman Bhojani (D-Euless), requires social media platforms to establish formal complaint procedures for users to report nonconsensual, AI-generated sexually explicit deepfake content. The bill defines “explicit deep fake material” as synthetic media that appears to depict real individuals engaged in sexual conduct or displaying intimate parts, created with the intent to deceive.
Platforms must acknowledge flagged content within 48 hours and remove it (along with duplicates) pending review. Within seven days, they must inform the complainant of the complaint’s resolution. If the content is determined to be non-explicit, it may be reinstated. Platforms are not required to notify the user whose content is removed or offer an appeals process, raising concerns about due process and the potential for abuse of the complaint system.
Violations are classified as deceptive trade practices under the Texas Deceptive Trade Practices Act (DTPA), allowing affected users to bring civil suits for noncompliance. The bill does not create new criminal offenses or regulatory agencies and places the burden of enforcement on private action.
Critics have flagged the bill’s lack of procedural safeguards and the potential chilling effects on expression, while supporters view it as a necessary response to protect victims, especially women, from deepfake exploitation.
Criminal and Civil Penalties for AI-Generated Deepfake Pornography, Senate Bill 441
Senate Bill 441 (SB 441), authored by State Sen. Juan ‘Chuy’ Hinojosa (D-McAllen), significantly expands both criminal and civil liability in Texas for the creation, distribution, solicitation, or promotion of AI-generated sexually explicit content without the subject’s consent. The bill amends Section 21.165 of the Penal Code to criminalize nonconsensual deepfake media, redefining the offense to include any realistic visual depiction altered by AI that portrays an identifiable person in sexual scenarios. First-time offenses are Class A misdemeanors, with elevated penalties for repeat offenses or minors.
A new Class B misdemeanor offense is also created for threats to disseminate such content with the intent to harass or coerce. Platforms and third parties may face civil liability if they knowingly profit from or facilitate distribution without proper consent. Civil remedies include actual and punitive damages, attorney’s fees, and victim anonymity protections.
The bill also mandates content takedown procedures for platforms, requiring removal of flagged content within 72 hours and public transparency about the takedown process. SB 441 provides affirmative defenses for law enforcement, legal activity, and neutral service providers, aiming to protect civil liberties while closing dangerous legal loopholes related to synthetic sexual exploitation.
Age and Consent Verification for AI-Generated Sexual Content, House Bill 581
Authored by State Rep. Mary Gonzalez (D-El Paso), House Bill 581 (HB 581) addresses the growing threat of non-consensual, AI-generated sexually explicit material, especially deepfakes involving minors. The bill defines “artificial sexual material harmful to minors” as content created using AI that realistically portrays an identifiable individual in explicit scenarios. It applies to commercial entities that operate websites or applications capable of generating such material.
The bill requires these platforms to implement reasonable age verification systems, such as government-issued ID checks, before allowing access to generative tools. It prohibits the storage of identifying data collected during verification. To ensure flexibility, platforms can be exempted from age verification if they prohibit such content in their terms of service and use effective technical safeguards to prevent abuse, such as content filters or pre-training exclusions.
HB 581 also mandates that platforms verify a person is over 18 and has given consent before using their likeness in AI-generated explicit content. Violators face civil penalties of up to $10,000 per day. The bill includes no new criminal penalties and relies on civil enforcement mechanisms only.
The Legislative Budget Board does not anticipate a significant fiscal impact, as enforcement is expected to be handled using existing agency resources. This bill reflects a narrowly tailored policy solution balancing privacy, innovation, and protection for vulnerable individuals in the AI age. Critics argue, however, that it creates significant compliance burdens for platforms and developers, particularly smaller or open-source projects, and that its broad definitions and lack of clarity could chill legitimate applications of generative AI.
Creation of the Texas Strategic Bitcoin Reserve, Senate Bill 21
Senate Bill 21 (SB 21), authored by State Sen. Charles Schwertner (R-Georgetown), creates the Texas Strategic Bitcoin Reserve, a state-managed fund overseen by the Texas Comptroller of Public Accounts and established outside the state treasury. The reserve is authorized to invest in digital assets—specifically cryptocurrencies with a market cap of at least $500 billion, effectively limiting investments to Bitcoin (and potentially Ethereum). Its purpose is to diversify the state’s financial portfolio and position Texas as a national leader in digital asset policy.
The Comptroller has broad authority to acquire, manage, secure, and liquidate cryptocurrency, including through custodians, cold storage, and liquidity partners. Earnings and proceeds may be used for fund administration, and temporary fund transfers to the treasury may be permitted with legislative approval. The law also establishes a five-member advisory committee to guide investment policy and oversight.
SB 21 amends the Government Code to exempt the reserve from traditional public fund investment limits, creating a significant shift in how the state approaches financial diversification. Biennial public reporting is required, detailing assets, transactions, and fund performance.
Changes to the bill throughout the process introduced restrictions on liquidation, improved fiscal guardrails, and clarified that no funds could be transferred without legislative approval. Still, the reserve’s existence outside the treasury reduces legislative control and opens the door to volatility-driven financial exposure. The Legislative Budget Board deemed the fiscal impact indeterminate, and oversight mechanisms like audit requirements and exposure caps were ultimately excluded.
While praised by crypto proponents as a visionary step, critics warn that the reserve marks a risky expansion of government into speculative investment. The long-term success or failure of this initiative will depend on cryptocurrency market performance and the Comptroller’s stewardship. Texas became the first state to establish a government-managed bitcoin reserve fund with the passage of SB 21. The reserve will operate as a special fund outside the state treasury, accruing digital assets through legislative appropriation, investments, and private donations. The Texas Comptroller will manage the fund and must report to the Legislature every two years.
Supporters describe it as a long-term strategic hedge against inflation and a signal of Texas’s continued openness to cryptocurrency and blockchain experimentation.
Bills That Did Not Make It
Despite a surge of proposals, not all tech-related legislation cleared the finish line. Several high-profile bills—particularly those involving content provenance, youth social media restrictions, and expanded AI disclosures—failed to advance through the Senate or were sidelined during negotiations. These measures, while stalled, remain influential in shaping future conversations around digital policy in Texas.
AI-Generated Media Provenance Requirements for Social Platforms, House Bill 2874
House Bill 2874 (HB 2874), authored by State Rep. Suleman Lalani (D-Sugar Land), would have required social media platforms to attach and retain provenance metadata for all photo, video, or audio content created or uploaded on their platforms using generative artificial intelligence (AI). This metadata would indicate whether AI was used in generating or modifying the content, the specific tool involved, and the name of its provider. Users would have been able to view this data through an accessible interface.
The bill removed a prior threshold of 1.5 million Texas users, making its scope applicable to platforms of all sizes. While it offered liability shields for platforms that relied in good faith on third-party data, it also allowed the Attorney General to enforce compliance and grant exemptions under broad discretionary standards.
Although intended to increase transparency and combat misinformation, critics raised concerns about compliance burdens, risks to expressive freedom, and vague enforcement provisions. Though the bill was put on a House Calendar, it was ultimately never considered by House lawmakers ahead of end-of-session deadlines.
Social Media Ban for Minors, House Bill 186
Authored by State Rep. Jared Patterson (R-Frisco), House Bill 186 (HB 186) would have prohibited individuals under 18 from accessing or using social media platforms in Texas. The bill passed the House and was reported favorably by the Senate State Affairs Committee, but ultimately was never considered on the Senate floor before the session ended.
The bill required platforms to implement commercially reasonable age verification procedures and defined social media platforms narrowly to exclude ISPs, email services, and platforms focused primarily on non-user-generated content or gaming. Verified parents or guardians would have been empowered to request deletion of a child’s account, and platforms would have been required to act within 10 days of such a request.
Violations, including granting access to minors or mishandling verification data, would have been treated as deceptive trade practices under the Texas Deceptive Trade Practices Act (DTPA), giving enforcement authority to the Attorney General.
Though the bill received significant attention and was poised to become a cornerstone of Texas’s approach to youth tech safety, it died due to time constraints. It remains a candidate for reintroduction in a future session, given the growing national debate around social media and adolescent mental health.
Opponents raised major privacy concerns, warning that the required age-verification methods could expose minors to data collection and security vulnerabilities. The bill also sparked intense constitutional and procedural debates, with critics highlighting potential First Amendment violations, the question of whether the state should override parental discretion, and the open-ended expansion of government authority into family and digital space. Many feared it would set a dangerous precedent for state control over youth online behavior.
Political Takeaways/Trends
Artificial intelligence and digital governance emerged as a major focus of the 89th Legislature, but the resulting policy landscape is far from uniformly positive. While lawmakers claimed to pursue innovation and digital safety, many of the bills passed expand state power, introduce new layers of bureaucracy, and regulate private platforms and developers in ways that may chill innovation and intrude on parental rights.
Though measures like TRAIGA and the Cyber Command frame Texas as a tech-forward state, they also concentrate oversight authority in unelected bodies and risk burdening small startups and open-source developers. Meanwhile, interventions like SB 2420 insert the state between parents and digital platforms, mandating age verification and individualized consent structures that could complicate family decisions rather than empower them.
Proposals like HB 186, which would have outright barred minors from using social media, reflect a paternalistic impulse that continues to resurface despite constitutional challenges and limited evidence of efficacy. While that bill ultimately failed, others like HB 581 and HB 3133 demonstrate the legislature’s growing willingness to police digital spaces, often through sweeping mandates enforced via civil penalties.
The direction of state policy suggests a more centralized role for government in technology regulation—one that often sacrifices innovation, free expression, and market flexibility in pursuit of control. While the rhetoric is pro-technology and pro-child safety, the practical outcome is increased government intervention in areas better left to families, individuals, and innovators.
Conclusion
Texas exited the 89th Legislative Session having enacted a wide range of technology-related bills, but the result is not an unequivocal advancement of liberty or innovation. While lawmakers made bold moves on cybersecurity, biometric privacy, and digital content regulation, many of the passed bills expand the role of government in ways that raise serious concerns.
Rather than foster a free and open innovation environment, several measures increase bureaucracy, burden small platforms and startups, and assert regulatory authority over areas better governed by parental judgment and private decision-making. Legislation like SB 2420 and HB 3133 inserts the state between families and digital tools, while AI-related mandates risk stifling experimentation by imposing sweeping compliance obligations without proportional safeguards.
Though framed as “tech-forward,” the session’s results often leaned toward centralized control and symbolic policymaking, rather than genuine empowerment of individuals and markets. As Texas continues to chart its course on digital policy, the priority should be recalibrating toward liberty, decentralization, and a more restrained view of government’s role in emerging technologies.
Other Policy Briefs in the Series
Texas Policy Research relies on the support of generous donors across Texas.
If you found this information helpful, please consider supporting our efforts! Thank you!