There was a time when regulation followed risk. Lawmakers held hearings, consulted experts, and tried to understand what they were voting on. That tradition has collapsed under the weight of partisanship and donor incentives. Today policy is drafted for optics rather than enforcement, leaving a digitizing society governed by deliberate ignorance.

In 2025 a budget megabill carried a proposed ten-year moratorium on state and local artificial intelligence regulation. Policy groups warned it would preempt safeguards against algorithmic discrimination, biometric surveillance, and deepfake abuse just as those systems spread into elections and public services. After public blowback, the Senate voted to strip the moratorium from the package, highlighting how speed and symbolism outran basic governance. The episode revealed how far lawmakers will go to silence oversight before the ink dries.

California provided a cautionary tale about urgency without specificity. AB 2655 ordered platforms to remove materially deceptive election deepfakes within forty-eight hours, but courts enjoined and then invalidated key provisions on First Amendment and Section 230 grounds. Commentators warned from the start that the term "deceptive" lacked a clear definition and would collapse on contact with constitutional scrutiny. The result was performative toughness that failed in court rather than durable rules that could survive enforcement.

Data privacy completes the picture of fragmentation by design. In 2025 alone, eight more state privacy laws took effect and more than a dozen are on the books, with different rights and obligations that whipsaw compliance and leave users with uneven protections. Maryland bans sensitive data sales while Iowa permits collection subject to notice and opt out. Practitioners now track state maps, grace periods, and novel rights like Minnesota's right to question automated decisions, which adds yet another variable to enterprise governance. The patchwork rewards lobbying and confuses users, and Washington still has no baseline.

There is a way to stop writing laws that fold on contact. Draft with stakeholder input, require explainability standards tied to use cases, and avoid vague terms that invite quick defeats in court. Pair any federal preemption with minimum floors rather than total silence so states retain authority to act where harms are local. Build procedures that demand public comment, open technical references, and pilot phases before mandates, and keep scope narrow enough to be testable. The best laws are boring, precise, and enforceable — not viral.