Governor Walz Signs Groundbreaking Legislation to Combat AI‑Generated CSAM

0
3

Key Takeaways

  • Minnesota became the first U.S. state to ban access to “nudification” technology that uses AI to create fake, non‑consensual nude images or videos.
  • Governor Tim Walz signed HF 1606 on Thursday; the bill passed with bipartisan support and will take effect in August 2025.
  • The law prohibits downloading, accessing, or using nudification software unless the output requires substantial human artistic or technical skill.
  • Civil penalties under the statute are not retroactive, but the legislation gives victims and the Minnesota Attorney General’s Office a legal route to pursue companies that provide the technology.
  • The measure responds to a rapid rise in AI‑generated child sexual abuse material; the Internet Watch Foundation recorded over 8,000 such images and videos in 2025.
  • Legislative sponsor Rep. Jess Hanson emphasized that stopping the technology at its source is essential to curb exploitation.
  • A recent federal case highlighted the danger: a former school employee pleaded guilty to producing over 690 AI‑altered child pornography images using photos taken on the job.
  • Victim advocate Megan Hurley described the lasting trauma of having fabricated nude images circulated without her consent.
  • The law aims to protect minors and adults alike by targeting the tools that enable deep‑fake sexual abuse rather than punishing only the end‑users.
  • Supporters hope Minnesota’s precedent will inspire similar safeguards in other states and at the federal level as AI capabilities continue to evolve.

Overview of Minnesota’s New Legislation
Minnesota has taken a pioneering step in the fight against AI‑enabled sexual exploitation by enacting a law that outright bans access to “nudification” technology. The statute, HF 1606, was signed into law by Governor Tim Walz on Thursday after receiving bipartisan backing in the state legislature. By targeting the tools that generate fabricated nude images or pornographic videos from ordinary photographs, the law seeks to address a growing threat that has outpaced existing legal frameworks. The legislation reflects a proactive stance, asserting that the state’s responsibility to protect its residents—particularly children and families—must keep pace with rapid technological advancement.

Details of HF 1606 and Its Provisions
The core of HF 1606 prohibits any person in Minnesota from accessing, downloading, or using software, websites, or applications that employ artificial intelligence to create nude or sexualized depictions of individuals without their consent. An important exception is carved out for cases where the output demands a “substantial application of technological or artistic skill by a human creator directing and controlling the output.” This carve‑out is intended to preserve legitimate artistic or educational uses that involve significant human intervention while closing the loophole that allows fully automated nudification tools to flourish. Violations of the ban are subject to civil penalties, though the statute explicitly states that these penalties will not apply retroactively to conduct occurring before the law’s effective date.

Governor Walz’s Statement and Rationale
Governor Tim Walz framed the legislation as a necessary safeguard in an era where AI capabilities are advancing at breakneck speed. In his signing statement, he declared, “Technology is moving fast, but our responsibility to protect Minnesotans, especially kids and families, moves faster.” Walz emphasized that the bill sends a clear message: creating and distributing fake, non‑consensual intimate images is unacceptable and will not be tolerated. By positioning Minnesota at the forefront of AI‑related harm mitigation, the governor hopes to deter both developers and users of nudification technology from exploiting the state’s residents.

Background on Nudification Technology and Its Harms
Nudification tools leverage deep‑learning models, often variants of generative adversarial networks (GANs) or diffusion models, to manipulate clothed photographs into realistic nude images or to splice faces onto pornographic video content. Because the output can be indistinguishable from genuine media, victims frequently suffer severe psychological trauma, reputational damage, and a loss of personal autonomy. The technology has been weaponized for harassment, extortion, and the production of child sexual abuse material (CSAM). Unlike traditional photo editing, which requires noticeable skill and time, modern AI nudification apps can produce convincing results with minimal user input, dramatically lowering the barrier to misuse.

Statistics from Internet Watch Foundation
The urgency behind Minnesota’s law is underscored by data from the Internet Watch Foundation (IWF), which monitors online child sexual abuse content. In 2025, the IWF identified 8,029 AI‑generated images and videos depicting realistic child sexual abuse—a figure that represents a sharp increase from previous years. Once such material is uploaded, it can proliferate across forums, dark‑web marketplaces, and social platforms, making removal exceedingly difficult. Victims often report that the images continue to resurface long after the original upload, perpetuating trauma and complicating efforts to regain control over their digital identities.

Legislative Sponsor Rep. Jess Hanson’s Perspective
Rep. Jess Hanson, the author of HF 1606, argued that addressing the problem at its source—namely, the technology that enables the creation of fake nude imagery—is critical. In interviews, Hanson noted that many members of the public remain unaware of how pervasive and accessible nudification tools have become. She stressed that the rapid proliferation of AI‑generated CSAM over the past couple of years demanded immediate legislative action. By making it illegal to obtain or use these tools, Hanson believes Minnesota can curb the supply chain that fuels exploitation, thereby reducing the volume of harmful content that reaches victims.

Implementation Timeline and Enforcement Mechanisms
HF 1606 is set to go into effect in August 2025, giving developers, distributors, and end‑users a brief window to comply with the new restrictions. Enforcement will fall primarily to the Minnesota Attorney General’s Office, which is empowered to investigate companies that provide nudification software and to pursue civil actions against violators. While the law does not impose criminal penalties for mere possession, it allows victims to seek damages and injunctive relief through civil litigation. The non‑retroactive nature of the civil penalties means that past conduct will not be punished under this statute, but ongoing or future violations will be subject to enforcement.

Case Example: Michael Haslach’s Guilty Plea
The legislative push gained additional momentum from a concurrent federal case that illustrated the real‑world dangers of AI‑enabled abuse. On the same day Governor Walz signed the bill, the U.S. Attorney’s Office for the District of Minnesota announced that 30‑year‑old Michael Haslach, a former school district employee, pleaded guilty to attempted production of child pornography and production of an obscene visual representation of child sexual abuse. Investigators reported that Haslach had accessed clothed photographs of at least 91 children while working as a lunch monitor, traffic guard, and youth summer‑programs assistant. He then used AI nudification tools to morph those images into pornographic content, amassing more than 690 altered files. The case was initiated after Dropbox flagged suspicious uploads and tipped authorities, leading to Haslach’s arrest in Maplewood, MN.

Victim Testimony: Megan Hurley’s Experience
Megan Hurley, who testified before the state Capitol in support of HF 1606, shared a personal account that highlighted the lasting impact of non‑consensual deep‑fake imagery. Hurley described how a former acquaintance used AI to generate realistic nude images of her and dozens of other women, despite her never having taken or exchanged such photos. She said the fabricated images appeared “so real” that they caused severe anxiety, forcing her to adjust her work schedule to avoid being alone and to contend with ongoing fear of recognition. Hurley emphasized that once such content is disseminated online, there is virtually no way to fully erase it, leaving victims to live with the threat of perpetual exposure. Her testimony underscored the law’s focus on preventing the creation of such material in the first place, rather than relying solely on after‑the‑fact removal efforts.

Broader Implications and Future Outlook
Minnesota’s ban on nudification technology represents a landmark effort to confront the dark side of generative AI. By targeting the tools that enable non‑consensual synthetic pornography, the law aims to reduce the supply of exploitative content at its origin, complementing existing criminal statutes that punish the distribution and possession of CSAM. Legal scholars note that the statute may serve as a model for other states grappling with similar challenges, potentially influencing federal discussions on AI safety and content regulation. As AI continues to evolve, ongoing vigilance will be required to ensure that legislation keeps pace with emerging techniques, such as real‑time video deep‑fakes or more sophisticated image‑to‑video generators. Nonetheless, Minnesota’s decisive action sends a clear societal message: the protection of individuals’ dignity and safety must not be sacrificed in the pursuit of technological innovation.

SignUpSignUp form

LEAVE A REPLY

Please enter your comment!
Please enter your name here