GUARD Act After Revisions: Privacy and Parental Concerns Remain High

By ● min read

Following widespread criticism, lawmakers have narrowed the scope of the GUARD Act, a legislative proposal originally targeting minors' access to certain AI systems. The initial version threatened to encompass nearly every AI-powered chatbot or search tool. The revised bill now focuses more narrowly on so-called “AI companions”—conversational systems designed to simulate emotional or interpersonal interactions with users. While this change addresses some of the most sweeping concerns, the updated legislation still creates serious problems for privacy, online speech, and parental choice.

Background of the GUARD Act

The GUARD Act, short for “Guarding Against Unrestricted AI Interactions for Minors,” was introduced with the stated goal of protecting children from potential harms of AI companions. Critics of the original bill argued it was overly broad, potentially restricting access to all sorts of AI tools, including education-focused chatbots and search engines. The revised version narrows the definition to systems that engage in emotional disclosures or present a persistent identity, persona, or character.

GUARD Act After Revisions: Privacy and Parental Concerns Remain High
Source: www.eff.org

Key Changes in the Revised Bill

The most significant revision is the narrowed definition of “AI companion.” The original bill could have been interpreted to cover any AI-powered conversational tool. Now, it targets systems that “engage in interactions involving emotional disclosures” from the user or present a persistent identity, persona, or character. This change reduces the risk of overreach but still leaves gray areas. For instance, a general-purpose chatbot that occasionally offers casual empathy might fall under the definition, depending on interpretation.

Additionally, the revised bill allows a broader set of age-verification methods. However, all methods remain tied to users’ real-world identities—such as financial records, government IDs, or age-verified accounts with mobile operating systems or app stores. This requirement persists despite privacy concerns.

Persistent Privacy and Age-Verification Concerns

The revised GUARD Act still mandates that companies offering AI companions implement “reasonable age verification” for all users. While the methods may be more flexible, they inevitably require linking to sensitive personal or financial information. This creates multiple issues:

Many people, including parents, may choose to forgo using these services rather than compromise their privacy. As one analysis noted, “People are rightly creeped out by age-verification systems” and may simply avoid them altogether.

GUARD Act After Revisions: Privacy and Parental Concerns Remain High
Source: www.eff.org

Unclear Definitions and Heavy Liability

The revised bill leaves important definitions unclear while sharply increasing penalties for companies that misinterpret them. For example, what exactly constitutes an “emotional disclosure”? Could a chatbot asking “How was your day?” be seen as soliciting emotional content? The vague language places developers in a difficult position, as they face heavy liability for getting judgments wrong. This chilling effect could discourage innovation and push companies to over-censor or block features to avoid legal risks.

Impact on Parental Choice and Access

Even parents who actively want their teenagers to use AI companions would face significant hurdles under the revised bill. For instance:

Under the revised GUARD Act, those users would still be forced through mandatory age checks tied to sensitive personal or financial information before they or their children can access these services. This undermines parental discretion and assumes that every parent needs government-mandated barriers to technology they have already chosen for their child.

Conclusion: A Step Forward, But Serious Problems Linger

Lawmakers have made progress by narrowing the GUARD Act’s scope, but the revised bill still attempts to solve a complicated social problem with vague legal standards, heavy liability, and privacy-invasive verification systems. Until age verification can be done without compromising privacy or excluding vulnerable populations, and until definitions are crystal clear, the GUARD Act will continue to raise serious concerns. For now, citizens and organizations are urging Congress to oppose the bill in its current form.

Back to top

Tags:

Recommended

Discover More

Strengthening Python's Security: The PSRT's New Governance and MembershipCosmic Inflation Crisis: Physicists Question Foundation of Universe's OriginThe Anti-Aging Power of Travel: A Science-Backed Guide to Living LongerModernizing Your React Build Pipeline: From Webpack to ViteBreaking: Lego Unveils Buildable Sega Genesis Set – Pre-Orders Open June 1