SB 1455
Creates provisions relating to artificial intelligence chatbots
Sponsor:
LR Number:
6193S.01I
Committee:
Last Action:
12/19/2025 - Prefiled
Journal Page:
Title:
Effective Date:
August 28, 2026

Current Bill Summary

SB 1455 - The act establishes the "Guidelines for User Age-Verification and Responsible Dialogue Act of 2026" or the "GUARD Act".

The act provides that it shall be unlawful to design, develop, or make available an artificial intelligence chatbot knowing or with reckless disregard that the chatbot poses certain risks of soliciting minors to engage in sexually explicit conduct or encouraging minors to create or transmit any visual depiction of sexually explicit conduct. Any person who violates this provision shall be fined not more than $100,000 per offense.

It shall be unlawful to design, develop, or make available an artificial intelligence chatbot knowing or with reckless disregard that the chatbot encourages, promotes, or coerces suicide, self-injury, or imminent physical or sexual violence. Any person who violates this provision shall be fined not more than $100,000 per offense.

A covered entity, as defined in the act, shall require each individual accessing a chatbot to make a user account in order to use the chatbot.

For any chatbot that exists as of August 28, 2026, a covered entity shall freeze the account, require the user to provide age data to restore the account, and using the age data classify each user as a minor or an adult.

At the time an individual creates a new user account to interact with a chatbot, a covered entity shall request age data from the individual, verify the individual's age using a reasonable age verification process, and classify each user as a minor or an adult using the age data.

A covered entity shall periodically review previously verified user accounts using a reasonable age verification process.

A covered entity may contract with a third party to employ reasonable age verification measures as part of the age verification process, as described in the act.

A covered entity shall establish reasonable measures to protect personal data as described in the act.

Each artificial intelligence chatbot shall at the start of each conversation with a user at 30-minute intervals disclose to the user that the chatbot is artificial intelligence and not a human being and be programmed to ensure that the chatbot does not claim to be a human being.

The chatbot shall not represent that the chatbot is a licensed professional, as described in the act, or that the chatbot provides certain professional services, as described in the act.

If the age verification process determines that an individual is a minor, a covered entity shall prohibit the minor from accessing any chatbot made available by the covered entity.

The Attorney General may bring a civil action for violations of the act. Relief is described in the act.

The act is identical to HB 2032 (2026).

JULIA SHEVELEVA

Amendments

No Amendments Found.