[Congressional Bills 119th Congress]
[From the U.S. Government Publishing Office]
[H.R. 6489 Introduced in House (IH)]
<DOC>
119th CONGRESS
1st Session
H. R. 6489
To ensure that providers of chatbots clearly and conspicuously disclose
to users who are minors that chatbots are artificial intelligence
systems, not natural person, and do not provide advice from licensed
professionals, and for other proposes.
_______________________________________________________________________
IN THE HOUSE OF REPRESENTATIVES
December 5, 2025
Mrs. Houchin introduced the following bill; which was referred to the
Committee on Energy and Commerce
_______________________________________________________________________
A BILL
To ensure that providers of chatbots clearly and conspicuously disclose
to users who are minors that chatbots are artificial intelligence
systems, not natural person, and do not provide advice from licensed
professionals, and for other proposes.
Be it enacted by the Senate and House of Representatives of the
United States of America in Congress assembled,
SECTION 1. SHORT TITLE.
This Act may be cited as the ``Safeguarding Adolescents From
Exploitative BOTs Act'' or the ``SAFE BOTs Act''.
SEC. 2. REQUIREMENTS FOR CHATBOTS USED BY MINORS.
(a) Certain Statements Prohibited.--A chatbot provider may not
provide to a covered user a chatbot that states to the covered user
that the chatbot is a licensed professional (unless such statement is
true).
(b) Disclosure Required.--
(1) In general.--A chatbot provider shall clearly and
conspicuously disclose, in accordance with paragraphs (2) and
(3), to each covered user of a chatbot of such provider notice
of the following:
(A) The chatbot is an artificial intelligence
system and not a natural person.
(B) Resources for contacting a suicide and crisis
intervention hotline.
(2) Timing.--
(A) AI system disclosure.--A disclosure under
paragraph (1)(A) shall be made--
(i) at the initiation of the first
interaction of a covered user with a chatbot;
and
(ii) at any point at which, during an
interaction of a covered user with a chatbot,
the covered user prompts the chatbot about
whether the chatbot is an artificial
intelligence system.
(B) Crisis resources disclosure.--A disclosure
under paragraph (1)(B) shall be made at any point at
which, during an interaction of a covered user with a
chatbot, the covered user prompts the chatbot about
suicide or suicidal ideation.
(3) Use of plain language.--A disclosure under paragraph
(1) shall be made in a clear, age-appropriate, and plain
language manner that is reasonably understandable by a minor.
(c) Policies Required.--A chatbot provider shall establish,
implement, and maintain reasonable policies, practices, and
procedures--
(1) to ensure that a chatbot of the provider advises a
covered user to take a break from the chatbot at the point at
which a continuous and uninterrupted interaction of the covered
user with the chatbot has lasted for 3 hours; and
(2) to address, with respect to covered users--
(A) sexual material harmful to minors;
(B) gambling; and
(C) the distribution, sale, or use of illegal
drugs, tobacco products, or alcohol.
(d) Effective Date.--Subsections (a), (b), and (c) shall take
effect on the date that is 1 year after the date of the enactment of
this Act.
(e) Enforcement by Federal Trade Commission.--
(1) Unfair or deceptive acts or practices.--A violation of
subsection (a), (b), or (c) shall be treated as a violation of
a regulation under section 18(a)(1)(B) of the Federal Trade
Commission Act (15 U.S.C. 57a(a)(1)(B)) regarding unfair or
deceptive acts or practices.
(2) Powers of commission.--The Federal Trade Commission
shall enforce subsections (a), (b), and (c) in the same manner,
by the same means, and with the same jurisdiction, powers, and
duties as though all applicable terms and provisions of the
Federal Trade Commission Act (15 U.S.C. 41 et seq.) were
incorporated into and made a part of this section. Any person
who violates subsection (a), (b), or (c) shall be subject to
the penalties and entitled to the privileges and immunities
provided in the Federal Trade Commission Act.
(3) Authority preserved.--Nothing in this subsection may be
construed to limit the authority of the Federal Trade
Commission under any other provision of law.
(f) Actions by States.--
(1) In general.--In any case in which the attorney general
of a State, or an official or agency of a State, has reason to
believe that an interest of the residents of such State has
been or is threatened or adversely affected by an act or
practice in violation of subsection (a), (b), or (c), the
State, as parens patriae, may bring a civil action on behalf of
the residents of the State in an appropriate State court or an
appropriate district court of the United States to--
(A) enjoin such act or practice;
(B) enforce compliance with such subsection;
(C) obtain damages, restitution, or other
compensation on behalf of residents of the State; or
(D) obtain such other legal and equitable relief as
the court may consider to be appropriate.
(2) Notice.--Before filing an action under this subsection,
the attorney general, official, or agency of the State involved
shall provide to the Federal Trade Commission a written notice
of such action and a copy of the complaint for such action. If
the attorney general, official, or agency determines that it is
not feasible to provide the notice described in this paragraph
before the filing of the action, the attorney general,
official, or agency shall provide written notice of the action
and a copy of the complaint to the Federal Trade Commission
immediately upon the filing of the action.
(3) Authority of federal trade commission.--
(A) In general.--On receiving notice under
paragraph (2) of an action under this subsection, the
Federal Trade Commission shall have the right--
(i) to intervene in the action; and
(ii) upon so intervening--
(I) to be heard on all matters
arising therein; and
(II) to file petitions for appeal.
(B) Limitation on state action while federal action
is pending.--If the Federal Trade Commission or the
Attorney General of the United States has instituted a
civil action for violation of subsection (a), (b), or
(c) (referred to in this subparagraph as the ``Federal
action''), no State attorney general, official, or
agency may bring an action under this subsection during
the pendency of the Federal action against any
defendant named in the complaint in the Federal action
for any violation of such subsection alleged in such
complaint.
(4) Rule of construction.--For purposes of bringing a civil
action under this subsection, nothing in this Act shall be
construed to prevent an attorney general, official, or agency
of a State from exercising the powers conferred on the attorney
general, official, or agency by the laws of such State to
conduct investigations, administer oaths and affirmations, or
compel the attendance of witnesses or the production of
documentary and other evidence.
(g) Study on Chatbots and Mental Health of Minors.--
(1) In general.--The Secretary of Health and Human
Services, acting through the Director of the National
Institutes of Health, shall conduct a 4-year longitudinal study
to evaluate the risks and benefits of chatbots with respect to
the mental health of minors, including with respect to
loneliness, anxiety, social skill building, social isolation,
depression, self-harm, and suicidal ideation.
(2) Consultation.--In carrying out the study under
paragraph (1), the Secretary shall consult with--
(A) the Director of the National Institute of
Mental Health;
(B) pediatric mental health experts;
(C) technologists;
(D) ethicists; and
(E) educators.
(3) Report.--Not later than 4 years after the date of the
enactment of this Act, the Secretary, acting through the
Director, shall submit to the Committee on Energy and Commerce
of the House of Representatives and the Committees on Commerce,
Science, and Transportation and Health, Education, Labor, and
Pensions of the Senate a report on the results of the study
conducted under paragraph (1) and any related recommendations.
(h) Relationship to State Laws.--No State or political subdivision
of a State may prescribe, maintain, or enforce any law, rule,
regulation, requirement, standard, or other provision having the force
and effect of law, if such law, rule, regulation, requirement,
standard, or other provision covers a matter described in subsection
(a), (b), or (c).
(i) Rule of Construction.--Nothing in this Act may be construed to
require the affirmative collection by a chatbot provider of any
personal information with respect to the age of a user that a chatbot
provider is not already collecting in the normal course of business.
(j) Severability.--If any provision of this Act or the application
of this Act to any person or circumstance is held invalid, the
remaining provisions of this Act and the application of this Act to
other persons or circumstances shall not be affected.
(k) Definitions.--In this Act:
(1) Artificial intelligence.--The term ``artificial
intelligence'' has the meaning given such term in section 5002
of the National Artificial Intelligence Initiative Act of 2020
(15 U.S.C. 9401).
(2) Chatbot.--The term ``chatbot'' means an artificial
intelligence system, marketed to and available for use by
consumers, that engages in interactive, natural-language
communication with a user and generates or selects content in
response to user inputs (including text, voice, or other
inputs) using a conversational context.
(3) Chatbot provider.--
(A) In general.--The term ``chatbot provider''
means a person that provides a chatbot directly to a
consumer for the use of the consumer, including through
a website, mobile application, or other online means.
(B) Limitation.--A person that provides a website,
mobile application, or other online service that
includes a chat function incidental to the predominant
purpose of such website, application, or service shall
not be treated as a chatbot provider solely on the
basis of such incidental chat function.
(4) Covered user.--The term ``covered user'' means a user
of a chatbot if the provider of such chatbot--
(A) has actual knowledge that such user is a minor;
or
(B) would know that such user is a minor if not for
willful disregard.
(5) Minor.--The term ``minor'' means an individual under
the age of 17 years.
(6) Sexual material harmful to minors.--The term ``sexual
material harmful to minors'' means a picture, image, graphic
image file, film, videotape, or other visual depiction that--
(A)(i) taken as a whole and with respect to minors,
appeals to the prurient interest in nudity, sex, or
excretion;
(ii) depicts, describes, or represents, in a
patently offensive way with respect to what is suitable
for minors, an actual or simulated sexual act or sexual
contact, actual or simulated normal or perverted sexual
acts, or lewd exhibition of the genitals; and
(iii) taken as a whole, lacks serious literary,
artistic, political, or scientific value as to minors;
or
(B) is child pornography.
<all>