[Congressional Bills 118th Congress]
[From the U.S. Government Publishing Office]
[S. 1409 Reported in Senate (RS)]
<DOC>
Calendar No. 287
118th CONGRESS
1st Session
S. 1409
To protect the safety of children on the internet.
_______________________________________________________________________
IN THE SENATE OF THE UNITED STATES
May 2, 2023
Mr. Blumenthal (for himself, Mrs. Blackburn, Mr. Lujan, Mrs. Capito,
Ms. Baldwin, Mr. Cassidy, Ms. Klobuchar, Ms. Ernst, Mr. Peters, Mr.
Daines, Mr. Hickenlooper, Mr. Rubio, Mr. Warner, Mr. Sullivan, Mr.
Coons, Mr. Young, Mr. Schatz, Mr. Grassley, Mr. Murphy, Mr. Graham, Mr.
Welch, Mr. Marshall, Ms. Hassan, Mrs. Hyde-Smith, Mr. Durbin, Mr.
Mullin, Mr. Casey, Mr. Risch, Mr. Whitehouse, Mrs. Britt, Mr. Scott of
Florida, Ms. Lummis, Mr. Cornyn, Ms. Murkowski, Mr. Wicker, Mr. Kelly,
Mr. Manchin, Mr. Lankford, Mr. Crapo, Mr. Carper, Mr. Kaine, Mr.
Cardin, Mrs. Shaheen, Mr. Menendez, Mr. Thune, Ms. Warren, and Mr.
Hawley) introduced the following bill; which was read twice and
referred to the Committee on Commerce, Science, and Transportation
December 13, 2023
Reported by Ms. Cantwell, with an amendment
[Strike out all after the enacting clause and insert the part printed
in italic]
_______________________________________________________________________
A BILL
To protect the safety of children on the internet.
Be it enacted by the Senate and House of Representatives of the
United States of America in Congress assembled,
<DELETED>SECTION 1. SHORT TITLE; TABLE OF CONTENTS.</DELETED>
<DELETED> (a) Short Title.--This Act may be cited as the ``Kids
Online Safety Act''.</DELETED>
<DELETED> (b) Table of Contents.--The table of contents for this Act
is as follows:</DELETED>
<DELETED>Sec. 1. Short title; table of contents.
<DELETED>Sec. 2. Definitions.
<DELETED>Sec. 3. Duty of care.
<DELETED>Sec. 4. Safeguards for minors.
<DELETED>Sec. 5. Disclosure.
<DELETED>Sec. 6. Transparency.
<DELETED>Sec. 7. Independent research.
<DELETED>Sec. 8. Market research.
<DELETED>Sec. 9. Age verification study and report.
<DELETED>Sec. 10. Guidance.
<DELETED>Sec. 11. Enforcement.
<DELETED>Sec. 12. Kids online safety council.
<DELETED>Sec. 13. Effective date.
<DELETED>Sec. 14. Rules of construction and other matters.
<DELETED>Sec. 15. Severability.
<DELETED>SEC. 2. DEFINITIONS.</DELETED>
<DELETED> In this Act:</DELETED>
<DELETED> (1) Child.--The term ``child'' means an individual
who is under the age of 13.</DELETED>
<DELETED> (2) Compulsive usage.--The term ``compulsive
usage'' means any response stimulated by external factors that
causes an individual to engage in repetitive behavior
reasonably likely to cause psychological distress, loss of
control, anxiety, depression, or harmful stress
responses.</DELETED>
<DELETED> (3) Covered platform.--</DELETED>
<DELETED> (A) In general.--The term ``covered
platform'' means a social media service, social
network, online video game (including educational
games), messaging application, video streaming service,
or an online platform that connects to the internet and
that is used, or is reasonably likely to be used, by a
minor.</DELETED>
<DELETED> (B) Exceptions.--The term ``covered
platform'' does not include--</DELETED>
<DELETED> (i) an entity acting in its
capacity as a provider of--</DELETED>
<DELETED> (I) a common carrier
service subject to the Communications
Act of 1934 (47 U.S.C. 151 et seq.) and
all Acts amendatory thereof and
supplementary thereto;</DELETED>
<DELETED> (II) a broadband internet
access service (as such term is defined
for purposes of section 8.1(b) of title
47, Code of Federal Regulations, or any
successor regulation);</DELETED>
<DELETED> (III) an email service;
or</DELETED>
<DELETED> (IV) a wireless messaging
service provided through the short
messaging service or multimedia
messaging service protocols;</DELETED>
<DELETED> (ii) an organization not organized
to carry on business for its own profit or that
of its members;</DELETED>
<DELETED> (iii) any public or private
preschool, elementary, or secondary school, or
any institution of vocational, professional, or
higher education; or</DELETED>
<DELETED> (iv) a product or service that
primarily functions as business-to-business
software.</DELETED>
<DELETED> (4) Mental health disorder.--The term ``mental
health disorder'' has the meaning given the term ``mental
disorder'' in the Diagnostic and Statistical Manual of Mental
Health Disorders, 5th Edition (or the most current successor
edition).</DELETED>
<DELETED> (5) Minor.--The term ``minor'' means an individual
who is under the age of 17.</DELETED>
<DELETED> (6) Online platform.--The term ``online platform''
means any public-facing website, online service, online
application, or mobile application that predominantly provides
a community forum for user generated content, including sharing
videos, images, games, audio files, or other content.</DELETED>
<DELETED> (7) Parent.--The term ``parent'' includes a legal
guardian or an individual with legal custody over a
minor.</DELETED>
<DELETED> (8) Personal data.--The term ``personal data''
means information that identifies or is linked or reasonably
linkable to a particular minor, including a consumer device
identifier associated with a minor.</DELETED>
<DELETED> (9) Personalized recommendation system.--The term
``personalized recommendation system'' means a fully or
partially automated system used to suggest, promote, or rank
information based on the personal data of users.</DELETED>
<DELETED> (10) Sexual exploitation and abuse.--The term
``sexual exploitation and abuse'' means any of the
following:</DELETED>
<DELETED> (A) Coercion and enticement, as described
in section 2422 of title 18, United States
Code.</DELETED>
<DELETED> (B) Child sexual abuse material, as
described in sections 2251, 2252, 2252A, and 2260 of
title 18, United States Code.</DELETED>
<DELETED> (C) Trafficking for the production of
images, as described in section 2251A of title 18,
United States Code.</DELETED>
<DELETED> (D) Sex trafficking of children, as
described in section 1591 of title 18, United States
Code.</DELETED>
<DELETED> (11) Targeted advertising.--</DELETED>
<DELETED> (A) In general.--The term ``targeted
advertising'' means displaying an advertisement to an
individual where the advertisement is selected based on
personal data about the individual to predict the
individual's preferences and interests.</DELETED>
<DELETED> (B) Exclusions.--Such term does not
include--</DELETED>
<DELETED> (i) advertising or marketing
directed to an individual in response to the
individual's request for information or express
selection of a product or service;</DELETED>
<DELETED> (ii) contextual advertising where
an advertisement is displayed to an individual
based on the content in which the advertisement
appears and does not vary based on who the
individual is; or</DELETED>
<DELETED> (iii) processing personal data
solely to measure or report advertising
performance, reach, or frequency.</DELETED>
<DELETED>SEC. 3. DUTY OF CARE.</DELETED>
<DELETED> (a) Prevention of Harm to Minors.--A covered platform
shall act in the best interests of a user that the platform knows or
reasonably should know is a minor by taking reasonable measures in its
design and operation of products and services to prevent and mitigate
the following:</DELETED>
<DELETED> (1) Consistent with evidence-informed medical
information, the following mental health disorders: anxiety,
depression, eating disorders, substance use disorders, and
suicidal behaviors.</DELETED>
<DELETED> (2) Patterns of use that indicate or encourage
addiction-like behaviors.</DELETED>
<DELETED> (3) Physical violence, online bullying, and
harassment of the minor.</DELETED>
<DELETED> (4) Sexual exploitation and abuse.</DELETED>
<DELETED> (5) Promotion and marketing of narcotic drugs (as
defined in section 102 of the Controlled Substances Act (21
U.S.C. 802)), tobacco products, gambling, or alcohol.</DELETED>
<DELETED> (6) Predatory, unfair, or deceptive marketing
practices, or other financial harms.</DELETED>
<DELETED> (b) Limitation.--Nothing in subsection (a) shall be
construed to require a covered platform to prevent or preclude--
</DELETED>
<DELETED> (1) any minor from deliberately and independently
searching for, or specifically requesting, content;
or</DELETED>
<DELETED> (2) the covered platform or individuals on the
platform from providing resources for the prevention or
mitigation of suicidal behaviors, substance use, and other
harms, including evidence-informed information and clinical
resources.</DELETED>
<DELETED>SEC. 4. SAFEGUARDS FOR MINORS.</DELETED>
<DELETED> (a) Safeguards for Minors.--</DELETED>
<DELETED> (1) Safeguards.--A covered platform shall provide
an individual that the covered platform knows or reasonably
should know is a minor with readily accessible and easy-to-use
safeguards to, as applicable--</DELETED>
<DELETED> (A) limit the ability of other individuals
to communicate with the minor;</DELETED>
<DELETED> (B) prevent other users, whether
registered or not, from viewing the minor's personal
data collected by or shared on the covered platform, in
particular restricting public access to personal
data;</DELETED>
<DELETED> (C) limit features that increase, sustain,
or extend use of the covered platform by the minor,
such as automatic playing of media, rewards for time
spent on the platform, notifications, and other
features that result in compulsive usage of the covered
platform by the minor;</DELETED>
<DELETED> (D) control personalized recommendation
systems, including the right to--</DELETED>
<DELETED> (i) opt out of such personalized
recommendation systems, while still allowing
the display of content based on a chronological
format; or</DELETED>
<DELETED> (ii) limit types or categories of
recommendations from such systems;
and</DELETED>
<DELETED> (E) restrict the sharing of the
geolocation of the minor and provide notice regarding
the tracking of the minor's geolocation.</DELETED>
<DELETED> (2) Options.--A covered platform shall provide an
individual that the covered platform knows or reasonably should
know is a minor with readily accessible and easy-to-use options
to--</DELETED>
<DELETED> (A) delete the minor's account and delete
any personal data collected from, or shared by, the
minor on the covered platform; or</DELETED>
<DELETED> (B) limit the amount of time spent by the
minor on the covered platform.</DELETED>
<DELETED> (3) Default safeguard settings for minors.--A
covered platform shall provide that, in the case of a user that
the platform knows or reasonably should know is a minor, the
default setting for any safeguard described under paragraph (1)
shall be the option available on the platform that provides the
most protective level of control that is offered by the
platform over privacy and safety for that user.</DELETED>
<DELETED> (b) Parental Tools.--</DELETED>
<DELETED> (1) Tools.--A covered platform shall provide
readily accessible and easy-to-use settings for parents to
support an individual that the platform knows or reasonably
should know is a minor with respect to the individual's use of
the platform.</DELETED>
<DELETED> (2) Requirements.--The parental tools provided by
a covered platform shall include--</DELETED>
<DELETED> (A) the ability to manage a minor's
privacy and account settings, including the safeguards
and options established under subsection (a), in a
manner that allows parents to--</DELETED>
<DELETED> (i) view the privacy and account
settings; and</DELETED>
<DELETED> (ii) in the case of a user that
the platform knows or reasonably should know is
a child, change and control the privacy and
account settings;</DELETED>
<DELETED> (B) the ability to restrict purchases and
financial transactions by the minor, where applicable;
and</DELETED>
<DELETED> (C) the ability to view metrics of total
time spent on the platform.</DELETED>
<DELETED> (3) Notice to minors.--A covered platform shall
provide clear and conspicuous notice to an individual that the
platform knows or reasonably should know is a minor when tools
described in this subsection are in effect and what settings or
controls have been applied.</DELETED>
<DELETED> (4) Default tools.--A covered platform shall
provide that, in the case of a user that the platform knows or
reasonably should know is a child, the tools described in this
subsection shall be enabled by default.</DELETED>
<DELETED> (c) Reporting Mechanism.--</DELETED>
<DELETED> (1) Reports submitted by parents, minors, and
schools.--A covered platform shall provide--</DELETED>
<DELETED> (A) a readily accessible and easy-to-use
means to submit reports to the covered platform of
harms to minors;</DELETED>
<DELETED> (B) an electronic point of contact
specific to matters involving harms to a minor;
and</DELETED>
<DELETED> (C) confirmation of the receipt of such a
report and a means to track a submitted
report.</DELETED>
<DELETED> (2) Timing.--A covered platform shall establish an
internal process to receive and substantively respond to
reports in a reasonable and timely manner, but in no case later
than--</DELETED>
<DELETED> (A) 7 days after the receipt of a report,
if, for the most recent calendar year, the platform
averaged more than 10,000,000 active users on a monthly
basis in the United States;</DELETED>
<DELETED> (B) 21 days after the receipt of a report,
if, for the most recent calendar year, the platform
averaged less than 10,000,000 active users on a monthly
basis in the United States; and</DELETED>
<DELETED> (C) notwithstanding subparagraphs (A) and
(B), if the report involves an imminent threat to the
safety of a minor, as promptly as needed to address the
reported threat to safety.</DELETED>
<DELETED> (d) Advertising of Illegal Products.--A covered platform
shall not facilitate the advertising of narcotic drugs (as defined in
section 102 of the Controlled Substances Act (21 U.S.C. 802)), tobacco
products, gambling, or alcohol to an individual that the covered
platform knows or reasonably should know is a minor.</DELETED>
<DELETED> (e) Application.--</DELETED>
<DELETED> (1) Accessibility.--With respect to safeguards and
parental controls described under subsections (a) and (b), a
covered platform shall provide--</DELETED>
<DELETED> (A) information and control options in a
clear and conspicuous manner that takes into
consideration the differing ages, capacities, and
developmental needs of the minors most likely to access
the covered platform and does not encourage minors or
parents to weaken or disable safeguards or parental
controls;</DELETED>
<DELETED> (B) readily accessible and easy-to-use
controls to enable or disable safeguards or parental
controls, as appropriate; and</DELETED>
<DELETED> (C) information and control options in the
same language, form, and manner as the covered platform
provides the product or service used by minors and
their parents.</DELETED>
<DELETED> (2) Dark patterns prohibition.--It shall be
unlawful for any covered platform to design, modify, or
manipulate a user interface of a covered platform with the
purpose or substantial effect of subverting or impairing user
autonomy, decision-making, or choice in order to weaken or
disable safeguards or parental controls required under this
section.</DELETED>
<DELETED> (3) Rules of construction.--Nothing in this
section shall be construed to--</DELETED>
<DELETED> (A) prevent a covered platform from taking
reasonable measures to--</DELETED>
<DELETED> (i) block, detect, or prevent the
distribution of unlawful, obscene, or other
harmful material to minors as described in
section 3(a); or</DELETED>
<DELETED> (ii) block or filter spam, prevent
criminal activity, or protect the security of a
platform or service; or</DELETED>
<DELETED> (B) require the disclosure of a minor's
browsing behavior, search history, messages, contact
list, or other content or metadata of their
communications.</DELETED>
<DELETED>SEC. 5. DISCLOSURE.</DELETED>
<DELETED> (a) Notice.--</DELETED>
<DELETED> (1) Registration.--Prior to registration or
purchase of a covered platform by an individual that the
platform knows or reasonably should know is a minor, the
platform shall provide clear, conspicuous, and easy-to-
understand--</DELETED>
<DELETED> (A) notice of the policies and practices
of the covered platform with respect to personal data
and safeguards for minors;</DELETED>
<DELETED> (B) information about how to access the
safeguards and parental tools required under section 4;
and</DELETED>
<DELETED> (C) notice about whether the covered
platform, including any personalized recommendation
systems used by the platform, pose any heightened risks
of harms to minors.</DELETED>
<DELETED> (2) Parental notification.--</DELETED>
<DELETED> (A) Notice and acknowledgment.--In the
case of an individual that a covered platform knows or
reasonably should know is a child, the platform shall
additionally provide information about the parental
tools and safeguards required under section 4 to a
parent of the child and obtain express affirmative
acknowledgment from the parent prior to the initial use
of the covered platform by the child.</DELETED>
<DELETED> (B) Reasonable effort.--A covered platform
shall be deemed to have satisfied the requirement
described in subparagraph (A) if the covered platform
has undertaken a reasonable effort (taking into
consideration available technology) to ensure a parent
receives the information described in such subparagraph
and to obtain a parent's express affirmative
acknowledgment.</DELETED>
<DELETED> (3) Consolidated notices.--A covered platform may
consolidate the process for providing information and (if
applicable) obtaining parental acknowledgment as required under
this subsection with its obligations to obtain consent for data
privacy practices, provided the content of the notice meets the
requirements of this subsection.</DELETED>
<DELETED> (4) Rulemaking.--The Federal Trade Commission may
issue rules pursuant to section 553 of title 5, United States
Code, to establish templates or models of short-form notices
that include the minimum level of information and labels
necessary for the disclosures required under paragraph
(1).</DELETED>
<DELETED> (b) Personalized Recommendation System.--A covered
platform that operates personalized recommendation systems shall set
out in its terms and conditions, in a clear, conspicuous, and easy-to-
understand manner--</DELETED>
<DELETED> (1) an overview of how those personalized
recommendation systems are used by the covered platform to
provide information to users of the platform who are minors,
including how such systems use the personal data of minors;
and</DELETED>
<DELETED> (2) information about options for minors or their
parents to control personalized recommendation systems
(including by opting out of such systems).</DELETED>
<DELETED> (c) Advertising and Marketing Information and Labels.--
</DELETED>
<DELETED> (1) Information and labels.--A covered platform
that facilitates advertising aimed at users that the platform
knows or reasonably should know are minors shall provide clear,
conspicuous, and easy-to-understand information and labels to
minors on advertisements regarding--</DELETED>
<DELETED> (A) the name of the product, service, or
brand and the subject matter of an
advertisement;</DELETED>
<DELETED> (B) why the minor is being targeted for a
particular advertisement if the covered platform
engages in targeted advertising, including material
information about how the minor's personal data was
used to target the advertisement; and</DELETED>
<DELETED> (C) whether particular media displayed to
the minor is an advertisement or marketing material,
including disclosure of endorsements of products,
services, or brands made for commercial consideration
by other users of the platform.</DELETED>
<DELETED> (2) Rulemaking.--The Federal Trade Commission may
issue rules pursuant to section 553 of title 5, United States
Code, to establish templates or models of short-form notices
that include the minimum level of information and labels
necessary for the disclosures required under paragraph
(1).</DELETED>
<DELETED> (d) Resources for Parents and Minors.--A covered platform
shall provide to minors and parents clear, conspicuous, easy-to-
understand, and comprehensive information in a prominent location
regarding--</DELETED>
<DELETED> (1) its policies and practices with respect to
personal data and safeguards for minors; and</DELETED>
<DELETED> (2) how to access the safeguards and tools
required under section 4.</DELETED>
<DELETED> (e) Resources in Additional Languages.--A covered platform
shall ensure, to the extent practicable, that the disclosures required
by this section are made available in the same language, form, and
manner as the covered platform provides any product or service used by
minors and their parents.</DELETED>
<DELETED>SEC. 6. TRANSPARENCY.</DELETED>
<DELETED> (a) In General.--Subject to subsection (b), not less
frequently than once a year, a covered platform shall issue a public
report identifying the reasonably foreseeable risk of material harms to
minors and describing the prevention and mitigation measures taken to
address such risk based on an independent, third-party audit conducted
through reasonable inspection of the covered platform.</DELETED>
<DELETED> (b) Scope of Application.--The requirements of this
section shall apply to a covered platform if--</DELETED>
<DELETED> (1) for the most recent calendar year, the
platform averaged more than 10,000,000 active users on a
monthly basis in the United States; and</DELETED>
<DELETED> (2) the platform predominantly provides a
community forum for user-generated content and discussion,
including sharing videos, images, games, audio files,
discussion in a virtual setting, or other content, such as
acting as a social media platform, virtual reality environment,
or a social network service.</DELETED>
<DELETED> (c) Content.--</DELETED>
<DELETED> (1) Transparency.--The public reports required of
a covered platform under this section shall include--</DELETED>
<DELETED> (A) an assessment of the extent to which
the platform is likely to be accessed by
minors;</DELETED>
<DELETED> (B) a description of the commercial
interests of the covered platform in use by
minors;</DELETED>
<DELETED> (C) an accounting, based on the data held
by the covered platform, of--</DELETED>
<DELETED> (i) the number of individuals
using the covered platform reasonably believed
to be minors in the United States,
disaggregated by the age ranges of 0-5, 6-9,
10-12, and 13-16; and</DELETED>
<DELETED> (ii) the median and mean amounts
of time spent on the platform by minors in the
United States who have accessed the platform
during the reporting year on a daily, weekly,
and monthly basis, disaggregated by the age
ranges of 0-5, 6-9, 10-12, and 13-16;</DELETED>
<DELETED> (D) an accounting of total reports
received regarding, and the prevalence (which can be
based on scientifically valid sampling methods using
the content available to the covered platform in the
normal course of business) of content related to, the
harms described in section 3(a), disaggregated by
category of harm; and</DELETED>
<DELETED> (E) a description of any material breaches
of parental tools or assurances regarding minors,
representations regarding the use of the personal data
of minors, and other matters regarding non-
compliance.</DELETED>
<DELETED> (2) Systemic risks assessment.--The public reports
required of a covered platform under this section shall
include--</DELETED>
<DELETED> (A) an assessment of the reasonably
foreseeable risk of harms to minors posed by the
covered platform, including identifying any other
physical, mental, developmental, or financial harms in
addition to those described in section 3(a);</DELETED>
<DELETED> (B) an assessment of how recommendation
systems and targeted advertising systems can contribute
to harms to minors;</DELETED>
<DELETED> (C) a description of whether and how the
covered platform uses system design features that
increase, sustain, or extend use of a product or
service by a minor, such as automatic playing of media,
rewards for time spent, and notifications;</DELETED>
<DELETED> (D) a description of whether, how, and for
what purpose the platform collects or processes
categories of personal data that may cause reasonably
foreseeable risk of harms to minors;</DELETED>
<DELETED> (E) an evaluation of the efficacy of
safeguards for minors under section 4, and any issues
in delivering such safeguards and the associated
parental tools; and</DELETED>
<DELETED> (F) an evaluation of any other relevant
matters of public concern over risk of harms to
minors.</DELETED>
<DELETED> (3) Mitigation.--The public reports required of a
covered platform under this section shall include--</DELETED>
<DELETED> (A) a description of the safeguards and
parental tools available to minors and parents on the
covered platform;</DELETED>
<DELETED> (B) a description of interventions by the
covered platform when it had or has reason to believe
that harms to minors could occur;</DELETED>
<DELETED> (C) a description of the prevention and
mitigation measures intended to be taken in response to
the known and emerging risks identified in its
assessment of system risks, including steps taken to--
</DELETED>
<DELETED> (i) prevent harms to minors,
including adapting or removing system design
features or addressing through parental
controls;</DELETED>
<DELETED> (ii) provide the most protective
level of control over privacy and safety by
default; and</DELETED>
<DELETED> (iii) adapt recommendation systems
to prioritize the best interests of users who
are minors, as described in section
3(a);</DELETED>
<DELETED> (D) a description of internal processes
for handling reports and automated detection mechanisms
for harms to minors, including the rate, timeliness,
and effectiveness of responses under the requirement of
section 4(c);</DELETED>
<DELETED> (E) the status of implementing prevention
and mitigation measures identified in prior
assessments; and</DELETED>
<DELETED> (F) a description of the additional
measures to be taken by the covered platform to address
the circumvention of safeguards for minors and parental
tools.</DELETED>
<DELETED> (d) Reasonable Inspection.--In conducting an inspection of
the systemic risks of harm to minors under this section, an
independent, third-party auditor shall--</DELETED>
<DELETED> (1) take into consideration the function of
recommendation systems;</DELETED>
<DELETED> (2) consult parents and youth experts, including
youth and families with relevant past or current experience,
public health and mental health nonprofit organizations, health
and development organizations, and civil society with respect
to the prevention of harms to minors;</DELETED>
<DELETED> (3) conduct research based on experiences of
minors that use the covered platform, including reports under
section 4(c) and information provided by law
enforcement;</DELETED>
<DELETED> (4) take account of research, including research
regarding system design features, marketing, or product
integrity, industry best practices, or outside research;
and</DELETED>
<DELETED> (5) consider indicia or inferences of age of
users, in addition to any self-declared information about the
age of individuals.</DELETED>
<DELETED> (e) Cooperation With Independent, Third-Party Audit.--To
facilitate the report required by subsection (c), a covered platform
shall--</DELETED>
<DELETED> (1) provide or otherwise make available to the
independent third-party conducting the audit all information
and material in its possession, custody, or control that is
relevant to the audit;</DELETED>
<DELETED> (2) provide or otherwise make available to the
independent third-party conducting the audit access to all
network, systems, and assets relevant to the audit;
and</DELETED>
<DELETED> (3) disclose all relevant facts to the independent
third-party conducting the audit, and not misrepresent in any
manner, expressly or by implication, any relevant
fact.</DELETED>
<DELETED> (f) Privacy Safeguards.--</DELETED>
<DELETED> (1) In issuing the public reports required under
this section, a covered platform shall take steps to safeguard
the privacy of its users, including ensuring that data is
presented in a de-identified, aggregated format such that it is
reasonably impossible for the data to be linked back to any
individual user.</DELETED>
<DELETED> (2) This section shall not be construed to require
the disclosure of information that will lead to material
vulnerabilities for the privacy of users or the security of a
covered platform's service or create a significant risk of the
violation of Federal or State law.</DELETED>
<DELETED> (g) Location.--The public reports required under this
section should be posted by a covered platform on an easy to find
location on a publicly available website.</DELETED>
<DELETED>SEC. 7. INDEPENDENT RESEARCH.</DELETED>
<DELETED> (a) Definitions.--In this section:</DELETED>
<DELETED> (1) Assistant secretary.--The term ``Assistant
Secretary'' means the Assistant Secretary of Commerce for
Communications and Information.</DELETED>
<DELETED> (2) De-identified data.--The term ``de-identified
data'' means information--</DELETED>
<DELETED> (A) that does not identify and is not
linked or reasonably linkable to an individual or an
individual's device; and</DELETED>
<DELETED> (B) with respect to which a covered
platform or researcher takes reasonable technical and
contractual measures to ensure that the information is
not used to re-identify any individual or individual's
device.</DELETED>
<DELETED> (3) Eligible researcher.--</DELETED>
<DELETED> (A) In general.--The term ``eligible
researcher'' means an individual or group of
individuals affiliated with or employed by--</DELETED>
<DELETED> (i) an institution of higher
education (as defined in section 101 of the
Higher Education Act of 1965 (20 U.S.C. 1001));
or</DELETED>
<DELETED> (ii) a nonprofit organization
described in section 501(c)(3) of the Internal
Revenue Code of 1986.</DELETED>
<DELETED> (B) Limitation.--Such term shall not
include an individual or group of individuals that is--
</DELETED>
<DELETED> (i) not located in the United
States; or</DELETED>
<DELETED> (ii) affiliated with the
government of a foreign adversary (as defined
in section 8(c)(2) of the Secure and Trusted
Communications Networks Act of 2019 (47 U.S.C.
1607(c)(2))).</DELETED>
<DELETED> (4) Independent research.--The term ``independent
research'' means the scientific or historical analysis of
information that is performed for the primary purpose of
advancing understanding, knowledge, and remedies regarding the
harms to minors described in section 3(a).</DELETED>
<DELETED> (5) Noncommercial purpose.--The term
``noncommercial purpose'' means a purpose that does not involve
any direct or indirect use of data sets for the sale, resale,
solicitation, rental, or lease of a service, or any use by
which the user expects a profit, including the sale to the
general public of a publication containing independent
research.</DELETED>
<DELETED> (6) Program.--The term ``Program'' means the
program established under subsection (b)(1).</DELETED>
<DELETED> (7) Qualified researcher.--The term ``qualified
researcher'' means an eligible researcher who is approved by
the Assistant Secretary to conduct independent research
regarding harms to minors under the Program.</DELETED>
<DELETED> (b) Independent Research Program Relating to Identified
Harms to Minors.--</DELETED>
<DELETED> (1) Establishment.--Subject to paragraph (2), the
Assistant Secretary shall establish a program, with public
notice and an opportunity to comment, under which an eligible
researcher may apply for, and a covered platform shall provide,
access to data sets from the covered platform for the sole
purpose of conducting independent research regarding the harms
described in section 3(a).</DELETED>
<DELETED> (2) Scope of application.--The requirements of
this subsection shall apply to a covered platform if--
</DELETED>
<DELETED> (A) for the most recent calendar year, the
platform averaged more than 10,000,000 active users on
a monthly basis in the United States; and</DELETED>
<DELETED> (B) the platform predominantly provides a
community forum for user generated content and
discussion, including sharing videos, images, games,
audio files, discussion in a virtual setting, or other
content, such as acting as a social media platform,
virtual reality environment, or social network
service.</DELETED>
<DELETED> (3) Processes, procedures, and standards.--Not
later than 1 year after the date of enactment of this Act, the
Assistant Secretary shall establish for the program established
under this subsection--</DELETED>
<DELETED> (A) definitions for data sets (related to
harms described in section 3(a)) that qualify for
disclosure to researchers under the program and
standards of access for data sets to be provided under
the program;</DELETED>
<DELETED> (B) a process by which an eligible
researcher may submit an application described in
paragraph (1);</DELETED>
<DELETED> (C) an appeals process for eligible
researchers to appeal adverse decisions on applications
described in paragraph (1) (including a decision to
grant an appeal under paragraph (4)(C));</DELETED>
<DELETED> (D) procedures for implementation of the
program, including methods for--</DELETED>
<DELETED> (i) participation by covered
platforms;</DELETED>
<DELETED> (ii) evaluation of researcher
proposals for alignment with program objectives
and scoping; and</DELETED>
<DELETED> (iii) verification by the
Assistant Secretary of the credentials of
eligible researchers and processes for the
application or disqualification to participate
in the program;</DELETED>
<DELETED> (E) standards for privacy, security, and
confidentiality required to participate in the program,
including rules to ensure that the privacy and safety
of users is not infringed by the program;</DELETED>
<DELETED> (F) a mechanism to allow individuals to
control the use of their personal data under the
program, including the ability to opt out of the
program;</DELETED>
<DELETED> (G) standards for transparency regarding
the operation and administration of the program;
and</DELETED>
<DELETED> (H) rules to prevent requests for data
sets that present financial conflicts of interest,
including efforts by covered platforms to gain a
competitive advantage by directly funding data access
requests, the use of qualified researcher status for
commercial gain, or efforts by covered platforms to
obtain access to intellectual property that is
otherwise protected by law.</DELETED>
<DELETED> (4) Duties and rights of covered platforms.--
</DELETED>
<DELETED> (A) Access to data sets.--</DELETED>
<DELETED> (i) In general.--If the Assistant
Secretary approves an application under
paragraph (1) with respect to a covered
platform, the covered platform shall, in a
timely manner, provide the qualified researcher
with access to data sets necessary to conduct
independent research described in that
paragraph.</DELETED>
<DELETED> (ii) Limitations.--Nothing in this
section shall be construed to require a covered
platform to provide access to data sets that
are intellectual property protected by Federal
law, trade secrets, or commercial or financial
information.</DELETED>
<DELETED> (iii) Form of access.--A covered
platform shall provide to a qualified
researcher access to data sets under clause (i)
through online databases, application
programming interfaces, and data files as
appropriate.</DELETED>
<DELETED> (B) Nondisclosure agreement.--A covered
platform may require, as a condition of access to the
data sets of the covered platform, that a qualified
researcher enter into a nondisclosure agreement
restricting the release of data sets, provided that--
</DELETED>
<DELETED> (i) the agreement does not
restrict the publication or discussion
regarding the qualified researcher's findings;
and</DELETED>
<DELETED> (ii) the terms of the agreement
allow the qualified researcher to provide the
original agreement or a copy of the agreement
to the Assistant Secretary.</DELETED>
<DELETED> (C) Appeal.--</DELETED>
<DELETED> (i) Agency appeal.--A covered
platform may appeal the granting of an
application under paragraph (1) on the grounds
that, and the Assistant Secretary shall grant
such appeal if--</DELETED>
<DELETED> (I) the covered platform
does not have access to the requested
data sets or the requested data sets
are not reasonably tailored to
application; or</DELETED>
<DELETED> (II) providing access to
the data sets will lead to material
vulnerabilities for the privacy of
users or the security of the covered
platform's service or create a
significant risk of the violation of
Federal or state law.</DELETED>
<DELETED> (ii) Judicial review.--A decision
of the Assistant Secretary with respect to an
appeal under clause (i) shall be considered to
be a final agency action for purposes of
judicial review under chapter 7 of title 5,
United States Code.</DELETED>
<DELETED> (iii) Alternative means of
fulfillment.--As part of an appeal under clause
(i) that is made on the basis of subclause (II)
of such clause, a covered platform shall
propose one or more alternative data sets or
means of accessing the requested data sets that
are appropriate and sufficient to fulfill the
purpose of the application, or shall explain
why there are no alternative data sets or means
of access which acceptably mitigate the
applicable privacy, security, or legal
concerns.</DELETED>
<DELETED> (D) Timing.--A covered platform for which
this provision applies shall participate in the program
established under this subsection no later than two
years after enactment of this Act.</DELETED>
<DELETED> (5) Application requirements.--In order to be
approved to access data sets from a covered platform, an
eligible researcher shall, in the application submitted under
paragraph (1)--</DELETED>
<DELETED> (A) explain the purpose for which the
independent research is undertaken;</DELETED>
<DELETED> (B) commit to conduct the research for
noncommercial purposes;</DELETED>
<DELETED> (C) demonstrate a proven record of
expertise on the proposed research topic and related
research methodologies;</DELETED>
<DELETED> (D) if the eligible researcher is seeking
access to data sets that include personal data, explain
why the data sets are requested, and the means through
which such data sets shall be accessed are the least
sensitive and the most privacy-protective means that
will permit completion of the research and not
compromise the privacy or safety of users;
and</DELETED>
<DELETED> (E) commit to fulfill, and demonstrate a
capacity to fulfill, the specific data security and
confidentiality requirements corresponding to the
application.</DELETED>
<DELETED> (6) Privacy and duty of confidentiality.--
</DELETED>
<DELETED> (A) Researcher confidentiality.--To
protect user privacy, a qualified researcher shall keep
data sets provided by a covered platform under the
program confidential and secure to the specifications
set forth under the program rules and the approved
application.</DELETED>
<DELETED> (B) Platform confidentiality.--A covered
platform shall use reasonable measures to enable
researcher access to data sets under the program in a
secure and privacy-protective manner, including through
the de-identification of personal data or use of other
privacy-enhancing technologies.</DELETED>
<DELETED> (C) Federal agencies.--Nothing in this
subsection shall be construed to authorize--</DELETED>
<DELETED> (i) a Federal agency to seek
access to the data of a covered platform
through the program; or</DELETED>
<DELETED> (ii) a qualified researcher to
transfer or share any data sets provided by a
covered platform under the program with a
Federal agency.</DELETED>
<DELETED> (D) Security.--Nothing in this subsection
shall be construed in a manner that would result in
data sets from a covered platform being transferred to
the Government of the People's Republic of China or the
government of another foreign adversary (as defined in
section 8(c)(2) of the Secure and Trusted
Communications Networks Act of 2019 (47 U.S.C.
1607(c)(2))).</DELETED>
<DELETED> (c) Safe Harbor for Collection of Data for Independent
Research Regarding Identified Harms to Minors.--If, in the course of
conducting independent research for noncommercial purposes regarding
harms described in section 3(a) (without regard to whether such
research is conducted under the program), an eligible researcher
collects or uses data from a covered platform in a manner that violates
the terms of service of the platform, no cause of action based on such
violation shall lie or be maintained in any court against such
researcher unless the violation relates to the failure of the
researcher to take reasonable measures to protect user privacy and
security.</DELETED>
<DELETED> (d) Rulemaking.--The Assistant Secretary, in consultation
with the Secretary of Commerce, the Director of the National Institute
of Standards and Technology, the Director of the National Science
Foundation, and the Director of the National Institutes of Health shall
promulgate rules in accordance with section 553 of title 5, United
States Code, as necessary to implement this section.</DELETED>
<DELETED>SEC. 8. MARKET RESEARCH.</DELETED>
<DELETED> (a) Market Research by Covered Platforms.--The Federal
Trade Commission, in consultation with the Secretary of Commerce, shall
issue guidance for covered platforms seeking to conduct market- and
product-focused research on minors. Such guidance shall include--
</DELETED>
<DELETED> (1) a standard consent form that provides minors
and their parents a clear, conspicuous, and easy-to-understand
explanation of the scope and purpose of the research to be
conducted, and provides an opportunity for informed consent;
and</DELETED>
<DELETED> (2) recommendations for research practices for
studies that may include minors, disaggregated by the age
ranges of 0-5, 6-9, 10-12, and 13-16.</DELETED>
<DELETED> (b) Timing.--The Federal Trade Commission shall issue such
guidance not later than 18 months after the date of enactment of this
Act. In doing so, they shall seek input from members of the public and
the representatives of the Kids Online Safety Council established under
section 12.</DELETED>
<DELETED>SEC. 9. AGE VERIFICATION STUDY AND REPORT.</DELETED>
<DELETED> (a) Study.--The Director of the National Institute of
Standards and Technology, in coordination with the Federal
Communications Commission, Federal Trade Commission, and the Secretary
of Commerce, shall conduct a study evaluating the most technologically
feasible methods and options for developing systems to verify age at
the device or operating system level.</DELETED>
<DELETED> (b) Contents.--Such study shall consider --</DELETED>
<DELETED> (1) the benefits of creating a device or operating
system level age verification system;</DELETED>
<DELETED> (2) what information may need to be collected to
create this type of age verification system;</DELETED>
<DELETED> (3) the accuracy of such systems and their impact
or steps to improve accessibility, including for individuals
with disabilities;</DELETED>
<DELETED> (4) how such a system or systems could verify age
while mitigating risks to user privacy and data security and
safeguarding minors' personal data, emphasizing minimizing the
amount of data collected and processed by covered platforms and
age verification providers for such a system; and</DELETED>
<DELETED> (5) the technical feasibility, including the need
for potential hardware and software changes, including for
devices currently in commerce and owned by consumers.</DELETED>
<DELETED> (c) Report.--Not later than 1 year after the date of
enactment of this Act, the agencies described in subsection (a) shall
submit a report containing the results of the study conducted under
such subsection to the Committee on Commerce, Science, and
Transportation of the Senate and the Committee on Energy and Commerce
of the House of Representatives.</DELETED>
<DELETED>SEC. 10. GUIDANCE.</DELETED>
<DELETED> (a) In General.--Not later than 18 months after the date
of enactment of this Act, the Federal Trade Commission, in consultation
with the Kids Online Safety Council established under section 12, shall
issue guidance to--</DELETED>
<DELETED> (1) provide information and examples for covered
platforms and auditors regarding--</DELETED>
<DELETED> (A) identifying features that are used to
increase, sustain, or extend use of the covered
platform by a minor;</DELETED>
<DELETED> (B) safeguarding minors against the
possible misuse of parental tools;</DELETED>
<DELETED> (C) best practices in providing minors and
parents the most protective level of control over
privacy and safety;</DELETED>
<DELETED> (D) using indicia or inferences of age of
users for assessing use of the covered platform by
minors;</DELETED>
<DELETED> (E) methods for evaluating the efficacy of
safeguards; and</DELETED>
<DELETED> (F) providing additional control options
that allow parents to address the harms described in
section 3(a); and</DELETED>
<DELETED> (2) outline conduct that does not have the purpose
or substantial effect of subverting or impairing user autonomy,
decision-making, or choice, or of causing, increasing, or
encouraging compulsive usage for a minor, such as--</DELETED>
<DELETED> (A) de minimis user interface changes
derived from testing consumer preferences, including
different styles, layouts, or text, where such changes
are not done with the purpose of weakening or disabling
safeguards or parental controls;</DELETED>
<DELETED> (B) algorithms or data outputs outside the
control of a covered platform; and</DELETED>
<DELETED> (C) establishing default settings that
provide enhanced privacy protection to users or
otherwise enhance their autonomy and decision-making
ability.</DELETED>
<DELETED> (b) Guidance to Schools.--Not later than 18 months after
the date of enactment of this Act, the Secretary of Education, in
consultation with the Federal Trade Commission and the Kids Online
Safety Council established under section 12, shall issue guidance to
assist to assist elementary and secondary schools in using the notice,
safeguards and tools provided under this Act and providing information
on online safety for students and teachers.</DELETED>
<DELETED> (c) Limitation on Federal Trade Commission Guidance.--
</DELETED>
<DELETED> (1) Effect of guidance.--No guidance issued by the
Federal Trade Commission with respect to this Act shall--
</DELETED>
<DELETED> (A) confer any rights on any person,
State, or locality; or</DELETED>
<DELETED> (B) operate to bind the Federal Trade
Commission or any person to the approach recommended in
such guidance.</DELETED>
<DELETED> (2) Use in enforcement actions.--In any
enforcement action brought pursuant to this Act, the Federal
Trade Commission--</DELETED>
<DELETED> (A) shall allege a violation of a
provision of this Act; and</DELETED>
<DELETED> (B) may not base such enforcement action
on, or execute a consent order based on, practices that
are alleged to be inconsistent with guidance issued by
the Federal Trade Commission with respect to this Act,
unless the practices are alleged to violate a provision
of this Act.</DELETED>
<DELETED>SEC. 11. ENFORCEMENT.</DELETED>
<DELETED> (a) Enforcement by Federal Trade Commission.--</DELETED>
<DELETED> (1) Unfair and deceptive acts or practices.--A
violation of this Act or a regulation promulgated under this
Act shall be treated as a violation of a rule defining an
unfair or deceptive act or practice prescribed under section
18(a)(1)(B) of the Federal Trade Commission Act (15 U.S.C.
57a(a)(1)(B)).</DELETED>
<DELETED> (2) Powers of the commission.--</DELETED>
<DELETED> (A) In general.--The Federal Trade
Commission (referred to in this section as the
``Commission'') shall enforce this Act and any
regulation promulgated under this Act in the same
manner, by the same means, and with the same
jurisdiction, powers, and duties as though all
applicable terms and provisions of the Federal Trade
Commission Act (15 U.S.C. 41 et seq.) were incorporated
into and made a part of this Act.</DELETED>
<DELETED> (B) Privileges and immunities.--Any person
that violates this Act or a regulation promulgated
under this Act shall be subject to the penalties, and
entitled to the privileges and immunities, provided in
the Federal Trade Commission Act (15 U.S.C. 41 et
seq.).</DELETED>
<DELETED> (3) Authority preserved.--Nothing in this Act
shall be construed to limit the authority of the Commission
under any other provision of law.</DELETED>
<DELETED> (b) Enforcement by State Attorneys General.--</DELETED>
<DELETED> (1) In general.--</DELETED>
<DELETED> (A) Civil actions.--In any case in which
the attorney general of a State has reason to believe
that an interest of the residents of that State has
been or is threatened or adversely affected by the
engagement of any person in a practice that violates
this Act or a regulation promulgated under this Act,
the State, as parens patriae, may bring a civil action
on behalf of the residents of the State in a district
court of the United States or a State court of
appropriate jurisdiction to--</DELETED>
<DELETED> (i) enjoin that
practice;</DELETED>
<DELETED> (ii) enforce compliance with this
Act or such regulation;</DELETED>
<DELETED> (iii) on behalf of residents of
the State, obtain damages, restitution, or
other compensation, each of which shall be
distributed in accordance with State law;
or</DELETED>
<DELETED> (iv) obtain such other relief as
the court may consider to be
appropriate.</DELETED>
<DELETED> (B) Notice.--</DELETED>
<DELETED> (i) In general.--Before filing an
action under subparagraph (A), the attorney
general of the State involved shall provide to
the Commission--</DELETED>
<DELETED> (I) written notice of that
action; and</DELETED>
<DELETED> (II) a copy of the
complaint for that action.</DELETED>
<DELETED> (ii) Exemption.--</DELETED>
<DELETED> (I) In general.--Clause
(i) shall not apply with respect to the
filing of an action by an attorney
general of a State under this paragraph
if the attorney general of the State
determines that it is not feasible to
provide the notice described in that
clause before the filing of the
action.</DELETED>
<DELETED> (II) Notification.--In an
action described in subclause (I), the
attorney general of a State shall
provide notice and a copy of the
complaint to the Commission at the same
time as the attorney general files the
action.</DELETED>
<DELETED> (2) Intervention.--</DELETED>
<DELETED> (A) In general.--On receiving notice under
paragraph (1)(B), the Commission shall have the right
to intervene in the action that is the subject of the
notice.</DELETED>
<DELETED> (B) Effect of intervention.--If the
Commission intervenes in an action under paragraph (1),
it shall have the right--</DELETED>
<DELETED> (i) to be heard with respect to
any matter that arises in that action;
and</DELETED>
<DELETED> (ii) to file a petition for
appeal.</DELETED>
<DELETED> (3) Construction.--For purposes of bringing any
civil action under paragraph (1), nothing in this Act shall be
construed to prevent an attorney general of a State from
exercising the powers conferred on the attorney general by the
laws of that State to--</DELETED>
<DELETED> (A) conduct investigations;</DELETED>
<DELETED> (B) administer oaths or affirmations;
or</DELETED>
<DELETED> (C) compel the attendance of witnesses or
the production of documentary and other
evidence.</DELETED>
<DELETED> (4) Actions by the commission.--In any case in
which an action is instituted by or on behalf of the Commission
for violation of this Act or a regulation promulgated under
this Act, no State may, during the pendency of that action,
institute a separate action under paragraph (1) against any
defendant named in the complaint in the action instituted by or
on behalf of the Commission for that violation.</DELETED>
<DELETED> (5) Venue; service of process.--</DELETED>
<DELETED> (A) Venue.--Any action brought under
paragraph (1) may be brought in--</DELETED>
<DELETED> (i) the district court of the
United States that meets applicable
requirements relating to venue under section
1391 of title 28, United States Code;
or</DELETED>
<DELETED> (ii) a State court of competent
jurisdiction.</DELETED>
<DELETED> (B) Service of process.--In an action
brought under paragraph (1) in a district court of the
United States, process may be served wherever
defendant--</DELETED>
<DELETED> (i) is an inhabitant; or</DELETED>
<DELETED> (ii) may be found.</DELETED>
<DELETED>SEC. 12. KIDS ONLINE SAFETY COUNCIL.</DELETED>
<DELETED> (a) Establishment.--Not later than 180 days after the date
of enactment of this Act, the Secretary of Commerce shall establish and
convene the Kids Online Safety Council for the purpose of providing
advice on matters related to this Act.</DELETED>
<DELETED> (b) Participation.--The Kids Online Safety Council shall
include diverse participation from--</DELETED>
<DELETED> (1) academic experts, health professionals, and
members of civil society with expertise in mental health,
substance use disorders, and the prevention of harms to
minors;</DELETED>
<DELETED> (2) representatives in academia and civil society
with specific expertise in privacy and civil
liberties;</DELETED>
<DELETED> (3) parents and youth representation;</DELETED>
<DELETED> (4) representatives of covered
platforms;</DELETED>
<DELETED> (5) representatives of the National
Telecommunications and Information Administration, the National
Institute of Standards and Technology, the Federal Trade
Commission, the Department of Justice, and the Department of
Health and Human Services;</DELETED>
<DELETED> (6) State attorneys general or their designees
acting in State or local government; and</DELETED>
<DELETED> (7) representatives of communities of socially
disadvantaged individuals (as defined in section 8 of the Small
Business Act (15 U.S.C. 637)).</DELETED>
<DELETED> (c) Activities.--The matters to be addressed by the Kids
Online Safety Council shall include--</DELETED>
<DELETED> (1) identifying emerging or current risks of harms
to minors associated with online platforms;</DELETED>
<DELETED> (2) recommending measures and methods for
assessing, preventing, and mitigating harms to minors
online;</DELETED>
<DELETED> (3) recommending methods and themes for conducting
research regarding online harms to minors; and</DELETED>
<DELETED> (4) recommending best practices and clear,
consensus-based technical standards for transparency reports
and audits, as required under this Act, including methods,
criteria, and scope to promote overall
accountability.</DELETED>
<DELETED>SEC. 13. EFFECTIVE DATE.</DELETED>
<DELETED> Except as otherwise provided in this Act, this Act shall
take effect on the date that is 18 months after the date of enactment
of this Act.</DELETED>
<DELETED>SEC. 14. RULES OF CONSTRUCTION AND OTHER MATTERS.</DELETED>
<DELETED> (a) Relationship to Other Laws.--Nothing in this Act shall
be construed to--</DELETED>
<DELETED> (1) preempt section 444 of the General Education
Provisions Act (20 U.S.C. 1232g, commonly known as the ``Family
Educational Rights and Privacy Act of 1974'') or other Federal
or State laws governing student privacy;</DELETED>
<DELETED> (2) preempt the Children's Online Privacy
Protection Act of 1998 (15 U.S.C. 6501 et seq.) or any rule or
regulation promulgated under such Act; or</DELETED>
<DELETED> (3) authorize any action that would conflict with
section 18(h) of the Federal Trade Commission Act (15 U.S.C.
57a(h)).</DELETED>
<DELETED> (b) Protections for Privacy.--Nothing in this Act shall be
construed to require--</DELETED>
<DELETED> (1) the affirmative collection of any personal
data with respect to the age of users that a covered platform
is not already collecting in the normal course of business;
or</DELETED>
<DELETED> (2) a covered platform to implement an age gating
or age verification functionality.</DELETED>
<DELETED> (c) Compliance.--Nothing in this Act shall be construed to
restrict a covered platform's ability to--</DELETED>
<DELETED> (1) cooperate with law enforcement agencies
regarding activity that the covered platform reasonably and in
good faith believes may violate Federal, State, or local laws,
rules, or regulations;</DELETED>
<DELETED> (2) comply with a civil, criminal, or regulatory
inquiry or any investigation, subpoena, or summons by Federal,
State, local, or other government authorities; or</DELETED>
<DELETED> (3) investigate, establish, exercise, respond to,
or defend against legal claims.</DELETED>
<DELETED>SEC. 15. SEVERABILITY.</DELETED>
<DELETED> If any provision of this Act, or an amendment made by this
Act, is determined to be unenforceable or invalid, the remaining
provisions of this Act and the amendments made by this Act shall not be
affected.</DELETED>
SECTION 1. SHORT TITLE; TABLE OF CONTENTS.
(a) Short Title.--This Act may be cited as the ``Kids Online Safety
Act''.
(b) Table of Contents.--The table of contents for this Act is as
follows:
Sec. 1. Short title; table of contents.
Sec. 2. Definitions.
Sec. 3. Duty of care.
Sec. 4. Safeguards for minors.
Sec. 5. Disclosure.
Sec. 6. Transparency.
Sec. 7. Independent research on social media and minors.
Sec. 8. Market research.
Sec. 9. Age verification study and report.
Sec. 10. Guidance.
Sec. 11. Enforcement.
Sec. 12. Kids online safety council.
Sec. 13. Filter bubble transparency requirements.
Sec. 14. Effective date.
Sec. 15. Rules of construction and other matters.
Sec. 16. Severability.
SEC. 2. DEFINITIONS.
In this Act:
(1) Child.--The term ``child'' means an individual who is
under the age of 13.
(2) Compulsive usage.--The term ``compulsive usage'' means
any response stimulated by external factors that causes an
individual to engage in repetitive behavior reasonably likely
to cause psychological distress, loss of control, anxiety, or
depression.
(3) Covered platform.--
(A) In general.--The term ``covered platform''
means an online platform, online video game, messaging
application, or video streaming service that connects
to the internet and that is used, or is reasonably
likely to be used, by a minor.
(B) Exceptions.--The term ``covered platform'' does
not include--
(i) an entity acting in its capacity as a
provider of--
(I) a common carrier service
subject to the Communications Act of
1934 (47 U.S.C. 151 et seq.) and all
Acts amendatory thereof and
supplementary thereto;
(II) a broadband internet access
service (as such term is defined for
purposes of section 8.1(b) of title 47,
Code of Federal Regulations, or any
successor regulation);
(III) an email service;
(IV) a teleconferencing or video
conferencing service that allows
reception and transmission of audio and
video signals for real-time
communication, provided that--
(aa) is not an online
platform, including a social
media service or social
network; and
(bb) the real-time
communication is initiated by
using a unique link or
identifier to faciliate access;
or
(V) a wireless messaging service,
including such a service provided
through short messaging service or
multimedia messaging service protocols,
that is not a component of or linked to
an online platform and where the
predominant or exclusive function is
direct messaging consisting of the
transmission of text, photos or videos
that are sent by electronic means,
where messages are transmitted from the
sender to a recipient, and are not
posted within an online platform or
publicly;
(ii) an organization not organized to carry
on business for its own profit or that of its
members;
(iii) any public or private preschool,
elementary, or secondary school, or any
institution of vocational, professional, or
higher education;
(iv) a library (as defined in section
213(1) of the Library Services and Technology
Act (20 U.S.C. 9122(1)));
(v) a news website or app where--
(I) the inclusion of video content
on the website or app is related to the
website or app's own gathering,
reporting, or publishing of news
content; and
(II) the website or app is not
otherwise an online platform;
(vi) a product or service that primarily
functions as business-to-business software; or
(vii) a virtual private network or similar
service that exists solely to route internet
traffic between locations.
(4) Geolocation.--The term ``geolocation'' means
information sufficient to identify street name and name of a
city or town.
(5) Individual-specific advertising to minors.--
(A) In general.--The term ``individual-specific
advertising to minors'' means advertising or any other
effort to market a product or service that is directed
to a specific minor or a device that is linked or
reasonably linkable to a minor--
(i) based on--
(I) the personal data of--
(aa) the minor; or
(bb) a group of minors who
are similar in sex, age, income
level, race, or ethnicity to
the specific minor to whom the
product or service is marketed;
(II) psychological profiling of a
minor or group of minors; or
(III) a unique identifier of the
device; or
(ii) as a result of use by the minor,
access by any device of the minor, or use by a
group of minors who are similar to the specific
minor, of more than a single--
(I) website;
(II) online service;
(III) online application;
(IV) mobile application; or
(V) connected device
(B) Exclusions.--The term ``individual-specific
advertising to minors'' shall not include--
(i) advertising or marketing to an
individual or the device of an individual in
response to the individual's specific request
for information or feedback, such as a minor's
current search query;
(ii) contextual advertising, such as when
an advertisement is displayed based on the
content of the covered platform on which the
advertisement appears and does not vary based
on personal information related to the viewer;
(iii) processing personal information
solely for measuring or reporting advertising
or content performance, reach, or frequency,
including independent measurement;
(C) Rule of construction.--Nothing in subparagraph
(A) shall be construed to prohibit a covered platform
with actual knowledge or knowledge fairly implied on
the basis of objective circumstances that an individual
is under the age of 17 from delivering advertising or
marketing that is age-appropriate for the individual
involved and intended for a child or teen audience (as
applicable), so long as the covered platform does not
use any personal data other than whether the user is
under the age of 17 to deliver such advertising or
marketing.
(6) Know or knows.--The term ``know'' or ``knows'' means to
have actual knowledge or knowledge fairly implied on the basis
of objective circumstances.
(7) Mental health disorder.--The term ``mental health
disorder'' has the meaning given the term ``mental disorder''
in the Diagnostic and Statistical Manual of Mental Health
Disorders, 5th Edition (or the most current successor edition).
(8) Minor.--The term ``minor'' means an individual who is
under the age of 17.
(9) Online platform.--The term ``online platform'' means
any public-facing website, online service, online application,
or mobile application that predominantly provides a community
forum for user generated content, such as sharing videos,
images, games, audio files, or other content, including a
social media service, social network, or virtual reality
environment.
(10) Online video game.--The term ``online video game''
means a video game, including an educational video game, that
connects to the internet and that--
(A) allows a user to--
(i) create and upload content;
(ii) engage in microtransactions within the
game; or
(iii) communicate with other users; or
(B) incorporates minor-specific advertising.
(11) Parent.--The term ``parent'' includes--
(A) a natural parent;
(B) a legal guardian; or
(C) an individual with legal custody over a minor.
(12) Personal data.--The term ``personal data'' means
information that identifies or is linked or reasonably linkable
to a particular minor, including a consumer device identifier
that is linked or reasonably linkable to a minor.
(13) Personalized recommendation system.--The term
``personalized recommendation system'' means a fully or
partially automated system used to suggest, promote, or rank
content, including other users or posts, based on the personal
data of users.
(14) Sexual exploitation and abuse.--The term ``sexual
exploitation and abuse'' means any of the following:
(A) Coercion and enticement, as described in
section 2422 of title 18, United States Code.
(B) Child sexual abuse material, as described in
sections 2251, 2252, 2252A, and 2260 of title 18,
United States Code.
(C) Trafficking for the production of images, as
described in section 2251A of title 18, United States
Code.
(D) Sex trafficking of children, as described in
section 1591 of title 18, United States Code.
SEC. 3. DUTY OF CARE.
(a) Prevention of Harm to Minors.--A covered platform shall take
reasonable measures in the design and operation of any product,
service, or feature that the covered platform knows is used by minors
to prevent and mitigate the following harms to minors:
(1) Consistent with evidence-informed medical information,
the following mental health disorders: anxiety, depression,
eating disorders, substance use disorders, and suicidal
behaviors.
(2) Patterns of use that indicate or encourage addiction-
like behaviors.
(3) Physical violence, online bullying, and harassment of
the minor.
(4) Sexual exploitation and abuse.
(5) Promotion and marketing of narcotic drugs (as defined
in section 102 of the Controlled Substances Act (21 U.S.C.
802)), tobacco products, gambling, or alcohol.
(6) Predatory, unfair, or deceptive marketing practices, or
other financial harms.
(b) Limitation.--Nothing in subsection (a) shall be construed to
require a covered platform to prevent or preclude--
(1) any minor from deliberately and independently searching
for, or specifically requesting, content; or
(2) the covered platform or individuals on the platform
from providing resources for the prevention or mitigation of
the harms described in subsection (a), including evidence-
informed information and clinical resources.
SEC. 4. SAFEGUARDS FOR MINORS.
(a) Safeguards for Minors.--
(1) Safeguards.--A covered platform shall provide an
individual that the covered platform knows is a minor with
readily-accessible and easy-to-use safeguards to, as
applicable--
(A) limit the ability of other individuals to
communicate with the minor;
(B) prevent other users, whether registered or not,
from viewing the minor's personal data collected by or
shared on the covered platform, in particular
restricting public access to personal data;
(C) limit features that increase, sustain, or
extend use of the covered platform by the minor, such
as automatic playing of media, rewards for time spent
on the platform, notifications, and other features that
result in compulsive usage of the covered platform by
the minor;
(D) control personalized recommendation systems,
including the ability for a minor to have at least 1 of
the following options--
(i) opt out of such personalized
recommendation systems, while still allowing
the display of content based on a chronological
format; or
(ii) limit types or categories of
recommendations from such systems; and
(E) restrict the sharing of the geolocation of the
minor to other users on the platform and provide notice
regarding the tracking of the minor's geolocation.
(2) Options.--A covered platform shall provide an
individual that the covered platform knows is a minor with
readily-accessible and easy-to-use options to--
(A) delete the minor's account and delete any
personal data collected from, or shared by, the minor
on the covered platform; or
(B) limit the amount of time spent by the minor on
the covered platform.
(3) Default safeguard settings for minors.--A covered
platform shall provide that, in the case of a user that the
platform knows is a minor, the default setting for any
safeguard described under paragraph (1) shall be the option
available on the platform that provides the most protective
level of control that is offered by the platform over privacy
and safety for that user.
(b) Parental Tools.--
(1) Tools.--A covered platform shall provide readily-
accessible and easy-to-use settings for parents to support an
individual that the platform knows is a minor with respect to
the individual's use of the platform.
(2) Requirements.--The parental tools provided by a covered
platform shall include--
(A) the ability to manage a minor's privacy and
account settings, including the safeguards and options
established under subsection (a), in a manner that
allows parents to--
(i) view the privacy and account settings;
and
(ii) in the case of a user that the
platform knows is a child, change and control
the privacy and account settings;
(B) the ability to restrict purchases and financial
transactions by the minor, where applicable; and
(C) the ability to view metrics of total time spent
on the platform and restrict time spent on the covered
platform by the minor.
(3) Notice to minors.--A covered platform shall provide
clear and conspicuous notice to an individual that the platform
knows is a minor when tools described in this subsection are in
effect and what settings or controls have been applied.
(4) Default tools.--A covered platform shall provide that,
in the case of a user that the platform knows is a child, the
tools described in this subsection shall be enabled by default.
(c) Reporting Mechanism.--
(1) Reports submitted by parents, minors, and schools.--A
covered platform shall provide--
(A) a readily-accessible and easy-to-use means to
submit reports to the covered platform of harms to a
minor;
(B) an electronic point of contact specific to
matters involving harms to a minor; and
(C) confirmation of the receipt of such a report
and a means to track a submitted report.
(2) Timing.--A covered platform shall establish an internal
process to receive and substantively respond to such reports in
a reasonable and timely manner, but in no case later than--
(A) 10 days after the receipt of a report, if, for
the most recent calendar year, the platform averaged
more than 10,000,000 active users on a monthly basis in
the United States;
(B) 21 days after the receipt of a report, if, for
the most recent calendar year, the platform averaged
less than 10,000,000 active users on a monthly basis in
the United States; and
(C) notwithstanding subparagraphs (A) and (B), if
the report involves an imminent threat to the safety of
a minor, as promptly as needed to address the reported
threat to safety.
(d) Advertising of Illegal Products.--A covered platform shall not
facilitate the advertising of narcotic drugs (as defined in section 102
of the Controlled Substances Act (21 U.S.C. 802)), tobacco products,
gambling, or alcohol to an individual that the covered platform knows
is a minor.
(e) Application.--
(1) Accessibility.--With respect to safeguards and parental
controls described under subsections (a) and (b), a covered
platform shall provide--
(A) information and control options in a clear and
conspicuous manner that takes into consideration the
differing ages, capacities, and developmental needs of
the minors most likely to access the covered platform
and does not encourage minors or parents to weaken or
disable safeguards or parental controls;
(B) readily-accessible and easy-to-use controls to
enable or disable safeguards or parental controls, as
appropriate; and
(C) information and control options in the same
language, form, and manner as the covered platform
provides the product or service used by minors and
their parents.
(2) Dark patterns prohibition.--It shall be unlawful for
any covered platform to design, modify, or manipulate a user
interface of a covered platform with the purpose or substantial
effect of subverting or impairing user autonomy, decision-
making, or choice with respect to safeguards or parental
controls required under this section.
(3) Rules of construction.--Nothing in this section shall
be construed to--
(A) prevent a covered platform from taking
reasonable measures to--
(i) block, detect, or prevent the
distribution of unlawful, obscene, or other
harmful material to minors as described in
section 3(a); or
(ii) block or filter spam, prevent criminal
activity, or protect the security of a platform
or service;
(B) require the disclosure of a minor's browsing
behavior, search history, messages, contact list, or
other content or metadata of their communications;
(C) prevent a covered platform from using a
personalized recommendation system to display content
to a minor if the system only uses information on--
(i) the language spoken by the minor;
(ii) the city the minor is located in; or
(iii) the minor's age; or
(D) prohibit a covered platform from integrating
its products or service with controls from third-party
systems, including operating systems or gaming
consoles, to meet the requirements imposed under
subsections (a) and (b) relating to safeguards for
minors and tools for parents, provided that--
(i) the controls meet such requirements;
and
(ii) the minor or parent is provided
sufficient notice of the integration and use of
the controls.
SEC. 5. DISCLOSURE.
(a) Notice.--
(1) Registration or purchase.--Prior to registration or
purchase of a covered platform by an individual that the
platform knows is a minor, the platform shall provide clear,
conspicuous, and easy-to-understand--
(A) notice of the policies and practices of the
covered platform with respect to personal data and
safeguards for minors;
(B) information about how to access the safeguards
and parental tools required under section 4; and
(C) notice about whether the covered platform uses
or makes available to minors a product, service, or
feature, including any personalized recommendation
system, that poses any heightened risk of harm to
minors.
(2) Notification.--
(A) Notice and acknowledgment.--In the case of an
individual that a covered platform knows is a child,
the platform shall additionally provide information
about the parental tools and safeguards required under
section 4 to a parent of the child and obtain
verifiable parental consent (as defined in section
1302(9) of the Children's Online Privacy Protection Act
(15 U.S.C. 6501(9))) from the parent prior to the
inital use of the covered platform by the child.
(B) Reasonable effort.--A covered platform shall be
deemed to have satisfied the requirement described in
subparagraph (A) if the covered platform is in
compliance with the requirements of the Children's
Online Privacy Protection Act (15 U.S.C. 6501 et seq.)
to use reasonable efforts (taking into consideration
available technology) to provide a parent with the
information described in subparagraph (A) and to obtain
verifiable parental consent as required.
(3) Consolidated notices.--A covered platform may
consolidate the process for providing information under this
subsection and obtaining verifiable parental consent or the
consent of the minor involved (as applicable) as required under
this subsection with its obligations to provide relevant notice
and obtain verifiable parental consent under the Children's
Online Privacy Protection Act (15 U.S.C. 6501 et seq.).
(4) Guidance.--The Federal Trade Commission may issue
guidance to assist covered platforms in complying with the
requirements of this section.
(b) Personalized Recommendation System.--A covered platform that
operates a personalized recommendation system shall set out in its
terms and conditions, in a clear, conspicuous, and easy-to-understand
manner--
(1) an overview of how such personalized recommendation
system is used by the covered platform to provide information
to users of the platform who are minors, including how such
systems use the personal data of minors; and
(2) information about options for minors or their parents
to opt out of or control the personalized recommendation system
(as applicable).
(c) Advertising and Marketing Information and Labels.--
(1) Information and labels.--A covered platform that
facilitates advertising aimed at users that the platform knows
are minors shall provide clear, conspicuous, and easy-to-
understand information and labels to minors on advertisements
regarding--
(A) the name of the product, service, or brand and
the subject matter of an advertisement;
(B) if the covered platform engages in individual-
specific advertising to minors, why a particular
advertisement is directed to a specific minor,
including material information about how the minor's
personal data is used to direct the advertisement to
the minor; and
(C) whether particular media displayed to the minor
is an advertisement or marketing material, including
disclosure of endorsements of products, services, or
brands made for commercial consideration by other users
of the platform.
(2) Guidance.--The Federal Trade Commission may issue
guidance to assist covered platforms in complying with the
requirements of this subsection, including guidance about the
minimum level of information and labels for the disclosures
required under paragraph (1) .
(d) Resources for Parents and Minors.--A covered platform shall
provide to minors and parents clear, conspicuous, easy-to-understand,
and comprehensive information in a prominent location regarding--
(1) its policies and practices with respect to personal
data and safeguards for minors; and
(2) how to access the safeguards and tools required under
section 4.
(e) Resources in Additional Languages.--A covered platform shall
ensure, to the extent practicable, that the disclosures required by
this section are made available in the same language, form, and manner
as the covered platform provides any product or service used by minors
and their parents.
SEC. 6. TRANSPARENCY.
(a) In General.--Subject to subsection (b), not less frequently
than once a year, a covered platform shall issue a public report
describing the reasonably foreseeable risks of material harms to minors
and assessing the prevention and mitigation measures taken to address
such risk based on an independent, third-party audit conducted through
reasonable inspection of the covered platform.
(b) Scope of Application.--The requirements of this section shall
apply to a covered platform if--
(1) for the most recent calendar year, the platform
averaged more than 10,000,000 active users on a monthly basis
in the United States; and
(2) the platform predominantly provides a community forum
for user-generated content and discussion, including sharing
videos, images, games, audio files, discussion in a virtual
setting, or other content, such as acting as a social media
platform, virtual reality environment, or a social network
service.
(c) Content.--
(1) Transparency.--The public reports required of a covered
platform under this section shall include--
(A) an assessment of the extent to which the
platform is likely to be accessed by minors;
(B) a description of the commercial interests of
the covered platform in use by minors;
(C) an accounting, based on the data held by the
covered platform, of--
(i) the number of individuals using the
covered platform reasonably believed to be
minors in the United States;
(ii) the median and mean amounts of time
spent on the platform by minors in the United
States who have accessed the platform during
the reporting year on a daily, weekly, and
monthly basis; and
(iii) the amount of content being accessed
by individuals that the platform knows to be
minors that is in English, and the top 5 non-
English languages used by individuals accessing
the platform in the United States;
(D) an accounting of total reports received
regarding, and the prevalence (which can be based on
scientifically valid sampling methods using the content
available to the covered platform in the normal course
of business) of content related to, the harms described
in section 3(a), disaggregated by category of harm and
language, including English and the top 5 non-English
languages used by individuals accessing the platform
from the United States (as identified under
subparagraph (C)(iii)); and
(E) a description of any material breaches of
parental tools or assurances regarding minors,
representations regarding the use of the personal data
of minors, and other matters regarding non-compliance.
(2) Reasonably foreseeable risk of harm to minors.--The
public reports required of a covered platform under this
section shall include--
(A) an assessment of the reasonably foreseeable
risk of harms to minors posed by the covered platform,
including identifying any other physical, mental,
developmental, or financial harms in addition to those
described in section 3(a);
(B) an assessment of how personalized
recommendation systems and individual-specific
advertising to minors can contribute to harms to
minors;
(C) a description of whether and how the covered
platform uses system design features that increase,
sustain, or extend use of a product or service by a
minor, such as automatic playing of media, rewards for
time spent, and notifications;
(D) a description of whether, how, and for what
purpose the platform collects or processes categories
of personal data that may cause reasonably foreseeable
risk of harms to minors;
(E) an evaluation of the efficacy of safeguards for
minors under section 4, and any issues in delivering
such safeguards and the associated parental tools;
(F) an evaluation of any other relevant matters of
public concern over risk of harms to minors; and
(G) an assessment of differences in risk of harm to
minors across different English and non-English
languages and efficacy of safeguards in those
languages.
(3) Mitigation.--The public reports required of a covered
platform under this section shall include, for English and the
top 5 non-English languages used by individuals accessing the
platform from the United States (as identified under paragraph
(2)(C)(iii)))--
(A) a description of the safeguards and parental
tools available to minors and parents on the covered
platform;
(B) a description of interventions by the covered
platform when it had or has reason to believe that
harms to minors could occur;
(C) a description of the prevention and mitigation
measures intended to be taken in response to the known
and emerging risks identified in its assessment of
system risks, including steps taken to--
(i) prevent harms to minors, including
adapting or removing system design features or
addressing through parental controls;
(ii) provide the most protective level of
control over privacy and safety by default; and
(iii) adapt recommendation systems to
mitigate reasonably foreseeable risk of harms
to minors, as described in section 3(a);
(D) a description of internal processes for
handling reports and automated detection mechanisms for
harms to minors, including the rate, timeliness, and
effectiveness of responses under the requirement of
section 4(c);
(E) the status of implementing prevention and
mitigation measures identified in prior assessments;
and
(F) a description of the additional measures to be
taken by the covered platform to address the
circumvention of safeguards for minors and parental
tools.
(d) Reasonable Inspection.--In conducting an inspection of the
systemic risks of harm to minors under this section, an independent,
third-party auditor shall--
(1) take into consideration the function of personalized
recommendation systems;
(2) consult parents and youth experts, including youth and
families with relevant past or current experience, public
health and mental health nonprofit organizations, health and
development organizations, and civil society with respect to
the prevention of harms to minors;
(3) conduct research based on experiences of minors that
use the covered platform, including reports under section 4(c)
and information provided by law enforcement;
(4) take account of research, including research regarding
system design features, marketing, or product integrity,
industry best practices, or outside research;
(5) consider indicia or inferences of age of users, in
addition to any self-declared information about the age of
individuals; and
(6) take into consideration differences in risk of
reasonably foreseeable harms and effectiveness of safeguards
across English and non-English languages.
(e) Cooperation With Independent, Third-party Audit.--To facilitate
the report required by subsection (c), a covered platform shall--
(1) provide or otherwise make available to the independent
third-party conducting the audit all information and material
in its possession, custody, or control that is relevant to the
audit;
(2) provide or otherwise make available to the independent
third-party conducting the audit access to all network,
systems, and assets relevant to the audit; and
(3) disclose all relevant facts to the independent third-
party conducting the audit, and not misrepresent in any manner,
expressly or by implication, any relevant fact.
(f) Privacy Safeguards.--
(1) In general.--In issuing the public reports required
under this section, a covered platform shall take steps to
safeguard the privacy of its users, including ensuring that
data is presented in a de-identified, aggregated format such
that it is reasonably impossible for the data to be linked back
to any individual user.
(2) Rule of construction.--This section shall not be
construed to require the disclosure of information that will
lead to material vulnerabilities for the privacy of users or
the security of a covered platform's service or create a
significant risk of the violation of Federal or State law.
(3) Definition of de-identified.--As used in this
subsection, the term ``de-identified'' means data that does not
identify and is not linked or reasonably linkable to a device
that is linked or reasonably linkable to an individual,
regardless of whether the information is aggregated
(g) Location.--The public reports required under this section
should be posted by a covered platform on an easy to find location on a
publicly-available website.
SEC. 7. INDEPENDENT RESEARCH ON SOCIAL MEDIA AND MINORS.
(a) Definitions.--In this section:
(1) Commission.--The term ``Commission'' means the Federal
Trade Commission.
(2) National academy.--The term ``National Academy'' means
the National Academy of Sciences.
(3) Secretary.--The term ``Secretary'' means the Secretary
of Health and Human Services.
(b) Research on Social Media Harms.--Not later than 12 months after
the date of enactment of this Act, the Commission shall seek to enter
into a contract with the National Academy, under which the National
Academy shall conduct no less than 5 scientific, comprehensive studies
and reports on the risk of harms to minors by use of social media and
other online platforms, including in English and non-English languages.
(c) Matters to Be Addressed.--In contracting with the National
Academy, the Commission, in consultation with the Secretary, shall seek
to commission separate studies and reports, using the Commission's
authority under section 6(b) of the Federal Trade Commission Act (15
U.S.C. 46(b)), on the relationship between social media and other
online platforms as defined in this Act on the following matters:
(1) Anxiety, depression, eating disorders, and suicidal
behaviors.
(2) Substance use disorders and the use of narcotic drugs,
tobacco products, gambling, or alcohol by minors.
(3) Sexual exploitation and abuse.
(4) Addiction-like use of social media and design factors
that lead to unhealthy and harmful overuse of social media.
(d) Additional Study.--Not earlier than 4 years after enactment,
the Commission shall seek to enter into a contract with the National
Academy under which the National Academy shall conduct an additional
study and report covering the matters described in subsection (c) for
the purposes of providing additional information, considering new
research, and other matters.
(e) Content of Reports.-- The comprehensive studies and reports
conducted pursuant to this section shall seek to evaluate impacts and
advance understanding, knowledge, and remedies regarding the harms to
minors posed by social media and other online platforms, and may
include recommendations related to public policy.
(f) Active Studies.--If the National Academy is engaged in any
active studies on the matters described in subsection (c) at the time
that it enters into a contract with the Commission to conduct a study
under this section, it may base the study to be conducted under this
section on the active study, so long as it otherwise incorporates the
requirements of this section.
(g) Collaboration.--In designing and conducting the studies under
this section, the Commission, the Secretary, and the National Academy
shall consult with the Surgeon General and the Kids Online Safety
Council.
(h) Access to Data.--
(1) Fact-finding authority.--The Commission may issue
orders to gather and compile information and data necessary to
conduct the studies required under this section.
(2) Scope.--The Commission may issue orders under section
6(b) of the Federal Trade Commission Act (15 U.S.C. 46(b)) to
no more than 5 covered platforms per study under this section.
(3) Confidential access.--Pursuant to subsections (b) and
(f) of section 6 of the Federal Trade Commission Act (15 U.S.C.
46), the Commission shall enter in agreements with the National
Academy to share appropriate information received from a
covered platform pursuant to an order under such subsection (b)
for a comprehensive study under this section in a confidential
and secure manner, and to prohibit the disclosure or sharing of
such information by the National Academy.
SEC. 8. MARKET RESEARCH.
(a) Market Research by Covered Platforms.--The Federal Trade
Commission, in consultation with the Secretary of Commerce, shall issue
guidance for covered platforms seeking to conduct market- and product-
focused research on minors. Such guidance shall include--
(1) a standard consent form that provides minors and their
parents a clear, conspicuous, and easy-to-understand
explanation of the scope and purpose of the research to be
conducted, and provides an opportunity for informed consent in
the language in which the parent uses the covered platform; and
(2) recommendations for research practices for studies that
may include minors, disaggregated by the age ranges of 0-5, 6-
9, 10-12, and 13-16.
(b) Timing.--The Federal Trade Commission shall issue such guidance
not later than 18 months after the date of enactment of this Act. In
doing so, they shall seek input from members of the public and the
representatives of the Kids Online Safety Council established under
section 12.
SEC. 9. AGE VERIFICATION STUDY AND REPORT.
(a) Study.--The Director of the National Institute of Standards and
Technology, in coordination with the Federal Communications Commission,
Federal Trade Commission, and the Secretary of Commerce, shall conduct
a study evaluating the most technologically feasible methods and
options for developing systems to verify age at the device or operating
system level.
(b) Contents.--Such study shall consider --
(1) the benefits of creating a device or operating system
level age verification system;
(2) what information may need to be collected to create
this type of age verification system;
(3) the accuracy of such systems and their impact or steps
to improve accessibility, including for individuals with
disabilities;
(4) how such a system or systems could verify age while
mitigating risks to user privacy and data security and
safeguarding minors' personal data, emphasizing minimizing the
amount of data collected and processed by covered platforms and
age verification providers for such a system;
(5) the technical feasibility, including the need for
potential hardware and software changes, including for devices
currently in commerce and owned by consumers; and
(6) the impact of different age verification systems on
competition, particularly the risk of different age
verification systems creating barriers to entry for small
companies.
(c) Report.--Not later than 1 year after the date of enactment of
this Act, the agencies described in subsection (a) shall submit a
report containing the results of the study conducted under such
subsection to the Committee on Commerce, Science, and Transportation of
the Senate and the Committee on Energy and Commerce of the House of
Representatives.
SEC. 10. GUIDANCE.
(a) In General.--Not later than 18 months after the date of
enactment of this Act, the Federal Trade Commission, in consultation
with the Kids Online Safety Council established under section 12, shall
issue guidance to--
(1) provide information and examples for covered platforms
and auditors regarding the following, with consideration given
to differences across English and non-English languages--
(A) identifying features that are used to increase,
sustain, or extend use of the covered platform by a
minor;
(B) safeguarding minors against the possible misuse
of parental tools;
(C) best practices in providing minors and parents
the most protective level of control over privacy and
safety;
(D) using indicia or inferences of age of users for
assessing use of the covered platform by minors;
(E) methods for evaluating the efficacy of
safeguards; and
(F) providing additional control options that allow
parents to address the harms described in section 3(a);
and
(2) outline conduct that does not have the purpose or
substantial effect of subverting or impairing user autonomy,
decision-making, or choice, or of causing, increasing, or
encouraging compulsive usage for a minor, such as--
(A) de minimis user interface changes derived from
testing consumer preferences, including different
styles, layouts, or text, where such changes are not
done with the purpose of weakening or disabling
safeguards or parental controls;
(B) algorithms or data outputs outside the control
of a covered platform; and
(C) establishing default settings that provide
enhanced privacy protection to users or otherwise
enhance their autonomy and decision-making ability.
(b) Guidance to Schools.--Not later than 18 months after the date
of enactment of this Act, the Secretary of Education, in consultation
with the Federal Trade Commission and the Kids Online Safety Council
established under section 12, shall issue guidance to assist to assist
elementary and secondary schools in using the notice, safeguards and
tools provided under this Act and providing information on online
safety for students and teachers.
(c) Guidance on Knowledge Standard.--Not later than 18 months after
the date of enactment of this Act, the Federal Trade Commission shall
issue guidance to provide information, including best practices and
examples, for covered platforms to understand the Commission's
determination of whether a covered platform ``had knowledge fairly
implied on the basis of objective circumstances'' for purposes of this
Act.
(d) Limitation on Federal Trade Commission Guidance.--
(1) Effect of guidance.--No guidance issued by the Federal
Trade Commission with respect to this Act shall--
(A) confer any rights on any person, State, or
locality; or
(B) operate to bind the Federal Trade Commission or
any person to the approach recommended in such
guidance.
(2) Use in enforcement actions.--In any enforcement action
brought pursuant to this Act, the Federal Trade Commission--
(A) shall allege a violation of a provision of this
Act; and
(B) may not base such enforcement action on, or
execute a consent order based on, practices that are
alleged to be inconsistent with guidance issued by the
Federal Trade Commission with respect to this Act,
unless the practices are alleged to violate a provision
of this Act.
SEC. 11. ENFORCEMENT.
(a) Enforcement by Federal Trade Commission.--
(1) Unfair and deceptive acts or practices.--A violation of
this Act shall be treated as a violation of a rule defining an
unfair or deceptive act or practice prescribed under section
18(a)(1)(B) of the Federal Trade Commission Act (15 U.S.C.
57a(a)(1)(B)).
(2) Powers of the commission.--
(A) In general.--The Federal Trade Commission
(referred to in this section as the ``Commission'')
shall enforce this Act in the same manner, by the same
means, and with the same jurisdiction, powers, and
duties as though all applicable terms and provisions of
the Federal Trade Commission Act (15 U.S.C. 41 et seq.)
were incorporated into and made a part of this Act.
(B) Privileges and immunities.--Any person that
violates this Act shall be subject to the penalties,
and entitled to the privileges and immunities, provided
in the Federal Trade Commission Act (15 U.S.C. 41 et
seq.).
(3) Authority preserved.--Nothing in this Act shall be
construed to limit the authority of the Commission under any
other provision of law.
(b) Enforcement by State Attorneys General.--
(1) In general.--
(A) Civil actions.--In any case in which the
attorney general of a State has reason to believe that
an interest of the residents of that State has been or
is threatened or adversely affected by the engagement
of any person in a practice that violates this Act, the
State, as parens patriae, may bring a civil action on
behalf of the residents of the State in a district
court of the United States or a State court of
appropriate jurisdiction to--
(i) enjoin that practice;
(ii) enforce compliance with this Act;
(iii) on behalf of residents of the State,
obtain damages, restitution, or other
compensation, each of which shall be
distributed in accordance with State law; or
(iv) obtain such other relief as the court
may consider to be appropriate.
(B) Notice.--
(i) In general.--Before filing an action
under subparagraph (A), the attorney general of
the State involved shall provide to the
Commission--
(I) written notice of that action;
and
(II) a copy of the complaint for
that action.
(ii) Exemption.--
(I) In general.--Clause (i) shall
not apply with respect to the filing of
an action by an attorney general of a
State under this paragraph if the
attorney general of the State
determines that it is not feasible to
provide the notice described in that
clause before the filing of the action.
(II) Notification.--In an action
described in subclause (I), the
attorney general of a State shall
provide notice and a copy of the
complaint to the Commission at the same
time as the attorney general files the
action.
(2) Intervention.--
(A) In general.--On receiving notice under
paragraph (1)(B), the Commission shall have the right
to intervene in the action that is the subject of the
notice.
(B) Effect of intervention.--If the Commission
intervenes in an action under paragraph (1), it shall
have the right--
(i) to be heard with respect to any matter
that arises in that action; and
(ii) to file a petition for appeal.
(3) Construction.--For purposes of bringing any civil
action under paragraph (1), nothing in this Act shall be
construed to prevent an attorney general of a State from
exercising the powers conferred on the attorney general by the
laws of that State to--
(A) conduct investigations;
(B) administer oaths or affirmations; or
(C) compel the attendance of witnesses or the
production of documentary and other evidence.
(4) Actions by the commission.--In any case in which an
action is instituted by or on behalf of the Commission for
violation of this Act, no State may, during the pendency of
that action, institute a separate action under paragraph (1)
against any defendant named in the complaint in the action
instituted by or on behalf of the Commission for that
violation.
(5) Venue; service of process.--
(A) Venue.--Any action brought under paragraph (1)
may be brought in--
(i) the district court of the United States
that meets applicable requirements relating to
venue under section 1391 of title 28, United
States Code; or
(ii) a State court of competent
jurisdiction.
(B) Service of process.--In an action brought under
paragraph (1) in a district court of the United States,
process may be served wherever defendant--
(i) is an inhabitant; or
(ii) may be found.
SEC. 12. KIDS ONLINE SAFETY COUNCIL.
(a) Establishment.--Not later than 180 days after the date of
enactment of this Act, the Secretary of Commerce shall establish and
convene the Kids Online Safety Council for the purpose of providing
advice on matters related to this Act.
(b) Participation.--The Kids Online Safety Council shall include
diverse participation from--
(1) academic experts, health professionals, and members of
civil society with expertise in mental health, substance use
disorders, and the prevention of harms to minors;
(2) representatives in academia and civil society with
specific expertise in privacy and civil liberties;
(3) parents and youth representation;
(4) representatives of covered platforms;
(5) representatives of the National Telecommunications and
Information Administration, the National Institute of Standards
and Technology, the Federal Trade Commission, the Department of
Justice, and the Department of Health and Human Services;
(6) State attorneys general or their designees acting in
State or local government;
(7) educators; and
(8) representatives of communities of socially
disadvantaged individuals (as defined in section 8 of the Small
Business Act (15 U.S.C. 637)).
(c) Activities.--The matters to be addressed by the Kids Online
Safety Council shall include--
(1) identifying emerging or current risks of harms to
minors associated with online platforms;
(2) recommending measures and methods for assessing,
preventing, and mitigating harms to minors online;
(3) recommending methods and themes for conducting research
regarding online harms to minors, including in English and non-
English languages; and
(4) recommending best practices and clear, consensus-based
technical standards for transparency reports and audits, as
required under this Act, including methods, criteria, and scope
to promote overall accountability.
SEC. 13. FILTER BUBBLE TRANSPARENCY REQUIREMENTS.
(a) Definitions.--In this section:
(1) Algorithmic ranking system.--The term ``algorithmic
ranking system'' means a computational process, including one
derived from algorithmic decision-making, machine learning,
statistical analysis, or other data processing or artificial
intelligence techniques, used to determine the selection,
order, relative prioritization, or relative prominence of
content from a set of information that is provided to a user on
a covered internet platform, including the ranking of search
results, the provision of content recommendations, the display
of social media posts, or any other method of automated content
selection.
(2) Approximate geolocation information.--The term
``approximate geolocation information'' means information that
identifies the location of an individual, but with a precision
of less than 5 miles.
(3) Commission.--The term ``Commission'' means the Federal
Trade Commission.
(4) Connected device.--The term ``connected device'' means
an electronic device that--
(A) is capable of connecting to the internet,
either directly or indirectly through a network, to
communicate information at the direction of an
individual;
(B) has computer processing capabilities for
collecting, sending, receiving, or analyzing data; and
(C) is primarily designed for or marketed to
consumers.
(5) Covered internet platform.--
(A) In general.--The term ``covered internet
platform'' means any public-facing website, internet
application, or mobile application, including a social
network site, video sharing service, search engine, or
content aggregation service.
(B) Exclusions.--Such term shall not include a
platform that--
(i) is wholly owned, controlled, and
operated by a person that--
(I) for the most recent 6-month
period, did not employ more than 500
employees;
(II) for the most recent 3-year
period, averaged less than $50,000,000
in annual gross revenue; and
(III) collects or processes on an
annual basis the user-specific data of
less than 1,000,000 users; or
(ii) is operated for the sole purpose of
conducting research that is not made for profit
either directly or indirectly.
(6) Input-transparent algorithm.--
(A) In general.--The term ``input-transparent
algorithm'' means an algorithmic ranking system that
does not use the user-specific data of a user to
determine the selection, order, relative
prioritization, or relative prominence of information
that is furnished to such user on a covered internet
platform, unless the user-specific data is expressly
provided to the platform by the user for such purpose.
(B) Data provided for express purpose of
interaction with platform.--For purposes of
subparagraph (A), user-specific data that is provided
by a user for the express purpose of determining the
selection, order, relative prioritization, or relative
prominence of information that is furnished to such
user on a covered internet platform--
(i) shall include user-supplied search
terms, filters, speech patterns (if provided
for the purpose of enabling the platform to
accept spoken input or selecting the language
in which the user interacts with the platform),
saved preferences, and the current precise
geolocation information that is supplied by the
user;
(ii) shall include the user's current
approximate geolocation information;
(iii) shall include data affirmatively
supplied to the platform by the user that
expresses the user's desire to receive
particular information, such as the social
media profiles the user follows, the video
channels the user subscribes to, or other
content or sources of content on the platform
the user has selected;
(iv) shall not include the history of the
user's connected device, including the user's
history of web searches and browsing, previous
geographical locations, physical activity,
device interaction, and financial transactions;
and
(v) shall not include inferences about the
user or the user's connected device, without
regard to whether such inferences are based on
data described in clause (i) or (iii).
(7) Opaque algorithm.--
(A) In general.--The term ``opaque algorithm''
means an algorithmic ranking system that determines the
selection, order, relative prioritization, or relative
prominence of information that is furnished to such
user on a covered internet platform based, in whole or
part, on user-specific data that was not expressly
provided by the user to the platform for such purpose.
(B) Exception for age-appropriate content
filters.--Such term shall not include an algorithmic
ranking system used by a covered internet platform if--
(i) the only user-specific data (including
inferences about the user) that the system uses
is information relating to the age of the user;
and
(ii) such information is only used to
restrict a user's access to content on the
basis that the individual is not old enough to
access such content.
(8) Precise geolocation information.--The term ``precise
geolocation information'' means geolocation information that
identifies an individual's location to within a range of 5
miles or less.
(9) Search syndication contract; upstream provider;
downstream provider.--
(A) Search syndication contract.--The term ``search
syndication contract'' means a contract or subcontract
for the sale of, license of, or other right to access
an index of web pages or search results on the internet
for the purpose of operating an internet search engine.
(B) Upstream provider.--The term ``upstream
provider'' means, with respect to a search syndication
contract, the person that grants access to an index of
web pages or search results on the internet to a
downstream provider pursuant to the contract.
(C) Downstream provider.--The term ``downstream
provider'' means, with respect to a search syndication
contract, the person that receives access to an index
of web pages on the internet from an upstream provider
under such contract.
(10) User-specific data.--The term ``user-specific data''
means information relating to an individual or a specific
connected device that would not necessarily be true of every
individual or device.
(b) Requirement to Allow Users to See Unmanipulated Content on
Internet Platforms.--
(1) In general.--Beginning on the date that is 1 year after
the date of enactment of this Act, it shall be unlawful--
(A) for any person to operate a covered internet
platform that uses an opaque algorithm unless the
person complies with the requirements of paragraph (2);
or
(B) for any upstream provider to grant access to an
index of web pages on the internet under a search
syndication contract that does not comply with the
requirements of paragraph (3).
(2) Opaque algorithm requirements.--
(A) In general.--The requirements of this paragraph
with respect to a person that operates a covered
internet platform that uses an opaque algorithm are the
following:
(i) The person provides notice to users of
the platform--
(I) that the platform uses an
opaque algorithm that uses user-
specific data to select the content the
user sees. Such notice shall be
presented in a clear, conspicuous
manner on the platform whenever the
user interacts with an opaque algorithm
for the first time, and may be a one-
time notice that can be dismissed by
the user; and
(II) in the terms and conditions of
the covered internet platform, in a
clear, accessible, and easily
comprehensible manner to be updated no
less frequently than once every 6
months--
(aa) the most salient
features, inputs, and
parameters used by the
algorithm;
(bb) how any user-specific
data used by the algorithm is
collected or inferred about a
user of the platform, and the
categories of such data;
(cc) any options that the
covered internet platform makes
available for a user of the
platform to opt out or exercise
options under clause (ii),
modify the profile of the user
or to influence the features,
inputs, or parameters used by
the algorithm; and
(dd) any quantities, such
as time spent using a product
or specific measures of
engagement or social
interaction, that the algorithm
is designed to optimize, as
well as a general description
of the relative importance of
each quantity for such ranking.
(ii) The person makes available a version
of the platform that uses an input-transparent
algorithm and enables users to easily switch
between the version of the platform that uses
an opaque algorithm and the version of the
platform that uses the input-transparent
algorithm.
(B) Nonapplication to certain downstream
providers.--Subparagraph (A) shall not apply with
respect to an internet search engine if--
(i) the search engine is operated by a
downstream provider with fewer than 1,000
employees; and
(ii) the search engine uses an index of web
pages on the internet to which such provider
received access under a search syndication
contract.
(3) Search syndication contract requirement.--The
requirements of this paragraph with respect to a search
syndication contract are that--
(A) as part of the contract, the upstream provider
makes available to the downstream provider the same
input-transparent algorithm used by the upstream
provider for purposes of complying with paragraph
(2)(A)(ii); and
(B) the upstream provider does not impose any
additional costs, degraded quality, reduced speed, or
other constraint on the functioning of such algorithm
when used by the downstream provider to operate an
internet search engine relative to the performance of
such algorithm when used by the upstream provider to
operate an internet search engine.
(4) Prohibition on differential pricing.--A covered
internet platform shall not deny, charge different prices or
rates for, or condition the provision of a service or product
to an individual based on the individual's election to use a
version of the platform that uses an input-transparent
algorithm as provided under paragraph (2)(A)(ii).
(c) Enforcement by Federal Trade Commission.--
(1) Unfair or deceptive acts or practices.--A violation of
this section by an operator of a covered internet platform
shall be treated as a violation of a rule defining an unfair or
deceptive act or practice prescribed under section 18(a)(1)(B)
of the Federal Trade Commission Act (15 U.S.C. 57a(a)(1)(B)).
(2) Powers of commission.--
(A) In general.--Except as provided in subparagraph
(C), the Federal Trade Commission shall enforce this
section in the same manner, by the same means, and with
the same jurisdiction, powers, and duties as though all
applicable terms and provisions of the Federal Trade
Commission Act (15 U.S.C. 41 et seq.) were incorporated
into and made a part of this section.
(B) Privileges and immunities.--Except as provided
in subparagraph (C), any person who violates this Act
shall be subject to the penalties and entitled to the
privileges and immunities provided in the Federal Trade
Commission Act (15 U.S.C. 41 et seq.).
(C) Common carriers and nonprofit organizations.--
Notwithstanding section 4, 5(a)(2), or 6 of the Federal
Trade Commission Act (15 U.S.C. 44, 45(a)(2), 46) or
any jurisdictional limitation of the Commission, the
Commission shall also enforce this Act, in the same
manner provided in subparagraphs (A) and (B) of this
paragraph, with respect to--
(i) common carriers subject to the
Communications Act of 1934 (47 U.S.C. 151 et
seq.) and Acts amendatory thereof and
supplementary thereto; and
(ii) organizations not organized to carry
on business for their own profit or that of
their members.
(D) Authority preserved.--Nothing in this section
shall be construed to limit the authority of the
Commission under any other provision of law.
(3) Rule of application.--Section 11 shall not apply to
this section.
(d) Rule of Construction to Preserve Personalized Blocks.--Nothing
in this section shall be construed to limit or prohibit a covered
internet platform's ability to, at the direction of an individual user
or group of users, restrict another user from searching for, finding,
accessing, or interacting with such user's or group's account, content,
data, or online community.
SEC. 14. EFFECTIVE DATE.
Except as otherwise provided in this Act, this Act shall take
effect on the date that is 18 months after the date of enactment of
this Act.
SEC. 15. RULES OF CONSTRUCTION AND OTHER MATTERS.
(a) Relationship to Other Laws.--Nothing in this Act shall be
construed to--
(1) preempt section 444 of the General Education Provisions
Act (20 U.S.C. 1232g, commonly known as the ``Family
Educational Rights and Privacy Act of 1974'') or other Federal
or State laws governing student privacy;
(2) preempt the Children's Online Privacy Protection Act of
1998 (15 U.S.C. 6501 et seq.) or any rule or regulation
promulgated under such Act; or
(3) authorize any action that would conflict with section
18(h) of the Federal Trade Commission Act (15 U.S.C. 57a(h)).
(b) Determination of ``Fairly Implied on the Basis of Objective
Circumstances''.--For purposes of enforcing this Act, in making a
determination as to whether covered platform has knowledge fairly
implied on the basis of objective circumstances that a user is a minor,
the Federal Trade Commission shall rely on competent and reliable
empirical evidence, taking into account the totality of the
circumstances, including consideration of whether the operator, using
available technology, exercised reasonable care.
(c) Protections for Privacy.--Nothing in this Act shall be
construed to require--
(1) the affirmative collection of any personal data with
respect to the age of users that a covered platform is not
already collecting in the normal course of business; or
(2) a covered platform to implement an age gating or age
verification functionality.
(d) Compliance.--Nothing in this Act shall be construed to restrict
a covered platform's ability to--
(1) cooperate with law enforcement agencies regarding
activity that the covered platform reasonably and in good faith
believes may violate Federal, State, or local laws, rules, or
regulations;
(2) comply with a civil, criminal, or regulatory inquiry or
any investigation, subpoena, or summons by Federal, State,
local, or other government authorities; or
(3) investigate, establish, exercise, respond to, or defend
against legal claims.
(e) Application to Video Streaming Services.--A video streaming
service shall be deemed to be in compliance with this Act if it
predominantly consists of news, sports, entertainment, or other video
programming content that is preselected by the provider and not user-
generated, and--
(1) any chat, comment, or interactive functionality is
provided incidental to, directly related to, or dependent on
provision of such content;
(2) if such video streaming service requires account owner
registration and is not predominantly news or sports, the
service includes the capability--
(A) to limit a minor's access to the service, which
may utilize a system of age-rating;
(B) to limit the automatic playing of on-demand
content selected by a personalized recommendation
system for an individual that the service knows is a
minor;
(C) to provide an individual that the service knows
is a minor with readily-accessible and easy-to-use
options to delete an account held by the minor and
delete any personal data collected from the minor on
the service, or, in the case of a service that allows a
parent to create a profile for a minor, to allow a
parent to delete the minor's profile, and to delete any
personal data collected from the minor on the service;
(D) for a parent to manage a minor's privacy and
account settings, and restrict purchases and financial
transactions by a minor, where applicable;
(E) to provide an electronic point of contact
specific to matters described in this paragraph;
(F) to offer a clear, conspicuous, and easy-to-
understand notice of its policies and practices with
respect to personal data and the capabilities described
in this paragraph; and
(G) when providing on-demand content, to employ
measures that safeguard against serving advertising for
narcotic drugs (as defined in section 102 of the
Controlled Substances Act (21 U.S.C. 802)), tobacco
products, gambling, or alcohol directly to the account
or profile of an individual that the service knows is a
minor.
SEC. 16. SEVERABILITY.
If any provision of this Act, or an amendment made by this Act, is
determined to be unenforceable or invalid, the remaining provisions of
this Act and the amendments made by this Act shall not be affected.
Calendar No. 287
118th CONGRESS
1st Session
S. 1409
_______________________________________________________________________
A BILL
To protect the safety of children on the internet.
_______________________________________________________________________
December 13, 2023
Reported with an amendment