[Congressional Bills 117th Congress]
[From the U.S. Government Publishing Office]
[S. 5259 Introduced in Senate (IS)]
<DOC>
117th CONGRESS
2d Session
S. 5259
To require certain interactive computer services to adopt and operate
technology verification measures to ensure that users of the platform
are not minors, and for other purposes.
_______________________________________________________________________
IN THE SENATE OF THE UNITED STATES
December 14, 2022
Mr. Lee introduced the following bill; which was read twice and
referred to the Committee on Commerce, Science, and Transportation
_______________________________________________________________________
A BILL
To require certain interactive computer services to adopt and operate
technology verification measures to ensure that users of the platform
are not minors, and for other purposes.
Be it enacted by the Senate and House of Representatives of the
United States of America in Congress assembled,
SECTION 1. SHORT TITLE.
This Act may be cited as the ``Shielding Children's Retinas from
Egregious Exposure on the Net Act'' or the ``SCREEN Act''.
SEC. 2. FINDINGS; SENSE OF CONGRESS.
(a) Findings.--Congress finds the following:
(1) Over the 3 decades preceding the date of enactment of
this Act, Congress has passed several bills to protect minors
from access to online pornographic content, including title V
of the Telecommunications Act of 1996 (Public Law 104-104)
(commonly known as the ``Communications Decency Act''), section
231 of the Communications Act of 1934 (47 U.S.C. 231) (commonly
known as the ``Child Online Protection Act''), and the
Children's Internet Protection Act (title XVII of division B of
Public Law 106-554).
(2) With the exception of the Children's Internet
Protection Act (title XVII of division B of Public Law 106-
554), the Supreme Court of the United States has struck down
the previous efforts of Congress to shield children from
pornographic content, finding that such legislation constituted
a ``compelling government interest'' but that it was not the
least restrictive means to achieve such interest. In Ashcroft
v. ACLU, 542 U.S. 656 (2004), the Court even suggested at the
time that ``blocking and filtering software'' could conceivably
be a ``primary alternative'' to the requirements passed by
Congress.
(3) In the nearly 2 decades since the Supreme Court of the
United States suggested the use of ``blocking and filtering
software'', such technology has proven to be ineffective in
protecting minors from accessing online pornographic content.
The Kaiser Family Foundation has found that filters do not work
on 1 in 10 pornography sites accessed intentionally and 1 in 3
pornography sites that are accessed unintentionally. Further,
it has been proven that children are able to bypass ``blocking
and filtering'' software by employing strategic searches or
measures to bypass the software completely.
(4) Additionally, Pew Research has revealed studies showing
that only 39-percent of parents use blocking or filtering
software for their minor's online activities, meaning that 61-
percent of children only have restrictions on their internet
access when they are at school or at a library.
(5) 17 States have now recognized pornography as a public
health hazard that leads to a broad range of individual harms,
societal harms, and public health impacts.
(6) It is estimated that 80-percent of minors between the
ages of 12 to 17 have been exposed to pornography, with 54-
percent of teenagers seeking it out. The internet is the most
common source for minors to access pornography with
pornographic websites receiving more web traffic in the United
States than Twitter, Netflix, Pinterest, and LinkedIn combined.
(7) Exposure to online pornography has created unique
psychological effects for minors, including anxiety, addiction,
low self-esteem, body image disorders, an increase in
problematic sexual activity at younger ages, and an increased
desire among minors to engage in risky sexual behavior.
(8) The Supreme Court of the United States has recognized
on multiple occasions that Congress has a ``compelling
government interest'' to protect the physical and psychological
well-being of minors, which includes shielding them from
``indecent'' content that may not necessarily be considered
``obscene'' by adult standards.
(9) Because ``blocking and filtering software'' has not
produced the results envisioned nearly 2 decades ago, it is
necessary for Congress to pursue alternative policies to enable
the protection of the physical and psychological well-being of
minors.
(10) The evolution of our technology has now enabled the
use of age verification technology that is cost efficient, not
unduly burdensome, and can be operated narrowly in a manner
that ensures only adults have access to a website's online
pornographic content.
(b) Sense of Congress.--It is the sense of Congress that--
(1) shielding minors from access to online pornographic
content is a compelling government interest that protects the
physical and psychological well-being of minors; and
(2) requiring interactive computer services that are in the
business of creating, hosting, or making available pornographic
content to enact technological measures that shield minors from
accessing pornographic content on their platforms is the least
restrictive means for Congress to achieve its compelling
government interest.
SEC. 3. DEFINITIONS.
In this Act:
(1) Child pornography; minor.--The terms ``child
pornography'' and ``minor'' have the meanings given those terms
in section 2256 of title 18, United States Code.
(2) Commission.--The term ``Commission'' means the Federal
Communications Commission.
(3) Covered platform.--The term ``covered platform''--
(A) means an entity--
(i) that is an interactive computer
service;
(ii) that--
(I) is engaged in interstate or
foreign commerce; or
(II) purposefully avails itself of
the United States market or a portion
thereof; and
(iii) for which it is in the regular course
of the trade or business of the entity to
create, host, or make available content that
meets the definition of harmful to minors under
paragraph (4) and that is provided by the
entity, a user, or other information content
provider, with the objective of earning a
profit; and
(B) includes an entity described in subparagraph
(A) regardless of whether--
(i) the entity earns a profit on the
activities described in subparagraph (A)(iii);
or
(ii) creating, hosting, or making available
content that meets the definition of harmful to
minors under paragraph (4) is the sole source
of income or principal business of the entity.
(4) Harmful to minors.--The term ``harmful to minors'',
with respect to a picture, image, graphic image file, film,
videotape, or other visual depiction, means that the picture,
image, graphic image file, film, videotape, or other
depiction--
(A)(i) taken as a whole and with respect to minors,
appeals to the prurient interest in nudity, sex, or
excretion;
(ii) depicts, describes, or represents, in a
patently offensive way with respect to what is suitable
for minors, an actual or simulated sexual act or sexual
contact, actual or simulated normal or perverted sexual
acts, or lewd exhibition of the genitals; and
(iii) taken as a whole, lacks serious, literary,
artistic, political, or scientific value as to minors;
(B) is obscene; or
(C) is child pornography.
(5) Information content provider; interactive computer
service.--The terms ``information content provider'' and
``interactive computer service'' have the meanings given those
terms in section 230(f) of the Communications Act of 1934 (47
U.S.C. 230(f)).
(6) Sexual act; sexual contact.--The terms ``sexual act''
and ``sexual contact'' have the meanings given those terms in
section 2246 of title 18, United States Code.
(7) Technology verification measure.--The term ``technology
verification measure'' means technology that--
(A) employs a system or process to determine
whether it is more likely than not that a user of a
covered platform is a minor; and
(B) prevents access by minors to any content on a
covered platform.
SEC. 4. TECHNOLOGY VERIFICATION MEASURES.
(a) Rule Making.--The Commission shall--
(1) not later than 30 days after the date of enactment of
this Act, issue a notice of proposed rule making to require
covered platforms to adopt and operate technology verification
measures on the platform to ensure that--
(A) users of the covered platform are not minors;
and
(B) minors are prevented from accessing any content
on the covered platform that is harmful to minors; and
(2) not later than 1 year after issuing the notice of
proposed rule making under paragraph (1), issue the final rule.
(b) Requirements.--The rule described in subsection (a) shall--
(1) set the applicable verification standards and metrics
to which a covered platform using a technology verification
measure is required to adhere when determining whether it is
more likely than not that a user of the covered platform is not
a minor;
(2) require covered platforms to--
(A) adopt technology verification measures that
adhere to the standards and metrics set by the
Commission under paragraph (1); and
(B) make publicly available the verification
process that the covered platform is employing to
comply with the requirements under this Act;
(3) provide that requiring a user to confirm that the user
is not a minor shall not be sufficient to satisfy the
requirements under subparagraphs (A) and (B) of subsection
(a)(1);
(4) subject the Internet Protocol (IP) addresses of all
users, including known virtual proxy network IP addresses, of a
covered platform to the requirements described in subparagraphs
(A) and (B) of subsection (a)(1) unless the covered platform
(or third-party described in paragraph (6)), according to
standards set by the Commission, determines based on available
technology a user is not located within the United States;
(5) permit covered platforms to choose the technology
verification measure that ensures the verification of users in
accordance with the standards and metrics set by the Commission
under paragraph (1), provided that the technology verification
measure employed by the covered platform prohibits a minor from
accessing the platform or any information on the platform that
is obscene, child pornography, or harmful to minors;
(6) permit covered platforms to contract with a third-party
to employ a technology verification measures, and provide that
use of such a third-party shall not relieve the covered
platform of the requirements under subparagraphs (A) and (B) of
subsection (a)(1) or the enforcement actions described in
section 6;
(7) require the Commission to establish a process for each
covered platform to submit only such documents or other
materials as are necessary for the Commission to ensure full
compliance with the requirements of the rule; and
(8) require the Commission to--
(A) conduct regular audits of covered platforms to
ensure compliance with the requirements under this
subsection; and
(B) make public the terms and processes for the
audits conducted under subparagraph (A), including the
processes for any third-party conducting an audit on
behalf of the Commission.
(c) Compliance.--
(1) Deadline.--Not later than 180 days after the date on
which the final rule is issued under subsection (a)(2), each
covered platform shall comply with the requirements under the
final rule.
(2) Appropriate documents, materials, and measures.--The
Commission shall prescribe the appropriate documents,
materials, or other measures required to ensure full compliance
with the requirements of the final rule issued under subsection
(a)(2).
(d) Contracting With Third Parties.--The Commission may create a
process to contract with independent third-party auditors to conduct
regular audits on behalf of the Commission under subsection (b)(8).
(e) Rule of Construction.--Nothing in this section shall be
construed to require a covered platform to submit any information that
identifies, is linked to, or is reasonably linkable to a user of the
covered platform or a device that is linked or reasonable linkable to a
user of the covered platform.
SEC. 5. CONSULTATION REQUIREMENTS.
In issuing the rule required under section 4, the Commission shall
consult with the following individuals, including with respect to the
applicable standards and metrics for making a determination on whether
it is more likely than not that a user of a covered platform is not a
minor:
(1) Individuals with experience in computer science and
software engineering.
(2) Individuals with experience in--
(A) advocating for online child safety; or
(B) providing services to minors who have been
victimized by online child exploitation.
(3) Individuals with experience in consumer protection and
online privacy.
(4) Individuals who supply technology verification measure
products or have expertise in technology verification measure
solutions.
(5) Individuals with experience in data security and
cryptography.
SEC. 6. CIVIL PENALTY FOR VIOLATIONS.
(a) Notification.--If the Commission has a sound basis to conclude
that a covered platform has violated the final rule issued under
section 4, the Commission shall notify the covered platform with a
brief description of the specific violation with recommended measures
to correct the violation.
(b) Penalty.--
(1) In general.--A covered platform that has not provided
evidence of compliance or corrected a violation that has been
noticed by the Commission under subsection (a) within 72 hours
of the receipt of such notice shall be subject to a civil
penalty in an amount that is not more than $25,000 per
violation.
(2) Separate violations.--For the purposes of paragraph
(1), each day of violation of the final rule issued under
section 4 shall constitute a separate violation.
(3) Appeal.--A covered platform may appeal any civil
penalty issued by the Commission under this subsection in an
appropriate district court of the United States.
(4) Use of amounts.--Any amounts collected under this
subsection shall be used by the Commission to carry out
enforcement of the final rule issued under section 4.
(c) Enforcement.--The Commission may--
(1) file a claim in an appropriate district court of the
United States to enforce this section;
(2) seek a temporary or permanent injunction from an
appropriate district court of the United States on such terms
as the court deems reasonable to prevent or restrain a
violation of the final rule issued under section 4;
(3) after 30 days of non-compliance with section 4 and a
demonstration of a lack of good faith on the part of the
covered platform to comply with section 4, seek a permanent or
temporary injunction to restrict the operation of the covered
platform within the United States; and
(4) after 30 days of non-compliance with section 4 and a
demonstration of a lack of good faith on the part of the
covered platform to comply with section 4, seek an order to
allow the Commission to prohibit a covered platform from
engaging in any online economic transactions within the United
States.
(d) Duration.--The terms of an injunction or an order issued under
paragraph (2), (3), or (4) of subsection (c) with respect to a covered
platform shall only be valid until such time as the covered platform
demonstrates to the Commission full compliance with the requirements of
the final rule issued under section 4.
SEC. 7. GAO REPORT.
Not later than 2 years after the date on which covered platforms
are required to comply with the final rule issued under section
4(a)(2), the Comptroller General of the United States shall submit to
Congress a report that includes--
(1) an analysis of the effectiveness of the technology
verification measures required under the final rule;
(2) recommendations to the Commission for improvements to
the final rule; and
(3) recommendations to Congress on future legislative
improvements.
SEC. 8. SEVERABILITY CLAUSE.
If any provision of this Act, or the application of such a
provision to any person or circumstance, is held to be
unconstitutional, the remaining provisions of this Act, and the
application of such provisions to any other person or circumstance,
shall not be affected thereby.
<all>