[Congressional Bills 118th Congress]
[From the U.S. Government Publishing Office]
[H.R. 3806 Introduced in House (IH)]
<DOC>
118th CONGRESS
1st Session
H. R. 3806
To ensure that large online platforms are addressing the needs of non-
English users.
_______________________________________________________________________
IN THE HOUSE OF REPRESENTATIVES
June 5, 2023
Mr. Cardenas (for himself, Mr. Soto, Ms. Barragan, Mr. Costa, Mr.
Espaillat, Mr. Vargas, Mr. Garcia of Illinois, and Mr. Castro of Texas)
introduced the following bill; which was referred to the Committee on
Energy and Commerce, and in addition to the Committee on Foreign
Affairs, for a period to be subsequently determined by the Speaker, in
each case for consideration of such provisions as fall within the
jurisdiction of the committee concerned
_______________________________________________________________________
A BILL
To ensure that large online platforms are addressing the needs of non-
English users.
Be it enacted by the Senate and House of Representatives of the
United States of America in Congress assembled,
SECTION 1. SHORT TITLE; TABLE OF CONTENTS.
(a) Short Title.--This Act may be cited as the ``Language-Inclusive
Support and Transparency for Online Services Act of 2023'' or the
``LISTOS Act''.
(b) Table of Contents.--The table of contents of this Act is as
follows:
Sec. 1. Short title; table of contents.
Sec. 2. Sense of Congress.
Sec. 3. Duty to ensure consistent enforcement.
Sec. 4. Disclosures on staffing and automated processes.
Sec. 5. Consistent access to tools and documentation.
Sec. 6. Advisory Group.
Sec. 7. Enforcement.
Sec. 8. Regulations.
Sec. 9. Effective dates.
Sec. 10. International online communication research activities pilot
program.
Sec. 11. Definitions.
SEC. 2. SENSE OF CONGRESS.
It is the sense of Congress that--
(1) substantial and deliberate investments across languages
are essential to protect the safety of users online and ensure
equitable access to digital spaces;
(2) online platforms have historically under-invested in
ensuring non-English content moderation and automated content
detection and filtering processes keep pace with their English
counterparts, providing little transparency into the efficacy
of efforts to detect, review, and remove content that violates
laws or platform policies across languages;
(3) this difference in enforcement for platforms' existing
policies and uneven moderation practices across both manual and
automated processes has increased the proliferation of illegal
and harmful content across many languages and the deliberate
targeting of non-English-speaking communities for fraud and
harassment; and
(4) any reform effort for online platform safety must
ensure equitable investment across languages in order to
promote economic opportunity, public health, and civil rights.
SEC. 3. DUTY TO ENSURE CONSISTENT ENFORCEMENT.
(a) In General.--The operator of a covered platform shall provide
that processes used by the platform for detecting, suppressing, and
removing illegal content, or content that otherwise violates platform
policies, are reasonably consistent for languages in which the covered
platform engages in monetization practices.
(b) Considerations.--Any entity enforcing or promulgating rules
under subsection (a) shall take into consideration factors that may
impact the covered platform's ability to enforce its policies with
respect to content in a given language, including staffing levels and
language proficiency, or the effectiveness of automated systems
designed to filter or flag content for additional review.
(c) Rule of Construction; Limitation on Regulation.--Nothing in
this section shall be construed to require, and no regulation issued by
the Commission to carry out this section may require, that a covered
platform take any particular action on a specific piece of content or
class of content.
SEC. 4. DISCLOSURES ON STAFFING AND AUTOMATED PROCESSES.
(a) In General.--The operator of a covered platform shall, not less
than annually, submit to the Commission and make available to the
public, in a machine-readable format, a clear and easily comprehensible
report on any manual and algorithmic content moderation that the
covered platform engaged in during the relevant period. Each such
report shall be in compliance with the rules established under
subsection (b).
(b) Rules.--The Commission shall, in accordance with section 8,
establish rules for reports under subsection (a). Such rules shall
require that a report include the following information:
(1) Content moderation staffing.--
(A) In general.--The number of staff employed by
the covered platform (whether directly employed by the
platform or contracted through a third party) for the
purposes of manually reviewing content for removal or
other interventions, in aggregate and broken down by--
(i) the countries in which the employees
are located;
(ii) the geographic or regional area to
which the employees are assigned; and
(iii) languages spoken by the employees
relevant to their employment and their levels
of language proficiency.
(B) Staff support.--A description of the training
and support provided to content moderation staff,
including--
(i) the training processes and guidelines
provided;
(ii) the support services, such as mental
health services, available to the employee; and
(iii) if training or support services
differ by factors such as geographic region,
languages spoken, or direct-hire versus
contracted employees, descriptions and
breakdowns of such differences.
(2) Automated content detection processes.--If the covered
platform elects to use algorithmic processes to detect content
for additional manual review or automated moderation,
information on such processes, including--
(A) performance metrics that are monitored to
ensure consistent behavior for such processes across
languages and the languages that are monitored; and
(B) other safeguards in place to ensure consistent
behavior of such systems across languages.
(3) Monetization across languages.--The list of languages
in which the covered platform engages in monetization practices
and the percentage breakdown by language of the covered
platform's revenue throughout the duration of the relevant
reporting period.
(4) In-language review.--Of all content that is manually
reviewed by staff, provide information on content that is
reviewed in the original language used to create the content
rather than being subject to automated translation before
review, including--
(A) the percentage of content reviewed in the
original language for each language in which the
covered platform engages in monetization practices; and
(B) a description of the policies governing whether
and to what extent content will be manually reviewed in
the original language or automatically translated prior
to manual review.
(5) Translation and review processes.--With respect to the
content review practices of the covered platform--
(A) the list of languages in which content is
reviewed without translation; and
(B) for languages in which automated translation is
applied prior to manual review, a description of--
(i) the process by which content is
translated; and
(ii) the process by which that content is
reviewed and how, if at all, that process
differs from the process used to review content
in the original language.
(6) Content moderation outcome measures.--
(A) Number of content takedowns.--The number of
content takedowns over the relevant reporting period
for each language in which the covered platform engages
in monetization practices.
(B) Response time.--The average response time to
user-initiated takedown or content review requests over
the relevant reporting period for each language in
which the covered platform engages in monetization
practices.
(7) Additional information.--Other information determined
appropriate by the Commission, including additional categories
or criteria relevant to the information described in paragraphs
(1), (2), and (4).
SEC. 5. CONSISTENT ACCESS TO TOOLS AND DOCUMENTATION.
The operator of a covered platform shall--
(1) provide that all user tools for reporting content for
review or automated action are accessible across all languages
in which the covered platform offers its service; and
(2) post all platform policies and other information
concerning acceptable use of the covered platform in the same
manner for all languages in which the platform offers its
service.
SEC. 6. ADVISORY GROUP.
(a) Establishment.--Not later than 360 days after the date of
enactment of this Act, the Commission shall establish a group to be
known as the ``Advisory Group on Language-Sensitive Technologies''
(referred to in this section as the ``Advisory Group'').
(b) Duties.--
(1) In general.--The Advisory Group shall provide consensus
advice and guidance to the Commission on best practices for
private enterprises or public entities using covered technology
that may have different performance outcomes depending on the
underlying language of the content being analyzed in order to
ensure the nondiscriminatory application of such technology.
(2) Covered technology.--For purposes of paragraph (1), the
term ``covered technology'' means technology used to--
(A) detect and process input language from sources,
such as analog text and audio, into a machine-readable
format, such as speech and optical character
recognition;
(B) process language stored in a machine-readable
format, such as natural language processing; and
(C) detect and process images and videos into a
machine-readable format, or process images or videos
stored in a machine-readable format.
(3) Membership.--The Commission shall appoint the members
of the Advisory Group. In making such appointments, the
Commission shall provide that the membership of the Advisory
Group--
(A) includes different points of view and
background experience; and
(B) includes both Federal employees and non-Federal
employee stakeholders, including representatives of
communities most impacted by the systemic risks of
harmful non-English language content and current or
former content moderators and employees of covered
platforms.
(4) Report.--The Commission shall make available on its
website the findings of the Advisory Group with recommendations
and best practices as reported by the Advisory Group concerning
the use of covered technology.
(c) Non-Applicability of the Federal Advisory Committee Act.--
Chapter 10 of title 5, United States Code, shall not apply to the
Advisory Group.
(d) Authorization of Appropriations.--There is authorized to be
appropriated to the Advisory Group such sums as are necessary to carry
out the requirements of this section.
SEC. 7. ENFORCEMENT.
(a) Enforcement by the Federal Trade Commission.--
(1) Unfair or deceptive acts or practices.--A violation of
section 3, 4, or 5 shall be treated as a violation of a rule
defining an unfair or a deceptive act or practice under section
18(a)(1)(B) of the Federal Trade Commission Act (15 U.S.C.
57a(a)(1)(B)).
(2) Powers of commission.--
(A) In general.--The Commission shall enforce this
Act in the same manner, by the same means, and with the
same jurisdiction, powers, and duties as though all
applicable terms and provisions of the Federal Trade
Commission Act (15 U.S.C. 41 et seq.) were incorporated
into and made a part of this Act.
(B) Privileges and immunities.--Any person who
violates section 3, 4, or 5 shall be subject to the
penalties and entitled to the privileges and immunities
provided in the Federal Trade Commission Act (15 U.S.C.
41 et seq.).
(C) Authority preserved.--Nothing in this Act shall
be construed to limit the authority of the Federal
Trade Commission under any other provision of law.
(b) Enforcement by States.--
(1) In general.--In any case in which the attorney general
of a State has reason to believe that an interest of the
residents of the State has been or is threatened or adversely
affected by the engagement of any person subject to section 3
or 5 in a practice that violates such section, the attorney
general of the State may, as parens patriae, bring a civil
action on behalf of the residents of the State in an
appropriate district court of the United States--
(A) to enjoin further violation of such section by
such person;
(B) to compel compliance with such section; and
(C) to obtain damages, restitution, or other
compensation on behalf of such residents.
(2) Scope of jurisdiction.--The attorney general of a State
may not bring a civil action under this subsection against a
person for a violation of section 3 or 5 if the Commission
would not be able to bring an enforcement action against the
person for such violation under subsection (a) because the
person is exempt from coverage under the Federal Trade
Commission Act (15 U.S.C. 41 et seq.).
(3) Rights of federal trade commission.--
(A) Notice to federal trade commission.--
(i) In general.--Except as provided in
clause (iii), the attorney general of a State
shall notify the Commission in writing that the
attorney general intends to bring a civil
action under paragraph (1) before initiating
the civil action.
(ii) Contents.--The notification required
by clause (i) with respect to a civil action
shall include a copy of the complaint to be
filed to initiate the civil action.
(iii) Exception.--If it is not feasible for
the attorney general of a State to provide the
notification required by clause (i) before
initiating a civil action under paragraph (1),
the attorney general shall notify the
Commission immediately upon instituting the
civil action.
(B) Intervention by federal trade commission.--The
Commission may--
(i) intervene in any civil action brought
by the attorney general of a State under
paragraph (1); and
(ii) upon intervening--
(I) be heard on all matters arising
in the civil action; and
(II) file petitions for appeal.
(4) Investigatory powers.--Nothing in this subsection may
be construed to prevent the attorney general of a State from
exercising the powers conferred on the attorney general by the
laws of the State to conduct investigations, to administer
oaths or affirmations, or to compel the attendance of witnesses
or the production of documentary or other evidence.
(5) Preemptive action by federal trade commission.--If the
Commission institutes a civil action or an administrative
action with respect to a violation of section 3 or 5, the
attorney general of a State may not, during the pendency of
such action, bring a civil action under paragraph (1) against
any defendant named in the complaint of the Commission for the
violation with respect to which the Commission instituted such
action.
(6) Venue; service of process.--
(A) Venue.--Any action brought under paragraph (1)
may be brought in--
(i) the district court of the United States
that meets applicable requirements relating to
venue under section 1391 of title 28, United
States Code; or
(ii) another court of competent
jurisdiction.
(B) Service of process.--In an action brought under
paragraph (1), process may be served in any district in
which the defendant--
(i) is an inhabitant; or
(ii) may be found.
(7) Actions by other state officials.--
(A) In general.--In addition to civil actions
brought by attorneys general under paragraph (1), any
other consumer protection officer of a State who is
authorized by the State to do so may bring a civil
action under paragraph (1), subject to the same
requirements and limitations that apply under this
subsection to civil actions brought by attorneys
general.
(B) Savings provision.--Nothing in this subsection
may be construed to prohibit an authorized official of
a State from initiating or continuing any proceeding in
a court of the State for a violation of any civil or
criminal law of the State.
SEC. 8. REGULATIONS.
(a) In General.--The Commission shall, pursuant to section 553 of
title 5, United States Code promulgate--
(1) regulations to carry out the provisions of sections 3
and 4; and
(2) such other regulations as the Commission determines
necessary to carry out the provisions of this Act.
(b) Timing.--The Commission shall begin the rulemaking process for
promulgating regulations to carry out the provisions of sections 3 and
4 not later than 120 days after the date of enactment of this Act.
SEC. 9. EFFECTIVE DATES.
The requirements of sections 3 and 4 shall take effect 120 days
after the promulgation by the Commission of regulations to carry out
such sections, and the requirements of section 5 shall take effect 120
days after the date of enactment of this Act.
SEC. 10. INTERNATIONAL ONLINE COMMUNICATION RESEARCH ACTIVITIES PILOT
PROGRAM.
(a) In General.--The Administrator of the United States Agency for
International Development (referred to in this section as ``USAID'')
shall, in coordination with the Secretary of State, evaluate and
prioritize support to select countries, from among the countries
eligible for assistance from USAID, for research and programming, such
as tool development, civil society capacity building, and other
activities, aimed at addressing the prevalence and impacts of non-
English online communication that--
(1) promotes hate, harassment, or abuse of racial, ethnic,
gender, religious, or sexual minorities;
(2) incites violence; or
(3) is false, misleading, or intended to harm--
(A) targeted individuals;
(B) public health;
(C) democratic integrity;
(D) civil rights;
(E) humanitarian response;
(F) economic integrity; or
(G) public safety.
(b) Authorized Activities.--
(1) In general.--In carrying out subsection (a), the
Administrator may--
(A) build lexicons of terms and phrases commonly
used in communications described in subsection (a);
(B) identify and improve the understanding of how
real or falsified text, videos, or imagery are being
used to spread hate, abuse, scams, fraud, and false or
misleading information in non-English languages;
(C) strengthen the capacities of civil society,
local private sector, academia, and governments to
develop and implement activities focused on preventing,
mitigating, or responding to non-English online
communication that is hateful, abusive, fraudulent,
false, or misleading; and
(D) improve awareness and the abilities of the
civil society and governments of countries that receive
support under subsection (a) to discover and interpret
non-English online communication that is hateful,
abusive, fraudulent, false, or misleading that--
(i) is perpetuated or sponsored by malign
actors or extremist organizations;
(ii) is influenced, regulated, or moderated
by governments, social media companies, and
internet service providers;
(iii) is perceived by, or impacts the
target or other consumers of online
communications or specific communities; and
(iv) leads to economic, mental, physical,
or other harms at the individual, household,
organization, or community levels.
(2) Locally led requirement.--Recipients of not less than
50 percent of the amounts appropriated pursuant to subsection
(d) shall substantially engage with organizations led by
individuals who--
(A) are living in a place from which communication
described in subsection (a) originates or to which such
communication is targeted;
(B) are familiar with the cultural context in such
a place; and
(C) have experience researching or working to
address such digital or online communication.
(3) Intersectionality requirement.--Research funded by
amounts appropriated pursuant to subsection (d) shall focus on
better understanding how online communication that is hateful
or abusive, incites violence, violates relevant data privacy
laws, divulges personal information, or involves false or
misleading information has a disparate impact on people who are
members of racial, ethnic, gender, religious, or sexual
minorities in their communities, including women, indigenous
populations, and people who identify as lesbian, gay, bisexual,
transgender, queer, intersex, or as another sexual minority.
(c) Reporting Requirement.--
(1) In general.--Not later than 120 days after all of the
programs receiving funding appropriated pursuant to subsection
(d) are terminated, the Administrator of the United States
Agency for International Development shall provide a briefing,
and submit a report, to the Committee on Foreign Relations of
the Senate, the Committee on Commerce, Science, and
Transportation of the Senate, the Committee on Foreign Affairs
of the House of Representatives, and the Committee on Energy
and Commerce of the House of Representatives describing the
findings of the research conducted by such programs and the
outcomes of the activities carried out by such programs.
(2) Public availability.--The report required under
paragraph (1) shall be made publicly available on a text-based
and searchable internet website.
(d) Authorization of Appropriations.--There is authorized to be
appropriated $3,000,000 in each of the fiscal years 2024 and 2025 to
carry out this section.
SEC. 11. DEFINITIONS.
In this Act:
(1) Commission.--The term ``Commission'' means the Federal
Trade Commission.
(2) Covered platform.--The term ``covered platform'' means
a website, internet application, or mobile internet application
that--
(A) allows users to create, share, view, or search
for and access user-generated or third-party content,
including a social media platform, online search
engine, and a service with direct or group messaging
capabilities; and
(B) has had at least 10,000,000 monthly active
users for 3 or more of the past 12 months within the
United States.
(3) Monetization practices.--The term ``monetization
practices'' means any avenues through which a covered platform
might garner revenue, including accepting monetary, in-kind, or
other compensation--
(A) in exchange for displaying or amplifying
specific content; or
(B) from businesses or other entities to utilize
the covered platform as a means to find, charge, or
communicate with customers.
(4) Platform policies.--The term ``platform policies''
means any terms, conditions, and clauses, regardless of their
name or form, which govern the contractual relationship between
a covered platform and a user, or any community guidelines that
a covered platform maintains that govern conduct on the covered
platform.
<all>