[Congressional Bills 117th Congress]
[From the U.S. Government Publishing Office]
[H.R. 3611 Introduced in House (IH)]
<DOC>
117th CONGRESS
1st Session
H. R. 3611
To prohibit the discriminatory use of personal information by online
platforms in any algorithmic process, to require transparency in the
use of algorithmic processes and content moderation, and for other
purposes.
_______________________________________________________________________
IN THE HOUSE OF REPRESENTATIVES
May 28, 2021
Ms. Matsui introduced the following bill; which was referred to the
Committee on Energy and Commerce
_______________________________________________________________________
A BILL
To prohibit the discriminatory use of personal information by online
platforms in any algorithmic process, to require transparency in the
use of algorithmic processes and content moderation, and for other
purposes.
Be it enacted by the Senate and House of Representatives of the
United States of America in Congress assembled,
SECTION 1. SHORT TITLE.
This Act may be cited as the ``Algorithmic Justice and Online
Platform Transparency Act''.
SEC. 2. FINDINGS.
Congress finds the following:
(1) Online platforms have become integral to individuals'
full participation in economic, democratic, and societal
processes.
(2) Online platforms employ manipulative dark patterns,
collect large amounts of personal information from their users,
and leverage that personal information for opaque algorithmic
processes in ways that create vastly different experiences for
different types of users.
(3) Algorithmic processes are often used by online
platforms without adequate testing and in the absence of
critical transparency requirements and other legally
enforceable safety and efficacy standards, which has resulted
in discrimination in housing, lending, job advertising, and
other areas of opportunity.
(4) The use of discriminatory algorithmic processes causes
disproportionate harm to populations that already experience
marginalization.
(5) Online platforms constantly engage in content
moderation decision making, resulting in highly influential
outcomes regarding what content is visible and accessible to
users.
(6) Online platforms' content moderation practices have
disproportionately significant repercussions for members of
marginalized communities, who have historically been the target
of nefarious online activity, including disinformation
campaigns.
(7) Users of online platforms should have access to
understandable information about how online platforms moderate
content and use algorithmic processes to amplify or recommend
content.
(8) Users of online platforms should be able to easily move
their data to alternative online platforms, and the importance
of this right is particularly significant given certain online
platforms' use of harmful algorithmic processes and engagement
in ineffective content moderation.
(9) In a variety of sectors, algorithmic processes also
facilitate discriminatory outcomes on online platforms that
individuals may not personally interact with, but which
nonetheless process the personal information of such
individuals and have significant, negative consequences.
(10) The people of the United States would benefit from the
convening of experts from a diverse set of governmental
positions to collectively study and report on discriminatory
algorithmic processes across the United States economy and
society, with particular attention to intersections of harm.
SEC. 3. DEFINITIONS.
In this Act, the following definitions apply:
(1) Algorithmic process.--The term ``algorithmic process''
means a computational process, including one derived from
machine learning or other artificial intelligence techniques,
that processes personal information or other data for the
purpose of determining the order or manner that a set of
information is provided, recommended to, or withheld from a
user of an online platform, including the provision of
commercial content, the display of social media posts, or any
other method of automated decision making, content selection,
or content amplification.
(2) Biometric information.--The term ``biometric
information''--
(A) means information regarding the physiological
or biological characteristics of an individual that may
be used, singly or in combination with each other or
with other identifying data, to establish the identity
of an individual; and
(B) includes--
(i) genetic information;
(ii) imagery of the iris, retina,
fingerprint, face, hand, palm, vein patterns,
and voice recordings, from which an identifier
template, such as a faceprint, a minutiae
template, or a voiceprint, can be extracted;
(iii) keystroke patterns or rhythms, gait
patterns or rhythms, and sleep, health, or
exercise data that contain identifying
information; and
(iv) any mathematical code, profile, or
algorithmic model derived from information
regarding the physiological or biological
characteristics of an individual.
(3) Commission.--The term ``Commission'' means the Federal
Trade Commission.
(4) Content moderation.--The term ``content moderation''
means--
(A) the intentional deletion, labeling, or editing
of user generated content or a process of purposefully
decreasing access to such content through the human
labor of any individual that is financially compensated
by an online platform, an automated process, or some
combination thereof, pursuant to the online platform's
terms of service or stated community standards; and
(B) such other practices as the Commission may
identify under regulations promulgated under section
553 of title 5, United States Code.
(5) De-identified.--The term ``de-identified'', with
respect to personal information, means information that has
been altered, anonymized, or aggregated so that it cannot
reasonably identify, relate to, describe, or be capable of
being associated with or linked to, directly or indirectly, a
particular individual or device.
(6) Demographic information.--The term ``demographic
information'' means information regarding an individual's or
class of individuals' race, color, ethnicity, sex, religion,
national origin, age, gender, gender identity, sexual
orientation, disability status, familial status, immigration
status, educational attainment, income, source of income,
occupation, employment status, biometric information, criminal
record, credit rating, or any categorization used by the online
platform derived from such information.
(7) Group.--The term ``group'' means a page or other
subdivision of an online platform that functions as a forum for
users to post or otherwise distribute content to, or
communicate with, other users of such page or other
subdivision.
(8) Non-precise geolocation information.--The term ``non-
precise geolocation information'' means information regarding a
country, State, county, city, or ZIP code.
(9) Online platform.--The term ``online platform'' means
any public-facing website, online service, online application,
or mobile application which is operated for commercial purposes
and provides a community forum for user generated content,
including a social network site, content aggregation service,
or service for sharing videos, images, games, audio files, or
other content.
(10) Personal information.--
(A) In general.--The term ``personal information''
means information that directly or indirectly
identifies, or could be reasonably linked to, a
particular individual or device.
(B) Reasonably linked.--For purposes of
subparagraph (A), information could be reasonably
linked to an individual or device if such information
can be used on its own or in combination with other
information held by, or readily accessible to, a person
to identify an individual or device.
(11) Place of public accommodation.--The term ``place of
public accommodation'' means--
(A) any entity considered a place of public
accommodation under section 201(b) of the Civil Rights
Act of 1964 (42 U.S.C. 2000a(b)) or section 301 of the
Americans with Disabilities Act of 1990 (42 U.S.C.
12181); or
(B) any commercial entity that offers goods or
services through the internet to the general public.
(12) Small business.--
(A) In general.--The term ``small business'' means
a commercial entity that establishes, with respect to
the 3 preceding calendar years (or since the inception
of such entity if such period is less than 3 calendar
years), that the entity--
(i) maintains an average annual gross
revenue of less than $25,000,000;
(ii) on average, annually processes the
personal information of less than 100,000
individuals, households, or devices used by
individuals or households;
(iii) on average, derives 50 percent or
less of its annual revenue from transferring
the personal information of individuals; and
(iv) has less than 50 workers at any time
during such period.
(B) Common control or branding.--For purposes of
subparagraph (A), the amounts at issue shall include
the activity of any person that controls, is controlled
by, is under common control with, or shares common
branding with such commercial entity.
(13) User generated content.--The term ``user generated
content'' means any content, including text, images, videos,
reviews, profiles, games, or audio content, that is made or
created (including through a form, template, or other process
provided by the online platform) and posted on an online
platform by a user of the online platform.
SEC. 4. TRANSPARENCY.
(a) Notice and Review of Algorithmic Process.--Beginning 1 year
after the date of enactment of this Act, any online platform that
employs, operates, or otherwise utilizes an algorithmic process to
withhold, amplify, recommend, or promote content (including a group) to
a user of the online platform shall comply with the following
requirements:
(1) Required notice.--
(A) In general.--With respect to each type of
algorithmic process utilized by an online platform,
such online platform shall disclose the following
information to users of the online platform in
conspicuous, accessible, and plain language that is not
misleading:
(i) The categories of personal information
the online platform collects or creates for
purposes of the type of algorithmic process.
(ii) The manner in which the online
platform collects or creates such personal
information.
(iii) How the online platform uses such
personal information in the type of algorithmic
process.
(iv) The method by which the type of
algorithmic process prioritizes, assigns weight
to, or ranks different categories of personal
information to withhold, amplify, recommend, or
promote content (including a group) to a user.
(B) Language of required notice.--Such online
platform shall make available the notice described in
subparagraph (A) in each language in which the online
platform provides services.
(C) Rulemaking.--The Commission shall conduct a
rulemaking to identify each type of algorithmic process
for which an online platform is required to disclose
the information described in subparagraph (A).
(2) Review of algorithmic process.--
(A) Record of algorithmic process.--Subject to
subparagraph (B), such online platform shall, for 5
years, retain a record that describes--
(i) the categories of personal information
used by the type of algorithmic process;
(ii) the method by which the type of
algorithmic process weighs or ranks certain
categories of personal information;
(iii) the method by which the online
platform develops its type of algorithmic
process, including--
(I) a description of any personal
information or other data used in such
development;
(II) an explanation of any personal
information or other data used to train
the type of algorithmic process on an
ongoing basis; and
(III) a description of how the type
of algorithmic process was tested for
accuracy, fairness, bias, and
discrimination; and
(iv) if the online platform (except for a
small business) utilizes an algorithmic process
that relates to opportunities for housing,
education, employment, insurance, credit, or
the access to or terms of use of any place of
public accommodations, an assessment of whether
the type of algorithmic process produces
disparate outcomes on the basis of an
individual's or class of individuals' actual or
perceived race, color, ethnicity, sex,
religion, national origin, gender, gender
identity, sexual orientation, familial status,
biometric information, or disability status.
(B) Additional requirements.--
(i) Requirement to de-identify personal
information.--The record described in
subparagraph (A) shall not include any personal
information other than de-identified personal
information.
(ii) Extension of record retention.--An
online platform shall retain the record
described in subparagraph (A) for up to an
additional 3 years if the Commission determines
that the online platform poses a reasonable
risk of engaging in repeated violations of this
Act or of unlawful discrimination as a result
of its use of an algorithmic process.
(C) Review of record.--Upon the request of the
Commission, an online platform shall make available to
the Commission the complete record described in
subparagraph (A).
(b) Notice of Content Moderation Practices.--
(1) Notice.--
(A) In general.--Beginning 1 year after the date of
enactment of this Act, any online platform shall
disclose to users of the online platform in
conspicuous, accessible, and plain language that is not
misleading a complete description of the online
platform's content moderation practices, including a
description of any type of automated content moderation
practices and content moderation practices that employ
human labor.
(B) Language of required notice.--Such online
platform shall make available the notice described in
subparagraph (A) in each language in which the online
platform provides services.
(2) Content moderation transparency reports.--
(A) In general.--Beginning 180 days after the date
of enactment of this Act, any online platform (except
for a small business) that engages in content
moderation shall publish, not less than annually, a
transparency report of their content moderation
practices.
(B) Requirements.--
(i) In general.--The transparency report
required under subparagraph (A) shall include,
if applicable:
(I) The total number of content
moderation decisions for the applicable
period.
(II) The number of content
moderation decisions for the applicable
period broken down by:
(aa) Relevant policy, type,
or category of content
moderation undertaken by the
online platform.
(bb) Whether the content
moderation decision occurred in
response to information
regarding organized campaigns
or other coordinated behavior.
(cc) Aggregate demographic
information of users who
created the user generated
content subjected to content
moderation.
(dd) Aggregate demographic
information of users targeted
by an algorithmic process
involving content subjected to
content moderation.
(ee) Whether the content
moderation occurred through
automated practices, human
labor by the online platform,
labor by any individual that
does not work as a paid
employee of the online
platform, or any combination
thereof.
(ff) In the case of content
moderation that occurred
through human labor by any
individual that does not work
for the online platform, the
nature of such individual's
relationship to the online
platform (such as a user,
moderator, State actor, or
representative of an external
partner organization).
(gg) The number and
percentage of content
moderation decisions subject to
appeal or other form of
secondary review.
(hh) The number and
percentage of content
moderation decisions reversed
on appeal or other form of
secondary review.
(ii) The number of content
moderation decisions occurring
in response to a government
demand or request.
(jj) The number of
government demands or requests
for content moderation broken
down by Federal agency, State,
municipality, or foreign
nation.
(kk) The types of content
moderation decisions made.
(ll) Other information that
the Commission, by regulation,
deems appropriate.
(III) The ability to cross-
reference each of the different types
of information disclosed pursuant to
subclause (II).
(ii) Accessibility of report.--The
transparency report required under subparagraph
(A) shall be--
(I) publicly available to any
individual without such individual
being required to create a user
account;
(II) conspicuous;
(III) accessible;
(IV) not misleading; and
(V) available in each language in
which the online platform provides
services.
(iii) Accessibility of report data.--The
online platform shall--
(I) provide any data in the
transparency report required under
subparagraph (A) in a machine-readable
format; and
(II) allow anyone to freely copy
and use such data.
(3) Rule of construction.--Nothing in this subsection shall
require an online platform to collect personal information that
the online platform would not otherwise collect.
(c) Advertisement Library.--Beginning 180 days after the date of
enactment of this Act, any online platform (except for a small
business) that uses personal information in combination with an
algorithmic process to sell or publish an advertisement shall take all
reasonable steps to maintain a library of such advertisements. The
library shall--
(1) be--
(A) publicly available to any individual without
such individual being required to create a user
account;
(B) conspicuous;
(C) accessible;
(D) not misleading; and
(E) available in each language in which the online
platform provides services;
(2) present information in both human- and machine-readable
formats;
(3) allow any individual to freely copy and use the
information contained in the library;
(4) at a minimum, be searchable by date, location, topic,
cost, advertiser, keyword, information disclosed pursuant to
paragraph (6), or any other criteria that the Commission, by
regulation, deems appropriate;
(5) contain copies of all advertisements sold or published
by the online platform for 2 years following the sale or
publishing of each advertisement; and
(6) for each advertisement entry, include--
(A) the content of the advertisement;
(B) all targeting criteria selected by the
advertiser, including demographic information and non-
precise geolocation information (except in the event
that including a specific criterion would disclose
personal information);
(C) any data the online platform provided to the
advertiser regarding to whom it sold or published the
advertisement, including demographic information and
non-precise geolocation information (except in the
event that including specific data would disclose
personal information); and
(D) the name of the advertiser, the cost of the
advertisement, the dates the advertisement was
displayed on the online platform, and any other
information that the Commission, by regulation, deems
appropriate.
(d) Certification.--Not later than 30 days after making any
disclosure required by subsection (a)(1), (b), or (c), and annually
thereafter, an online platform shall certify the accuracy and
completeness of such disclosure. Such certification shall--
(1) be signed, under oath, by the online platform's chief
executive officer, chief privacy officer, chief operating
officer, chief information security officer, or another senior
officer of equivalent stature;
(2) attest that the officer described in paragraph (1) has
personal knowledge sufficient to make such certification; and
(3) in addition to any annual certification, be issued with
any material change (which shall not include routine additions
to or maintenance of entries in the advertising library
pursuant to subsection (c)).
SEC. 5. RIGHT TO DATA PORTABILITY.
In promulgating regulations under this Act, the Commission shall
require an online platform, if the online platform retains the personal
information of a user, to provide to the user access to the personal
information retained in the form of a portable electronic table that--
(1) is in a usable and searchable format; and
(2) allows the user to transfer such personal information
from one online platform to another without hindrance.
SEC. 6. PROHIBITED CONDUCT.
(a) Public Accommodations.--It shall be unlawful for an online
platform to employ any proprietary online platform design features,
including an algorithmic process, or otherwise process the personal
information of an individual in a manner that segregates, discriminates
in, or otherwise makes unavailable the goods, services, facilities,
privileges, advantages, or accommodations of any place of public
accommodation on the basis of an individual's or class of individuals'
actual or perceived race, color, ethnicity, religion, national origin,
sex, gender, gender identity, sexual orientation, familial status,
biometric information, or disability status.
(b) Equal Opportunity.--It shall be unlawful for an online platform
to employ any proprietary online platform design features, including an
algorithmic process, or otherwise process the personal information of
an individual for the purpose of advertising, marketing, soliciting,
offering, selling, leasing, licensing, renting, or otherwise
commercially contracting for housing, employment, credit, insurance,
healthcare, or education opportunities in a manner that discriminates
against or otherwise makes the opportunity unavailable on the basis of
an individual's or class of individuals' actual or perceived race,
color, ethnicity, religion, national origin, sex, gender, gender
identity, sexual orientation, familial status, biometric information,
or disability status.
(c) Voting Rights.--It shall be unlawful for an online platform to
process personal information in a manner that intentionally deprives,
defrauds, or attempts to deprive or defraud any individual of their
free and fair exercise of the right to vote in a Federal, State, or
local election. Such manner includes:
(1) Intentional deception regarding--
(A) the time, place, or method of voting or
registering to vote;
(B) the eligibility requirements to vote or
register to vote;
(C) the counting of ballots;
(D) the adjudication of elections;
(E) explicit endorsements by any person or
candidate; or
(F) any other material information pertaining to
the procedures or requirements for voting or
registering to vote in a Federal, State, or local
election.
(2) Intentionally using deception, threats, intimidation,
fraud, or coercion to prevent, interfere with, retaliate
against, deter, or attempt to prevent, interfere with,
retaliate against, or deter an individual from--
(A) voting or registering to vote in a Federal,
State, or local election; or
(B) supporting or advocating for a candidate in a
Federal, State, or local election.
(d) Discriminatory Advertising.--
(1) In general.--Not later than 2 years after the date of
enactment of this Act, the Commission shall promulgate
regulations to define and prohibit unfair or deceptive acts or
practices with respect to advertising practices.
(2) Periodic review of regulations.--The Commission shall
review such regulations not less than once every 5 years and
update the regulations as appropriate.
(3) Considerations.--In promulgating regulations under this
subsection, the Commission shall consider:
(A) Established public policy, such as civil rights
laws, to prevent discrimination and promote equal
opportunity.
(B) The state of the art of advertising.
(C) Research of and methodologies for measuring
discrimination in advertising.
(D) The role of each actor in the advertising
ecosystem.
(E) Any harm caused by predatory or manipulative
advertising practices, including practices targeting
vulnerable populations.
(F) Whether, and at what age, a minor is able to
distinguish between editorial content and paid
advertisements.
(G) Methods for fairly promoting equal opportunity
in housing, employment, credit, insurance, education,
and healthcare through targeted outreach to
underrepresented populations in a fair and non-
deceptive manner.
(H) The needs of small businesses.
(I) Any other criteria the Commission deems
appropriate.
(e) Safety and Effectiveness of Algorithmic Processes.--
(1) In general.--It shall be unlawful for an online
platform to employ an algorithmic process in a manner that is
not safe and effective.
(2) Safe.--For purposes of paragraph (1), an algorithmic
process is safe--
(A) if the algorithmic process does not produce any
disparate outcome as described in the assessment
conducted under section 4(a)(2)(A)(iv); or
(B) if the algorithmic process does produce a
disparate outcome as described in the assessment
conducted under section 4(a)(2)(A)(iv), any such
disparate outcome is justified by a non-discriminatory,
compelling interest, and such interest cannot be
satisfied by less discriminatory means.
(3) Effective.--For purposes of paragraph (1), an
algorithmic process is effective if the online platform
employing or otherwise utilizing the algorithmic process has
taken reasonable steps to ensure that the algorithmic process
has the ability to produce its desired or intended result.
(f) Discrimination by Users of Online Platforms.--It shall be
unlawful for a user of an online platform to utilize an algorithmic
process on an online platform in a manner that--
(1) withholds, denies, deprives, or attempts to withhold,
deny, or deprive any individual of a right or privilege under
title II of the Civil Rights Act of 1964 (42 U.S.C. 2000a et.
seq.);
(2) intimidates, threatens, coerces, or attempts to
intimidate, threaten, or coerce any individual with the purpose
of interfering with a right or privilege under title II of such
Act; or
(3) punishes or attempts to punish any individual for
exercising or attempting to exercise a right or privilege under
title II of such Act.
(g) Exceptions.--Nothing in this section shall limit an online
platform from processing personal information for the purpose of--
(1) good faith internal testing to prevent unlawful
discrimination, identify disparate outcomes or treatment, or
otherwise determine the extent or effectiveness of the online
platform's compliance with this Act; or
(2) advertising, marketing, or soliciting economic
opportunities (which shall not be of lower quality or contain
less desirable terms than similar opportunities the online
platform advertises, markets, or solicits to the general
population) to underrepresented populations in a fair and non-
deceptive manner.
(h) FTC Advisory Opinions.--An online platform may request guidance
from the Commission with respect to the online platform's potential
compliance with this Act, in accordance with the Commission's rules of
practice on advisory opinions.
(i) Preservation of Rights and Whistleblower Protections; Rules of
Construction.--
(1) No conditional service.--An online platform may not
condition or degrade the provision of a service or product to
an individual based on the individual's waiver of any right
guaranteed in this section.
(2) No arbitration agreement or waiver.--No pre-dispute
arbitration agreement or pre-dispute joint action waiver of any
right guaranteed in this section shall be valid or enforceable
with respect to a dispute arising under this Act. Any
determination as to the scope or manner of applicability of
this section shall be made by a court, rather than an
arbitrator, without regard to whether such agreement purports
to delegate such determination to an arbitrator.
(3) Whistleblower protection.--An online platform may not,
directly or indirectly, discharge, demote, suspend, threaten,
harass, or in any other manner discriminate against an
individual for reporting or attempting to report a violation of
this section.
(4) Rule of construction.--Nothing in this section shall be
construed to affect the application of section 230 of the
Communications Act of 1934 (commonly known as ``section 230 of
the Communications Decency Act of 1996'') (47 U.S.C. 230) to an
online platform or otherwise impose on an online platform legal
liability for user generated content.
SEC. 7. INTERAGENCY TASK FORCE ON ALGORITHMIC PROCESSES ON ONLINE
PLATFORMS.
(a) Establishment.--The Commission shall establish an interagency
task force on algorithmic processes on online platforms (referred to in
this section as the ``Task Force'') for the purpose of examining the
discriminatory use of personal information by online platforms in
algorithmic processes.
(b) Membership.--
(1) In general.--The Task Force established under this
section shall include representatives from--
(A) the Commission;
(B) the Department of Education;
(C) the Department of Justice;
(D) the Department of Labor;
(E) the Department of Housing and Urban
Development;
(F) the Department of Commerce;
(G) the Department of Health and Human Services;
(H) the Department of Veterans Affairs;
(I) the Equal Employment Opportunity Commission;
(J) the Consumer Financial Protection Bureau;
(K) the Federal Communications Commission;
(L) the Federal Elections Commission; and
(M) the White House Office of Science and
Technology Policy.
(2) Chair.--The Task Force shall be co-chaired by 1
representative of the Commission and 1 representative of the
Department of Justice.
(3) Staff.--The Task Force shall hire such other personnel,
including individuals with expertise in the intersection of
civil rights and technology, as may be appropriate to enable
the Task Force to perform its duties.
(c) Study and Report.--
(1) Study.--The Task Force shall conduct a study on the
discriminatory use of personal information by online platforms
in algorithmic processes. Such study shall include the
following:
(A) Discriminatory use of personal information in
the advertisement of (including the withholding of an
advertisement) housing opportunities.
(B) Discriminatory use of personal information in
the advertisement of (including the withholding of an
advertisement) credit, lending, or other financial
services opportunities.
(C) Discriminatory use of personal information in
the advertisement of (including the withholding of an
advertisement) employment opportunities.
(D) Discriminatory use of personal information in
the advertisement of (including the withholding of an
advertisement) education opportunities.
(E) Discriminatory use of personal information in
the advertisement of (including the withholding of an
advertisement) insurance opportunities.
(F) Discriminatory use of personal information or
biometric information by employers in the surveillance
or monitoring of workers.
(G) Discriminatory use of personal information on
online platforms involved in hiring screening
practices.
(H) Discriminatory use of personal information or
biometric information in education, including the use
of--
(i) student personal information for
predictive forecasting on student ability or
potential for purposes of admissions decisions;
and
(ii) automated proctoring software that
monitors, analyzes, or otherwise processes
student biometric information to identify
suspicious behavior, including any
discriminatory outcomes associated with the use
of such software.
(I) Discriminatory use of user biometric
information.
(J) Use of personal information by disinformation
campaigns for the purpose of political
disenfranchisement.
(K) Any other discriminatory use of personal
information.
(2) Report.--Not later than 180 days after the date of
enactment of this Act, and biennially thereafter, the Task
Force shall submit to Congress a report containing the results
of the study conducted under paragraph (1), together with
recommendations for such legislation and administrative action
as the Task Force determines appropriate.
(d) Funding.--Out of any money in the Treasury not otherwise
appropriated, there are appropriated to the Commission such sums as are
necessary to carry out this section. Amounts appropriated under the
preceding sentence shall remain available until expended.
SEC. 8. ENFORCEMENT.
(a) Enforcement by the Commission.--
(1) Unfair or deceptive acts or practice.--A violation of
this Act or a regulation promulgated under this Act shall be
treated as a violation of a rule defining an unfair or
deceptive act or practice under section 18(a)(1)(B) of the
Federal Trade Commission Act (15 U.S.C. 57a(a)(1)(B)).
(2) Powers of the commission.--
(A) In general.--The Commission shall enforce this
Act in the same manner, by the same means, and with the
same jurisdiction, powers, and duties as though all
applicable terms and provisions of the Federal Trade
Commission Act (15 U.S.C. 41 et seq.) were incorporated
into and made a part of this section.
(B) Privileges and immunities.--Any person who
violates this Act or a regulation promulgated under
this Act shall be subject to the penalties and entitled
to the privileges and immunities provided in the
Federal Trade Commission Act (15 U.S.C. 41 et seq.).
(C) Authority preserved.--Nothing in this Act shall
be construed to limit the authority of the Commission
under any other provision of law.
(3) Rulemaking.--The Commission shall promulgate in
accordance with section 553 of title 5, United States Code,
such rules as may be necessary to carry out this Act.
(b) Enforcement by States.--
(1) Authorization.--Subject to paragraph (2), in any case
in which the attorney general of a State has reason to believe
that an interest of the residents of the State has been or is
adversely affected by the engagement of any person in an act or
practice that violates this Act or a regulation promulgated
under this Act, the attorney general of the State may, as
parens patriae, bring a civil action on behalf of the residents
of the State in an appropriate district court of the United
States to--
(A) enjoin that act or practice;
(B) enforce compliance with this Act or the
regulation;
(C) obtain damages, civil penalties, restitution,
or other compensation on behalf of the residents of the
State; or
(D) obtain such other relief as the court may
consider to be appropriate.
(2) Rights of the commission.--
(A) Notice to the commission.--
(i) In general.--Except as provided in
clause (iii), the attorney general of a State
shall notify the Commission in writing that the
attorney general intends to bring a civil
action under paragraph (1) before initiating
the civil action against a person subject to
this Act.
(ii) Contents.--The notification required
by clause (i) with respect to a civil action
shall include a copy of the complaint to be
filed to initiate the civil action.
(iii) Exception.--If it is not feasible for
the attorney general of a State to provide the
notification required by clause (i) before
initiating a civil action under paragraph (1),
the attorney general shall notify the
Commission immediately upon instituting the
civil action.
(B) Intervention by the commission.--The Commission
may--
(i) intervene in any civil action brought
by the attorney general of a State under
paragraph (1); and
(ii) upon intervening--
(I) be heard on all matters arising
in the civil action; and
(II) file petitions for appeal of a
decision in the civil action.
(3) Investigatory powers.--Nothing in this subsection may
be construed to prevent the attorney general of a State from
exercising the powers conferred on the attorney general by the
laws of the State to conduct investigations, to administer
oaths or affirmations, or to compel the attendance of witnesses
or the production of documentary or other evidence.
(4) Action by the commission.--If the Commission institutes
a civil action with respect to a violation of this Act, the
attorney general of a State may not, during the pendency of the
action, bring a civil action under paragraph (1) against any
defendant named in the complaint of the Commission for the
violation with respect to which the Commission instituted such
action.
(5) Venue; service of process.--
(A) Venue.--Any action brought under paragraph (1)
may be brought in the district court of the United
States that meets applicable requirements relating to
venue under section 1391 of title 28, United States
Code.
(B) Service of process.--In an action brought under
paragraph (1), process may be served in any district in
which the defendant--
(i) is an inhabitant; or
(ii) may be found.
(c) Enforcement by the Department of Justice.--
(1) In general.--The Attorney General may bring a civil
action to enforce section 6(a), (b), (c), (e), (f), or (i) in
an appropriate district court of the United States.
(2) Coordination with the commission.--The Attorney General
shall, when reasonable and appropriate, consult and coordinate
with the Commission on a civil action brought under paragraph
(1).
(3) Relief.--In any civil action brought under paragraph
(1), the court may impose injunctive relief, declaratory
relief, damages, civil penalties, restitution, and any other
relief the court deems appropriate.
(d) Enforcement by Individuals.--
(1) In general.--Any individual alleging a violation of
section 6(a), (b), or (c), or a regulation promulgated
thereunder, may bring a civil action in any court of competent
jurisdiction, State or Federal.
(2) Relief.--In a civil action brought under paragraph (1)
in which the plaintiff prevails, the court may award--
(A) an amount equal to $2,500 or actual damages,
whichever is greater;
(B) punitive damages;
(C) reasonable attorney's fees and litigation
costs; and
(D) any other relief, including injunctive or
declaratory relief, that the court determines
appropriate.
<all>