[Pages H1644-H1647]
From the Congressional Record Online through the Government Publishing Office [www.gpo.gov]





   TOOLS TO ADDRESS KNOWN EXPLOITATION BY IMMOBILIZING TECHNOLOGICAL 
                 DEEPFAKES ON WEBSITES AND NETWORKS ACT

  Mr. BILIRAKIS. Mr. Speaker, I move to suspend the rules and pass the 
bill (S. 146) to require covered platforms to remove nonconsensual 
intimate visual depictions, and for other purposes.
  The Clerk read the title of the bill.
  The text of the bill is as follows:

                                 S. 146

       Be it enacted by the Senate and House of Representatives of 
     the United States of America in Congress assembled,

     SECTION 1. SHORT TITLE.

       This Act may be cited as the ``Tools to Address Known 
     Exploitation by Immobilizing Technological Deepfakes on 
     Websites and Networks Act'' or the ``TAKE IT DOWN Act''.

     SEC. 2. CRIMINAL PROHIBITION ON INTENTIONAL DISCLOSURE OF 
                   NONCONSENSUAL INTIMATE VISUAL DEPICTIONS.

       (a) In General.--Section 223 of the Communications Act of 
     1934 (47 U.S.C. 223) is amended--
       (1) by redesignating subsection (h) as subsection (i); and
       (2) by inserting after subsection (g) the following:
       ``(h) Intentional Disclosure of Nonconsensual Intimate 
     Visual Depictions.--
       ``(1) Definitions.--In this subsection:
       ``(A) Consent.--The term `consent' means an affirmative, 
     conscious, and voluntary authorization made by an individual 
     free from force, fraud, duress, misrepresentation, or 
     coercion.
       ``(B) Digital forgery.--The term `digital forgery' means 
     any intimate visual depiction of an identifiable individual 
     created through the use of software, machine learning, 
     artificial intelligence, or any other computer-generated or 
     technological means, including by adapting, modifying, 
     manipulating, or altering an authentic visual depiction, 
     that, when viewed as a whole by a reasonable person, is 
     indistinguishable from an authentic visual depiction of the 
     individual.
       ``(C) Identifiable individual.--The term `identifiable 
     individual' means an individual--
       ``(i) who appears in whole or in part in an intimate visual 
     depiction; and
       ``(ii) whose face, likeness, or other distinguishing 
     characteristic (including a unique birthmark or other 
     recognizable feature) is displayed in connection with such 
     intimate visual depiction.
       ``(D) Interactive computer service.--The term `interactive 
     computer service' has the meaning given the term in section 
     230.
       ``(E) Intimate visual depiction.--The term `intimate visual 
     depiction' has the meaning given such term in section 1309 of 
     the Consolidated Appropriations Act, 2022 (15 U.S.C. 6851).
       ``(F) Minor.--The term `minor' means any individual under 
     the age of 18 years.
       ``(2) Offense involving authentic intimate visual 
     depictions.--
       ``(A) Involving adults.--Except as provided in subparagraph 
     (C), it shall be unlawful for any person, in interstate or 
     foreign commerce, to use an interactive computer service to 
     knowingly publish an intimate visual depiction of an 
     identifiable individual who is not a minor if--
       ``(i) the intimate visual depiction was obtained or created 
     under circumstances in which the person knew or reasonably 
     should have known the identifiable individual had a 
     reasonable expectation of privacy;
       ``(ii) what is depicted was not voluntarily exposed by the 
     identifiable individual in a public or commercial setting;
       ``(iii) what is depicted is not a matter of public concern; 
     and
       ``(iv) publication of the intimate visual depiction--

       ``(I) is intended to cause harm; or
       ``(II) causes harm, including psychological, financial, or 
     reputational harm, to the identifiable individual.

       ``(B) Involving minors.--Except as provided in subparagraph 
     (C), it shall be unlawful for any person, in interstate or 
     foreign commerce, to use an interactive computer service to 
     knowingly publish an intimate visual depiction of an 
     identifiable individual who is a minor with intent to--
       ``(i) abuse, humiliate, harass, or degrade the minor; or
       ``(ii) arouse or gratify the sexual desire of any person.
       ``(C) Exceptions.--Subparagraphs (A) and (B) shall not 
     apply to--
       ``(i) a lawfully authorized investigative, protective, or 
     intelligence activity of--

       ``(I) a law enforcement agency of the United States, a 
     State, or a political subdivision of a State; or
       ``(II) an intelligence agency of the United States;

       ``(ii) a disclosure made reasonably and in good faith--

       ``(I) to a law enforcement officer or agency;
       ``(II) as part of a document production or filing 
     associated with a legal proceeding;
       ``(III) as part of medical education, diagnosis, or 
     treatment or for a legitimate medical, scientific, or 
     education purpose;
       ``(IV) in the reporting of unlawful content or unsolicited 
     or unwelcome conduct or in pursuance of a legal, 
     professional, or other lawful obligation; or
       ``(V) to seek support or help with respect to the receipt 
     of an unsolicited intimate visual depiction;

       ``(iii) a disclosure reasonably intended to assist the 
     identifiable individual;
       ``(iv) a person who possesses or publishes an intimate 
     visual depiction of himself or herself engaged in nudity or 
     sexually explicit conduct (as that term is defined in section 
     2256(2)(A) of title 18, United States Code); or
       ``(v) the publication of an intimate visual depiction that 
     constitutes--

       ``(I) child pornography (as that term is defined in section 
     2256 of title 18, United States Code); or
       ``(II) a visual depiction described in subsection (a) or 
     (b) of section 1466A of title 18, United States Code 
     (relating to obscene visual representations of the sexual 
     abuse of children).

       ``(3) Offense involving digital forgeries.--
       ``(A) Involving adults.--Except as provided in subparagraph 
     (C), it shall be unlawful for any person, in interstate or 
     foreign commerce, to use an interactive computer service to 
     knowingly publish a digital forgery of an identifiable 
     individual who is not a minor if--
       ``(i) the digital forgery was published without the consent 
     of the identifiable individual;
       ``(ii) what is depicted was not voluntarily exposed by the 
     identifiable individual in a public or commercial setting;
       ``(iii) what is depicted is not a matter of public concern; 
     and
       ``(iv) publication of the digital forgery--

       ``(I) is intended to cause harm; or
       ``(II) causes harm, including psychological, financial, or 
     reputational harm, to the identifiable individual.

       ``(B) Involving minors.--Except as provided in subparagraph 
     (C), it shall be unlawful for any person, in interstate or 
     foreign commerce, to use an interactive computer service to 
     knowingly publish a digital forgery of an identifiable 
     individual who is a minor with intent to--
       ``(i) abuse, humiliate, harass, or degrade the minor; or
       ``(ii) arouse or gratify the sexual desire of any person.
       ``(C) Exceptions.--Subparagraphs (A) and (B) shall not 
     apply to--
       ``(i) a lawfully authorized investigative, protective, or 
     intelligence activity of--

       ``(I) a law enforcement agency of the United States, a 
     State, or a political subdivision of a State; or
       ``(II) an intelligence agency of the United States;

       ``(ii) a disclosure made reasonably and in good faith--

       ``(I) to a law enforcement officer or agency;
       ``(II) as part of a document production or filing 
     associated with a legal proceeding;
       ``(III) as part of medical education, diagnosis, or 
     treatment or for a legitimate medical, scientific, or 
     education purpose;
       ``(IV) in the reporting of unlawful content or unsolicited 
     or unwelcome conduct or in pursuance of a legal, 
     professional, or other lawful obligation; or
       ``(V) to seek support or help with respect to the receipt 
     of an unsolicited intimate visual depiction;

       ``(iii) a disclosure reasonably intended to assist the 
     identifiable individual;
       ``(iv) a person who possesses or publishes a digital 
     forgery of himself or herself engaged in nudity or sexually 
     explicit conduct (as that term is defined in section 
     2256(2)(A) of title 18, United States Code); or
       ``(v) the publication of an intimate visual depiction that 
     constitutes--

       ``(I) child pornography (as that term is defined in section 
     2256 of title 18, United States Code); or
       ``(II) a visual depiction described in subsection (a) or 
     (b) of section 1466A of title 18, United States Code 
     (relating to obscene visual representations of the sexual 
     abuse of children).

       ``(4) Penalties.--
       ``(A) Offenses involving adults.--Any person who violates 
     paragraph (2)(A) or (3)(A) shall be fined under title 18, 
     United States Code, imprisoned not more than 2 years, or 
     both.
       ``(B) Offenses involving minors.--Any person who violates 
     paragraph (2)(B) or (3)(B) shall be fined under title 18, 
     United States Code, imprisoned not more than 3 years, or 
     both.
       ``(5) Rules of construction.--For purposes of paragraphs 
     (2) and (3)--
       ``(A) the fact that the identifiable individual provided 
     consent for the creation of the intimate visual depiction 
     shall not establish that the individual provided consent for 
     the publication of the intimate visual depiction; and
       ``(B) the fact that the identifiable individual disclosed 
     the intimate visual depiction to another individual shall not 
     establish that the identifiable individual provided consent 
     for the publication of the intimate visual depiction by the 
     person alleged to have violated paragraph (2) or (3), 
     respectively.
       ``(6) Threats.--
       ``(A) Threats involving authentic intimate visual 
     depictions.--Any person who intentionally threatens to commit 
     an offense under paragraph (2) for the purpose of 
     intimidation, coercion, extortion, or to create mental 
     distress shall be punished as provided in paragraph (4).
       ``(B) Threats involving digital forgeries.--
       ``(i) Threats involving adults.--Any person who 
     intentionally threatens to commit

[[Page H1645]]

     an offense under paragraph (3)(A) for the purpose of 
     intimidation, coercion, extortion, or to create mental 
     distress shall be fined under title 18, United States Code, 
     imprisoned not more than 18 months, or both.
       ``(ii) Threats involving minors.--Any person who 
     intentionally threatens to commit an offense under paragraph 
     (3)(B) for the purpose of intimidation, coercion, extortion, 
     or to create mental distress shall be fined under title 18, 
     United States Code, imprisoned not more than 30 months, or 
     both.
       ``(7) Forfeiture.--
       ``(A) In general.--The court, in imposing a sentence on any 
     person convicted of a violation of paragraph (2) or (3), 
     shall order, in addition to any other sentence imposed and 
     irrespective of any provision of State law, that the person 
     forfeit to the United States--
       ``(i) any material distributed in violation of that 
     paragraph;
       ``(ii) the person's interest in property, real or personal, 
     constituting or derived from any gross proceeds of the 
     violation, or any property traceable to such property, 
     obtained or retained directly or indirectly as a result of 
     the violation; and
       ``(iii) any personal property of the person used, or 
     intended to be used, in any manner or part, to commit or to 
     facilitate the commission of the violation.
       ``(B) Procedures.--Section 413 of the Controlled Substances 
     Act (21 U.S.C. 853), with the exception of subsections (a) 
     and (d), shall apply to the criminal forfeiture of property 
     under subparagraph (A).
       ``(8) Restitution.--The court shall order restitution for 
     an offense under paragraph (2) or (3) in the same manner as 
     under section 2264 of title 18, United States Code.
       ``(9) Rule of construction.--Nothing in this subsection 
     shall be construed to limit the application of any other 
     relevant law, including section 2252 of title 18, United 
     States Code.''.
       (b) Defenses.--Section 223(e)(1) of the Communications Act 
     of 1934 (47 U.S.C. 223(e)(1)) is amended by striking ``or 
     (d)'' and inserting ``, (d), or (h)''.
       (c) Technical and Conforming Amendment.--Subsection (i) of 
     section 223 of the Communications Act of 1934 (47 U.S.C. 
     223), as so redesignated by subsection (a), is amended by 
     inserting ``Definitions.--'' before ``For purposes of this 
     section''.

     SEC. 3. NOTICE AND REMOVAL OF NONCONSENSUAL INTIMATE VISUAL 
                   DEPICTIONS.

       (a) In General.--
       (1) Notice and removal process.--
       (A) Establishment.--Not later than 1 year after the date of 
     enactment of this Act, a covered platform shall establish a 
     process whereby an identifiable individual (or an authorized 
     person acting on behalf of such individual) may--
       (i) notify the covered platform of an intimate visual 
     depiction published on the covered platform that--

       (I) includes a depiction of the identifiable individual; 
     and
       (II) was published without the consent of the identifiable 
     individual; and

       (ii) submit a request for the covered platform to remove 
     such intimate visual depiction.
       (B) Requirements.--A notification and request for removal 
     of an intimate visual depiction submitted under the process 
     established under subparagraph (A) shall include, in 
     writing--
       (i) a physical or electronic signature of the identifiable 
     individual (or an authorized person acting on behalf of such 
     individual);
       (ii) an identification of, and information reasonably 
     sufficient for the covered platform to locate, the intimate 
     visual depiction of the identifiable individual;
       (iii) a brief statement that the identifiable individual 
     has a good faith belief that any intimate visual depiction 
     identified under clause (ii) is not consensual, including any 
     relevant information for the covered platform to determine 
     the intimate visual depiction was published without the 
     consent of the identifiable individual; and
       (iv) information sufficient to enable the covered platform 
     to contact the identifiable individual (or an authorized 
     person acting on behalf of such individual).
       (2) Notice of process.--A covered platform shall provide on 
     the platform a clear and conspicuous notice, which may be 
     provided through a clear and conspicuous link to another web 
     page or disclosure, of the notice and removal process 
     established under paragraph (1)(A) that--
       (A) is easy to read and in plain language; and
       (B) provides information regarding the responsibilities of 
     the covered platform under this section, including a 
     description of how an individual can submit a notification 
     and request for removal.
       (3) Removal of nonconsensual intimate visual depictions.--
     Upon receiving a valid removal request from an identifiable 
     individual (or an authorized person acting on behalf of such 
     individual) using the process described in paragraph 
     (1)(A)(ii), a covered platform shall, as soon as possible, 
     but not later than 48 hours after receiving such request--
       (A) remove the intimate visual depiction; and
       (B) make reasonable efforts to identify and remove any 
     known identical copies of such depiction.
       (4) Limitation on liability.--A covered platform shall not 
     be liable for any claim based on the covered platform's good 
     faith disabling of access to, or removal of, material claimed 
     to be a nonconsensual intimate visual depiction based on 
     facts or circumstances from which the unlawful publishing of 
     an intimate visual depiction is apparent, regardless of 
     whether the intimate visual depiction is ultimately 
     determined to be unlawful or not.
       (b) Enforcement by the Commission.--
       (1) Unfair or deceptive acts or practices.--A failure to 
     reasonably comply with the notice and takedown obligations 
     under subsection (a) shall be treated as a violation of a 
     rule defining an unfair or a deceptive act or practice under 
     section 18(a)(1)(B) of the Federal Trade Commission Act (15 
     U.S.C. 57a(a)(1)(B)).
       (2) Powers of the commission.--
       (A) In general.--Except as provided in subparagraph (D), 
     the Commission shall enforce this section in the same manner, 
     by the same means, and with the same jurisdiction, powers, 
     and duties as though all applicable terms and provisions of 
     the Federal Trade Commission Act (15 U.S.C. 41 et seq.) were 
     incorporated into and made a part of this section.
       (B) Privileges and immunities.--Any person who violates 
     this section shall be subject to the penalties and entitled 
     to the privileges and immunities provided in the Federal 
     Trade Commission Act (15 U.S.C. 41 et seq.).
       (C) Authority preserved.--Nothing in this Act shall be 
     construed to limit the authority of the Federal Trade 
     Commission under any other provision of law.
       (D) Scope of jurisdiction.--Notwithstanding sections 4, 
     5(a)(2), or 6 of the Federal Trade Commission Act (15 U.S.C. 
     44, 45(a)(2), 46), or any jurisdictional limitation of the 
     Commission, the Commission shall also enforce this section in 
     the same manner provided in subparagraph (A), with respect to 
     organizations that are not organized to carry on business for 
     their own profit or that of their members.

     SEC. 4. DEFINITIONS.

       In this Act:
       (1) Commission.--The term ``Commission'' means the Federal 
     Trade Commission.
       (2) Consent; digital forgery; identifiable individual; 
     intimate visual depiction.--The terms ``consent'', ``digital 
     forgery'', ``identifiable individual'', ``intimate visual 
     depiction'', and ``minor'' have the meaning given such terms 
     in section 223(h) of the Communications Act of 1934 (47 
     U.S.C. 223), as added by section 2.
       (3) Covered platform.--
       (A) In general.--The term ``covered platform'' means a 
     website, online service, online application, or mobile 
     application--
       (i) that serves the public; and
       (ii)(I) that primarily provides a forum for user-generated 
     content, including messages, videos, images, games, and audio 
     files; or
       (II) for which it is in the regular course of trade or 
     business of the website, online service, online application, 
     or mobile application to publish, curate, host, or make 
     available content of nonconsensual intimate visual 
     depictions.
       (B) Exclusions.--The term ``covered platform'' shall not 
     include the following:
       (i) A provider of broadband internet access service (as 
     described in section 8.1(b) of title 47, Code of Federal 
     Regulations, or successor regulation).
       (ii) Electronic mail.
       (iii) Except as provided in subparagraph (A)(ii)(II), an 
     online service, application, or website--

       (I) that consists primarily of content that is not user 
     generated but is preselected by the provider of such online 
     service, application, or website; and
       (II) for which any chat, comment, or interactive 
     functionality is incidental to, directly related to, or 
     dependent on the provision of the content described in 
     subclause (I).

     SEC. 5. SEVERABILITY.

       If any provision of this Act, or an amendment made by this 
     Act, is determined to be unenforceable or invalid, the 
     remaining provisions of this Act and the amendments made by 
     this Act shall not be affected.

  The SPEAKER pro tempore. Pursuant to the rule, the gentleman from 
Florida (Mr. Bilirakis) and the gentleman from New Jersey (Mr. Pallone) 
each will control 20 minutes.
  The Chair recognizes the gentleman from Florida.


                             General Leave

  Mr. BILIRAKIS. Mr. Speaker, I ask unanimous consent that all Members 
may have 5 legislative days in which to revise and extend their remarks 
and insert extraneous material in the Record on the bill.
  The SPEAKER pro tempore. Is there objection to the request of the 
gentleman from Florida?
  There was no objection.
  Mr. BILIRAKIS. Mr. Speaker, I yield myself such time as I may 
consume.
  I rise today in strong support of S. 146, the TAKE IT DOWN Act by 
Senator Ted Cruz. The bill addresses a serious gap in our current law, 
a loophole that came to light in my own district.
  I had a meeting on this particular issue with the sheriff in Pasco 
County, Florida, Sheriff Nocco, and then I also talked to Senator Cruz 
about one of his constituents. This is how we get things done. The best 
ideas come from the people, and this is very necessary.

[[Page H1646]]

  A teacher in my district used AI to create explicit content of his 
students incorporating real images of his students taken from a 
yearbook. Under current law, only the use of the actual photos is 
illegal, the AI-generated, sexually explicit content is not.
  Because of this gap, law enforcement was unable to fully charge this 
particular individual, this sick individual, in my opinion, for the 
scope of the images in his possession. Had this bill been in effect, 
his actions would have been criminalized in full.
  As technology evolves, so must our laws. We need to keep pace, there 
is no question. We must. We must continue working hand in hand with our 
law enforcement partners to stay ahead of these emerging threats and 
safeguard our most vulnerable.
  I urge my colleagues to join me in voting in favor of S. 146. The 
House sponsor is the gentlewoman from Florida (Ms. Salazar). She is a 
great friend of mine. Let's get this done. We need to be protecting our 
children.
  Mr. Speaker, I reserve the balance of my time.
  Mr. PALLONE. Mr. Speaker, I yield myself such time as I may consume.
  I rise to speak in support of S. 146, the TAKE IT DOWN Act. The 
legislation addresses the nonconsensual sharing of intimate images 
online, one of the most significant harms proliferating on the internet 
in recent years.
  Advances in generative artificial intelligence and other photo 
manipulation software have enabled the creation of digital forgeries 
that place victims in sexually explicit situations that never actually 
occurred but can still cause massive reputational and financial damage 
to those who were targeted.
  Some of the platforms hosting actual and computer-generated 
nonconsensual intimate images have promised to address such abuses 
online and protect their users but, nevertheless, victims report great 
difficulty in getting such images removed from the internet. They tell 
us they feel powerless as it spreads or even resurfaces years later.
  The TAKE IT DOWN Act will require social media and other online 
public platforms to provide a mechanism for people to notify the 
platform of nonconsensual intimate images in which they are depicted. 
It also requires the platforms to take reasonable steps to remove the 
images from their platforms within 48 hours.
  I thank Representatives Dean and Dingell for their leadership on this 
issue, and I encourage my colleagues to support this bipartisan 
legislation. I reserve the balance of my time, Mr. Speaker.
  Mr. BILIRAKIS. Mr. Speaker, I yield such time as she may consume to 
the gentlewoman from Florida (Ms. Salazar), the House sponsor of this 
particular bill, who is my good friend and a very effective Member of 
Congress.
  Ms. SALAZAR. Mr. Speaker, every generation of Congress faces moments 
that test our commitment to justice, and today is one of those moments.
  I rise today to urge my colleagues to vote ``yes'' on S. 146, the 
TAKE IT DOWN Act. The Senate has already done its job. They passed this 
legislation unanimously. Now, it is our turn in the House of 
Representatives. This is our moment to stand up to protect our children 
and make this the law of the land.
  The name of the legislation is the TAKE IT DOWN Act. The mission of 
this bill is simple, profound, and long lasting. It stops cyber abuse. 
It prevents the bullying of one child against another and, even more 
importantly, prevents suicide born out of shame.
  It is outrageously sick to use images--the face, the voice, the 
likeness--of a young vulnerable female to manipulate them, to extort 
them, and to humiliate them publicly just for fun, just for revenge. 
That is why we created this bill, to stop the abuse spreading like 
wildfire right now on social media.
  It is widely known that 99 percent of the time, the victims, most of 
them girls, don't even know their faces, their bodies, their intimate 
parts are being circulated around the internet in fake, compromising 
pornographic images. Unfortunately, in life, perception is reality. 
Even though the images are fake, the consequences are very, very real.
  Even though, as I said, the images do not belong to them, those girls 
are paying for them dearly with shame, humiliation, and the unbearable 
suffering when you are 14, 15 years old. Up until now, there was no 
recourse. Just imagine waking up one morning to find yourself trapped 
in a nightmare that you never created. This is exactly what is 
happening to our children, and this is why we must act.

                              {time}  1515

  Mr. Speaker, the bill called the TAKE IT DOWN Act finally sends a 
very loud and clear message to Big Tech. If you, Big Tech, do not 
remove these fake images within 48 hours, you are as guilty and as 
responsible as the predators who created them.
  I am talking to Snapchat, to Instagram, and to TikTok. All of them 
will have to comply within 48 hours when a victim calls and demands and 
tells you that you have to remove those images. There are no more 
excuses. You, Big Tech, have to take it down.
  Let's talk to the bullies, to the predators, to the perverts who are 
hiding behind a computer, the ones who created this fake material. If 
you dare to do this again to another innocent child, most of them 
girls, you are going to prison. You will be in jail for a long time. 
Don't do it, and don't dare to do it anymore. Prison is a great place 
where you can sit and ponder what you have done to another human being.
  Today, we take away the power from the aggressors and the 
accomplices, and we give it back to the victims and their families. Up 
until now, the parents of the victims found that the schools couldn't 
do anything. The police couldn't do anything. Big Tech would not even 
pay attention to them but no more. Now they can because we are passing 
this law.
  What a great honor it is for me to be part of this initiative, one 
that the First Lady has personally championed. The President of the 
United States endorsed this legislation during the recent state of the 
Union that he conducted in front of Congress. The President explained 
this as a vital step to defend our sons and daughters against online 
predators.
  My fellow colleagues, this is not about politics. This is about basic 
human dignity. This is about protecting children who are the most 
vulnerable among us. As I just mentioned, the Commander in Chief called 
on us to act on this law in this very Chamber a few weeks ago.
  Mr. Speaker, I urge the House of Representatives to vote ``yes'' on 
the TAKE IT DOWN Act. Let's make history today. Let's protect our 
children. Let's just take it down.
  Mr. PALLONE. Mr. Speaker, I yield 3 minutes to the gentlewoman from 
Pennsylvania (Ms. Dean).
  Ms. DEAN of Pennsylvania. Mr. Speaker, I thank Chairman Bilirakis and 
Ranking Member Pallone for bringing this bill forward.
  Mr. Speaker, I rise in support of the TAKE IT DOWN Act. Senate bill 
146 was my bill, and it is still my bill. I thank Congresswoman Salazar 
for her leadership on this. I thank Congresswoman Dingell for her 
longtime leadership on this and for leading this effort with me on the 
House side. I thank Senators Cruz and Klobuchar for their strong 
leadership on this.
  As we speak, the internet is awash with real and fake nonconsensual 
intimate imagery. Mr. Speaker, the consequences, as you just heard, are 
devastating for every victim, their family, and their community. It 
happens to men and boys, to women and girls. Most often, it happens to 
women and girls.
  As AI becomes more prevalent in our everyday lives, Congress must 
meet this moment. We must empower and protect victims from bad actors 
who share their intimate images, real or fake, without consent and from 
the most harmful developments of AI.
  During the 2023-2024 school year, 15 percent of high school students 
reported hearing about deepfakes of nonconsensual intimate images that 
depicted kids at their schools. It happened to one of my hometown 
constituents, a 20-year-old, bright college student named Jack 
Sullivan. He was sextorted by two men claiming to be a woman on 
Instagram. They threatened Jack. They told him they would post intimate 
images of him unless he paid huge sums of money. He paid and he paid. 
When he could no longer pay their demands, Jack took his own life. We 
must do better for Jack and every other victim of these crimes.

[[Page H1647]]

  As a former educator, and a mother, and as a grandmother, this 
sickens me. As an elected official, I am moved to protect our children. 
Congress must create guardrails to protect Americans' privacy and 
dignity at a time when online exploitation is easier than ever.
  That is what our bill does. The TAKE IT DOWN Act criminalizes the 
publication of real or fake AI-generated intimate images. It requires 
websites to react, to respond, and to remove these horrifying images 
and videos within 48 hours of a victim's report. I suggest they act 
even faster. Finally, we will hold online platforms and social media 
companies accountable. This cannot wait.
  Mr. Speaker, I am pleased to have bipartisan support for this bill. I 
thank Representative Salazar and Senators Cruz and Klobuchar. I thank 
the First Lady and the President for their leadership on this. I 
implore all of my colleagues to join us in supporting this important 
bill.
  Mr. BILIRAKIS. Mr. Speaker, I yield such time as he may consume to 
the gentleman from Kentucky (Mr. Guthrie), the great chairman of the 
Committee on Energy and Commerce and a good friend of mine who is doing 
an outstanding job, in my opinion.
  Mr. GUTHRIE. Mr. Speaker, I rise today in support of S. 146, the TAKE 
IT DOWN Act. I echo the sentiments of Representative Dean and my friend 
Representative Salazar. I appreciate the comments that they just made. 
I thank Congresswoman Dingell and Senator Cruz for their determination 
in combating this crisis of malicious, deepfake pornography.
  Last month, I joined a bipartisan group convened by the First Lady. 
We heard from young survivors and their parents who were targeted by 
those abhorrent practices. I thank the First Lady for her leadership 
and for shining a light on this dark and destructive crisis.
  I am sad to say that this issue struck close to home with the 
heartbreaking death of my constituent, 16-year-old Elijah Heacock. He 
tragically fell victim to an online extortion scheme, showing my 
community the dangers of predators targeting our kids online.
  I sat with his mom, his dad, and his brother just this last week. We 
talked about the tragedy that happened in his life and their 
determination to see that we move forward in this Congress, not only on 
this bill but others to make sure that it doesn't happen to other 
families like theirs. We are all praying for that dear family.

  Mr. Speaker, the heart-wrenching stories we have heard tell us all we 
need to know. It is time to send the TAKE IT DOWN Act to the 
President's desk so we can give survivors and law enforcement the tools 
they need to combat this crisis. I urge my colleagues to vote in favor 
of this legislation.
  Mr. PALLONE. Mr. Speaker, I yield 3 minutes to the gentlewoman from 
Michigan (Mrs. Dingell), a member of our committee.
  Mrs. DINGELL. Mr. Speaker, I thank Chairman Guthrie. I thank Ranking 
Member, Frank Pallone, who often puts up with my intensity on this 
subject. I also thank Chairman Bilirakis and my co-leads, 
Representatives Salazar, Dean, and Pfluger, as well as Senator Cruz and 
Senator Klobuchar, my compatriot many days, who works to prioritize 
violence against women.
  Mr. Speaker, I rise today as a strong and unwavering advocate for 
women, children, and survivors of abuse and in support of a bill that I 
helped lead, S. 146, the TAKE IT DOWN Act.
  The rise of deepfake pornography and nonconsensual intimate images is 
a growing crisis that demands urgent action now. We need to work 
together to protect women and children from these evolving threats. New 
generative artificial intelligence tools are being weaponized to 
humiliate, silence, and terrorize women and children.
  We have seen it used against children as young as middle school as a 
tool to create revenge porn. We have seen it used against women in 
public life, on both sides of the aisle, including our own colleagues.
  None of my colleagues think this is acceptable, and it is a crisis 
that demands immediate action. We have a responsibility to act now and 
not tomorrow, not next year, not after more damage is done. The TAKE IT 
DOWN Act gives victims a clear, fast pathway to have these images 
removed from online platforms, hold perpetrators accountable, and 
ensure that tech companies do their part. They have responsibility.
  This is just one piece of a broader fight. It is one I have been in 
for years, and I will not stop fighting. We will end violence against 
women, address coercive control, and stop the misuse of technology to 
harm survivors.
  Let me be clear. This bill should already be law. It passed the 
Senate unanimously. It was included in Congress' year-end package last 
year until it was stripped out at the end. I won't get political on 
that because I want everybody to vote on it right now. It should never 
have happened.
  Mr. Speaker, I urge my colleagues to support the TAKE IT DOWN Act. 
Let's get this across the finish line and deliver for the women and 
children who are counting on us.
  Mr. BILIRAKIS. Mr. Speaker, I am prepared to close, and I reserve the 
balance of my time.
  Mr. PALLONE. Mr. Speaker, I urge unanimous support for this 
legislation. Once again, this is another very important bill as part of 
this consumer protection agenda today.
  Mr. Speaker, I yield back the balance of my time.
  Mr. BILIRAKIS. Mr. Speaker, I yield myself the balance of my time.
  Mr. Speaker, I thank the President of the United States for shedding 
light on this particular bill at the state of the Union. I thank our 
great First Lady as well for her support on this particular bill.
  I thank Representatives Dean and Dingell, and, of course, 
Representative Salazar who was also the main sponsor of the bill in the 
House. I thank Senator Cruz who worked so very hard to get this done.
  This is a bipartisan accomplishment, and we will protect our kids if 
we pass this particular bill. It will go to the President once we pass 
this bill. Let's get it done. Let's pass it unanimously.
  Mr. Speaker, again, I encourage a ``yes'' vote on this bill, and I 
yield back the balance of my time.
  The SPEAKER pro tempore. The question is on the motion offered by the 
gentleman from Florida (Mr. Bilirakis) that the House suspend the rules 
and pass the bill, S. 146.
  The question was taken.
  The SPEAKER pro tempore. In the opinion of the Chair, two-thirds 
being in the affirmative, the ayes have it.
  Mr. BILIRAKIS. Mr. Speaker, on that I demand the yeas and nays.
  The yeas and nays were ordered.
  The SPEAKER pro tempore. Pursuant to clause 8 of rule XX, further 
proceedings on this motion will be postponed.

                          ____________________