[Congressional Bills 119th Congress]
[From the U.S. Government Publishing Office]
[S. 2381 Introduced in Senate (IS)]

<DOC>






119th CONGRESS
  1st Session
                                S. 2381

  To require the Director of the National Institute of Standards and 
    Technology to develop a framework for detecting, removing, and 
   reporting child pornography in datasets used to train artificial 
             intelligence systems, and for other purposes.


_______________________________________________________________________


                   IN THE SENATE OF THE UNITED STATES

                             July 22, 2025

  Mr. Cornyn (for himself and Mr. Kim) introduced the following bill; 
    which was read twice and referred to the Committee on Commerce, 
                      Science, and Transportation

_______________________________________________________________________

                                 A BILL


 
  To require the Director of the National Institute of Standards and 
    Technology to develop a framework for detecting, removing, and 
   reporting child pornography in datasets used to train artificial 
             intelligence systems, and for other purposes.

    Be it enacted by the Senate and House of Representatives of the 
United States of America in Congress assembled,

SECTION 1. SHORT TITLE.

    This Act may be cited as the ``Preventing Recurring Online Abuse of 
Children Through Intentional Vetting of Artificial Intelligence Data 
Act of 2025'' or the ``PROACTIV Artificial Intelligence Data Act of 
2025''.

SEC. 2. DEFINITIONS.

    In this Act:
            (1) Artificial intelligence.--The term ``artificial 
        intelligence'' has the meaning given that term in section 
        238(g) of the John S. McCain National Defense Authorization Act 
        for Fiscal Year 2019 (Public Law 115-232; 10 U.S.C. note prec. 
        4061).
            (2) Artificial intelligence developer.--The term 
        ``artificial intelligence developer'' means a person who 
        designs, codes, or produces an artificial intelligence system 
        and makes such system commercially available, whether for 
        profit or not.
            (3) Artificial intelligence deployer.--.The term 
        ``artificial intelligence deployer'' means a person who 
        integrates an artificial intelligence system into the products 
        or services of the person and makes those products or services 
        commercially available, whether for profit or not.
            (4) Artificial intelligence user.--The term ``artificial 
        intelligence user'' means a person who uses an artificial 
        intelligence system for a purpose other than personal 
        noncommercial activity.
            (5) Child pornography.--The term ``child pornography'' has 
        the meaning given that term in section 2256 of title 18, United 
        States Code.
            (6) Covered dataset.--The term ``covered dataset'' means a 
        set of data that--
                    (A) is collected for the purpose of training an 
                artificial intelligence system; and
                    (B) was created using automated data crawlers or 
                data scraping tools, whether or not directed by a human 
                operator.
            (7) Data collector.--The term ``data collector'' means any 
        person who specializes in collecting, preparing, cleaning, 
        labeling, transforming for algorithmic compatibility, and 
        organizing large amounts of data for the purpose of training an 
        artificial intelligence system.
            (8) Director.--The term ``Director'' means the Director of 
        the National Institute of Standards and Technology.

SEC. 3. DEVELOPMENT OF FRAMEWORK ON DETECTING, REMOVING, AND REPORTING 
              CHILD PORNOGRAPHY IN CERTAIN DATASETS.

    (a) In General.--Not later than 1 year after the date of the 
enactment of this Act, the Director shall, in collaboration with such 
other Federal agencies and public and private sector organizations as 
the Director considers appropriate, including the National Science 
Foundation, the National Center for Missing and Exploited Children, and 
the Department of Justice, develop and publish a voluntary framework 
for detecting, removing, and reporting child pornography in covered 
datasets.
    (b) Contents.--The Director shall ensure that the framework 
published under subsection (a) provides to artificial intelligence 
developers and to data collectors guidelines, best practices, 
methodologies, procedures, and processes--
            (1) to detect any child pornography in covered datasets;
            (2) to remove any child pornography from covered datasets; 
        and
            (3) to regularly report to Federal, State, or local law 
        enforcement and the National Center for Missing and Exploited 
        Children any child pornography detected in covered datasets.
    (c) Limitation.--The framework published under subsection (a) shall 
apply to persons who are artificial intelligence developers and to data 
collectors, and not to persons who are solely artificial intelligence 
deployers or artificial intelligence users.
    (d) Stakeholder Outreach.--In developing the framework issued under 
subsection (a), the Director shall--
            (1) solicit input from--
                    (A) institutions of higher education (as defined in 
                section 101 of the Higher Education Act of 1965 (20 
                U.S.C. 1001));
                    (B) any Federal agency the Director considers 
                relevant;
                    (C) civil society and nonprofit organizations;
                    (D) artificial intelligence developers and 
                artificial intelligence deployers;
                    (E) Federal laboratories (as defined in section 4 
                of the Stevenson-Wydler Technology Innovation Act of 
                1980 (15 U.S.C. 3703)); and
                    (F) any other such stakeholder the Director 
                considers appropriate; and
            (2) provide an opportunity for public comment on the 
        guidelines, best practices, methodologies, procedures, and 
        processes developed as part of the framework.
    (e) Research.--The Director of the National Science Foundation, in 
coordination with the heads of other relevant Federal agencies, as 
determined by such Director, shall support research into innovative 
approaches to detecting, removing, and reporting child pornography from 
covered datasets, including research conducted through the Directorate 
for Technology, Innovation, and Partnerships.

SEC. 4. LIMITED LIABILITY FOR DETECTING, REMOVING, AND REPORTING CHILD 
              PORNOGRAPHY.

    (a) In General.--Except as provided in subsection (b), no cause of 
action shall lie or be maintained in any court against an artificial 
intelligence developer or data collector, and such action shall be 
promptly dismissed, for the detecting, removing, or reporting of child 
pornography in covered datasets that is conducted in accordance with 
the framework issued by the Director under section 3(a).
    (b) Intentional, Reckless, Grossly Negligent, or Other 
Misconduct.--Subsection (a) shall not apply to a cause of action for 
detecting, removing, or reporting child pornography in covered datasets 
if the artificial intelligence developer or data collector--
            (1) engaged in intentional misconduct;
            (2) acted, or failed to act--
                    (A) with actual malice;
                    (B) with reckless disregard to a substantial risk 
                of causing injury without legal justification; or
                    (C) with gross negligence; or
            (3) engaged in any activity that violates section 2251 of 
        title 18, United States Code.
    (c) Rule of Construction.--Nothing in this Act shall be construed 
to affect the protections and obligations of section 2258A of title 18, 
United States Code.
                                 <all>