[Congressional Bills 118th Congress]
[From the U.S. Government Publishing Office]
[S. 4178 Reported in Senate (RS)]

<DOC>





                                                       Calendar No. 725
118th CONGRESS
  2d Session
                                S. 4178

To establish artificial intelligence standards, metrics, and evaluation 
 tools, to support artificial intelligence research, development, and 
 capacity building activities, to promote innovation in the artificial 
 intelligence industry by ensuring companies of all sizes can succeed 
                  and thrive, and for other purposes.


_______________________________________________________________________


                   IN THE SENATE OF THE UNITED STATES

                             April 18, 2024

Ms. Cantwell (for herself, Mr. Young, Mr. Hickenlooper, Mrs. Blackburn, 
    Mr. Wicker, Mr. Lujan, Ms. Sinema, Mr. Rounds, and Mr. Schumer) 
introduced the following bill; which was read twice and referred to the 
           Committee on Commerce, Science, and Transportation

            December 18 (legislative day, December 16), 2024

              Reported by Ms. Cantwell, with an amendment
 [Strike out all after the enacting clause and insert the part printed 
                               in italic]

_______________________________________________________________________

                                 A BILL


 
To establish artificial intelligence standards, metrics, and evaluation 
 tools, to support artificial intelligence research, development, and 
 capacity building activities, to promote innovation in the artificial 
 intelligence industry by ensuring companies of all sizes can succeed 
                  and thrive, and for other purposes.

    Be it enacted by the Senate and House of Representatives of the 
United States of America in Congress assembled,

<DELETED>SECTION 1. SHORT TITLE; TABLE OF CONTENTS.</DELETED>

<DELETED>    (a) Short Title.--This Act may be cited as the ``Future of 
Artificial Intelligence Innovation Act of 2024''.</DELETED>
<DELETED>    (b) Table of Contents.--The table of contents for this Act 
is as follows:</DELETED>

<DELETED>Sec. 1. Short title; table of contents.
<DELETED>Sec. 2. Sense of Congress.
<DELETED>Sec. 3. Definitions.
<DELETED>TITLE I--VOLUNTARY ARTIFICIAL INTELLIGENCE STANDARDS, METRICS, 
       EVALUATION TOOLS, TESTBEDS, AND INTERNATIONAL COOPERATION

   <DELETED>Subtitle A--Artificial Intelligence Safety Institute and 
                                Testbeds

<DELETED>Sec. 101. Artificial Intelligence Safety Institute.
<DELETED>Sec. 102. Program on artificial intelligence testbeds.
<DELETED>Sec. 103. National Institute of Standards and Technology and 
                            Department of Energy testbed to identify, 
                            test, and synthesize new materials.
<DELETED>Sec. 104. National Science Foundation and Department of Energy 
                            collaboration to make scientific 
                            discoveries through the use of artificial 
                            intelligence.
<DELETED>Sec. 105. Progress report.
             <DELETED>Subtitle B--International Cooperation

<DELETED>Sec. 111. International coalition on innovation, development, 
                            and harmonization of standards with respect 
                            to artificial intelligence.
<DELETED>Sec. 112. Requirement to support bilateral and multilateral 
                            artificial intelligence research 
                            collaborations.
   <DELETED>Subtitle C--Identifying Regulatory Barriers to Innovation

<DELETED>Sec. 121. Comptroller General of the United States 
                            identification of risks and obstacles 
                            relating to artificial intelligence and 
                            Federal agencies.
   <DELETED>TITLE II--ARTIFICIAL INTELLIGENCE RESEARCH, DEVELOPMENT, 
                      CAPACITY BUILDING ACTIVITIES

<DELETED>Sec. 201. Public data for artificial intelligence systems.
<DELETED>Sec. 202. Federal grand challenges in artificial intelligence.

<DELETED>SEC. 2. SENSE OF CONGRESS.</DELETED>

<DELETED>    It is the sense of Congress that policies governing 
artificial intelligence should maximize the potential and development 
of artificial intelligence to benefit all private and public 
stakeholders.</DELETED>

<DELETED>SEC. 3. DEFINITIONS.</DELETED>

<DELETED>    In this Act:</DELETED>
        <DELETED>    (1) Agency.--The term ``agency'' has the meaning 
        given such term in section 3502 of title 44, United States 
        Code, except such term shall include an independent regulatory 
        agency, as defined in such section.</DELETED>
        <DELETED>    (2) Artificial intelligence.--The term 
        ``artificial intelligence'' has the meaning given such term in 
        section 5002 of the National Artificial Intelligence Initiative 
        Act of 2020 (15 U.S.C. 9401).</DELETED>
        <DELETED>    (3) Artificial intelligence blue-teaming.--The 
        term ``artificial intelligence blue-teaming'' means an effort 
        to conduct operational network vulnerability evaluations and 
        provide mitigation techniques to entities who have a need for 
        an independent technical review of the network security posture 
        of an artificial intelligence system.</DELETED>
        <DELETED>    (4) Artificial intelligence model.--The term 
        ``artificial intelligence model'' means a component of an 
        artificial intelligence system that is a model--</DELETED>
                <DELETED>    (A) derived using mathematical, 
                computational, statistical, or machine-learning 
                techniques; and</DELETED>
                <DELETED>    (B) used as part of an artificial 
                intelligence system to produce outputs from a given set 
                of inputs.</DELETED>
        <DELETED>    (5) Artificial intelligence red-teaming.--The term 
        ``artificial intelligence red-teaming'' means structured 
        adversarial testing efforts of an artificial intelligence 
        system to identify risks, flaws, and vulnerabilities of the 
        artificial intelligence system, such as harmful outputs from 
        the system, unforeseen or undesirable system behaviors, 
        limitations, or potential risks associated with the misuse of 
        the system.</DELETED>
        <DELETED>    (6) Artificial intelligence risk management 
        framework.--The term ``Artificial Intelligence Risk Management 
        Framework'' means the most recently updated version of the 
        framework developed and updated pursuant to section 22A(c) of 
        the National Institute of Standards and Technology Act (15 
        U.S.C. 278h-1(c)).</DELETED>
        <DELETED>    (7) Artificial intelligence system.--The term 
        ``artificial intelligence system'' has the meaning given such 
        term in section 7223 of the Advancing American AI Act (40 
        U.S.C. 11301 note).</DELETED>
        <DELETED>    (8) Critical infrastructure.--The term ``critical 
        infrastructure'' has the meaning given such term in section 
        1016(e) of the Uniting and Strengthening America by Providing 
        Appropriate Tools Required to Intercept and Obstruct Terrorism 
        (USA PATRIOT ACT) Act of 2001 (42 U.S.C. 5195c(e)).</DELETED>
        <DELETED>    (9) Federal laboratory.--The term ``Federal 
        laboratory'' has the meaning given such term in section 4 of 
        the Stevenson-Wydler Technology Innovation Act of 1980 (15 
        U.S.C. 3703).</DELETED>
        <DELETED>    (10) Foundation model.--The term ``foundation 
        model'' means an artificial intelligence model trained on broad 
        data at scale and is adaptable to a wide range of downstream 
        tasks.</DELETED>
        <DELETED>    (11) Generative artificial intelligence.--The term 
        ``generative artificial intelligence'' means the class of 
        artificial intelligence models that utilize the structure and 
        characteristics of input data in order to generate outputs in 
        the form of derived synthetic content. Such derived synthetic 
        content can include images, videos, audio, text, software, 
        code, and other digital content.</DELETED>
        <DELETED>    (12) National laboratory.--The term ``National 
        Laboratory'' has the meaning given such term in section 2 of 
        the Energy Policy Act of 2005 (42 U.S.C. 15801).</DELETED>
        <DELETED>    (13) Synthetic content.--The term ``synthetic 
        content'' means information, such as images, videos, audio 
        clips, and text, that has been significantly modified or 
        generated by algorithms, including by artificial 
        intelligence.</DELETED>
        <DELETED>    (14) Testbed.--The term ``testbed'' means a 
        facility or mechanism equipped for conducting rigorous, 
        transparent, and replicable testing of tools and technologies, 
        including artificial intelligence systems, to help evaluate the 
        functionality, trustworthiness, usability, and performance of 
        those tools or technologies.</DELETED>
        <DELETED>    (15) TEVV.--The term ``TEVV'' means methodologies, 
        metrics, techniques, and tasks for testing, evaluating, 
        verifying, and validating artificial intelligence systems or 
        components.</DELETED>
        <DELETED>    (16) Watermarking.--The term ``watermarking'' 
        means the act of embedding information that is intended to be 
        difficult to remove, into outputs generated by artificial 
        intelligence, including outputs such as text, images, audio, 
        videos, software code, or any other digital content or data, 
        for the purposes of verifying the authenticity of the output or 
        the identity or characteristics of its provenance, 
        modifications, or conveyance.</DELETED>

<DELETED>TITLE I--VOLUNTARY ARTIFICIAL INTELLIGENCE STANDARDS, METRICS, 
  EVALUATION TOOLS, TESTBEDS, AND INTERNATIONAL COOPERATION</DELETED>

   <DELETED>Subtitle A--Artificial Intelligence Safety Institute and 
                           Testbeds</DELETED>

<DELETED>SEC. 101. ARTIFICIAL INTELLIGENCE SAFETY INSTITUTE.</DELETED>

<DELETED>    (a) Establishment of Institute.--</DELETED>
        <DELETED>    (1) In general.--Not later than 1 year after the 
        date of the enactment of this Act, the Under Secretary of 
        Commerce for Standards and Technology (in this section referred 
        to as the ``Under Secretary'') shall establish an institute on 
        artificial intelligence.</DELETED>
        <DELETED>    (2) Designation.--The institute established 
        pursuant to paragraph (1) shall be known as the ``Artificial 
        Intelligence Safety Institute'' (in this section referred to as 
        the ``Institute'').</DELETED>
        <DELETED>    (3) Mission.--The mission of the Institute is as 
        follows:</DELETED>
                <DELETED>    (A) To assist the private sector and 
                agencies in developing voluntary best practices for the 
                robust assessment of artificial intelligence 
                systems.</DELETED>
                <DELETED>    (B) To provide technical assistance for 
                the adoption and use of artificial intelligence across 
                the Federal Government to improve the quality of 
                government services.</DELETED>
                <DELETED>    (C) To develop guidelines, methodologies, 
                and best practices to promote--</DELETED>
                        <DELETED>    (i) development and adoption of 
                        voluntary, consensus-based technical standards 
                        or industry standards;</DELETED>
                        <DELETED>    (ii) long-term advancements in 
                        artificial intelligence technologies; 
                        and</DELETED>
                        <DELETED>    (iii) innovation in the artificial 
                        intelligence industry by ensuring that 
                        companies of all sizes can succeed and 
                        thrive.</DELETED>
<DELETED>    (b) Director.--The Under Secretary shall appoint a 
director of the Institute, who shall be known as the ``Director of the 
Artificial Intelligence Safety Institute'' (in this section referred to 
as the ``Director'') and report directly to the Under 
Secretary.</DELETED>
<DELETED>    (c) Staff and Authorities.--</DELETED>
        <DELETED>    (1) Staff.--The Director may hire such full-time 
        employees as the Director considers appropriate to assist the 
        Director in carrying out the functions of the 
        Institute.</DELETED>
        <DELETED>    (2) Use of authority to hire critical technical 
        experts.--In addition to making appointments under paragraph 
        (1) of this subsection, the Director, in coordination with the 
        Secretary of Commerce, may make appointments of scientific, 
        engineering, and professional personnel, and fix their basic 
        pay, under subsection (b) of section 6 of the National 
        Institute of Standards and Technology Act (15 U.S.C. 275) to 
        hire critical technical experts.</DELETED>
        <DELETED>    (3) Expansion of authority to hire critical 
        technical experts.--Such subsection is amended, in the second 
        sentence, by striking ``15'' and inserting ``30''.</DELETED>
        <DELETED>    (4) Modification of sunset.--Subsection (c) of 
        such section is amended by striking ``the date that is 5 years 
        after the date of the enactment of this section'' and inserting 
        ``December 30, 2035''.</DELETED>
        <DELETED>    (5) Agreements.--The Director may enter into such 
        agreements, including contracts, grants, cooperative 
        agreements, and other transactions, as the Director considers 
        necessary to carry out the functions of the Institute and on 
        such terms as the Under Secretary considers 
        appropriate.</DELETED>
<DELETED>    (d) Consultation and Coordination.--In establishing the 
Institute, the Under Secretary shall--</DELETED>
        <DELETED>    (1) coordinate with--</DELETED>
                <DELETED>    (A) the Secretary of Energy;</DELETED>
                <DELETED>    (B) the Secretary of Homeland 
                Security;</DELETED>
                <DELETED>    (C) the Secretary of Defense;</DELETED>
                <DELETED>    (D) the Director of the National Science 
                Foundation; and</DELETED>
                <DELETED>    (E) the Director of the Office of Science 
                and Technology Policy; and</DELETED>
        <DELETED>    (2) consult with the heads of such other Federal 
        agencies as the Under Secretary considers 
        appropriate.</DELETED>
<DELETED>    (e) Functions.--The functions of the Institute, which the 
Institute shall carry out in coordination with the laboratories of the 
National Institute of Standards and Technology, are as 
follows:</DELETED>
        <DELETED>    (1) Research, evaluation, testing, and 
        standards.--The following functions relating to research, 
        evaluation, testing, and standards:</DELETED>
                <DELETED>    (A) Conducting measurement research into 
                system and model safety, validity and reliability, 
                security, capabilities and limitations, explainability, 
                interpretability, and privacy.</DELETED>
                <DELETED>    (B) Working with the Department of Energy, 
                the National Science Foundation, public-private 
                partnerships, including the Artificial Intelligence 
                Safety Institute Consortium established under 
                subsection (f), and other private sector organizations 
                to develop testing environments and perform regular 
                benchmarking and capability evaluations, including 
                artificial intelligence red-teaming as the Director 
                considers appropriate.</DELETED>
                <DELETED>    (C) Working with consensus-based, open, 
                and transparent standards development organizations 
                (SDOs) and relevant industry, Federal laboratories, 
                civil society, and academic institutions to advance 
                development and adoption of clear, implementable, 
                technically sound, and technology-neutral voluntary 
                standards and guidelines that incorporate appropriate 
                variations in approach depending on the size of the 
                entity, the potential risks and potential benefits of 
                the artificial intelligence system, and the role of the 
                entity (such as developer, deployer, or user) relating 
                to artificial intelligence systems.</DELETED>
                <DELETED>    (D) Building upon the Artificial 
                Intelligence Risk Management Framework to incorporate 
                guidelines on generative artificial intelligence 
                systems.</DELETED>
                <DELETED>    (E) Developing a companion resource to the 
                Secure Software Development Framework to incorporate 
                secure development practices for generative artificial 
                intelligence and for foundation models.</DELETED>
                <DELETED>    (F) Developing and publishing 
                cybersecurity tools, methodologies, best practices, 
                voluntary guidelines, and other supporting information 
                to assist persons who maintain systems used to create 
                or train artificial intelligence models to discover and 
                mitigate vulnerabilities and attacks.</DELETED>
                <DELETED>    (G) Coordinating or developing guidelines, 
                metrics, benchmarks, and methodologies for evaluating 
                artificial intelligence systems, including the 
                following:</DELETED>
                        <DELETED>    (i) Cataloging existing artificial 
                        intelligence metrics, benchmarks, and 
                        evaluation methodologies used in industry and 
                        academia.</DELETED>
                        <DELETED>    (ii) Testing and validating the 
                        efficacy of existing metrics, benchmarks, and 
                        evaluations, as well as TEVV tools and 
                        products.</DELETED>
                        <DELETED>    (iii) Funding and facilitating 
                        research and other activities in a transparent 
                        manner, including at institutions of higher 
                        education and other nonprofit and private 
                        sector partners, to evaluate, develop, or 
                        improve TEVV capabilities, with rigorous 
                        scientific merit, for artificial intelligence 
                        systems.</DELETED>
                        <DELETED>    (iv) Evaluating foundation models 
                        for their potential effect in downstream 
                        systems, such as when retrained or fine-
                        tuned.</DELETED>
                <DELETED>    (H) Coordinating with counterpart 
                institutions of international partners and allies to 
                promote global interoperability in the development of 
                research, evaluation, testing, and standards relating 
                to artificial intelligence.</DELETED>
                <DELETED>    (I) Developing tools, methodologies, best 
                practices, and voluntary guidelines for identifying 
                vulnerabilities in foundation models.</DELETED>
                <DELETED>    (J) Developing tools, methodologies, best 
                practices, and voluntary guidelines for relevant 
                agencies to track incidents resulting in harm caused by 
                artificial intelligence systems.</DELETED>
        <DELETED>    (2) Implementation.--The following functions 
        relating to implementation:</DELETED>
                <DELETED>    (A) Using publicly available and 
                voluntarily provided information, conducting 
                evaluations to assess the impacts of artificial 
                intelligence systems, and developing guidelines and 
                practices for safe development, deployment, and use of 
                artificial intelligence technology.</DELETED>
                <DELETED>    (B) Aligning capability evaluation and 
                red-teaming guidelines and benchmarks, sharing best 
                practices, and coordinating on building testbeds and 
                test environments with allies of the United States and 
                international partners and allies.</DELETED>
                <DELETED>    (C) Coordinating vulnerability and 
                incident data sharing with international partners and 
                allies.</DELETED>
                <DELETED>    (D) Integrating appropriate testing 
                capabilities and infrastructure for testing of models 
                and systems.</DELETED>
                <DELETED>    (E) Establishing blue-teaming capabilities 
                to develop mitigation approaches and partner with 
                industry to address risks and negative 
                impacts.</DELETED>
                <DELETED>    (F) Developing voluntary guidelines on--
                </DELETED>
                        <DELETED>    (i) detecting synthetic content, 
                        authenticating content and tracking of the 
                        provenance of content, labeling original and 
                        synthetic content, such as by watermarking, and 
                        evaluating software and systems relating to 
                        detection and labeling of synthetic 
                        content;</DELETED>
                        <DELETED>    (ii) ensuring artificial 
                        intelligence systems do not violate privacy 
                        rights or other rights; and</DELETED>
                        <DELETED>    (iii) transparency documentation 
                        of artificial intelligence datasets and 
                        artificial intelligence models.</DELETED>
                <DELETED>    (G) Coordinating with relevant agencies to 
                develop or support, as the heads of the agencies 
                determine appropriate, sector- and application-specific 
                profiles of the Artificial Intelligence Risk Management 
                Framework for different use cases, integrating end-user 
                experience and on-going development work into a 
                continuously evolving toolkit.</DELETED>
        <DELETED>    (3) Operations and engagement.--The following 
        functions relating to operations and engagement:</DELETED>
                <DELETED>    (A) Managing the work of the Institute, 
                developing internal processes, and ensuring that the 
                Institute meets applicable goals and targets.</DELETED>
                <DELETED>    (B) Engaging with the private sector to 
                promote innovation and competitiveness.</DELETED>
                <DELETED>    (C) Engaging with international standards 
                organizations, multilateral organizations, and similar 
                institutes among allies and partners.</DELETED>
<DELETED>    (f) Artificial Intelligence Safety Institute Consortium.--
</DELETED>
        <DELETED>    (1) Establishment.--</DELETED>
                <DELETED>    (A) In general.--Not later than 180 days 
                after the date of the enactment of this Act, the Under 
                Secretary shall establish a consortium of stakeholders 
                from academic or research communities, Federal 
                laboratories, private industry, including companies of 
                all sizes with different roles in the use of artificial 
                intelligence systems, including developers, deployers, 
                and users, and civil society with expertise in matters 
                relating to artificial intelligence to support the 
                Institute in carrying out the functions set forth under 
                subsection (e).</DELETED>
                <DELETED>    (B) Designation.--The consortium 
                established pursuant to subparagraph (A) shall be known 
                as the ``Artificial Intelligence Safety Institute 
                Consortium''.</DELETED>
        <DELETED>    (2) Consultation.--The Under Secretary, acting 
        through the Director, shall consult with the consortium 
        established under this subsection not less frequently than 
        quarterly.</DELETED>
        <DELETED>    (3) Report to congress.--Not later than 2 years 
        after the date of the enactment of this Act, the Director of 
        the National Institute of Standards and Technology shall submit 
        to the Committee on Commerce, Science, and Transportation of 
        the Senate and the Committee on Science, Space, and Technology 
        of the House of Representatives a report summarizing the 
        contributions of the members of the consortium established 
        under this subsection in support the efforts of the 
        Institute.</DELETED>
<DELETED>    (g) Artificial Intelligence System Testing.--In carrying 
out the Institute functions required by subsection (a), the Under 
Secretary shall support and contribute to the development of voluntary, 
consensus-based technical standards for testing artificial intelligence 
system components, including, as the Under Secretary considers 
appropriate, the following:</DELETED>
        <DELETED>    (1) Physical infrastructure for training or 
        developing artificial intelligence models and systems, 
        including cloud infrastructure.</DELETED>
        <DELETED>    (2) Physical infrastructure for operating 
        artificial intelligence systems, including cloud 
        infrastructure.</DELETED>
        <DELETED>    (3) Data for training artificial intelligence 
        models.</DELETED>
        <DELETED>    (4) Data for evaluating the functionality and 
        trustworthiness of trained artificial intelligence models and 
        systems.</DELETED>
        <DELETED>    (5) Trained or partially trained artificial 
        intelligence models and any resulting software systems or 
        products.</DELETED>
<DELETED>    (h) Gifts.--</DELETED>
        <DELETED>    (1) Authority.--The Director may seek, accept, 
        hold, administer, and use gifts from public and private sources 
        whenever the Director determines it would be in the interest of 
        the United States to do so.</DELETED>
        <DELETED>    (2) Regulations.--The Director, in consultation 
        with the Director of the Office of Government Ethics, shall 
        ensure that authority under this subsection is exercised 
        consistent with all relevant ethical constraints and 
        principles, including--</DELETED>
                <DELETED>    (A) the avoidance of any prohibited 
                conflict of interest or appearance of impropriety; 
                and</DELETED>
                <DELETED>    (B) a prohibition against the acceptance 
                of a gift from a foreign government or an agent of a 
                foreign government.</DELETED>
<DELETED>    (i) Rule of Construction.--Nothing in this section shall 
be construed to provide the Director of the National Institute of 
Standards and Technology any enforcement authority that was not in 
effect on the day before the date of the enactment of this 
Act.</DELETED>

<DELETED>SEC. 102. PROGRAM ON ARTIFICIAL INTELLIGENCE 
              TESTBEDS.</DELETED>

<DELETED>    (a) Definitions.--In this section:</DELETED>
        <DELETED>    (1) Appropriate committees of congress.--The term 
        ``appropriate committees of Congress'' means--</DELETED>
                <DELETED>    (A) the Committee on Commerce, Science, 
                and Transportation and the Committee on Energy and 
                Natural Resources of the Senate; and</DELETED>
                <DELETED>    (B) the Committee on Science, Space, and 
                Technology of the House of Representatives.</DELETED>
        <DELETED>    (2) Director.--The term ``Director'' means the 
        Director of the National Science Foundation.</DELETED>
        <DELETED>    (3) Institute.--The term ``Institute'' means the 
        Artificial Intelligence Safety Institute established by section 
        101.</DELETED>
        <DELETED>    (4) Secretary.--The term ``Secretary'' means the 
        Secretary of Energy.</DELETED>
        <DELETED>    (5) Under secretary.--The term ``Under Secretary'' 
        means the Under Secretary of Commerce for Standards and 
        Technology.</DELETED>
<DELETED>    (b) Program Required.--Not later than 180 days after the 
date of the enactment of this Act, the Under Secretary shall, in 
coordination with the Secretary and the Director, establish and 
commence carrying out a testbed program to encourage collaboration and 
support partnerships between the National Laboratories, the National 
Institute of Standards and Technology, the National Artificial 
Intelligence Research Resource pilot program established by the 
Director of the National Science Foundation, or any successor program, 
and public and private sector entities, including companies of all 
sizes, to conduct research and development, tests, evaluations, and 
risk assessments of artificial intelligence systems, including 
measurement methodologies developed by the Institute.</DELETED>
<DELETED>    (c) Activities.--In carrying out this program, the Under 
Secretary shall, in coordination with the Secretary--</DELETED>
        <DELETED>    (1) use the advanced computing resources, 
        testbeds, and expertise of the National Laboratories, the 
        Institute, the National Science Foundation, and private sector 
        entities to run tests and evaluations on the capabilities and 
        limitations of artificial intelligence systems;</DELETED>
        <DELETED>    (2) use existing solutions to the maximum extent 
        practicable;</DELETED>
        <DELETED>    (3) develop automated and reproducible tests, 
        evaluations, and risk assessments for artificial intelligence 
        systems to the extent that is practicable;</DELETED>
        <DELETED>    (4) assess the computational resources necessary 
        to run tests, evaluations, and risk assessments of artificial 
        intelligence systems;</DELETED>
        <DELETED>    (5) research methods to effectively minimize the 
        computational resources needed to run tests, evaluations, and 
        risk assessments of artificial intelligence systems;</DELETED>
        <DELETED>    (6) consider developing tests, evaluations, and 
        risk assessments for artificial intelligence systems that are 
        designed for high-, medium-, and low-computational intensity; 
        and</DELETED>
        <DELETED>    (7) prioritize identifying and evaluating 
        scenarios in which the artificial intelligence systems tested 
        or evaluated by a testbed could be deployed in a way that poses 
        security risks, and either establishing classified testbeds, or 
        utilizing existing classified testbeds, at the National 
        Laboratories if necessary, including with respect to--
        </DELETED>
                <DELETED>    (A) autonomous offensive cyber 
                capabilities;</DELETED>
                <DELETED>    (B) cybersecurity vulnerabilities in the 
                artificial intelligence software ecosystem and 
                beyond;</DELETED>
                <DELETED>    (C) chemical, biological, radiological, 
                nuclear, critical infrastructure, and energy-security 
                threats or hazards; and</DELETED>
                <DELETED>    (D) such other capabilities as the Under 
                Secretary determines necessary.</DELETED>
<DELETED>    (d) Consideration Given.--In carrying out the activities 
required by subsection (c), the Under Secretary shall, in coordination 
with the Secretary, take under consideration the applicability of any 
tests, evaluations, and risk assessments to artificial intelligence 
systems trained using primarily biological sequence data, including 
those systems used for gene synthesis.</DELETED>
<DELETED>    (e) Metrics.--The Under Secretary, in collaboration with 
the Secretary, shall develop metrics--</DELETED>
        <DELETED>    (1) to assess the effectiveness of the program in 
        encouraging collaboration and supporting partnerships as 
        described in subsection (b); and</DELETED>
        <DELETED>    (2) to assess the impact of the program on public 
        and private sector integration and use of artificial 
        intelligence systems.</DELETED>
<DELETED>    (f) Use of Existing Program.--In carrying out the program 
required by subsection (a), the Under Secretary may, in collaboration 
with the Secretary and the Director, use a program that was in effect 
on the day before the date of the enactment of this Act.</DELETED>
<DELETED>    (g) Evaluation and Findings.--Not later than 3 years after 
the start of this program, the Under Secretary shall, in collaboration 
with the Secretary--</DELETED>
        <DELETED>    (1) evaluate the success of the program in 
        encouraging collaboration and supporting partnerships as 
        described in subsection (b), using the metrics developed 
        pursuant to subsection (e);</DELETED>
        <DELETED>    (2) evaluate the success of the program in 
        encouraging public and private sector integration and use of 
        artificial intelligence systems by using the metrics developed 
        pursuant to subsection (e); and</DELETED>
        <DELETED>    (3) submit to the appropriate committees of 
        Congress the evaluation supported pursuant to paragraph (1) and 
        the findings of the Under Secretary, the Secretary, and the 
        Director with respect to the testbed program.</DELETED>
<DELETED>    (h) Consultation.--In carrying out subsection (b), the 
Under Secretary shall consult, as the Under Secretary considers 
appropriate, with the following:</DELETED>
        <DELETED>    (1) Industry, including private artificial 
        intelligence laboratories, companies of all sizes, and 
        representatives from the United States financial 
        sector.</DELETED>
        <DELETED>    (2) Academia and institutions of higher 
        education.</DELETED>
        <DELETED>    (3) Civil society.</DELETED>
        <DELETED>    (4) Third-party evaluators.</DELETED>
<DELETED>    (i) Establishment of Foundation Models Test Program.--In 
carrying out the program under subsection (b), the Under Secretary 
shall, acting through the Director of the Institute and in coordination 
with the Secretary of Energy, carry out a test program to provide 
vendors of foundation models the opportunity to voluntarily test 
foundation models across a range of modalities, such as models that 
ingest and output text, images, audio, video, software code, and mixed 
modalities, relative to the Artificial Intelligence Risk Management 
Framework, by--</DELETED>
        <DELETED>    (1) conducting research and regular testing to 
        improve and benchmark the accuracy, efficacy, and bias of 
        foundation models;</DELETED>
        <DELETED>    (2) conducting research to identify key 
        capabilities, limitations, and unexpected behaviors of 
        foundation models;</DELETED>
        <DELETED>    (3) identifying and evaluating scenarios in which 
        these models could pose risks;</DELETED>
        <DELETED>    (4) establishing reference use cases for 
        foundation models and performance criteria for assessing each 
        use case, including accuracy, efficacy, and bias 
        metrics;</DELETED>
        <DELETED>    (5) enabling developers and deployers of 
        foundation models to evaluate such systems for risks, 
        incidents, and vulnerabilities if deployed in such use 
        cases;</DELETED>
        <DELETED>    (6) coordinating public evaluations, which may 
        include prizes and challenges, to evaluate foundation models; 
        and</DELETED>
        <DELETED>    (7) as the Under Secretary and the Secretary 
        consider appropriate, producing public-facing reports of the 
        findings from such testing for a general audience.</DELETED>
<DELETED>    (j) Rule of Construction.--Nothing in this section shall 
be construed to require a person to disclose any information, including 
information--</DELETED>
        <DELETED>    (1) relating to a trade secret or other protected 
        intellectual property right;</DELETED>
        <DELETED>    (2) that is confidential business information; 
        or</DELETED>
        <DELETED>    (3) that is privileged.</DELETED>

<DELETED>SEC. 103. NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGY AND 
              DEPARTMENT OF ENERGY TESTBED TO IDENTIFY, TEST, AND 
              SYNTHESIZE NEW MATERIALS.</DELETED>

<DELETED>    (a) Testbed Authorized.--The Secretary of Commerce, acting 
through the Director of the National Institute of Standards and 
Technology, and the Secretary of Energy shall jointly establish a 
testbed to identify, test, and synthesize new materials to advance 
materials science and to support advanced manufacturing for the benefit 
of the United States economy through the use of artificial 
intelligence, autonomous laboratories, and artificial intelligence 
integrated with emerging technologies, such as quantum hybrid computing 
and robotics.</DELETED>
<DELETED>    (b) Support for Accelerated Technologies.--The Secretary 
of Commerce and the Secretary of Energy shall ensure that technologies 
accelerated using the testbed established pursuant to subsection (a) 
are supported by advanced algorithms and models, uncertainty 
quantification, and software and workforce development tools to produce 
benchmark data, model comparison tools, and best practices 
guides.</DELETED>
<DELETED>    (c) Public-Private Partnerships.--In carrying out 
subsection (a), the Secretary of Commerce and the Secretary of Energy 
shall, in consultation with industry, civil society, and academia, 
enter into such public-private partnerships as the Secretaries jointly 
determine appropriate.</DELETED>
<DELETED>    (d) Resources.--In carrying out subsection (a), the 
Secretaries may use resources from National Laboratories and the 
private sector.</DELETED>

<DELETED>SEC. 104. NATIONAL SCIENCE FOUNDATION AND DEPARTMENT OF ENERGY 
              COLLABORATION TO MAKE SCIENTIFIC DISCOVERIES THROUGH THE 
              USE OF ARTIFICIAL INTELLIGENCE.</DELETED>

<DELETED>    (a) In General.--The Director of the National Science 
Foundation (referred to in this section as the ``Director'') and the 
Secretary of Energy (referred to in this section as the ``Secretary'') 
shall collaborate to support new translational scientific discoveries 
and advancements for the benefit of the economy of the United States 
through the use of artificial intelligence, including artificial 
intelligence integrated with emerging technologies, such as quantum 
hybrid computing and robotics.</DELETED>
<DELETED>    (b) Public-Private Partnerships.--In carrying out 
subsection (a), the Director and the Secretary shall enter into such 
public-private partnerships as the Director and the Secretary jointly 
determine appropriate.</DELETED>
<DELETED>    (c) Resources.--In carrying out subsection (a), the 
Director and the Secretary may accept and use resources from the 
National Laboratories, resources from the private sector, and academic 
resources.</DELETED>

<DELETED>SEC. 105. PROGRESS REPORT.</DELETED>

<DELETED>    Not later than 1 year after the date of the enactment of 
this Act, the Director of the Artificial Intelligence Safety Institute 
shall, in coordination with the Secretary of Commerce and the Secretary 
of Energy, submit to Congress a report on the implementation of this 
subtitle.</DELETED>

        <DELETED>Subtitle B--International Cooperation</DELETED>

<DELETED>SEC. 111. INTERNATIONAL COALITION ON INNOVATION, DEVELOPMENT, 
              AND HARMONIZATION OF STANDARDS WITH RESPECT TO ARTIFICIAL 
              INTELLIGENCE.</DELETED>

<DELETED>    (a) In General.--The Secretary of Commerce, the Secretary 
of State, and the Director of the Office of Science and Technology 
Policy (in this section referred to as the ``Director''), in 
consultation with the heads of relevant agencies, shall jointly seek to 
form an alliance or coalition with like-minded governments of foreign 
countries--</DELETED>
        <DELETED>    (1) to cooperate on approaches to innovation and 
        advancements in artificial intelligence and ecosystems for 
        artificial intelligence;</DELETED>
        <DELETED>    (2) to coordinate on development and use of 
        interoperable international standards or harmonization of 
        standards with respect to artificial intelligence;</DELETED>
        <DELETED>    (3) to promote adoption of common artificial 
        intelligence standards;</DELETED>
        <DELETED>    (4) to develop the government-to-government 
        infrastructure needed to facilitate coordination of coherent 
        global application of artificial intelligence safety standards, 
        including, where appropriate, putting in place agreements for 
        information sharing between governments; and</DELETED>
        <DELETED>    (5) to involve private-sector stakeholders from 
        partner countries to help inform coalition partners on recent 
        developments in artificial intelligence and associated 
        standards development.</DELETED>
<DELETED>    (b) Criteria for Participation.--In forming an alliance or 
coalition of like-minded governments of foreign countries under 
subsection (a), the Secretary of Commerce, the Secretary of State, and 
the Director, in consultation with the heads of relevant agencies, 
shall jointly establish technology trust criteria--</DELETED>
        <DELETED>    (1) to ensure all participating countries that 
        have a high level of scientific and technological 
        advancement;</DELETED>
        <DELETED>    (2) to ensure all participating countries commit 
        to using open international standards; and</DELETED>
        <DELETED>    (3) to support the governance principles for 
        international standards as detailed in the World Trade 
        Organization Agreement on Technical Barriers to Trade, done at 
        Geneva April 12, 1979, on international standards, such as 
        transparency, openness, and consensus-based decision-
        making.</DELETED>
<DELETED>    (c) Consultation on Innovation and Advancements in 
Artificial Intelligence.--In forming an alliance or coalition under 
subsection (a), the Director, the Secretary of Commerce, and the 
Secretary of State shall consult with the Secretary of Energy and the 
Director of the National Science Foundation on approaches to innovation 
and advancements in artificial intelligence.</DELETED>
<DELETED>    (d) Security and Protection of Intellectual Property.--The 
Director, the Secretary of Commerce, and the Secretary of State shall 
jointly ensure that an alliance or coalition formed under subsection 
(a) is only formed with countries that--</DELETED>
        <DELETED>    (1) have in place sufficient intellectual property 
        protections, safety standards, and risk management approaches 
        relevant to innovation and artificial intelligence; 
        and</DELETED>
        <DELETED>    (2) develop and coordinate research security 
        measures, export controls, and intellectual property 
        protections relevant to innovation, development, and standard-
        setting relating to artificial intelligence.</DELETED>
<DELETED>    (e) Rule of Construction.--Nothing in this section shall 
be construed to prohibit anyone from participating in other 
international standards bodies.</DELETED>

<DELETED>SEC. 112. REQUIREMENT TO SUPPORT BILATERAL AND MULTILATERAL 
              ARTIFICIAL INTELLIGENCE RESEARCH 
              COLLABORATIONS.</DELETED>

<DELETED>    (a) In General.--The Director of the National Science 
Foundation shall support bilateral and multilateral collaborations to 
facilitate innovation in research and development of artificial 
intelligence.</DELETED>
<DELETED>    (b) Alignment With Priorities.--The Director shall ensure 
that collaborations supported under subsection (a) align with the 
priorities of the Foundation and United States research community and 
have the potential to benefit United States prosperity, security, 
health, and well-being.</DELETED>
<DELETED>    (c) Requirements.--The Director shall ensure that 
collaborations supported under subsection (a)--</DELETED>
        <DELETED>    (1) support innovation and advancement in research 
        on the development and use of artificial 
        intelligence;</DELETED>
        <DELETED>    (2) facilitate international collaboration on 
        innovation and advancement in artificial intelligence research 
        and development, including data sharing, expertise, and 
        resources; and</DELETED>
        <DELETED>    (3) leverage existing National Science Foundation 
        programs, such as the National Science Foundation-supported 
        National Artificial Intelligence Research Institutes and Global 
        Centers programs.</DELETED>
<DELETED>    (d) Coordination of Security Measures and Export 
Controls.--When entering into agreements in order to support 
collaborations pursuant to subsection (a), the Director shall ensure 
that participating countries have developed and coordinated security 
measures and export controls to protect intellectual property and 
research and development.</DELETED>

        <DELETED>Subtitle C--Identifying Regulatory Barriers to 
                          Innovation</DELETED>

<DELETED>SEC. 121. COMPTROLLER GENERAL OF THE UNITED STATES 
              IDENTIFICATION OF RISKS AND OBSTACLES RELATING TO 
              ARTIFICIAL INTELLIGENCE AND FEDERAL AGENCIES.</DELETED>

<DELETED>    (a) Report Required.--Not later than 1 year after the date 
of the enactment of this Act, the Comptroller General of the United 
States shall submit to Congress a report on regulatory impediments to 
innovation in artificial intelligence systems.</DELETED>
<DELETED>    (b) Contents.--The report submitted pursuant to subsection 
(a) shall include the following:</DELETED>
        <DELETED>    (1) Significant examples of Federal statutes and 
        regulations that directly affect the innovation of artificial 
        intelligence systems, including the ability of companies of all 
        sizes to compete in artificial intelligence, which should also 
        account for the effect of voluntary standards and best 
        practices developed by the Federal Government.</DELETED>
        <DELETED>    (2) An assessment of challenges that Federal 
        agencies face in the enforcement of provisions of law 
        identified pursuant to paragraph (1).</DELETED>
        <DELETED>    (3) An evaluation of the progress in government 
        adoption of artificial intelligence and use of artificial 
        intelligence to improve the quality of government 
        services.</DELETED>
        <DELETED>    (4) Based on the findings of the Comptroller 
        General with respect to paragraphs (1) through (4), such 
        recommendations as the Comptroller General may have for 
        legislative or administrative action to increase the rate of 
        innovation in artificial intelligence systems.</DELETED>

   <DELETED>TITLE II--ARTIFICIAL INTELLIGENCE RESEARCH, DEVELOPMENT, 
                 CAPACITY BUILDING ACTIVITIES</DELETED>

<DELETED>SEC. 201. PUBLIC DATA FOR ARTIFICIAL INTELLIGENCE 
              SYSTEMS.</DELETED>

<DELETED>    (a) List of Priorities.--</DELETED>
        <DELETED>    (1) In general.--To expedite the development of 
        artificial intelligence systems in the United States, the 
        Director of the Office of Science and Technology Policy shall, 
        acting through the National Science and Technology Council and 
        the Interagency Committee established or designated pursuant to 
        section 5103 of the National Artificial Intelligence Initiative 
        Act of 2020 (15 U.S.C. 9413), develop a list of priorities for 
        Federal investment in creating or improving curated, publicly 
        available Federal Government data for training and evaluating 
        artificial intelligence systems.</DELETED>
        <DELETED>    (2) Requirements.--</DELETED>
                <DELETED>    (A) In general.--The list developed 
                pursuant to paragraph (1) shall--</DELETED>
                        <DELETED>    (i) prioritize data that will 
                        advance novel artificial intelligence systems 
                        in the public interest; and</DELETED>
                        <DELETED>    (ii) prioritize datasets unlikely 
                        to independently receive sufficient private 
                        sector support to enable their creation, absent 
                        Federal funding.</DELETED>
                <DELETED>    (B) Datasets identified.--In carrying out 
                subparagraph (A)(ii), the Director shall identify 20 
                datasets to be prioritized.</DELETED>
        <DELETED>    (3) Considerations.--In developing the list under 
        paragraph (1), the Director shall consider the 
        following:</DELETED>
                <DELETED>    (A) Applicability to the initial list of 
                societal, national, and geostrategic challenges set 
                forth by subsection (b) of section 10387 of the 
                Research and Development, Competition, and Innovation 
                Act (42 U.S.C. 19107), or any successor list.</DELETED>
                <DELETED>    (B) Applicability to the initial list of 
                key technology focus areas set forth by subsection (c) 
                of such section, or any successor list.</DELETED>
                <DELETED>    (C) Applicability to other major United 
                States economic sectors, such as agriculture, health 
                care, transportation, manufacturing, communications, 
                weather services, and positive utility to small and 
                medium United States businesses.</DELETED>
                <DELETED>    (D) Opportunities to improve datasets in 
                effect before the date of the enactment of this 
                Act.</DELETED>
                <DELETED>    (E) Inclusion of data representative of 
                the entire population of the United States.</DELETED>
                <DELETED>    (F) Potential national security threats to 
                releasing datasets, consistent with the United States 
                Government approach to data flows.</DELETED>
                <DELETED>    (G) Requirements of laws in 
                effect.</DELETED>
                <DELETED>    (H) Applicability to the priorities listed 
                in the National Artificial Intelligence Research and 
                Development Strategic Plan of the National Science and 
                Technology Council, dated October 2016.</DELETED>
                <DELETED>    (I) Ability to use data already made 
                available to the National Artificial Intelligence 
                Research Resource Pilot program or any successor 
                program.</DELETED>
        <DELETED>    (4) Public input.--Before finalizing the list 
        required by paragraph (1), the Director shall implement public 
        comment procedures for receiving input and comment from private 
        industry, academia, civil society, and other relevant 
        stakeholders.</DELETED>
<DELETED>    (b) National Science and Technology Council Agencies.--The 
head of each agency with a representative included in the Interagency 
Committee pursuant to section 5103(c) of the National Artificial 
Intelligence Initiative Act of 2020 (15 U.S.C. 9413(c)) or the heads of 
multiple agencies with a representative included in the Interagency 
Committee working cooperatively, consistent with the missions or 
responsibilities of each Executive agency--</DELETED>
        <DELETED>    (1) subject to the availability of appropriations, 
        shall award grants or otherwise establish incentives, through 
        new or existing programs, for the creation or improvement of 
        curated datasets identified in the list developed pursuant to 
        subsection (a)(1), including methods for addressing data 
        scarcity;</DELETED>
        <DELETED>    (2) may establish or leverage existing 
        initiatives, including public-private partnerships, to 
        encourage private sector cost-sharing in the creation or 
        improvement of such datasets;</DELETED>
        <DELETED>    (3) may apply the priorities set forth in the list 
        developed pursuant to subsection (a)(1) to the enactment of 
        Federal public access and open government data 
        policies;</DELETED>
        <DELETED>    (4) in carrying out this subsection, shall ensure 
        consistency with Federal provisions of law relating to privacy, 
        including the technology and privacy standards applied to the 
        National Secure Data Service under section 10375(f) of the 
        Research and Development, Competition, and Innovation Act (42 
        U.S.C. 19085(f)); and</DELETED>
        <DELETED>    (5) in carrying out this subsection, shall ensure 
        data sharing is limited with any country that the Secretary of 
        Commerce, in consultation with the Secretary of Defense, the 
        Secretary of State, and the Director of National Intelligence, 
        determines to be engaged in conduct that is detrimental to the 
        national security or foreign policy of the United 
        States.</DELETED>
<DELETED>    (c) Availability of Datasets.--Datasets that are created 
or improved by Federal agencies may be made available to the National 
Artificial Intelligence Research Resource pilot program established by 
the Director of the National Science Foundation in accordance with 
Executive Order 14110 (88 Fed. Reg. 75191; relating to safe, secure, 
and trustworthy development and use of artificial intelligence), or any 
successor program.</DELETED>
<DELETED>    (d) Rule of Construction.--Nothing in this subsection 
shall be construed to require the Federal Government or other 
contributors to disclose any information--</DELETED>
        <DELETED>    (1) relating to a trade secret or other protected 
        intellectual property right;</DELETED>
        <DELETED>    (2) that is confidential business information; 
        or</DELETED>
        <DELETED>    (3) that is privileged.</DELETED>

<DELETED>SEC. 202. FEDERAL GRAND CHALLENGES IN ARTIFICIAL 
              INTELLIGENCE.</DELETED>

<DELETED>    (a) List of Priorities for Federal Grand Challenges in 
Artificial Intelligence.--</DELETED>
        <DELETED>    (1) List required.--Not later than 1 year after 
        the date of the enactment of this Act, the Director of the 
        Office of Science and Technology Policy shall, acting through 
        the National Science and Technology Council and the Interagency 
        Committee established or designated pursuant to section 5103 of 
        the National Artificial Intelligence Initiative Act of 2020 (15 
        U.S.C. 9413), in consultation with industry, civil society, and 
        academia, establish a list of priorities for Federal grand 
        challenges in artificial intelligence that seek--</DELETED>
                <DELETED>    (A) to expedite the development of 
                artificial intelligence systems in the United States; 
                and</DELETED>
                <DELETED>    (B) to stimulate artificial intelligence 
                research, development, and commercialization that 
                solves or advances specific, well-defined, and 
                measurable challenges.</DELETED>
        <DELETED>    (2) Contents.--The list established pursuant to 
        paragraph (1) may include the following priorities:</DELETED>
                <DELETED>    (A) To overcome challenges with 
                engineering of and applied research on 
                microelectronics, including through integration of 
                artificial intelligence with emerging technologies, 
                such as machine learning and quantum computing, or with 
                respect to the physical limits on transistors, 
                electrical interconnects, and memory 
                elements.</DELETED>
                <DELETED>    (B) To promote transformational or long-
                term advancements in computing and artificial 
                intelligence technologies through--</DELETED>
                        <DELETED>    (i) next-generation algorithm 
                        design;</DELETED>
                        <DELETED>    (ii) next-generation compute 
                        capability;</DELETED>
                        <DELETED>    (iii) generative and adaptive 
                        artificial intelligence for design 
                        applications;</DELETED>
                        <DELETED>    (iv) photonics-based 
                        microprocessors and optical communication 
                        networks, including electrophotonics;</DELETED>
                        <DELETED>    (v) the chemistry and physics of 
                        new materials;</DELETED>
                        <DELETED>    (vi) energy use or energy 
                        efficiency;</DELETED>
                        <DELETED>    (vii) techniques to establish 
                        cryptographically secure content provenance 
                        information; or</DELETED>
                        <DELETED>    (viii) safety and controls for 
                        artificial intelligence applications.</DELETED>
                <DELETED>    (C) To develop artificial intelligence 
                solutions, including through integration among emerging 
                technologies such as quantum computing and machine 
                learning, to overcome barriers relating to innovations 
                in advanced manufacturing in the United States, 
                including areas such as--</DELETED>
                        <DELETED>    (i) materials, nanomaterials, and 
                        composites;</DELETED>
                        <DELETED>    (ii) rapid, complex 
                        design;</DELETED>
                        <DELETED>    (iii) sustainability and 
                        environmental impact of manufacturing 
                        operations;</DELETED>
                        <DELETED>    (iv) predictive maintenance of 
                        machinery;</DELETED>
                        <DELETED>    (v) improved part 
                        quality;</DELETED>
                        <DELETED>    (vi) process 
                        inspections;</DELETED>
                        <DELETED>    (vii) worker safety; and</DELETED>
                        <DELETED>    (viii) robotics.</DELETED>
                <DELETED>    (D) To develop artificial intelligence 
                solutions in sectors of the economy, such as expanding 
                the use of artificial intelligence in maritime vessels, 
                including in navigation and in the design of propulsion 
                systems and fuels.</DELETED>
                <DELETED>    (E) To develop artificial intelligence 
                solutions to improve border security, including 
                solutions relevant to the detection of fentanyl, 
                illicit contraband, and other illegal 
                activities.</DELETED>
        <DELETED>    (3) Periodic updates.--The Director shall update 
        the list established pursuant to paragraph (1) periodically as 
        the Director determines necessary.</DELETED>
<DELETED>    (b) Federal Investment Initiatives Required.--Subject to 
the availability of appropriations, the head of each agency with a 
representative on the Interagency Committee pursuant to section 5103(c) 
of the National Artificial Intelligence Initiative Act of 2020 (15 
U.S.C. 9413(c)) or the heads of multiple agencies with a representative 
on the Interagency Committee working cooperatively, shall, consistent 
with the missions or responsibilities of each agency, establish 1 or 
more prize competitions under section 24 of the Stevenson-Wydler 
Technology Innovation Act of 1980 (15 U.S.C. 3719), challenge-based 
acquisitions, or other research and development investments that each 
agency head deems appropriate consistent with the list of priorities 
established pursuant to subsection (a)(1).</DELETED>
<DELETED>    (c) Timing and Announcements of Federal Investment 
Initiatives.--The President, acting through the Director, shall ensure 
that, not later than 1 year after the date on which the Director 
establishes the list required by subsection (a)(1), at least 3 prize 
competitions, challenge-based acquisitions, or other research and 
development investments are announced by heads of Federal agencies 
under subsection (b).</DELETED>
<DELETED>    (d) Requirements.--Each head of an agency carrying out an 
investment initiative under subsection (b) shall ensure that--
</DELETED>
        <DELETED>    (1) for each prize competition or investment 
        initiative carried out by the agency under such subsection, 
        there is--</DELETED>
                <DELETED>    (A) a positive impact on the economic 
                competitiveness of the United States;</DELETED>
                <DELETED>    (B) a benefit to United States 
                industry;</DELETED>
                <DELETED>    (C) to the extent possible, leveraging of 
                the resources and expertise of industry and 
                philanthropic partners in shaping the investments; 
                and</DELETED>
                <DELETED>    (D) in a case involving development and 
                manufacturing, use of advanced manufacturing in the 
                United States; and</DELETED>
        <DELETED>    (2) all research conducted for purposes of the 
        investment initiative is conducted in the United 
        States.</DELETED>

SECTION 1. SHORT TITLE; TABLE OF CONTENTS.

    (a) Short Title.--This Act may be cited as the ``Future of 
Artificial Intelligence Innovation Act of 2024''.
    (b) Table of Contents.--The table of contents for this Act is as 
follows:

Sec. 1. Short title; table of contents.
Sec. 2. Sense of Congress.
Sec. 3. Definitions.

    TITLE I--VOLUNTARY ARTIFICIAL INTELLIGENCE STANDARDS, METRICS, 
       EVALUATION TOOLS, TESTBEDS, AND INTERNATIONAL COOPERATION

   Subtitle A--Artificial Intelligence Safety Institute and Testbeds

Sec. 101. Artificial Intelligence Safety Institute.
Sec. 102. Program on artificial intelligence testbeds.
Sec. 103. National Institute of Standards and Technology and Department 
                            of Energy testbed to identify, test, and 
                            synthesize new materials.
Sec. 104. National Science Foundation and Department of Energy 
                            collaboration to make scientific 
                            discoveries through the use of artificial 
                            intelligence.
Sec. 105. Progress report.

                 Subtitle B--International Cooperation

Sec. 111. International coalition on innovation, development, and 
                            harmonization of standards with respect to 
                            artificial intelligence.
Sec. 112. Requirement to support bilateral and multilateral artificial 
                            intelligence research collaborations.

       Subtitle C--Identifying Regulatory Barriers to Innovation

Sec. 121. Comptroller General of the United States identification of 
                            risks and obstacles relating to artificial 
                            intelligence and Federal agencies.

   TITLE II--ARTIFICIAL INTELLIGENCE RESEARCH, DEVELOPMENT, CAPACITY 
                          BUILDING ACTIVITIES

Sec. 201. Public data for artificial intelligence systems.
Sec. 202. Federal grand challenges in artificial intelligence.
Sec. 1. Short title; table of contents.
Sec. 2. Sense of Congress.

    TITLE I--VOLUNTARY ARTIFICIAL INTELLIGENCE STANDARDS, METRICS, 
       EVALUATION TOOLS, TESTBEDS, AND INTERNATIONAL COOPERATION

Sec. 100. Definitions.

   Subtitle A--Artificial Intelligence Safety Institute and Testbeds

Sec. 101. Artificial Intelligence Safety Institute.
Sec. 102. Interagency coordination and program to facilitate artificial 
                            intelligence testbeds.
Sec. 103. National Institute of Standards and Technology and Department 
                            of Energy testbed to identify, test, and 
                            synthesize new materials.
Sec. 104. Coordination, reimbursement, and savings provisions.
Sec. 105. Progress report.

                 Subtitle B--International Cooperation

Sec. 111. International coalitions on innovation, development, and 
                            alignment of standards with respect to 
                            artificial intelligence.

       Subtitle C--Identifying Regulatory Barriers to Innovation

Sec. 121. Comptroller General of the United States identification of 
                            risks and obstacles relating to artificial 
                            intelligence and Federal agencies.

   TITLE II--ARTIFICIAL INTELLIGENCE RESEARCH, DEVELOPMENT, CAPACITY 
                          BUILDING ACTIVITIES

Sec. 201. Public data for artificial intelligence systems.
Sec. 202. Federal grand challenges in artificial intelligence.

             TITLE III--RESEARCH SECURITY AND OTHER MATTERS

Sec. 301. Research security.
Sec. 302. Expansion of authority to hire critical technical experts.
Sec. 303. Foundation for Standards and Metrology.
Sec. 304. Prohibition on certain policies relating to the use of 
                            artificial intelligence or other automated 
                            systems.
Sec. 305. Certifications and audits of temporary fellows.

SEC. 2. SENSE OF CONGRESS.

    It is the sense of Congress that policies affecting artificial 
intelligence should maximize the potential, development, and use of 
artificial intelligence to benefit all private and public stakeholders.

    TITLE I--VOLUNTARY ARTIFICIAL INTELLIGENCE STANDARDS, METRICS, 
       EVALUATION TOOLS, TESTBEDS, AND INTERNATIONAL COOPERATION

SEC. 100. DEFINITIONS.

    In this title:
            (1) Artificial intelligence.--The term ``artificial 
        intelligence'' has the meaning given such term in section 5002 
        of the National Artificial Intelligence Initiative Act of 2020 
        (15 U.S.C. 9401).
            (2) Artificial intelligence model.--The term ``artificial 
        intelligence model'' means a component of an artificial 
        intelligence system that is--
                    (A) derived using mathematical, computational, 
                statistical, or machine-learning techniques; and
                    (B) used as part of an artificial intelligence 
                system to produce outputs from a given set of inputs.
            (3) Artificial intelligence system.--The term ``artificial 
        intelligence system'' means an engineered or machine-based 
        system that--
                    (A) can, for a given set of objectives, generate 
                outputs such as predictions, recommendations, or 
                decisions influencing real or virtual environments; and
                    (B) is designed to operate with varying levels of 
                autonomy.
            (4) Critical infrastructure.--The term ``critical 
        infrastructure'' has the meaning given such term in section 
        1016(e) of the Uniting and Strengthening America by Providing 
        Appropriate Tools Required to Intercept and Obstruct Terrorism 
        (USA PATRIOT ACT) Act of 2001 (42 U.S.C. 5195c(e)).
            (5) Federal laboratory.--The term ``Federal laboratory'' 
        has the meaning given such term in section 4 of the Stevenson-
        Wydler Technology Innovation Act of 1980 (15 U.S.C. 3703).
            (6) Foundation model.--The term ``foundation model'' means 
        an artificial intelligence model trained on broad data at scale 
        and is adaptable to a wide range of downstream tasks.
            (7) National laboratory.--The term ``National Laboratory'' 
        has the meaning given such term in section 2 of the Energy 
        Policy Act of 2005 (42 U.S.C. 15801).
            (8) Testbed.--The term ``testbed'' means a facility or 
        mechanism equipped for conducting rigorous, transparent, and 
        replicable testing of tools and technologies, including 
        artificial intelligence systems, to help evaluate the 
        functionality, trustworthiness, usability, and performance of 
        those tools or technologies.

   Subtitle A--Artificial Intelligence Safety Institute and Testbeds

SEC. 101. ARTIFICIAL INTELLIGENCE SAFETY INSTITUTE.

    The National Institute of Standards and Technology Act (15 U.S.C. 
271 et seq.) is amended by inserting after section 22A (15 U.S.C. 278h-
1) the following:

``SEC. 22B. ARTIFICIAL INTELLIGENCE SAFETY INSTITUTE.

    ``(a) Definitions.--In this section:
            ``(1) Agency.--The term `agency' has the meaning given the 
        term `Executive agency' in section 105 of title 5, United 
        States Code.
            ``(2) Artificial intelligence.--The term `artificial 
        intelligence' has the meaning given such term in section 5002 
        of the National Artificial Intelligence Initiative Act of 2020 
        (15 U.S.C. 9401).
            ``(3) Artificial intelligence blue-teaming.--The term 
        `artificial intelligence blue-teaming' means an effort to 
        conduct operational vulnerability evaluations and provide 
        mitigation techniques to entities who have a need for an 
        independent technical review of the security posture of an 
        artificial intelligence system.
            ``(4) Artificial intelligence red-teaming.--The term 
        `artificial intelligence red-teaming' means structured 
        adversarial testing efforts of an artificial intelligence 
        system.
            ``(5) Federal laboratory.--The term `Federal laboratory' 
        has the meaning given such term in section 4 of the Stevenson-
        Wydler Technology Innovation Act of 1980 (15 U.S.C. 3703).
            ``(6) Foundation model.--The term `foundation model' means 
        an artificial intelligence model trained on broad data at scale 
        and is adaptable to a wide range of downstream tasks.
            ``(7) Synthetic content.--The term `synthetic content' 
        means information, such as images, videos, audio clips, and 
        text, that has been significantly modified or generated by 
        algorithms, including by an artificial intelligence system.
            ``(8) Testbed.--The term `testbed' means a facility or 
        mechanism equipped for conducting rigorous, transparent, and 
        replicable testing of tools and technologies, including 
        artificial intelligence systems, to help evaluate the 
        functionality, trustworthiness, usability, and performance of 
        those tools or technologies.
            ``(9) Watermarking.--The term `watermarking' means the act 
        of embedding information that is intended to be difficult to 
        remove, into outputs generated by artificial intelligence 
        systems or in original content, including outputs such as text, 
        images, audio, videos, software code, or any other digital 
        content or data, for the purposes of verifying the authenticity 
        of the output or the identity or characteristics of its 
        provenance, modifications, or conveyance.
    ``(b) Establishment of Artificial Intelligence Safety Institute.--
            ``(1) In general.--Not later than 90 days after the date of 
        the enactment of the Future of Artificial Intelligence 
        Innovation Act of 2024, the Director shall establish an 
        institute on artificial intelligence within the Institute.
            ``(2) Designation.--The institute established pursuant to 
        paragraph (1) shall be known as the `Artificial Intelligence 
        Safety Institute'.
            ``(3) Mission.--The mission of the Artificial Intelligence 
        Safety Institute is to assist the private sector and agencies 
        in developing voluntary best practices for the robust 
        assessment of artificial intelligence systems, which may be 
        contributed to or inform the work on such practices in 
        standards development organizations.
    ``(c) Functions.--
            ``(1) In general.--The functions of the Artificial 
        Intelligence Safety Institute, which the Artificial 
        Intelligence Safety Institute shall carry out in coordination 
        with the laboratories of the Institute, include the following:
                    ``(A) Using publicly available or voluntarily 
                provided information, assessing artificial intelligence 
                systems and developing best practices for reliable and 
                secure development, deployment, and use of artificial 
                intelligence technology.
                    ``(B) Supporting artificial intelligence red-
                teaming, sharing best practices, and coordinating on 
                building testbeds and test environments with allies and 
                international partners of the United States.
                    ``(C) Developing and publishing physical and 
                cybersecurity tools, methodologies, best practices, 
                voluntary guidelines, and other supporting information 
                to assist persons who maintain systems used to create 
                or train artificial intelligence models with 
                discovering and mitigating vulnerabilities and attacks, 
                including manipulation through data poisoning, 
                including those that may be exploited by foreign 
                adversaries.
                    ``(D) Establishing artificial intelligence blue-
                teaming capabilities to support mitigation approaches 
                and partnering with industry to address the reliability 
                of artificial intelligence systems.
                    ``(E) Developing tools, methodologies, best 
                practices, and voluntary guidelines for detecting 
                synthetic content, authenticating content and tracking 
                of the provenance of content, labeling original and 
                synthetic content, such as by watermarking, and 
                evaluating software and systems relating to detection 
                and labeling of synthetic content.
                    ``(F) Coordinating or developing metrics and 
                methodologies for testing artificial intelligence 
                systems, including the following:
                            ``(i) Cataloging existing artificial 
                        intelligence metrics and evaluation 
                        methodologies used in industry and academia.
                            ``(ii) Testing the efficacy of existing 
                        metrics and evaluations.
                    ``(G) Coordinating with counterpart international 
                institutions, partners, and allies, to support global 
                interoperability in the development of research and 
                testing of standards relating to artificial 
                intelligence.
    ``(d) Artificial Intelligence Safety Institute Consortium.--
            ``(1) Establishment.--
                    ``(A) In general.--Not later than 180 days after 
                the date of the enactment of this Act, the Director 
                shall establish a consortium of stakeholders from 
                academic or research communities, Federal laboratories, 
                private industry, including companies of all sizes with 
                different roles in the use of artificial intelligence 
                systems, including developers, deployers, evaluators, 
                users, and civil society with expertise in matters 
                relating to artificial intelligence to support the 
                Artificial Intelligence Safety Institute in carrying 
                out the functions set forth under subsection (c).
                    ``(B) Designation.--The consortium established 
                pursuant to subparagraph (A) shall be known as the 
                `Artificial Intelligence Safety Institute Consortium'.
            ``(2) Consultation.--The Director shall consult with the 
        consortium established under this subsection not less 
        frequently than quarterly.
            ``(3) Annual reports to congress.--Not later than 1 year 
        after the date of the enactment of the Future of Artificial 
        Intelligence Innovation Act of 2024 and not less frequently 
        than once each year thereafter, the Director shall submit to 
        the Committee on Commerce, Science, and Transportation of the 
        Senate and the Committee on Science, Space, and Technology of 
        the House of Representatives a report summarizing the 
        contributions of the members of the consortium established 
        under this subsection in support the efforts of the Artificial 
        Intelligence Safety Institute.
    ``(e) Voluntary Artificial Intelligence Testing Standards.--In 
carrying out the functions under subsection (c), the Director shall 
support and contribute to the development of voluntary, consensus-based 
technical standards for testing artificial intelligence system 
components, including by addressing, as the Director considers 
appropriate, the following:
            ``(1) Physical infrastructure for training or developing 
        artificial intelligence models and systems, including cloud 
        infrastructure.
            ``(2) Physical infrastructure for operating artificial 
        intelligence systems, including cloud infrastructure.
            ``(3) Data for training artificial intelligence models.
            ``(4) Data for evaluating the functionality and 
        trustworthiness of trained artificial intelligence models and 
        systems.
            ``(5) Trained or partially trained artificial intelligence 
        models and any resulting software systems or products.
            ``(6) Human-in-the-loop testing of artificial intelligence 
        models and systems.
    ``(f) Matters Relating to Disclosure and Access.--
            ``(1) FOIA exemption.--Any confidential content, as deemed 
        confidential by the contributing private sector person, shall 
        be exempt from public disclosure under section 552(b)(3) of 
        title 5, United States Code.
            ``(2) Limitation on access to content.--Access to a 
        contributing private sector person's voluntarily provided 
        confidential content, as deemed confidential by the 
        contributing private sector person shall be limited to the 
        private sector person and the Artificial Intelligence Safety 
        Institute.
            ``(3) Aggregated information.--The Director may make 
        aggregated, deidentified information available to contributing 
        companies, the public, and other agencies, as the Director 
        considers appropriate, in support of the purposes of this 
        section.
    ``(g) Rule of Construction.--Nothing in this section shall be 
construed to provide the Director any enforcement authority that was 
not in effect on the day before the date of the enactment of the Future 
of Artificial Intelligence Innovation Act of 2024.
    ``(h) Prohibition on Access to Resources for Entities Under Control 
of Certain Foreign Governments.--
            ``(1) In general.--An entity under the ownership, control, 
        or influence of the government of a covered nation may not 
        access any of the resources of the Artificial Intelligence 
        Safety Institute.
            ``(2) Criteria for identification.--The Director, working 
        with the heads of the relevant Federal agencies, shall 
        establish criteria to determine if any entity that seeks to 
        utilize the resources of the Artificial Intelligence Safety 
        Institute is under the ownership, control, or influence of the 
        government of a covered nation.
            ``(3) Definitions.--In this subsection:
                    ``(A) Covered nation.--The term `covered nation' 
                has the meaning given that term in section 4872 of 
                title 10, United States Code.
                    ``(B) Ownership, control, or influence of the 
                government of a covered nation.--The term `ownership, 
                control, or influence of the government of a covered 
                nation', with respect to an entity, means the 
                government of a covered nation--
                            ``(i) has the power to direct or decide 
                        matters affecting the entity's management or 
                        operations in a manner that could--
                                    ``(I) result in unauthorized access 
                                to classified information; or
                                    ``(II) adversely affect performance 
                                of a contract or agreement requiring 
                                access to classified information; and
                            ``(ii) exercises that power--
                                    ``(I) directly or indirectly;
                                    ``(II) through ownership of the 
                                entity's securities, by contractual 
                                arrangements, or other similar means;
                                    ``(III) by the ability to control 
                                or influence the election or 
                                appointment of one or more members to 
                                the entity's governing board (such as 
                                the board of directors, board of 
                                managers, or board of trustees) or its 
                                equivalent; or
                                    ``(IV) prospectively (such as by 
                                not currently exercising the power, but 
                                could).''.

SEC. 102. INTERAGENCY COORDINATION AND PROGRAM TO FACILITATE ARTIFICIAL 
              INTELLIGENCE TESTBEDS.

    (a) Definitions.--In this section:
            (1) Appropriate committees of congress.--The term 
        ``appropriate committees of Congress'' means--
                    (A) the Committee on Commerce, Science, and 
                Transportation and the Committee on Energy and Natural 
                Resources of the Senate; and
                    (B) the Committee on Science, Space, and Technology 
                of the House of Representatives.
            (2) Director.--The term ``Director'' means the Director of 
        the National Science Foundation.
            (3) Institute.--The term ``Institute'' means the National 
        Institute of Standards and Technology.
            (4) Secretary.--The term ``Secretary'' means the Secretary 
        of Energy.
            (5) Under secretary.--The term ``Under Secretary'' means 
        the Under Secretary of Commerce for Standards and Technology.
    (b) Program Required.--Not later than 1 year after the date of the 
enactment of this Act, the Under Secretary and the Secretary, in 
coordination with the Director, shall jointly establish a testbed 
program to encourage collaboration and support partnerships between the 
National Laboratories, Federal laboratories, the National Institute of 
Standards and Technology, the National Artificial Intelligence Research 
Resource pilot program established by the Director, or any successor 
program, and public and private sector entities, including companies of 
all sizes, to conduct tests, evaluations, and security or vulnerability 
risk assessments, and to support research and development, of 
artificial intelligence systems, including measurement methodologies 
developed by the Institute, in order to develop standards and encourage 
development of a third-party ecosystem.
    (c) Activities.--In carrying out the program required by subsection 
(b), the Under Secretary and the Secretary--
            (1) may use the advanced computing resources, testbeds, and 
        expertise of the National Laboratories, Federal laboratories, 
        the Institute, the National Science Foundation, and private 
        sector entities to run tests and evaluations on the 
        capabilities and limitations of artificial intelligence 
        systems;
            (2) shall use existing solutions to the maximum extent 
        practicable;
            (3) shall develop automated and reproducible tests and 
        evaluations for artificial intelligence systems to the extent 
        that is practicable;
            (4) shall assess the computational resources necessary to 
        run tests and evaluations of artificial intelligence systems;
            (5) shall research methods to effectively minimize the 
        computational resources needed to run tests, evaluations, and 
        security assessments of artificial intelligence systems;
            (6) shall where practicable, develop tests and evaluations 
        for artificial intelligence systems that are designed for high-
        , medium-, and low-computational intensity; and
            (7) shall prioritize assessments by identifying security 
        vulnerabilities of artificial intelligence systems, including 
        the establishment of and utilization of existing classified 
        testbeds, at the National Laboratories if necessary, including 
        with respect to--
                    (A) autonomous offensive cyber capabilities;
                    (B) cybersecurity vulnerabilities in the artificial 
                intelligence software ecosystem and beyond;
                    (C) chemical, biological, radiological, nuclear, 
                critical infrastructure, and energy-security threats or 
                hazards; and
                    (D) such other capabilities as the Under Secretary 
                or the Secretary determines necessary.
    (d) Consideration Given.--In carrying out the activities required 
by subsection (c), the Under Secretary and the Secretary shall take 
under consideration the applicability of any tests, evaluations, and 
risk assessments to artificial intelligence systems trained using 
primarily biological sequence data that could be used to enhance an 
artificial intelligence system's ability to contribute to the creation 
of a pandemic or biological weapon, including those systems used for 
gene synthesis.
    (e) Metrics.--The Under Secretary and the Secretary shall jointly 
develop metrics to assess--
            (1) the effectiveness of the program in encouraging 
        collaboration and supporting partnerships as described in 
        subsection (b); and
            (2) the impact of the program on public and private sector 
        integration and use of artificial intelligence systems.
    (f) Use of Existing Program.--In carrying out the program required 
by subsection (b), the Under Secretary, the Secretary, and the Director 
may use a program that was in effect on the day before the date of the 
enactment of this Act.
    (g) Evaluation and Findings.--Not later than 3 years after the 
start of the program required by subsection (b), the Under Secretary 
and the Secretary shall jointly--
            (1) evaluate the success of the program in encouraging 
        collaboration and supporting partnerships as described in 
        subsection (b), using the metrics developed pursuant to 
        subsection (e);
            (2) evaluate the success of the program in encouraging 
        public and private sector integration and use of artificial 
        intelligence systems by using the metrics developed pursuant to 
        subsection (e); and
            (3) submit to the appropriate committees of Congress the 
        evaluation supported pursuant to paragraph (1) and the findings 
        of the Under Secretary, the Secretary, and the Director with 
        respect to the testbed program.
    (h) Consultation.--In carrying out subsection (b), the Under 
Secretary and the Secretary shall consult, as the Under Secretary and 
the Secretary consider appropriate, with the following:
            (1) Industry, including private artificial intelligence 
        laboratories, companies of all sizes, and representatives from 
        the United States financial sector.
            (2) Academia and institutions of higher education.
            (3) Civil society.
    (i) Establishment of Voluntary Foundation Models Test Program.--In 
carrying out the program under subsection (b), the Under Secretary and 
the Secretary shall, jointly carry out a test program to provide 
vendors of foundation models, as well as vendors of artificial 
intelligence virtual agents and robots that incorporate foundation 
models, the opportunity to voluntarily test foundation models across a 
range of modalities, such as models that ingest and output text, 
images, audio, video, software code, and mixed modalities.
    (j) Matters Relating to Disclosure and Access.--
            (1) Limitation on access to content.--Access to a 
        contributing private sector person's voluntarily provided 
        confidential content, as deemed confidential by the 
        contributing private sector person, shall be limited to the 
        contributing private sector person and the Institute.
            (2) Aggregated information.--The Under Secretary and the 
        Secretary may make aggregated, deidentified information 
        available to contributing companies, the public, and other 
        agencies, as the Under Secretary considers appropriate, in 
        support of the purposes of this section.
            (3) FOIA exemption.--Any confidential content, as deemed 
        confidential by the contributing private sector person, shall 
        be exempt from public disclosure under section 552(b)(3) of 
        title 5, United States Code.
    (k) Rule of Construction.--Nothing in this section shall be 
construed to require a person to disclose any information, including 
information--
            (1) relating to a trade secret or other protected 
        intellectual property right;
            (2) that is confidential business information; or
            (3) that is privileged.
    (l) Sunset.--The programs required by subsections (b) and (i) and 
the requirements of this section shall terminate on the date that is 7 
years after the date of the enactment of this Act.

SEC. 103. NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGY AND DEPARTMENT 
              OF ENERGY TESTBED TO IDENTIFY, TEST, AND SYNTHESIZE NEW 
              MATERIALS.

    (a) In General.--The Secretary of Commerce, acting through the 
Under Secretary of Commerce for Standards and Technology, and the 
Secretary of Energy may use the program established under section 
102(b) to advance materials science and energy storage and optimization 
and to support advanced manufacturing for the benefit of the United 
States economy through the use of artificial intelligence, autonomous 
laboratories, and artificial intelligence integrated with emerging 
technologies, such as quantum hybrid computing and robotics.
    (b) Support for Accelerated Technologies.--The Secretary of 
Commerce and the Secretary of Energy shall ensure that technologies 
accelerated under subsection (a) are supported by advanced algorithms 
and models, uncertainty quantification, and software and workforce 
development tools to produce benchmark data, model comparison tools, 
and best practices guides.
    (c) Public-private Partnerships.--In carrying out subsection (a), 
the Secretary of Commerce and the Secretary of Energy shall, in 
consultation with industry, civil society, and academia, enter into 
such public-private partnerships as the Secretaries jointly determine 
appropriate.
    (d) Resources.--In carrying out this section, the Secretaries may--
            (1) use science and technology resources from the 
        Manufacturing USA Program, the Hollings Manufacturing Extension 
        Partnership, the National Laboratories, Federal laboratories, 
        and the private sector; and
            (2) the program established under section 102(b).

SEC. 104. COORDINATION, REIMBURSEMENT, AND SAVINGS PROVISIONS.

    (a) Coordination and Duplication.--The Secretary of Commerce shall 
take such actions as may be necessary to ensure no duplication of 
activities carried out under this subtitle with the activities of--
            (1) research entities of the Department of Energy, 
        including--
                    (A) the National Laboratories; and
                    (B) the Advanced Scientific Computing Research 
                program; and
            (2) relevant industries.
    (b) National Laboratory Resources.--Any advanced computing 
resources, testbeds, expertise, or other resources of the Department of 
Energy or the National Laboratories that are provided to the National 
Science Foundation, the National Institute of Standards and Technology, 
or any other applicable entities under this subtitle shall be 
provided--
            (1) on a reimbursable basis; and
            (2) pursuant to a reimbursable agreement.
    (c) Waiver.--The Secretary may waive the requirements set forth in 
subsection (b) if the Secretary determines the waiver is necessary or 
appropriate to carry out the missions of the Department of Commerce.
    (d) Savings Provision.--Nothing in this subtitle shall be 
construed--
            (1) to modify any requirement or authority provided under 
        section 5501 of the National Artificial Intelligence Initiative 
        Act of 2020 (15 U.S.C. 9461); or
            (2) to allow the Secretary of Commerce (including the Under 
        Secretary of Commerce for Standards and Technology or the 
        Director of the Artificial Intelligence Safety Institute) or 
        the Director of the National Science Foundation to use monetary 
        resources of the Department of Energy or any National 
        Laboratory.

SEC. 105. PROGRESS REPORT.

    (a) In General.--Not later than 1 year after the date of the 
enactment of this Act, the Under Secretary of Commerce for Standards 
and Technology shall, in coordination with the Secretary of Commerce 
and the Secretary of Energy, submit to Congress a report on the 
implementation of sections 102 and 103.
    (b) Contents.--The report submitted pursuant to subsection (a) 
shall include the following:
            (1) A description of the reimbursable agreements, 
        statements of work, and associated project schedules and 
        deliverables for the testbed program established pursuant to 
        section 102(b) and section 103(a).
            (2) Details on the total amount of reimbursable agreements 
        entered into pursuant to section 104(b).
            (3) Such additional information as the Under Secretary 
        determines appropriate.

                 Subtitle B--International Cooperation

SEC. 111. INTERNATIONAL COALITIONS ON INNOVATION, DEVELOPMENT, AND 
              ALIGNMENT OF STANDARDS WITH RESPECT TO ARTIFICIAL 
              INTELLIGENCE.

    (a) In General.--The Under Secretary of Commerce for Standards and 
Technology (in this section referred to as the ``Under Secretary'') and 
the Secretary of Energy (in this section referred to as the 
``Secretary'') shall jointly lead information exchange and coordination 
among Federal agencies and communication from Federal agencies to the 
private sector of the United States and like-minded governments of 
foreign countries to ensure effective Federal engagement in the 
development and use of international technical standards for artificial 
intelligence.
    (b) Requirements.--To support private sector-led engagement and 
ensure effective Federal engagement in the development and use of 
international technical standards for artificial intelligence, the 
Under Secretary shall seek to form alliances or coalitions with like-
minded governments of foreign countries--
            (1) to support the private sector-led development and 
        adoption of standards or alignment with respect to artificial 
        intelligence;
            (2) to encourage the adoption of technical standards 
        developed in the United States to be adopted by international 
        standards organizations;
            (3) to facilitate international collaboration on 
        innovation, science, and advancement in artificial intelligence 
        research and development, including data sharing, expertise, 
        and resources; and
            (4) to develop the government-to-government infrastructure 
        to support the activities described in paragraphs (1) through 
        (3), using existing bilateral and multilateral agreements to 
        the extent practicable.
    (c) Criteria for Participation.--In forming an alliance or 
coalition of like-minded governments of foreign countries under 
subsection (b), the Secretary of Commerce, the Secretary of Energy, the 
Secretary of State, and the Director, in consultation with the heads of 
relevant agencies, shall jointly establish technology trust criteria--
            (1) to ensure all partner countries have a high level of 
        scientific and technological advancement; and
            (2) to support the principles for international standards 
        development as detailed in the Committee Decision on World 
        Trade Organization Agreement on Technical Barriers to Trade 
        (Annex 2 of Part 1 of G/TBT/1), on international standards, 
        such as transparency, openness, and consensus-based decision-
        making.
    (d) Consultation on Innovation and Advancements in Artificial 
Intelligence.--In forming an alliance or coalition under subsection 
(b), the Director, the Secretary of Commerce, and the Secretary of 
State shall consult with the Secretary of Energy and the Director of 
the National Science Foundation on approaches to innovation and 
advancements in artificial intelligence.
    (e) Security and Protection of Intellectual Property.--The 
Director, the Secretary of Commerce, the Secretary of Energy, and the 
Secretary of State shall jointly ensure that an alliance or coalition 
formed under subsection (b) is only undertaken with countries that--
            (1) have in place sufficient intellectual property 
        protections, safety standards, and risk management approaches 
        relevant to innovation and artificial intelligence; and
            (2) develop and coordinate research security measures, 
        export controls, and intellectual property protections relevant 
        to innovation, development, and standard-setting relating to 
        artificial intelligence.
    (f) Limitation on Eligibility of the People's Republic of China.--
            (1) In general.--The People's Republic of China is not 
        eligible to participate in an alliance or coalition of like-
        minded governments of foreign countries under subsection (b) 
        until the United States Trade Representative determines in a 
        report to Congress required by section 421 of the U.S.-China 
        Relations Act of 2000 (22 U.S.C. 6951) that the People's 
        Republic of China has come into compliance with the commitments 
        it made in connection with its accession to the World Trade 
        Organization.
            (2) Report required.--Upon the submission of a report 
        described in paragraph (1), the officials specified in 
        paragraph (3) shall jointly submit to Congress a report that 
        includes the following:
                    (A) A detailed justification for why government-to-
                government information exchange and coordination with 
                the Government of the People's Republic of China is in 
                the national security interests of the United States.
                    (B) An assessment of the risks and potential 
                effects of such coordination, including any potential 
                for the transfer under an alliance or coalition 
                described in paragraph (1) of technology or 
                intellectual property capable of harming the national 
                security interests of the United States.
                    (C) A detailed justification for how the officials 
                specified in paragraph (3) intend to address human 
                rights concerns in any scientific and technology 
                collaboration proposed to be conducted by such an 
                alliance or coalition.
                    (D) An assessment of the extent to which those 
                officials will be able to continuously monitor the 
                commitments made by the People's Republic of China in 
                participating in such an alliance or coalition.
                    (E) Such other information relating to such an 
                alliance or coalition as those officials consider 
                appropriate.
            (3) Officials specified.--The officials specified in this 
        paragraph are the following:
                    (A) The Director.
                    (B) The Secretary of Commerce.
                    (C) The Secretary of Energy.
                    (D) The Secretary of State.
    (g) Rule of Construction.--Nothing in this section shall be 
construed--
            (1) to prohibit a person (as defined in section 551 of 
        title 5, United States Code) from participating in an 
        international standards body; or
            (2) to constrain separate engagement with emerging 
        economies on artificial intelligence.

       Subtitle C--Identifying Regulatory Barriers to Innovation

SEC. 121. COMPTROLLER GENERAL OF THE UNITED STATES IDENTIFICATION OF 
              RISKS AND OBSTACLES RELATING TO ARTIFICIAL INTELLIGENCE 
              AND FEDERAL AGENCIES.

    (a) Report Required.--Not later than 1 year after the date of the 
enactment of this Act, the Comptroller General of the United States 
shall submit to Congress a report on regulatory impediments to 
innovation in artificial intelligence systems.
    (b) Contents.--The report submitted pursuant to subsection (a) 
shall include the following:
            (1) Significant examples of Federal statutes and 
        regulations that directly affect the innovation of artificial 
        intelligence systems, including the ability of companies of all 
        sizes to compete in artificial intelligence, which should also 
        account for the effect of voluntary standards and best 
        practices developed with contributions from the Federal 
        Government.
            (2) An evaluation of the progress in government adoption of 
        artificial intelligence and use of artificial intelligence to 
        improve the quality of government services.
            (3) Based on the findings of the Comptroller General with 
        respect to paragraphs (1) and (2), such recommendations as the 
        Comptroller General may have for legislative or administrative 
        action to increase the rate of innovation in artificial 
        intelligence systems.

   TITLE II--ARTIFICIAL INTELLIGENCE RESEARCH, DEVELOPMENT, CAPACITY 
                          BUILDING ACTIVITIES

SEC. 201. PUBLIC DATA FOR ARTIFICIAL INTELLIGENCE SYSTEMS.

    (a) In General.--Title LI of the National Artificial Intelligence 
Initiative Act of 2020 (15 U.S.C. 9411 et seq.) is amended by adding at 
the end the following new section:

``SEC. 5103A. PUBLIC DATA FOR ARTIFICIAL INTELLIGENCE SYSTEMS.

    ``(a) List of Priorities.--
            ``(1) In general.--To expedite the development of 
        artificial intelligence systems in the United States, the 
        Director of the Office of Science and Technology Policy (in 
        this section referred to as the `Director') shall, acting 
        through the National Science and Technology Council and the 
        Interagency Committee and in consultation with the Advisory 
        Committee on Data for Evidence Building established under 
        section 315 of title 5, United States Code, develop a list of 
        priorities for Federal investment in creating or improving 
        curated, publicly available Federal Government data for 
        training and evaluating artificial intelligence systems and 
        identify an appropriate location to host curated datasets.
            ``(2) Requirements.--
                    ``(A) In general.--The list developed pursuant to 
                paragraph (1) shall--
                            ``(i) prioritize data that will advance 
                        novel artificial intelligence systems in the 
                        public interest; and
                            ``(ii) prioritize datasets unlikely to 
                        independently receive sufficient private sector 
                        support to enable their creation, absent 
                        Federal funding.
                    ``(B) Datasets identified.--In carrying out 
                subparagraph (A)(ii), the Director shall identify 20 
                datasets to be prioritized.
            ``(3) Considerations.--In developing the list under 
        paragraph (1), the Director shall consider the following:
                    ``(A) Applicability to the initial list of 
                societal, national, and geostrategic challenges set 
                forth by subsection (b) of section 10387 of the 
                Research and Development, Competition, and Innovation 
                Act (42 U.S.C. 19107), or any successor list.
                    ``(B) Applicability to the initial list of key 
                technology focus areas set forth by subsection (c) of 
                such section, or any successor list.
                    ``(C) Applicability to other major United States 
                economic sectors, such as agriculture, health care, 
                transportation, manufacturing, communications, weather 
                services, and positive utility to small- and medium-
                sized United States businesses.
                    ``(D) Opportunities to improve datasets in effect 
                before the date of the enactment of the Future of 
                Artificial Intelligence Innovation Act of 2024.
                    ``(E) Inclusion of data representative of the 
                entire population of the United States.
                    ``(F) Potential national security threats to 
                releasing datasets, consistent with the United States 
                Government approach to data flows.
                    ``(G) Requirements of laws in effect.
                    ``(H) Applicability to the priorities listed in the 
                National Artificial Intelligence Research and 
                Development Strategic Plan of the National Science and 
                Technology Council, dated October 2016.
                    ``(I) Ability to use data already made available to 
                the National Artificial Intelligence Research Resource 
                Pilot program or any successor program.
                    ``(J) Coordination with other Federal open data 
                efforts, as applicable.
            ``(4) Public input.--Before finalizing the list required by 
        paragraph (1), the Director shall implement public comment 
        procedures for receiving input and comment from private 
        industry, academia, civil society, and other relevant 
        stakeholders.
    ``(b) Interagency Committee.--In carrying out this section, the 
Interagency Committee--
            ``(1) may establish or leverage existing initiatives, 
        including through public-private partnerships, for the creation 
        or improvement of curated datasets identified in the list 
        developed pursuant to subsection (a)(1), including methods for 
        addressing data scarcity;
            ``(2) may apply the priorities set forth in the list 
        developed pursuant to subsection (a)(1) to the enactment of 
        Federal public access and open government data policies;
            ``(3) shall ensure consistency with Federal provisions of 
        law relating to privacy, including the technology and privacy 
        standards applied to the National Secure Data Service under 
        section 10375(f) of the Research and Development, Competition, 
        and Innovation Act (42 U.S.C. 19085(f)); and
            ``(4) shall ensure that no data sharing is permitted with 
        any country that the Secretary of Commerce, in consultation 
        with the Secretary of Defense, the Secretary of State, the 
        Secretary of Energy, and the Director of National Intelligence, 
        determines to be engaged in conduct that is detrimental to the 
        national security or foreign policy of the United States.
    ``(c) Availability of Datasets.--Datasets that are created or 
improved pursuant to this section--
            ``(1) shall, in the case of a dataset created or improved 
        by a Federal agency, be made available to the comprehensive 
        data inventory developed and maintained by the Federal agency 
        pursuant to section 3511(a) of title 44, United States Code, in 
        accordance with all applicable regulations; and
            ``(2) may be made available to the National Artificial 
        Intelligence Research Resource pilot program established by the 
        Director of the National Science Foundation, and the applicable 
        programs established by the Department of Energy, in accordance 
        with Executive Order 14110 (88 Fed. Reg. 75191; relating to 
        safe, secure, and trustworthy development and use of artificial 
        intelligence), or any successor program.
    ``(d) Report.--Not later than 1 year after the date of the 
enactment of the Future of Artificial Intelligence Innovation Act of 
2024, the Director shall, acting through the National Science and 
Technology Council and the Interagency Committee, submit to the 
Committee on Commerce, Science, and Transportation of the Senate and 
the Committee on Science, Space, and Technology of the House of 
Representatives a report that includes--
            ``(1) best practices in developing publicly curated 
        artificial intelligence datasets;
            ``(2) lessons learned and challenges encountered in 
        developing the curated artificial intelligence datasets;
            ``(3) principles used for artificial intelligence-ready 
        data; and
            ``(4) recommendations related to artificial intelligence-
        ready data standards and potential processes for development of 
        such standards.
    ``(e) Rules of Construction.--
            ``(1) In general.--Nothing in this section shall be 
        construed to require the Federal Government or other 
        contributors to disclose any information--
                    ``(A) relating to a trade secret or other protected 
                intellectual property right;
                    ``(B) that is confidential business information; or
                    ``(C) that is privileged.
            ``(2) Disclosure to public datasets.--Except as 
        specifically provided for in this section, nothing in this 
        section shall be construed to prohibit the head of a Federal 
        agency from withholding information from a public dataset.''.
    (b) Clerical Amendments.--The table of contents at the beginning of 
section 2 of the William M. (Mac) Thornberry National Defense 
Authorization Act for Fiscal Year 2021 and the table of contents at the 
beginning of title LI of such Act are both amended by inserting after 
the items relating to section 5103 the following new item:

``5103A. Public data for artificial intelligence systems.''.

SEC. 202. FEDERAL GRAND CHALLENGES IN ARTIFICIAL INTELLIGENCE.

    (a) In General.--Title LI of the National Artificial Intelligence 
Initiative Act of 2020 (15 U.S.C. 9411 et seq.), as amended by section 
201, is further amended by adding at the end the following new section:

``SEC. 5107. FEDERAL GRAND CHALLENGES IN ARTIFICIAL INTELLIGENCE.

    ``(a) Establishment of Program.--
            ``(1) In general.--Not later than 1 year after the date of 
        the enactment of the Future of Artificial Intelligence 
        Innovation Act of 2024, the Director of the Office of Science 
        and Technology Policy (acting through the National Science and 
        Technology Council) and the Interagency Committee may establish 
        a program to award prizes, using the authorities and processes 
        established under section 24 of the Stevenson-Wydler Technology 
        Innovation Act of 1980 (15 U.S.C. 3719), to eligible 
        participants as determined by the co-chairs of the Interagency 
        Committee pursuant to subsection (e).
            ``(2) Purposes.--The purposes of the program required by 
        paragraph (1) are as follows:
                    ``(A) To expedite the development of artificial 
                intelligence systems in the United States.
                    ``(B) To stimulate artificial intelligence 
                research, development, and commercialization that 
                solves or advances specific, well-defined, and 
                measurable challenges in 1 or more of the categories 
                established pursuant to subsection (b).
    ``(b) Federal Grand Challenges in Artificial Intelligence.--
            ``(1) List of priorities.--The Director of the Office of 
        Science and Technology Policy (acting through the National 
        Science and Technology Council) and the Interagency Committee 
        and in consultation with industry, civil society, and academia, 
        identify, and annually review and update as the Director 
        considers appropriate, a list of priorities for Federal grand 
        challenges in artificial intelligence pursuant to the purposes 
        set forth under subsection (a)(2).
            ``(2) Initial list.--
                    ``(A) Contents.--The list established pursuant to 
                paragraph (1) may include the following priorities:
                            ``(i) To overcome challenges with 
                        engineering of and applied research on 
                        microelectronics, including through integration 
                        of artificial intelligence with emerging 
                        technologies, such as neuromorphic and quantum 
                        computing, or with respect to the physical 
                        limits on transistors, advanced interconnects, 
                        and memory elements.
                            ``(ii) To promote transformational or long-
                        term advancements in computing and artificial 
                        intelligence technologies through--
                                    ``(I) next-generation algorithm 
                                design;
                                    ``(II) next-generation compute 
                                capability;
                                    ``(III) generative and adaptive 
                                artificial intelligence for design 
                                applications;
                                    ``(IV) photonics-based 
                                microprocessors and optical 
                                communication networks, including 
                                electrophotonics;
                                    ``(V) the chemistry and physics of 
                                new materials;
                                    ``(VI) energy use or energy 
                                efficiency;
                                    ``(VII) techniques to establish 
                                cryptographically secure content 
                                provenance information; or
                                    ``(VIII) safety and controls for 
                                artificial intelligence applications.
                            ``(iii) To promote explainability and 
                        mechanistic interpretability of artificial 
                        intelligence systems.
                            ``(iv) To develop artificial intelligence 
                        solutions, including through integration among 
                        emerging technologies such as neuromorphic and 
                        quantum computing to overcome barriers relating 
                        to innovations in advanced manufacturing in the 
                        United States, including areas such as--
                                    ``(I) materials, nanomaterials, and 
                                composites;
                                    ``(II) rapid, complex design;
                                    ``(III) sustainability and 
                                environmental impact of manufacturing 
                                operations;
                                    ``(IV) predictive maintenance of 
                                machinery;
                                    ``(V) improved part quality;
                                    ``(VI) process inspections;
                                    ``(VII) worker safety; and
                                    ``(VIII) robotics.
                            ``(v) To develop artificial intelligence 
                        solutions in sectors of the economy, such as 
                        expanding the use of artificial intelligence in 
                        maritime vessels, including in navigation and 
                        in the design of propulsion systems and fuels.
                            ``(vi) To develop artificial intelligence 
                        solutions to improve border security, including 
                        solutions relevant to the detection of 
                        fentanyl, illicit contraband, and other illegal 
                        activities.
                            ``(vii) To develop artificial intelligence 
                        for science applications.
                            ``(viii) To develop cybersecurity for 
                        artificial intelligence-related intellectual 
                        property, such as artificial intelligence 
                        systems and artificial intelligence algorithms.
                            ``(ix) To develop artificial intelligence 
                        solutions to modernize code and software 
                        systems that are deployed in government 
                        agencies and critical infrastructure and are at 
                        risk of maintenance difficulties due to code 
                        obsolescence or challenges finding expertise in 
                        outdated code bases.
            ``(3) Consultation on identification and selection of grand 
        challenges.--The Director of the Office of Science and 
        Technology Policy, the Director of the National Institute of 
        Standards and Technology, the Director of the Defense Advanced 
        Research Projects Agency, such agency heads as the Director of 
        the Office of Science and Technology Policy considers relevant, 
        and the National Artificial Intelligence Advisory Committee 
        shall each identify and select artificial intelligence research 
        and development grand challenges in which eligible participants 
        will compete to solve or advance for prize awards under 
        subsection (a).
            ``(4) Public input on identification.--The Director of the 
        Office of Science and Technology Policy shall also seek public 
        input on the identification of artificial intelligence research 
        and development grand challenges under subsection (a).
            ``(5) Problem statements; success metrics.--For each 
        priority for a Federal grand challenge identified under 
        paragraph (1) and the grand challenges identified and selected 
        under paragraph (3), the Director of the Office of Science and 
        Technology Policy shall--
                    ``(A) establish a specific and well-defined grand 
                challenge problem statement and ensure that such 
                problem statement is published on a website linking out 
                to relevant prize competition listings on the website 
                Challenge.gov, or successor website, that is managed by 
                the General Services Administration; and
                    ``(B) establish and publish on the website 
                Challenge.gov, or successor website, clear targets, 
                success metrics, and validation protocols for the prize 
                competitions designed to address each grand challenge, 
                in order to provide specific benchmarks that will be 
                used to evaluate submissions to the prize competition.
    ``(c) Federal Investment Initiatives Authorized.--Subject to the 
availability of amounts appropriated for this purpose, the Secretary of 
Commerce, the Secretary of Transportation, the Director of the National 
Science Foundation may, consistent with the missions or 
responsibilities of each Federal agency, establish 1 or more prize 
competitions under section 24 of the Stevenson-Wydler Technology 
Innovation Act of 1980 (15 U.S.C. 3719), challenge-based acquisitions, 
or other research and development investments that each agency head 
deems appropriate consistent with the list of priorities established 
pursuant to subsection (b)(1).
    ``(d) Requirements.--
            ``(1) In general.--The Director of the Office of Science 
        and Technology Policy shall develop requirements for--
                    ``(A) the process for prize competitions under 
                subsections (a) and (c), including eligibility criteria 
                for participants, consistent with the requirements 
                under paragraph (2); and
                    ``(B) testing, judging, and verification procedures 
                for submissions to receive a prize award under 
                subsection (c).
            ``(2) Eligibility requirement and judging.--
                    ``(A) Eligibility.--In accordance with the 
                requirement described in section 24(g)(3) of the 
                Stevenson-Wydler Technology Innovation Act of 1980 (15 
                U.S.C. 3719(g)(3)), a recipient of a prize award under 
                subsection (c)--
                            ``(i) that is a private entity shall be 
                        incorporated in and maintain a primary place of 
                        business in the United States; and
                            ``(ii) who is an individual, whether 
                        participating singly or in a group, shall be a 
                        citizen or permanent resident of the United 
                        States.
                    ``(B) Judges.--In accordance with section 24(k) of 
                the Stevenson-Wydler Technology Innovation Act of 1980 
                (15 U.S.C. 3719(k)), a judge of a prize competition 
                under subsection (c) may be an individual from the 
                private sector.
            ``(3) Agency leadership.--Each agency head carrying out an 
        investment initiative under subsection (c) shall ensure that--
                    ``(A) for each prize competition or investment 
                initiative carried out by the agency head under such 
                subsection, there is--
                            ``(i) a positive impact on the economic 
                        competitiveness of the United States;
                            ``(ii) a benefit to United States industry;
                            ``(iii) to the extent possible, leveraging 
                        of the resources and expertise of industry and 
                        philanthropic partners in shaping the 
                        investments; and
                            ``(iv) in a case involving development and 
                        manufacturing, use of advanced manufacturing in 
                        the United States; and
                    ``(B) all research conducted for purposes of the 
                investment initiative is conducted in the United 
                States.
    ``(e) Reports.--
            ``(1) Notification of winning submission.--Not later than 
        60 days after the date on which a prize is awarded under 
        subsection (c), the agency head awarding the prize shall submit 
        to the Committee on Commerce, Science, and Transportation of 
        the Senate, the Committee on Science, Space, and Technology of 
        the House of Representatives, and such other committees of 
        Congress as the agency head considers relevant a report that 
        describes the winning submission to the prize competition and 
        its benefits to the United States.
            ``(2) Biennial report.--
                    ``(A) In general.--Not later than 2 years after the 
                date of the enactment of the Future of Artificial 
                Intelligence Innovation Act of 2024, and biennially 
                thereafter, the heads of agencies described in 
                subsection (c) shall submit to the Committee on 
                Commerce, Science, and Transportation of the Senate, 
                the Committee on Science, Space, and Technology of the 
                House of Representatives, and such other committees of 
                Congress as the agency heads consider relevant a report 
                that includes--
                            ``(i) a description of the activities 
                        carried out by the agency heads under this 
                        section;
                            ``(ii) a description of the active 
                        competitions and the results of completed 
                        competitions under subsection (c); and
                            ``(iii) efforts to provide information to 
                        the public on active competitions under 
                        subsection (c) to encourage participation.
                    ``(B) Public accessibility.--The agency heads 
                described in subsection (c) shall make the biennial 
                report required under subparagraph (A) publicly 
                accessible, including by posting the biennial report on 
                a website in an easily accessible location, such as the 
                GovInfo website of the Government Publishing Office.
    ``(f) Accessibility.--In carrying out any competition under 
subsection (c), the head of an agency shall post the active prize 
competitions and available prize awards under subsection (b) to 
Challenge.gov, or successor website, after the grand challenges are 
selected and the prize competitions are designed pursuant to 
subsections (c) and (e) to ensure the prize competitions are widely 
accessible to eligible participants.
    ``(g) Sunset.--This section shall terminate on the date that is 5 
years after the date of the enactment the Future of Artificial 
Intelligence Innovation Act of 2024.''.
    (b) Comptroller General of the United States Studies and Reports.--
            (1) Initial study.--
                    (A) In general.--Not later than 1 year after the 
                date of enactment of this Act, the Comptroller General 
                of the United States shall conduct a study of Federal 
                prize competitions, which shall include an assessment 
                of the efficacy and impact of prize competitions 
                generally.
                    (B) Elements.--The study conducted under 
                subparagraph (A) shall include, to the extent 
                practicable, the following:
                            (i) A survey of all existing, current and 
                        ongoing Federal prize competitions carried out 
                        under authorities enacted before the date of 
                        the enactment of this Act.
                            (ii) An assessment of those existing, 
                        current, and ongoing Federal prize competitions 
                        that includes addressing--
                                    (I) whether and what technology or 
                                innovation would have been developed in 
                                the absence of the prize competitions;
                                    (II) whether the prize competitions 
                                shortened the timeframe for the 
                                development of the technology or 
                                innovation;
                                    (III) whether the prize competition 
                                was cost effective;
                                    (IV) what, if any, other benefits 
                                were gained from conducting the prize 
                                competitions;
                                    (V) whether the use of a more 
                                traditional policy tool such as a grant 
                                or contract have resulted in the 
                                development of a similar technology or 
                                innovation;
                                    (VI) whether prize competitions 
                                might be designed differently in a way 
                                that would result in a more effective 
                                or revolutionary technology being 
                                developed;
                                    (VII) what are appropriate metrics 
                                that could be used for determining the 
                                success of a prize competition, and 
                                whether those metrics differ when 
                                evaluating near-term and long-term 
                                impacts of prize competitions; and
                                    (VIII) suggested best practices of 
                                prize competitions.
                    (C) Congressional briefing.--Not later than 540 
                days after the date of the enactment of this Act, the 
                Comptroller General shall provide the Committee on 
                Science, Space, and Technology and the Committee on 
                Energy and Natural Resources of the Senate and the 
                Committee on Energy and Commerce of the House of 
                Representatives a briefing on the findings of the 
                Comptroller General with respect to the study conducted 
                under subparagraph (A).
                    (D) Report.--Not later than 540 days after the date 
                of the enactment of this Act, the Comptroller General 
                shall submit to the congressional committees specified 
                in subparagraph (C) a report on the findings and 
                recommendations of Comptroller General from the study 
                conducted under subparagraph (A).
            (2) Interim study.--
                    (A) In general.--The Comptroller General of the 
                United States shall conduct a study of the Federal 
                prize challenges implemented under section 5108 of the 
                of the National Artificial Intelligence Initiative Act 
                of 2020, as added by subsection (a), which shall 
                include an assessment of the efficacy and effect of 
                such prize competitions.
                    (B) Elements.--The study conducted under 
                subparagraph (A) shall include, to the extent 
                practicable, the following:
                            (i) A survey of all Federal prize 
                        competitions implemented under section 5108 of 
                        the of the National Artificial Intelligence 
                        Initiative Act of 2020, as added by subsection 
                        (a).
                            (ii) An assessment of the Federal prize 
                        competitions implemented such section, which 
                        shall include addressing the same 
                        considerations as set forth under paragraph 
                        (1)(B)(ii).
                            (iii) An assessment of the efficacy, 
                        impact, and cost-effectiveness of prize 
                        competitions implemented under section 5108 of 
                        the of the National Artificial Intelligence 
                        Initiative Act of 2020, as added by subsection 
                        (a), compared to other Federal prize 
                        competitions.
                    (C) Congressional briefing.--Not later than 1 year 
                after completing the study required by subparagraph 
                (A), the Comptroller General shall provide the 
                Committee on Science, Space, and Technology and the 
                Committee on Energy and Natural Resources of the Senate 
                and the Committee on Energy and Commerce of the House 
                of Representatives a briefing on the findings of the 
                Comptroller General with respect to the study conducted 
                under subparagraph (A).
                    (D) Report.--Not later than 180 days after the date 
                of the enactment of this Act, the Comptroller General 
                shall submit to the congressional committees specified 
                in subparagraph (C) a report on the findings and 
                recommendations of the Comptroller General with respect 
                to the study conducted under subparagraph (A).
    (c) Clerical Amendments.--The table of contents at the beginning of 
section 2 of the William M. (Mac) Thornberry National Defense 
Authorization Act for Fiscal Year 2021 and the table of contents at the 
beginning of title LI of such Act, as amended by section 201, are both 
amended by inserting after the items relating to section 5107 the 
following new item:

``5107. Federal grand challenges in artificial intelligence.''.

             TITLE III--RESEARCH SECURITY AND OTHER MATTERS

SEC. 301. RESEARCH SECURITY.

    The activities authorized under this Act shall be carried out in 
accordance with the provision of subtitle D of title VI of the Research 
and Development, Competition, and Innovation Act (42 U.S.C. 19231 et 
seq.; enacted as part of division B of Public Law 117-167) and section 
223 of the William M. (Mac) Thornberry National Defense Authorization 
Act for Fiscal Year 2021 (42 U.S.C. 6605).

SEC. 302. EXPANSION OF AUTHORITY TO HIRE CRITICAL TECHNICAL EXPERTS.

    (a) In General.--Subsection (b) of section 6 of the National 
Institute of Standards and Technology Act (15 U.S.C. 275) is amended, 
in the second sentence, by striking ``15'' and inserting ``30
    (b) Modification of Sunset.--Subsection (c) of such section is 
amended by striking ``under section (b) shall expire on the date that 
is 5 years after the date of the enactment of this section'' and 
inserting ``under subsection (b) shall expire on December 30, 2035''.

SEC. 303. FOUNDATION FOR STANDARDS AND METROLOGY.

    (a) In General.--Subtitle B of title II of the Research and 
Development, Competition, and Innovation Act (42 U.S.C. 18931 et seq.; 
relating to measurement research of the National Institute of Standards 
and Technology for the future; enacted as part of division B of Public 
Law 117-167) is amended by adding at the end the following new section:

``SEC. 10236. FOUNDATION FOR STANDARDS AND METROLOGY.

    ``(a) Establishment.--The Secretary, acting through the Director, 
shall establish a nonprofit corporation to be known as the `Foundation 
for Standards and Metrology'.
    ``(b) Mission.--The mission of the Foundation shall be to--
            ``(1) support the Institute in carrying out its activities 
        and mission to advance measurement science, technical 
        standards, and technology in ways that enhance the economic 
        security and prosperity of the United States; and
            ``(2) advance collaboration with researchers, institutions 
        of higher education, industry, and nonprofit and philanthropic 
        organizations to accelerate the development of technical 
        standards, measurement science, and the commercialization of 
        emerging technologies in the United States.
    ``(c) Activities.--In carrying out its mission under subsection 
(b), the Foundation may carry out the following:
            ``(1) Support international metrology and technical 
        standards engagement activities.
            ``(2) Support studies, projects, and research on metrology 
        and the development of benchmarks and technical standards 
        infrastructure across the Institute's mission areas.
            ``(3) Advance collaboration between the Institute and 
        researchers, industry, nonprofit and philanthropic 
        organizations, institutions of higher education, federally 
        funded research and development centers, and State, Tribal, and 
        local governments.
            ``(4) Support the expansion and improvement of research 
        facilities and infrastructure at the Institute to advance the 
        development of emerging technologies.
            ``(5) Support the commercialization of federally funded 
        research.
            ``(6) Conduct education and outreach activities.
            ``(7) Offer direct support to NIST associates, including 
        through the provision of fellowships, grants, stipends, travel, 
        health insurance, professional development training, housing, 
        technical and administrative assistance, recognition awards for 
        outstanding performance, and occupational safety and awareness 
        training and support, and other appropriate expenditures.
            ``(8) Conduct such other activities as determined necessary 
        by the Foundation to carry out its mission.
    ``(d) Authority of the Foundation.--The Foundation shall be the 
sole entity responsible for carrying out the activities described in 
subsection (c).
    ``(e) Stakeholder Engagement.--The Foundation shall convene, and 
may consult with, representatives from the Institute, institutions of 
higher education, the private sector, non-profit organizations, and 
commercialization organizations to develop activities for the mission 
of the Foundation under subsection (b) and to advance the activities of 
the Foundation under subsection (c).
    ``(f) Limitation.--The Foundation shall not be an agency or 
instrumentality of the Federal Government.
    ``(g) Support.--The Foundation may receive, administer, solicit, 
accept, and use funds, gifts, devises, or bequests, either absolutely 
or in trust of real or personal property or any income therefrom or 
other interest therein to support activities under subsection (c), 
except that this subsection shall not apply if any of such is from a 
foreign country of concern or a foreign entity of concern.
    ``(h) Tax Exempt Status.--The Board shall take all necessary and 
appropriate steps to ensure the Foundation is an organization described 
in section 501(c) of the Internal Revenue Code of 1986 and exempt from 
taxation under section 501(a) of such Code.
    ``(i) Board of Directors.--
            ``(1) Establishment.--The Foundation shall be governed by a 
        Board of Directors.
            ``(2) Composition.--
                    ``(A) In general.--The Board shall be composed of 
                the following:
                            ``(i) Eleven appointed voting members 
                        described in subparagraph (B).
                            ``(ii) Ex officio nonvoting members 
                        described in subparagraph (C).
                    ``(B) Appointed members.--
                            ``(i) Initial members.--The Secretary, 
                        acting through the Director, shall--
                                    ``(I) seek to enter into an 
                                agreement with the National Academies 
                                of Sciences, Engineering, and Medicine 
                                to develop a list of individuals to 
                                serve as members of the Board who are 
                                well qualified and will meet the 
                                requirements of clauses (ii) and (iii); 
                                and
                                    ``(II) appoint the initial members 
                                of the Board from such list, if 
                                applicable, in consultation with the 
                                National Academies of Sciences, 
                                Engineering, and Medicine.
                            ``(ii) Representation.--The appointed 
                        members of the Board shall reflect a broad 
                        cross-section of stakeholders across diverse 
                        sectors, regions and communities, including 
                        from academia, private sector entities, 
                        technical standards bodies, the investment 
                        community, the philanthropic community, and 
                        other nonprofit organizations.
                            ``(iii) Experience.--The Secretary, acting 
                        through the Director, shall ensure the 
                        appointed members of the Board have the 
                        experience and are qualified to provide advice 
                        and information to advance the Foundation's 
                        mission, including in science and technology 
                        research and development, technical standards, 
                        education, technology transfer, 
                        commercialization, or other aspects of the 
                        Foundation's mission.
                    ``(C) Nonvoting members.--
                            ``(i) Ex officio members.--The Director (or 
                        Director's designee) shall be an ex officio 
                        member of the Board.
                            ``(ii) No voting power.--The ex officio 
                        members described in clause (i) shall not have 
                        voting power on the Board.
            ``(3) Chair and vice chair.--
                    ``(A) In general.--The Board shall designate, from 
                among its members--
                            ``(i) an individual to serve as the chair 
                        of the Board; and
                            ``(ii) an individual to serve as the vice 
                        chair of the Board.
                    ``(B) Terms.--The term of service of the Chair and 
                Vice Chair of the Board shall end on the earlier of--
                            ``(i) the date that is 3 years after the 
                        date on which the Chair or Vice Chair of the 
                        Board, as applicable, is designated for the 
                        respective position; and
                            ``(ii) the last day of the term of service 
                        of the member, as determined under paragraph 
                        (4)(A), who is designated to be Chair or Vice 
                        Chair of the Board, as applicable.
                    ``(C) Representation.--The Chair and Vice Chair of 
                the Board--
                            ``(i) may not be representatives of the 
                        same area of subject matter expertise, or 
                        entity, as applicable; and
                            ``(ii) may not be representatives of any 
                        area of subject matter expertise, or entity, as 
                        applicable, represented by the immediately 
                        preceding Chair and Vice Chair of the Board.
            ``(4) Terms and vacancies.--
                    ``(A) Term limits.--Subject to subparagraph (B), 
                the term of office of each member of the Board shall be 
                not more than five years, except that a member of the 
                Board may continue to serve after the expiration of the 
                term of such member until the expiration of the 180-day 
                period beginning on the date on which the term of such 
                member expires, if no new member is appointed to 
                replace the departing board member.
                    ``(B) Initial appointed members.--Of the initial 
                members of the Board appointed under paragraph (4)(A), 
                half of such members shall serve for four years and 
                half of such members shall serve for five years, as 
                determined by the Chair of the Board.
                    ``(C) Vacancies.--Any vacancy in the membership of 
                the appointed members of the Board--
                            ``(i) shall be filled in accordance with 
                        the bylaws of the Foundation by an individual 
                        capable of representing the same area or 
                        entity, as applicable, as represented by the 
                        vacating board member under paragraph 
                        (2)(B)(ii);
                            ``(ii) shall not affect the power of the 
                        remaining appointed members to carry out the 
                        duties of the Board; and
                            ``(iii) shall be filled by an individual 
                        selected by the Board.
            ``(5) Quorum.--A majority of the members of the Board shall 
        constitute a quorum for the purposes of conducting the business 
        of the Board.
            ``(6) Duties.--The Board shall carry out the following:
                    ``(A) Establish bylaws for the Foundation in 
                accordance with paragraph (7).
                    ``(B) Provide overall direction for the activities 
                of the Foundation and establish priority activities.
                    ``(C) Coordinate with the Institute the activities 
                of the Foundation to ensure consistency with the 
                programs and policies of the Institute.
                    ``(D) Evaluate the performance of the Executive 
                Director of the Foundation.
                    ``(E) Actively solicit and accept funds, gifts, 
                grants, devises, or bequests of real or personal 
                property to the Foundation, including from private 
                entities.
                    ``(F) Carry out any other necessary activities of 
                the Foundation.
            ``(7) Bylaws.--The Board shall establish bylaws for the 
        Foundation. In establishing such bylaws, the Board shall ensure 
        the following:
                    ``(A) The bylaws of the Foundation include the 
                following:
                            ``(i) Policies for the selection of the 
                        Board members, officers, employees, agents, and 
                        contractors of the Foundation.
                            ``(ii) Policies, including ethical and 
                        disclosure standards, for the following:
                                    ``(I) The acceptance, solicitation, 
                                and disposition of donations and grants 
                                to the Foundation, including 
                                appropriate limits on the ability of 
                                donors to designate, by stipulation or 
                                restriction, the use or recipient of 
                                donated funds.
                                    ``(II) The disposition of assets of 
                                the Foundation.
                            ``(iii) Policies that subject all 
                        employees, fellows, trainees, and other agents 
                        of the Foundation (including appointed voting 
                        members and ex officio members of the Board) to 
                        conflict of interest standards.
                            ``(iv) The specific duties of the Executive 
                        Director of the Foundation.
                    ``(B) The bylaws of the Foundation and activities 
                carried out under such bylaws do not--
                            ``(i) reflect unfavorably upon the ability 
                        of the Foundation to carry out its 
                        responsibilities or official duties in a fair 
                        and objective manner; or
                            ``(ii) compromise, or appear to compromise, 
                        the integrity of any governmental agency or 
                        program, or any officer or employee employed 
                        by, or involved in a governmental agency or 
                        program.
            ``(8) Restrictions on membership.--
                    ``(A) Employees.--No employee of the Department of 
                Commerce may be appointed as a voting member of the 
                Board.
                    ``(B) Status.--Each voting member of the Board 
                shall be--
                            ``(i) a citizen of the United States;
                            ``(ii) a national of the United States (as 
                        such term is defined in section 101(a) of the 
                        Immigration and Nationality Act (8 U.S.C. 
                        1101(a));
                            ``(iii) an alien admitted as a refugee 
                        under section 207 of such Act (8 U.S.C. 1157); 
                        or
                            ``(iv) an alien lawfully admitted to the 
                        United States for permanent residence.
            ``(9) Compensation.--
                    ``(A) In general.--Members of the Board may not 
                receive compensation for serving on the Board.
                    ``(B) Certain expenses.--In accordance with the 
                bylaws of the Foundation, members of the Board may be 
                reimbursed for travel expenses, including per diem in 
                lieu of subsistence, and other necessary expenses 
                incurred in carrying out the duties of the Board.
            ``(10) Liaison representatives.--The Secretary, acting 
        through the Director, shall designate representatives from 
        across the Institute to serve as the liaisons to the Board and 
        the Foundation.
            ``(11) Personal liability of board members.--The members of 
        the Board shall not be personally liable, except for 
        malfeasance.
    ``(j) Administration.--
            ``(1) Executive director.--
                    ``(A) In general.--The Foundation shall have an 
                Executive Director who shall be appointed by the Board, 
                and who shall serve at the pleasure of the Board, and 
                for whom the Board shall establish the rate of 
                compensation. Subject to the bylaws established under 
                subsection (i)(7), the Executive Director shall be 
                responsible for the daily operations of the Foundation 
                in carrying out the activities of the Foundation under 
                subsection (c).
                    ``(B) Responsibilities.--In carrying out the daily 
                operations of the Foundation, the Executive Director of 
                the Foundation shall carry out the following:
                            ``(i) Hire, promote, compensate, and 
                        discharge officers and employees of the 
                        Foundation, and define the duties of such 
                        officers and employees.
                            ``(ii) Accept and administer donations to 
                        the Foundation, and administer the assets of 
                        the Foundation.
                            ``(iii) Enter into such contracts and 
                        execute legal instruments as are appropriate in 
                        carrying out the activities of the Foundation.
                            ``(iv) Perform such other functions as 
                        necessary to operate the Foundation.
                    ``(C) Restrictions.--
                            ``(i) Executive director.--The Executive 
                        Director shall be--
                                    ``(I) a citizen of the United 
                                States;
                                    ``(II) a national of the United 
                                States (as such term is defined in 
                                section 101(a) of the Immigration and 
                                Nationality Act (8 U.S.C. 1101(a));
                                    ``(III) an alien admitted as a 
                                refugee under section 207 of such Act 
                                (8 U.S.C. 1157); or
                                    ``(IV) an alien lawfully admitted 
                                to the United States for permanent 
                                residence.
                            ``(ii) Officers and employees.--Each 
                        officer or employee of the Foundation shall 
                        be--
                                    ``(I) a citizen of the United 
                                States;
                                    ``(II) a national of the United 
                                States (as such term is defined in 
                                section 101(a) of the Immigration and 
                                Nationality Act (8 U.S.C. 1101(a));
                                    ``(III) an alien admitted as a 
                                refugee under section 207 of such Act 
                                (8 U.S.C. 1157); or
                                    ``(IV) an alien lawfully admitted 
                                to the United States for permanent 
                                residence.
            ``(2) Administrative control.--No member of the Board, 
        officer or employee of the Foundation or of any program 
        established by the Foundation, or participant in a program 
        established by the Foundation, may exercise administrative 
        control over any Federal employee.
            ``(3) Transfer of funds to institute.--The Foundation may 
        transfer funds and property to the Institute, which the 
        Institute may accept and use and which shall be subject to all 
        applicable Federal limitations relating to federally funded 
        research.
            ``(4) Strategic plan.--Not later than one year after the 
        establishment of the Foundation, the Foundation shall submit to 
        the Committee on Science, Space, and Technology of the House of 
        Representatives and the Committee on Commerce, Science, and 
        Transportation of the Senate a strategic plan that contains the 
        following:
                    ``(A) A plan for the Foundation to become 
                financially self-sustaining in the next five years.
                    ``(B) Short- and long-term objectives of the 
                Foundation, as identified by the Board.
                    ``(C) A description of the efforts the Foundation 
                will take to be transparent in the processes of the 
                Foundation, including processes relating to the 
                following:
                            ``(i) Grant awards, including selection, 
                        review, and notification.
                            ``(ii) Communication of past, current, and 
                        future research priorities.
                            ``(iii) Solicitation of and response to 
                        public input on the priorities identified by 
                        the Foundation.
                    ``(D) A description of the financial goals and 
                benchmarks of the Foundation for the following ten 
                years.
                    ``(E) A description of the efforts undertaken by 
                the Foundation to ensure maximum complementarity and 
                minimum redundancy with investments made by the 
                Institute.
            ``(5) Report.--
                    ``(A) In general.--Not later than 18 months after 
                the establishment of the Foundation and not later than 
                February 1 of each year thereafter, the Foundation 
                shall publish a report describing the activities of the 
                Foundation during the immediately preceding fiscal 
                year. Each such report shall include with respect to 
                such fiscal year a comprehensive statement of the 
                operations, activities, financial condition, progress, 
                and accomplishments of the Foundation.
                    ``(B) Financial condition.--With respect to the 
                financial condition of the Foundation, each report 
                under subparagraph (A) shall include the source, and a 
                description of, all support under subsection (g) 
                provided to the Foundation. Each such report shall 
                identify the persons or entities from which such 
                support is received, and include a specification of any 
                restrictions on the purposes for which such support may 
                be used.
                    ``(C) Publication.--The Foundation shall make 
                copies of each report submitted under subparagraph (A) 
                available--
                            ``(i) for public inspection, and shall upon 
                        request provide a copy of the report to any 
                        individual for a charge not to exceed the cost 
                        of providing such copy; and
                            ``(ii) to the Committee on Science, Space, 
                        and Technology of the House of Representatives 
                        and the Committee on Commerce, Science, and 
                        Transportation of the Senate.
            ``(6) Audits and disclosure.--The Foundation shall--
                    ``(A) provide for annual audits of the financial 
                condition of the Foundation, including a full list of 
                the Foundation's donors and any restrictions on the 
                purposes for which gifts to the Foundation may be used; 
                and
                    ``(B) make such audits, and all other records, 
                documents, and other papers of the Foundation, 
                available to the Secretary and the Comptroller General 
                of the United States for examination or audit.
            ``(7) Evaluation by comptroller general.--Not later than 
        five years after the date on which the Foundation is 
        established, the Comptroller General of the United States shall 
        submit to the Committee on Science, Space, and Technology of 
        the House of Representatives and the Committee on Commerce, 
        Science, and Transportation of the Senate the following:
                    ``(A) An evaluation of the following:
                            ``(i) The extent to which the Foundation is 
                        achieving the mission of the Foundation.
                            ``(ii) The operation of the Foundation.
                    ``(B) Any recommendations on how the Foundation may 
                be improved.
    ``(k) Integrity.--
            ``(1) In general.--To ensure integrity in the operations of 
        the Foundation, the Board shall develop and enforce procedures 
        relating to standards of conduct, financial disclosure 
        statements, conflicts of interest (including recusal and waiver 
        rules), audits, and any other matters determined appropriate by 
        the Board.
            ``(2) Financial conflicts of interest.--To mitigate 
        conflicts of interest and risks from malign foreign influence, 
        any individual who is an officer, employee, or member of the 
        Board is prohibited from any participation in deliberations by 
        the Foundation of a matter that would directly or predictably 
        affect any financial interest of any of the following:
                    ``(A) Such individual.
                    ``(B) A relative of such individual.
                    ``(C) A business organization or other entity in 
                which such individual or relative of such individual 
                has an interest, including an organization or other 
                entity with which such individual is negotiating 
                employment.
            ``(3) Security.--This section shall be carried out in 
        accordance with the provision of subtitle D of title VI of the 
        Research and Development, Competition, and Innovation Act (42 
        U.S.C. 19231 et seq.; enacted as part of division B of Public 
        Law 117-167) and section 223 of the William M. (Mac) Thornberry 
        National Defense Authorization Act for Fiscal Year 2021 (42 
        U.S.C. 6605).
    ``(l) Intellectual Property.--The Board shall adopt written 
standards to govern the ownership and licensing of any intellectual 
property rights developed by the Foundation or derived from the 
collaborative efforts of the Foundation
    ``(m) Full Faith and Credit.--The United States shall not be liable 
for any debts, defaults, acts, or omissions of the Foundation. The full 
faith and credit of the United States shall not extend to any 
obligations of the Foundation.
    ``(n) Support Services.--The Secretary, acting through the 
Director, may provide facilities, utilities, and support services to 
the Foundation if it is determined by the Director to be advantageous 
to the research programs of the Institute.
    ``(o) Nonapplicability.--Chapter 10 of title 5, United States Code, 
shall not apply to the Foundation.
    ``(p) Separate Fund Accounts.--The Board shall ensure that amounts 
received pursuant to the authorization of appropriations under 
subsection (q) are held in a separate account from any other funds 
received by the Foundation.
    ``(q) Funding; Authorization of Appropriations.--Notwithstanding 
any other provision of law, from amounts authorized to be appropriated 
for a fiscal year beginning with fiscal year 2025 to the Secretary of 
Commerce pursuant to section 10211, the Director may transfer not less 
than $500,000 and not more than $1,250,000 to the Foundation each such 
fiscal year.
    ``(r) Definitions.--In this section:
            ``(1) Board.--The term `Board' means the Board of Directors 
        of the Foundation, established pursuant to subsection (i).
            ``(2) Director.--The term `Director' means the Director of 
        the National Institute of Standards and Technology.
            ``(3) Foreign country of concern.--The term `foreign 
        country of concern' has the meaning given such term in section 
        10638 of the Research and Development, Competition, and 
        Innovation Act (42 U.S.C. 19237; enacted as part of division B 
        of Public Law 117-167).
            ``(4) Foreign entity of concern.--The term `foreign entity 
        of concern' has the meaning given such term in section 10638 of 
        the Research and Development, Competition, and Innovation Act 
        (42 U.S.C. 19237; enacted as part of division B of Public Law 
        117-167).
            ``(5) Foundation.--The term `Foundation' means the 
        Foundation for Standards and Metrology established pursuant to 
        subsection (a).
            ``(6) Institute.--The term `Institute' means the National 
        Institute of Standards and Technology.
            ``(7) Institution of higher education.--The term 
        `institution of higher education' has the meaning given such 
        term in section 101 of the Higher Education Act of 1965 (20 
        U.S.C. 1001).
            ``(8) NIST associate.--The term `NIST associate' means any 
        guest researcher, facility user, volunteer, or other 
        nonemployee of the National Institute of Standards and 
        Technology who conducts research or otherwise engages in an 
        authorized activity with National Institute of Standards and 
        Technology personnel or at a National Institute of Standards 
        and Technology facility.
            ``(9) Relative.--The term `relative' has the meaning given 
        such term in section 13101 of title 5, United States Code.
            ``(10) Secretary.--The term `Secretary' means the Secretary 
        of Commerce.
            ``(11) Technical standard.--The term `technical standard' 
        has the meaning given such term in section 12(d)(5) of the 
        National Technology Transfer and Advancement Act of 1995 (15 
        U.S.C. 272 note).''.
    (b) Clerical Amendment.--The table of contents in section 1 of 
Public Law 117-167 is amended by inserting after the item relating to 
section 10235 the following new item:

``Sec. 10236. Foundation for Standards and Metrology.''.

SEC. 304. PROHIBITION ON CERTAIN POLICIES RELATING TO THE USE OF 
              ARTIFICIAL INTELLIGENCE OR OTHER AUTOMATED SYSTEMS.

    Not later than 7 days after the date of the enactment of this Act, 
the President, acting through the Director of the Office of Science and 
Technology Policy, shall issue a technology directive with respect to 
artificial intelligence or other automated systems that prohibits any 
action, directive, rule, regulation, policy, principle, or guidance by 
a Federal agency that includes policies that require, recommend, 
promote, or encourage any of the following concepts or rules:
            (1) One race or sex is inherently superior to another race 
        or sex.
            (2) The United States is fundamentally racist or sexist.
            (3) An individual, by virtue of his or her race or sex, is 
        inherently racist, sexist, or oppressive, whether consciously 
        or unconsciously.
            (4) An individual should be discriminated against or 
        receive adverse treatment solely or partly because of his or 
        her race or sex.
            (5) Members of one race or sex cannot and should not 
        attempt to treat others without respect to race or sex.
            (6) The moral character of an individual is necessarily 
        determined by his or her race or sex.
            (7) An individual, by virtue of his or her race or sex, 
        bears responsibility for actions committed in the past by other 
        members of the same race or sex.
            (8) An individual should feel discomfort, guilt, anguish, 
        or another form of psychological distress on account of his or 
        her race or sex.
            (9) Meritocracy or traits such as a hard work ethic are 
        racist or sexist, or were created by a particular race to 
        oppress another.
            (10) Artificial intelligence, algorithms, or other 
        automated systems should be designed in an equitable way that 
        prevents disparate impacts based on a protected class or other 
        societal classification.
            (11) Input data used by designers, developers, or deployers 
        of artificial intelligence, algorithms, or other automated 
        systems should be modified to prevent disparate impacts based 
        on a protected class or other societal classification.
            (12) Designers, developers, integrators, or deployers of 
        artificial intelligence, algorithms, or other automated systems 
        should conduct disparate impact or equity impact assessments 
        prior to deployment or implementation of such technology to 
        ensure inclusivity and equity in the creation, design, or 
        development of the technology.
            (13) Federal agencies should review input data used by 
        designers, developers, or deployers of artificial intelligence, 
        algorithms, or other automated systems to ensure the 
        technology--
                    (A) meets the view of that Federal agency of what 
                constitutes bias or misinformation; and
                    (B) contains no positions contrary to the position 
                of the Federal Government.

SEC. 305. CERTIFICATIONS AND AUDITS OF TEMPORARY FELLOWS.

    (a) Definitions.--In this section:
            (1) Agency.--The term ``agency'' has the meaning given such 
        term in section 3502 of title 44, United States Code.
            (2) Committees of jurisdiction.--The term ``committees of 
        jurisdiction'' means--
                    (A) the Committee on Commerce, Science, and 
                Transportation and the Committee on Energy and Natural 
                Resources of the Senate; and
                    (B) the Committee on Energy and Commerce and the 
                Committee on Science, Space, and Technology of the 
                House of Representatives.
            (3) Critical and emerging technologies.--The term 
        ``critical and emerging technologies'' means a subset of 
        artificial intelligence and other critical and emerging 
        technologies included in the list of such technologies 
        identified and maintained by the National Science and 
        Technology Council of the Office of Science and Technology 
        Policy.
            (4) Inherently governmental function.--The term 
        ``inherently governmental function'' has the meaning given such 
        term in section 5 of the Federal Activities Inventory Reform 
        Act of 1998 (Public Law 105-270; 31 U.S.C. 501 note) and 
        includes the meaning given such term in subpart 7.5 of part 7 
        of the Federal Acquisition Regulation, or successor regulation.
            (5) Temporary fellow.--The term ``temporary fellow'', with 
        respect to an agency, means a fellow, contractor, consultant, 
        or any other person performing work for the agency who is not a 
        Federal government employee.
    (b) Certification.--
            (1) In general.--Prior to performing any work for an agency 
        under this Act relating to artificial intelligence and other 
        critical and emerging technologies, a temporary fellow and the 
        head of the agency shall sign a certification that the 
        temporary fellow will not perform any inherently governmental 
        functions.
            (2) Submittal.--Not later than 30 days after the date on 
        which the head of an agency signs a certification under 
        paragraph (1), the head of the agency shall submit a copy of 
        the certification to the Director of the Office of Management 
        and Budget and the chairpersons and ranking members of the 
        committees of jurisdiction.
    (c) Audit.--
            (1) In general.--For each agency using a temporary fellow 
        to carry out this Act, the inspector general of the agency 
        shall perform an annual audit of the use of temporary fellows 
        by the agency, which includes--
                    (A) the number of temporary fellows used by the 
                agency;
                    (B) the entities paying any temporary fellow for 
                their work for the agency;
                    (C) the work temporary fellows are performing for 
                the agency;
                    (D) the authorities under which the agency hired 
                the temporary fellows; and
                    (E) whether the temporary fellows and the agency 
                are complying with the requirements of section (b).
            (2) Submittal to congress.--Not later than 30 days after 
        the date on which the inspector general of an agency completes 
        an audit under paragraph (1), the head of the agency shall 
        submit to the chairpersons and ranking members of the 
        committees of jurisdiction and the Director of the Office of 
        Management and Budget a report containing the findings of 
        inspector general with respect to the audit.
                                                       Calendar No. 725

118th CONGRESS

  2d Session

                                S. 4178

_______________________________________________________________________

                                 A BILL

To establish artificial intelligence standards, metrics, and evaluation 
 tools, to support artificial intelligence research, development, and 
 capacity building activities, to promote innovation in the artificial 
 intelligence industry by ensuring companies of all sizes can succeed 
                  and thrive, and for other purposes.

_______________________________________________________________________

            December 18 (legislative day, December 16), 2024

                       Reported with an amendment