[Congressional Bills 118th Congress]
[From the U.S. Government Publishing Office]
[S. 4664 Reported in Senate (RS)]
<DOC>
Calendar No. 631
118th CONGRESS
2d Session
S. 4664
To require the Secretary of Energy to establish a program to promote
the use of artificial intelligence to support the missions of the
Department of Energy, and for other purposes.
_______________________________________________________________________
IN THE SENATE OF THE UNITED STATES
July 10, 2024
Mr. Manchin (for himself and Ms. Murkowski) introduced the following
bill; which was read twice and referred to the Committee on Energy and
Natural Resources
November 21, 2024
Reported by Mr. Manchin, with an amendment
[Strike out all after the enacting clause and insert the part printed
in italic]
_______________________________________________________________________
A BILL
To require the Secretary of Energy to establish a program to promote
the use of artificial intelligence to support the missions of the
Department of Energy, and for other purposes.
Be it enacted by the Senate and House of Representatives of the
United States of America in Congress assembled,
<DELETED>SECTION 1. SHORT TITLE.</DELETED>
<DELETED> This Act may be cited as the ``Department of Energy AI
Act''.</DELETED>
<DELETED>SEC. 2. FINDINGS.</DELETED>
<DELETED> Congress finds that--</DELETED>
<DELETED> (1) the Department has a leading role to play in
making the most of the potential of artificial intelligence to
advance the missions of the Department relating to national
security, science, and energy (including critical
materials);</DELETED>
<DELETED> (2) the 17 National Laboratories employ over
40,000 scientists, engineers, and researchers with decades of
experience developing world-leading advanced computational
algorithms, computer science research, experimentation, and
applications in machine learning that underlie artificial
intelligence;</DELETED>
<DELETED> (3) the NNSA manages the Stockpile Stewardship
Program established under section 4201 of the Atomic Energy
Defense Act (50 U.S.C. 2521), which includes the Advanced
Simulation and Computing program, that provides critical
classified and unclassified computing capabilities to sustain
the nuclear stockpile of the United States;</DELETED>
<DELETED> (4) for decades, the Department has led the world
in the design, construction, and operation of the preeminent
high-performance computing systems of the United States, which
benefit the scientific and economic competitiveness of the
United States across many sectors, including energy, critical
materials, biotechnology, and national security;</DELETED>
<DELETED> (5) across the network of 34 user facilities of
the Department, scientists generate tremendous volumes of high-
quality open data across diverse research areas, while the NNSA
has always generated the foremost datasets in the world on
nuclear deterrence and strategic weapons;</DELETED>
<DELETED> (6) the unrivaled quantity and quality of open and
classified scientific datasets of the Department is a unique
asset to rapidly develop frontier AI models;</DELETED>
<DELETED> (7) the Department already develops cutting-edge
AI models to execute the broad mission of the Department,
including AI models of the Department that are used to forecast
disease transmission for COVID-19, and address critical
material issues and emerging nuclear security
missions;</DELETED>
<DELETED> (8) the AI capabilities of the Department will
underpin and jumpstart a dedicated, focused, and centralized AI
program; and</DELETED>
<DELETED> (9) under section 4.1(b) of Executive Order 14110
(88 Fed. Reg. 75191 (November 1, 2023)) (relating to the safe,
secure, and trustworthy development and use of artificial
intelligence), the Secretary is tasked to lead development in
testbeds, national security protections, and assessment of
artificial intelligence applications.</DELETED>
<DELETED>SEC. 3. DEFINITIONS.</DELETED>
<DELETED> In this Act:</DELETED>
<DELETED> (1) AI; artificial intelligence.--The terms ``AI''
and ``artificial intelligence'' have the meaning given the term
``artificial intelligence'' in section 5002 of the National
Artificial Intelligence Initiative Act of 2020 (15 U.S.C.
9401).</DELETED>
<DELETED> (2) Alignment.--The term ``alignment'' means a
field of AI safety research that aims to make AI systems behave
in line with human intentions.</DELETED>
<DELETED> (3) Department.--The term ``Department'' means the
Department of Energy, including the NNSA.</DELETED>
<DELETED> (4) Foundation model.--The term ``foundation
model'' means an AI model that--</DELETED>
<DELETED> (A) is trained on broad data;</DELETED>
<DELETED> (B) generally uses self-
supervision;</DELETED>
<DELETED> (C) contains at least tens of billions of
parameters; and</DELETED>
<DELETED> (D) is applicable across a wide range of
contexts; and</DELETED>
<DELETED> (E) exhibits, or could be easily modified
to exhibit, high levels of performance at tasks that
pose a serious risk to the security, national economic
security, or national public health or safety of the
United States.</DELETED>
<DELETED> (5) Frontier ai.--</DELETED>
<DELETED> (A) In general.--The term ``frontier AI''
means the leading edge of AI research that remains
unexplored and is considered to be the most
challenging, including models--</DELETED>
<DELETED> (i) that exceed the capabilities
currently present in the most advanced existing
models; and</DELETED>
<DELETED> (ii) many of which perform a wide
variety of tasks.</DELETED>
<DELETED> (B) Inclusion.--The term ``frontier AI''
includes AI models with more than 1,000,000,000,000
parameters.</DELETED>
<DELETED> (6) National laboratory.--The term ``National
Laboratory'' has the meaning given the term in section 2 of the
Energy Policy Act of 2005 (42 U.S.C. 15801).</DELETED>
<DELETED> (7) NNSA.--The term ``NNSA'' means the National
Nuclear Security Administration.</DELETED>
<DELETED> (8) Secretary.--The term ``Secretary'' means the
Secretary of Energy.</DELETED>
<DELETED> (9) Testbed.--The term ``testbed'' means any
platform, facility, or environment that enables the testing and
evaluation of scientific theories and new technologies,
including hardware, software, or field environments in which
structured frameworks can be implemented to conduct tests to
assess the performance, reliability, safety, and security of a
wide range of items, including prototypes, systems,
applications, AI models, instruments, computational tools,
devices, and other technological innovations.</DELETED>
<DELETED>SEC. 4. ARTIFICIAL INTELLIGENCE RESEARCH TO
DEPLOYMENT.</DELETED>
<DELETED> (a) Program To Develop and Deploy Frontiers in Artificial
Intelligence for Science, Security, and Technology (FASST).--</DELETED>
<DELETED> (1) Establishment.--Not later than 180 days after
the date of enactment of this Act, the Secretary shall
establish a centralized AI program to carry out research on the
development and deployment of advanced artificial intelligence
capabilities for the missions of the Department (referred to in
this subsection as the ``program''), consistent with the
program established under section 5501 of the William M. (Mac)
Thornberry National Defense Authorization Act for Fiscal Year
2021 (15 U.S.C. 9461).</DELETED>
<DELETED> (2) Program components.--</DELETED>
<DELETED> (A) In general.--The program shall advance
and support diverse activities that include the
following components:</DELETED>
<DELETED> (i) Aggregation, curation, and
distribution of AI training datasets.</DELETED>
<DELETED> (ii) Development and deployment of
next-generation computing platforms and
infrastructure.</DELETED>
<DELETED> (iii) Development and deployment
of safe and trustworthy AI models and
systems.</DELETED>
<DELETED> (iv) Tuning and adaptation of AI
models and systems for pressing scientific,
energy, and national security
applications.</DELETED>
<DELETED> (B) Aggregation, curation, and
distribution of ai training datasets.--In carrying out
the component of the program described in subparagraph
(A)(i), the Secretary shall develop methods, platforms,
protocols, and other tools required for efficient,
safe, and effective aggregation, generation, curation,
and distribution of AI training datasets, including--
</DELETED>
<DELETED> (i) assembling, aggregating, and
curating large-scale training data for advanced
AI, including outputs from research programs of
the Department and other open science data,
with the goal of developing comprehensive
scientific AI training databases and testing
and validation data;</DELETED>
<DELETED> (ii) developing and executing
appropriate data management plan for the
ethical, responsible, and secure use of
classified and unclassified scientific
data;</DELETED>
<DELETED> (iii) identifying, curating, and
safely distributing, as appropriate based on
the application--</DELETED>
<DELETED> (I) scientific and
experimental Departmental datasets;
and</DELETED>
<DELETED> (II) sponsored research
activities that are needed for the
training of foundation and adapted
downstream AI models; and</DELETED>
<DELETED> (iv) partnering with stakeholders
to curate critical datasets that reside outside
the Department but are determined to be
critical to optimizing the capabilities of
open-science AI foundation models, national
security AI foundation models, and other AI
technologies developed under the
program.</DELETED>
<DELETED> (C) Development and deployment of next-
generation computing platforms and infrastructure.--In
carrying out the component of the program described in
subparagraph (A)(ii), the Secretary shall--</DELETED>
<DELETED> (i) develop early-stage AI
testbeds to test and evaluate new software,
hardware, algorithms, and other AI-based
technologies and applications;</DELETED>
<DELETED> (ii) develop and deploy new
energy-efficient AI computing hardware and
software infrastructure necessary for
developing and deploying trustworthy frontier
AI systems that leverage the high-performance
computing capabilities of the Department and
the National Laboratories;</DELETED>
<DELETED> (iii) facilitate the development
and deployment of unclassified and classified
high-performance computing systems and AI
platforms through Department-owned
infrastructure data and computing
facilities;</DELETED>
<DELETED> (iv) procure high-performance
computing and other resources necessary for
developing, training, evaluating, and deploying
AI foundation models and AI technologies;
and</DELETED>
<DELETED> (v) use appropriate supplier
screening tools available through the
Department to ensure that procurements under
clause (iv) are from trusted
suppliers.</DELETED>
<DELETED> (D) Development and deployment of safe and
trustworthy ai models and systems.--In carrying out the
component of the program described in subparagraph
(A)(iii), not later than 3 years after the date of
enactment of this Act, the Secretary shall--</DELETED>
<DELETED> (i) develop innovative concepts
and applied mathematics, computer science,
engineering, and other science disciplines
needed for frontier AI;</DELETED>
<DELETED> (ii) develop best-in-class AI
foundation models and other AI technologies for
open-science and national security
applications;</DELETED>
<DELETED> (iii) research and deploy counter-
adversarial artificial intelligence solutions
to predict, prevent, mitigate, and respond to
threats to critical infrastructure, energy
security, and nuclear nonproliferation, and
biological and chemical threats;</DELETED>
<DELETED> (iv) establish crosscutting
research efforts on AI risks, reliability,
safety, trustworthiness, and alignment,
including the creation of unclassified and
classified data platforms across the
Department; and</DELETED>
<DELETED> (v) develop capabilities needed to
ensure the safe and responsible implementation
of AI in the private and public sectors that--
</DELETED>
<DELETED> (I) may be readily applied
across Federal agencies and private
entities to ensure that open-science
models are released responsibly,
securely, and in the national interest;
and</DELETED>
<DELETED> (II) ensure that
classified national security models are
secure, responsibly managed, and safely
implemented in the national
interest.</DELETED>
<DELETED> (E) Tuning and adaptation of ai models and
systems for pressing scientific and national security
applications.--In carrying out the component of the
program described in subparagraph (A)(iv), the
Secretary shall--</DELETED>
<DELETED> (i) use AI foundation models and
other AI technologies to develop a multitude of
tuned and adapted downstream models to solve
pressing scientific, energy, and national
security challenges;</DELETED>
<DELETED> (ii) carry out joint work,
including public-private partnerships, and
cooperative research projects with industry,
including end user companies, hardware systems
vendors, and AI software companies, to advance
AI technologies relevant to the missions of the
Department;</DELETED>
<DELETED> (iii) form partnerships with other
Federal agencies, institutions of higher
education, and international organizations
aligned with the interests of the United States
to advance frontier AI systems development and
deployment; and</DELETED>
<DELETED> (iv) increase research experiences
and workforce development, including training
for undergraduate and graduate students in
frontier AI for science, energy, and national
security.</DELETED>
<DELETED> (3) Strategic plan.--In carrying out the program,
the Secretary shall develop a strategic plan with specific
short-term and long-term goals and resource needs to advance
applications in AI for science, energy, and national security
to support the missions of the Department, consistent with--
</DELETED>
<DELETED> (A) the 2023 National Laboratory workshop
report entitled ``Advanced Research Directions on AI
for Science, Energy, and Security''; and</DELETED>
<DELETED> (B) the 2024 National Laboratory workshop
report entitled ``AI for Energy''.</DELETED>
<DELETED> (b) AI Research and Development Centers.--</DELETED>
<DELETED> (1) In general.--As part of the program
established under subsection (a), the Secretary shall select,
on a competitive, merit-reviewed basis, National Laboratories
to establish and operate not fewer than 8 multidisciplinary AI
Research and Development Centers (referred to in this
subsection as ``Centers'')--</DELETED>
<DELETED> (A) to accelerate the safe and trustworthy
deployment of AI for science, energy, and national
security missions;</DELETED>
<DELETED> (B) to demonstrate the use of AI in
addressing key challenge problems of national interest
in science, energy, and national security;
and</DELETED>
<DELETED> (C) to maintain the competitive advantage
of the United States in AI.</DELETED>
<DELETED> (2) Focus.--Each Center shall bring together
diverse teams from National Laboratories, academia, and
industry to collaboratively and concurrently deploy hardware,
software, numerical methods, data, algorithms, and applications
for AI and ensure that the frontier AI research of the
Department is well-suited for key Department missions,
including by using existing and emerging computing systems to
the maximum extent practicable.</DELETED>
<DELETED> (3) Administration.--</DELETED>
<DELETED> (A) National laboratory.--Each Center
shall be established as part of a National
Laboratory.</DELETED>
<DELETED> (B) Application.--To be eligible for
selection to establish and operate a Center under
paragraph (1), a National Laboratory shall submit to
the Secretary an application at such time, in such
manner, and containing such information as the
Secretary may require.</DELETED>
<DELETED> (C) Director.--Each Center shall be headed
by a Director, who shall be the Chief Executive Officer
of the Center and an employee of the National
Laboratory described in subparagraph (A), and
responsible for--</DELETED>
<DELETED> (i) successful execution of the
goals of the Center; and</DELETED>
<DELETED> (ii) coordinating with other
Centers.</DELETED>
<DELETED> (D) Technical roadmap.--In support of the
strategic plan developed under subsection (a)(3), each
Center shall--</DELETED>
<DELETED> (i) set a research and innovation
goal central to advancing the science, energy,
and national security mission of the
Department; and</DELETED>
<DELETED> (ii) establish a technical roadmap
to meet that goal in not more than 7
years.</DELETED>
<DELETED> (E) Coordination.--The Secretary shall
coordinate, minimize duplication, and resolve conflicts
between the Centers.</DELETED>
<DELETED> (4) Funding.--Of the amounts made available under
subsection (h), each Center shall receive not less than
$30,000,000 per year for a duration of not less than 5 years
but not more than 7 years, which yearly amount may be renewed
for an additional 5-year period.</DELETED>
<DELETED> (c) AI Risk Evaluation and Mitigation Program.--</DELETED>
<DELETED> (1) AI risk program.--As part of the program
established under subsection (a), and consistent with the
missions of the Department, the Secretary, in consultation with
the Secretary of Homeland Security, the Secretary of Defense,
the Director of National Intelligence, the Director of the
National Security Agency, and the Secretary of Commerce, shall
carry out a comprehensive program to evaluate and mitigate
safety and security risks associated with artificial
intelligence systems (referred to in this subsection as the
``AI risk program'').</DELETED>
<DELETED> (2) Risk taxonomy.--</DELETED>
<DELETED> (A) In general.--Under the AI risk
program, the Secretary shall develop a taxonomy of
safety and security risks associated with artificial
intelligence systems relevant to the missions of the
Department, including, at a minimum, the risks
described in subparagraph (B).</DELETED>
<DELETED> (B) Risks described.--The risks referred
to in subparagraph (A) are the abilities of artificial
intelligence--</DELETED>
<DELETED> (i) to generate information at a
given classification level;</DELETED>
<DELETED> (ii) to assist in generation of
nuclear weapons information;</DELETED>
<DELETED> (iii) to assist in generation of
chemical, biological, radiological, nuclear,
nonproliferation, critical infrastructure, and
energy security threats or hazards;</DELETED>
<DELETED> (iv) to assist in generation of
malware and other cyber and adversarial threats
that pose a significant national security risk,
such as threatening the stability of critical
national infrastructure;</DELETED>
<DELETED> (v) to undermine public trust in
the use of artificial intelligence technologies
or in national security;</DELETED>
<DELETED> (vi) to deceive a human operator
or computer system, or otherwise act in
opposition to the goals of a human operator or
automated systems; and</DELETED>
<DELETED> (vii) to act autonomously with
little or no human intervention in ways that
conflict with human intentions.</DELETED>
<DELETED> (d) Shared Resources for AI.--</DELETED>
<DELETED> (1) In general.--As part of the program
established under subsection (a), the Secretary shall identify,
support, and sustain shared resources and enabling tools that
have the potential to accelerate the pace of scientific
discovery and technological innovation with respect to the
missions of the Department relating to science, energy, and
national security.</DELETED>
<DELETED> (2) Consultation.--In carrying out paragraph (1),
the Secretary shall consult with relevant experts in industry,
academia, and the National Laboratories.</DELETED>
<DELETED> (3) Focus.--Shared resources and enabling tools
referred to in paragraph (1) shall include the
following:</DELETED>
<DELETED> (A) Scientific data and knowledge bases
for training AI systems.</DELETED>
<DELETED> (B) Benchmarks and competitions for
evaluating advances in AI systems.</DELETED>
<DELETED> (C) Platform technologies that lower the
cost of generating training data or enable the
generation of novel training data.</DELETED>
<DELETED> (D) High-performance computing, including
hybrid computing systems that integrate AI and high-
performance computing.</DELETED>
<DELETED> (E) The combination of AI and scientific
automation, such as cloud labs and self-driving
labs.</DELETED>
<DELETED> (F) Tools that enable AI to solve inverse
design problems.</DELETED>
<DELETED> (G) Testbeds for accelerating progress at
the intersection of AI and cyberphysical
systems.</DELETED>
<DELETED> (e) Administration.--</DELETED>
<DELETED> (1) Research security.--The activities authorized
under this section shall be applied in a manner consistent with
subtitle D of title VI of the Research and Development,
Competition, and Innovation Act (42 U.S.C. 19231 et
seq.).</DELETED>
<DELETED> (2) Cybersecurity.--The Secretary shall ensure the
integration of robust cybersecurity measures into all AI
research-to-deployment efforts authorized under this section to
protect the integrity and confidentiality of collected and
analyzed data.</DELETED>
<DELETED> (3) Partnerships with private entities.--
</DELETED>
<DELETED> (A) In general.--The Secretary shall seek
to establish partnerships with private companies and
nonprofit organizations in carrying out this Act,
including with respect to the research, development,
and deployment of each of the 4 program components
described in subsection (a)(2)(A).</DELETED>
<DELETED> (B) Requirement.--In carrying out
subparagraph (A), the Secretary shall protect any
information submitted to or shared by the Department
consistent with applicable laws (including
regulations).</DELETED>
<DELETED> (f) STEM Education and Workforce Development.--</DELETED>
<DELETED> (1) In general.--Of the amounts made available
under subsection (h), not less than 10 percent shall be used to
foster the education and training of the next-generation AI
workforce.</DELETED>
<DELETED> (2) AI talent.--As part of the program established
under subsection (a), the Secretary shall develop the required
workforce, and hire and train not fewer than 500 new
researchers to meet the rising demand for AI talent--</DELETED>
<DELETED> (A) with a particular emphasis on
expanding the number of individuals from
underrepresented groups pursuing and attaining skills
relevant to AI; and</DELETED>
<DELETED> (B) including by--</DELETED>
<DELETED> (i) providing training, grants,
and research opportunities;</DELETED>
<DELETED> (ii) carrying out public awareness
campaigns about AI career paths; and</DELETED>
<DELETED> (iii) establishing new degree and
certificate programs in AI-related disciplines
at universities and community
colleges.</DELETED>
<DELETED> (g) Annual Report.--The Secretary shall submit to Congress
an annual report describing--</DELETED>
<DELETED> (1) the progress, findings, and expenditures under
each program established under this section; and</DELETED>
<DELETED> (2) any legislative recommendations for promoting
and improving each of those programs.</DELETED>
<DELETED> (h) Authorization of Appropriations.--There is authorized
to be appropriated to carry out this section $2,400,000,000 each year
for the 5-year period following the date of enactment of this
Act.</DELETED>
<DELETED>SEC. 5. FEDERAL PERMITTING.</DELETED>
<DELETED> (a) Establishment.--Not later than 180 days after the date
of enactment of this Act, the Secretary shall establish a program to
improve Federal permitting processes for energy-related projects,
including critical materials projects, using artificial
intelligence.</DELETED>
<DELETED> (b) Program Components.--In carrying out the program
established under subsection (a), the Secretary shall carry out
activities, including activities that--</DELETED>
<DELETED> (1) analyze data and provide tools from past
environmental and other permitting reviews, including by--
</DELETED>
<DELETED> (A) extracting data from applications for
comparison with data relied on in environmental reviews
to assess the adequacy and relevance of
applications;</DELETED>
<DELETED> (B) extracting information from past site-
specific analyses in the area of a current
project;</DELETED>
<DELETED> (C) summarizing key mitigation actions
that have been successfully applied in past similar
projects; and</DELETED>
<DELETED> (D) using AI for deeper reviews of past
determinations under the National Environmental Policy
Act of 1969 (42 U.S.C. 4321 et seq.) to inform more
flexible and effective categorical exclusions;
and</DELETED>
<DELETED> (2) build tools to improve future reviews,
including--</DELETED>
<DELETED> (A) tools for project proponents that
accelerate preparation of environmental
documentation;</DELETED>
<DELETED> (B) tools for government reviewers such as
domain-specific large language models that help convert
geographic information system or tabular data on
resources potentially impacted into rough-draft
narrative documents;</DELETED>
<DELETED> (C) tools to be applied in nongovernmental
settings, such as automatic reviews of applications to
assess the completeness of information; and</DELETED>
<DELETED> (D) a strategic plan to implement and
deploy online and digital tools to improve Federal
permitting activities, developed in consultation with--
</DELETED>
<DELETED> (i) the Secretary of the
Interior;</DELETED>
<DELETED> (ii) the Secretary of Agriculture,
with respect to National Forest System
land;</DELETED>
<DELETED> (iii) the Executive Director of
the Federal Permitting Improvement Steering
Council established by section 41002(a) of the
FAST Act (42 U.S.C. 4370m-1(a)); and</DELETED>
<DELETED> (iv) the heads of any other
relevant Federal department or agency, as
determined appropriate by the
Secretary.</DELETED>
<DELETED>SEC. 6. RULEMAKING ON AI STANDARDIZATION FOR GRID
INTERCONNECTION.</DELETED>
<DELETED> Not later than 18 months after the date of enactment of
this Act, the Federal Energy Regulatory Commission shall initiate a
rulemaking to revise the pro forma Large Generator Interconnection
Procedures promulgated pursuant to section 35.28(f) of title 18, Code
of Federal Regulations (or successor regulations), to require public
utility transmission providers to share and employ, as appropriate,
queue management best practices with respect to the use of computing
technologies, such as artificial intelligence, machine learning, or
automation, in evaluating and processing interconnection requests, in
order to expedite study results with respect to those
requests.</DELETED>
<DELETED>SEC. 7. ENSURING ENERGY SECURITY FOR DATACENTERS AND COMPUTING
RESOURCES.</DELETED>
<DELETED> Not later than 1 year after the date of enactment of this
Act, the Secretary shall submit to Congress a report that--</DELETED>
<DELETED> (1) assesses--</DELETED>
<DELETED> (A) the growth of computing data centers
and advanced computing electrical power load in the
United States;</DELETED>
<DELETED> (B) potential risks of growth in computing
centers or growth in the required electrical power to
United States energy and national security;
and</DELETED>
<DELETED> (C) the extent to which emerging
technologies, such as artificial intelligence and
advanced computing, may impact hardware and software
systems used at data and computing centers;
and</DELETED>
<DELETED> (2) provides recommendations for--</DELETED>
<DELETED> (A) resources and capabilities that the
Department may provide to promote access to energy
resources by data centers and advanced
computing;</DELETED>
<DELETED> (B) policy changes to ensure domestic
deployment of data center and advanced computing
resources prevents offshoring of United States data and
resources; and</DELETED>
<DELETED> (C) improving the energy efficiency of
data centers, advanced computing, and AI.</DELETED>
<DELETED>SEC. 8. OFFICE OF CRITICAL AND EMERGING TECHNOLOGY.</DELETED>
<DELETED> (a) In General.--Title II of the Department of Energy
Organization Act is amended by inserting after section 215 (42 U.S.C.
7144b) the following:</DELETED>
<DELETED>``SEC. 216. OFFICE OF CRITICAL AND EMERGING
TECHNOLOGY.</DELETED>
<DELETED> ``(a) Definitions.--In this section:</DELETED>
<DELETED> ``(1) Critical and emerging technology.--The term
`critical and emerging technology' means--</DELETED>
<DELETED> ``(A) advanced technology that is
potentially significant to United States
competitiveness, energy security, or national security,
such as biotechnology, advanced computing, and advanced
manufacturing;</DELETED>
<DELETED> ``(B) technology that may address the
challenges described in subsection (b) of section 10387
of the Research and Development, Competition, and
Innovation Act (42 U.S.C. 19107); and</DELETED>
<DELETED> ``(C) technology described in the key
technology focus areas described in subsection (c) of
that section (42 U.S.C. 19107).</DELETED>
<DELETED> ``(2) Department capabilities.--The term
`Department capabilities' means--</DELETED>
<DELETED> ``(A) each of the National Laboratories
(as defined in section 2 of the Energy Policy Act of
2005 (42 U.S.C. 15801)); and</DELETED>
<DELETED> ``(B) each associated user facility of the
Department.</DELETED>
<DELETED> ``(3) Director.--The term `Director' means the
Director of Critical and Emerging Technology described in
subsection (d).</DELETED>
<DELETED> ``(4) Office.--The term `Office' means the Office
of Critical and Emerging Technology established by subsection
(b).</DELETED>
<DELETED> ``(b) Establishment.--There shall be within the Office of
the Under Secretary for Science and Innovation an Office of Critical
and Emerging Technology.</DELETED>
<DELETED> ``(c) Mission.--The mission of the Office shall be--
</DELETED>
<DELETED> ``(1) to work across the entire Department to
assess and analyze the status of and gaps in United States
competitiveness, energy security, and national security
relating to critical and emerging technologies, including
through the use of Department capabilities;</DELETED>
<DELETED> ``(2) to leverage Department capabilities to
provide for rapid response to emerging threats and
technological surprise from new emerging
technologies;</DELETED>
<DELETED> ``(3) to promote greater participation of
Department capabilities within national science policy and
international forums; and</DELETED>
<DELETED> ``(4) to inform the direction of research and
policy decisionmaking relating to potential risks of adoption
and use of emerging technologies, such as inadvertent or
deliberate misuses of technology.</DELETED>
<DELETED> ``(d) Director of Critical and Emerging Technology.--The
Office shall be headed by a director, to be known as the `Director of
Critical and Emerging Technology', who shall--</DELETED>
<DELETED> ``(1) be appointed by the Secretary; and</DELETED>
<DELETED> ``(2) be an individual who, by reason of
professional background and experience, is specially qualified
to advise the Secretary on matters pertaining to critical and
emerging technology.</DELETED>
<DELETED> ``(e) Collaboration.--In carrying out the mission and
activities of the Office, the Director shall closely collaborate with
all relevant Departmental entities, including the National Nuclear
Security Administration and the Office of Science, to maximize the
computational capabilities of the Department and minimize redundant
capabilities.</DELETED>
<DELETED> ``(f) Coordination.--In carrying out the mission and
activities of the Office, the Director--</DELETED>
<DELETED> ``(1) shall coordinate with senior leadership
across the Department and other stakeholders (such as
institutions of higher education and private
industry);</DELETED>
<DELETED> ``(2) shall ensure the coordination of the Office
of Science with the other activities of the Department relating
to critical and emerging technology, including the transfer of
knowledge, capabilities, and relevant technologies, from basic
research programs of the Department to applied research and
development programs of the Department, for the purpose of
enabling development of mission-relevant
technologies;</DELETED>
<DELETED> ``(3) shall support joint activities among the
programs of the Department;</DELETED>
<DELETED> ``(4) shall coordinate with the heads of other
relevant Federal agencies operating under existing
authorizations with subjects related to the mission of the
Office described in subsection (c) in support of advancements
in related research areas, as the Director determines to be
appropriate; and</DELETED>
<DELETED> ``(5) may form partnerships to enhance the use of,
and to ensure access to, user facilities by other Federal
agencies.</DELETED>
<DELETED> ``(g) Planning, Assessment, and Reporting.--</DELETED>
<DELETED> ``(1) In general.--Not later than 180 days after
the date of enactment of the Department of Energy AI Act, the
Secretary shall submit to Congress a critical and emerging
technology action plan and assessment, which shall include--
</DELETED>
<DELETED> ``(A) a review of current investments,
programs, activities, and science infrastructure of the
Department, including under National Laboratories, to
advance critical and emerging technologies;</DELETED>
<DELETED> ``(B) a description of any shortcomings of
the capabilities of the Department that may adversely
impact national competitiveness relating to emerging
technologies or national security; and</DELETED>
<DELETED> ``(C) a budget projection for the
subsequent 5 fiscal years of planned investments of the
Department in each critical and emerging technology,
including research and development, infrastructure,
pilots, test beds, demonstration projects, and other
relevant activities.</DELETED>
<DELETED> ``(2) Updates.--Every 2 years after the submission
of the plan and assessment under paragraph (1), the Secretary
shall submit to Congress--</DELETED>
<DELETED> ``(A) an updated emerging technology
action plan and assessment; and</DELETED>
<DELETED> ``(B) a report that describes the progress
made toward meeting the goals set forth in the emerging
technology action plan and assessment submitted
previously.''.</DELETED>
<DELETED> (b) Clerical Amendment.--The table of contents for the
Department of Energy Organization Act (Public Law 95-91; 91 Stat. 565;
119 Stat. 764; 133 Stat. 2199) is amended by inserting after the item
relating to section 215 the following:</DELETED>
<DELETED>``Sec. 216. Office of Critical and Emerging Technology.''.
SECTION 1. SHORT TITLE.
This Act may be cited as the ``Department of Energy AI Act''.
SEC. 2. FINDINGS.
Congress finds that--
(1) the Department has a leading role to play in making the
most of the potential of artificial intelligence to advance the
missions of the Department relating to national security,
science, and energy (including critical materials);
(2) the 17 National Laboratories employ over 40,000
scientists, engineers, and researchers with decades of
experience developing world-leading advanced computational
algorithms, computer science research, experimentation, and
applications in machine learning that underlie artificial
intelligence;
(3) the NNSA manages the Stockpile Stewardship Program
established under section 4201 of the Atomic Energy Defense Act
(50 U.S.C. 2521), which includes the Advanced Simulation and
Computing program, that provides critical classified and
unclassified computing capabilities to sustain the nuclear
stockpile of the United States;
(4) for decades, the Department has led the world in the
design, construction, and operation of the preeminent high-
performance computing systems of the United States, which
benefit the scientific and economic competitiveness of the
United States across many sectors, including energy, critical
materials, biotechnology, and national security;
(5) across the Department's network of 34 user facilities,
scientists generate tremendous volumes of high-quality open
data across diverse research areas, while the NNSA has always
generated the foremost datasets in the world on nuclear
deterrence and strategic weapons;
(6) the unrivaled quantity and quality of open and
classified scientific datasets of the Department is a unique
asset to rapidly develop frontier AI models;
(7) the Department already develops cutting-edge AI models
to execute the broad mission of the Department, including AI
models developed by the Department that are used to forecast
disease transmission for COVID-19, and address critical
material issues and emerging nuclear security missions;
(8) the AI capabilities of the Department will underpin and
jumpstart a dedicated, focused, and centralized AI program; and
(9) under section 4.1(b) of Executive Order 14110 (88 Fed.
Reg. 75191 (November 1, 2023)) (relating to the safe, secure,
and trustworthy development and use of artificial
intelligence), the Secretary is tasked to lead development in
testbeds, national security protections, and assessment of
artificial intelligence applications.
SEC. 3. DEFINITIONS.
In this Act:
(1) AI; artificial intelligence.--The terms ``AI'' and
``artificial intelligence'' have the meaning given the term
``artificial intelligence'' in section 5002 of the National
Artificial Intelligence Initiative Act of 2020 (15 U.S.C.
9401).
(2) Alignment.--The term ``alignment'' means a field of AI
safety research that aims to make AI systems behave in line
with human intentions.
(3) Department.--The term ``Department'' means the
Department of Energy, including the NNSA.
(4) Foundation model.--The term ``foundation model'' means
an AI model that--
(A) is trained on broad data;
(B) generally uses self-supervision;
(C) contains at least tens of billions of
parameters; and
(D) is applicable across a wide range of contexts;
and
(E) exhibits, or could be easily modified to
exhibit, high levels of performance at tasks that pose
a serious risk to the security, national economic
security, or national public health or safety of the
United States.
(5) Frontier ai.--
(A) In general.--The term ``frontier AI'' means the
leading edge of AI research that remains unexplored and
is considered to be the most challenging, including
models--
(i) that exceed the capabilities currently
present in the most advanced existing models;
and
(ii) many of which perform a wide variety
of tasks.
(B) Inclusion.--The term ``frontier AI'' includes
AI models with more than 1,000,000,000,000 parameters.
(6) National laboratory.--The term ``National Laboratory''
has the meaning given the term in section 2 of the Energy
Policy Act of 2005 (42 U.S.C. 15801).
(7) NNSA.--The term ``NNSA'' means the National Nuclear
Security Administration.
(8) Secretary.--The term ``Secretary'' means the Secretary
of Energy.
(9) Testbed.--The term ``testbed'' means any platform,
facility, or environment that enables the testing and
evaluation of scientific theories and new technologies,
including hardware, software, or field environments in which
structured frameworks can be implemented to conduct tests to
assess the performance, reliability, safety, and security of a
wide range of items, including prototypes, systems,
applications, AI models, instruments, computational tools,
devices, and other technological innovations.
SEC. 4. ARTIFICIAL INTELLIGENCE RESEARCH TO DEPLOYMENT.
(a) Program to Develop and Deploy Frontiers in Artificial
Intelligence for Science, Security, and Technology (FASST).--
(1) Establishment.--Not later than 180 days after the date
of enactment of this Act, the Secretary shall establish a
centralized AI program to carry out research on the development
and deployment of advanced artificial intelligence capabilities
for the missions of the Department (referred to in this
subsection as the ``program''), consistent with the program
established under section 5501 of the William M. (Mac)
Thornberry National Defense Authorization Act for Fiscal Year
2021 (15 U.S.C. 9461).
(2) Program components.--
(A) In general.--The program shall advance and
support diverse activities that include the following
components:
(i) Aggregation, curation, and distribution
of AI training datasets.
(ii) Development and deployment of next-
generation computing platforms and
infrastructure.
(iii) Development and deployment of safe
and trustworthy AI models and systems.
(iv) Tuning and adaptation of AI models and
systems for pressing scientific, energy, and
national security applications.
(B) Aggregation, curation, and distribution of ai
training datasets.--In carrying out the component of
the program described in subparagraph (A)(i), the
Secretary shall develop methods, platforms, protocols,
and other tools required for efficient, safe, secure,
and effective aggregation, generation, curation, and
distribution of AI training datasets, including--
(i) assembling, aggregating, and curating
large-scale training data for advanced AI,
including outputs and synthetic data from
research programs of the Department and other
open science data, with the goal of developing
comprehensive scientific AI training databases
and testing and validation data;
(ii) developing and executing appropriate
data management plan for the ethical,
responsible, and secure use of classified and
unclassified scientific data;
(iii) identifying, restricting, securing,
curating, and safely distributing, as
appropriate based on the application--
(I) scientific and experimental
Departmental datasets; and
(II) sponsored research activities
that are needed for the training of
foundation and adapted downstream AI
models; and
(iv) partnering with stakeholders to
identify, secure, and curate critical datasets
that reside outside the Department but are
determined to be critical to optimizing the
capabilities of open-science AI foundation
models, national security AI foundation models,
applied energy AI foundation models, and other
AI technologies developed under the program.
(C) Development and deployment of next-generation
computing platforms and infrastructure.--In carrying
out the component of the program described in
subparagraph (A)(ii), the Secretary shall--
(i) develop early-stage and application-
stage AI testbeds to test and evaluate new
software, hardware, algorithms, and other AI-
based technologies and applications;
(ii) develop and deploy new energy-
efficient AI computing hardware and software
infrastructure necessary for developing and
deploying trustworthy and secure interoperable
frontier AI systems that leverage the high-
performance computing capabilities of the
Department and the National Laboratories;
(iii) facilitate the development and
deployment of unclassified and classified high-
performance computing systems and AI platforms
through Department-owned infrastructure data
and computing facilities;
(iv) procure interoperable high-performance
computing and other resources necessary for
developing, training, evaluating, and deploying
AI foundation models and AI technologies; and
(v) use appropriate supplier screening
tools available through the Department to
ensure that procurements under clause (iv) are
from trusted suppliers.
(D) Development and deployment of safe, secure, and
trustworthy ai models and systems.--In carrying out the
component of the program described in subparagraph
(A)(iii), not later than 3 years after the date of
enactment of this Act, the Secretary shall--
(i) develop innovative concepts and applied
mathematics, computer science, engineering, and
other science disciplines needed for frontier
AI;
(ii) develop best-in-class AI foundation
models and other AI technologies for open-
science, applied energy, and national security
applications;
(iii) research, develop, and deploy
counter-adversarial artificial intelligence
solutions to predict, prevent, mitigate, and
respond to threats to critical infrastructure,
energy security, nuclear nonproliferation,
biological and chemical threats, and cyber
threats;
(iv) establish crosscutting research
efforts on AI risks, reliability, safety,
cybersecurity, trustworthiness, and alignment,
including the creation of unclassified and
classified data platforms across the
Department; and
(v) develop capabilities needed to ensure
the safe, secure, and responsible
implementation of AI in the private and public
sectors that--
(I) may be readily applied across
Federal agencies and private entities
to ensure that open-science models are
released, operated, and managed
responsibly, securely, and in the
national interest; and
(II) ensure that classified
national security models are secure,
responsibly-managed, and safely
implemented in the national interest.
(E) Tuning and adaptation of ai models and systems
for pressing scientific, applied energy, and national
security applications.--In carrying out the component
of the program described in subparagraph (A)(iv), the
Secretary shall--
(i) use AI foundation models and other AI
technologies to develop a multitude of tuned
and adapted downstream models to solve pressing
scientific, applied energy, and national
security challenges;
(ii) carry out joint work, including
public-private partnerships, and cooperative
research projects with industry, including end
user companies, hardware systems vendors, and
AI software companies, to advance AI
technologies relevant to the missions of the
Department;
(iii) form partnerships with other Federal
agencies, institutions of higher education, and
international organizations aligned with the
interests of the United States to advance
frontier AI systems development and deployment;
and
(iv) increase research experiences and
workforce development, including training for
undergraduate and graduate students in frontier
AI for science, energy, and national security.
(3) Strategic plan.--In carrying out the program, the
Secretary shall develop a strategic plan with specific short-
term and long-term goals and resource needs to advance
applications in AI for science, energy, and national security
to support the missions of the Department, consistent with--
(A) the 2023 National Laboratory workshop report
entitled ``Advanced Research Directions on AI for
Science, Energy, and Security''; and
(B) the 2024 National Laboratory workshop report
entitled ``AI for Energy''.
(4) AI talent.--As part of the program, the Secretary shall
develop the required workforce, and hire and train not fewer
than 500 new researchers to meet the rising demand for AI
talent--
(A) with a particular emphasis on expanding the
number of individuals from underrepresented groups
pursuing and attaining skills relevant to AI; and
(B) including by--
(i) providing training, grants, and
research opportunities;
(ii) carrying out public awareness
campaigns about AI career paths; and
(iii) establishing new degree and
certificate programs in AI-related disciplines
at universities and community colleges.
(b) AI Research and Development Centers.--
(1) In general.--As part of the program established under
subsection (a), the Secretary shall select, on a competitive,
merit-reviewed basis, National Laboratories to establish and
operate not fewer than 8 multidisciplinary AI Research and
Development Centers (referred to in this subsection as
``Centers'')--
(A) to accelerate the safe, secure, and trustworthy
deployment of AI for science, energy, and national
security missions;
(B) to demonstrate the use of AI in addressing key
challenge problems of national interest in science,
energy, and national security; and
(C) to maintain the competitive advantage of the
United States in AI.
(2) Considerations for selection.--In selecting National
Laboratories under paragraph (1), the Secretary shall, to the
maximum extent practicable--
(A) ensure that at least 1 Center focuses on
applied energy activities carried out by the Office of
Energy Efficiency and Renewable Energy, the Office of
Fossil Energy and Carbon Management, or the Office of
Nuclear Energy; and
(B) consider geographic diversity to leverage
resources and facilities of National Laboratories and
partners in different regions.
(3) Focus.--Each Center shall bring together diverse teams
from National Laboratories, Department user facilities,
academia, and industry to collaboratively and concurrently
deploy hardware, software, numerical methods, data, algorithms,
and applications for AI and ensure that the frontier AI
research of the Department is well-suited for key Department
missions, including by using existing and emerging computing
systems and datasets to the maximum extent practicable.
(4) Administration.--
(A) National laboratory.--Each Center shall be
established as part of a National Laboratory.
(B) Application.--To be eligible for selection to
establish and operate a Center under paragraph (1), a
National Laboratory shall submit to the Secretary an
application at such time, in such manner, and
containing such information as the Secretary may
require.
(C) Director.--Each Center shall be headed by a
Director, who shall be the Chief Executive Officer of
the Center and an employee of the National Laboratory
described in subparagraph (A), and responsible for--
(i) successful execution of the goals of
the Center; and
(ii) coordinating with other Centers.
(D) Technical roadmap.--In support of the strategic
plan developed under subsection (a)(3), each Center
shall--
(i) set a research and innovation goal
central to advancing the science, energy, and
national security mission of the Department;
and
(ii) establish a technical roadmap to meet
that goal in not more than 7 years.
(E) Coordination.--The Secretary shall coordinate,
minimize duplication, and resolve conflicts between the
Centers.
(c) AI Risk Evaluation and Mitigation Program.--
(1) AI risk program.--As part of the program established
under subsection (a), and consistent with the missions of the
Department, the Secretary, in consultation with the Secretary
of Homeland Security, the Secretary of Defense, the Director of
National Intelligence, the Director of the National Security
Agency, and the Secretary of Commerce, shall carry out a
comprehensive program to evaluate and mitigate safety and
security risks associated with artificial intelligence systems
(referred to in this subsection as the ``AI risk program'').
(2) Risk taxonomy.--
(A) In general.--Under the AI risk program, the
Secretary shall develop a taxonomy of safety and
security risks associated with artificial intelligence
systems and datasets relevant to the missions of the
Department, including, at a minimum, the risks
described in subparagraph (B).
(B) Risks described.--The risks referred to in
subparagraph (A) are the abilities of artificial
intelligence--
(i) to generate information at a given
classification level;
(ii) to assist in generation of nuclear
weapons information;
(iii) to assist in generation of chemical,
biological, radiological, nuclear,
nonproliferation, critical infrastructure, and
other economic, security, or energy threats;
(iv) to assist in generation of malware and
other cyber and adversarial tactics,
techniques, and procedures that pose a
significant national security risk, such as
threatening the stability of critical national
infrastructure;
(v) to undermine public trust in the use of
artificial intelligence technologies or in
national security;
(vi) to deceive a human operator or
computer system, or otherwise act in opposition
to the goals of a human operator or automated
systems;
(vii) to act autonomously with little or no
human intervention in ways that conflict with
human intentions;
(viii) to be vulnerable to data compromise
by malicious cyber actors; and
(ix) to be vulnerable to other emerging or
unforeseen risk, as determined by the
Secretary.
(d) Shared Resources for AI.--
(1) In general.--As part of the program established under
subsection (a), the Secretary shall identify, support, and
sustain shared resources and enabling tools that have the
potential to reduce cost and accelerate the pace of scientific
discovery and technological innovation with respect to the
missions of the Department relating to science, energy, and
national security.
(2) Consultation.--In carrying out paragraph (1), the
Secretary shall consult with relevant experts in industry,
academia, and the National Laboratories.
(3) Focus.--Shared resources and enabling tools referred to
in paragraph (1) shall include the following:
(A) Scientific data and knowledge bases for
training AI systems.
(B) Benchmarks and competitions for evaluating
advances in AI systems.
(C) Platform technologies that lower the cost of
generating training data or enable the generation of
novel training data.
(D) High-performance computing, including hybrid
computing systems that integrate AI and high-
performance computing.
(E) The combination of AI and scientific
automation, such as cloud labs and self-driving labs.
(F) Tools that enable AI to solve inverse design
problems.
(G) Testbeds for accelerating progress at the
intersection of AI and cyberphysical systems.
(e) Administration.--
(1) Research security.--The activities authorized under
this section shall be applied in a manner consistent with
subtitle D of title VI of the Research and Development,
Competition, and Innovation Act (42 U.S.C. 19231 et seq.).
(2) Cybersecurity.--The Secretary shall ensure the
integration of robust cybersecurity and data security measures
into all AI research-to-deployment efforts authorized under
this section to protect the integrity and confidentiality of
collected and analyzed data.
(3) Partnerships with private entities.--
(A) In general.--The Secretary shall seek to
establish partnerships with private companies and
nonprofit organizations in carrying out this Act,
including with respect to the research, development,
and deployment of each of the 4 program components
described in subsection (a)(2)(A).
(B) Requirement.--In carrying out subparagraph (A),
the Secretary shall protect any information submitted
to or shared by the Department consistent with
applicable laws (including regulations).
(4) Considerations.--In carrying out this section, the
Secretary shall, to the maximum extent practicable, consider
leveraging existing resources from public and private sectors.
(f) Annual Report.--The Secretary shall submit to Congress an
annual report describing--
(1) the progress, findings, and expenditures under each
program established under this section; and
(2) any legislative recommendations for promoting and
improving each of those programs.
SEC. 5. FEDERAL PERMITTING.
(a) Establishment.--Not later than 180 days after the date of
enactment of this Act, the Secretary shall establish a program to
improve Federal permitting processes for energy-related projects,
including critical materials projects using artificial intelligence.
(b) Program Components.--In carrying out the program established
under subsection (a), the Secretary shall carry out activities,
including activities that--
(1) generate, collect, and analyze data and provide tools
from past environmental and other permitting reviews, including
by--
(A) extracting data from applications for
comparison with data relied on in environmental reviews
to assess the adequacy and relevance of applications;
(B) extracting information from past site-specific
analyses in the area of a current project;
(C) summarizing key mitigation actions that have
been successfully applied in past similar projects; and
(D) using AI for deeper reviews of past
determinations under the National Environmental Policy
Act of 1969 (42 U.S.C. 4321 et seq.) to inform more
flexible and effective categorical exclusions; and
(2) build tools to improve future reviews, including--
(A) tools for project proponents that accelerate
preparation of environmental documentation;
(B) tools for government reviewers such as domain-
specific large language models that help convert
geographic information system or tabular data on
resources potentially impacted into rough-draft
narrative documents;
(C) tools to be applied in nongovernmental
settings, such as automatic reviews of applications to
assess the completeness of information; and
(D) a strategic plan to implement and deploy online
and digital tools to improve Federal permitting
activities, developed in consultation with--
(i) the Secretary of the Interior;
(ii) the Secretary of Agriculture, with
respect to National Forest System land;
(iii) the Executive Director of the Federal
Permitting Improvement Steering Council
established by section 41002(a) of the FAST Act
(42 U.S.C. 4370m-1(a)); and
(iv) the heads of any other relevant
Federal department or agency, as determined
appropriate by the Secretary.
(c) Interagency Access.--The Secretary shall make available to
Federal agencies--
(1) the code for any artificial intelligence developed in
furtherance of the program established under subsection (a);
(2) the training dataset curated under this section; and
(3) the particular environmental documents used in that
training dataset.
SEC. 6. RULEMAKING ON AI STANDARDIZATION FOR GRID INTERCONNECTION.
Not later than 18 months after the date of enactment of this Act,
the Federal Energy Regulatory Commission shall initiate a rulemaking to
revise the pro forma Large Generator Interconnection Procedures
promulgated pursuant to section 35.28(f) of title 18, Code of Federal
Regulations (or successor regulations), to require public utility
transmission providers to share and employ, as appropriate, queue
management best practices with respect to the use of computing
technologies, such as artificial intelligence, machine learning, or
automation, in evaluating and processing interconnection requests, in
order to expedite study results with respect to those requests.
SEC. 7. ENSURING ENERGY SECURITY FOR DATACENTERS AND COMPUTING
RESOURCES.
Not later than 1 year after the date of enactment of this Act, the
Secretary shall submit to Congress a report that--
(1) assesses--
(A) the growth of computing data centers and
advanced computing electrical power load in the United
States;
(B) potential risks of growth in computing centers
or growth in the required electrical power to United
States energy and national security;
(C) the national security impacts of computing data
centers being manipulated through nefarious means to
cause broad impacts to energy reliability; and
(D) the extent to which emerging technologies, such
as artificial intelligence and advanced computing, may
impact hardware and software systems used at data and
computing centers; and
(2) provides recommendations for--
(A) resources and capabilities that the Department
may provide to promote access to energy resources by
data centers and advanced computing;
(B) policy changes to ensure domestic deployment of
data center and advanced computing resources prevents
offshoring of United States data and resources;
(C) improving the energy efficiency of data
centers, advanced computing, and AI; and
(D) enhancing collaboration and resource sharing
between National Laboratories and other applicable
entities to maximize scientific output and accelerate
AI innovation.
SEC. 8. OFFICE OF CRITICAL AND EMERGING TECHNOLOGY.
(a) In General.--Title II of the Department of Energy Organization
Act is amended by inserting after section 215 (42 U.S.C. 7144b) the
following:
``SEC. 216. OFFICE OF CRITICAL AND EMERGING TECHNOLOGY.
``(a) Definitions.--In this section:
``(1) Critical and emerging technology.--The term `critical
and emerging technology' means--
``(A) advanced technology that is potentially
significant to United States competitiveness, energy
security, or national security, such as biotechnology,
advanced computing, and advanced manufacturing;
``(B) technology that may address the challenges
described in subsection (b) of section 10387 of the
Research and Development, Competition, and Innovation
Act (42 U.S.C. 19107); and
``(C) technology described in the key technology
focus areas described in subsection (c) of that section
(42 U.S.C. 19107).
``(2) Department capabilities.--The term `Department
capabilities' means--
``(A) each of the National Laboratories (as defined
in section 2 of the Energy Policy Act of 2005 (42
U.S.C. 15801)); and
``(B) each associated user facility of the
Department.
``(3) Director.--The term `Director' means the Director of
Critical and Emerging Technology described in subsection (d).
``(4) Office.--The term `Office' means the Office of
Critical and Emerging Technology established by subsection (b).
``(b) Establishment.--There shall be within the Office of the Under
Secretary for Science and Innovation an Office of Critical and Emerging
Technology.
``(c) Mission.--The mission of the Office shall be--
``(1) to work across the entire Department to assess and
analyze the status of and gaps in United States
competitiveness, energy security, and national security
relating to critical and emerging technologies, including
through the use of Department capabilities;
``(2) to leverage Department capabilities to provide for
rapid response to emerging threats and technological surprise
from new emerging technologies;
``(3) to promote greater participation of Department
capabilities within national science policy and international
forums; and
``(4) to inform the direction of research and policy
decisionmaking relating to potential risks of adoption and use
of emerging technologies, such as inadvertent or deliberate
misuses of technology.
``(d) Director of Critical and Emerging Technology.--The Office
shall be headed by a director, to be known as the `Director of Critical
and Emerging Technology', who shall--
``(1) be appointed by the Secretary; and
``(2) be an individual who, by reason of professional
background and experience, is specially qualified to advise the
Secretary on matters pertaining to critical and emerging
technology.
``(e) Collaboration.--In carrying out the mission and activities of
the Office, the Director shall closely collaborate with all relevant
Departmental entities, including the National Nuclear Security
Administration, the applied energy offices, and the Office of Science,
to maximize the computational capabilities of the Department and
minimize redundant capabilities.
``(f) Coordination.--In carrying out the mission and activities of
the Office, the Director--
``(1) shall coordinate with senior leadership across the
Department and other stakeholders (such as institutions of
higher education and private industry);
``(2) shall ensure the coordination of the Office of
Science with the other activities of the Department relating to
critical and emerging technology, including the transfer of
knowledge, capabilities, and relevant technologies, from basic
research programs of the Department to applied research and
development programs of the Department, for the purpose of
enabling development of mission-relevant technologies;
``(3) shall support joint activities among the programs of
the Department;
``(4) shall coordinate with the heads of other relevant
Federal agencies operating under existing authorizations with
subjects related to the mission of the Office described in
subsection (c) in support of advancements in related research
areas, as the Director determines to be appropriate; and
``(5) may form partnerships to enhance the use of, and to
ensure access to, user facilities by other Federal agencies.
``(g) Planning, Assessment, and Reporting.--
``(1) In general.--Not later than 180 days after the date
of enactment of the Department of Energy AI Act, the Secretary
shall submit to Congress a critical and emerging technology
action plan and assessment, which shall include--
``(A) a review of current investments, programs,
activities, and science infrastructure of the
Department, including under National Laboratories, to
advance critical and emerging technologies;
``(B) a description of any shortcomings of the
capabilities of the Department that may adversely
impact national competitiveness relating to emerging
technologies or national security; and
``(C) a budget projection for the subsequent 5
fiscal years of planned investments of the Department
in each critical and emerging technology, including
research and development, infrastructure, pilots, test
beds, demonstration projects, and other relevant
activities.
``(2) Updates.--Every 2 years after the submission of the
plan and assessment under paragraph (1), the Secretary shall
submit to Congress--
``(A) an updated emerging technology action plan
and assessment; and
``(B) a report that describes the progress made
toward meeting the goals set forth in the emerging
technology action plan and assessment submitted
previously.''.
(b) Clerical Amendment.--The table of contents for the Department
of Energy Organization Act (Public Law 95-91; 91 Stat. 565; 119 Stat.
764; 133 Stat. 2199) is amended by inserting after the item relating to
section 215 the following:
``Sec. 216. Office of Critical and Emerging Technology.''.
SEC. 9. OFFICE OF INTELLIGENCE AND COUNTERINTELLIGENCE REVIEW OF
VISITORS AND ASSIGNEES.
(a) Definitions.--In this section:
(1) Appropriate congressional committees.--The term
``appropriate congressional committees'' means--
(A) the congressional intelligence committees;
(B) the Committee on Armed Services, the Committee
on Energy and Natural Resources, the Committee on
Foreign Relations, the Committee on the Judiciary, the
Committee on Homeland Security and Governmental
Affairs, and the Committee on Appropriations of the
Senate; and
(C) the Committee on Armed Services, the Committee
on Energy and Commerce, the Committee on Foreign
Affairs, the Committee on the Judiciary, the Committee
on Homeland Security, and the Committee on
Appropriations of the House of Representatives.
(2) Country of risk.--The term ``country of risk'' means a
country identified in the report submitted to Congress by the
Director of National Intelligence in 2024 pursuant to section
108B of the National Security Act of 1947 (50 U.S.C. 3043b)
(commonly referred to as the ``Annual Threat Assessment'').
(3) Covered assignee; covered visitor.--The terms ``covered
assignee'' and ``covered visitor'' mean a foreign national from
a country of risk that is ``engaging in competitive behavior
that directly threatens U.S. national security'', who is not an
employee of either the Department or the management and
operations contractor operating a National Laboratory on behalf
of the Department, and has requested access to the premises,
information, or technology of a National Laboratory.
(4) Director.--The term ``Director'' means the Director of
the Office of Intelligence and Counterintelligence of the
Department (or their designee).
(5) Foreign national.--The term ``foreign national'' has
the meaning given the term ``alien'' in section 101(a) of the
Immigration and Nationality Act (8 U.S.C. 1101(a)).
(6) National laboratory.--The term ``National Laboratory''
has the meaning given the term in section 2 of the Energy
Policy Act of 2005 (42 U.S.C. 15801).
(7) Nontraditional intelligence collection threat.--The
term ``nontraditional intelligence collection threat'' means a
threat posed by an individual not employed by a foreign
intelligence service, who is seeking access to information
about a capability, research, or organizational dynamics of the
United States to inform a foreign adversary or nonstate actor.
(b) Findings.--The Senate finds the following:
(1) The National Laboratories conduct critical, cutting-
edge research across a range of scientific disciplines that
provide the United States with a technological edge over other
countries.
(2) The technologies developed in the National Laboratories
contribute to the national security of the United States,
including classified and sensitive military technology and
dual-use commercial technology.
(3) International cooperation in the field of science is
critical to the United States maintaining its leading
technological edge.
(4) The research enterprise of the Department, including
the National Laboratories, is increasingly targeted by
adversarial nations to exploit military and dual-use
technologies for military or economic gain.
(5) Approximately 40,000 citizens of foreign countries,
including more than 8,000 citizens from China and Russia, were
granted access to the premises, information, or technology of
National Laboratories in fiscal year 2023.
(6) The Office of Intelligence and Counterintelligence of
the Department is responsible for identifying
counterintelligence risks to the Department, including the
National Laboratories, and providing direction for the
mitigation of such risks.
(c) Sense of the Senate.--It is the sense of the Senate that--
(1) before being granted access to the premises,
information, or technology of a National Laboratory, citizens
of foreign countries identified in the 2024 Annual Threat
Assessment of the intelligence community as ``engaging in
competitive behavior that directly threatens U.S. national
security'' should be appropriately screened by the National
Laboratory to which they seek access, and by the Office of
Intelligence and Counterintelligence of the Department, to
identify risks associated with granting the requested access to
sensitive military, or dual-use technologies; and
(2) identified risks should be mitigated.
(d) Review of Country of Risk Covered Visitor and Covered Assignee
Access Requests.--The Director shall, in consultation with the
applicable Under Secretary of the Department that oversees the National
Laboratory, or their designee, promulgate a policy to assess the
counterintelligence risk that covered visitors or covered assignees
pose to the research or activities undertaken at a National Laboratory.
(e) Advice With Respect to Covered Visitors or Covered Assignees.--
(1) In general.--The Director shall provide advice to a
National Laboratory on covered visitors and covered assignees
when 1 or more of the following conditions are present:
(A) The Director has reason to believe that a
covered visitor or covered assignee is a nontraditional
intelligence collection threat.
(B) The Director is in receipt of information
indicating that a covered visitor or covered assignee
constitutes a counterintelligence risk to a National
Laboratory.
(2) Advice described.--Advice provided to a National
Laboratory in accordance with paragraph (1) shall include a
description of the assessed risk.
(3) Risk mitigation.--When appropriate, the Director shall,
in consultation with the applicable Under Secretary of the
Department that oversees the National Laboratory, or their
designee, provide recommendations to mitigate the risk as part
of the advice provided in accordance with paragraph (1).
(f) Reports to Congress.--Not later than 90 days after the date of
the enactment of this Act, and quarterly thereafter, the Secretary
shall submit to the appropriate congressional committees a report,
which shall include--
(1) the number of covered visitors or covered assignees
permitted to access the premises, information, or technology of
each National Laboratory;
(2) the number of instances in which the Director provided
advice to a National Laboratory in accordance with subsection
(e); and
(3) the number of instances in which a National Laboratory
took action inconsistent with advice provided by the Director
in accordance with subsection (e).
(g) Authorization of Appropriations.--There is authorized to be
appropriated such sums as may be necessary to carry out this section
for each of fiscal years 2024 through 2032.
Calendar No. 631
118th CONGRESS
2d Session
S. 4664
_______________________________________________________________________
A BILL
To require the Secretary of Energy to establish a program to promote
the use of artificial intelligence to support the missions of the
Department of Energy, and for other purposes.
_______________________________________________________________________
November 21, 2024
Reported with an amendment