There is 1 version of this bill. View text

Click the check-box to add or remove the section, click the text link to scroll to that section.
Titles Actions Overview All Actions Cosponsors Committees Related Bills Subjects Latest Summary All Summaries

Titles (2)

Short Titles

Short Titles - Senate

Short Titles as Introduced

Algorithmic Accountability Act of 2019

Official Titles

Official Titles - Senate

Official Titles as Introduced

A bill to direct the Federal Trade Commission to require entities that use, store, or share personal information to conduct automated decision system impact assessments and data protection impact assessments.


Actions Overview (1)

Date Actions Overview
04/10/2019Introduced in Senate

All Actions (1)

Date All Actions
04/10/2019Read twice and referred to the Committee on Commerce, Science, and Transportation. (Sponsor introductory remarks on measure: CR S2389)
Action By: Senate

Cosponsors (1)

* = Original cosponsor
CosponsorDate Cosponsored
Sen. Booker, Cory A. [D-NJ]* 04/10/2019

Committees (1)

Committees, subcommittees and links to reports associated with this bill are listed here, as well as the nature and date of committee activity and Congressional report number.

Committee / Subcommittee Date Activity Reports
Senate Commerce, Science, and Transportation04/10/2019 Referred to

A related bill may be a companion measure, an identical bill, a procedurally-related measure, or one with text similarities. Bill relationships are identified by the House, the Senate, or CRS, and refer only to same-congress measures.


Latest Summary (1)

There is one summary for S.1108. View summaries

Shown Here:
Introduced in Senate (04/10/2019)

Algorithmic Accountability Act of 2019

This bill requires specified commercial entities to conduct assessments of high-risk systems that involve personal information or make automated decisions, such as systems that use artificial intelligence or machine learning.

Specifically, high-risk automated decision systems include those that (1) may contribute to inaccuracy, bias, or discrimination; or (2) facilitate decision-making about sensitive aspects of consumers' lives by evaluating consumers' behavior. Further, an automated-decision system, or information system involving personal data, is considered high-risk if it (1) raises security or privacy concerns, (2) involves the personal information of a significant number of people, or (3) systematically monitors a large, publicly accessible physical location.

Assessments of high-risk automated-decision systems must (1) describe the system in detail, (2) assess the relative costs and benefits of the system, (3) determine the risks to the privacy and security of personal information, and (4) explain the steps taken to minimize those risks, if discovered. Assessments of high-risk information systems involving personal information must evaluate the extent to which the system protects the privacy and security of such information.