To direct the Federal Trade Commission to require entities that use, store, or share personal information to conduct automated decision system impact assessments and data protection impact assessments.
Actions Overview (1)
Date
Actions Overview
04/10/2019
Introduced in House
04/10/2019 Introduced in House
All Actions (3)
Date
All Actions
04/11/2019
Referred to the Subcommittee on Consumer Protection and Commerce. Action By: Committee on Energy and Commerce
04/10/2019
Referred to the House Committee on Energy and Commerce. Action By: House of Representatives
04/10/2019
Introduced in House Action By: House of Representatives
04/11/2019 Referred to the Subcommittee on Consumer Protection and Commerce.
04/10/2019 Referred to the House Committee on Energy and Commerce.
Committees, subcommittees and links to reports associated with this bill are listed here, as well as the nature and date of committee activity and Congressional report number.
Committee / Subcommittee
Date
Activity
Related Documents
House Energy and Commerce
04/10/2019
Referred to
House Energy and Commerce Subcommittee on Consumer Protection and Commerce
This bill requires specified commercial entities to conduct assessments of high-risk systems that involve personal information or make automated decisions, such as systems that use artificial intelligence or machine learning.
Specifically, high-risk automated decision systems include those that (1) may contribute to inaccuracy, bias, or discrimination; or (2) facilitate decision-making about sensitive aspects of consumers' lives by evaluating consumers' behavior. Further, an automated-decision system, or information system involving personal data, is considered high-risk if it (1) raises security or privacy concerns, (2) involves the personal information of a significant number of people, or (3) systematically monitors a large, publicly accessible physical location.
Assessments of high-risk automated-decision systems must (1) describe the system in detail, (2) assess the relative costs and benefits of the system, (3) determine the risks to the privacy and security of personal information, and (4) explain the steps taken to minimize those risks, if discovered. Assessments of high-risk information systems involving personal information must evaluate the extent to which the system protects the privacy and security of such information.
All Summaries (1)
Shown Here: Introduced in House (04/10/2019)
Algorithmic Accountability Act of 2019
This bill requires specified commercial entities to conduct assessments of high-risk systems that involve personal information or make automated decisions, such as systems that use artificial intelligence or machine learning.
Specifically, high-risk automated decision systems include those that (1) may contribute to inaccuracy, bias, or discrimination; or (2) facilitate decision-making about sensitive aspects of consumers' lives by evaluating consumers' behavior. Further, an automated-decision system, or information system involving personal data, is considered high-risk if it (1) raises security or privacy concerns, (2) involves the personal information of a significant number of people, or (3) systematically monitors a large, publicly accessible physical location.
Assessments of high-risk automated-decision systems must (1) describe the system in detail, (2) assess the relative costs and benefits of the system, (3) determine the risks to the privacy and security of personal information, and (4) explain the steps taken to minimize those risks, if discovered. Assessments of high-risk information systems involving personal information must evaluate the extent to which the system protects the privacy and security of such information.