Skip to Content

IAPP CIPP/US: Privacy concerns with Acme Student Loan Company using artificial intelligence?

Question

Acme Student Loan Company has developed an artificial intelligence algorithm that determines whether an individual is likely to pay their bill or default. A person who is determined by the algorithm to be more likely to default will receive frequent payment reminder calls, while those who are less likely to default will not receive payment reminders. Which of the following most accurately reflects the privacy concerns with Acme Student Loan Company using artificial intelligence in this manner?

A. If the algorithm uses risk factors that impact the automatic decision engine. Acme must ensure that the algorithm does not have a disparate impact on protected classes in the output.
B. If the algorithm makes automated decisions based on risk factors and public information, Acme need not determine if the algorithm has a disparate impact on protected classes.
C. If the algorithm’s methodology is disclosed to consumers, then it is acceptable for Acme to have a disparate impact on protected classes.
D. If the algorithm uses information about protected classes to make automated decisions, Acme must ensure that the algorithm does not have a disparate impact on protected classes in the output.

Answer

D. If the algorithm uses information about protected classes to make automated decisions, Acme must ensure that the algorithm does not have a disparate impact on protected classes in the output.

Explanation

The correct answer is D. If the algorithm uses information about protected classes to make automated decisions, Acme must ensure that the algorithm does not have a disparate impact on protected classes in the output.

The Fair Credit Reporting Act (FCRA) protects consumers from unfair, inaccurate, and discriminatory treatment by creditors and other businesses that use credit reports. The FCRA prohibits creditors from using information about protected classes, such as race, color, religion, national origin, sex, marital status, age, or because they receive income from a public assistance program, to make decisions about credit.

In the case of Acme Student Loan Company, the algorithm is using information about protected classes to make automated decisions about whether to send payment reminder calls. This could have a disparate impact on protected classes, such as people of color or people with low incomes. For example, people of color may be more likely to be identified as being at risk of default, even if they are just as likely to repay their loans as people of other races.

Acme Student Loan Company must ensure that the algorithm does not have a disparate impact on protected classes. This could be done by using a variety of methods, such as:

  • Training the algorithm on a dataset that is representative of the population of borrowers.
  • Using a statistical technique called “fairness testing” to evaluate the algorithm’s impact on protected classes.
  • Taking steps to mitigate any disparities that are found.

By taking these steps, Acme Student Loan Company can help to ensure that its use of artificial intelligence does not violate the FCRA or other privacy laws.

Reference

IAPP Certified Information Privacy Professional/United States (CIPP/US) certification exam practice question and answer (Q&A) dump with detail explanation and reference available free, helpful to pass the IAPP Certified Information Privacy Professional/United States (CIPP/US) exam and earn IAPP Certified Information Privacy Professional/United States (CIPP/US) certification.