EstatePass
Ethics & Fraudmedium17% of exam

A mortgage company's automated underwriting system consistently approves fewer applications from borrowers in ZIP codes with majority African American populations, even when controlling for credit scores and income. The company states they had no knowledge of the demographic composition when programming the system. This most likely constitutes:

Correct Answer

C) Disparate impact because the system produces discriminatory results regardless of intent

This represents disparate impact under the Fair Housing Act. Disparate impact discrimination does not require proof of discriminatory intent. Even though the system appears neutral and the company claims no knowledge of demographics, the discriminatory results affecting a protected class (race) can still violate fair lending laws. The company would need to justify the system's criteria with legitimate business necessity.

Answer Options
A
Neither disparate treatment nor impact since the system is automated and race-neutral
B
Disparate treatment because the company intentionally programmed bias into the system
C
Disparate impact because the system produces discriminatory results regardless of intent
D
Redlining because it involves geographic discrimination

Why This Is the Correct Answer

This represents disparate impact under the Fair Housing Act. Disparate impact discrimination does not require proof of discriminatory intent. Even though the system appears neutral and the company claims no knowledge of demographics, the discriminatory results affecting a protected class (race) can still violate fair lending laws. The company would need to justify the system's criteria with legitimate business necessity.

More Ethics & Fraud Questions

People Also Study

Practice More MLO Questions

Access all practice questions with progress tracking and adaptive difficulty to pass your SAFE MLO exam.

Start Practicing