Online lenders and human lenders charge minorities more for mortgages

0

Online lenders and human lenders earn 11-17% higher profits on minority borrowers by charging African Americans and Latinos higher rates, researchers at the University of California at Berkeley have found.

It’s not just racially prejudiced bank loan officers who discriminate against black and Latino borrowers. Computer algorithms do it too.

That’s the groundbreaking finding from the University of California, Berkeley, researchers who found that algorithmic credit scoring using big data is no better than humans at leveling the playing field when it comes to determine mortgage interest rates.

Online lenders and human lenders earn 11-17% higher profits on minority borrowers by charging African Americans and Latinos higher rates, according to the study. Black and Latino consumers pay 5.6 to 8.6 basis points of interest on home loans higher than their white or Asian counterparts with similar credit profiles, whether they obtained their loans face to face or in line. The effect is less when it comes to refinancing, with black and Latino borrowers paying 3 points more.

The disparity causes African Americans and Latinos together to pay up to half a billion dollars more in mortgage interest each year, according to the study.

Most read business stories

“Removal from humans should eliminate malicious forms of discrimination,” said Adair Morse, professor of finance at the Haas School of Business at Berkeley and co-author of the article. “But we are entering an era where we use variables to statistically discriminate against people in loans.”

The results are significant as more and more consumers buy mortgages online. Almost half of the top 2,000 mortgage lenders offer complete mortgage applications online.

Morse and her colleagues – Nancy Wallace and Richard Stanton at Haas and Robert Bartlett at Berkeley Law – focused on 30-year fixed-rate single-family home loans issued between 2008 and 2015. They were able to link the data on interest rates. interest, loan terms, location of the property, income and credit scores with the race of first-time borrowers. All loans were guaranteed by government-funded firms Fannie Mae and Freddie Mac, allowing researchers to eliminate credit risk as a factor in price differences.

“Even controlling for creditworthiness, we see discriminatory effects in the rates at which borrowers get mortgages,” Bartlett said.

The researchers said racial disparities could result from algorithms that use machine learning and big data to charge higher interest rates to borrowers who may be less likely to shop. For example, algorithms can take into account a borrower’s neighborhood – noting who lives in banking deserts – or other characteristics such as their high school or college. Consumers least likely to shop around are also black or Latino.

It’s legal to use statistical data to set prices that help maximize profits – in theory. The problem arises when the data correlates with race, regardless of credit risk. Discrimination against minority borrowers – even unintentionally – is illegal unless it is based on their creditworthiness, Bartlett said.

Homeownership and debt are key factors in racial disparities in wealth.

Bartlett said banks that increasingly use big data to determine loan approvals or rates should be audited to ensure their methods do not discriminate against minority borrowers who have the same credit scores as whites.

The researchers described a few silver liners in their study. The increased competition among lenders has resulted in less discrimination overall. And when it comes to whether to accept or reject a loan, online lenders do not discriminate against minorities, while their human counterparts are 4% more likely to reject Latin American and African American borrowers. .

Rather, online lenders end up dealing with people discriminated against by face-to-face lenders, according to the study.

“To reject the loans would be money left on the table for the lenders,” Morse said.

Leave A Reply

Your email address will not be published.