Hartford Courant (Sunday)

Ending discrimina­tion

Can digital mortgage platforms reduce bias?

- By Jennifer Miller

In 2015, Melany Anderson’s 6-year-old daughter came home from a play date and asked her mother a heartbreak­ing question: Why did all her friends have their own bedrooms?

Anderson, 41, a pharmaceut­ical benefits consultant, was recently divorced, living with her parents in West Orange, New Jersey, and sharing a room with her daughter. She longed to buy a home, but the divorce had emptied her bank account and wrecked her credit. She was working hard to improve her financial profile, but she couldn’t imagine submitting herself to the scrutiny of a mortgage broker.

“I found the idea of going to a bank completely intimidati­ng and impossible,” she said. “I was a divorced woman and a Black woman. And also being a contractor — I know it’s frowned upon, because it’s looked at as unstable. There were so many negatives against me.”

Then, last year, Anderson was checking her credit score online when a pop-up ad announced that she was eligible for a mortgage, listing several options. She ended up at Better.com, a digital lending platform, which promised to help Anderson secure a mortgage without ever setting foot in a bank or, if she so desired, even talking to another human.

In the end, she estimated, she conducted about 70% of the mortgage applicatio­n and approval process online. Her fees totaled $4,000, about half the national average. In November 2019, she and her daughter moved into a two-bedroom home not far from her parents with a modern kitchen, a deck and

a backyard.

Getting a mortgage can be a harrowing experience for anyone, but for those who don’t fit the middle-oflast-century stereotype of homeowners­hip — white, married, heterosexu­al — the stress is amplified by the heightened probabilit­y of getting an unfair deal.

Digital mortgage websites and apps represent a potential improvemen­t. Without showing their faces, prospectiv­e borrowers can upload their financial informatio­n, get a letter of preapprova­l, customize loan criteria (like the size of the down payment) and search for interest rates. Software processes the data and, and if the numbers check out, approves a loan.

Reducing — or even removing — human brokers from the mortgage underwriti­ng process could democratiz­e the industry.

Last year, Better.com said, it saw significan­t increases in traditiona­lly underrepre

sented homebuyers, including people of color, single women, LGBTQ couples and customers with student loan debt.

“Discrimina­tion is definitely falling, and it correspond­s to the rise in competitio­n between fintech lenders and regular lenders,” said Nancy Wallace, chair in real estate capital markets at Berkeley’s Haas School of Business. A study that Wallace co-authored in 2019 found that fintech algorithms discrimina­ted 40% less on average than face-to-face lenders in loan pricing and did not discrimina­te at all in accepting and rejecting loans.

Digital lenders say that they assess risk using the same financial criteria as traditiona­l banks: borrower income, assets, credit score, debt, liabilitie­s, cash reserves and the like.

These lenders could theoretica­lly use additional variables to assess whether borrowers can repay a loan,

such as rental or utility payment history, or even assets held by extended family. But generally, they don’t. To fund their loans, they rely on the secondary mortgage market, which includes the government­backed entities Freddie Mac and Fannie Mae, and which became more conservati­ve after the 2008 crash. With some exceptions, if you don’t meet the standard CFPB criteria, you are likely to be considered a risk.

Fair housing advocates say that’s a problem, because the standard financial informatio­n puts minorities at a disadvanta­ge. Credit scores are calculated based on a person’s spending and payment habits. But landlords often don’t report rental payments to credit bureaus, even though these are the largest payments that millions of people make on a regular basis, including more than half of Black Americans.

For mortgage lending, most banks rely on the credit scoring model invented by the Fair Isaac Corp., or FICO. Newer FICO models can include rental payment history, but the secondary mortgage market doesn’t require them. Neither does the Federal Housing Administra­tion, which specialize­s in loans for low and moderate-income borrowers. What’s more, systemic inequality has created salary disparitie­s between

Black and white Americans.

“We know the wealth gap is incredibly large between white households and households of color,” said Alanna McCargo, the vice president of housing finance policy at the Urban Institute. “If you are looking at income, assets and credit — your three drivers — you are excluding millions of potential Black, Latino and, in some cases, Asian minorities and immigrants from getting access to credit through your system. You are perpetuati­ng the wealth gap.”

Software has the potential to reduce lending disparitie­s by processing enormous amounts of personal informatio­n — far more than the CFPB guidelines require. Looking more holistical­ly at a person’s financials as well as their spending habits and preference­s, banks can make a more nuanced decision about who is likely to repay their loan.

Lisa Rice, president and chief executive of the National Fair Housing Alliance, said she was skeptical when mortgage lenders said their algorithms considered only federally sanctioned variables like credit score, income and assets. “Data scientists will say, if you’ve got 1,000 bits of informatio­n going into an algorithm, you’re not possibly only looking at three things,” she said. “If the objective is to predict how well this person will perform on a loan and to maximize profit, the algorithm is looking at every single piece of data to achieve those objectives.”

Many loan officers, of course, do their work equitably, Rice said. “Humans understand how bias is working,” she said. “There are so many examples of loan officers who make the right decisions and know how to work the system to get that borrower who really is qualified through the door.”

But as Zest AI’s former executive vice president, Kareem Saleh, put it, “Humans are the ultimate black box.” Intentiona­lly or unintentio­nally, they discrimina­te.

These are positive steps. But fair housing advocates say government regulators and banks in the secondary mortgage market must rethink risk assessment: accept alternativ­e credit scoring models, consider factors like rental history payment and ferret out algorithmi­c bias.

 ??  ??
 ?? BRYAN ANSELM/THE NEW YORK TIMES ?? Melany Anderson, with her daughter, Milan Wright, used a digital platform to get a mortgage for her New Jersey home.
BRYAN ANSELM/THE NEW YORK TIMES Melany Anderson, with her daughter, Milan Wright, used a digital platform to get a mortgage for her New Jersey home.

Newspapers in English

Newspapers from United States