The changing economic environment is opening a window of opportunity for banks and credit unions to put their artificial intelligence models for loan underwriting through a credit cycle.
The digital banking evolution in the past decade has significantly enriched data around consumer profiles and behaviors. It has incubated a group of financial technology vendors to develop AI underwriting models for banks and credit unions. Since most of them were established in a benign environment, early adopters are now looking to validate their AI models in the current dynamics with the downside risks leading to credit deterioration being high.
VyStar CU Vice President of Consumer Lending Alice Stevens
Source: VyStar CU
"We want to keep a close eye to make sure that even in these changing economic times, the model is working as we expect it to be working," said Alice Stevens, vice president of consumer lending at VyStar CU.
VyStar's exploration of AI started about four years ago. In 2019, the Jacksonville, Fla.-based credit union partnered with ZestFinance Inc., which does business as Zest AI, a fintech company building AI models for lenders. In 2021, VyStar and First National Bank of Omaha led an $18 million investment in Zest AI.
Zest AI, founded in 2009, tested its models using data in scenarios from the economic downturn in 2008. But for VyStar, it is the first time to observe in practice how the models will perform when delinquency rates are rising and inflation is changing consumers' cash flow patterns, Stevens said.
"As much as I believe in this technology, I'm always going to be prudent about how far I jump into the water before I know it's warm," Stevens said.
The use of AI models for credit underwriting was led by digital lenders such as LendingClub Corp., said Gal Krubiner, CEO of Pagaya. A key catalyst to using algorithms is the production of clean and standardized data as digital banking penetrates into lenders' operations, Krubiner said. Founded in 2016, Pagaya provides a network of AI models for lenders including Upgrade Inc., Ally Financial Inc. and SoFi Technologies Inc. to make credit decisions.
"Compared to three or four years ago, I think it's a much broader range of financial institutions that are starting to develop or utilize machine learning models for credit purposes," said Tori Shinohara, a partner at Mayer Brown LLP.
A conservative approach
Unlike technology companies that aim to make machines smarter, banks and credit unions are taking a more conservative approach leveraging human assistance.
Zest AI uses supervised machine learning, a subcategory of AI, to help banks create records to explain credit decisions. In supervised AI models, lenders use well-labeled input and output data to train the machine to predict outcomes.
"In our model, we can tell you what the variables are and we actually provide that documentation," said Yolanda McGill, vice president of policy and government affairs at Zest AI.
In comparison, unsupervised models do not necessarily need human intervention. But the reliance on algorithms by themselves makes it challenging to provide visibility into the rationale of credit decisions, industry advisers said.
"What I've seen employed today, particularly by regulated institutions, is going to be a supervised machine learning model," Mayer Brown's Shinohara said. "So it's not a full black box. You need to know what's going in, and you need to know what's coming out, in order to be able to comply with the explainability requirements under the law."
In guidance issued Sept. 19, the Consumer Financial Protection Bureau emphasized that lenders using AI or other complex models must provide specific and accurate reasons when they deny credit to consumers, in compliance with the Equal Credit Opportunity Act.
VyStar uses AI models to vet loan applications for consumer lending products with high volumes.
Source: VyStar CU
Since VyStar uses supervised models, loan officers play a key role in monitoring the performance and conducting tests to discover possible errors. VyStar mainly uses AI models to vet loan applications and has applied them in the vast majority of its consumer lending products with high volumes, including credit cards, personal lines of credit and auto loans, Stevens said. Over 60% of the loans using AI can be instantly approved, while the traditional digital lending solution can only approve about 30% instantly, she said.
AI models are better at building correlations among data points to picture the creditworthiness of an applicant, instead of just ticking off "yes" or "no" like what traditional digital lending software would do, Stevens explained. The credit union is also interested in exploring the use of AI in more complex tasks such as loan pricing, she said.
"Because [AI] uses the data in a relative way, we're able to trust it more," Stevens said.
Regulatory attention
After the Great Recession of 2008, fintech innovation thrived partly due to low interest rates and investor interest in the space. But regulators have long been wary of the lack of evidence of fintechs' ability to survive through cycles.
In a 2017 report, the Financial Stability Board wrote: "[W]ithout the benefit of a full credit cycle, it is too early to say how new models that exploit big data will perform in terms of measuring and pricing risk."
But innovation is coming. ChatGPT took the internet by storm after its launch in November 2022, triggering broad discussion about enhancing regulatory supervision of AI. Financial regulators in particular have underscored the compliance of AI models to existing consumer protection laws.
"AI is getting a lot of attention in Washington. The tech giants have come to tell us how to shape new laws that will advance their business models. But laws already exist governing some aspects of AI," Sen. Elizabeth Warren (D-Mass.) said at a Senate banking committee hearing Sept. 20.
Just like other lending practices, using AI tools for credit is governed by the Equal Credit Opportunity Act and the Fair Credit Reporting Act. With respect to AI models, lenders' ability to explain credit decisions and fair lending are what bank regulators watch out for, Zest AI's McGill said.
Regulated institutions are also expected to have a framework of governance on the algorithm models, Pagaya's Krubiner added. Pagaya has been working with Deloitte LLP and Charles River Associates to audit its AI models for credit decisioning.
"There is no 'AI exception' to our consumer protection laws," Warren said at the hearing.
VyStar will continue to monitor and optimize its AI system to make sure it adheres to regulations. If the approach is successful, the company sees it increasing efficiency.
"Eventually we'll be able to have less human underwriters because we won't be looking at as many loans, but all of that is going to happen over a long period of time — a fairly long business cycle," Stevens said.