NAFCU Journal January February 2022

45 THE NAFCU JOURNAL January–February 2022 authorities. We will spare no resource to ensure that federal fair lending laws are vigorously enforced and that financial institutions provide equal opportunity for every American to obtain credit.” In the same vein of “modern-day redlining,” Rohit Chopra, Director of the CFPB, stated in his remarks that the bureau “will also be closely watching for digital redlining, disguised through so-called neutral algorithms, that may reinforce the biases that have long existed.” Many financial institutions, including credit unions, are relying more heavily on technology, whether internally created or provided by third parties. However, without a firm understanding of the assumptions and information that are utilized when creating various algorithms, conscious or unconscious biases can continue to transpire. Director Chopra discussed how some mortgage companies have even stated that they “do not have all the data that feeds into their algorithms or full knowledge of the algorithms.” This lack of data and understanding of how the algorithms work creates a substantial blind spot, where “the algorithms are black boxes behind brick walls.” With no knowledge of how the decisions are being made by the algorithms, the potential for fair lending issues can be equally substantial. While this is not a new area of interest for the CFPB, it may be a new area of consideration for credit unions. The industry increasingly utilizes technology for a variety of reasons including improving efficiency and providing more services to members, which in turn can mean relying more on machine learning and algorithms. There is no shortage of companies looking to help credit unions get into the game, either. On the other hand, state and federal regulators are pushing back to ensure that these algorithms do not lead to continued—or increased—discrimination. For example, state lawmakers in Colorado passed a law prohibiting life, annuity, long-term care and disability insurance issuers from utilizing algorithms which result in “unfair discrimination based on race, color, national or ethnic origin, religion, sex, sexual orientation, disability, gender identity or gender expression.” The law also prohibits “the use of external consumer data and information sources, as well as algorithms and predictive models using external consumer data and information sources, which use has the result of unfairly discriminating” based on the same set of prohibited bases. While this law is not directly related to lending, one can see how similarly targeted laws could be structured and applied for fair lending purposes. Furthermore, it is not even just issues with individual financial institutions or specific FinTech solution providers; there may be fair lending implications in the underwriting software used (and often mandated) by the GSEs. ABC News’ The Markup published an investigative piece about “the secret bias hidden in mortgage approval algorithms,” and asserted that the credit scoring algorithm utilized by Fannie Mae and Freddie Mac is outdated, and “widely considered detrimental to people of color because it rewards traditional credit, to which white Americans have more access.” The article also analyzes 2019 HMDA data for conventional loans, and the results outline a nationwide trend of lenders who “were 40% more likely to turn down Latino applicants for loans, 50% more likely to deny Asian/Pacific Islander applicants, and 70% more likely to deny Native American applicants than similar white applicants. Lenders were 80% more likely to reject Black applicants than similar white applicants.” Given these realities, what can be done? First, credit unions using or considering machine learning and algorithms for credit decisions need to have e a firm understanding of the data, assumptions and other information used to make those decisions, to avoid the “black box behind a brick wall” issue. Next, there has been much discussion in the mortgage industry about alternative credit scoring models which could include more modern information, such as both good and bad payment information from payday lending firms (which currently only report missed payments, not the total payment history), or rental payment history. Finally, transparency can help to combat digital redlining. The more information credit unions have and can provide to their members, the more well-equipped their members can be to put themselves in a better position to obtain loans. Detangling algorithms is no easy feat. Machine learning is complex by nature, and if humans could efficiently do the work, the machines would not be needed. While technology can take in massive amounts of data and can help us make sense of it all, it is important to understand the inputs to ensure that what comes out no longer perpetuates discriminatory practices. Federal regulators have indicated their intent to address redlining and similar fair lending issues, so credit unions will benefit from taking a deeper look at the technology they use and be mindful of possible unintended outcomes. Rebecca Tetreau is regulatory compliance counsel for NAFCU.

RkJQdWJsaXNoZXIy Nzc3ODM=