Start accounting for CECL

The Current Expected Credit Loss accounting standard doesn’t kick in until 2021 for most community banks, but that doesn’t mean you can’t start preparing today.

By Katie Kuehner-Hebert

While the Current Expected Credit Loss (CECL) accounting standard won’t take effect until 2020 for SEC registrants and 2021 for all other banks, community banks can start preparing now for the sea change in how they calculate and reserve for potential losses.

The new standard, developed by the Financial Accounting Standards Board (FASB) and the International Accounting Standards Board (IASB), replaces the incurred loss impairment methodology with one that forecasts lifetime expected credit losses and requires consideration of a broader range of “reasonable and supportable” information to inform credit loss estimates.

“The whole purpose is to capture a truer picture of risk,” says Will Newcomer, vice president of product and strategy, finance, risk and reporting–Americas for consultancy Wolters Kluwer in Boston, Mass. With the outgoing incurred-loss methodology, banks manage their reserves based on losses that have happened in the immediate past or are expected to happen in the very near future. Its shortcoming, Newcomer says, is that stock analysts and other outsiders can only see what’s already happened, but not what is expected to happen.

After the financial crisis, accounting boards started examining ways to capture credit risk over the entire life of a loan to more accurately predict how much banks expected to lose on their portfolios. While there is no mandated model, FASB provides guidelines on acceptable methodology.

“Banks can buy or build models, but no matter what, they need to be able to defend their models,” Newcomer says.

Most banks feel the first CECL filing will be a “best guess” and that the entire industry will find ways to improve on the process. After all, the calculations banks will use to estimate future losses are simply estimations. The key is how they will explain and defend their methodology to analysts, examiners, board directors and shareholders.

“To do this, they’ll need to effectively communicate how they researched and decided which methodologies to use in the models,” Newcomer says. “If the reserves are adjusted up or down 5 percent, they will have to be able to explain why and defend the results.”

Brett D. Schwantes, a CPA and senior manager of the Wausau, Wis., office of Wipfli LLP, says small banks with accountants on staff could potentially develop their own models, avoiding having to pay for a third-party solution.

“But just because a bank has the resources to develop its own model doesn’t mean I recommend that it automatically does this internally. Other considerations need to go into the decision,” Schwantes cautions. “First, what happens if the CFO or other employee maintaining the model leaves or is out for extended disability? Does the bank have someone who can take over the CECL responsibilities? If not, they are going to be stuck.”

Simple methodologies

Another disadvantage of an in-house model is a bank’s likely adoption of a simple methodology, which typically results in larger allowances for loan losses. The precision of a simple methodology is much lower because it does not include all the relevant variables a more sophisticated methodology could utilize.

“For example, with a simple methodology, a bank is not going to consider probability of default for each individual loan but instead will only be looking at the entire portfolio,” Schwantes says. He adds that a simple methodology may not consider changes in risk characteristics of the portfolio over time; thus management will have to spend more time determining qualitative adjustments to the calculated CECL loss rate.

If a bank today used a five-year life to calculate the cumulative loss rate of its portfolio, the bank would be going back to 2012, when it might have been taking losses on its loans due to fallout from the recession that do not reflect the current credit quality of the portfolio.

“Banks can buy or build models, but no matter what, they need to be able to defend their models.”
—Will Newcomer,
Wolters Kluwer

“The portfolio would likely have improved a great deal after that, but a simple methodology doesn’t account for that—it’s stuck in 2012,” Schwantes says. “That would result in higher loan-loss reserves.”

On the other hand, if a bank hires an outside party to develop a methodology on its behalf, a more complex and precise approach might be used, such as the probability of default methodology. This methodology uses statistical analysis to calculate the probability of each loan defaulting and then estimates how much loss the bank would incur as a result of that individual default.

Another more complex methodology that might be used is the discounted cash-flow methodology, Schwantes says. This methodology schedules all of the monthly payments for each loan using scheduled payments and other factors, such as a constant default rate and an estimate for prepayments, then discounts the cash flows back to the current period using the loan’s effective rate.

Many banks may also opt to use different methodologies for different loan types. “If a particular pool is not material—say, a bank doesn’t make many auto loans—then they might not want to spend the money on a complex methodology but instead use a simpler methodology for that pool,” Schwantes explains.

Data decisions

After determining the type of methodology they’ll use, community banks then need to assess the type of data they’ll need. Basic data for all methodologies include the loan trial balance and loan-specific information, such as origination dates, origination balance, charge-off and/or recovery dates and amounts.

If further information is required for the model, Schwantes advises banks to involve their loan processing staff to determine what kinds of additional data they might have to start collecting at the time of origination, as well as throughout the life of the loan.

“For example, banks might want to start collecting data on the past-due status for each loan, including whether the loan is past due for 30 days, 60 days or 90 days,” Schwantes says. “The bank might also want to track how often each loan is past due and then develop a process to make sure the system is updated regularly.”

Bruce Richter, a CPA, CMA, CIA, CFE and senior manager at Eide Bailly LLP in Mankato, Minn., recommends community banks consider a “vintage analysis” model. This historical loss approach analyzes loans grouped by “vintage,” meaning the year they were originated.

“The vintage approach tracks the life of a loan throughout its life cycle, and the data needed is going to be more readily available for most community banks,” Richter says. “Looking at a group of loans originated in a certain year, banks should ask themselves what kinds of losses did they have for that particular vintage.

“Then they can put that in context with the economic environment that existed during the life cycle of that vintage period.”

Next, banks need to incorporate forward-looking data to help determine future expected losses, Richter says. For example, on a group of ag loans, banks might choose to include price data on futures contracts on the Chicago Board of Trade. They might also take into account yield data due to better weather in certain areas, as higher yields could more than offset lower prices, leading to lower losses.

Finally, banks should divide their loan portfolios into various risk characteristics and then determine the economic factors that caused the losses for each group of loans. Was it higher unemployment rates?

“Banks should do some dry runs ahead of the compliance deadline to see if they have to find better data sources.”
—Bruce Richter,
Eide Bailly

Lower gross domestic product? Local economic factors should take precedence, as Midwest banks with large ag portfolios fared much better during the recession than many banks in other regions, Richter says. Banks may decide they want to capture additional data to more accurately forecast expected losses, which may require changing what type of information they gather at a loan origination and during its life cycle, Richter says.

His advice? Do some early practice runs. “It’s going to take some fine-tuning, and so banks should do some dry runs ahead of the compliance deadline to see if they have to find better data sources,” he says. “But there is a certain amount of flexibility out there, and this is supposed to be scalable so smaller institutions don’t have to buy expensive models or incur time-consuming methodologies.”

Katie Kuehner-Hebert is a writer in California.

comments powered by Disqus