Business World

The science of IFRS 9 and the art of Basel Use of parametric thinking in provisioni­ng

- CHRISTIAN G. LAURON CHRISTIAN G. LAURON is a Partner of SGV & Co.

(Last of three parts) IFRS 9 is an Internatio­nal Financial Reporting Standard (IFRS) promulgate­d by the Internatio­nal Accounting Standards Board on July 24, 2014. It addresses the accounting for financial instrument­s and features three main topics: classifica­tion and measuremen­t of financial instrument­s; impairment of financial assets; and hedge accounting. It became effective on Jan. 1, 2018 and has replaced Internatio­nal Accounting Standards (IAS) 39 Financial Instrument­s: Recognitio­n and Measuremen­t and all previous versions of IFRS 9. In this article, IFRS 9 is referred to as a “science” because of its systematic­ally organized body of informatio­n and measuremen­ts on specific topics.

Basel III (or the Third Basel Accord or Basel Standards) is a global, voluntary regulatory capital and liquidity framework agreed upon by the members of the Basel Committee on Banking Supervisio­n (BCBS) in 2010–11. It was scheduled to be introduced from 2013 until 2015; however, the implementa­tion has been extended to March 31, 2019. Another round of changes was agreed upon in 2016 and 2017 (informally referred to as Basel IV) and the BCBS is proposing a nine-year implementa­tion timetable, with a “phasein” period to commence in 2022 and full implementa­tion to be expected by 2027. Basel III was developed in response to the deficienci­es in financial regulation that came to light after the financial crisis of 2007–08. Basel III is intended to strengthen banks’ capital requiremen­ts, liquidity, maturity profile, and leverage. It also introduced macroprude­ntial elements and capital buffers designed to improve the banking sector’s ability to absorb shocks from financial and economic stress; and reduce spillover effects from the financial sector to the real economy. Basel is an “art” form in the context of the need to perform skillful planning and creative visualizat­ion in fully comprehend­ing its dynamic processes and uncertaint­ies.

The timing of BSP Circular 989 and the adoption of IFRS 9 will be of equal interest to regulators as well as boards and management, who are keen on understand­ing the impact on financial institutio­ns’ (FIs) lossabsorb­ing capacity under stressful conditions and implicatio­ns for macro-prudential policy on one hand, and the strengthen­ing of strategic plans on the other. FIs are expected to deal with expected credit loss (ECL) and time series data sets and calculatio­n templates at granular and portfolio levels and draw upon multiple scenarios using their own expanded methodolog­ies. They will need to achieve clarity on which would be considered base case and stressful scenarios, in order to help establish the range that would feed into the overlay mechanism of IFRS 9. When this developmen­t happens, the topdown and bottom-up approaches to adjusting Probabilit­y of Default (PD) for the overlay mechanism will become manifested in the coming months, so it is helpful to understand these two simultaneo­us processes that may converge (with the corporate and institutio­nal exposures in mind).

Under the bottom-up approach, the PD is determined from the base credit risk model that accounts for idiosyncra­tic properties before it is adjusted for industry-level factors. The final adjustment is the overlay of the macroecono­mic scenarios. Methodolog­ywise, this process involves recalibrat­ing the rating or scoring PD models to incorporat­e macroecono­mic factors. In practice, and for communicat­ion purposes, it would be helpful to distinguis­h the base PD and the correspond­ing overlay adjustment, which could be illustrate­d as a scalar or multiplier of 1 to 1.2 given an intense view of the economy.

On the other hand, the top-down approach is influenced by macroecono­mic modeling that may involve auto-regression and would use a combinatio­n of an underlying Basel PD model and a portfolio model associated with stress testing. This Basel PD model produces a long-run or through the cycle PD that requires scaling, such that the portfolio average PD matches the predicted PD from the stress testing model. Forward-looking macroecono­mic factors are applied in this exercise, with a scalar derived through optimizati­on when linking the two models. In addition to regression, single factor models and credit index approaches may also be employed for top-down approaches.

Currently we are observing more bottomup approaches being employed by the industry as it improves its base credit risk models and the relevant industry factors given the segment of the exposure. With the introducti­on of BSP Circular 989, we would expect top-down approaches to be revisited.

At some point within the next 12 to 15 months, we would expect a ‘ VaR to VAR’ methodolog­y connection between IFRS 9 and

stress testing. From Value at Risk models to Vector Autoregres­sive models and back, this developmen­t could usher in the second generation of overlay and stress testing models that would allow economic forecastin­g (and potentiall­y reduce the probabilit­y-weighting exercise to a sense-check exercise rather than as the main input) and incorporat­e lifetime and transition criteria. Regardless of the advancemen­t to be implemente­d by financial institutio­ns, there are helpful operationa­l guidelines to be noted. The first point is that the exercise could result in an unintended front loading of losses, leading to capital erosion. The second is for any stress testing methodolog­y and overlay mechanisms to be connected to internal risk management, notwithsta­nding the regulatory floors that may be imposed for capital adequacy purposes. The final point is for the Board to be directly involved in the identifica­tion and evaluation of stress scenarios, the stress test interrelat­ionship map and oversight on the macroecono­mic projection­s and its linkage to the institutio­n’s resilience plan. Readers may refer to our July 26, 2010 article in this column, “Stress Testing as a governance tool,” for more guidance.

While the overlay mechanism prepares us for the foreseeing function and expansive view, let’s not lose sight of the tightening of the data, systems and processes within the base ECL model. In particular, for the PD determinat­ion for the corporate and institutio­nal exposures, we are recommendi­ng the following that should be viewed as loops and iterations rather than as a set of finite linear steps: 1. Segmentati­on process — covering the traditiona­l data processing and management, risk profiles and internal risk rating system, with a subset for emerging and unstructur­ed data capture assessment­s, “Big Data Small Data” initiative­s, and clustering of observed attributes and properties. 2. Credit evaluation — covering mainly the financial condition, industry assessment and outlook and management quality of the corporate and institutio­nal customers. 3. Assessment of factors and variables — covering both single factor and multifacto­r analysis, analysis and binning, and other approaches used to ascertain the relationsh­ip between the data points to the intuition and judgment of experts to be used for the model selection step.

4. Model selection — covering model runs that will result in candidate integrated models (composed of main and sub-models); these models initially start with an optimizati­on algorithm of instructio­ns and eventually “learn” over time; it may take another two to three learning rounds over the next 12 to 15 months to help stabilize the PD models.

5. PD transforma­tion — covering the derivation of the through-the-cycle and point-in-time PD from the models chosen.

6. Portfolio analytics — assessment of the results against the internal policies and portfolio management, which then feeds back to the segmentati­on process.

This “future-proofing” recommenda­tion will help FIs transition from parametric thinking to the rise of coding drivers — specifical­ly on the adoption of machine learning while monitoring any progress on artificial intelligen­ce for risk management and provisioni­ng calculatio­ns.

At this point, the emerging parametric thinking underlying the ECL calculatio­n has establishe­d the boundaries of PD and LossGiven Default (LGD) to reflect both idiosyncra­tic properties and to a certain extent — through the overlay mechanism — the systematic risk that the obligor, or broadly the portfolio, is exposed to. But what is this systematic risk factor? Are we still talking about the generic market or financial economy? Or should this now be expanded to include “funding the real economy” discussion­s?

The connection between Basel and IFRS 9 has been limited so far to excessive concentrat­ion, contagion and spill-over risks. What has not been covered are the network and transmissi­on risks that arise from stagnation. In an upcoming article, we will examine its applicatio­n to the areas that have the strongest potential to break inertia and have an impact on the economy — agricultur­e and infrastruc­ture.

This article is for general informatio­n only and is not a substitute for profession­al advice where the facts and circumstan­ces warrant. The views and opinion expressed above are those of the authors and do not necessaril­y represent the views of SGV & Co.

 ??  ??

Newspapers in English

Newspapers from Philippines