Data Architecture – What lies beneath the hood for loan loss reporting

 

Data Architecture _ALLL

What could be the best data architecture for a loan loss provisioning reporting system?

Keeping the data unbiased, is the first principle. In the loan loss reporting world, this has three implications – source a rich set of attributes without restricting oneself to only the current scope of reporting. Secondly, acquire granular data even going to transaction level assuming of course that data volumes permit this and thirdly, store historical data. This will ensure, that we future proof loan loss reporting at least to some extent and are equipped to meet business user demands for more insights from the data, furnish drilldowns, etc.

Particularly important in the loan loss reporting scenario is enabling users to interact with the data effortlessly. This means allowing them to perform data corrections, enrichments, validations as self –sufficiently as possible. In order to support this, data needs to be presented in a banker friendly language and most, if not all, data management and transformation functionalities should be UI driven.

Flexibility needs to be provided in terms of Parameterization at various points from identification of non performing behaviour in an asset account to definition to account pools and specification of provisioning levels. To achieve this, the final processed metrics need to be decoupled from data. Hence rules can be changed; managed directly by users to keep pace with business and regulatory considerations without impacting the underlying data layer. Lastly, traceability and auditability need to be maintained as data flows through the various stages of transformation, user validation and correction.

Hence the bank, needs to choose an approach that not only allows for reporting to be automated but also provides the added capabilities required to manage regulatory inspections seamlessly while keeping pace with changing regulations.

 

 
Posted in Compliance | Tagged , , , | Leave a comment

Ensuring Accuracy and Integrity in Loan Loss reporting – first time and every time!

Creating good quality data is akin to the efforts taken by one to get and stay healthy in terms that it requires both intense short term initiatives and long term investments in process, technology and people. Importance of good quality data can’t be underscored enough when it comes to computing and reporting on loan quality loss provisions and reserves. Let’s analyse this from the standpoint of multiple dimensions of data quality –

 

Accuracy & Integrity.jpg

One of the key steps in maintenance of data quality is to create a separate mart for asset quality reporting and tightly control the quality of data (across all dimensions detailed above) flowing into it. However, in the long-term, it requires clear ownership metrics and checks to be built into each stage of the data journey.

 
Posted in Compliance | Tagged , , , , | Leave a comment

Comprehensiveness – Start with Loan Loss reporting and extend the same data investments to do more

In the Loan Loss Reporting scenario buying for the future, definitely means investing in a solution that is CECL ready. Having said that, this mandate comes in the wake of the BCBS239 guideline, leads to an overall rethink at banks about the way risk and regulatory reporting in general is done.

Over the years, burgeoning compliance requirements have been accompanied by a proliferation of multiple systems. across the entire data management value chain from Data Storage and Integration tools to Banking data models and reporting tools to finally specific Regulatory Reporting solutions resulting in an architecture that looks something like this.

Comprehensiveness -NL2

Given this scenario, as banks make investments to get CECL ready, CXOs are evaluating extensibility and reusability of the data and reporting infrastructure they are creating. Hence, a solution that meets not only loan loss related reporting requirements, but in addition, enables reuse of that data for other compliance mandates like  CCAR or Liquidity reporting is of tremendous value.

It not only means cost and time efficiencies but also simplifies user adoption as they don’t have to “learn” another new solution to meet the next new compliance mandate. In addition reconciliation overheads are minimized as there is a single source of data and in the long -term as the system and business landscape of the bank changes, makes maintenance simpler.

 
Posted in Compliance | Tagged , , , , | Leave a comment

Regulatory Watch – Asset Quality and Liquidity monitoring a global phenomenon

During the global financial crisis, there was a significant turmoil in the financial systems worldwide leading to the insolvency of a number of banks in several countries which either failed or received taxpayer-funded bailouts. The shocks of meltdown were felt across the globe and regulatory changes were initiated at both global and local levels. While the global rules are set internationally, it is the local regulators who enforce these rules with tweaks to suit the local markets.

Post the financial crisis, the global banking rules could be divided into four broad themes: Liquidity, Risk Data, Capital and Provisioning. New liquidity requirements (LCR & NSFR) have been put in place for banks to meet financial commitments during times of stress. BCBS 239 focusses on the importance of risk data aggregation and reporting. BASEL III mandates increase in both quality and quantity of capital to absorb losses. And finally, IFRS 9 published by IASB changes the way banks calculate provisions from an incurred loss model to an expected loss model.

Drawing parallels in the Indian context, while the Liquidity Risk Monitoring guidelines were published by Basel committee in 2012-13, RBI published a lightened version for liquidity monitoring in 2014 and suggested a staggered timeline for implementation (60% in 2015 increasing to 100% in 2019). In Singapore, Monetary Authority of Singapore published a list of D-SIBs (domestic systemically important banks) which were assessed to have a significant impact on the stability of the financial system and proper functioning of the broader economy and have to comply with guidelines similar to BCBS 239. BASEL III has been accepted globally with changes either stringent or lighter based on geography. International standards require banks to maintain at least 5.5% CET1 whereas in India the minimum is 5.5%. As can be see a global guideline sooner or later transmits to country specific levels, albeit with certain modifications to suit the market intricacies. Moreover, recently RBI allowed banks to included certain items such as property value, foreign exchange etc. to beef-up the Tier-I capital base

In 2012, IASB published IFRS 9 to tackle the “too little, too late” issue pertaining to the existing incurred loss model. Soon, BIS published its guidelines on ECL (expected credit losses) as “Guidance on Credit Risk and Accounting for Expected Credit Losses”. It was always understood across the banking industry that provisioning based on incurred losses cannot create sufficient reserves for times of stress, which resulted in issues with balance sheet and capital management, negative provisioning, low dividend pay-outs and more. FASB, in US, deliberated the issue and came up with its own set of guidelines known as CECL (current expected credit losses) which was a variant of IFRS 9 with certain modification. Recently, the MCA (ministry of corporate affairs) in India published IND AS 109 relating to reporting of financial assets and financial liabilities with a section on Expected Credit losses. The report of the working group on implementation of IND AS by banks in India has proposed the IFRS 9 three bucket expected credit loss model for Indian banks. However, the timelines for implementation are yet to be confirmed by RBI.

As can be seen, all these themes were introduced at global level and have been inherited in some form or other by the local regulator. So, in this dynamic regulatory environment, it is imperative for the banks to not only keep a watch on the local developments, but also be up to date with what’s happening globally.

 
Posted in Compliance | Tagged , , , | Leave a comment

Adaptability – Meet current and future regulatory requirements around CECL

IASB and FASB have already published guidelines around expected loss modelling for reserve calculations. A working paper on ECL has also been published in India which might soon become a guideline. It is expected that banks would be concerned about the new “expected loss” model, but they need to realize that building these complex models is not the devil, Data is!

As mentioned by Jeffrey Geer, OCC Deputy Chief Accountant “Banks are encouraged to consider their data collection need. Adding that “vintage,” or historical, loan data may become much more important in adhering to the new accounting standard”.

For transitioning successfully to the new accounting standards, the banks must focus at minimum on three aspects of data:

I.        Data elements: A sample set of all data elements that a financial institution might require can be seen in the attached diagram. Again, this is a laundry list, a lot of information is dependent on what decisions the financial institution takes in terms of methodology (PD-LGD or Cohort Analysis or Vintage Analysis). For example: if a bank believes that the prepayment impact on pools is low, then the transaction data is not mandatory. But a few data elements which would become mandatory for ECL world are risk grades at loan level, origination and maturity dates, and in-depth information of charge-offs, recoveries and TDRs.

II.        Data History: The other aspect which has been misinterpreted is the history of information – 30 Years of loan history is not mandatory. Let’s take an example to substantiate: For a long term loan such as Residential Real estate, if the pre-payments are taken into consideration, the average maturity comes down to 7-10 years. So, for using a vintage analysis on a residential real estate, the historical data required would not be more than 7-10 years. Most core banking systems archive data beyond 1 year. Hence, banks need to start talking to their core vendors on how the historical data can be sourced back for analysis.

   III.        External Data: For forecasting of losses under ECL, banks might be required to forecast economic conditions. If, enough loss data is unavailable, peer data might be required to calculate loss rates. FIs have to decide a strategy on whether they would subscribe to such data from an external vendor or would create an in-house external data mart

Banks have three options to adapt to the changes: (a) Develop the capability in-house (b) Buy a Vendor solution (c) Outsource the calculation to a third party. Rather than making a panic decision, banks should analyse the pros-cons of each option. A transition to ECL would require a complete re-jig of data available for analysis.

Can this data be re-used for current strategic/regulatory requirements and can the same data be used for future purposes?

Does the bank already have a robust data eco-system and only lacks the models required for ECL?

These are a few questions the banks needs to get the answer before making a decision. To summarize, there would be a lot of noise around focus on statistical models along with as early as possible shift to new ECL models. Banks should filter out such noise and focus on preparing of required data elements. ECL is not about the new calculation methodologies but it is more about how banks prepare themselves to take on this transition and leverage the same data ecosystem for futuristic requirements

Adaptability CECL

 
Posted in Compliance | Tagged , , , | Leave a comment

Agile Data – Extending Data Management principles to Loan Loss Provisioning and Reporting

Even as the global banking regulatory landscape witnesses rapid evolution, regional regulators have been diligently dispensing some of the most critical compliance and risk regulations. Though there has been considerable opposition from the financial services industry, most regulations have been implemented in some fashion or the other. The most important lessons learnt from the financial crisis were: (i) The accounting model for impairment waited for the impairment to be incurred before requiring a loss allowance thereon and was criticised for being a “too little, too late” approach; (ii) Banks had woefully inadequate data and systems to manage and report risk. These issues, especially during turbulent times dramatically increased the chances of failure.

To address these fallacies, two colossal regulatory changes were introduced: BCBS 239 and IFRS9. BCBS 239 has been a global phenomenon and a focus area for bank for 3 year now. It was introduced primarily to enhance risk data aggregation and reporting in banks focussing on four areas: Overarching Governance and Infrastructure, Risk Data Aggregation Capabilities, Risk Reporting Practices and Supervisory Review Tools. The regulation requires that banks get “Risk” data equipped and move from a fragmented business level/entity level/system level approach to a more consolidated level aggregation and reporting. Today, most of the G-SIBs (Global Systematically Important Banks) and D-SIBs (Domestic SIBs) are in some way or other working towards BCBS 239 compliance. IFRS 9, introduced by IASB, on the other hand was localized earlier and later deliberated in US with FASB to introduce the new Expected Credit Loss (ECL) guidelines.

 

Though, one of the regulations was focussed on enterprise risk management with a deep dive into risk data aggregation and the other was pertaining to the most important financial calculation which directly impacts the P&L, a striking resemblance could be seen. Both these regulations have emphasised on the relevance of quality and gamut of data required for calculation and reporting. Data, as they say, is the hidden culprit and most often banks don’t realize this until it’s too late and these all-encompassing regulations will force banks to look at data differently. While for BCBS 239, the fundamentals talk about risk data, for ECL (or CECL) a deep down analysis is required to understand importance of data.

Bankers understand the issues of building new models to switch from a backward-looking approach to accounting to a forward-looking approach to accounting, but the increase in scope of data collection(both internal and external) is still not accepted. In order to model future market events, firms will need large volumes of data (internal data such as charge-offs, loan detail etc. and external data such as macroeconomic information) that has not previously been required.

With the focus on data increasing due to the above mentioned guidelines, banks need to build a strategy around data acquisition, data standardization and data usage. Moreover, co-ordination between various departments would need process definition. The proverbial ‘need of the hour’ therefore is foresight. With ECL (CECL) regime on the horizon, now is the ideal time to get the fundamentals right to prepare for a wave of new regulations.

 

 
Posted in Compliance | Tagged , , , | Leave a comment

Fintellix Experience

One the leading international banks of French origin has leveraged the Fintellix regulatory reporting platform and data model to automate regulatory reporting for multiple geographies. India, Korea, HongKong, Australia, Vietnam are some of the regions where automation work has been completed

/ is in advanced stages , while other regions like Japan  are on the roadmap. Apart from Central Bank Reporting Liquidity reporting has also been automated on the Fintellix platform. In the process of automation several specific challenges faced by the bank , like collaboration among multiple users (departments) for the submission of a single return, ability to maintain data lineage, preserve traceability and integrate not readily available in legacy systems have been effectively addressed.

Fintellix Experience

Significant benefits interms of cost and time economies have been realised by the bank owing to standardization of data, processes and a consistent and enhanced user experience.

 
Posted in Compliance | Tagged , , | Leave a comment

Global Watch – Connected Regulators

Increasingly, multiple platforms are available for global regulators to stay connected and track each other. This has led to a scenario where successful compliance practices once adopted in a geography rapidly spread to others.

There are multiple examples – like the XBRL based submission of returns which was adopted by multiple regulatory bodies in the US and then spread to Europe, India and subsequently Indonesia and Mauritius. Today taxonomy based submissions is becoming a standard practice across the globe adopted not just by banking regulators but by capital market regulators, industry bodies and others.

Another manifestation of this is seen in the BCBS239 guidelines, where Principle 14 talks about Home/Host Cooperation basically exchange and sharing of information among regulators globally. In summary, demand for granular data, agility (in terms of ability to respond to change), traceability and standardization seem to be the key regulatory themes today.

While, on the one hand this translates to more regulatory requirements, on the other hand, it makes it easier for banks to future proof as they can anticipate what lies ahead by tracking international best practices.

 
Posted in Compliance | Tagged , , | Leave a comment

Adaptability – Manage change both internally and externally initiated

Change – This is probably the most dreaded word for Chief Compliance and Risk officers in banks today. A change in a guideline, almost always have wider ramifications, which spills over into the domain of the COO, CFO and CIO in the bank.

The question, then is, is there a way to build to manage change?  Also, not all change is externally (regulator) induced, so of it can be the by-product of internal policy changes, product launches, system enhancements and so on. So how can one prepare for this?

In our opinion, it is possible to build to manage change better and this is precisely what the Principle 6 of BCBS239 says. So for this, the Data Aggregation process needs to be decoupled from the way data is acquired and stored. Consequentially, a logic or definition change will not demand much more than a data classification or business rule change.

degrees

 

Similarly the reporting infrastructure needs to be equipped with a versatile reporting writing tool which not only helps in creating new data visualizations to meet user requirements, but also reduces the degrees of separation between data owners and end consumers.

Hence, investing in a solution that is designed to handle change, does make change management faster and more cost effective for banks.

 

 
Posted in Compliance | Tagged , , , , | Leave a comment

Comprehensiveness – One platform, multiple uses

How do I buy for the future? Probably this is one of the most poignant questions, as Compliance and Risk Officers work together with technology teams to make investments to meet regulatory needs.

One way to do this, is to lift the hood, dig a little deeper and understand the commonalities in features, functionalities and data coverage across compliance reporting requirements. So then arises the question, can a common investment be made to meet multiple compliance requirements? If so, what kind of investment should that be?

To answer this question, some insights can be derived from the BCBS239 guidelines which emphasize that once we get the data right, reporting is simplified. So essentially this implies investments in creating a Data Hub and a Data Management platform which simplifies access to the data.

Comprehensiveness

As compliance requirements evolve and Basel II makes way for Basel III, liquidity reporting and new Expected Loss based provisioning norms are introduced a Data Management platform can provide the right base on which to build the necessary additional computation engine and generate final reports. Hence time and monetary investments to ensure compliance to each new guideline are minimized and ongoing maintenance too gets simplified.

 

 
Posted in Compliance | Tagged , , | Leave a comment