APRA EFS Modernisation

APRA EFS Modernisation : An overhaul of Australian financial services industry’s regulatory reporting landscape

  • APRA released its EFS form reviews for public consultation,earlier this year. The reviews represent major changes in the existing reporting forms
  •  

  • The changes are driven by what APRA has cited as “significant gaps between current reporting practices” and “evolving regulatory and risk environment”. Changes are also driven by data quality issues which have been a major source of concern in the current reporting landscape

There are two broad categories of changes proposes :

NL3-Image4

The proposed changes are poised to greatly affect reporting entities across entire value chain of reporting landscape. Reporting entities will have to realign their people, process and technology ecosystems to a significant extent in order to comply with the new reporting regime.

To better understand the reporting changes, existing gaps, detailed analysis of changes, impact to reporting entities and solutions to address the challenges, watch out for our series of white papers. In the next release, we will lay out the current reporting trends in Australia, parallels of the proposed reporting regime with regulatory reporting environment in other geographies and analysis of changes.

 
Posted in Compliance | Leave a comment

Fintellix Experience

A leading multinational bank has leveraged the Fintellix regulatory reporting platform and data model to automate multiple regulatory reporting outcomes include LCR reporting.

NL3-Image4

The solution architecture illustrated below, involves sourcing and storing common data in a “compliance data hub” while only the “differential” data is stored in solution specific marts. In this manner, data acquisition efforts are reused and data reconciliation efforts are minimized. The platform remains standard across solution making it easy for users to “learn” features and use them effectively across solutions.

NL3-Image5

 
Posted in Compliance | Leave a comment

Global Watch – Connected Regulators

Liquidity Management is one of the most avidly monitored areas within the bank given its impact on critical areas such as market operations undertaken by the bank, product pricing strategy, etc. to name a few.

Historically, central banks have had their own metrics and tools to monitor liquidity. MAS in Singapore, for example, has the Minimum Liquid Assets or MLA which has its classification framework for assets based riskiness. Banks had to ensure that at least 24% of overall assets were liquid. The MLA number has to be computed every day and reported on a monthly basis.  The Indian central bank on the other hand monitors inflows and outflows across different maturity buckets thus closely tracking any mismatches. However, it was only after the 2008 financial crisis, that a globally accepted standard for Liquidity Monitoring has emerged in the form of LCR/ NSFR reporting accompanied by a stress testing framework.

While several central banks have come up with timelines for LCR /NSFR compliance, they have created parallel liquidity monitoring tools and metrics like the FR20152a of US. Across the Middle East, Asia, Europe and the Americas, regulators have already published the LCR/NSFR roadmap largely culminating in 2019. In Singapore, the transition to LCR/NSFR regime has been mandated only D-SIBs. However, many banks are choosing to adopt LCR/ NSFR reporting owing to its global acceptability.

 
Posted in Compliance | Leave a comment

Adaptability – Manage change both internally triggered and externally initiated

If we take the example of FR2052a reporting, $700 bn plus banks have already commenced daily reporting on inflows and outflows. For smaller banks, the reporting is expected to be monthly and will kick in from 2017.

Essentially, if we analyse the FR2052a data sets, bank need to report on inflows and outflows from different kinds of assets and liabilities. A comprehensive master data framework is enforced on the data sets that need to be submitted with the intention of making it easy for the regulator to compare and process data received from multiple banks.

From a change management standpoint, banks need to have in place a robust and GUI based master data management framework to maintain and manage inclusion of new codes (regulator initiated), modifications in mappings to existing codes(bank /regulator initiated) and then go ahead and generate the final datasets. Similarly, LCR reporting enforces the “HQLA” categorisation framework on the banks assets which might be subject to revisions. Thus, a critical part of a smart change management strategy for Liquidity reporting is a robust master data management framework and data classification engine.

Another important piece is capturing data and having a data model that can support an exhaustive list of attributes to ensure all aspects around counterparty, nature of transaction, currency and product are covered. From a stress testing standpoint as well, having a model that can enable setting up conditions on a variety of parameters is a necessity as this is another area which is likely to evolve.

When it comes to Liquidity reporting a strong data foundation overlaid by a robust master data management framework can equip banks to better handle internally triggered or externally initiated changes that come their way.

 
Posted in Compliance | Leave a comment

Comprehensiveness – The key to making data reusability a reality

How can one ensure that data investments are reusable? In order to answer this question, one has to dig a little deeper and think the “data paradigm”. Can data be extracted once and reused multiple times? The benefits of this are immense.  Savings in time and cost comes to mind as the same data need to be acquired multiple times from source systems.

Then comes the advantage of lower reconciliation costs for users and higher trust in data over a period of time. Liquidity reporting has many manifestations, depending on the guidelines at play. US itself has the LCFR/NSFR reporting and the FR2052a reporting both of which can leverage the same data foundation.  Case for data reusability becomes all the more significant within the liquidity reporting space. The diagram below illustrates the data overlap across standard compliance and business reporting use cases.

NL3-image3

Typical use cases would be assets and liabilities data can be reused for Financial Reporting, Capital Adequacy reporting, Asset quality and Concentration risk monitoring. Achieving this objective requires having a vision to put in place a “comprehensive data strategy” as opposed to adopting a “point solution approach”.

 
Posted in Compliance | Leave a comment

Accuracy and Integrity – Ensuring accuracy to the last mile

Probably the only way to ensure accuracy in Liquidity reporting is to have a powerful reporting platform that intermediates between the sources of data and the final return.

Accuracy is a pursuit that begins with the generation of data and ends only with production of the final report. In this context, a reporting platform that helps manage the data journey from source to return is critical.

NL3The first step to accurate reporting is ensuring the data is complete and correct by monitoring the data load process and profiling the data. Next in line are correcting source system errors and thirdly, performing necessary enrichments to fill data gaps .

Once data is ready, the next step is ensuring the application of correct classification rules first to transform “product types” and “security types” to asset categories and then on to HQLA classes. Various classification rules have to be created and maintained by users and for this such rules need to be easily accessible and “understandable.”

Template changes and need for new visualizations of data are frequent and need to be tracked and implemented which inturn requires a versatile and easy to use report writer tool.

All of these features can only be provided by a powerful reporting platform which insulates the raw data from the final outcomes ; hence having this intermediate layer is critical to ensuring accurate submissions and simplify change management.

 
Posted in Compliance | Leave a comment

Data Architecture – Simplifying data management

NL3-Image 2Drawing upon the first edition of this newsletter which entailed key design principles to achieve an optimal data architecture, “keeping data unbiased” and enabling “malleability” become particularly relevant in the context of liquidity reporting. Liquidity is one area that demands reporting often right down to the transaction level. At the same time, different regulations demand different classifications of assets and liabilities. In order to do this, it is essential to have all the data and a parameterizable framework that allows for the appropriate classification of data.

Hence, the framework promotes malleability which essentially means that with the same base data, different classifications are built to meet different reporting use cases.

Third and equally important is providing users access to the data albeit in a controlled environment. This will aid the users to make adjustments to specific attributes of a transaction governed by a review mechanism and audit trail. This in turn improves user trust and confidence both in the data and in the final report.

Also, there is greater control on accuracy of submissions from day 1, despite source system data quality issues which would typically take longer to correct.

 
Posted in Compliance | Leave a comment

Liquidity Reporting

Liquidity has become one of the key metrics getting tracked by banks globally, especially post the 2008 financial crisis. Consequently, one of the new requirements in Basel III (as compared to Basel II) is that of tracking and reporting on short and long term liquidity of the bank. Liquidity Coverage Ratio (LCR) takes into account the liquidity position of the bank over a 30 day horizon while the Net Stable Funding Ratio (NSFR) looks at the inflows and outflows over a one year timeframe.  In addition, most regulators also have region specific liquidity monitoring returns.

In the US, the regulator has recently introduced the FR2052a return which requires that banks report inflows and outflows at transaction level adhering to a well laid out master data management framework. On the other hand, In India, RBI has a return on Structural Liquidity which monitors inflows and outflows at an aggregate level i.e. across asset/liability categories across different maturity buckets as well as, asks for the list of the top depositors thus again drilling down to the potential source of liquidity risk.

However, the key differences on comparing these local liquidity standards with the Basel committee norms on Liquidity risk are

  • Stress testing on HQLA where the bank needs to report on the HQLA level under stressed conditions
  • Specific classification on the basis of its riskiness in HQLA Level 1, HQLA Level 2A and 2B categories
  • Adherence to category wise threshold levels prescribed by the committee to ensure quality of the liquid asset stock of the bank
  • View on short and medium term liquidity

Hence, while local and global liquidity reports may be complementary both need to be separately tracked and managed by banks given the change cycles, data requirements, report formats and frequencies vary.

 
Posted in Compliance | Tagged , , , | Leave a comment

Fintellix Experience – Insights gleaned from the Fintellix client repertoire

One of the leading multinational banks headquartered in North America, has leveraged the Fintellix regulatory reporting platform and data model to automate its asset quality reporting (in India). The Fintellix solution entails end to end Data Processing starting from to Data Integration to final reporting

  • Integration of the corporate and retail loans portfolio from diverse source systems and imposition of a standard data definition
  • Unification of multiple identities of a single customer to create a single view so that the bank can solve cross classification challenges and perform a customer level classification – impose the “worst” classification across different credit facilities of the same customer
  • Classification of loans using a parameterizable framework into different
    “Non performing “ and “Early Warning” categories
  • Computation of provisioning amount and generation of appropriate accounting entries as per the GL structure of the bank

Significant benefits in terms of cost and time economies, as users are now engaged purely in review and analyses of system identified accounts rather in preparation of the list of accounts as per regulatory norms. In addition, users have access to the identified list of accounts and can run different types of analyses on it, to identify common attributes and potential causal factors for NPA accounts.

ALLL case study

 
Posted in Compliance | Tagged , , , | Leave a comment

Data Architecture – What lies beneath the hood for loan loss reporting

 

Data Architecture _ALLL

What could be the best data architecture for a loan loss provisioning reporting system?

Keeping the data unbiased, is the first principle. In the loan loss reporting world, this has three implications – source a rich set of attributes without restricting oneself to only the current scope of reporting. Secondly, acquire granular data even going to transaction level assuming of course that data volumes permit this and thirdly, store historical data. This will ensure, that we future proof loan loss reporting at least to some extent and are equipped to meet business user demands for more insights from the data, furnish drilldowns, etc.

Particularly important in the loan loss reporting scenario is enabling users to interact with the data effortlessly. This means allowing them to perform data corrections, enrichments, validations as self –sufficiently as possible. In order to support this, data needs to be presented in a banker friendly language and most, if not all, data management and transformation functionalities should be UI driven.

Flexibility needs to be provided in terms of Parameterization at various points from identification of non performing behaviour in an asset account to definition to account pools and specification of provisioning levels. To achieve this, the final processed metrics need to be decoupled from data. Hence rules can be changed; managed directly by users to keep pace with business and regulatory considerations without impacting the underlying data layer. Lastly, traceability and auditability need to be maintained as data flows through the various stages of transformation, user validation and correction.

Hence the bank, needs to choose an approach that not only allows for reporting to be automated but also provides the added capabilities required to manage regulatory inspections seamlessly while keeping pace with changing regulations.

 

 
Posted in Compliance | Tagged , , , | Leave a comment