Download presentation
Presentation is loading. Please wait.
Published byΜαργαρίτες Μητσοτάκης Modified over 6 years ago
1
19th XBRL International Conference “Reducing regulatory burden with XBRL: a catalyst for better reporting” June 22-25, 2009 Paris, France Regulatory Track FFIEC Central Data Repository: Adding Value to the Data Supply Chain Alan Deaton, Federal Deposit Insurance Corporation June 25, 2009 Good morning. My name is Alan Deaton. I work for the Federal Deposit Insurance Corporation. I’ve been with the FDIC for 12 years. Originally I started out in Research, writing papers about the state of the industry. It was much harder back then because the industry was very healthy and had few problems. Now it’s a much more interesting and challenging time, and there is a lot to write about, but I no longer work in that area. For the past 5 years I have worked on the Central Data Repository project, which is the topic of the presentation today.
2
Agenda Overview of the Call Report
Overview of the Central Data Repository (CDR) Value-added functionality Public Data Distribution Uniform Bank Performance Report XBRL for Bank Examinations FDIC’s GENESYS application (Proof of Concept) for bank supervision Other FDIC applications By way of an agenda, we’ll talk about… Overview of the Call Report Overview of the CDR – how many have heard of the CDR? Value-added functionality that we have added to the CDR XBRL for bank examinations – a proof of concept that we would like to undertake in 2009 How many are from fellow bank regulatory agencies?
3
Overview of the Call Report
All U.S. Banks and Thrifts are required by the Federal Financial Institutions Examination Council (FFIEC) to file quarterly reports with their regulator as of the close of business on the last day of each calendar quarter Federally Insured Banks file consolidated Reports of Condition and Income (Call Reports) Federally Insured Savings Institutions file Thrift Financial Reports (TFRs) Banks have 30 days to submit data unless they have multiple foreign offices, then 35 days Data that are collected on the Call Report, or the consolidated reports of condition and income, are fundamental to our business. The data are used to: Examine banks Conduct research on the industry Determine assessments for individual banks and the industry as a whole All banks are required to file the Call Report each quarter In the US we have what’s called the dual banking system, with several regulator sharing supervisory duties FDIC – state non-member banks (the majority in number, but not in assets) FRB – state member banks (smaller number of banks, but generally much larger) OCC – nationally chartered banks (smaller number of banks, but generally much larger) Thrifts file similar reports with the Office of Thrift Supervision Filing deadline is 30 days, unless the bank has multiple foreign offices, then they get 35 days to file
4
Overview of the Call Report
Approximately 7,500 filing entities as of March 31, 2009 Approximately 1,200 financial concepts reported by an entity Approximately 2,000 validation criteria used to check quality Over 429 pages of instructions Call Report data available electronically from as early as 1959 Since 1998, all banks have been required to file Call Report data electronically We have a lot of banks in the US – about 7,500 as of March 31, That number generally decreases each quarter as a result of mergers, and lately because of failures. There are about 1,200 concepts that entities must report – we have two forms (041 for domestic and 031 for banks with multiple foreign offices; there is a larger report for those with multiple foreign offices) The agencies use about 2,000 validation criteria that check for quality. We’ll talk more about these validation criteria later As good governmental regulatory bodies, we have copious amounts of instructions - there are over 400 pages of instructions We’ve been collecting Call Report data for a long time, already since several decades. Originally there were only a few pages, and in the 1980s, it grew to about 30 pages or so. Since, 1988, banks were required to file the Call Report electronically. Prior to that time, we would accept the data in paper form. So, we have come a long way.
5
Overview of the CDR Before the CDR and XBRL:
Validation routines and formulas stored in and processed by two systems (FRB, FDIC) Banks submit data after some minimal checks in their software - inconsistencies between preparation software packages Software vendors receive Call Report metadata from Excel, PDF, and Word documents – cut and paste into their software Agency analysts would check data quality once files had been submitted and contact bankers with any questions – often 1-3 weeks after initial submission Before the development of the CDR, the banking agencies collected Call Report data much differently The FDIC and the FRB had their own systems for collecting and processing the data. The FDIC collected data on behalf of the OCC. The validation criteria were similar but not identical between the two systems, so banks were essentially held to different standards. There was a community of software vendors that developed regulatory reporting software to help banks in preparing their Call Report data for submission to the agencies. These vendors received Call Report metadata (concept definitions, validation criteria, and instructions) in Excel, PDF and Word formats. This process was very inefficient and inevitably led to human errors. Plus, they had to receive two different sets of validation criteria from the FDIC and the FRB. Software vendors did what they could to ensure high quality data by programming the validation criteria into their software, but at best there were only minimal checks on the data before it was submitted to the agencies. The banks were not able to proper data validation prior to submission of the data. The analysts would then receive the data, which was submitted to the agencies in batch via a third party, sometimes several days or weeks after it was prepared by the banks. The analysts would have to follow up with the banks via a phone call to discuss any abnormalities in the data, usually after the banker had forgotten what they reported or why. Several rounds of telephone calls were usually required.
6
Overview of the CDR Three banking agencies developed the Central Data Repository (CDR) Used XBRL to define and transport data Data receipt Data validation Storage Distribution CDR launched on October 1, 2005 Key policy change ~ pre-validation using XBRL Very Successful implementation The agencies started to look into how to improve this process in This was about the same time that XBRL was becoming more viable as a data reporting standard. The agencies were at the bleeding edge of the technology at that time. In 2003, three banking agencies (FDIC, FRB, OCC) launched a project to develop the Central Data Repository using XBRL to define the data, transport the data, validate the data, store the data, and distribute the data. It took roughly two years to complete the project, with a one-year delay to the original schedule. Finally, on October 1, 2005, the CDR went live. Since then, we have successfully collected 15 quarters of data with the CDR. Overall, the implementation was very successful, and we have sought to build upon that success.
7
Overview of the CDR After the CDR and XBRL:
FFIEC developed the XBRL-based CDR with Unisys Corporation as systems integrator Metadata stored in XBRL taxonomy files now available to anyone The same taxonomy files that contain validation criteria the agencies use in the CDR are used in Call Report software vendor packages Banks are required to check the quality of their data before submitting Agencies do not accept data with quality problems Quality assurance work is done by reporters up front, when it is more efficient Unisys Corporation was our partner in building the CDR. The CDR is completely hosted and maintained at Unisys, outside of all the regulatory agencies. The agencies receive the data in daily extracts that are integrated into our downstream systems. If you’ve been around XBRL long enough, you’ve heard of a taxonomy. We have a taxonomy for the CDR. There is one for every quarter, and it is a closed taxonomy, meaning that the reporters cannot alter or extend the taxonomy. All the Call Report metadata is defined within the taxonomy Concepts Validation criteria Presentation – how to view the data The taxonomy is publicly available and is distributed quarterly to the Call Report software vendors. We worked closely with these vendors while developing the CDR, so that they would be able to transition their customers to use of XBRL. These vendors use the taxonomy information to prepare software that assists the banks in completing and submitting the Call Report. This is much the same as before, except that the information is standardized and can easily be used without manual intervention. This makes the process much more efficient and less prone to errors. A key policy change that we implemented was that banks were required to check the quality of their data prior to submission. Each of the validation criteria that the agencies use to validate the data are incorporated in the Call Report software. So, the banks see exactly what validation criteria failures the agencies see. If there is a data quality problem, we do not accept the data. The banks perform quality assurance prior to submission, meaning that we receive cleaner data sooner in the process. This is better for the banks because it allows them to clean up the data prior to submission, rather than after the fact and when they have already moved on to the next quarter. The banks do not know they are using XBRL – the vendors hide that detail from their users.
8
Overview of the CDR Validity – equations that must hold true or the data is inaccurate Quality – data relationships that help identify anomalies Reportability – identify what financial concepts an entity should submit based on their structural or financial characteristics There are three types of validation criteria in the CDR (we call them edits): they are contained in the formula linkbase. Validity edits – equations that must hold true (Assets = Liabilities + Equity). If this relationship is not true, the data are rejected. Prior to CDR, we had lots of validity errors (70% clean) – now we have virtually none because the banks are aware of the errors prior to submission. Quality edits – relationships that are typically true (Assets growth < 5%) – used to help identify anomalies in the data. If there are anomalies, the bank must explain it. If they do not, the data are rejected. This explanation is sent with the submission, and the analysts review the explanations to check whether they make sense or not. If they do not, there is still a follow-up phone conversation with the bank. If there is a need to change the data, banks must resubmit their data. We do not manipulate the data for them. Prior to CDR we had only about 66% of submissions that were considered clean. Now we have close to 95% that are considered clean. Reportability edits – identify which concepts a bank should report based on its characteristics (both structural characteristics and financial characteristics). If they do not report a concept they should, the data are rejected. Any questions so far about the CDR?
9
Value-added Functionality
Value Added Business & Performance Metrics - Uniform Bank Performance Report (UBPR) Capital Adequacy Asset Quality Earnings Liquidity Growth Rates Industry Standards Regulatory International ~ Basel II Since 2005, the agencies have been enhancing the CDR. Many of the enhancements have been for the analysts or the software vendors, to make using the system easier. Other enhancements have been value-added functionality. One example is our ongoing project to produce the Uniform Bank Performance Report (UBPR) in the CDR. The performance and composition data contained in the report can be used as an aid in evaluating the adequacy of earnings, liquidity, capital, asset and liability management, and growth management. Bankers and examiners alike can use this report to further their understanding of a bank's financial condition, and through such understanding, perform their duties more effectively. What do we hope to achieve by putting the UBPR into the CDR? Save costs to produce the report Faster data availability – as soon as the Call Report is submitted, with all data updated daily Improved transparency – use of a public taxonomy to express our formulas and communicate with others Process agility – putting the keys to the system into the hands of the business owner, rather than the programmer (this is what CDR did). By manipulating metadata, the system responds immediately.
10
Value-added Functionality
UBPR Modify data by (+, -, /, *) Apply functions (annualize, %change) Consistently applied across Data Industry Comparability The Uniform Bank Performance Report (UBPR) is an analytical tool created for bank supervisory, examination, and management purposes. In a concise format, it shows the impact of management decisions and economic conditions on a bank's performance and balance-sheet composition. It is a common analysis tool that is widely used by banking supervisors and banks. It provides a common set of performance metrics for the examiners and bank management to review. The formulas are complicated, but essentially use simple arithmetic operators as well as some functions (such as annualization) But these functions are applied consistently across all institutions, which enables comparisons across time and across institutions. Banks are organized into pre-determined peer groups, such as asset size, and ranked within the peer group. State reports are also available to compare institutions within a state. Analysts can also organize custom peer groups, if desired.
11
Value-added Functionality
UBPR: in CDR as early as December 2009 The UBPR consists of a summary page followed by details on several topics, including profitability, balance sheet composition, capital, securitization, and asset quality. The UBPR is shared with the other federal and state banking supervisors. These data support onsite examinations and offsite supervisory activities. All UBPR information is available to the public through a public web site ( We have recently implemented the UBPR calculation engine within the CDR The public system is scheduled for implementation at the end of 2009.
12
Value-added Functionality
Public Data Distribution: Leveraging XBRL We’re also trying to improve how we distribute data to the public This is a portion of our public site, which was launched in March 2007 Using information in the taxonomy, we are able to generate reports automatically to be placed on our web site (PDD). These reports are available to the public 24 hours after submission in either PDF, XBRL, or SDF formats. This was a significant cost saving for the FDIC over the old process, which was reliant on manual processes that were prone to error.
13
Value-added Functionality
Public Data Distribution: Bulk Data At the end of 2008, we started distributing bulk Call Report data from the CDR, replacing a variety of distribution channels that existed previously. A user can download all report for a single period, as well as the associated taxonomy A user can also download basic information for several period to perform a quick time-series analysis on the balance sheet, income statement, and past due. These downloads are available in either tab delimited or XBRL format.
14
Value-added Functionality
Public Data Distribution: Public Web Service Retrieve Panel of Reporters Institution identifiers, simple structure data, and indicator whether data is available Retrieve filers since a given date List of institutions Retrieve facsimile PDF SDF XBRL We have also implemented a public web service to enable electronic retrieval of Call Report data from the PDD site. There are a few simple queries that allow programmers to write programs that can automatically check for data and download it Retrieve POR Retrieve a list of filers since a given date – to check for updates since the last time it was checked Retrieve facsimile – to download the data once you know what you want to download The data are available in either PDF, SDF, or XBRL format This feature is used by data aggregators, such as SNL and ibank.net, which make the data available to their customers Any questions about the value-added functionality?
15
XBRL for Bank Examinations
GENESYS Proof of Concept (POC) General Examination System (GENESYS) – FDIC’s system for creating the Report of Examination Current system relies on outdated technology Current system must be manually updated each quarter Cannot look back to prior quarters Delivery of system is not timely In 2009 we started planning for the development of a proof of concept that would leverage the CDR to improve the examination of banks at the FDIC. At the FDIC, we have an examination software tool called GENESYS GENESYS (General Examination System) is an integrated softare application for recording financial institution data and creating the Report of Examination (ROE). It includes spreadsheets, worksheets, and places to enter notes and comments, as well as the actual pages included in an ROE. The GENESYS application is used during an examination to automate the ROE and parts of the examination process. Examiners enter data into GENESYS before, during, and after the examination, and use it to prepare the final ROE. Reviewers also use GENESYS when they review completed reports. So, imagine three columns of data – one for the original Call Report data, one for manual adjustments made by the examiner, and a third for the adjusted Call Report data. The current system relies on outdated technology – it is a mainframe based system built on Access and Visual Basic Timely Call Report data is crucial to the examination process. Unfortunately, the current system must be manually updated for each new quarter to incorporate changes to the reporting requirements. There is only one presentation and calculation of the data available at any given time – the most current – even though changes are typical and sometimes significant. There is always a delay in the development of the software – sometimes as much as three months after the quarter has begun. Significant time and resources are devoted to modifying the software and testing it each quarter to ensure that it works as intended. The presentation and the formulas must be maintained manually, which introduces the time delay and opportunity for errors. The data are presented via the templates that are maintained each quarter.
16
XBRL for Bank Examinations
GENESYS Proof of Concept: Goals Enable web services Retrieve taxonomy and bank data from CDR Leverage taxonomy information internally Formulas Presentation Real-time data retrieval, manipulation and analysis POC could provide the basis for future internal systems development: GENESYS Modernization Project in 2010 Other internal FDIC systems Our friends from supervision approached us earlier this year to discuss how the process could be improved. Together, we are collaborating to define a proof of concept that could prove whether we can use the information in the Call Report taxonomies to automate the manual programming functions that are so time-consuming and error prone. The proof of concept is simple: We would like to enable web services at the FDIC to retrieve data contained in the CDR – both financial institution data and metadata (formulas and presentation) Once the data are downloaded, GENESYS would be able to use the information to build a Call Report in “real-time”, using the most up-to-date information available. GENESYS would use the presentation information to display the data and the formulas to build the logic required for the examiner to perform “what-if” analysis. I will be working with my colleagues (some of you may know Mark Montoya from my staff) to build the proof of concept in If it is successful, it could provide the basis for a larger GENESYS modernization effort in 2010 that could incorporate the UBPR and even provide feedback to the CDR based on the examiner’s findings (e.g., if the examiner asked the bank to resubmit its Call Report – the CDR could notify the assigned analyst about the request or notify the bank until it has resubmitted).
17
La Fin Any questions? We are also planning to work with the SEC to link the Call Report taxonomy with the GAAP taxonomy that the SEC has developed. We hope that this project will lead to better analysis and comparison of data that are submitted by institutions to the government. Hopefully we will be back to give you an update to our progress in these areas.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.