Office of the Superintendent of Financial Institutions
This document supports the approval framework described in the Implementation Note, 2007/2008 Approval of IRB Approaches for InstitutionsFootnote 1, which outlines the key principles, requirements and steps for the approval of Internal Risk Rating Systems for the internal ratings-based (IRB) methodology and minimum regulatory capital calculation outlined in Chapter 5 of the OSFI’s Capital Adequacy Requirements (CAR) Guideline A-1. In its Implementation Note, OSFI indicated that self-assessment template(s) would be developed along with the instructions for Phase 2Footnote 2 of the approval process. Accordingly, OSFI has developed the following self-assessment documents:
These Instructions describe the self-assessment process and elaborate on the contents of the self-assessment templates, such as the AIRB RRS Scorecard and the Vendor Model Inventory, as well as other areas not explicitly provided for.
A comprehensive self-assessment process is necessary for successful implementation of AIRB. If completed with accuracy and appropriateness, the self-assessment process can leverage the limited time that exists prior to implementation of AIRB by providing an assessment and analysis that is structured and consistent across the applicant institutions.
It is expected that an institution using AIRB will have a clearly defined and robust self-assessment process that is supported by a variety of documentation. The institution will be able to provide evidence of the self-assessment work performed by delivering a self-assessment packageFootnote 3 as well as other supporting documents to OSFI (see Section 2 of this document for details).
The self-assessment package summarizes the self-assessment work performed by the institution and provides a “road map” to all self-assessment documents developed and used by the institution for AIRB approval purposes. Upon request, all documents that the institution refers to in its self-assessment package should be made available to OSFI.
The self-assessment process must determine if the institution has:
In line with CAR Guideline A-1, the self-assessment process requires that risk and/or business management (as applicable) and internal audit provide the necessary assurances that the institution is in compliance with the AIRB minimum requirements by the implementation date of the new Basel framework (i.e., as at November 1, 2007). OSFI recognizes that some of the requirements around the scope of work for these parties are new and will take time to build during implementation of the new Basel framework.
Consequently, OSFI expects that the initial assessment of adherence to the AIRB minimum requirements to be submitted by the Formal Application Date, i.e., February 1, 2006, based on a self-assessment as at October 31, 2005. The remaining work will be completed over the subsequent years of the AIRB rollout, particularly during the parallel reporting period.
The final self-assessment, including fully completed AIRB RRS Scorecard, should be submitted to OSFI by October 31, 2007, based on a self-assessment as at July 31, 2007 (see Section 7 of this document for details).
As part of OSFI’s AIRB approval process, the institution will submit the initial self-assessment package by February 1, 2006Footnote 4 and the final self-assessment package by October 31, 2007Footnote 5. Please refer to Figure 1 for graphical presentation of the self-assessment package.
The supporting documents to the self-assessment package should demonstrate the scope, depth, and quality of the self-assessment work performed by the institution. This self-assessment work should demonstrate that all material and relevant issues have been identified and that the institution has undertaken or is undertaking appropriate action to address them in a timely fashion.
OSFI recognizes that the implementation process is ongoing and that the self-assessment will change throughout the implementation and rollout of AIRB. Consequently, OSFI expects the institutions to update parts of the self-assessment package on a regular basis to facilitate the monitoring of AIRB implementation. Please refer to Section 7 for further details on the update timeline.
Schedules have been provided to enhance the structure of the self-assessment templates. The schedules identify distinct sections of the templates, including sections of AIRB RRS Scorecard, Risk Parameters Mini-Scorecard, Vendor Model Inventory and RRS Summary. All schedules are listed below in Exhibit 1.
As indicated in the Implementation Note, Phase 2 of the approval process is the stage for formal application and preparation for “meaningful” parallel reporting. To commence Phase 2, the institution will need to submit a cover letter from the Chief Risk Officer, addressed to OSFI by February 1, 2006, as part of the initial self-assessment package. The letter will include the following:
In addition to the letter, the institution should submit its self-assessment package. Please refer to Sections 2.3 to 2.8 for a detailed description of this package.
OSFI recognizes that this self-assessment work will be a work-in-progress. Consequently, OSFI expects a description of all work performed to date by risk and/or business management (as applicable) in respect of the institution’s adherence to the AIRB minimum requirements. Where such work is performed, the results of the assessment must also be included in the form of assessment ratings within the self-assessment package.
The institution will need to re-submit a letter from the Chief Risk Officer to OSFI by October 31, 2007 providing an updated view on the level of adherence to the AIRB minimum requirements and the nature of any and all representations made to the Audit and Risk Committees of the Board in respect of the AIRB implementation and approval.
As part of the initial self-assessment package, the institution will need to submit a letter from the Chief Internal Auditor addressed to OSFI by March 31, 2006, providing an assessment, in the form of negative assurance, based on the following information:
In addition to the letter, the institution will submit the following information:
The institution will re-submit a letter from the Chief Internal Auditor to OSFI by October 31, 2007 providing an updated view on internal audit work performed to date as it relates to all points described above. This letter will also give an updated assessment in the form of an opinion from internal audit based on the following assessments:
The institution will provide a report describing its internal self-assessment process by February 1, 2006. Unless there is a material change to the internal self-assessment process, the institution is not required to re-submit such a report again. Although there is no prescribed format for this report, it will include the following information:
The institution will submit the description of all exemptions, extensions and waivers by February 1, 2006. Unless there is a material change, the institution is not required to re-submit this information; however, if at any stage material changes have occurred, the institution should inform OSFI of the changes on a timely basis.
The IRB approval process recognizes the importance of materiality and provides various mechanisms to accommodate the phased rollout of AIRB. To support this approach, the institution should provide a document with sufficient detail on the assumptions underpinning the self-assessment process, including materiality assumptions. As a part of this document, the institution should show its assessment of materiality for all IRB asset classes listed in Exhibit 2.
In addition, as a part of the self-assessment process, all exemptionsFootnote 7, extensionsFootnote 8 and waiversFootnote 9 should be listed, together with a description of the tracking process used to monitor them throughout the rollout of AIRB.
In order to minimize duplication with the rollout plan, this section should focus primarily on the self-assessment itself, and the management process and procedures supporting the self-assessment, as opposed to a description of exemptions and waivers themselves.
The AIRB RRS Scorecard was designed as a tool to assist the self-assessment process for AIRB approval. The structure of the scorecard is based on the structure of the CAR Guideline, but it also attempts to integrate other aspects, including work effort around the implementation and self-assessment process. Various schedules of the AIRB RRS Scorecard need to be updated at different times. Please refer to Section 7 on Update Timeline.
OSFI will use the self-assessment package for monitoring an institution’s progress throughout the implementation process and rollout of AIRB. OSFI recognizes that institutions will continue implementation efforts throughout the parallel reporting period and this will impact self-assessments over time. The scorecard is to be updated on a regular basis to capture this dynamic process. Refer to Exhibit 22 for the update frequencies.
The AIRB minimum requirements relate to several levels of analysis such as the consolidated banking group, the IRB asset class, and the internal risk rating system. Consequently, different types of self-assessments will be required at each level.
The institution will be required to use two approaches within the self-assessment process. The first approach is based on an assessment at the level of the RRS, whereas the second approach is based on an assessment at the level of the IRB asset classFootnote 10. See Exhibit 2 for the illustration of approaches.
Exhibit 2, above, illustrates that the institution has to populate one scorecard per RRS for corporate (including corporate SME), sovereign, bank, retail mortgages, QRE, and other retail asset classes.
Internal rating system designs vary among institutions; therefore, some institutions will have one RRS per asset class, while others will have one RRS for several asset classes. In situations where one RRS covers several asset classes, OSFI will permit the institutions to complete one scorecard for all appropriate asset classes with an exception of retail exposures.
Institutions should clearly identify any differences between asset classes that apply to the requirements identified in the RRS Scorecard, including composition of the underlying portfolio (see Schedule 2).
In the case of retail exposures, OSFI expects institutions to complete one scorecard for each IRB asset class. For example, if the institution has one RRS for all retail exposures, the institution should populate three scorecards, for retail mortgages, qualifying revolvers and other retail exposures, respectively. Please see Section 3 for a more detailed description of the RRS Scorecard and Figure 2 for the structure of the RRS Scorecard.
The AIRB RRS Scorecard workbook contains four schedules as follows:
OSFI recognizes that AIRB approval (and self-assessment) will be performed on a consolidated basis. However, for instances where material and relevant differences arise in the application and operation of an RRS across multiple legal jurisdictions, the institution should identify these within the self-assessment, and explain how these differences were combined into the consolidated RRS self-assessment of AIRB compliance.
There are no separate scorecards explicitly designed for specialized lendingFootnote 11, purchased receivables, IRB equity, and IRB asset securitization. Institutions are encouraged to conduct a self-assessment against the relevant paragraphs of the CAR Guideline A-1 to ensure that they meet the requirements of the new Basel framework; this self-assessment should be made available to OSFI upon request.
The Risk Parameters Mini-Scorecard captures the RRS’s risk quantification details. It has only one table and is presented separately in Schedule 5.
The purpose of the Mini-Scorecard is to focus on the IRB parameters themselves, identifying supporting data and assumptions explicitly. Mini- Scorecards will be used for follow-up and review discussions on risk quantification and validation.
The Vendor Model Inventory template is presented in Schedule 6. The workbook contains two sections: Section 1 – Vendor Non-Retail Model Inventory, and Section 2 – Vendor Retail Model Inventory.
Schedule 6 captures key information relating to external vendor models used by the institutionFootnote 12 for non-retail and retail portfolios. Further detail on the Vendor Model Inventory is available in Section 5. Figure 3 shows the structure of the Vendor Model Inventory template.
The RRS Summary template is presented in Schedule 7. The purpose of Schedule 7 is to give an overview of all RRSs used by the institution as well as all exemptions, extensions and waivers. In essence, this Schedule summarizes the key information from AIRB RRS Scorecard.
In addition, Schedule 7 asks the institutions to indicate the amount and percentage of gross credit assets and the IRB credit risk-weighted assets covered by each RRS. It also asks the institutions to compare the totals to QIS5 or BCAR and explain any differences.
The purpose of the AIRB RRS Scorecard is to create a framework for self-assessment and supervisory review purposes during the AIRB approval process. All responses and assessments provided in the RRS Scorecard are specific to the RRS identified by the institution.
Different parts of the AIRB RRS Scorecard are shaded in different colors to assist in the use of the scorecard. Risk and/or business management (as applicable) should complete all areas shaded in blue. By contrast, internal audit should complete the areas shaded in gray.
OSFI expects risk and/or business management (as applicable) to be primarily responsible for populating the scorecard. As a part of their self-assessment, risk and/or business management will identify gaps and map them to projects initiated by the institution and will provide an assessment of compliance with the AIRB minimum requirements as at the appropriate assessment date.
Risk and/or business management should have clear measures of success for every assessment, whether the institution meets the requirement or has an outstanding gap to close. In addition, risk and/or business management should assess the status of projects, where these are identified as necessary for gap closure.
OSFI expects internal audit to complete columns 12 and 13 of the supporting worksheets in Schedule 4 and assess the institution’s adherence to the AIRB minimum requirements, as set out in paragraph 443 of the CAR Guideline A-1.
On the front page, the institution will complete summary information relating to the risk rating system as follows: the institution name, the business unit that uses the RRS, the name of the RRS, the name of the portfolioFootnote 13, the IRB asset class that is covered by the RRS, and the date when the last self-assessment was done. If desired, the institutions may indicate the dates when the different schedules or ratings were updated.
Schedule 1 (AIRB RRS Scorecard Structure) illustrates the organization of the overall scorecard workbook and its associated schedules. Schedule 1 shows the three distinct sections of the AIRB RRS Scorecard: (1) Introduction & Overview, (2) Self-Assessment Summary, and (3) Supporting Schedules. Please refer to Figure 2 for a graphical presentation of the RRS Scorecard structure.
The first section includes three worksheets: (1) Front Page, (2) Schedule 1 - AIRB RRS Scorecard Template Structure, and (3) Schedule 2 - RRS Portfolio Overview.
The second section contains Schedule 3, which is a summary populated automatically from supporting schedules.
The third section contains Schedule 4, which includes seven supporting worksheets. There is one worksheet for each selected subsection of the IRB minimum requirements of the CAR Guideline A-1, including the following: (1) Overall Compliance, (2) RRS Design, (3) RRS Operations, (4) Corporate Governance, CRCU, & Internal Audit, (5) Use of RRS, (6) Risk Quantification, and (7) Validation.
Schedule 2 (RRS – Portfolio Overview) provides an overview of the RRS together with information on the portfolio to which the RRS applies. The institution is asked to provide information broken down by retail and non-retail exposures as applicable. Schedule 2 contains nine tables as listed below:
In Table 1.1 (RRS Definition), the institution will provide the name and abbreviation used internally to describe the RRS and define the RRS that is to be self-assessed. Table 1.1 has a highlighted area (shaded in blue) where the institution can input its definition.
In Table 1.2 (General Overview), the institution will provide general background information on the RRS. If the institution uses one RRS Scorecard for several asset classes, it should provide clear explanation of any differences between the asset classes, as it is required in Table 1.2. Also, the institution should provide totals and the breakdowns by asset class. A brief description outlining what is required for each field is provided below in Exhibit 3.
In Table 1.3 (Criteria/Segmentation Borrower/Facility), the institution will provide information on RRS criteria/segmentation by borrower and facility rating dimension. The table is divided into two sections: non-retail and retail. For each instance, the institution will populate the sections highlighted in blue. A brief description outlining what is required for each field (by rating dimension) for the non-retail asset classes is provided in Exhibit 4.
Mapping RRS to industries (if applicable)
Mapping RRS to master scale (if applicable)
Mapping RRS to external rating agencies
Criteria used for RRS (key inputs)
A brief description outlining what is required for each field in the retail asset class table is provided below in Exhibit 5. For each instance, the institution will populate the sections highlighted in blue.
List of titles of supporting documents
Mapping RRS to credit score
In Table 1.4 (Methodology (Borrower/Facility)), the institution will provide information on its rating assignment methodology. Table 1.4 is divided into two sections: non-retail and retail. A brief description outlining what is required for each field in the non-retail and retail asset class table is provided in Exhibit 6. For each instance, the institution will populate the sections highlighted in blue.
Type of Model
Provide the name of the model by which it is known internally.
For model-based approaches, indicate whether this is a proprietary model or vendor model.
List the name of the external vendor and its respective vendor, as applicable.
In Table 1.5 (Responsibility), the institution will indicate the department and contact information for parties responsible for the areas listed in Exhibit 7. For each instance, the institution will populate the sections highlighted in blue.
Design of RRS
Validation of RRS
Operation of RRS
Indicate the group/division responsible for the operation of the RRS
Assignment of initial ratings
Approval of initial rating
Refreshment of the rating
Approval of refreshment of the rating
Organizational charts for each category
In Table 1.6 (Performance of RRS), the institution will comment on the performance of the RRS. A brief description outlining what is required for each field is provided in Exhibit 8. For each instance, the institution will populate the sections highlighted in blue.
List of tests performed to measure performance of RRS and their frequency of application
List of management reports on RRS performance
In Table 1.7 (Use of RRS), the institution will comment on the use of the RRS. A brief description outlining what is required for each field is provided in Exhibit 9. For each instance, the institution is asked to populate the sections highlighted in blue.
Loan/Credit approval (Yes/No)
Reporting to senior management and the Board (Yes/No)
Loan loss reserving (Yes/No)
Regulatory capital allocation (Yes/No)
Economic capital allocation (Yes/No)
Profitability analysis and pricing decisions (Yes/No)
Risk and/or business management and loan monitoring (Yes/No)
Other (Please specify)
In Table 1.8 (RRS Overrides), the institution will comment on the frequency and scope of RRS overrides. A brief description outlining what is required for each field is provided in Exhibit 10. For each instance, the institution is asked to populate the sections highlighted in blue.
Frequency of RRS overrides
Scope of overrides
In Table 1.9 (Significant Changes/Refreshments Since October, 2005), the institution is asked to provide information on any significant changes and/or refreshments to the RRS that have occurred since October 31, 2005.
A brief description outlining what is required for each field is provided below in Exhibit 11. For each instance of change, the institution should populate the sections highlighted in blue. If more than five significant changes have been experienced, then the institution should copy and paste additional rows, as necessary.
Provide a brief description indicating the size of the portfolio affected by this change.
Describe expected effects on the PD, LGD and/or EAD of the RRS.
Outline the type of change that has occurred. For example, was this change institution induced, externally driven (environmental), etc.
Schedule 3 (Self-Assessment Summary) has two summary tables that are populated automatically from the supporting worksheets. The purpose of these tables is to provide a snapshot of the overall self-assessment against the AIRB minimum requirements.
Schedule 4 (Supporting Worksheets) consists of seven sections (one per worksheet):
Please refer to 3.6.3 for details.
Chapter 5 of the CAR Guideline A-1 structures the AIRB minimum requirements in terms of key approval areas, such as RRS design and operation. For each approval area, related CAR paragraphs have been listed for reference purposes, and brief descriptions of each paragraph have been provided. The institution is asked to assess itself against these paragraphs.
There are three rating types used in the supporting worksheets: Rating 1, Rating 2 and Rating 3. These ratings are to be completed by the risk and/or business management. Internal audit indicates audit status in column 12. A summary of the rating types is outlined below, viz.
Rating 1 (column 5) is based on definitions used for gap analysis review purposes. This rating is to be completed by risk and/or business management (as applicable), and is to be updated at least once a year. The institution will indicate the date of gap assessment and its frequency in Table 2.2 on the same spreadsheet. The institution will select the rating by using a drop-down menu of one of the following:
Rating 2 (column 10) indicates the degree of completion for projects/activities that have been undertaken to close any compliance gaps related to the AIRB minimum requirements. This rating is to be completed by risk and/or business management (as applicable). The rating is to be updated three times a year, and the date of the latest rating assessment should be stated in Table 2.2 on the same spreadsheet.
Risk and/or business management should be prepared to provide supporting documentation upon request to support its assessment, as part of the supervisory review process. Supporting documentation could include such things as project status dashboards, as well as other internally produced project management reports. The institution indicates the rating by using a drop-down menu of one of the following:
Rating 3 (column 11) indicates the institution’s assessment of its status and progress towards full implementation. This rating is to be completed by risk and/or business management (as applicable). The rating is to be updated whenever material changes occur or at least once a year, and the date of the latest rating assessment should be stated in Table 2.2 on the same spreadsheet.
It could also be used as a reference point by internal audit in its assessment of adherence to the AIRB minimum requirements. The institution indicates the rating by using a drop-down menu of one of the following:
Exhibit 12 (below) provides a rough guideline of mapping of Rating-type 3 (compliance) vs. the possible stages of AIRB implementation, viz.
Exhibit 12. Mapping of Compliance vs. AIRB Implementation Stages (Text Version)
Audit status (column 12) should be completed by internal audit based on work performed as at the date of the self-assessment. The status is to be updated whenever material changes occur or at least once a year, and the date of the latest update should be stated in Table 2.1 on the same spreadsheet. The institution indicates audit status by using a drop-down menu of one of the following:
The institution will indicate Audit Date, performed and/or planned, in column 13. If the institution has done or plans to do several audits in the same area, dates of all audit work should be captured.
As indicated earlier, Schedule 4 (Supporting Worksheets) consists of seven supporting worksheets, one for each selected subsection of the key IRB minimum requirements of the CAR Guideline A-1: (1) Overall Compliance, (2) RRS Design, (3) RRS Operations, (4) Corporate Governance, CRCU, & Internal Audit, (5) Use of RRS; (6) Risk Quantification, and (7) Validation.
The structure of each supporting worksheet is similar. Each worksheet is divided into three tables as follows: (1) Self-Assessment Scorecard, (2) Description of Assessment Work Completed (two parts), and (3) Names of all Supporting Documents.
Table 1 (Self-Assessment Scorecard), has 13 columns. It contains sections that should be completed by risk and/or business management (as applicable) and sections that should be completed by internal audit. A detailed summary of what is required for each column is listed in Exhibit 13.
In Table 2.1 (Internal Audit), the institution will provide a brief description of all work completed by internal audit as at the self-assessment date. This description should include any reviews or activities.
In Table 2.2 (Other), the institution will describe the work done and rationale for assigning the rating status for current projects (column 10) and the rating status for compliance with AIRB minimum requirements (column 11). The institution should also indicate when the assessment was done for each rating type. For more information on the update frequency of the various columns, please refer to Section 7 on Update Timeline.
In Table 2.2 (Other), the institution should also record any material changes that were done in columns 6, 7, 8 and 9. In essence, this table can be used as a change log.
In Table 3 (Names of Supporting Documents), the institution is asked to list all relevant supporting documents, such as guidelines, policies, project plans, internal audit reviews, review reports, or other applicable documents. The institution should indicate relevant title(s), section(s), paragraph(s), etc. of the supporting documents for quick reference.
Schedule 5 (Risk Parameters Mini-Scorecard) was developed to capture the RRS’s risk quantification details. On the front page, the institution will complete summary information relating to the Mini-Scorecard as follows: the institution name, the business unit that uses the Mini-Scorecard, the name of the RRS, the name of the portfolioFootnote 16, the IRB asset class that is covered by the RRS, and the date of self-assessment.
The Mini-Scorecard should be completed for each risk parameter in the RRS. It is divided into three sections: (1) Rating Grade Description for Internal Estimates, (2) Developmental Evidence for Internal Estimates, and (3) Validation of Internal Estimates.
In Section 1 (Rating Grade Description for Internal Estimates), the institution will provide general information on individual portfolios. A summary of what is required is listed below in Exhibit 14.
In Section 2 (Developmental Evidence for Internal Estimates), the table is made up of six columns: Developmental Evidence Items, Internal Data, Mapping to ECAIs, Statistical Model, Pooled Data, and Other. The institution will populate all applicable columns. A detailed summary of what is required is listed below in Exhibit 15.
In Section 3 (Validation of Internal Estimates), the table has six columns: Developmental Evidence Items, Internal Data, Mapping to ECAI, Statistical Model, Pooled Data, and Other. The institution will populate all applicable columns for each internal estimate. A detailed summary of what is required is listed below in Exhibit 16.
This template integrates information on the Vendor Model Inventory for retail and non-retail asset classes. See Figure 3 for an overview of the Vendor Model Inventory structure.
Schedule 6 (Vendor Model Inventory) applies only to those vendor models that are material and relevant to the internal risk rating systems of the institution. On the front page, the institution will provide its name and the date when the vendor model template was populated or updated.
The Vendor Model Inventory contains two worksheets: (1) Vendor Non-Retail Models, and (2) Vendor Retail Models. The Vendor Non-Retail worksheet applies to corporate, bank, sovereign, specialized lending, and purchased receivables asset classes. The Vendor Retail worksheet applies to retail mortgages, QRE, and other retail asset classes.
The information requested in the Vendor Non-Retail Models and the Vendor Retail Models worksheets are identical. However, the respective responses should be customized for non-retail and retail asset classes.
Each worksheet contains a sufficient number of tables for analysis of three such models. If the institution requires more tables because it uses more than three vendor models, it should copy and paste additional tables to provide a complete listing overall. A description outlining the required fields is given in Exhibits 17,18, 19, and 20, below.
Size of portfolio
Indicate the size of the portfolio in $ billion covered by the model.
Indicate the size of the portfolio in terms of the number of facilities/ borrowers covered by the model.
Schedule 7 (the RRS Summary) provides an overview of all RRSs used by the institution as well as all exemptions, extensions and waivers. This is a summary table that integrates information from all AIRB Scorecards and gives a snapshot of distribution of credit assets by RRS. On the front page, the institution will complete summary information, such as the institution name and the date on which the last self-assessment was performed.
The template provides space for three RRSs. If the institution uses more than three RRSs, it should copy and paste additional rows into the table and provide the information, as applicable. The template provides additional rows for waivers, extensions and exemptions. Refer to Section 2.4 for the definitions.
At the bottom of the table, the institution will provide information on total amounts of gross credit assets and total IRB credit risk-weighted assets covered by all RRSs in Columns 9, 10, 11, and 12. If the institution adds additional rows, it should ensure that the total sums all assets listed under the various RRSs, waivers, extensions and exemptions. The institution will use QIS5 figures for 2006 submission and BCAR figures for 2007 submission.
The RRS Summary table has 12 columns. A description outlining the required fields is given in Exhibit 21, below.
OSFI recognizes that the approval process is dynamic and that many activities will occur after the Formal Application Date (i.e., February 1, 2006). Consequently, the institution’s self-assessment package will also need to be dynamic to better reflect the current status of the institution’s implementation efforts.
To streamline the process and to identify critical areas for monitoring and updating, different sections of the self-assessment template will require update at different stages of implementation and at various frequencies to remain current. Consequently, a timeline table has been incorporated to assist the institution with this requirement. Refer to Exhibit 22 for details.
For documents that need to be updated annually, the following approach should be used. If there are any material changes to the self-assessment documents, OSFI expects the institution to submit a formal notification of these changes on a timely basis (no less than once a year). If there are no material changes to the self-assessment documents, the institution needs to confirm this by notifying OSFI. In such instances, the institution need not re-submit the related self-assessment documentation.
Non-Retail – Feb. 1, 2006
Retail – Oct. 31, 2006
October 31, 2007
February 1, 2006 - one set of PD and LGDs from one non-retail risk rating system that is the most meaningful and material to the institution.
October 31, 2006 - retail. One set of PD and LGDs from one non-retail risk rating system that is the most meaningful and material to the institution.
October 31,2007 - submission for all risk parameters of all RRSs.
Feb. 1, 2006 (as at Oct. 31, 2005)
Oct. 31, 2007 (as at July 31, 2007)
Banks and bank holding companies to which the Bank Act applies and federally regulated trust or loan companies to which the Trust and Loan Companies Act applies are collectively referred to as “institutions”.
Return to footnote 1 referrer
As defined in the Implementation Note, the various phases of approval are: Phase 1: Monitoring of institutions’ implementation efforts; Phase 2: Formal application and preparation for ‘meaningful’ parallel reporting; Phase 3: ’Meaningful’ parallel reporting and completion of OSFI review for approval; Phase 4: Approval for Pillar 1 credit risk capital purposes; Phase 5: Monitoring of ongoing compliance (see Appendix I for further details).
Return to footnote 2 referrer
Submission of a signed application confirms an applicant’s consent for any information provided to be shared with other regulators for the purposes of the approvals process.
Return to footnote 3 referrer
All documents listed in the package are to be submitted together, except for the Chief Auditor’s Letter, which is required to be submitted to OSFI by March 31, 2006.
Return to footnote 4 referrer
Some of the documents, such as description of the self-assessment process, description of waivers, extensions and exemptions, vendor model inventory, should be re-submitted by October 31, 2007 if any material changes have occurred since their last submission.
Return to footnote 5 referrer
See OSFI’s Implementation Notes on 2007/2008 Approval of IRB Approaches for Institutions, Corporate Governance and Oversight at IRB Institutions, Risk Quantification at IRB Institutions, Collateral Management Principles for IRB Institutions, Data Maintenance at IRB Institutions, The Use of Rating and Estimates of Default and Loss at IRB Institutions, and Validating Risk Rating Systems at IRB Institutions.
Return to footnote 6 referrer
Exemptions only apply to those asset classes, business units and/or legal entities that are deemed immaterial and will therefore report on an alternative Pillar I approach to credit risk.
Return to footnote 7 referrer
Extensions apply to those material portfolios outside of Canada and the U.S. that are subject to a three-year transition period to roll out the IRB approach.
Return to footnote 8 referrer
Waivers only apply to those material portfolios that are expected to be AIRB-compliant by the start date of the new Basel framework.
Return to footnote 9 referrer
The institutions should follow definitions of exposure classes based on the CAR Guideline A-1, Section B1 Categorization of Exposures. Please refer to the following paragraphs for a definition of each exposure class: corporate exposures par. 218, sovereign exposures par. 229, bank exposures par. 230, retail exposures par. 231, qualifying revolving retail exposures par. 234, equity exposures par. 235, and eligible purchased receivables par. 239-241.
Return to footnote 10 referrer
If the wholesale RRS covers several asset classes, including specialized lending (SL), and the bank populates one RRS Scorecard for several asset classes, SL should be assessed using the templates.
Return to footnote 11 referrer
This information should not be considered as a substitute for complete and comprehensive model documentation that the institution is required to maintain by the CAR Guideline A-1.
Return to footnote 12 referrer
Portfolio definition is based on the RRS coverage of asset classes. If one RRS covers one IRB asset class, the portfolio and asset class should be identical for self-assessment purposes. For non-retail exposures, if one RRS covers several IRB asset classes, the portfolio definition should clearly state which asset classes are covered by the AIRB RRS Scorecard.
Return to footnote 13 referrer
For example, there are other projects that are highly dependent on the completion of this requirement; a significant work effort is required and/or significant technological solutions are required. Refer to the extended list of examples used for the gap analysis definitions in 2003-2004.
Return to footnote 14 referrer
Reversible in relation to the IRB rollout plan submitted to OSFI.
Return to footnote 15 referrer
Portfolio definition is based on the RRS coverage of asset classes. If one RRS covers an entire IRB asset class, the portfolio and asset class should be identical for self-assessment purposes. If one RRS covers several asset classes, the portfolio should be one asset class covered by the RRS Scorecard.
Return to footnote 16 referrer
The institutions are expected to submit an initial self-assessment by February 1, 2006 based on a self-assessment as at October 31, 2005. Annual updates should be submitted to OSFI by the subsequent 2006 and 2007 anniversary dates. Updates required for regular gap analysis reviews should be submitted to OSFI in accordance with the respective gap analysis schedule in 2006 and 2007.
Return to footnote 17 referrer