these issues to closure. It is important that your issues management
program ensures the appropriate level of management oversight. As
part of the process the institution should be able to identify where
the issue was detected. Some categories that should be considered:
■ ■ Self-identified—Issues that a business or department has
discovered during the normal course of business. These can
include system-generated exception reporting or line of business (LoB) quality control functions.
■ ■ Testing—Compliance and/or LoB testing of processes and
controls.
■ ■ Complaints—As part of your complaints, issues that are
systemic in nature or may be considered a regulatory violation
should have a mechanism to be vetted and tracked as part of
your institution’s issues management program.
■ ■ Risk Assessments—Deficiencies detected as part of annual
risk assessment processes. Ongoing risk assessments should
also be included in the issue management program.
In addition to tracking how an issue was detected, other categories that should be collected include, but are not limited to the:
■ ■ nature/description of the issue;
■ ■ impacts (customer, financial, regulatory);
■ ■ rating scale;
■ ■ remediation plan(s); and
■ ■ update mechanism to ensure consistent tracking of issue closure.
Audit and examination findings may be tracked separately
from other issue types.
The incidence and content of enforcement
actions, civil money penalties (CMPs),
and settlement agreements definitely
suggests that overall compliance ratings
are down.
The institution should have a documented process that demonstrates to the regulators the issue life-cycle, including birth, life,
and death. It is important to document the lifecycle, so regulators
understand the process by which issues are identified, remediated,
and closed. Having a documented process will allow the regulators to understand what issues the institution has identified and
just how far the institution has come in the remediation process.
Regulators also want to see the closure of issues. They have questions like: What are your validation requirements? Does compliance
validate? Does audit validate? Who determines when an issue is
finally closed? What controls are in place to ensure that issues are
appropriately closed?
Like a complaint program, an issues management program
should be commensurate with your institution’s size and complexity. It may be as simple as a computer worksheet or as complex as a
custom database. The tool used to track the issue is not necessarily
as important as the processes that support it. Ensuring the appropriate level of management oversight through regular reporting
is important. Doing this allows management to understand key
factors, such as what the issues are, where in the remediation
process an issue is, when issues are near stated remediation plan
dates, and what/when issues are past due.
The question we started with was: “Do the old established
exam management principles still apply?”
The answer is a resounding “yes,” in large part because the
principles are grounded in well-established experience over prior
periods of industry challenge and crisis. What we have presented
merely suggests that specific execution under these principles can
be modified as the environment evolves. Now that’s manageable! ■
Footnotes
1 National supervisory rating data is included in the 12th District Banking
Profile: a banking conditions summary report published by the Federal
Reserve Bank of San Francisco http://www.frbsf.org/publications/banking/
index.html
2 The CAMELS rating system is used to assess the soundness of banks on a
uniform basis during financial examinations. The rating system includes
six components: Capital; Asset Quality; Management; Earnings; Liquidity
and Sensitivity to Market Risk. The component and overall composite
ratings categories are as follows: 1—Strong; 2—Satisfactory; 3—Less than
Satisfactory; 4—Unacceptable and 5—Critically Deficient.
3 The Federal Reserve Board reports annually on compliance with consumer
protection laws by entities supervised by federal agencies. The agencies
include the 12 Federal Reserve Banks, the FFIEC member agencies, and
other federal enforcement agencies. The exam data reporting periods are
as follows: 2010 annual report—July 1, 2009 through June 30, 2010; 2011
annual report—July 1, 2010, through June 30, 2011
ABOUT THE AUTHORS
BONITA G. JONES, president of San Francisco-based Bonita
Jones & Associates, LLC, is a retired principal in the Banking
Supervision and Regulation Division of the Federal Reserve Bank
of San Francisco. Reach her by e-mail at
bonitajon@aol.com or by
telephone at (415) 297-1784.
THOMAS J. HEALY, CRCM, is senior compliance director for
Deposits, eCommerce, and Marketing at Ally Bank in Charlotte. Before
Healy joined Ally in June 2010, he was a compliance and operational
risk manager at Bank of America supporting Deposit Products,
Regulatory Complaints, Americans with Disabilities Act (ADA), and
Consumer Regulatory Relations Examination Management. Prior
to his work at Bank of America, Healy was the compliance officer
at F&M Bank in Granite Quarry, N.C, where he was responsible for
building the bank’s compliance, BSA, and security programs.
Prior to entering the financial services industry, Healy was
a senior consultant with Uniform Computer Recovery, a North
Carolina consulting firm that specialized in business continuity
planning. During his tenure there, the company expanded
it services to include assisting community banks in building
compliance programs to meet regulatory requirements.
Healy serves on the advisory boards and faculty for the ABA
Compliance Schools. Reach him at
Thomas.Healy@ally.com.