May 11th, 2012 by crossi under Uncategorized. No Comments.
The $2 billion loss in credit default swaps JP Morgan Chase sold on the Markit North America Index Grade Index is less about its potential to precipitate a crisis in financial markets as it is a stark reminder that even the purported best risk managers in the business can occasionally make huge mistakes. And unlike Long Term Capital Management or Amaranth LLC where arcane derivatives transactions ultimately took these companies down, this episode in derivatives trades gone wild is more of a flesh wound to JP Morgan’s one highly regarded risk management capabilities than it is a systemic risk event. Certainly it calls to mind whether other less well-buttoned down banks (or so we thought) have engaged in similar transactions and there is a virtual certainty that at some time we will observe another headline worthy derivatives fiasco at a major institution. But what exactly can we take away from the JP Morgan experience? (For some of my additional perspectives please see the pieces below)
First, it sparks another round of debate over the Volcker Rule with regard to the extent to which banks can use derivatives in their business activities. One thing is for sure – the transaction points out the oftentimes blurred lines between speculation and hedging which bedevils regulators in implementing the Volcker Rule. And this will ratchet up the debate on this issue in Congress. But it will still not offer any real solution to distinguishing what constitutes a hedge strategy versus a calculated bet.
Moreover, the lack of transparency about the nature of the transactions does not help either. All we know is that the trades were supposed to be hedges against credit losses from other segments of the bank’s business such as from loans to companies, banks and sovereigns. To guard against adverse credit movements in its underlying unhedged position, it could have bought CDS to protect against downward movements in the economy. Instead the selling of CDS does not appear to be a hedge at all and may be complicated by a host of other factors not known at this time including the effectiveness of the so-called hedge. It is unclear how correlated the CDS position was with the aggregate unhedged position which could have resulted in significant basis risk as well.
There are serious concerns regarding the level of risk and regulatory oversight of these transactions. As these trades have been stated to originate out of the Chief Investment Office under Bruno Iksil, it is interesting that this group had been under some scrutiny in the media only a month earlier. So, what I’d like to know is what were the risk management group and regulators doing during that period? Why weren’t the trades in the group being dissected and analyzed well before now? It wasn’t like this was news. With the positions of the group in these particular CDS sitting at about $150 billion, representing between 1/3rd and 2/3rds of the total market notional value of that CDS, I find it odd that the company would have permitted that large an exposure. In other words what were the position limits in place for this group? That much concentration introduces all sorts of counterparty and liquidity issues, and makes it extremely difficult to unwind the transactions at those levels – again, this is a bit of déjà vu when thinking about Amaranth LLC’s situation. In that case, aggressive traders racked up impressive short-term performance increasing positions in complex derivatives facilitated by poor models and risk oversight. Sound familiar?
It is interesting that reports are circulating that JP Morgan had just moved to a new value-at-risk (VaR) model in the first quarter of 2012 (which is a tool risk manager’s use to gauge risk exposure under extreme conditions) which was found to be inadequate, forcing the company to go back to the original model. So we have another example of where yet again risk models were not well developed and failed to detect elevated risk levels ahead of a problem.
The good news is markets will not crumble as they did in 2008 because of this latest derivatives mess. But a few important points are worth highlighting. First, despite the misstep by JP Morgan, derivatives are essential tools in managing risk and should be available to banks under tight controls. In other words – Derivatives don’t kill markets, people kill markets. Getting the governance, incentives, processes and controls in place for using derivatives is what is needed. Clearly even well-run shops have trouble at this and so we need smart regulation that allows derivatives to be used for hedging risk but that reduces the over-engineering that often takes place. Second, a firm can be burned as much by its assumptions and views of where the market may head as by the complexity of the transaction. Thus greater appreciation for the art and the science of derivatives much occur at banks. Third, this latest derivatives problem underscores the need for strong and vigilant risk management. Finally, the need for strengthened micro-prudential regulation is clear. Much time has been spent since the crisis in enhancing macro-prudential regulation, but the safety and soundness regulators face an enormous task at ferreting out risk time bombs with the staffing levels and expertise currently in place. The spectre of the 2008-2009 financial crisis has heightened our sensitivities to bank mistakes and the attention to JP Morgan is understandably justified. We need to use this as a case study of how to craft stronger risk management and regulatory support and implement such enhancements quickly.
April 27th, 2012 by crossi under Uncategorized. No Comments.
Over the last week, student loans have crept back into the national spotlight in good measure driven by presidential politics and Congressional brinksmanship. Although the timing of politicians’ interests in student loans draws a certain amount of skepticism over anything but their political motivation to gain the support of the younger vote, these election year gymnastics should not overshadow a serious set of long-term economic problems facing this country. Behind the seemingly cerebral surroundings of our colleges and universities is the fact that private and public higher education is a big money business that has been greatly facilitated by generous federal subsidies for decades for student loans. Most estimates place the outstanding amount of student loan debt at nearly $1 trillion, with the federal government providing 80% of that aid through its Stafford loan programs. The issues confronting our society and elected officials are how do we ensure a future for college graduates that allow them the same opportunities as previous generations to purchase a home, save for retirement and their children’s educations while at the same time demonstrating fiscal responsibility at a time when federal bailouts seem to be the norm for industries and homeowners, among others.
The numbers themselves paint a gloomy picture for the US as a whole and graduates with student loans. Shockingly, Americans 60+ years and older owe $36 billion in outstanding student loans. At a time when this segment of the population should be thinking of solidifying their retirement plans, they remain mired in student loan debt from in many cases decades earlier. Moreover, the average student loan obligation these days is over $23,000, up more than 25% in the last decade. In 2007, Congress passed legislation that halved the interest rate on federal student loans from 6.8% to 3.4% that expires July 1st and would affect 7.4 million people. In the sideshow that is election year politics, agreement that interest rates not be allowed to go back up is a bit surreal at first glance but importantly illustrates the gravity of the situation.
From an economic perspective, providing a well-educated work force is critical in an increasingly competitive global economy. Innovation and productivity are dependent upon highly educated workers – but not just in white collar jobs, but also in manufacturing and other blue-collar positions. The fact that 90% of parents want their children to go to college is notable and encouraging; however, strong vocational programs are also needed along with basic math and science skills where US children have been lagging behind for years relative to a number of other countries. The surge in 4-year college enrollments in the last decade in some sense resembles the artificial demand in housing that took place in the years preceding the financial crisis due to a variety of market and government-related factors. In the case of higher education, demand was induced in part by significant federal subsidies that in turn facilitated major tuition increases by colleges over the last decade (i.e., 67% increase in tuition, board and room over this period). These burdens place graduates in a precarious position; with a nearly 30% 30-day past due or worse delinquency rate on student loans, the product is a difficult one for private lenders to make acceptable margins on and we’ve seen some large banks exit the market as a result, leaving the federal government as the primary student lender. But unlike housing where personal bankruptcy can lead homeowners out from under crushing mortgage obligations, no such relief exists in general for the discharge of federal student debt, placing extraordinary financial pressure on the next generation. To be clear, bankruptcy should only be a final and extreme solution for borrowers facing financial difficulties. However, from a long-term economic growth perspective, the drag on future growth is real and worrisome. With such burdens, an extremely fragile and still recovering housing market stands little chance at robust growth if the market for first-time homebuyers is unable to qualify for a mortgage due to excessive nonmortgage debt obligations. Herein lies the dilemma for Congress and the Administration – initiating a legislative fix to the interest rate subsidy doesn’t tackle the large outstanding student debt, and it must be paid for somehow, which means an additional $6 billion in taxpayer support per year at a time when our fiscal house is already falling in on itself. Like social security, health care and tax reform, student loans have languished in terms of meaningful legislative reform that is both equitable and budgetarily responsible. At a minimum, Congress and the Administration need to meet halfway and extend the 3.4% cap and do so in a budget neutral fashion. More important, a long-term fix to the way we think about higher education subsidies, the cost of securing such education and the expectations of students and educators to provide a world-class highly educated workforce for all economic activity must be a priority for the next administration.
April 12th, 2012 by crossi under Uncategorized. No Comments.
This year’s 2012 Atlanta Federal Reserve Financial Markets Conference was aptly titled, “Financial Reform: The Devil’s In the Details” in light of the flurry of activity going on these days at regulatory agencies charged with implementing various aspects of Dodd-Frank.
Among the topics covered were issues related to mortgage finance, shadow banking and the role of maturity transformation in promoting the crisis and how liquidity regulation may impact this natural role of financial institutions. At the session on maturity transformation, I took a somewhat contrarian position that maturity transformation is easy to disparage at first glance, but going with the theme of the conference, the devil is in the details on this issue. AtlantaFedMatTransCVRApril2012 The simple answer is that excessive risk-taking and the associated growth of risky mortgage products and securities adversely impacted the maturity transformation process that has worked effectively over time. But it seems that the financial crisis has brought about a natural inclination to find fault with many of the most basic financial processes and with it a tendency to seize on some form of corrective action. It certainly makes us all feel better when we are able to show that we have fixed a process. In many instances, however, we are treating the symptom and not the disease.
For context, in a 2009 speech by New York Fed President William Dudley, he acknowledged that maturity transformation, i.e., the tendency for banks to transform relatively shorter-term funding sources into longer-term assets is a natural and expected part of what commercial banks do. http://www.newyorkfed.org/newsevents/speeches/2009/dud091113.html
Under normal market conditions, banks tend to use insured deposits augmented where needed with other funding sources for their asset generation activities. But two issues arose in the years before the crisis that ultimately led to greater risks to the maturity transformation process. First, the form and substance of funding alternatives morphed a bit over time such that it contributed to the liquidity crunch that ensued in 2007. An increase in the use of residential mortgages and related securities with longer durations as collateral over shorter-term forms of collateral amplified the maturity transformation process. Examples of this include asset-backed commercial paper (ABCP) and its attendant uses for funding longer-term assets, getting away from more traditional funding of customer trade receivables with shorter durations, for example. In addition, the tri-party repo market made greater use of whole loans and non-investment grade asset-backed securities that ran into liquidity issues later on. And of course the money market mutual funds (MMMFs) experienced greater volatility under stress from exposures to various investments such as Lehman commercial paper. To be sure, commercial banks bear responsibility in financing so-called shadow banking activities by allowing these entities (e.g., special purpose vehicles, ABCP conduits, MMMFs, etc. not to be confused with the financing of shadow bank entities) to leverage the embedded put option of deposit insurance as well as bank access to Fed liquidity sources.
But along the way, the residential mortgage asset bubble accentuated the traditional maturity transformation process by embedding in it significant credit risk due to product morphing via tremendous asset risk layering. As residential mortgages and associated securities gained in popularity among banks and nonbanks, it further widened the duration gap (absent interest rate risk hedging activities) and thus set the stage for problems in maturity transformation ultimately leading to a liquidity crisis. Mortgage product morphing and the advent of various nontraditional mortgages such as negatively amortizing option ARMs and piggyback second lien products along with aggressive production targets added pressure on the maturity transformation process by requiring banks to augment insured deposits with non-deposit funding sources. These shifts in the traditional maturity transformation process facilitated problems later on as markets became unhinged and funding sources evaporated literally overnight. Banks were relatively unaware of the implications of these activities at a system-level and were effectively caught off-guard on how they would manage this risk.
Unfortunately one lesson learned the hard way in the crisis is that the banking sector vastly underestimated the significance of liquidity risk exposure. We all went through the process of measuring and managing liquidity risk, not realizing fully the broader systemic implications that were brewing. Some of this could be attributed to changes in funding composition as mentioned earlier. But another lesson was that the perception of liquidity that exists during normal periods is ephemeral – gone in a New York minute at the whiff of problems. In the aftermath of the crisis then it is tempting for regulators to want to impose new standards for liquidity as we now see coming in the next incarnation of Basel. But liquidity tends to be procyclical. A number of studies point to this such as Berger and Bouwman (2011) suggesting that liquidity builds up to abnormally high levels preceding a crisis in part as it did before 2008 in the form of relaxed underwriting standards and then crashes after the bubble pops. Moreover, there appears to be some empirical evidence (Allen and Carletti, 2011) of liquidity hoarding by banks during a crisis that suggests the direction of causation between liquidity and financial fragility might be reversed. (See Longbrake and Rossi Procyclicality Study, 2011 referenced in an earlier posting for the citations for these two studies).
So what are the implications of these results? First, liquidity appears to amplify lending activity preceding a crisis, not as a precautionary move, but rather may be symptomatic of underlying drivers of systemic risk generally. Thus imposing a set of required liquidity ratios may help reduce crisis-induced hoarding of liquidity under stress periods, but it has yet to be proven whether the benefit of ensuring a minimum level of liquidity is maintained across the economic cycle offsets potential economic drag such a requirement may impose generally.
So where does this leave us? Increasingly, institutions need to do a much better job at assessing and integrating views of various risks in their firms generally. Had the industry imposed a high enough penalty function for illiquidity and credit risk, for that matter there is a greater likelihood that we would have seen far better results on this dimension than what ultimately were realized. I am reminded by a story at a very large depository, where in response to a serious regulatory problem years before the crisis, overhauled its liquidity risk management process significantly, making it one of the strongest such programs in the industry – the proof of this effort was that the firm weathered the liquidity crisis very well. Stronger oversight of liquidity risk management practices by regulators provides a more flexible yet effective mechanism for ensuring firms have robust liquidity risk management protocols. Finally, firms need to augment existing firm-specific risk management capabilities with activities focused on how systemic risk events could impact their operations. Certainly stress tests provide some measure of this, however they are not sufficiently comprehensive across risk types to get a full appreciation for how systemic risk events could impact their business.
March 29th, 2012 by crossi under Uncategorized. No Comments.
With the November elections closing in, it should not be surprising to see continued political rhetoric surrounding a number of key administration nominations for various regulatory posts. This drama came to light again during Senate Banking Commitee nomination hearings this week.
Vacancies at some of the highest levels technically still exist at the Federal Reserve, the FDIC, the OCC, the FHA, and the OFR for example, which hampers the ability of these organizations to marshall resources and support for their various initiatives. Certainly the administration inflamed the process somewhat by forcing through the Cordray appointment during a Congressional recess. The political posturing, while making for good theatre, detracts from the more substantive issue of how we can expect the agencies to tackle such complex issues as systemic risk identification, long-term housing finance reform, and effective banking reform that ensures smart regulation that doesn’t strangle markets or promote excessive risk-taking. It is difficult if not near impossible to lead an organization through such unprecedented times as financial regulatory agencies face today without having confirmed leaders in place that have the endorsement of both the administration and Congress. Further, there has been some noise about nominating individuals from some of the very banking institutions that were entwined in the crisis, however, such views do not appreciate the importance that such experiences bring with regard to how financial markets operate. We’ve seen a number of examples in Dodd-Frank of how well-intended policy staff with little real-world experience outside of the DC area can devise regulations that have unintended consequences – a case in point are the proposed Qualified Residential Mortgage (QRM) provisions that for various reasons have been put on the proverbial policy backburner due to a number of glitches in the proposed framework. Avoiding conflicts of interest are clearly imperative at all times, however, recruiting the best and most experienced minds from industry has overall improved policy outcomes in the long-run. Both sides of the political aisle for the collective good need to ensure we have the right people in place at our regulatory agencies in a timely fashion.
March 21st, 2012 by crossi under Uncategorized. No Comments.
The FHFA’s proposal for restructuring the government-sponsored enterprises (GSEs) Fannie Mae and Freddie Mac has important implications for private mortgage insurance companies (PMIs), an industry where the stakes could not be higher for its long-term viability. http://www.fhfa.gov/webfiles/23344/StrategicPlanConservatorshipsFINAL.pdf A critical step in recovery for the secondary mortgage market is shrinking the federal government’s presence in mortgage financing activities. The FHFA has put on the table a couple of alternatives for reinvigorating private capital that rely on various forms of credit enhancement, either thru securitization structures and/or variations of traditional private mortgage insurance contracts. As the FHFA works through these options, it is important that they and other policymakers not write-off private mortgage insurance as a potentially viable form of credit enhancement based on the weakened state of the industry. To be sure, there are many lessons to be learned from the mortgage crisis that could vastly improve the long-term viability of private mortgage insurers during periods of extreme credit stress. However, the concept of private mortgage insurance has merit as an effective credit enhancement that could well play a significant role in a newly designed housing finance system.
PMIs are often referred to as creatures of the GSE charters, specifically requiring both Fannie and Freddie to obtain insurance for any loan above a 80% loan-to-value (LTV) ratio. In this capacity for decades leading up to the mortgage crisis, PMI companies operated in all economic cycles, providing effective credit enhancement for a traditionally higher risk segment of the conventional conforming mortgage market. Historically strict capital requirements imposed on PMIs (maximum 25:1 risk-to-capital ratios) contributed largely to this long period of stable performance leading up to the crisis. Further, beyond providing a vehicle for private capital to enter the mortgage finance system, PMIs through their capital requirements provide a countercyclical buffer in the event of market downturns and thus provide ongoing stability during times of stress. With private capital an important criteria for restructuring the housing finance system, credit enhancements providing countercyclical responses in down markets should be featured in any new structure.
There is no question the mortgage insurance industry has been under extreme duress since the crisis, with several firms no longer able to write new insurance due to their weak capital positions. This situation has led to many observers calling for the eventual demise of the industry as legacy PMIs implode under the weight of massive credit losses sustained after the housing bust. But while in apparent dire straits, it may be premature to call for the death of an industry that could provide an important way forward in housing finance. And it is important to remind ourselves what brought on this problem. Fundamentally, the losses sustained by PMIs lead back to fundamental issues of moral hazard and adverse selection compounded by competition among mortgage insurers. In a paper on the PMI industry, I trace these issues in part back to market power exerted by the GSEs and large mortgage originators through industry consolidation. Rossi QRM Study It is hard to call the PMIs unwilling victims of the GSEs and large lenders when they accepted the risk; however, it calls out a weakness in the system generally and not endemic of PMIs that must be remedied in any post-crisis housing finance system. Partly this can be addressed through stronger underwriting criteria. Conceptually, the qualified residential mortgage (QRM) provisions that are part of the Dodd-Frank risk retention provisions are a mechanism for enforcing greater discipline on credit risk-taking, although the early proposals were overly restrictive. For many years leading up to the housing boom, US mortgages could be described as fairly ordinary from a credit risk perspective, with few exotic features such as low documentation, or layered risks. This well-controlled credit risk profile for mortgages kept the industry out of major trouble for many years and any replacement for the GSEs must assure a certain level of credit quality. A number of legacy PMIs saddled with credit losses from the boom have faced the cruel reality of a truly private market and have ceased doing business. As unfortunate as this outcome is, it reminds us of why it is so important to promote private mortgage investment; namely that failure is not rewarded by government bailouts. In this regard, the PMI industry has at least shown a stark contrast between itself and the GSEs rescued by the good graces of the US taxpayer.
The FHFA’s intention to consider deeper MI arrangements as part of its proposal for restructuring the housing finance system indicates a willingness to put all viable options on the table. And capital markets also seem to be signaling in small ways their support for mortgage insurance though the ability of two new entrants to the market to raise capital. Those that dismiss the virtues of a viable private mortgage insurance industry due to inherent flaws in structure of the legacy mortgage finance system reject the possibility of an effective form of credit enhancement that promotes a deep and liquid market for mortgages financed largely by private capital.
March 14th, 2012 by crossi under Uncategorized. No Comments.
The much-awaited results from the Fed’s latest stress tests on large banks, formally known as the Comprehensive Capital Analysis and Review 2012 (CCAR), must be put into context around the effort’s massive dependencies on models and assumptions used to determine each firm’s capital situation.
The Fed’s report implies much more precision than lies within their analysis. Upon close inspection of the methodology for generating the stress scenario projections, it is clear that much work remains for the Fed to develop a robust stress test capability. Conceptually, the approach to apply a consistent analytical methodology and set of assumptions against all banks is sound and the Fed has made significant strides in developing this capability since their first try at this in 2009 with the Supervisory Capital Assessment Program (SCAP). Nevertheless, these latest stress tests are fraught with the same kind of model risk that has historically plagued banks.
The first issue relates to the Fed’s approach to assessing losses on bank accrual portfolios. An industry-level model is developed against which all bank portfolios are measured. While this establishes a common set of risk factors describing the various loan types, it tends to wash out bank-specific effects. That is, it tends to make riskier accrual portfolios appear less risky than they are and vice versa. This occurs by aggregating data across the industry rather than building separate bank-specific models, which would be even more resource intensive than the massive stress test effort already entails. Technical workarounds to this problem are possible, but are beyond the scope of this discussion. A second issue with the stress test relates to its assessment of operational risk and mortgage repurchase losses. Providing reliable estimates of these risks is notoriously difficult as even the Fed admits in its methodology document. Obtaining statistically reliable estimates of both risks due to the lumpiness of such events and their low frequency, high severity nature is extremely difficult. A third issue with the stress test is what it doesn’t include. Surprisingly, the stress test does not incorporate what many claim contributed to the financial crisis of 2008 in the first place, namely liquidity risk. The fact that liquidity risk assessment does not appear in the CCAR is puzzling given the amount of discussion it has received since the crisis and the focus on including separate capital charges for liquidity risk in Basel III. A fourth issue with CCAR is its over-reliance on a single stress path. The stress scenario selected by the Fed features 25 macroeconomic factors tracked over the stress period. Nowhere in the methodology is it ever described what the sensitivity is of various positions, revenues and expenses to this single scenario. For instance, in assessing counterparty risk exposure, the Fed has to make assumptions regarding asset correlations that drive obligor defaults. If we have learned anything about models used before the crisis, it is that correlations are not static. How are we to know whether credit losses of some part of the portfolio, for example, are overly sensitive to one or more of these factors, thereby introducing instability into the estimates? Moreover, the Fed acknowledges that it conducted an independent model assessment by using economists from across the Federal Reserve System. For such a critically important exercise, the Fed ought to have engaged outside scholars as well as economists from other regulatory agencies and industry practitioners with appropriate analytical and banking subject matter expertise. One of the strengths of CCAR is also its weakness; namely its complexity. The methodology is a testament to large-scale empirical modeling efforts that can easily engender the kind of overconfidence in results that such efforts met with before the crisis.
In light of these limitations, what should we make of the CCAR results this week? The first takeaway is that the Fed has at least made a heroic attempt to provide a comprehensive picture of large bank vulnerabilities to significant stress events. It isn’t clear whether the Fed is measuring the right stress test or even what that means. But it does provide a common benchmark against a single stress event to compare bank performance in the spirit of Vikram Pandit’s bank risk model proposal. But investors and others closely following the numbers should realize that the capital results generated by CCAR are subject to variability due to inherent model sensitivities. The effort remains a black box to all but the Fed and until the veil is lifted on the models and assumptions, its results should be viewed with a healthy dose of skepticism.
March 12th, 2012 by crossi under Uncategorized. No Comments.
The financial crisis reinvigorated discussions over the importance of systemic risk analysis to ward off potential market crashes leading to new oversight structures such as the Financial Stability Oversight Council and its research affiliate, the Office of Financial Research (OFR). Moreover, a host of other regulatory agencies in the US and abroad have begun efforts to strengthen their focus on systemic risk analysis. But risk managers at individual large financial institutions should also consider leveraging progress made to measure systemic risk and build it in to other risk management processes used for firm-level risk assessment.
The case for integrating systemic risk analysis into firm-level financial risk management practices is rather straightforward. First, traditional financial risk management practices are inwardly-focused which are necessary to fully understand the breadth of risk-taking across the enterprise. However, in doing so, risk management functions can tend to adopt a “missing the forest for the trees” mentality where systemic risks building across the industry and global markets can have as we have painfully witnessed catastrophic effects on the individual firm. Many firms preceding and during the crisis had access to macroeconomic indicators and specialized market performance metrics, however, such information was insufficient to provide any warning of a buildup of risk across the entire financial system. Second, a number of systemic risk measures could provide additional insight into the relative contribution to systemic risk by an institution, potential linkages between institutions manifesting in greater risk exposure and understanding market-specific systemic risk exposure. Such metrics could augment firm counterparty risk assessment, business line risk management, as well as help shape internal strategies to guide risk-taking and risk mitigation under varying economic conditions at the executive committee and board levels.
The degree of interconnectedness of systemically important financial institutions (SIFIs) has been well-documented since the crisis; requiring risk managers going forward adopt a much broader view of risk than traditionally exists within the firm. This is not meant to abandon or in any way diminish the activities in place for accomplishing enterprise risk management (ERM), but to add an important missing link to the discipline for large institutions. But what exactly do we mean by implementing a systemic risk assessment process into traditional ERM? These days with so much focus on systemic risk, the phrase suffers a bit from overuse as a blanket definition encompassing a broad spectrum of contributing factors to the breakdown in the overall financial system. To gain a better idea of the breadth of scope in defining and measuring systemic risk, the OFR published its first working paper cataloging 31 measures of systemic risk.
The measures fall into a highly diverse typology that includes such topics as network measures, stress tests, macroeconomic measures, cross-sectional measures and illiquidity and insolvency. Disentangling which of these to focus attention on by financial institutions seems daunting for those unfamiliar with specific measures outlined in the document, however, a few seem particularly appropriate based on the following criteria for inclusion in a systemic risk assessment framework. Three areas that clearly firms should have had better insights into in the period preceding the crisis are liquidity risk, counterparty risk and capital. Understanding how individual institutions contributed to systemic risk along each of these dimensions would have helped firms better anticipate and react to conditions ahead of any problem. However, many systemic risk measures are estimated at a point-in-time and while useful as a benchmark for assessing risk need to be augmented with an array of dynamic measures capable of assessing changes in liquidity and capital, among others as conditions deteriorate. Systemic Expected Shortfall (SES) is one such example of a static measure of a firm’s impact on systemic risk, but others such as Marginal Expected Shortfall (MES) and/or leverage (LVG) can be used as leading indicators of systemic risk for specific firms. Other measures that firm risk management teams could apply include CoVaR and Co-Risk, measures of the interconnectivity between financial institutions. While clearly not an exhaustive list, the point is that firms should be working with their economics office to determine which metrics should be included in an ongoing reporting package on systemic risk.
To be useful to firms, the measures need to be implementable, in large part meaning that there is relevant data on which to measure them. In many cases data is readily available for these measures and with most, there are always limitations in their utility. But waiting for the best metric to come along can be a worse decision than not implementing a process to measure systemic risk and establishing a review process as part of the regular risk dialogue. Certain metrics included in the working paper may also be useful to specific industries and firms such as metrics assessing crowded trades in currency funds, equity market illiquidity metrics, and other metrics designed to measure systemic risk in hedge funds. Momentum has been building since the crisis for firms to provide greater transparency around their risk to the financial system. The release by the Fed this week of stress test results for the largest banking institutions in the US illustrates the new regime to periodically assess the strength of these institutions. Forward-looking financial institutions will find it in their long-term best interest to start developing a process for analyzing systemic risks of important counterparties, industry segments and markets generally.
February 22nd, 2012 by crossi under Uncategorized. No Comments.
Yesterday marked an important milestone in the mortgage market with the announcement by FHFA of what may lie ahead for the GSEs. For the first time, government officials outlined a coherent strategy for a successful transition of the two mortgage giants Fannie Mae and Freddie Mac which indicates that the state of limbo that secondary markets have been in ever since the two entities entered conservatorship in 2008 may be approaching an important turning point. Don’t expect vast changes to occur in the next year certainly with a Presidential election heating up. But the fact that the FHFA put this plan forward marks a critical time for the mortgage market as facilitating a transition to a sustainable mortgage finance system is no easy task.
The plan cites a set of core objectives that FHFA has used to guide the conservatorship process including mitigating losses to the taxpayer, enhancing access to mortgage finance for homeowners and assisting struggling borrowers per the various Administration and other foreclosure prevention programs. Now FHFA states the time has come to move forward with the next phase of the process which includes building the next generation secondary market infrastructure, reduce the GSE’s market presence, and maintain focus on foreclosure prevention activities. While the plan is not specific on the exact manner in which transition of the GSEs to another form takes place, a workable approach should entail a well-choreographed set of activities that ultimately leads to a complete wind-down of both GSEs with them being replaced by one of the three options laid out by Treasury and widely supported in the market.
FHFA in its plan lays out some of the progress it has made in aligning important aspects of the GSEs’ business including its Servicing Alignment Initiative, the Uniform Mortgage Data Program and aligning the GSEs’ rep and warrant policies, among others. So clearly FHFA believes that consistency and alignment of practices and activities between Fannie and Freddie is worthwhile. Other aspects of the current state of conservatorship highlighted by FHFA include the need to recognize economies of scale in technology and the importance of standardization while recognizing challenges in maintaining focus on the core objectives mentioned by FHFA above. With these core principles it is possible to offer a blueprint for an effective transition from conservatorship today into a vibrant replacement for the GSEs that meets all of the objectives laid out in FHFA’s plan.
Although a number of possible transition models exist, to get a glimpse into what could take place consider the following plan that could have a timeframe of 2-3 years. Other than historical legacy, one could argue that maintaining both GSEs in their current form does not meet FHFA’s objectives for mortgage markets. There is a great deal of complexity in managing both enterprises in their current form that poses a challenge to FHFA’s desired outcome of alignment and consistency of practices and activities of the GSEs despite the progress made by FHFA on this front to this point. Thus, a possible step forward would be to merge one of the GSEs in with the other, consolidating the various businesses in the process and restructuring them into a logical set of activities lined up against FHFA’s plan objectives. This approach has the clear advantage of creating improvements in execution of specific strategies such as winding down the retained portfolios, foreclosure prevention and maintaining a stable liquid market for mortgage finance. It doesn’t hurt either that both companies are in need of new leadership, thus reducing management frictions that can accompany such consolidations. Further, the FHFA has cited a number of challenges in retaining sufficient talent at both companies to ensure adequate focus on important initiatives that could be improved through consolidation. By leaving one GSE in place to absorb the other it also helps maintain market stability since the “acquiring” GSE would honor the outstanding debt of the other GSE. GSE consolidation would facilitate further streamlining among GSE activities, promote greater transparency for lender partners and vendors, and otherwise establish a structure better suited for FHFA’s long-term plan.
Once the consolidation has been completed, the second step in the process would be to reconfigure the entities’ businesses to allow for single point accountability for each strategic initiative outlined by FHFA earlier. This could be done by creating effectively four business lines; legacy business; current state single-family securitization business; multifamily, and future state infrastructure. The legacy business would be comprised of two parts; one would house the retained portfolios of both GSEs; the second being a group focused on foreclosure prevention efforts of legacy assets. The legacy portfolio group would focus entirely on least cost asset disposition strategies and interest rate risk management activities. The aggregate single-family securitization activity of the consolidated GSE would continue as today until the future state infrastructure was available along with the replacement securitization conduit structures. However, a single security would be issued, either a MBS or PC. This would allow for FHFA’s goal of realizing a single mortgage-backed security sooner than they anticipate in their plan. The securitization entity would house all of the associated processes consolidated from both enterprises including debt issuance, credit and counterparty risk management, credit guarantee pricing, quality control and sourcing, among others. Given duplication at both enterprises across these activities, some efficiencies will be gained in addition to attaining complete alignment of policies, procedures and strategy. A consolidated multifamily guarantee business would be created focusing on replacing government with private guarantees. The fourth group would be tasked with development of the next generation secondary market loan systems and infrastructure. This would include design and deployment of the new servicing platform, pooling and servicing agreements, data and loan documentation requirements, among other activities. Elements from other areas deemed to be duplicative could be reassigned to this group along with other resources dedicated to such efforts today. Two other important support functions; legacy IT and Operations/Administration would round out the structure of the consolidated GSE. The existing IT systems of both entities would be used in the legacy group since they would need to manage assets from both GSEs while the securitization activity in theory could be conducted from a single GSE platform.
Once the new securitization infrastructure has been developed and tested, the new replacement conduits based on one of the three options laid out by Treasury would be established and capitalized. This might over time be facilitated by efforts to reduce GSE market presence by raising Gfees, establishing loss-sharing arrangements and revitalizing mortgage insurance contracts as mentioned by FHFA. Greater use of MI, however, it is important to note is not without a set of financial and policy challenges; namely the extreme weakness of the MI industry and Qualified Residential Mortgage (QRM )proposed rules under Dodd-Frank Act risk retention provisions that are overly restrictive for non-GSE loans. With this new infrastructure and capitalized replacement conduits in place, the consolidated GSE could be placed into receivership and wound down. Having consolidated both GSEs first and restructuring the organizations would make this somewhat easier. Some parts of the company could conceivably be merged in with Ginnie Mae or FHA if deemed to have value for these agencies. This might include the remnants from the legacy foreclosure prevention group, for example.
This hypothetical framework is not without its challenges from an implementation and policy perspective, and it would be up to FHFA to methodically analyze what sequence of restructuring activities is needed and what it would look like before it could be undertaken. Nevertheless, the example above provides a clear way out of the mortgage financing morass that has enveloped the entire housing market since the crisis. The FHFA’s plan signals that change is needed, and the prospect for revitalizing the secondary market now seems for the first time a real possibility.
February 7th, 2012 by crossi under Uncategorized. No Comments.
Over the last week there has been a lot of press and associated discussion over Freddie Mac’s use of CMO inverse floaters and whether they incent Freddie Mac to slow its efforts to refinance struggling homeowners. Freddie has also been experimenting with a new
debt financing structure, the Mortgage Linked Amortizing Note (MLAN) which some also feel may be taking a bet against homeowners seeking to refinance. The subject of an American Banker Op-ed piece on this issue appeared today in which I outline how these derivatives are important instruments for hedging interest rate risk exposure at the GSE and that the size of the inverse floater position at $5 billion, relative to Freddie’s $655 billion retained portfolio is insufficient for the agency to have risked taking enormous reputation risk.
Moreover, the MLAN provides a useful way of better matching the duration of the mortgage asset with its liability and lessens the reliance on accurate prepayment forecasting. However, much has been speculated on Freddie’s derivatives activities as the reason for such poor refinance results to date. As a control group the fact that Fannie has not followed in using these types of derivatives (at least not to this author’s knowledge) and their poor refinance results would suggest that perhaps derivatives are not the reason for the underwhelming results today and we should be looking at a different but more plausible cause; namely poor program design and execution. CMO inverse floaters and MLANs certainly may capture our attention as potentially sinister financial instruments of mass destruction as one oracle suggests, but in this case, it really looks like some are searching for an answer to a problem that has a much simpler explanation.
February 4th, 2012 by crossi under Uncategorized. No Comments.
This week the Chief Risk Officers of MF Global testified before the House of Representatives Committee on Financial Services Oversight and Investigations Subcommittee regarding the circumstances leading to MF Global’s bankruptcy.
The parallels and lessons to be learned between MF Global and the mortgage crisis are uncannily similar and in ways for this author a bit of deja vu.
The testimony of these CROs underscores how lapses in risk governance and support of the CRO by the CEO and Board can bring down an institution. A few facts about each CRO during his tenure are instructive to consider. Mr. Roseman preceded Mr. Stockman as CRO of MF Global and reported directly to the CEO and had direct access to the Board. As part of this responsibility, Mr Roseman worked with Executive Management and the Board to implement an Enterprise Risk Management capability in the wake of an unauthorized trading incident. At that point it seemed that Mr. Roseman enjoyed the support of senior management. In 2010, Mr. Roseman reported a number of important shifts in risk coinciding with the arrival of the new CEO. Among these were requests from the business units to raise European sovereign position limits, first reset at $1B, then less than 6 months later raised again to $1.5-$2B. At about this same time, the Repo-to-Maturity (RTM) transactions were ramping up in size, this growth justified by their “profitability and the importance of generating earnings.” Astounding as it may sound, only 1 month later the positions grew to $3.5-$4B at which point Mr. Roseman was to request the Board to again raise the limits, this time to $4.75B. At a subsequent meeting in November with the Board, Mr. Roseman laid out his arguments and analysis for his concerns about the risk of these positions. It was at that meeting that the Board including the CEO contended that the CRO’s scenarios were implausible. Two months later, Mr. Roseman was informed that he was being replaced. However, the new CRO would no longer report directly to the CEO, but rather was layered under the Chief Operating Officer. This important change in reporting underscores the shift in risk culture of MF Global and the stature of risk management in that organization following the change in CEO.
Once again, the MF Global experience illustrates how important effective risk management is to an organization. But the authority and support of risk management starts and ends with the CEO and the Board, for it is their responsibility to establish a strong risk culture. In a recent study of risk governance, cognitive biases and incentives, I develop a theoretical model for how risk governance works in relation to inherent biases of senior management and incentive compensation structures.
In that study I note that several biases can undermine the effectiveness of the CRO including something referred to as ambiguity bias. One of a risk manager’s major functions is to help senior management quantify uncertainty of different outcomes for the firm. A variety of tools are at the CRO’s disposal for this including stress tests as used by Mr. Roseman for assessing MF Global’s liquidity risk exposure. However, as illustrated in this case, executive management and the Board may differ with the views of risk management, in part supported by market or firm performance from the recent past that supports a particular view they may hold (confirmation bias) reinforced by weak governance and incentive structures. In the study I go on to also illustrate how these views toward risk by senior management can ultimately undermine the stature and credibility of risk officers in the eyes of the business, eventually leading to greater risk-taking and/or weakening of critical risk processes and controls. Mr. Roseman’s testimony of his experience at MF Global completely aligns with this model which has several implications for public policy.
First, regulation cannot assure a company will adopt a strong risk culture and associated risk management practices. This is the responsibility of the Board and CEO. However, one step that could be taken is to elevate the stature of the position directly in the firm by having it report directly to the Board Risk Committee Chairperson much as the General Auditor position does in most instances to the Board Audit Committee. This should have been included as part of Dodd-Frank’s risk management requirements along with its requirement to establish risk committees and risk expertise on the board outlined in DFA for large companies. In addition, other players could financially incentivize firms to adopt strong risk management practices. These include D&O insurers, rating agencies and even the FDIC as it relates to risk-based deposit insurance pricing. Each of these entities in some fashion includes risk management as a component of the assessment, however, at this time those efforts appear to underweight the importance of the risk management function directly. Finally, the executive compensation of senior management must have a significant component tied to risk outcomes.
MF Global’s demise in October 2011 points again to the importance of risk governance of banks and nonbank financial institutions. The Dodd-Frank Act has attempted to plug a number of weaknesses in the financial sector that led to the greatest financial collapse since the Great Depression, however, for all of its scope, its attention to strengthening how risk management operates within the organization is limited. MF Global will not be the last large failure to occur based on weak risk governance.