
Wednesday, December 30, 2009
The Fourth Paradigm: Data-Intensive Scientific Discovery

Saturday, December 19, 2009
Risk, Reward and Responsibility: The Financial Sector and Society
A strong financial sector is essential to a modern economy, but private actions can impose enormous costs on taxpayers; a balance must be struck. This column explains why the UK Government believes that there is a case for increasing the costs of risk-taking to banks and their shareholders while reducing those borne by taxpayers.
A strong and competitive financial sector is essential to a productive modern economy. But when banks fail the costs are high. Following the failure of Lehman Brothers in September 2008, governments around the world acted decisively to protect retail depositors, maintain financial stability, and enable banks to continue lending. Without such action the consequences of the financial crisis would have been far worse – but the result was a significant burden on taxpayers.
To reduce the probability of a repetition of these events, G20 finance ministers and leaders have committed to implementing higher regulatory and supervisory standards to guard against excessive risks. The objective of these reforms is to ensure that the costs of failure of institutions are borne by shareholders and other creditors in an orderly way without triggering a systemic crisis. This should limit the circumstances in which any future government intervention is necessary. But in practice there may always be a risk that the potential costs to the wider economy of a systemic crisis will be sufficiently great that government intervention is appropriate. After all, systemic financial crises have been an intermittent but pervasive feature of financial systems throughout recorded economic history.
A key question is therefore how to ensure that any costs of government intervention are distributed more fairly across those in the financial sector who benefit, reflecting the risks and rewards associated with financial intermediation. This should mean ensuring that the costs of bank failure fall primarily to banks and bank investors, rather than taxpayers. A paper published by the UK government last week (HM Treasury 2009) considers in detail options for ensuring banks meet the immediate costs of interventions to prevent systemic failure. The provision of emergency liquidity facilities and of deposit insurance is already well established, so this article concentrates on the potential new proposals in the paper: contingent capital and systemic risk levies. It also discusses the case for additional taxation of the financial sector to ensure that the sector makes a fair contribution to society and broader social objectives.
Contingent capital and capital insurance
The Basel Committee is considering how to reform global capital standards and banks are likely to have to significantly strengthen the quantity and quality of capital that they hold. In the recent crisis existing subordinated debt and hybrid capital largely failed in its original objective of bearing losses, so the focus of regulatory capital requirements should be on capital that can absorb losses on a going concern basis, typically common equity. More equity will enhance the resilience of banks to shocks. However, if equity is insufficient to absorb losses, banks may have to try to raise more. And in a systemic crisis the cost of raising new capital could be prohibitively high, so that banks may find it difficult to raise new capital when they need it most.
Contingent capital or capital insurance held by the private sector could help address this potential issue and supplement common equity in times of crisis. There are a variety of proposals (e.g. Raviv 2004, Flannery 2009) under which banks would issue fixed income debt that would convert into capital according to a predetermined mechanism, either bank-specific (related to levels of regulatory capital) or a more general measure of crisis. Alternatively, under capital insurance, an insurer would receive a premium for agreeing to provide an amount of capital to the bank in case of systemic crisis.
From an economic perspective, the attraction of contingent capital or insurance is clear; creating such instruments, especially if they were traded, would improve both market discipline and market information. Unfortunately, however, as we have seen, it is precisely in a crisis that markets for such instruments – which will essentially be put options sold by investors to banks – can fail. Such instruments may ultimately not be appropriate for many fixed-income investors such as insurance funds which have in the past invested in subordinated debt and hybrid capital instruments; and the systemic stability consequences would need to be explored.
Alternatively, the Government could offer a capital insurance scheme. For example, Caballero and Kurlat (2009) propose that the central bank should issue tradable insurance credits, which would allow holders to attach a central bank guarantee to their assets during a systemic crisis; or the government could set out an explicit arrangement to deliver capital and liquidity to banks in times of a systemic crisis in return for an up-front fee, as proposed by Perotti and Suarez (2009). Trigger events, fees and the equity conversion price would all be set in advance.
Such a scheme would provide greater certainty for market participants about the circumstances and limits of government intervention, while also ensuring government receives an up-front fee for implicit support. However, there would be potential for moral hazard resulting from reduced loss-sharing across subordinated debt and hybrid capital, unless such capital converted to equity ahead of any government injection. In general government-provided capital insurance would seem to be less useful and less flexible than a wider systemic levy or resolution fund, an option outlined below, although capital insurance could be an element of any wider solution.
Systemic risk levies and resolution funds
Since it is impossible to eliminate entirely the risk that some of the costs of intervention will have to provided by government, there is a case for ensuring that in future the financial sector itself meets these residual costs through a systemic risk levy or resolution fund. A “systemic risk scheme” would have a wider scope than existing deposit-guarantee schemes in that contributions would be sought from all systemically important firms rather than just retail deposit-takers; and wider coverage in that it would fund the costs of restoring financial stability more broadly rather than just the costs of compensating retail depositors. Funds could be used towards the recapitalisation of failing or failed banks, to cover the cost of other guarantees to creditors, or potentially other costs from resolution.
Such a levy might be charged on either a pre- or post-funded basis. It could be weighted towards financial firms which, because of their size or the nature of their business, were a potential source of significant risk to the stability of the financial system. The levy would not fund insurance for any individual firm but rather would support intervention, if needed, to stabilise the system as a whole, hence avoiding some of the moral hazard problems typical of insurance schemes. Given that financial crises requiring direct government intervention are relatively rare in developed economies, any fund would have to be built up over an extended period. When financial crises do occur, however, they can be exceptionally costly to the public finances, and therefore the fund may ultimately need to be quite large.
A key consideration in designing any systemic levy or general resolution fund would be how to assess systemic significance and/or a firm’s contribution to systemic risk, and how to reflect that in any levy. In principle, a well-designed levy could achieve two objectives. It would both raise funds to cover the costs of restoring financial stability and, to the extent that the levy accurately reflected both systemic risk and institution-specific risk and charged for individual institutions’ contribution to those risks, it would embed incentives that would reduce the probability that such costs would ever materialise. For these reasons, a prefunded, rather than a “survivor pays” approach appears more attractive in principle.
Ensuring the financial sector makes a fair contribution
Going beyond the measures described above to deal with systemic risk, there might be other reasons why the financial sector should make a greater fiscal contribution:
- to defray the wider fiscal, economic and social costs of the recent crisis;
- to help correct what might be considered excessively risky or destabilising activities that may have negative externalities;
- if elements of the financial services industry were shown to be generating supernormal returns to executives or shareholders – economic rents – because of the existence of market failures, then there may be a case for increasing taxation on these returns;
- the global nature of the financial services industry and the mobility of its activities might suggest that a more internationally coordinated approach would help ensure the sector makes a fair contribution through tax, irrespective of where firms are located or where the activity takes place.
A financial transaction tax has been suggested as a potential method of ensuring that the global financial services sector makes a fair contribution. It has been argued that some financial transactions have little or even negative social value (see e.g. Krugman (2009), Turner (2009)); that even if such a tax reduces liquidity in some markets, there are likely to be diminishing marginal returns to liquidity, and so any negative impact would be minimal; and that the potential revenues could be large. Full analysis of the potential economic implications of introducing a transaction tax will help determine the desirability of such a tax, the level at which it should be set if it were introduced and the likely consequences.
As well as considering the economic impact of a financial transaction tax, there are some significant issues to explore regarding its design and implementation. Key points include:
- identifying the tax base: to protect against avoidance a financial transactions tax would ideally have as broad a base as possible, including over-the counter transactions;
- establishing a means of tracking transactions in order to implement the tax, given financial transactions are currently recorded through a range of exchanges and other systems;
- setting a rate, or rates, to ensure the introduction of a financial transaction tax does not have a negative economic impact, given the different margins on particular types of transactions;
- determining a means for allocating revenue raised, given the international nature of many financial transactions; and
- defining a method for monitoring and ensuring compliance, and determining action that should be taken in the event of avoidance or evasion.
Common principles and next steps
The IMF will report in April to the G20 on these issues. Any proposals should respect the following principles:
- Global – some options could realistically only be implemented at a global level, while others would require international agreement and coordination on key principles to be effective;
- Non-distortionary – avoiding measures that would damage liquidity, drive inefficient allocation of capital or lead to widespread avoidance;
- Stability enhancing – actions must support and not undermine the regulatory action already being taken. This is likely to mean any option would take several years to implement; and
- Fair and measured – financial services must be able to continue to contribute to economic growth and any additional costs should be distributed fairly across the sector. A thorough impact assessment must be conducted prior to implementation.
References
Caballero, Ricardo J and Pablo Kurlat (2009), "The 'Surprising' Origin and Nature of Financial Crises: A Macroeconomic Policy Proposal," Federal Reserve Symposium at Jackson Hole, August.
Flannery, Mark (2009), "Contingent Tools Can Fill Capital Gaps," American Banker, 174(117).
HM Treasury (2009), "Risk, Reward and Responsibility: The Financial Sector and Society," December.
Krugman, Paul (2009), "Taxing the Speculators," New York Times, 27 November.
Perotti, Enrico and Javier Suarez (2009), "Liquidity Insurance for Systemic Crises," VoxEU.org, 11 February.
Raviv, Alon (2004), "Bank Stability and Market Discipline: Debt-for-Equity Swap versus Subordinated Notes," Unpublished working paper.
Turner, Adair (2009), "Responding to the Financial Crisis: Challenging Assumptions," Speech to the British Embassy, Paris, 30 November.
Republished with permission of VoxEU.org
Thursday, December 17, 2009
Using Inflation to Erode the US Public Debt
As the US debt-to-GDP ratio rises towards 100%, policymakers will be tempted to inflate away the debt. This column examines that option and suggests that it is not far-fetched. US inflation of 6% for four years would reduce the debt-to-GDP ratio by 20%, a scenario similar to what happened following WWII.
Since the start of 2007, the financial crisis has triggered $1.62 trillion of write-downs and credit losses at US financial institutions, sending the American economy into its deepest recession since the Great Depression and the global economy into its first recession since World War II. The US Federal Reserve has responded aggressively. Fiscal policy has become expansionary as well. The US is now facing large deficits and growing public debt. If economic recovery is slow to take hold, large deficits and growing debt are likely to persist for a number of years. Not surprisingly, concerns about government deficits and public debt now dominate the policy debate (Cavallo & Cottani 2009).
Many observers worry that the debt-to-GDP ratios projected over the next ten years are unsustainable. Assuming deficits can be reined in, how might the debt/GDP ratio be reduced? There are four basic mechanisms:
1. GDP can grow rapidly enough to reduce the ratio. This scenario requires a robust economic recovery from the financial crisis.
2. Inflation can rise, eroding the real value of the debt held by creditors and the effective debt ratio. With foreign creditors holding a significant share of the dollar-denominated US federal debt, they will share the burden of any higher US inflation along with domestic creditors.
3. The government can use tax revenue to redeem some of the debt.
4. The government can default on some of its debt obligations.
Over its history, the US has relied on each of these mechanisms to reduce its debt/GDP ratio. In a recent paper (Aizenman & Marion 2009), we examine the role of inflation in reducing the Federal government’s debt burden. We conclude that an inflation of 6% over four years could reduce the debt/GDP ratio by a significant 20%.
The facts
Figure 1 depicts trends in gross federal debt and federal debt held by the public, including the Federal Reserve, from 1939 to the present. In 1946, gross federal debt held by the public was 108.6%. Over the next 30 years, debt as a percentage of GDP decreased almost every year, due primarily to an expanding economy as well as inflation. By 1975, gross federal debt held by the public had fallen to 25.3%.
Figure 1. Debt as a share of GDP
The immediate post-World War II period is especially revealing. Figure 2 shows that between 1946 and 1955, the debt/GDP ratio was cut almost in half. The average maturity of the debt in 1946 was 9 years, and the average inflation rate over this period was 4.2%. Hence, inflation reduced the 1946 debt/GDP ratio by almost 40% within a decade.
Figure 2. US debt reduction, 1946-1955
In recent years, the debt/GDP ratio has increased dramatically, exacerbated by the financial crisis. In 2009, it reached a level not seen since 1955. Figure 3 shows three 10-year projections, indicating debt held by the public could be 70-100% of GDP in ten years.
Figure 3. America’s projected debt burden
A government that has lots of nominal debt denominated in its own currency has an incentive to try to inflate it away to decrease the debt burden. If foreign creditors hold a significant share of the debt, the temptation to use inflation is greater, since they will bear some of the inflation tax. Shorter average debt maturities and inflation-indexed debt limit the government’s ability to reduce its debt burden through inflation.
Figure 4 shows the share of US public debt held by foreign creditors. The foreign share was essentially zero up until the early 1960s. It has risen dramatically in recent years, particularly since the 1997-98 Asian financial crisis, and was 48.2% of publicly held debt in 2008. Thus foreign creditors would bear about half of any inflation tax should inflation be used to reduce the debt burden, with China and Japan hit hardest.
Figure 4. Foreign share of publicly held debt
Figure 5 illustrates the average maturity length for US marketable interest-bearing public debt held by private investors, along with the debt held by the public as a share of GDP. As noted by a number of authors, the US exhibits a positive relation between maturities and debt/GDP ratios in the post-World War II period. Most developed countries show little correlation between maturities and debt/GDP ratios. The US appears to be an exception. Maturity length on US public debt in the post-World War II era went from a 9.4 years in 1947 to a low of 2.6 years in 1976. In June, 2009, the average maturity was 3.9 years. Most of this debt is nominal. (1)
Figure 5. Average maturity length and share of debt held by the public
Inflating away some of the debt burden
Figure 6 illustrates the percentage decline in the debt/GDP ratio under various inflation scenarios.(2) Inflation yielded the most dramatic reduction in the debt/GDP ratio – and the real value of the debt – in the immediate post-World War II period. A 5% inflation increase starting in 1946, for example, would have reduced the debt/GDP ratio from 108.6% to 59.3%, a decline in the debt ratio of 45%. Not only was there a large debt overhang when the war ended, but inflation was low (2.3%) and debt maturity was high. Thus there was room to let inflation rise – and it rose to 14.4% in 1947 before dropping considerably. Average inflation over the decade was 4.2%. Moreover, long maturities allowed inflation to erode the debt burden. Maturities were over 9 years in years 1945-48 and then fell gradually to 8.17 years in 1950.
Figure 6. Impact of inflation on publicly-held debt as a share of GDP
In contrast, inflation would have had little impact on reducing the debt burden in the mid-1970s. That period was characterised by a lower debt overhang, inflation was higher, and debt maturities were shorter (under 3 years). As a result, in 1975 a five-point increase in inflation would have reduced the debt/GDP ratio from 25.3% to 21.9%. The estimated impact of inflation on today’s debt/GDP ratio is larger than in the mid-1970s but not as large as in the mid-1940s. If inflation were 5% higher, the debt/GDP ratio would be about 20% lower, a debt ratio of 43.4% instead of 53.8%. Our computations of the impact of inflation on the debt overhang assume that all debt is denominated in domestic currency, none is indexed, and the maturity is invariant to inflation. Regression analysis confirms that US debt maturities over the period 1946-2008 are not responsive to inflation.
We develop a stylistic model that illustrates both the costs and benefits associated with inflating away some of the debt burden. The model, inspired by Barro (1979), shows that the foreign share of the nominal debt is an important determinant of the optimal inflation rate. So is the size of the debt/GDP ratio, the share of debt indexed to inflation, and the cost of collecting taxes. A lesson to take from the model and the simulations is that eroding the debt through inflation is not farfetched. The model predicts that inflation of 6% could reduce the debt/GDP ratio by 20% within four years. That inflation rate is only slightly higher than the average observed after World War II. Of course, inflation projections would be much higher than 6% if the share of publicly-held debt in the US were to approach the 100% range observed at the end of World War II.
Interpretation
The current period shares two features with the immediate post-World War II period. It starts with a large debt overhang and low inflation. Both factors increase the temptation to erode the debt burden through inflation. Even so, there are two important differences between the periods. Today, a much greater share of the public debt is held by foreign creditors – 48% instead of zero. This large foreign share increases the temptation to inflate away some of the debt. Another important difference is that today’s debt maturity is less than half what it was in 1946 –3.9 years instead of 9. Shorter maturities reduce the temptation to inflate. These two competing factors appear to offset each other, and the net result in a simple optimising model is a projected inflation rate slightly higher than that experienced after World War II, but for a shorter duration.
In the simulations, we raise a concern about the stability of some model parameters across periods, particularly the parameters that capture the cost of inflation. It may be that the cost of inflation is higher today because globalisation and the greater ease of foreign direct investment provide new options for producers to move activities away from countries with greater uncertainty. Inflation above some threshold could generate this uncertainty, reducing further the attractiveness of using inflation to erode the debt.
Moreover, history suggests that modest inflation may increase the risk of an unintended inflation acceleration to double-digit levels, as happened in 1947 and in 1979-1981. Such an outcome often results in an abrupt and costly adjustment down the road. Accelerating inflation had limited global implications at a time when the public debt was held domestically and the US was the undisputed global economic leader. In contrast, unintended acceleration of inflation to double-digit levels in the future may have unintended adverse effects, including growing tensions with global creditors and less reliance on the dollar. (3)
Footnotes
(1) Treasury inflation-protected securities, or TIPS, account for less than 10% of total debt issues.
(2) The calculation assumes that maturity is invariant to inflation. We test and overall confirm the validity of this assumption for U.S. data in the post-World War II period.
(3) For the threat to the dollar from the euro, see Chinn and Frankel (2008) and Frankel (2008).
References
Aizenman, Joshua and Nancy Marion (2009), "Using Inflation to Erode the US Public Debt,” NBER Working Paper 15562.
Barro, Robert (1979), “On the Determination of the Public Debt,” Journal of Political Economy, 87, 940–71.
Cavallo, Domingo and Joaquín Cottani (2009), “A Simpler Way to Solve the 'Dollar Problem' and Avoid a New Inflationary Cycle,” VoxEU.org, 12 May.
Chinn, Menzie and Jeffrey Frankel (2008), "Why the Euro Will Rival the Dollar,” International Finance, 11(1), 49-73.
Frankel, Jeffrey (2008), "The Euro Could Surpass the Dollar within Ten Years,” VoxEU.org, 18 March.
Republished with permission of VoxEU.org
Sunday, November 29, 2009
ModelRisk 3.0: Best in Class Solution for Excel-Based Risk Analysis
The tools and techniques made available in ModelRisk have been developed from Vose Consulting’s experience in assessing risk in a broad range of industries over many years, and goes far beyond the Monte Carlo simulation tools currently available. ModelRisk has been designed to make risk analysis modeling simpler and more intuitive for the novice user, and at the same time provide access to the most advanced risk modeling techniques available. (Press release, Vose Software, May 11, 2009)The latest version of ModelRisk is the most advanced spreadsheet-based risk-modeling platform ever developed and currently stands as the best in class software solution for quantitative risk analysis, forecasting, simulation, and optimization. ModelRisk enables users to build complex risk analysis models in a fraction of the time required to develop custom-coded applications. Open database connectivity further extends the business intelligence capabilities of this integrated platform by enabling access to essentially any data warehousing system in use today.
“Good risk analysis modeling doesn’t have to be hard, but the tools just weren’t there to make it easy and intuitive. So we asked, “If we could start from the beginning, what would the ideal risk analysis tool be like?” says David Vose, Technical Director of Vose Software. “ModelRisk is the result. Users of competing spreadsheet risk analysis tools will find all the features they are familiar with in ModelRisk, but ModelRisk throws open the doors to a far richer world of risk analysis modeling. Better still, ModelRisk has many visual tools that really help the user understand what they are modeling so they can be confident in what they do, and ModelRisk costs no more than the older tools available. We also have a training program second-to-none: the people teaching our courses are risk analysts with years of real-world experience, not just software trainers.”ModelRisk 3.0 now includes:
- Over 100 distribution types
- Stochastic ‘objects’ for more powerful and intuitive modeling
- Time-series forecasting tools such as ARMA, ARCH, GARCH, and more
- Advanced correlations via copulas
- Distribution fitting of time-series data, including correlation structures
- Probability measures and reporting
- Integrated optimization using the most advanced, proven methods available
- Multiple visual interfaces for ModelRisk functions
- User library for organizing models, assumptions, references, simulation results, and more
- Direct linking to external databases
- Extreme-value modeling
- Advanced data visualization tools
- Expert elicitation tools
- Mathematical tools fors numerical integration, series summation, and matrix analysis
- Comprehensive statistical analytics
- World class help file
- Developers’ kit for programming using ModelRisk’s technology
Learn More
Saturday, October 24, 2009
Working with Windows 7

Wednesday, September 02, 2009
Evidence-Based Practices in Online Learning
Students who took all or part of their class online performed better, on average, than those taking the same course through traditional face-to-face instruction.

Instruction combining online and face-to-face elements had a larger advantage relative to purely face-to-face instruction than did purely online instruction.... [Moreover,] elements such as video or online quizzes do not appear to influence the amount that students learn in online classes.The report took care to note that more research is required before extending the implications of the study into education segments outside of adult learning, such as K-12 programs.
Educators making decisions about online learning need rigorous research examining the effectiveness of online learning for different types of students and subject matter as well as studies of the relative effectiveness of different online learning practices.Download Report
Monday, August 31, 2009
Website Preferences of Social Networkers
In a recent article, eMarketer cites research by Anderson Analytics that indicates the majority of social networkers by generational cohort use Facebook, while a minority use Twitter and LinkedIn. Moreover, a majority of users younger than age 45 also use MySpace, apparently in conjunction with Facebook.
The summary data above speaks for itself. The generational age groupings were as follows: generation Z (ages 13-14); generation Y (ages 15-29); generation X (ages 30-44); baby boomers (ages 45-65); and the WWII generation (older than age 65). Again, Facebook appears to be in the lead, at least for now.
Wednesday, August 26, 2009
The Normality of Surprises

Extreme values in the universe (or population) of outcomes occur naturally and more frequently than many presume. Under conditions of normality, 1 in 22 observations will deviate by twice the standard deviation (which is the square root of the variance) from the mean, 1 in 370 will deviate by three times the standard deviation, and up to 5 in 1,000 observations will deviate from the mean by three or more times the standard deviation. Note especially that extreme outcomes can fall well beyond the mean (to infinity).
We as analysts have a duty to educate decision-makers about how to use probability theory to advance the cause of modern finance in society. That includes emphasizing the counter-intuitive possibilities of extreme events in the economy. To assume away the normality of such surprises would be naïve.
Saturday, August 22, 2009
The Speed of Thinking

As communications speed up, so can “brain overload.” This begs the question of just how fast communications should go. Perhaps the speed of thinking could provide a useful governor.Speed used to convey urgency; now we somehow think it means efficiency.... There is a paradox here, though. The Internet has provided us with an almost unlimited amount of information, but the speed at which it works—and we work through it—has deprived us of its benefits. We might work at a higher rate, but this is not working. We can store a limited amount of information in our brains and have it at our disposal at any one time. Making decisions in this communication brownout, though without complete information, we go to war hastily, go to meetings unprepared, and build relationships on the slippery gravel of false impressions. Attention is one of the most valuable modern resources. If we waste it on frivolous communication, we will have nothing left when we really need it.
Saturday, August 15, 2009
To DBA or PhD in Business Administration
Learners should realize that theories (or concepts) provide the essential framework for all business research, including research conducted in fulfillment of requirements for both the DBA and PhD. All successful doctoral dissertations are grounded in good theory and evidence. The PhD is not a “theoretical” degree devoid of empirical evidence, and the DBA is not a “practical” degree devoid of theory. Rather, both degrees are grounded in the management literature and evidentiary support. You do not get out of theory or the rules of evidence simply by choosing one or the other academic degree.
The essential difference between the DBA and PhD in business administration is that the former focuses on the application of theory rather than on the development of new theory. Because business administrators are in constant search for better concepts and methods for conducting enterprise, PhD candidates can always find new topics for basic research. Likewise, while it is true that a general theory of management has yet to emerge, it does not mean that candidate theories are not out there. Thus, DBA learners can always find research problems where the application of existing concepts and methods might prevail to solve business problems.
Both, the DBA and PhD in business administration are about good theory and evidence – in other words, scholarship.
Friday, August 14, 2009
Economic Policies in Dystopia

However, should the stimulus programs eventually fail to place the global economy on track to a robust recovery, then the looming question will become how Western governments might eventually pay for the spending spree they gleefully embarked upon. Prof Kenneth Rogoff (2009, The Confidence Game, Project Syndicate) suggests that governments may have few policy options remaining:
Within a few years, Western governments will have to sharply raise taxes, inflate, partially default, or some combination of all three.If Prof Rogoff is correct in suggesting that Western governments may soon have to choose from these dreary options, then the economic future for society is arguably bleak. None of these policies would be popular or easy to implement. Nevertheless, there is a certain reality found in the multiple approach-avoidance content of these choices, and it may be time for policy-makers (and voters) to begin thinking about which option (or combination of options) they might prefer.
Wednesday, August 12, 2009
Too Big to Fail, or Just Too Big?
Delineating the factors that might make a financial institution systemically important is the first step towards managing the risk arising from it. Understanding why a firm might be systemically important is necessary to establish measures that reduce the number of such firms and to develop procedures for resolving the insolvency of systemically important firms at the lowest total cost (including the long-run cost) to the economy.Dr Thompson further argues that disclosing the identity of firms that may eventually be designated “systemically important” would require “constructive ambiguity” in order to ensure the market is not mislead into believing certain firms retain special dispensations in the form of government guarantees.
The choice of disclosure regime would seem to be between transparency (publication of the list of firms in each category) and some version of constructive ambiguity, where selected information is released… In the context of central banking and financial markets, the term [constructive ambiguity] refers to a policy of using ambiguous statements to signal intent while retaining policy flexibility. In the context of the federal financial safety net, many have argued for a policy of constructive ambiguity to limit expansion of the federal financial safety net. The notion here is that if market participants are uncertain whether their claim on a financial institution will be guaranteed, they will exert more risk discipline on the firm. In this context, constructive ambiguity is a regulatory tactic for limiting the extent to which de facto government guarantees are extended to the liabilities of the firms that regulators consider systemically important.After considering Dr Thompson’s ideas, I am flabbergasted with doubts. My first is with regard to the dogma implied by “systemically important” (i.e., “too big to fail”). What does “systemically important” mean? What makes a company “systemically important?” Dr Thompson sidesteps the “too big to fail” proposition by coining the alternative phraseology, “systemically important,” which is equally lambaste with normative relativism. The entire concept of “systemically important” lacks content validity, both in rhetoric and substance. To say a firm is “systemically important” is just another way of designating the firm as “too big to fail.”

My final comment is to offer a new suggestion for dealing with firms that are either “systemically important” or “too big to fail,” and that is we treat such firms as simply too big to keep around. Firms that are so large as to become “systemically important” or “too big to fail” should be broken up into smaller companies, thus advancing the competitive spirit of the marketplace, while ensuring that no firm becomes so large as to be able to threaten the financial stability of our nation as a consequence of their misfortunes.
Download
Tuesday, August 11, 2009
More Small Businesses Needed

Download
Friday, August 07, 2009
Statisticians in Demand
The rising stature of statisticians, who can earn $125,000 at top companies in their first year after getting a doctorate, is a byproduct of the recent explosion of digital data. In field after field, computing and the Web are creating new realms of data to explore — sensor signals, surveillance tapes, social network chatter, public records and more. And the digital data surge only promises to accelerate, rising fivefold by 2012, according to a projection by IDC, a research firm.
The demand for statisticians is consistent with a larger trend toward competing on analytics in enterprise. This trend has also given impetus to the need for other experts, especially in computer programming.
Though at the fore, statisticians are only a small part of an army of experts using modern statistical techniques for data analysis. Computing and numerical skills, experts say, matter far more than degrees. So the new data sleuths come from backgrounds like economics, computer science and mathematics.
Over the past several decades, firms have invested heavily into data management technology, including server and data-warehousing systems. These investments have created massive amounts of raw data that are begging to be analyzed by people trained and skilled in descriptive and inferential statistics, stochastic modeling, linear and non-linear forecasting, and so forth. The creation of so much raw data in recent years makes statistical analysis of that data a vital value-adding activity that enables competing on analytics.
“I keep saying that the sexy job in the next 10 years will be statisticians,” said Hal Varian, chief economist at Google. “And I’m not kidding.”
Thursday, August 06, 2009
Controlling vs Collateralizing Risk
Regulatory authorities around the world are currently discussing ways to prevent another financial crisis. One idea is to mandate higher levels of capital reserves. Japan’s banking reform shows that a comprehensive solution would work better.Requiring banks to increase capital reserves is itself, “risky.” For one thing, banks may not be able to raise sufficient capital in the equity markets to meet the revised capital requirements. Moreover, raising capital requirements tends to disadvantage banks that focus on traditional borrowing and lending transactions, and advantage banks that trade and take risks with their own accounts.
A new regulatory framework must also distinguish between banks whose main business is deposit taking and lending—the vast majority of banks worldwide—and banks that trade for their own account. The recent financial crisis demonstrated that balance sheet structure matters. Trusted banks with a large retail deposit base continued to provide funds to customers even in the depths of the crisis, whereas many banks that relied heavily on market funding or largely trading for their own account effectively failed. Investment banks with higher risk businesses by nature should be charged a higher level of capital requirement—otherwise, sound banking will not be rewarded.That the government has undertaken to save only the largest banks under the “too big to fail” presumption is of concern to the public for a variety of reasons, not the least of which is that such an approach may actually reward the banks that are taking the biggest risks, while closing those that have played by the rules. Additionally, requiring banks to maintain excessive capital reserves may sound good, but high reserves brings reduced capital efficiency, particularly at a time when money is scarce.
