Wednesday, December 30, 2009

The Fourth Paradigm: Data-Intensive Scientific Discovery

I recently came across a free e-book entitled, "The Fourth Paradigm: Data-Intensive Scientific Discovery," by Microsoft Research, and wanted to commend the book to data analysts and researchers. The book makes a novel and well-illustrated case for the future of data-intensive scientific discovery. The e-book version is available as a free download from the Microsoft Research website linked below, or may be purchased from leading bookstores.

Saturday, December 19, 2009

Risk, Reward and Responsibility: The Financial Sector and Society

by Jonathan Portes © VoxEU.org

A strong financial sector is essential to a modern economy, but private actions can impose enormous costs on taxpayers; a balance must be struck. This column explains why the UK Government believes that there is a case for increasing the costs of risk-taking to banks and their shareholders while reducing those borne by taxpayers.

A strong and competitive financial sector is essential to a productive modern economy. But when banks fail the costs are high. Following the failure of Lehman Brothers in September 2008, governments around the world acted decisively to protect retail depositors, maintain financial stability, and enable banks to continue lending. Without such action the consequences of the financial crisis would have been far worse – but the result was a significant burden on taxpayers.

To reduce the probability of a repetition of these events, G20 finance ministers and leaders have committed to implementing higher regulatory and supervisory standards to guard against excessive risks. The objective of these reforms is to ensure that the costs of failure of institutions are borne by shareholders and other creditors in an orderly way without triggering a systemic crisis. This should limit the circumstances in which any future government intervention is necessary. But in practice there may always be a risk that the potential costs to the wider economy of a systemic crisis will be sufficiently great that government intervention is appropriate. After all, systemic financial crises have been an intermittent but pervasive feature of financial systems throughout recorded economic history.

A key question is therefore how to ensure that any costs of government intervention are distributed more fairly across those in the financial sector who benefit, reflecting the risks and rewards associated with financial intermediation. This should mean ensuring that the costs of bank failure fall primarily to banks and bank investors, rather than taxpayers. A paper published by the UK government last week (HM Treasury 2009) considers in detail options for ensuring banks meet the immediate costs of interventions to prevent systemic failure. The provision of emergency liquidity facilities and of deposit insurance is already well established, so this article concentrates on the potential new proposals in the paper: contingent capital and systemic risk levies. It also discusses the case for additional taxation of the financial sector to ensure that the sector makes a fair contribution to society and broader social objectives.

Contingent capital and capital insurance

The Basel Committee is considering how to reform global capital standards and banks are likely to have to significantly strengthen the quantity and quality of capital that they hold. In the recent crisis existing subordinated debt and hybrid capital largely failed in its original objective of bearing losses, so the focus of regulatory capital requirements should be on capital that can absorb losses on a going concern basis, typically common equity. More equity will enhance the resilience of banks to shocks. However, if equity is insufficient to absorb losses, banks may have to try to raise more. And in a systemic crisis the cost of raising new capital could be prohibitively high, so that banks may find it difficult to raise new capital when they need it most.

Contingent capital or capital insurance held by the private sector could help address this potential issue and supplement common equity in times of crisis. There are a variety of proposals (e.g. Raviv 2004, Flannery 2009) under which banks would issue fixed income debt that would convert into capital according to a predetermined mechanism, either bank-specific (related to levels of regulatory capital) or a more general measure of crisis. Alternatively, under capital insurance, an insurer would receive a premium for agreeing to provide an amount of capital to the bank in case of systemic crisis.

From an economic perspective, the attraction of contingent capital or insurance is clear; creating such instruments, especially if they were traded, would improve both market discipline and market information. Unfortunately, however, as we have seen, it is precisely in a crisis that markets for such instruments – which will essentially be put options sold by investors to banks – can fail. Such instruments may ultimately not be appropriate for many fixed-income investors such as insurance funds which have in the past invested in subordinated debt and hybrid capital instruments; and the systemic stability consequences would need to be explored.

Alternatively, the Government could offer a capital insurance scheme. For example, Caballero and Kurlat (2009) propose that the central bank should issue tradable insurance credits, which would allow holders to attach a central bank guarantee to their assets during a systemic crisis; or the government could set out an explicit arrangement to deliver capital and liquidity to banks in times of a systemic crisis in return for an up-front fee, as proposed by Perotti and Suarez (2009). Trigger events, fees and the equity conversion price would all be set in advance.

Such a scheme would provide greater certainty for market participants about the circumstances and limits of government intervention, while also ensuring government receives an up-front fee for implicit support. However, there would be potential for moral hazard resulting from reduced loss-sharing across subordinated debt and hybrid capital, unless such capital converted to equity ahead of any government injection. In general government-provided capital insurance would seem to be less useful and less flexible than a wider systemic levy or resolution fund, an option outlined below, although capital insurance could be an element of any wider solution.

Systemic risk levies and resolution funds

Since it is impossible to eliminate entirely the risk that some of the costs of intervention will have to provided by government, there is a case for ensuring that in future the financial sector itself meets these residual costs through a systemic risk levy or resolution fund. A “systemic risk scheme” would have a wider scope than existing deposit-guarantee schemes in that contributions would be sought from all systemically important firms rather than just retail deposit-takers; and wider coverage in that it would fund the costs of restoring financial stability more broadly rather than just the costs of compensating retail depositors. Funds could be used towards the recapitalisation of failing or failed banks, to cover the cost of other guarantees to creditors, or potentially other costs from resolution.

Such a levy might be charged on either a pre- or post-funded basis. It could be weighted towards financial firms which, because of their size or the nature of their business, were a potential source of significant risk to the stability of the financial system. The levy would not fund insurance for any individual firm but rather would support intervention, if needed, to stabilise the system as a whole, hence avoiding some of the moral hazard problems typical of insurance schemes. Given that financial crises requiring direct government intervention are relatively rare in developed economies, any fund would have to be built up over an extended period. When financial crises do occur, however, they can be exceptionally costly to the public finances, and therefore the fund may ultimately need to be quite large.

A key consideration in designing any systemic levy or general resolution fund would be how to assess systemic significance and/or a firm’s contribution to systemic risk, and how to reflect that in any levy. In principle, a well-designed levy could achieve two objectives. It would both raise funds to cover the costs of restoring financial stability and, to the extent that the levy accurately reflected both systemic risk and institution-specific risk and charged for individual institutions’ contribution to those risks, it would embed incentives that would reduce the probability that such costs would ever materialise. For these reasons, a prefunded, rather than a “survivor pays” approach appears more attractive in principle.

Ensuring the financial sector makes a fair contribution

Going beyond the measures described above to deal with systemic risk, there might be other reasons why the financial sector should make a greater fiscal contribution:
  • to defray the wider fiscal, economic and social costs of the recent crisis;
  • to help correct what might be considered excessively risky or destabilising activities that may have negative externalities;
  • if elements of the financial services industry were shown to be generating supernormal returns to executives or shareholders – economic rents – because of the existence of market failures, then there may be a case for increasing taxation on these returns;
  • the global nature of the financial services industry and the mobility of its activities might suggest that a more internationally coordinated approach would help ensure the sector makes a fair contribution through tax, irrespective of where firms are located or where the activity takes place.

A financial transaction tax has been suggested as a potential method of ensuring that the global financial services sector makes a fair contribution. It has been argued that some financial transactions have little or even negative social value (see e.g. Krugman (2009), Turner (2009)); that even if such a tax reduces liquidity in some markets, there are likely to be diminishing marginal returns to liquidity, and so any negative impact would be minimal; and that the potential revenues could be large. Full analysis of the potential economic implications of introducing a transaction tax will help determine the desirability of such a tax, the level at which it should be set if it were introduced and the likely consequences.

As well as considering the economic impact of a financial transaction tax, there are some significant issues to explore regarding its design and implementation. Key points include:

  • identifying the tax base: to protect against avoidance a financial transactions tax would ideally have as broad a base as possible, including over-the counter transactions;
  • establishing a means of tracking transactions in order to implement the tax, given financial transactions are currently recorded through a range of exchanges and other systems;
  • setting a rate, or rates, to ensure the introduction of a financial transaction tax does not have a negative economic impact, given the different margins on particular types of transactions;
  • determining a means for allocating revenue raised, given the international nature of many financial transactions; and
  • defining a method for monitoring and ensuring compliance, and determining action that should be taken in the event of avoidance or evasion.

Common principles and next steps

The IMF will report in April to the G20 on these issues. Any proposals should respect the following principles:

  • Global – some options could realistically only be implemented at a global level, while others would require international agreement and coordination on key principles to be effective;
  • Non-distortionary – avoiding measures that would damage liquidity, drive inefficient allocation of capital or lead to widespread avoidance;
  • Stability enhancing – actions must support and not undermine the regulatory action already being taken. This is likely to mean any option would take several years to implement; and
  • Fair and measured – financial services must be able to continue to contribute to economic growth and any additional costs should be distributed fairly across the sector. A thorough impact assessment must be conducted prior to implementation.

References

Caballero, Ricardo J and Pablo Kurlat (2009), "The 'Surprising' Origin and Nature of Financial Crises: A Macroeconomic Policy Proposal," Federal Reserve Symposium at Jackson Hole, August.

Flannery, Mark (2009), "Contingent Tools Can Fill Capital Gaps," American Banker, 174(117).

HM Treasury (2009), "Risk, Reward and Responsibility: The Financial Sector and Society," December.

Krugman, Paul (2009), "Taxing the Speculators," New York Times, 27 November.

Perotti, Enrico and Javier Suarez (2009), "Liquidity Insurance for Systemic Crises," VoxEU.org, 11 February.

Raviv, Alon (2004), "Bank Stability and Market Discipline: Debt-for-Equity Swap versus Subordinated Notes," Unpublished working paper.

Turner, Adair (2009), "Responding to the Financial Crisis: Challenging Assumptions," Speech to the British Embassy, Paris, 30 November.

Republished with permission of VoxEU.org

Thursday, December 17, 2009

Using Inflation to Erode the US Public Debt

by Joshua Aizenman and Nancy Marion © VoxEU.org

As the US debt-to-GDP ratio rises towards 100%, policymakers will be tempted to inflate away the debt. This column examines that option and suggests that it is not far-fetched. US inflation of 6% for four years would reduce the debt-to-GDP ratio by 20%, a scenario similar to what happened following WWII.

Since the start of 2007, the financial crisis has triggered $1.62 trillion of write-downs and credit losses at US financial institutions, sending the American economy into its deepest recession since the Great Depression and the global economy into its first recession since World War II. The US Federal Reserve has responded aggressively. Fiscal policy has become expansionary as well. The US is now facing large deficits and growing public debt. If economic recovery is slow to take hold, large deficits and growing debt are likely to persist for a number of years. Not surprisingly, concerns about government deficits and public debt now dominate the policy debate (Cavallo & Cottani 2009).

Many observers worry that the debt-to-GDP ratios projected over the next ten years are unsustainable. Assuming deficits can be reined in, how might the debt/GDP ratio be reduced? There are four basic mechanisms:

1. GDP can grow rapidly enough to reduce the ratio. This scenario requires a robust economic recovery from the financial crisis.

2. Inflation can rise, eroding the real value of the debt held by creditors and the effective debt ratio. With foreign creditors holding a significant share of the dollar-denominated US federal debt, they will share the burden of any higher US inflation along with domestic creditors.

3. The government can use tax revenue to redeem some of the debt.

4. The government can default on some of its debt obligations.

Over its history, the US has relied on each of these mechanisms to reduce its debt/GDP ratio. In a recent paper (Aizenman & Marion 2009), we examine the role of inflation in reducing the Federal government’s debt burden. We conclude that an inflation of 6% over four years could reduce the debt/GDP ratio by a significant 20%.

The facts

Figure 1 depicts trends in gross federal debt and federal debt held by the public, including the Federal Reserve, from 1939 to the present. In 1946, gross federal debt held by the public was 108.6%. Over the next 30 years, debt as a percentage of GDP decreased almost every year, due primarily to an expanding economy as well as inflation. By 1975, gross federal debt held by the public had fallen to 25.3%.


Figure 1. Debt as a share of GDP

The immediate post-World War II period is especially revealing. Figure 2 shows that between 1946 and 1955, the debt/GDP ratio was cut almost in half. The average maturity of the debt in 1946 was 9 years, and the average inflation rate over this period was 4.2%. Hence, inflation reduced the 1946 debt/GDP ratio by almost 40% within a decade.


Figure 2. US debt reduction, 1946-1955

In recent years, the debt/GDP ratio has increased dramatically, exacerbated by the financial crisis. In 2009, it reached a level not seen since 1955. Figure 3 shows three 10-year projections, indicating debt held by the public could be 70-100% of GDP in ten years.


Figure 3. America’s projected debt burden

A government that has lots of nominal debt denominated in its own currency has an incentive to try to inflate it away to decrease the debt burden. If foreign creditors hold a significant share of the debt, the temptation to use inflation is greater, since they will bear some of the inflation tax. Shorter average debt maturities and inflation-indexed debt limit the government’s ability to reduce its debt burden through inflation.

Figure 4 shows the share of US public debt held by foreign creditors. The foreign share was essentially zero up until the early 1960s. It has risen dramatically in recent years, particularly since the 1997-98 Asian financial crisis, and was 48.2% of publicly held debt in 2008. Thus foreign creditors would bear about half of any inflation tax should inflation be used to reduce the debt burden, with China and Japan hit hardest.


Figure 4. Foreign share of publicly held debt

Figure 5 illustrates the average maturity length for US marketable interest-bearing public debt held by private investors, along with the debt held by the public as a share of GDP. As noted by a number of authors, the US exhibits a positive relation between maturities and debt/GDP ratios in the post-World War II period. Most developed countries show little correlation between maturities and debt/GDP ratios. The US appears to be an exception. Maturity length on US public debt in the post-World War II era went from a 9.4 years in 1947 to a low of 2.6 years in 1976. In June, 2009, the average maturity was 3.9 years. Most of this debt is nominal. (1)


Figure 5. Average maturity length and share of debt held by the public

Inflating away some of the debt burden

Figure 6 illustrates the percentage decline in the debt/GDP ratio under various inflation scenarios.(2) Inflation yielded the most dramatic reduction in the debt/GDP ratio – and the real value of the debt – in the immediate post-World War II period. A 5% inflation increase starting in 1946, for example, would have reduced the debt/GDP ratio from 108.6% to 59.3%, a decline in the debt ratio of 45%. Not only was there a large debt overhang when the war ended, but inflation was low (2.3%) and debt maturity was high. Thus there was room to let inflation rise – and it rose to 14.4% in 1947 before dropping considerably. Average inflation over the decade was 4.2%. Moreover, long maturities allowed inflation to erode the debt burden. Maturities were over 9 years in years 1945-48 and then fell gradually to 8.17 years in 1950.


Figure 6. Impact of inflation on publicly-held debt as a share of GDP

In contrast, inflation would have had little impact on reducing the debt burden in the mid-1970s. That period was characterised by a lower debt overhang, inflation was higher, and debt maturities were shorter (under 3 years). As a result, in 1975 a five-point increase in inflation would have reduced the debt/GDP ratio from 25.3% to 21.9%. The estimated impact of inflation on today’s debt/GDP ratio is larger than in the mid-1970s but not as large as in the mid-1940s. If inflation were 5% higher, the debt/GDP ratio would be about 20% lower, a debt ratio of 43.4% instead of 53.8%. Our computations of the impact of inflation on the debt overhang assume that all debt is denominated in domestic currency, none is indexed, and the maturity is invariant to inflation. Regression analysis confirms that US debt maturities over the period 1946-2008 are not responsive to inflation.

We develop a stylistic model that illustrates both the costs and benefits associated with inflating away some of the debt burden. The model, inspired by Barro (1979), shows that the foreign share of the nominal debt is an important determinant of the optimal inflation rate. So is the size of the debt/GDP ratio, the share of debt indexed to inflation, and the cost of collecting taxes. A lesson to take from the model and the simulations is that eroding the debt through inflation is not farfetched. The model predicts that inflation of 6% could reduce the debt/GDP ratio by 20% within four years. That inflation rate is only slightly higher than the average observed after World War II. Of course, inflation projections would be much higher than 6% if the share of publicly-held debt in the US were to approach the 100% range observed at the end of World War II.

Interpretation

The current period shares two features with the immediate post-World War II period. It starts with a large debt overhang and low inflation. Both factors increase the temptation to erode the debt burden through inflation. Even so, there are two important differences between the periods. Today, a much greater share of the public debt is held by foreign creditors – 48% instead of zero. This large foreign share increases the temptation to inflate away some of the debt. Another important difference is that today’s debt maturity is less than half what it was in 1946 –3.9 years instead of 9. Shorter maturities reduce the temptation to inflate. These two competing factors appear to offset each other, and the net result in a simple optimising model is a projected inflation rate slightly higher than that experienced after World War II, but for a shorter duration.

In the simulations, we raise a concern about the stability of some model parameters across periods, particularly the parameters that capture the cost of inflation. It may be that the cost of inflation is higher today because globalisation and the greater ease of foreign direct investment provide new options for producers to move activities away from countries with greater uncertainty. Inflation above some threshold could generate this uncertainty, reducing further the attractiveness of using inflation to erode the debt.

Moreover, history suggests that modest inflation may increase the risk of an unintended inflation acceleration to double-digit levels, as happened in 1947 and in 1979-1981. Such an outcome often results in an abrupt and costly adjustment down the road. Accelerating inflation had limited global implications at a time when the public debt was held domestically and the US was the undisputed global economic leader. In contrast, unintended acceleration of inflation to double-digit levels in the future may have unintended adverse effects, including growing tensions with global creditors and less reliance on the dollar. (3)

Footnotes

(1) Treasury inflation-protected securities, or TIPS, account for less than 10% of total debt issues.

(2) The calculation assumes that maturity is invariant to inflation. We test and overall confirm the validity of this assumption for U.S. data in the post-World War II period.

(3) For the threat to the dollar from the euro, see Chinn and Frankel (2008) and Frankel (2008).

References

Aizenman, Joshua and Nancy Marion (2009), "Using Inflation to Erode the US Public Debt,” NBER Working Paper 15562.

Barro, Robert (1979), “On the Determination of the Public Debt,” Journal of Political Economy, 87, 940–71.

Cavallo, Domingo and Joaquín Cottani (2009), “A Simpler Way to Solve the 'Dollar Problem' and Avoid a New Inflationary Cycle,” VoxEU.org, 12 May.

Chinn, Menzie and Jeffrey Frankel (2008), "Why the Euro Will Rival the Dollar,” International Finance, 11(1), 49-73.

Frankel, Jeffrey (2008), "The Euro Could Surpass the Dollar within Ten Years,” VoxEU.org, 18 March.

Republished with permission of VoxEU.org

Sunday, November 29, 2009

ModelRisk 3.0: Best in Class Solution for Excel-Based Risk Analysis

I have been a practicing risk modeler and analyst now for over fifteen years, and during that time, I have worked and trained with a variety of popular spreadsheet-based software tools, including Crystal Ball (Oracle) and @Risk (Palisade). Each of these applications enables users of Excel (Microsoft) to incorporate simulations and optimizations into models. However, neither Crystal Ball nor @Risk offers a comprehensive software solution that combines simulation and optimization with stochastic object modeling, time-series forecasting, and multivariate correlation. ModelRisk 3.0 from Vose Software (Ghent, Brussels) combines all of these features into a single application that works seamlessly with Excel.
The tools and techniques made available in ModelRisk have been developed from Vose Consulting’s experience in assessing risk in a broad range of industries over many years, and goes far beyond the Monte Carlo simulation tools currently available. ModelRisk has been designed to make risk analysis modeling simpler and more intuitive for the novice user, and at the same time provide access to the most advanced risk modeling techniques available. (Press release, Vose Software, May 11, 2009)
The latest version of ModelRisk is the most advanced spreadsheet-based risk-modeling platform ever developed and currently stands as the best in class software solution for quantitative risk analysis, forecasting, simulation, and optimization. ModelRisk enables users to build complex risk analysis models in a fraction of the time required to develop custom-coded applications. Open database connectivity further extends the business intelligence capabilities of this integrated platform by enabling access to essentially any data warehousing system in use today.
“Good risk analysis modeling doesn’t have to be hard, but the tools just weren’t there to make it easy and intuitive. So we asked, “If we could start from the beginning, what would the ideal risk analysis tool be like?” says David Vose, Technical Director of Vose Software. “ModelRisk is the result. Users of competing spreadsheet risk analysis tools will find all the features they are familiar with in ModelRisk, but ModelRisk throws open the doors to a far richer world of risk analysis modeling. Better still, ModelRisk has many visual tools that really help the user understand what they are modeling so they can be confident in what they do, and ModelRisk costs no more than the older tools available. We also have a training program second-to-none: the people teaching our courses are risk analysts with years of real-world experience, not just software trainers.”
ModelRisk 3.0 now includes:
  • Over 100 distribution types
  • Stochastic ‘objects’ for more powerful and intuitive modeling
  • Time-series forecasting tools such as ARMA, ARCH, GARCH, and more
  • Advanced correlations via copulas
  • Distribution fitting of time-series data, including correlation structures
  • Probability measures and reporting
  • Integrated optimization using the most advanced, proven methods available
  • Multiple visual interfaces for ModelRisk functions
  • User library for organizing models, assumptions, references, simulation results, and more
  • Direct linking to external databases
  • Extreme-value modeling
  • Advanced data visualization tools
  • Expert elicitation tools
  • Mathematical tools fors numerical integration, series summation, and matrix analysis
  • Comprehensive statistical analytics
  • World class help file
  • Developers’ kit for programming using ModelRisk’s technology
Currently, no other competing software package on the market offers the same comprehensive list or range of features found in ModelRisk 3.0, which is now my primary risk modeling, forecasting, and business intelligence platform. For more information, follow the link below.

Learn More

Saturday, October 24, 2009

Working with Windows 7

Well, I took the plunge and upgraded to Windows 7 (Microsoft). The installation went smoothly, and yes, I can report that Windows 7 works, and by that I mean works for me. The Windows 7 experience seems to be more a reflection of the user than of the operating system's creators, which is both empowering and inspiring. I will leave the technical review of the software to others. In the mean time, Windows 7 is working for me just fine.

Wednesday, September 02, 2009

Evidence-Based Practices in Online Learning

A new study by the US Department of Education (2009) entitled, “Evaluation of Evidence-Based Practices in Online Learning: A Meta-Analysis and Review of Online Learning Studies,” reports that:
Students who took all or part of their class online performed better, on average, than those taking the same course through traditional face-to-face instruction.
The methodology for the study was a meta-analysis of 51 study effects, 44 of which were drawn from research with older learners. The study also found that:
Instruction combining online and face-to-face elements had a larger advantage relative to purely face-to-face instruction than did purely online instruction.... [Moreover,] elements such as video or online quizzes do not appear to influence the amount that students learn in online classes.
The report took care to note that more research is required before extending the implications of the study into education segments outside of adult learning, such as K-12 programs.
Educators making decisions about online learning need rigorous research examining the effectiveness of online learning for different types of students and subject matter as well as studies of the relative effectiveness of different online learning practices.
Download Report

Monday, August 31, 2009

Website Preferences of Social Networkers

The social networking phenomenon is quite interesting. Service providers such as Facebook, MySpace, LinkedIn, and Twitter are in wide demand. However, as with any new technology, we will likely see a number of successes, failures, and consolidations before a market leader emerges. In the mean time, Facebook appears to be assuming an early lead amongst its competitors.

In a recent article, eMarketer cites research by Anderson Analytics that indicates the majority of social networkers by generational cohort use Facebook, while a minority use Twitter and LinkedIn. Moreover, a majority of users younger than age 45 also use MySpace, apparently in conjunction with Facebook.

The summary data above speaks for itself. The generational age groupings were as follows: generation Z (ages 13-14); generation Y (ages 15-29); generation X (ages 30-44); baby boomers (ages 45-65); and the WWII generation (older than age 65). Again, Facebook appears to be in the lead, at least for now.

Wednesday, August 26, 2009

The Normality of Surprises

Rationalizing away the possibility of extreme events (or surprises) under conditions of probabilistic normality is an interesting behavioral economics phenomenon in society. Such behavior is signaled by attributions of “uncertainty,” “anomalies,” “shocks,” or some other lofty but equally pretentious notion of time-space deception. The truth is that denying the possibility of extreme outcomes defies the logic of normality, as such outcomes are certainly possible (or even imminent) under normal conditions.

Carl Friedrich Gauss (1777–1855) discovered the specific characteristics of what we now know as normality. The conceptual framework for describing normality is the normal frequency distribution, or “bell curve,” also known as a “Gaussian” distribution. Normality is a common assumption of many of nature’s phenomena, and a central concept in probability theory.

Extreme values in the universe (or population) of outcomes occur naturally and more frequently than many presume. Under conditions of normality, 1 in 22 observations will deviate by twice the standard deviation (which is the square root of the variance) from the mean, 1 in 370 will deviate by three times the standard deviation, and up to 5 in 1,000 observations will deviate from the mean by three or more times the standard deviation. Note especially that extreme outcomes can fall well beyond the mean (to infinity).

We as analysts have a duty to educate decision-makers about how to use probability theory to advance the cause of modern finance in society. That includes emphasizing the counter-intuitive possibilities of extreme events in the economy. To assume away the normality of such surprises would be naïve.

Saturday, August 22, 2009

The Speed of Thinking

The speed of communications in the post-modern world continues to accelerate exponentially. Yet, the speed at which the human mind can absorb and process information has changed little. As a result, society has become "queasy" from high volumes of data, and just as humans slow down when nauseated, society’s spirit might also be in abatement under the bombardment of information.

According to Mr John Freeman (2009, “Not So Fast,” WSJ, Aug 21, 2009):

Speed used to convey urgency; now we somehow think it means efficiency.... There is a paradox here, though. The Internet has provided us with an almost unlimited amount of information, but the speed at which it works—and we work through it—has deprived us of its benefits. We might work at a higher rate, but this is not work­ing. We can store a limited amount of information in our brains and have it at our disposal at any one time. Making decisions in this communication brownout, though without complete infor­mation, we go to war hastily, go to meetings unprepared, and build relationships on the slippery gravel of false impressions. Attention is one of the most valuable modern resources. If we waste it on frivolous communication, we will have nothing left when we really need it.

As communications speed up, so can “brain overload.” This begs the question of just how fast communications should go. Perhaps the speed of thinking could provide a useful governor.

Saturday, August 15, 2009

To DBA or PhD in Business Administration

One of the frequently asked questions I receive from graduate students is regarding the differences between the degrees of Doctor of Business Administration (DBA) and Doctor of Philosophy (PhD) in business administration. Inevitably, these discussions lead to the matter of theory and its role in business research. What a good topic!

Learners should realize that theories (or concepts) provide the essential framework for all business research, including research conducted in fulfillment of requirements for both the DBA and PhD. All successful doctoral dissertations are grounded in good theory and evidence. The PhD is not a “theoretical” degree devoid of empirical evidence, and the DBA is not a “practical” degree devoid of theory. Rather, both degrees are grounded in the management literature and evidentiary support. You do not get out of theory or the rules of evidence simply by choosing one or the other academic degree.

The essential difference between the DBA and PhD in business administration is that the former focuses on the application of theory rather than on the development of new theory. Because business administrators are in constant search for better concepts and methods for conducting enterprise, PhD candidates can always find new topics for basic research. Likewise, while it is true that a general theory of management has yet to emerge, it does not mean that candidate theories are not out there. Thus, DBA learners can always find research problems where the application of existing concepts and methods might prevail to solve business problems.

Both, the DBA and PhD in business administration are about good theory and evidence – in other words, scholarship.

Friday, August 14, 2009

Economic Policies in Dystopia

I have previously written about the near-term monetary and fiscal policy options that were available to decision makers for responding to the then emerging economic crisis in Implications of the Financial Crisis as offered by Dr George Cooper (2008, “The Origin of Financial Crises,” Vintage). The policy options he offered at the time included the “free market route,” which entails allowing the credit contraction and underlying asset deflation to play out. His second option was for policy-makers to continue applying fiscal and monetary stimulus in an effort to trigger a new economic expansion that might have the power and momentum to negate the current credit contraction. Dr Cooper’s final option was for policy-makers to “unleash the inflation monster,” which means simply “printing money” in order to negate debt through either state-funded handouts or deliberate inflationary spending policies. Of course, none of these options is particularly attractive. My conclusion at the time was that the US had probably already set out on a course towards inflation.

Since last December, Western countries have implemented an unprecedented array of fiscal and monetary initiatives designed to expand the economy and mitigate the severity of the ongoing economic crisis. Almost all of these initiatives entail heavy spending and borrowing by governments. The scale of these actions suggests that Western nations have elected to embark on programs to stimulate the global economy in an effort to restore capital flows and financial stability. The good news is that there are at least some indications that the economic crisis has bottomed-out, but no one really knows at this point.

However, should the stimulus programs eventually fail to place the global economy on track to a robust recovery, then the looming question will become how Western governments might eventually pay for the spending spree they gleefully embarked upon. Prof Kenneth Rogoff (2009, The Confidence Game, Project Syndicate) suggests that governments may have few policy options remaining:
Within a few years, Western governments will have to sharply raise taxes, inflate, partially default, or some combination of all three.
If Prof Rogoff is correct in suggesting that Western governments may soon have to choose from these dreary options, then the economic future for society is arguably bleak. None of these policies would be popular or easy to implement. Nevertheless, there is a certain reality found in the multiple approach-avoidance content of these choices, and it may be time for policy-makers (and voters) to begin thinking about which option (or combination of options) they might prefer.

Wednesday, August 12, 2009

Too Big to Fail, or Just Too Big?

I just read a discussion paper by Dr James B Thompson of the Research Department of the Federal Reserve Bank of Cleveland (2009, “On Systemically Important Financial Institutions and Progressive Systemic Mitigation”), in which he proposes various criteria for identifying and supervising financial institutions that are “systemically important.” According to Dr Thompson:
Delineating the factors that might make a financial institution systemically important is the first step towards managing the risk arising from it. Understanding why a firm might be systemically important is necessary to establish measures that reduce the number of such firms and to develop procedures for resolving the insolvency of systemically important firms at the lowest total cost (including the long-run cost) to the economy.
Dr Thompson further argues that disclosing the identity of firms that may eventually be designated “systemically important” would require “constructive ambiguity” in order to ensure the market is not mislead into believing certain firms retain special dispensations in the form of government guarantees.
The choice of disclosure regime would seem to be between transparency (publication of the list of firms in each category) and some version of constructive ambiguity, where selected information is released… In the context of central banking and financial markets, the term [constructive ambiguity] refers to a policy of using ambiguous statements to signal intent while retaining policy flexibility. In the context of the federal financial safety net, many have argued for a policy of constructive ambiguity to limit expansion of the federal financial safety net. The notion here is that if market participants are uncertain whether their claim on a financial institution will be guaranteed, they will exert more risk discipline on the firm. In this context, constructive ambiguity is a regulatory tactic for limiting the extent to which de facto government guarantees are extended to the liabilities of the firms that regulators consider systemically important.
After considering Dr Thompson’s ideas, I am flabbergasted with doubts. My first is with regard to the dogma implied by “systemically important” (i.e., “too big to fail”). What does “systemically important” mean? What makes a company “systemically important?” Dr Thompson sidesteps the “too big to fail” proposition by coining the alternative phraseology, “systemically important,” which is equally lambaste with normative relativism. The entire concept of “systemically important” lacks content validity, both in rhetoric and substance. To say a firm is “systemically important” is just another way of designating the firm as “too big to fail.”

My second doubt centers on the need for “constructive ambiguity” in disclosing the identity of firms that are designated as “systemically important.” The suggestion that “constructive ambiguity” will somehow protect the markets is preposterous. What the marketplace needs today is greater transparency, not less. The very notion of “constructive ambiguity” is laced with deceit. Ambiguity can only further harm the stature and creditability of our financial markets, especially given the recent collapse of public confidence in the face of the ongoing economic crisis.

My final comment is to offer a new suggestion for dealing with firms that are either “systemically important” or “too big to fail,” and that is we treat such firms as simply too big to keep around. Firms that are so large as to become “systemically important” or “too big to fail” should be broken up into smaller companies, thus advancing the competitive spirit of the marketplace, while ensuring that no firm becomes so large as to be able to threaten the financial stability of our nation as a consequence of their misfortunes.

Download

Tuesday, August 11, 2009

More Small Businesses Needed

A new report by Dr John Schmitt and Nathan Lane of the Center for Economic and Policy Research In Washington, DC (2009, “An International Comparison of Small Business Employment”) dispels some misconceptions about the scale of small business employment in the US. According to the report, the US has a much smaller small-business sector (as a share of total employment) than Canada and essentially all of Europe. The authors suggest that the relatively high direct cost of health care discourages small business formation in the US. In contrast, small businesses and start-ups in other countries tend to rely on government-funded health care systems. As of 2007, the US self-employment rate was well below that of other nations.


Download

Friday, August 07, 2009

Statisticians in Demand

We live in a world where data is the raw material that analysts use to produce information, also known as knowledge. However, without analysis, data remains data in raw form. This might explain the recent upsurge in hiring for statisticians. Steve Lohr (“For Today’s Graduate, Just One Word: Statistics,” NYT, Aug 5, 2009) argues that statistics may be the password for hiring in the coming years:
The rising stature of statisticians, who can earn $125,000 at top companies in their first year after getting a doctorate, is a byproduct of the recent explosion of digital data. In field after field, computing and the Web are creating new realms of data to explore — sensor signals, surveillance tapes, social network chatter, public records and more. And the digital data surge only promises to accelerate, rising fivefold by 2012, according to a projection by IDC, a research firm.

The demand for statisticians is consistent with a larger trend toward competing on analytics in enterprise. This trend has also given impetus to the need for other experts, especially in computer programming.

Though at the fore, statisticians are only a small part of an army of experts using modern statistical techniques for data analysis. Computing and numerical skills, experts say, matter far more than degrees. So the new data sleuths come from backgrounds like economics, computer science and mathematics.

Over the past several decades, firms have invested heavily into data management technology, including server and data-warehousing systems. These investments have created massive amounts of raw data that are begging to be analyzed by people trained and skilled in descriptive and inferential statistics, stochastic modeling, linear and non-linear forecasting, and so forth. The creation of so much raw data in recent years makes statistical analysis of that data a vital value-adding activity that enables competing on analytics.

“I keep saying that the sexy job in the next 10 years will be statisticians,” said Hal Varian, chief economist at Google. “And I’m not kidding.”

Thursday, August 06, 2009

Controlling vs Collateralizing Risk

Regulatory reforms that focus on improving risk controls rather than increasing capital reserves are the better path for the future of banking, according to Katsunori Nagayasu, president of the Bank of Tokyo-Mitsubishi UFJ and chairman of the Japanese Bankers Association (“How Japan Restored Its Financial System,” WSJ, Aug 6, 2009).
Regulatory authorities around the world are currently discussing ways to prevent another financial crisis. One idea is to mandate higher levels of capital reserves. Japan’s banking reform shows that a comprehensive solution would work better.
Requiring banks to increase capital reserves is itself, “risky.” For one thing, banks may not be able to raise sufficient capital in the equity markets to meet the revised capital requirements. Moreover, raising capital requirements tends to disadvantage banks that focus on traditional borrowing and lending transactions, and advantage banks that trade and take risks with their own accounts.
A new regulatory framework must also distinguish between banks whose main business is deposit taking and lending—the vast majority of banks worldwide—and banks that trade for their own account. The recent financial crisis demonstrated that balance sheet structure matters. Trusted banks with a large retail deposit base continued to provide funds to customers even in the depths of the crisis, whereas many banks that relied heavily on market funding or largely trading for their own account effectively failed. Investment banks with higher risk businesses by nature should be charged a higher level of capital requirement—otherwise, sound banking will not be rewarded.

That the government has undertaken to save only the largest banks under the “too big to fail” presumption is of concern to the public for a variety of reasons, not the least of which is that such an approach may actually reward the banks that are taking the biggest risks, while closing those that have played by the rules. Additionally, requiring banks to maintain excessive capital reserves may sound good, but high reserves brings reduced capital efficiency, particularly at a time when money is scarce.

Regulators would be wise to consider the capital efficiency of the reforms they intend to invoke, or the current recession could extend well into the future. The US should take a lesson from the Japanese banking experience and focus on new ways to control risk, rather than simply collateralizing it.

Monday, August 03, 2009

In Defense of Financial Theories

I recently read a ridiculous critique of Value at Risk (VaR) by Pablo Triana in BusinessWeek (“The Risk Mirage at Goldman,” Aug 10, 2009). His review of this advanced financial technique is scathing:
VaR-based analysis of any firm's riskiness is useless. VaR lies. Big time. As a predictor of risk, it's an impostor. It should be consigned to the dustbin. Firms should stop reporting it. Analysts and regulators should stop using it.
Mr Triana bases his assertion on the observation that VaR is “a mathematical tool that simply reflects what happened to a portfolio of assets during a certain past period,” and that “the person supplying the data to the model can essentially select any dates.” My response to his argument is simply to ask, “Isn’t that true of any model or theory…?” Mr Triana goes on to argue that:
VaR models also tend to plug in weird assumptions that typically deliver unrealistically low risk numbers: the assumption, for instance, that markets follow a normal probability distribution, thus ruling out extreme events. Or that diversification in the portfolio will offset risk exposure.
In essence, Mr Triana seems to be saying that normally distributed results have bounds, and that portfolio diversification does not offset risk. Neither of his assertions are supported by probability theory or the empirical evidence. Yet, Mr Triana goes on to conclude, “it’s time to give up analytics so that real risk can be revealed.”

Mr Triana does a disservice to the financial services industry and public at large with his dramatic commentary. Yes, the discipline of finance has much to learn from the ongoing economic crisis, and of course, financial theory in general will evolve based on these recent lessons. However, just because one gets a bad meal in one restaurant does not mean that one should quit going to restaurants.

Financial theories such as VaR stand as state-of-the-art tools in the business of finance and risk management. These techniques are grounded in the same stochastic methodologies that are used by engineers in virtually every industry. To dismiss VaR so completely without considering its utility for supporting effective financial decisions is tantamount to sending financial theory back to the dark ages. Our knowledge of finance needs to advance as a result of what is happening in the economy, not go backwards.

Sunday, August 02, 2009

Government Spending and Gross Domestic Product

Data recently released by the US Department of Commerce suggests that government spending as a percentage of Gross Domestic Product (GDP) remains historically low, despite government efforts to increase its consumption significantly. According to the GDP data for the second quarter of 2009, government consumption stood at 20.7% of the total US GDP (as of June 30). This compares with 23.6% for the same period in 1952, 22.0% in 1962, 21.5% in 1972, 20.7% in 1982, 20.1% in 1992, and 18.6% in 2002. The data indicate that government expenditures as a percentage of GDP have been declining since the early 1950’s, and remain relatively low by this measure. Note that government expenditures for GDP include defense and non-defense spending, as well as spending by state and local governments. The current rates of government consumption as a percentage of GDP are by no means alarming, at least by historical standards. Of greater concern might be the negative impact of net exports/imports on US GDP in recent years.

Friday, July 31, 2009

Enter the Algorithm

In this occasional paper, I echo Dr David Berlinski's view that the proliferation of powerful personal computers during the late twentieth-century accelerated the quest for algorithms as drivers of technological and human progress. While calculus is creditworthy of bringing high order to the physical sciences, it is now the algorithm that thrives as the intelligent artifact of the new millennium.

Download

Thursday, July 30, 2009

The Spreadsheet Reinvented

My father was an accountant, and I vividly recall his working on spreadsheets (also known as ledger sheets) on the dining room table late at night. He once gave me a short lesson about how to make proper entries onto a spreadsheet. I recall his emphasis on neatness and penmanship, selecting the proper pencil (my father preferred the Ticonderoga No. 3), and having a serviceable gum eraser nearby for making clean erasures. In the early 1980’s, he introduced me to electronic computing. At the time, VisiCalc was all the rage. Later, Lotus 1-2-3 came into vogue. Today, Excel is the most widely-used spreadsheet program in the world, and has almost completely replaced ledger paper in professional practice (though some small businesses still rely on paper ledgers to this day).

Of course, the essential features of the electronic spreadsheet are inherited from its paper lineage, and the ontological purpose of all spreadsheets is the same, regardless of whether one is using an electronic or paper version of the tool. Nevertheless, it's interesting to compare the paper spreadsheet to its electronic successor in an effort to better understand why the spreadsheet remains essential to the practice of finance and accounting.

Below, you will find a short definition of the term “spreadsheet,” together with two download files – one is a facsimile of a paper spreadsheet, the other an electronic spreadsheet. Once you have downloaded both files, place them side-by-side on your monitor screen and then ask yourself the following question: Can one replace the spreadsheet without reinventing the spreadsheet? I look forward to your comments.
spread·sheet n. 1. A piece of paper with rows and columns for recording financial data for use in comparative analysis. 2. Computer Science An accounting or bookkeeping program that displays data in rows and columns on a screen.
Paper Spreadsheet

Electronic Spreadsheet

Wednesday, July 29, 2009

Small Investors Beware

High frequency (or algorithmic) trading was one of the major investment innovations to emerge in the late 20th century. Given the effectiveness and profit potential of such methods, it comes as no surprise to learn that 46 percent of daily volume originates through high frequency strategies.
Powerful computers, some housed right next to the machines that drive marketplaces like the New York Stock Exchange, enable high frequency traders to transmit millions of orders at lightning speed and, their detractors contend, reap billions at everyone else’s expense… High-frequency specialists clearly have an edge over typical traders, let alone ordinary investors… Powerful algorithms — “algos,” in industry parlance — execute millions of orders a second and scan dozens of public and private marketplaces simultaneously. They can spot trends before other investors can blink, changing orders and strategies within milliseconds… These systems are so fast they can outsmart or outrun other investors, humans and computers alike. (Duhigg, “Stock Traders Find Speed Pays, in Milliseconds,” NYT, 23 Jul 2009)
Unfortunately, the methods of high frequency trading are neither available nor assessable to the average investor. My advice to most investors is to invest in public companies only with funds that you can afford to lose. My recommended investment strategy of choice for serious investors is to target companies in which you are an active owner, partner, or director in order to ensure you have full access to the fundamental information you need to monitor your investments wisely (which incidentally is the same strategy apparently used by Warren Buffet, George Soros, and Carl Icahn).

Institutional investors and other major players dominate modern day investing with sophisticated methods and technologies that the average investor cannot hope to match. Unless regulators can find a way to level the playing field, small investors should beware of the markets.

Tuesday, July 28, 2009

The Ascent of Money

From Main Street to Wall Street, the ongoing financial crisis has changed the very nature of society and enterprise. The PBS production, The Ascent of Money with Niall Ferguson, can now be viewed online. The four-hour documentary seeks to trace the evolution of money and its impact on society throughout history. I commend the film to my readers.

The Ascent of Money on PBS

Wednesday, June 10, 2009

Business Intelligence and Spreadsheet Redux

A recent survey released by Nigel Pendse and the Business Application Research Center (2009, “BI Survey 8,” BARC) seems to confirm that business intelligence (BI) is less the domain of information technology (IT) than it is of "disenfranchised" spreadsheet-users. Stephen Swoyer of The Data Warehouse Institute (2009, “Report Debunks BI Myth”) offered this commentary on the BARC survey results:
Business intelligence vendors like to talk up a 20/80 split -- i.e., in any given organization, only 20 percent of users are actually consuming BI technologies; the remaining 80 percent are disenfranchised. According to "BI Survey 8," however, most shops clock in at far below the 20 percent rate. In any given BI-using organization…, just over 8 percent of employees are actually using BI tools. Even in industries that have aggressively adopted BI tools (e.g., wholesale, banking, and retail), usage barely exceeds 11 percent.
James Standon of nModal Solutions (2009, “Business Intelligence Adoption Low and Falling”) concludes that analysts tend to choose BI tools that are best able to get the job done, and more often than not, that tool is the electronic spreadsheet:
Big business intelligence seems to think that BI for the masses is a tool problem - something in how their portal works, or how many rows of data per second their appliance can process. Sure, if the tools are hard to use or learn, it's a factor, but I think more often than not business intelligence isn't used because it's not providing what is required… Often, people use Excel [Microsoft] because last week they didn't know exactly what they needed, and it is a tool that lets them build it themselves this week when the boss wants the answer and there is a decision to make. With all its flaws, it's still the most adopted business intelligence tool in the world.

Friday, June 05, 2009

Sharing vs Collaborating Organizations

Recently, I came across a dichotomy in terminology that was not simply instructive, but explanatory. The terms to be compared were “sharing” and “collaborating.” According to the Free Dictionary, sharing means “to participate in, use, enjoy, or experience jointly or in turns,” while collaborating means “to work with another or others on a joint project.” The implied word analogies are instructive, as sharing is to participate, use, enjoy, and experience, as collaborating is to work, produce, achieve, and attain.


So, what is my point? What I am trying to say is that sharing is not the same as collaborating, especially when it comes to work. Too often, co-workers are willing to share data and information, but their motivation in sharing is merely to participate, use, enjoy, and otherwise experience an outcome devoid of personal responsibility. Conversely, co-workers who collaborate in a joint effort are working to achieve some ends to which each collaborator extends some degree of personal responsibility.

With these distinctions in mind, how would you describe your organization of interest? Is it a sharing organization or a collaborating organization? Is the purpose of the organization lost in confounding experiences and participation, or is the purpose of your organization found in joint work endeavoring to achieve a common purpose or cause?

In my past writings, I have argued that the future of enterprise depends upon its capacity to be inclusive, transparent, and inventive. But, to achieve these ends, the enterprise must first transcend the passive sharing inclinations of its members to become instead a vibrant collaborating organization where work is defined by its combined efforts to achieve common goals. While it may be fun to share, collaboration is what advances enterprise to a higher cause.

Wednesday, May 20, 2009

The Future of Enterprise

The political changes that are sweeping our nation are also affecting the future of enterprise. And while the “old guard” of society seems to be desperately seeking evidence of a return to common ideals and shared experiences, what is instead emerging are demands for change that seem to be grounded in a diffusion of concepts and cultures, but which appear nonetheless to be adaptively converging into a tidal wave that is enveloping enterprise and governments alike. The integrating vectors of this new wave of constituent thinking are calling on the future of enterprise to be inclusive, transparent, and inventive:

Be inclusive. Pluralism requires that every stakeholder be a supportive party to the solution. The days of elitism (where a few decide what is best for everyone) and populism (where a simple and often slim majority imposes its will on the remainder) are waning in favor of a new yearning for super-majorities of hyperactive constituents.

Be transparent. Stakeholder confidence in enterprise management regimes can only “reset” if the financial state of the firm is reported in real-time, and the underlying assumptions of financial projections are fully disclosed. Indeed, all future paths for enterprise management (including governance) should be paved in disclosure.

Be inventive. Effective enterprise will require new concepts and technologies that transcend monopolization and commoditization in order to achieve more authentic and genuine forms of competitive advantage that respect the community, environment, and security challenges of our time. Moreover, while past solutions have most often focused on ways to mitigate risk, the time has arrived for inventive thinking that actually reduces the risks our society fears most, both natural and manmade.

As a businessperson, I have begun looking for new projects that are “real,” by which I mean projects that require inclusive teamwork, non-proprietary transparency, and an inventive spirit that seeks to address business problems in ways never before considered. The future of enterprise is upon us – and it is real.

Tuesday, May 19, 2009

Spreadsheets Are Back

A recent poll of over one thousand LinkedIn members returned some interesting insights into spreadsheet usage patterns in companies. Respondents were posed with the following statement, and were then asked to provide a single response as follows:
I use spreadsheets _____ in my work.
· Never
· Rarely
· Monthly
· Weekly
· Daily
The results found that 80% of respondents use spreadsheets on a daily basis, while another 11% use spreadsheets weekly. In all, over 90% of respondents are apparently using spreadsheets at least weekly in their jobs.

The other interesting finding was that spreadsheet ubiquity was at its greatest in enterprise and large firms where a full 85% of resondents reported using spreadsheets on a daily basis, while another 10% reporting weekly usage.

One respondent left a comment claiming to have selected "daily" only because "constantly" and "hourly" were not offered as options. Still another respondent voiced surprise that "daily" users were less than 95%. One apparent critic of spreadsheets commented that the poll was "a waste of time."

Results were generally even across age groups. However, males reported somewhat higher daily spreadsheet usage than females. The survey was open to all LinkedIn users between April 24 and May 19, 2009. There were 1,094 voluntary participants in the survey.

More

Monday, May 18, 2009

Challenging Commoditization and Monopolization

I recently responded to an article by Max J Pucher entitled, “The End of Capitalism," and I wanted to share that response here in order to introduce my views about how economic reform must subordinate to political reform in order to resolve our society’s economic woes:

I have to agree that “capitalism” is not what I see operating the new economy. In fact, governments long ago rallied in support of two opposing themes that are rank with proximity to the economic crisis. These two contradictory themes are monopolization and commoditization. Everyone knows what a monopoly is, and it is evident that enterprise-scale monopolies such as government and healthcare are thriving in the new economy. The second theme of the new economy has been the rampage of commoditization through major industries, whereby goods and services become undifferentiated resulting in loss of pricing flexibility. Lower prices is good news for consumers, but often results in lower wages for workers. Two industries that have been ravaged by commoditization are financial services and automobile production. That these industries would become the leading scapegoats of the expanding economic crisis is of little surprise.

The global economic system has become a corrupt choice of supporting either “big government” or “big business,” making the real losers the people and society. This polarization of the political economy is untenable. Society’s only hope is that the electorate reframe the debate from “economic” to “political” reform, whereby we the people accept that elitism (which advocates extreme commoditization of industries) and populism (with its relentless and expanding commitment to government sponsored monopolies, including national security and healthcare) should now yield to pluralism, and devote instead the energies and resources of government toward the broader needs of a more active and relevant citizenry, albeit at the expense of “enterprise” scale designs. Indeed, it may be time to relook the notions of republicanism and federalism as the political systems upon which to ground capitalist society. We, the people, have much work ahead of us in order to reinvent our world into pluralism in our time. Let’s hope it’s not too late.

Friday, May 15, 2009

Limited Purpose Banking

Prof Laurence Kotlikoff and Dr John Goodman make an interesting case for so-called, "limited purpose banking," the essence of which is that "banks would let people gamble, but they would not themselves gamble." Their modest proposal has intriguing potential. Check it out...

Back to Basics

Tuesday, April 28, 2009

Analytical Forecasters Needed

The ability to forecast results is now the top concern of US, Asian, and European chief financial officers according to a recent survey of nearly 1,300 senior finance executives by CFO Europe, Tilburg University, and Duke University (CFO, April 2009). That finance chiefs now rank their ability to forecast effectively as their top internal concern is instructive for the future of enterprise management. Other issues, such as working capital management, maintaining morale, and counterparty risk, stood behind forecasting as the top concern. As the global economic crisis continues its onslaught across the enterprise landscape, there appears to be a crying need for experts (and expert systems) in analytical forecasting around the world.

Monday, April 27, 2009

Risk Management in Demand

As the global financial disaster continues unabated, research is beginning to percolate findings about some of the causes of the storm, as well as the precautionary measures that might avert future crises of this nature. In a recent survey of over 500 key financial executives conducted by MPI Europe (April 2009), several important views prevailed. One of the survey's strongest findings was the perceived need to develop a “risk management culture” in today’s financial institutions, including bolstering the relative power of risk management functions vis-à-vis its trading counterparts. Now, as good as that sounds, I am skeptical as to whether our financial services industry has it within itself to embed a new risk-aware culture without demonstrable intermediate measures to lead the way (after all, our world is inspired by capitalism). The good news is that several other findings were more specific and actionable. Over 75 percent of the respondents saw a shortage of sufficiently and appropriately trained personnel as having a “high impact” on creating the crisis. Additionally, a significant majority of respondents wanted to see an improvement in their “risk management applications,” to include a shift from predominantly quantitative measures toward qualitative methodologies (e.g., internal controls). Both of these latter measures are fully actionable through increased investment in risk management technologies and training. Moreover, implementing stronger spreadsheet control regimes, coupled with stricter guidelines for spreadsheet checking and auditing, are another immediate requirement. Finally, I would argue that by funding and initiating improved risk management technologies and training, executives will be taking the first vital steps toward creating the risk management culture that they seek. The recognized need for effective risk management is gaining traction in today’s financial services industry. The real question remains whether the industry’s leaders will have the courage to recognize the deficiencies of their existing risk management structures, and respond by investing in the technologies and training that can address these shortcomings.

Tuesday, April 21, 2009

Valuing Business Intelligence

I have a theory I am pondering based on my reading of the emerging business intelligence literature. My hypothesis is that the more efficient and simple the analytical framework between the data and the decision, the more valuable the intelligence becomes. Said another way, if intelligence is what connects data to decisions, then the value of the intelligence increases as the analytical framework that supports and generates the intelligence is simplified. A related research question would be whether decision-makers who do their own analytical work make better decisions than those who rely on intermediaries. If I ever test the theory, I suppose the results could have implications for existing and emerging enterprise resource and risk management regimes, as well as management science in general. If you are contemplating a future research topic, this might provide you with a starting point.

Thursday, April 09, 2009

The Ten Commandments of Risk Analysis

The Ten Commandments of risk analysis according to Prof Granger Morgan and Prof Max Henrion (1990):
  • Do your homework with literature, experts, and users
  • Let the problem drive the analysis
  • Make the analysis as simple as possible, but no simpler
  • Identify all significant assumptions
  • Be explicit about decision criteria and policy strategy
  • Be explicit about uncertainties
  • Perform systematic sensitivity and uncertainty analysis
  • Iteratively refine the problem statement and analysis
  • Document clearly and completely
  • Expose to peer review

    Tuesday, April 07, 2009

    Elevating the Roles of Risk Analysts and Managers

    The ongoing economic crisis has brought with it a heightened interest in risk analysis and management. Yet, something seems awry at the risk management desks of our nation’s largest banks and financial institutions. Goldman Sachs CEO, Lloyd C Blankfein was recently quoted as saying, “too many financial institutions and investors simply outsourced their risk management -- rather than undertake their own analysis, they relied on the rating agencies to do the essential work of risk analysis for them.” Mr Blankfein contended that banks and financial institutions must elevate the status of risk analysis in order to alleviate the “systemic lack of skepticism” that he alleged was precursor to the ongoing economic crisis. Mr Blankfein suggested that banks and financial institutions redefine the roles of risk managers, including giving them equal stature “with their counterparts in revenue producing divisions.” This redefinition of roles includes delegating greater responsibilities and authority onto risk managers, whereby “if there is a question about a mark or a disagreement about a risk limit, the risk manager's view should prevail” (WSJ, April 7, 2009). My personal interpretation of Mr Blankfein’s statements is that those trained and charged with risk analysis and management were apparently left standing outside the boardroom as our nation’s banking and financial “executives” undertook investment policy-making in isolation. Let us hope that the regulatory changes to come might at least mandate that risk managers and analysts get a seat at the table.

    Monday, March 30, 2009

    It's All About Skills...

    I was recently asked the following question about higher business education: "Does an MBA breed authenticity and humility or arrogance in the name of leadership?" Well, if we can first transcend experience and degrees, and ask instead about skills (both hard and soft), then we begin to ask better questions, such as, "Who has the skills to get the job done?" Of course, skills can be acquired through both experience and higher education (or not). For this reason, I always ask management applicants about their skills, rather than their education or experience. "What skills do you bring to the table?" Responses such as, "I have years of experience," or "I have years of college," evade the question. My advice to everyone is make your experience and higher education translate into more and more skills, and do not assume that simply being "present" in experience or education is the same thing. As for where a person acquires their skills, who cares. If we can all focus on skills, rather than experience and education, then we can be best assured the right people are in leadership.

    Tuesday, March 24, 2009

    Working on Wall Street

    In some ways, the financial services industry is overdue for a good house-cleaning. And no doubt, the culture of the money management business will change accordingly. Whereas excesses in staffing have been common at many financial services firms in recent years, the reality is that only those who produce value will have a place in the reworked financial services industry. Creating value in financial services is not easy. Which leads to one of my favorite interview questions for job applicants, "Explain how one creates value through finance..." Those who can respond effectively to that question have a future in financial services.

    Friday, February 06, 2009

    On Management "Style"

    The matter of "baby-boomers" remaining in control of their destiny has significant implications for management, particularly since society has yet to send the "Bob Hope" generation packing. For example, the US Congress is still dominated by members born prior to World War II. For this reason, there may actually be a generation "skipping" tendency in favor of younger leaders from diverse backgrounds -- witness Barack Obama's rise to power. My sensing is that there is a large segment of our society that views "change" as the replacement of anyone born prior to 1955 with much younger leaders. The impact of these views on management selection could be significant in the coming years. The second issue confronting management is the continuing demise of elitism, the entrenchment of populism, and the rise of pluralism. The net result could be a generation of leaders who insist on securing more than a simple majority in support of their decisions, but rather require a plurality of support closer to two-thirds majority. Imagine if the US President vetoed legislation because it did not come with two-thirds support of Congress. If the same approach took hold in corporations, it would require boards to listen carefully to all of their stakeholders in depth. In summary, I believe that generational issues coupled with society's move toward pluralist thinking will require mangement to change its "style" of decision-making in the coming years -- probably for the best.

    Tuesday, January 13, 2009

    Risk Management in Review

    I recently responded to a question in a public forum regarding the limitations of risk management. The specific question posed was, “What limits our ability to effectively manage risk?” This is an interesting question given the financial crisis still underway. My response follows:

    Risk management continues to be a misunderstood discipline. The truth is we do have the analytics to understand and manage most kinds of risk (at least to some extent). The more serious problem confronting our society is our apparent inability to apply that knowledge. I'm reminded of the story of a civil war soldier who was listening to one of his officers read from a newspaper. The story goes that the officer quoted from a story by commenting, "...it says here there were fifty percent casualties at the battle of..." The soldier who was listening to the officer's comment responded by asking, "...wow, is that a lot?"

    The point is that having the analytics to describe risk, and having the knowledge and training to understand the analytics are two different things. My impression is that many (if not most) business leaders are poorly trained at understanding risk analytics beyond what might be described as layman's terms. What is most needed today is for our business leaders to become more knowledgeable of how to understand and use risk analytics in an effective and meaningful manner. The days of making guesses based on institutions are long gone, especially when those decisions can result in losses of billions of dollars, as well as suffering amongst the ranks of employees and other stakeholders who are ultimately victimized by those decisions.

    My advice to business leaders at all levels is to make risk analysis a centerpiece of their training in graduate school. If you puzzle over terms such as variance, standard deviation, stochastic, optimization, and so forth, then it may be time to schedule some training in these skills as part of your lifetime learning plan.