Monday, August 31, 2009

Website Preferences of Social Networkers

The social networking phenomenon is quite interesting. Service providers such as Facebook, MySpace, LinkedIn, and Twitter are in wide demand. However, as with any new technology, we will likely see a number of successes, failures, and consolidations before a market leader emerges. In the mean time, Facebook appears to be assuming an early lead amongst its competitors.

In a recent article, eMarketer cites research by Anderson Analytics that indicates the majority of social networkers by generational cohort use Facebook, while a minority use Twitter and LinkedIn. Moreover, a majority of users younger than age 45 also use MySpace, apparently in conjunction with Facebook.

The summary data above speaks for itself. The generational age groupings were as follows: generation Z (ages 13-14); generation Y (ages 15-29); generation X (ages 30-44); baby boomers (ages 45-65); and the WWII generation (older than age 65). Again, Facebook appears to be in the lead, at least for now.

Wednesday, August 26, 2009

The Normality of Surprises

Rationalizing away the possibility of extreme events (or surprises) under conditions of probabilistic normality is an interesting behavioral economics phenomenon in society. Such behavior is signaled by attributions of “uncertainty,” “anomalies,” “shocks,” or some other lofty but equally pretentious notion of time-space deception. The truth is that denying the possibility of extreme outcomes defies the logic of normality, as such outcomes are certainly possible (or even imminent) under normal conditions.

Carl Friedrich Gauss (1777–1855) discovered the specific characteristics of what we now know as normality. The conceptual framework for describing normality is the normal frequency distribution, or “bell curve,” also known as a “Gaussian” distribution. Normality is a common assumption of many of nature’s phenomena, and a central concept in probability theory.

Extreme values in the universe (or population) of outcomes occur naturally and more frequently than many presume. Under conditions of normality, 1 in 22 observations will deviate by twice the standard deviation (which is the square root of the variance) from the mean, 1 in 370 will deviate by three times the standard deviation, and up to 5 in 1,000 observations will deviate from the mean by three or more times the standard deviation. Note especially that extreme outcomes can fall well beyond the mean (to infinity).

We as analysts have a duty to educate decision-makers about how to use probability theory to advance the cause of modern finance in society. That includes emphasizing the counter-intuitive possibilities of extreme events in the economy. To assume away the normality of such surprises would be naïve.

Saturday, August 22, 2009

The Speed of Thinking

The speed of communications in the post-modern world continues to accelerate exponentially. Yet, the speed at which the human mind can absorb and process information has changed little. As a result, society has become "queasy" from high volumes of data, and just as humans slow down when nauseated, society’s spirit might also be in abatement under the bombardment of information.

According to Mr John Freeman (2009, “Not So Fast,” WSJ, Aug 21, 2009):

Speed used to convey urgency; now we somehow think it means efficiency.... There is a paradox here, though. The Internet has provided us with an almost unlimited amount of information, but the speed at which it works—and we work through it—has deprived us of its benefits. We might work at a higher rate, but this is not work­ing. We can store a limited amount of information in our brains and have it at our disposal at any one time. Making decisions in this communication brownout, though without complete infor­mation, we go to war hastily, go to meetings unprepared, and build relationships on the slippery gravel of false impressions. Attention is one of the most valuable modern resources. If we waste it on frivolous communication, we will have nothing left when we really need it.

As communications speed up, so can “brain overload.” This begs the question of just how fast communications should go. Perhaps the speed of thinking could provide a useful governor.

Saturday, August 15, 2009

To DBA or PhD in Business Administration

One of the frequently asked questions I receive from graduate students is regarding the differences between the degrees of Doctor of Business Administration (DBA) and Doctor of Philosophy (PhD) in business administration. Inevitably, these discussions lead to the matter of theory and its role in business research. What a good topic!

Learners should realize that theories (or concepts) provide the essential framework for all business research, including research conducted in fulfillment of requirements for both the DBA and PhD. All successful doctoral dissertations are grounded in good theory and evidence. The PhD is not a “theoretical” degree devoid of empirical evidence, and the DBA is not a “practical” degree devoid of theory. Rather, both degrees are grounded in the management literature and evidentiary support. You do not get out of theory or the rules of evidence simply by choosing one or the other academic degree.

The essential difference between the DBA and PhD in business administration is that the former focuses on the application of theory rather than on the development of new theory. Because business administrators are in constant search for better concepts and methods for conducting enterprise, PhD candidates can always find new topics for basic research. Likewise, while it is true that a general theory of management has yet to emerge, it does not mean that candidate theories are not out there. Thus, DBA learners can always find research problems where the application of existing concepts and methods might prevail to solve business problems.

Both, the DBA and PhD in business administration are about good theory and evidence – in other words, scholarship.

Friday, August 14, 2009

Economic Policies in Dystopia

I have previously written about the near-term monetary and fiscal policy options that were available to decision makers for responding to the then emerging economic crisis in Implications of the Financial Crisis as offered by Dr George Cooper (2008, “The Origin of Financial Crises,” Vintage). The policy options he offered at the time included the “free market route,” which entails allowing the credit contraction and underlying asset deflation to play out. His second option was for policy-makers to continue applying fiscal and monetary stimulus in an effort to trigger a new economic expansion that might have the power and momentum to negate the current credit contraction. Dr Cooper’s final option was for policy-makers to “unleash the inflation monster,” which means simply “printing money” in order to negate debt through either state-funded handouts or deliberate inflationary spending policies. Of course, none of these options is particularly attractive. My conclusion at the time was that the US had probably already set out on a course towards inflation.

Since last December, Western countries have implemented an unprecedented array of fiscal and monetary initiatives designed to expand the economy and mitigate the severity of the ongoing economic crisis. Almost all of these initiatives entail heavy spending and borrowing by governments. The scale of these actions suggests that Western nations have elected to embark on programs to stimulate the global economy in an effort to restore capital flows and financial stability. The good news is that there are at least some indications that the economic crisis has bottomed-out, but no one really knows at this point.

However, should the stimulus programs eventually fail to place the global economy on track to a robust recovery, then the looming question will become how Western governments might eventually pay for the spending spree they gleefully embarked upon. Prof Kenneth Rogoff (2009, The Confidence Game, Project Syndicate) suggests that governments may have few policy options remaining:
Within a few years, Western governments will have to sharply raise taxes, inflate, partially default, or some combination of all three.
If Prof Rogoff is correct in suggesting that Western governments may soon have to choose from these dreary options, then the economic future for society is arguably bleak. None of these policies would be popular or easy to implement. Nevertheless, there is a certain reality found in the multiple approach-avoidance content of these choices, and it may be time for policy-makers (and voters) to begin thinking about which option (or combination of options) they might prefer.

Wednesday, August 12, 2009

Too Big to Fail, or Just Too Big?

I just read a discussion paper by Dr James B Thompson of the Research Department of the Federal Reserve Bank of Cleveland (2009, “On Systemically Important Financial Institutions and Progressive Systemic Mitigation”), in which he proposes various criteria for identifying and supervising financial institutions that are “systemically important.” According to Dr Thompson:
Delineating the factors that might make a financial institution systemically important is the first step towards managing the risk arising from it. Understanding why a firm might be systemically important is necessary to establish measures that reduce the number of such firms and to develop procedures for resolving the insolvency of systemically important firms at the lowest total cost (including the long-run cost) to the economy.
Dr Thompson further argues that disclosing the identity of firms that may eventually be designated “systemically important” would require “constructive ambiguity” in order to ensure the market is not mislead into believing certain firms retain special dispensations in the form of government guarantees.
The choice of disclosure regime would seem to be between transparency (publication of the list of firms in each category) and some version of constructive ambiguity, where selected information is released… In the context of central banking and financial markets, the term [constructive ambiguity] refers to a policy of using ambiguous statements to signal intent while retaining policy flexibility. In the context of the federal financial safety net, many have argued for a policy of constructive ambiguity to limit expansion of the federal financial safety net. The notion here is that if market participants are uncertain whether their claim on a financial institution will be guaranteed, they will exert more risk discipline on the firm. In this context, constructive ambiguity is a regulatory tactic for limiting the extent to which de facto government guarantees are extended to the liabilities of the firms that regulators consider systemically important.
After considering Dr Thompson’s ideas, I am flabbergasted with doubts. My first is with regard to the dogma implied by “systemically important” (i.e., “too big to fail”). What does “systemically important” mean? What makes a company “systemically important?” Dr Thompson sidesteps the “too big to fail” proposition by coining the alternative phraseology, “systemically important,” which is equally lambaste with normative relativism. The entire concept of “systemically important” lacks content validity, both in rhetoric and substance. To say a firm is “systemically important” is just another way of designating the firm as “too big to fail.”

My second doubt centers on the need for “constructive ambiguity” in disclosing the identity of firms that are designated as “systemically important.” The suggestion that “constructive ambiguity” will somehow protect the markets is preposterous. What the marketplace needs today is greater transparency, not less. The very notion of “constructive ambiguity” is laced with deceit. Ambiguity can only further harm the stature and creditability of our financial markets, especially given the recent collapse of public confidence in the face of the ongoing economic crisis.

My final comment is to offer a new suggestion for dealing with firms that are either “systemically important” or “too big to fail,” and that is we treat such firms as simply too big to keep around. Firms that are so large as to become “systemically important” or “too big to fail” should be broken up into smaller companies, thus advancing the competitive spirit of the marketplace, while ensuring that no firm becomes so large as to be able to threaten the financial stability of our nation as a consequence of their misfortunes.

Download

Tuesday, August 11, 2009

More Small Businesses Needed

A new report by Dr John Schmitt and Nathan Lane of the Center for Economic and Policy Research In Washington, DC (2009, “An International Comparison of Small Business Employment”) dispels some misconceptions about the scale of small business employment in the US. According to the report, the US has a much smaller small-business sector (as a share of total employment) than Canada and essentially all of Europe. The authors suggest that the relatively high direct cost of health care discourages small business formation in the US. In contrast, small businesses and start-ups in other countries tend to rely on government-funded health care systems. As of 2007, the US self-employment rate was well below that of other nations.


Download

Friday, August 07, 2009

Statisticians in Demand

We live in a world where data is the raw material that analysts use to produce information, also known as knowledge. However, without analysis, data remains data in raw form. This might explain the recent upsurge in hiring for statisticians. Steve Lohr (“For Today’s Graduate, Just One Word: Statistics,” NYT, Aug 5, 2009) argues that statistics may be the password for hiring in the coming years:
The rising stature of statisticians, who can earn $125,000 at top companies in their first year after getting a doctorate, is a byproduct of the recent explosion of digital data. In field after field, computing and the Web are creating new realms of data to explore — sensor signals, surveillance tapes, social network chatter, public records and more. And the digital data surge only promises to accelerate, rising fivefold by 2012, according to a projection by IDC, a research firm.

The demand for statisticians is consistent with a larger trend toward competing on analytics in enterprise. This trend has also given impetus to the need for other experts, especially in computer programming.

Though at the fore, statisticians are only a small part of an army of experts using modern statistical techniques for data analysis. Computing and numerical skills, experts say, matter far more than degrees. So the new data sleuths come from backgrounds like economics, computer science and mathematics.

Over the past several decades, firms have invested heavily into data management technology, including server and data-warehousing systems. These investments have created massive amounts of raw data that are begging to be analyzed by people trained and skilled in descriptive and inferential statistics, stochastic modeling, linear and non-linear forecasting, and so forth. The creation of so much raw data in recent years makes statistical analysis of that data a vital value-adding activity that enables competing on analytics.

“I keep saying that the sexy job in the next 10 years will be statisticians,” said Hal Varian, chief economist at Google. “And I’m not kidding.”

Thursday, August 06, 2009

Controlling vs Collateralizing Risk

Regulatory reforms that focus on improving risk controls rather than increasing capital reserves are the better path for the future of banking, according to Katsunori Nagayasu, president of the Bank of Tokyo-Mitsubishi UFJ and chairman of the Japanese Bankers Association (“How Japan Restored Its Financial System,” WSJ, Aug 6, 2009).
Regulatory authorities around the world are currently discussing ways to prevent another financial crisis. One idea is to mandate higher levels of capital reserves. Japan’s banking reform shows that a comprehensive solution would work better.
Requiring banks to increase capital reserves is itself, “risky.” For one thing, banks may not be able to raise sufficient capital in the equity markets to meet the revised capital requirements. Moreover, raising capital requirements tends to disadvantage banks that focus on traditional borrowing and lending transactions, and advantage banks that trade and take risks with their own accounts.
A new regulatory framework must also distinguish between banks whose main business is deposit taking and lending—the vast majority of banks worldwide—and banks that trade for their own account. The recent financial crisis demonstrated that balance sheet structure matters. Trusted banks with a large retail deposit base continued to provide funds to customers even in the depths of the crisis, whereas many banks that relied heavily on market funding or largely trading for their own account effectively failed. Investment banks with higher risk businesses by nature should be charged a higher level of capital requirement—otherwise, sound banking will not be rewarded.

That the government has undertaken to save only the largest banks under the “too big to fail” presumption is of concern to the public for a variety of reasons, not the least of which is that such an approach may actually reward the banks that are taking the biggest risks, while closing those that have played by the rules. Additionally, requiring banks to maintain excessive capital reserves may sound good, but high reserves brings reduced capital efficiency, particularly at a time when money is scarce.

Regulators would be wise to consider the capital efficiency of the reforms they intend to invoke, or the current recession could extend well into the future. The US should take a lesson from the Japanese banking experience and focus on new ways to control risk, rather than simply collateralizing it.

Monday, August 03, 2009

In Defense of Financial Theories

I recently read a ridiculous critique of Value at Risk (VaR) by Pablo Triana in BusinessWeek (“The Risk Mirage at Goldman,” Aug 10, 2009). His review of this advanced financial technique is scathing:
VaR-based analysis of any firm's riskiness is useless. VaR lies. Big time. As a predictor of risk, it's an impostor. It should be consigned to the dustbin. Firms should stop reporting it. Analysts and regulators should stop using it.
Mr Triana bases his assertion on the observation that VaR is “a mathematical tool that simply reflects what happened to a portfolio of assets during a certain past period,” and that “the person supplying the data to the model can essentially select any dates.” My response to his argument is simply to ask, “Isn’t that true of any model or theory…?” Mr Triana goes on to argue that:
VaR models also tend to plug in weird assumptions that typically deliver unrealistically low risk numbers: the assumption, for instance, that markets follow a normal probability distribution, thus ruling out extreme events. Or that diversification in the portfolio will offset risk exposure.
In essence, Mr Triana seems to be saying that normally distributed results have bounds, and that portfolio diversification does not offset risk. Neither of his assertions are supported by probability theory or the empirical evidence. Yet, Mr Triana goes on to conclude, “it’s time to give up analytics so that real risk can be revealed.”

Mr Triana does a disservice to the financial services industry and public at large with his dramatic commentary. Yes, the discipline of finance has much to learn from the ongoing economic crisis, and of course, financial theory in general will evolve based on these recent lessons. However, just because one gets a bad meal in one restaurant does not mean that one should quit going to restaurants.

Financial theories such as VaR stand as state-of-the-art tools in the business of finance and risk management. These techniques are grounded in the same stochastic methodologies that are used by engineers in virtually every industry. To dismiss VaR so completely without considering its utility for supporting effective financial decisions is tantamount to sending financial theory back to the dark ages. Our knowledge of finance needs to advance as a result of what is happening in the economy, not go backwards.

Sunday, August 02, 2009

Government Spending and Gross Domestic Product

Data recently released by the US Department of Commerce suggests that government spending as a percentage of Gross Domestic Product (GDP) remains historically low, despite government efforts to increase its consumption significantly. According to the GDP data for the second quarter of 2009, government consumption stood at 20.7% of the total US GDP (as of June 30). This compares with 23.6% for the same period in 1952, 22.0% in 1962, 21.5% in 1972, 20.7% in 1982, 20.1% in 1992, and 18.6% in 2002. The data indicate that government expenditures as a percentage of GDP have been declining since the early 1950’s, and remain relatively low by this measure. Note that government expenditures for GDP include defense and non-defense spending, as well as spending by state and local governments. The current rates of government consumption as a percentage of GDP are by no means alarming, at least by historical standards. Of greater concern might be the negative impact of net exports/imports on US GDP in recent years.