Friday, January 29, 2010

Analytics at Work: Lessons from the 2007-2009 Financial Crisis

Prof Thomas H Davenport, Jeanne G Harris, and Robert Morison (2010) posit two major lessons learned from the 2007-2009 financial crisis:
Financial firms need to radically change their analytical focus. They need to make the assumptions behind their models much more explicit and transparent. They need to incorporate the systematic monitoring of analytical models into their businesses. They -- and their regulators -- need to be skeptical about the ability to model and manage risk in extraordinary time.
But, their "most important" lesson is directed specifically at management itself:
Financial executives need to learn much more about the models that are running their businesses. In search of outsized returns, they've taken on investment and debt securities that are bundled up in algorithmic combinations that they don't understand. Cowed by this accumulation of daunting numbers, these executives have abdicated responsibility for managing risk.
Reference: Davenport, T H, Harris, J G, & Morison, R (2010), Analytics at Work: Smarter Decisions, Better Results, Boston, MA: Harvard Business.

Monday, January 25, 2010

Probability of Global Asset Price Collapse

According to the Risks Interconnection Map posted by the World Economic Forum (2010), the probability of a severe collapse in global asset prices stands at 20 percent or greater (click to expand).

View Interactive Map

Sunday, January 24, 2010

Modern Portfolio Management

I recently had reason to produce an illustration describing the essential inputs, decision processes, and outputs of modern portfolio management. What follows is the product of that effort (click to expand).

The Minds Behind the Meltdown

In an effort to assign responsibility for the Wall Street financial meltdown that began in 2007, Scott Peterson (2010) makes the case that "quants" were the demons that somehow misled investment managers and the public:
On Wall Street, they were all known as "quants," traders and financial engineers who used brain-twisting math and superpowered computers to pluck billions in fleeting dollars out of the market. Instead of looking at individual companies and their performance, management and competitors, they use math formulas to make bets on which stocks were going up or down. By the early 2000s, such tech-savvy investors had come to dominate Wall Street, helped by theoretical breakthroughs in the application of mathematics to financial markets, advances that had earned their discoverers several shelves of Nobel Prizes.
Regarding the management teams that supervised these "quants," Peterson is mute. I tend to take the alternative view that management is responsbile for everything that happens or fails to happen within their domains. My question for the investment managers and policy makers is simply, "How could you be so dumb as to let your trading staffs run over you?" I tend to point the finger directly at the investment policies and greed of the investment managers and other senior executives rather than the traders. After all, the public can hardly hold staff traders responsibile for the devastating loses that ensued. The executive management teams of these investment institutions have a lot to answer for from my perspective. The minds behind the meltdown sit amongst the policy makers - not the traders.

Reference: Peterson, S (2010, January 23), The Minds Behind the Meltdown, Wall Street Journal Online.

Friday, January 22, 2010

Why Spreadsheets?

Why do financial analysts continue to prefer spreadsheets over other business intelligence tools? Consider the following scathing review of spreadsheets by Wayne Eckerson (2010, p. 13) of TDWI:
The fundamental problem is that spreadsheets are personal productivity tools, not an enterprise information delivery system. Individuals armed with spreadsheets generate their own data: they bring it into Excel, link it with other data sets, apply custom calculations, and format it according to their preferences. The resulting spreadsheet reflects their individual or departmental view of the business. It is highly unlikely that their view of the business harmonizes with other views, especially since other departments and corporate headquarters typically define basic entities and key metrics, such as customers, products, sales and profits, in entirely different ways. The use of spreadsheets in such a manner leads to the breakdown of corporate vocabulary and of the single version of the truth.
Yet, the spreadsheet remains the tool of choice amongst financial analysts for all of the reasons Eckerson cites. Interesting...

Reference: Eckerson, W (2010), Transforming Finance: How CFOs Use Business Intelligence to Turn Finance from Record Keepers to Strategic Advisors, Renton, WA: TDWI.

Analytical People

A white paper released by nGenera (2008) in collaboration with Prof Thomas Davenport identifies three levels of analytical people to consider when hiring:
1. Analytical professionals. Most successful analytical competitors have a core cadre of people who design and conduct experiments and tests, define and refine analytical algorithms, and perform data mining and statistical analyses on key data. In most cases, these individuals have advanced degrees – often Ph.D.s – in such analytical fields as statistics, operations research, logistics, economics, or econometrics.

2. Analytical semi-professionals. They can do substantial amounts of modeling and analysis using spreadsheets or visual analysis tools, but are unlikely to develop sophisticated new algorithms or models. These individuals might be, for example, quantitatively-oriented MBAs with deep knowledge and experience in the business process or function that’s the analytical focus of the enterprise.

3. Analytical amateurs. The employees who do the day-to-day work of the business also need to understand something of the analytical basis for operations and decisions. For example, if a lodging chain employs sophisticated analytics for revenue management, those who quote room prices to customers need to understand, at least to some degree, how prices are derived, and when they can be overridden.

Reference: Business Analytics: Six Questions To Ask About Information And Competition (2008), Austin, TX: nGenera Corp.

Business Analytics: Questions for Enterprise

Firms today are increasingly seeking competitive advantage through advances in business analytics and decision support systems. According to a white paper published by nGenera (2008) in collaboration with Prof Thomas Davenport:
The next wave of business reengineering is being powered by business analytics, and the potential performance breakthroughs are just as large as they were 15 or so years ago. Many of these breakthroughs will come through the ability to integrate the demand side of the house with the supply side of the house as never before. Even information-rich industries have tended to concentrate on one side or the other. With the power of business analytics, corporations can make and manage the demand-supply connections – a big step closer to the goal of optimizing the performance of the corporation as a whole.

Here are six topical questions (with supporting questions) posed by nGenera for companies seeking to compete analytically:

1. Where should we leverage business analytics?

  • What is our distinctive capability? On what basis do we choose to compete? And how clear and definitive are we about that choice?
  • What performance levels or innovations in this area would blow away the competition?
  • What information, knowledge, and insight would it take to perform that way? What are the biggest unanswered questions and biggest opportunities?
  • How would we act upon that information, knowledge, and insights.

2. Why now?

  • What are our direct competitors doing or attempting with business analytics? Is anyone in our industry jumping ahead in terms of analytical capability?
  • How are analytics changing our competitive landscape? Are we at risk from non-traditional competitors who may use analytics to encroach on our markets?
  • What emerging technologies of information integration and analysis should we be exploring more aggressively?
  • How fast can we launch a serious business analytics initiative? What’s holding us back?

3. What's the payoff?

  • What are our specific performance goals in the area where we choose to compete?
  • How well do we measure them? How might better measurement and analysis of today’s performance reveal tomorrow’s opportunities?
  • How well aligned are the organization, its management, and its stakeholders with these performance goals?
  • What’s our highest ambition? What would it mean in terms of revenue, profit, and market share if we were really to change the basis of competition?

4. What information and technology do we need?

  • Is the information we need at hand? Is the data that support our distinctive capability in one repository, with common definitions of key data elements?
  • Is this data integrated enough not only to be accessible, but also to be manipulated with analytical tools?
  • How completely and accurately does the information measure and represent our distinctive business capability and basis of competition? Is it up-to-date? What are the most glaring gaps and shortfalls?
  • Do we have the technologies in place to support business analytics in this area? Or is technology fragmentation holding us back?

5. What kinds of people do we need?

  • Do we have a critical mass of analytical professionals on staff? Are we prepared to hire them? Do we need to “rent” this talent in the short term to fill gaps?
  • Who can manage analytical professionals? Who has the necessary experience, credibility, and “bridging” skills?
  • Will we be ready to train employees to apply the analytical results and operate differently?
  • Is the organization at large oriented toward analytical decision-making, or is it wedded to yesterday’s procedures and rules of thumb? How quickly can the organization come up to speed analytically?

6. What roles must senior executives play?

  • Are we committed to competing on analytics, starting at the top of the organization? What are the CEO and executive team doing to demonstrate that commitment?
  • Is the leader of the analytical function prepared to act upon the results of the analyses? Are the roles and decision rights of other stakeholders, including the CFO and CIO, clear – especially when their roles are novel or overlap?
  • Do we have a project leader who can span the worlds of strategy, process performance, and analytics?
Reference: Business Analytics: Six Questions To Ask About Information And Competition (2008), Austin, TX: nGenera Corp.

Thursday, January 21, 2010

Wednesday, January 20, 2010

Of Capital Flows

Is capital intensity properly measured as a stock or flow? Prof Frank H Knight (1885-1972) argued that “the quantities of economics are properly rates…” (1921/2002, p. 59). Knight's supposition underlies the derivatives concept to this day:
Utility, and any product yielding utility, must be thought of as a time-rate or intensity, not a thing. Even the notion of rate is unfortunate, as it suggests a quantity spread over a period of time. The dimensional relation is the inverse of this. The economic magnitude is inherently a process, something going on, not something existing. An analogy such as that of hydraulics is misleading in that water exists whether it flows or not, and its rate of flow is a quantity divided by a time interval. The correct analogy for economics is current electricity, or better still, light, which cannot be thought of as existing without flowing. A quantity of light is inherently a two-dimensional concept, an intensity multiplied by time, such as a candle-power-hour. The economic reality is of the same sort, a process going on in time. We cannot think of an instantaneous economic experience, still less plan for one. (Knight, 1967, pp. 126-127)
References:

Knight, F H (1967), The Economic Organization (with an article) Notes on Cost and Utility, New York: Augustus M Kelley Publishers.

Knight, F H (1921/2002), Risk, Uncertainty and Profit, Washington, DC: BeardBooks.

Tuesday, January 19, 2010

The Beauty of Finance

Here's a quote by Prof Emanuel Derman (2004, p. 270) that every financial analyst can take to heart:
I like to think in Goethean terms of what we do in quantitative finance. We try to make as beautiful and truthful a description as we can of what we observe. We’re involved in intuiting [i.e., sensing], inventing, or concocting approximate laws and patterns. We combine both art and science in creating understanding. We use our intuition, our scientific knowledge and our pedagogical skills to paint a picture of how to think qualitatively, and then within limits, quantitatively, about the world of human affairs, and in so doing, we influence and are influenced by other people’s thoughts.
Reference: Derman, E (2004). My Life as a Quant: Reflections on Physics and Finance. Hoboken, NJ: John Wiley.

Sunday, January 17, 2010

Analytical Models in the Fog of Uncertainty

Dr Sam L Savage urges decision makers to keep an eye on the "big picture" when relying on analytical models to pilot a business (2009, p. 44):
If managing a business is like flying a plane, then analytical models are analogous to the instruments. Use them to calibrate your intuition while visibility is good. Then use them with caution if you are suddenly socked in by the fog of uncertainty.
Of course, there is a corollary to Savage's advice: Make certain the analytical instrumentation in your dashboard is active and working before the fog rolls in...

Reference: Savage, S L (2009), The Flaw of Averages: Why We Underestimate Risk in the Face of Uncertainty, Hoboken, NJ: John Wiley.

Pascal on Analysis

Blaise Pascal (1623-1662) posited the following still useful rules for constructing arguments in what he characterizes as "geometric" analysis (Pascal, 1952, p. 443):
1. Do not leave undefined any terms at all obscure or ambiguous.
2. Use in definitions only terms perfectly well known or already explained.
3. Ask only that evident things be granted as axioms.
4. Prove all propositions, using for their proof only axioms that are perfectly self-evident or propositions already demonstrated or granted.
5. Never get caught in the ambiguity of terms by failing to substitute in thought the definitions which restrict or explain them.

Blaise Pascal (1623-1662)

Pascal's thinking remains instructive to this day for all forms of analysis.

Source: Pascal, B (1952), On Geometrical Demonstration (On the Geometrical Mind) (R Scofield, Trans). In R M Hutchins & M Adler (Eds), Great Books of the Western World (Vol 33, pp. 430-446), Chicago, IL: Encyclopedia Britannica.

Related Posts

Saturday, January 16, 2010

Toward Norms for the Development of Models

Today, modeling and data issues pervade social science research. Likewise, the discipline of finance is now confronting issues of model risk. Assuming the nature of financial economics is similar to that of the social sciences, then perhaps the former might take a lesson from the latter. Prof W James Bradley and Prof Kurt C Schaefer offer the following “norms” in order that researchers might avoid the misue of models and data (1998, pp. 145-151):
1. For each situation under analysis, practitioners benefit from a detailed understanding of the situation, from different disciplinary perspectives when possible, before the problem is modeled.

2. Practitioners should learn what is important enough to measure before trying to measure it, then decide which of the five measurement scales [i.e., nominal, ordinal, interval, ratio, absolute] is appropriate. One must then live within the bounds imposed by the characteristics of that measurement scale.

3. Random error terms convey information about the abstractions, approximations, ignorance, and measurement problems that are involved in the model we have constructed. The drawing of inferences should therefore involve a careful inspection of the residual errors between our data and our model.

4. “Probabilities” in the social and human sciences are degrees of warranted belief, not relative frequencies. But this means that the classical ratio scale of probabilities is not appropriate in the situations we are discussing. Therefore, the level of confidence in a result cannot be stated as a single number; significance involves a judgment about the reasonableness of the entire model and its data.

5. Levels of statistical significance are always somewhat arbitrary, but we should be especially skeptical in cases when (a) the social processes under study are extremely complex, with many auxiliary hypotheses complicating the primary hypotheses; (b) the entity being measured is not clearly definable, or there is a poorly developed theory of the entity and its relationship to the measureable variables, or the measurement instrument is not precise and reliable; (c) inappropriate measurement scales are used; (d) the statistical methods (and, when present, functional forms) employed are not consistent with the measurement scale; (e) the specification of the model and its functional form are not clearly justified by reference to the actual situation being modeled; (f) the error residuals are not observed and analyzed (e.g., some ANOVA and correlation studies); and (g) the quality of the data and reliability of the source are questionable. We should be particularly skeptical when the analyst does not fully disclose the relevant information on these topics. In fact, it should be a professional norm that the statement of one’s results must, as a matter of habit, discuss these details.

Reference: Bradley, W J & Schaefer, K C (1998), The Uses and Misuses of Data and Models: The Mathematization of the Human Sciences, Thousand Oaks, CA: Sage.

Wednesday, January 13, 2010

Abstractions Are Never Perfect

Dr John D Cook (2010) of The Endeavor makes a good point by arguing that it is "better to have a simple system than a complex system with a simple abstraction on top." Leonard Richardson and Sam Ruby (2007, p. xvi) reinforce Cook's point:
Abstractions are never perfect. Every new layer creates failure points, interoperability hassles, and scalability problems. New tools can hide complexity, but they can’t justify it... The more complex the system, the more difficult it is to fix when something goes wrong.
To the above, I will add a quote by Prof Robert W Grubbström (2001, p. 1133):
Simplicity is the opposite of complexity and simplicity is valuable.
References:

Cook, John D (2010), "Abstractions Are Never Perfect," The Endeavor, Retrieved on Jan 13, 2010 from http://www.johndcook.com/blog/2010/01/11/abstractions-are-never-perfect.

Grubbström, R W (2001). "Some Aspects on Modelling as a Base for Scientific Recommendations," Kybernetes, 30(9/10), 1126-1138.

Richardson, L & Ruby, S (2007), RESTful WEb Services, Sebastopal, CA: O'Reilly Media.

Tuesday, January 12, 2010

Building Smarter Networks

If one assumes that "flat world" technologies have the potential to bring people together in support of a common purpose or cause, then why is the world not yet enjoying the full benefits of these new capabilities, especially given the enormous investment society is making in network technologies. Perhaps the problem is not an embodiment of the technology itself, but rather embedded in the people that activate networking. In Competing in a Flat World: Building Enterprises for a Borderless World (2008, p. 51), Prof Yoram (Jerry) Wind, Dr Victor K Fung, and Dr William K Fung argue for better orchestration of networks:
Smarter networks do not just happen. They require guidance, intelligence, a design, and an invisible or visible hand drawing together all the diverse contributions. In other words, they require orchestration. Otherwise, crowds can rapidly devolve into chaos and stupidity. Orchestration is what makes smart networks smart.
Now is the time for world-class direction in building networks that might create a better future for enterprise and society.

Reference: Fung, V K, Fung, W K, and Wind, Y (2008), Competing in a Flat World: Building Enterprises for a Borderless World, Upper Saddle River, NJ: Wharton.

The Globalization Mindset

An emerging theme in enterprise management is that the recent setback of financial globalization portends the demise of globalization in general. However, one of the reasons I believe this assertion to be false is that an entire generation of executives born in the globalization movement retain staunchly pro-globalization perspectives and views, and the attitudes of this cohort of leaders is unlikely to concur with a resurgence of the defunct business models of the past, and especially vertical integration. Assuming a pro-globalization mindset has indeed taken root within this generational cohort of leaders, then their growing influence is likely to affect a broad resurgence of globalization sooner than later from within the organization. According to Prof Harold James (2009, p. 10):
Globalization is not only a process that occurs somewhere out there - in an objective and measurable world of trade and money. It also happens in our minds, and that part of globalization is often more difficult to manage...
The extent to which the proponents of globalization have found their way into enterprise may decide the degree to which globalization's critics succeed or fail. The globalization mindset is alive and well within today's firms, and a comprehensive resurgence of globalization is likely only a matter of time.

Reference: James, H (2009), The Creation and Destruction of Value: The Globalization Cycle, Cambridge: Harvard.

The Perfection of Process

The emergence of process-centered organizational models during the late twentieth century accelerated globalization while warning of the obsolesence of vertical integration in corporations. As declared by Dr Michael Hammer back in 1996 (p. 13):
The time for process has come. No longer can processes be the orphans of business, toiling away without recognition, attention, and respect. They now must occupy center stage in our organizations. Processes must be at the heart, rather than the periphery of companies' organization and management. They must influence structure and systems. They must shape how people think and the attitudes they have.
The triumph of process during the twentieth century trumpets its perfection for the new millenium.

Reference: Hammer, M (1996), Beyond Reengineering, New York: HarperBusiness.

Monday, January 11, 2010

On Cosmopolitanism

The essence of cosmopolitanism holds promises for the future of humankind. According to Prof Thomas Pogge (1992, pp. 48-49):
Three elements are shared by all cosmopolitan positions. First, individualism: the ultimate units of concern are human beings, or persons – rather than, say, family lines, tribes, ethnic, cultural, or religious communities, nations, or states. The latter may be units of concern only indirectly, in virtue of their individual members or citizens. Second, universality: the status of ultimate unit of concern attaches to every living human being equally – not merely to some subset, such as men, aristocrats, Aryans, whites, or Muslims. Third, generality: this special status has global force. Persons are ultimate units of concern for everyone – not only for their compatriots, fellow religionists, or suchlike.
Cosmopolitanism resonates around the individual in a way that provincialism and nationalism do not.

Reference: Pogge, T (1992), "Cosmopolitanism and Sovereignty," Ethics, 103(1), 48-75.

More

The Scary Truth About Unemployment in the US

The latest employment data for December 2009 is scary in the extreme. I found this chart in CalculatedRisk comparing the percent job losses for every recession since World War II. According to CalculatedRisk (10 Jan 2010):
This graph shows the job losses from the start of the employment recession, in percentage terms (as opposed to the number of jobs lost). The current employment recession is the worst recession since WWII in percentage terms, and 2nd worst in terms of the unemployment rate (only early '80s recession with a peak of 10.8 percent was worse).
Hopefully, the economic implications thereof will not fall on deaf ears in America's halls of power.

Sunday, January 10, 2010

Navigating Trends in Globalization and Technology

I recently came across a downloadable conceptual "map" for navigating future trends in technology and globalization by blogger Richard Watson of Fast Company (7 Jan 2010). According to Watson:
If you want to explore something that's unfamiliar, it's often useful to have a map. Here's my (not totally serious) map looking at trends and technology from 2010-2050. The key mega-trends are: globalisation, urbanisation, digitalisation, personalisation, individualism, environmental change, sustainability, debt, volatility, ageing, anxiety, power shift eastwards and localism. As you'll see the map contains way too much information, so it's a bit like real life then? BTW, don't even attempt to look at this on a small screen or print out at less than A3 size. Safe travels.

Download