Friday, April 02, 2010

Stochastics Come of Age

I just read a fascinating paper written by Prof David Mumford (1999) of Brown University, and I wanted to share his predictions about the future of stochastics written at the end of the second millennium:
Stochastic methods will transform pure and applied mathematics in the beginning of the third millennium. Probability and statistics will come to be viewed as the natural tools to use in mathematical as well as scientific modeling. The intellectual world as a whole will come to view logic as a beautiful elegant idealization but to view statistics as the standard way in which we reason and think.
In fact, stochastics have come of age as we proceed into the second decade of the 21st century. You can read his entire paper linked below.

Source: Mumford, D (1999, November 18), The Dawning of the Age of Stochasticity, Unpublished paper based on a lecture delivered by the author at a conference entitled, "Mathematics towards the Third Millennium" held at Accademia Nazionale dei Lincei on May 27-29, 1999.

Download

Stochastics and Optimization Are Integral to Financial Management

Just in case there is any doubt, stochastics and optimization are integral to modern finance. The chart below illustrates the high-level processes required for a complete financial management system (click image to expand).

Thursday, April 01, 2010

The Topological Landscape between Automation and Expertise

Over the ages, the topological space occupied by automation and technology has continued to expand its domain such that humanity has had no recourse but to seek refuge in discovery and innovation. Of course, the innovations of today become the technology of tomorrow, and so humans are continuously driven deeper into the unknown in search of new knowledge and ideas that might somehow ensure the survival of the human race in what is apparently a hostile universe. The chart below illustrates how automation encroaches upon expertise. Note that value and complexity are the drivers of the isoquantal relationship between automated and expert decisions.

I have previously written that the domain of expert decisions is also where polyvalence reigns supreme. The question before us is not how to thwart the growth of automation, but rather how to expand the realm of polyvalence, which has always been the refuge and safe haven for human existence.

Related Posts:

Polyvalence Defies Commoditization

Financial Management Requires Polyvalence

Wednesday, March 31, 2010

Polyvalence Defies Commoditization

One of the tacit objectives of commoditization is to disembed knowledge from humans in order to reembody that knowledge into technology (Giddens, 1990). The impact of commoditization on knowledge workers is a well-understood facet of post-modernization and the new economy.

The success of the commoditization movement has been particularly discouraging for professional services as the public sector is increasingly unwilling to pay premium fees for services, and the private sector is too distrustful of professionals to engage them even when doing so is arguably prudent. Joan Capelin's (2005) expressed sentiments are commonplace:
It is cold comfort to realize that all the professions are experiencing this race to the bottom. We tend not to see the effects of commoditization — nor even feel sympathetic — when lawyers, accountants, and management consultants find they are being squeezed for lower fees by their marketplace. They are far more rewarded for their time, to begin with. But I can tell you that there is much handwringing going on across all the business-based professions.
Nevertheless, as the forces of commoditization continue their forays into the realm of human knowledge in search of latent technologies, critical issues are emerging about how and if certain human specific functions can feasibly be transferred into machines. In particular, the expert handling of polyvalent information and data seems to defy commoditization.

Polyvalence denotes something that has multiple values, meanings, or appeals. Polyvalence (which is a synonym for multivalence) is a state or the way something is with respect to its main attributes, as in “the current state of knowledge,” “the state of one’s health,” and “the state of the economy.” Polyvalent symbols and metaphors carry multiple (often esoteric) meanings and can fulfill expressive, transactional, and interactional functions concurrently.

Prof Daniel Bell (2003) first used the term polyvalent to describe human activities that are difficult or impossible to delegate, let alone commoditize. Examples of polyvalent human functions include surgery, piloting an aircraft, coaching a professional sports team, and representing a client in court. Thus, the professions are often associated with polyvalence.

If technology can displace a particular human activity, then the displaced activity is no longer truly polyvalent.

More to follow about polyvalence in future posts.

Sources:

Capelin, J (2005, September 8), Confronting Commoditization, DesignIntelligence.

Cohen, D (2003), Our Modern Times: The New Nature of Companies in the Information Age (S Clay and D Cohen, Trans), Cambridge, MA: MIT Press.

Giddens, A (1990), The Consequences of Modernity, Stanford, CA: Stanford University Press.

Related Posts:

Financial Management Requires Polyvalence

The Topological Landscape between Automation and Expertise

Thursday, March 25, 2010

Financial Management Requires Polyvalence

A frequent problem in advanced financial management is the task of interpreting (and understanding) probabilistic forecasts. The following exercise illustrates the essence of the problem.

Suppose that the following (very simplistic) financial projection for the “next” period arrives in your hands:

The financial analyst who handed you the report then comments “…the projection is based on historical data…” You focus your attention on the positive net profit number and conclude that the projection looks reasonable based on your “gut.” Having reached your conclusion, you head home to enjoy your life.

Unfortunately, a forecasted numerical value is rarely accurate, because the projected “number” is inevitably “off” by some (often significant) amount.

Contrast the above with this second story. In this instance, the analyst hands you a financial projection for the same next period; the report begins again with the worksheet displayed above. However, the report continues with additional information as shown in the tables and graphs that follow:

After looking over the report, you listen as the analyst interjects that a lognormal distribution was used to model gross revenues based on a fitting of available historical data, and the analyst confides that your feedback regarding that decision would be helpful. The analyst also expresses concern about net profitability for the coming period by predicting “…there appears to be a significant chance that we won't be profitable…” (see the last histrogram projecting a 30% chance of losses). With this information, you call your spouse with a message that you will be working late in the office to “deal with some problems…”


The moral of the story is that a “number” in itself is rarely sufficient information and justification for financial decision-making. In many ways, financial managers must be able to “hear” and “feel” the meaning of probabilistic reporting much like the conductor of a symphony orchestra must hear and feel the meaning of a musical score. Effective financial managers maintain a polyvalent sensitivity toward a complexity of factors and dependencies in order to transcend numerics as guidance for decisions. In this manner, the art of financial economics embraces its science.

Financial managers might take comfort in these words by Nobel Laureate Prof Robert J Aumann (2000, p. 141):
The best art is something that strikes a chord with the viewer or listener. It expresses something that the viewer or listener has experienced himself and it expresses it in a way that enables him to focus his feelings or ideas about it. You read a novel and it expresses some kind of idea with which you can empathize, or perhaps something that you yourself have thought about or experienced. Take a sculpture or a cubist painting. It expresses some reality, some insight, in an ideal way. That is what the best mathematical economics does. It is a way of expressing ideas, perhaps in an ideal way.
By the way, probabilistic (stochastic) reasoning is now integral to finance, so if any of the reporting above is a challenge (for you or your staff), please consider my training and software offerings here.

Source: Aumann, R J (2000). Economic Theory and Mathematical Method: An Interview. In Collected Papers (Vol I, pp. 135-144). Cambridge, MA: MIT Press.

Related Posts:

Polyvalence Defies Commoditization

The Topological Landscape between Automation and Expertise

Wednesday, March 24, 2010

Democracy Reborn

Given that healthcare reform in America is now the "law of the land," I continue to ponder the underlying philosophical themes that might explain the political changes now underway. My previous writings have argued that America is witnessing the final collapse of elitism, accompanied by the "beginning of the end" for populism, as pluralism unfolds to become the political "motif" of our time. Whether we like it or not, the old "white male" regimes of past (both elitist and populist) are withering as a pluralist electorate and agenda takes deep root across out nation's landscape. The fledgling "tea party" movement pales in comparison to the pluralist pagentry that is in celebration of the future. By all accounts, the pluralist political machine is grinding down the elitist and populist political structures in detail. From where I sit, American democracy is in reform, as the pundits of the old retire generationally. Indeed, we are witnessing a rebirth of democracy in our time.

Republican Senators Jon Kyl, Judd Gregg, and Mitch McConnell

Related Posts

Corporate Anarchy?

From Populism to Pluralism

Saturday, March 20, 2010

A Regulatory Architecture for Cross-Border Banking Groups

by Stefano Micossi © VoxEU.org

Policymakers and commentators have suggested that large banks should be broken up. This column argues that such an idea risks the very existence of a global financial system. It outlines an alternative framework in which deposit insurance should be covered by banks not taxpayers, banks should not be guaranteed a bailout, and regulators should be mandated to step in when the warning signs begin.

Following the demise of Lehman Brothers, the debate on regulatory reform has led to the conclusion that large banking institutions must be broken up and their risk-taking activities limited by law along the lines of the ‘Volcker rule’ (Gros 2010). Not only are such actions unnecessary, they may be hard to implement and could reduce the availability of credit to the economy (if for example they reduce the ability of banks to hedge their credit positions). The main consequence of such a plan would be the disintegration of global financial markets as they break down into segregated national markets.

In recent research, my colleagues and I set out an alternative solution that can achieve a more stable and resilient financial system without renouncing the benefits of global and multi-purpose financial institutions and innovative finance (Carmassi et al 2010). These are predicated on effectively curtailing moral hazard and strengthening market discipline on banks’ shareholders and managers by raising the cost of the banking charter to fully reflect its benefits for the banks, and restoring the possibility that all – or at least most – large banks can fail without unmanageable systemic repercussions.

Back to basics

The crisis was generated by cross-border banks levering their deposit base and acting, through the wholesale interbank market, as the residual suppliers of liquidity for all the other players in financial markets. This multiplied funds for speculation and helped to sustain a gigantic inverted pyramid of securities made up of other securities and yet again other securities. Without the money-multiplying capacity of the banks, the asset price bubble and the explosion of financial intermediation and aggregate leverage would not have been possible. Extreme investment strategies by bankers obviously reflect moral hazard created by the expectation that governments would step in to bail them out in case of large losses.

The first thing that needs to be done in order to restore corrective incentives for bank managers and shareholders is to eliminate the most obvious pitfall in banking regulation, that is, reliance on capital requirements based on risk-weighted assets. This approach is flawed since asset risk cannot be assessed and measured independently of market conditions and market sentiment. As a result, the need for capital will always be underestimated under favourable market conditions, leading to balance-sheet fragility and precipitous asset sales when market sentiment turns sour. Banks need a capital buffer to overcome the massive asymmetries of information between bank managers on one side, and investors and regulators on the other. This asymmetry currently makes it easy for bankers to accumulate excessive risks in the quest for higher returns, before markets become aware. The way to do this is to set capital requirements in straight proportion to total assets or liabilities of banking groups.

Fixing flaws in prudential capital rules does not remove moral hazard from the banking system, whose specific sources must be tackled separately. These are:
  • the deposit-institution franchise,
  • the implicit or explicit promise of bailout in case of threatened failure, and
  • regulatory forbearance.
The problem associated with the deposit franchise is well known. If depositors have doubts on the bank’s solvency, they will run for the exit, forcing rapid liquidation of banks’ assets, possibly with large losses and contagion spreading instability to other banking institutions.

First pillar for tackling moral hazard

Deposit insurance can reassure depositors, and thus is the first pillar for tackling moral hazard, but it also mutes their incentive to monitor the management of their bank, since they no longer risk losing their money.

More importantly, deposit insurance has evolved in most countries into a system effectively protecting the bank, or the entire banking group, rather than depositors. When a bank risks becoming insolvent, supervisors step in to cover its losses and replenish its capital so as to avoid any adverse repercussions on market confidence. Moreover, most deposit insurance systems are inadequately funded by insured institutions, entailing an implicit promise that taxpayers’ money will make up the difference in case of failure of large banks.

Thus, in order to re-establish a proper price for the banking charter, banks should carry, ex ante, the full cost of deposit protection, making sure that in most circumstances the guarantee fund would be adequate to reimburse depositors when individual banks fail. Of course, no fund could ever be sufficient to meet a general banking crisis, but a fund of an appropriate size would offer adequate protection in normal circumstances, with only a predictable share of banks going bankrupt.

Deposit insurance fees are the right instrument to make banks pay for risk generated by banks. They should be determined on the basis of a careful probabilistic assessment of the likelihood of failure within the overall pool of deposits and risks of the banking system (within appropriately defined market jurisdictions). This is where the risk profile of banks’ asset and loan portfolios can be taken fully into consideration, together with, more broadly, the quality of bank management and risk control, thus creating effective penalties for riskier behaviour. Size itself could be appropriately penalised by higher fees that would incorporate a probabilistic price for the potential threat for systemic stability.

Second pillar: Not “too big to fail”

The second pillar required in order to greatly limit moral hazard in the financial system is credibly removing the promise that large banks cannot fail. To this end, all main jurisdictions should establish special resolution procedures – as already exist in the US and the UK – managed by an administrative authority, with powers to require early recapitalisation, manage required reorganisations and, once all this has failed, liquidating the bank with only limited systemic repercussions. Crisis prevention, reorganisation and liquidation would all be part of a unified consolidated resolution procedure managed for each bank, by one administrative authority.

In order to make resolution feasible, all banks and banking groups would be required to prepare and provide to their supervisors a document detailing the full consolidated structure of legal entities that depend on the parent company for their survival, the claims on the bank and their order of priority, and a clear description of operational – as distinct from legal – responsibilities and decision-making, notably regarding functions centralised with the parent company. This "living wills" document may also comprise "segregation" arrangements to preserve certain functions of systemic relevance even during resolution: for clearing and settlement of certain transactions, netting out of certain counterparties, suspension of covenants on certain operations.

Third pillar: No regulatory forbearance

Finally, the third pillar of an effectively reformed financial system is a set of procedural arrangements that will prevent supervisory forbearance. Supervisory discretion to postpone corrective action would be strictly constrained, so that bankers, stakeholders and the public would know that mistakes would always meet early retribution. To this end it is necessary to establish a system of early mandated action by bank supervisors ensuring that, as capital falls below certain thresholds, the bank or banking group will be promptly and adequately recapitalised. Should capital continue to fall, then supervisors should be required to step in and impose all necessary reorganisation, including disposing of assets, selling or closing lines of business, changing management, ceding the entire bank to a stronger entity.

Should this not work, then liquidation would commence. A bridge bank would take over deposits and other “sound” banking activities, thus ensuring their continuity. All other assets and liabilities, together with the price received for the transfer of assets to the bridge bank, would remain in the “residual” bank, which would be stripped of its banking licence. An administrator for the liquidation of the residual bank would be appointed to determine its value and satisfy creditors according to the legal order of priorities (based on the law of the parent company and other jurisdictions involved).

The attractive feature of mandated corrective action is that asset disposals and change of management will normally take place well before capital falls to zero, so that losses for the insurance fund and ultimately taxpayers are more likely to be much smaller.

References

Carmassi, E L, and Stefano M (2010), "Overcoming too big to fail - A Regulatory Framework to Limit Moral Hazard and Free Riding in the Financial Sector," CEPS-Assonime Report.

Gros, D (2010), "Too interconnected to fail = too big to fail: What is in a leverage ratio?", VoxEU.org, 26 January.

Republished with permission of VoxEU.org

Friday, March 19, 2010

Corporate Anarchy?

by Sunshine Mugrabi © SunshineMug.com

In the last few posts I've been developing an idea that has been rattling around my brain for some time now. It can be summed up in the following questions: Is the social media revolution we're experiencing right now a result of new technology, or is it the other way around? In other words, is the quiet revolution sweeping through our society in response to new technology -- or, this new way of communicating something that we have called into existence? Could it be that the rise of Twitter and Facebook and Friendfeed are a reflection of people's desire to break down the old barriers and speak directly to one another? Are we seeing the end of the "expert" era, in which all knowledge and understanding is filtered to us through a select few?

It seems to me that this is a generational thing. My parents generation, the children of the 1960s, started some kind of revolution. It was in many ways a flawed attempt. As the writer Ken Wilbur has pointed out, for all its good intentions, the boomer generation was immensely narcissistic. It had (and still has) a tendency to blow its accomplishments up out of proportion. And it was very much still stuck in an "us/them" paradigm. In fact, the whole idea of a generation gap is based on that! However, there's no denying that our parents generation -- with their anti-war protests, long hair, and rebellion -- shook up the old order for good and all.

Then came my generation - Generation X. We were a bit lost for a time. They called us slackers, because we tended to be introspective. We couldn't exactly rebel, because our parents had already done that, so we kind of came up with our own way. When I was in college, I took to calling myself an anarchist. This was partly to annoy my parents and professors. But it was also my way of showing my dissatisfaction with the dual options that were being served up as my only choices -- Democratic v. Republican, Left vs. Right, Women vs. Men, etc.

Now, there's a new generation--I believe they're calling it Generation Y. They seem to evolved a whole new stance. It's as if they have taken the best of the boomer generation and my generation, and melded them into something entirely new.

They're not rebelling. They're talking. And, lo and behold, no one is left out. It may have started on the campus of Harvard University, but now Facebook is open to all. Even Walmart has a page (though many of the wall comments aren't terribly kind).

The new generation seems to intuitively understand something that for all its free love, the hippies never completely got. That is, we are a human family -- all of us connected to one another. When we try to deny it, we experience the opposite. Alienation. Loneliness. Anger. All of the things that seem to ail our society today.

The great marketing guru Seth Godin (who I generally like and agree with) showed himself to be stuck in the old paradigm with a recent post, "You Matter." In it, he lists all of the situations that show that you matter. They were all very heartfelt. For example:

"When you love the work you do and the people you do it with, you matter ... When kids grow up wanting to be you, you matter."

The list goes on. However, the new paradigm is as follows: "Everyone matters. Period." You could be having the crappiest day ever, and feeling no love whatsoever for your fellow man. You could be a protestor on the streets Tehran. You could be a tech startup struggling to get noticed. Or, you could be Apple.

No one is left out of this. Everyone matters.

Republished with permission of SunshineMug.com

Thursday, March 18, 2010

VaR Methodologies Compared II

In addition to the Value at Risk (VaR) comparison chart I posted at VaR Methodologies Compared, I found a second equally insightful comparative analysis in table form, this one by Prof Philippe Jorion (2007, p. 270):

As stated in my previous post, I believe that Monte Carlo (i.e., stochastic) simulation provides the most robust approach to estimating VaR, though this approach requires more computing (and brain) power.

Source: Jorion, P (2007), Value at Risk: The New Benchmark for Managing Financial Risk (3rd ed), New York: McGraw-Hill.

See also:

VaR Methodologies Compared

"Model-less" Modeling?

One of the themes that seems to be emerging in some circles of finance is the notion of "model-free" analysis and reasoning. Here is a comment that I recently posted in response to such thinking in a popular public forum:

To all, the case for modeling probabilities and inferential statistics is a bit harder to dismiss than some are presupposing in this forum. Now, I do concede that researchers have many challenges to deal with when validating data and models. Moreover, the theoretical relationships between dependent and independent variables (including proxy measures) are often misspecified and/or interpreted by researchers to the point where some research is outright misleading (especially in the social sciences). Nevertheless, inferential modeling of complex pheonomena cannot be simply dismissed as incredible via declaratory argument and assertions. There is a tremendous burden of proof that is assumed when one takes the path of falsifying probability theory and inferential statistics as a discipline. Thus, I fear that the notion of "model-less" reasoning is problematic as an approach to understanding what is happening in this place called "reality" around us. For the record, I am not as pessimistic about the validity and reliability of such methods as some may be here and in society. My advice to all is "be prepared" to defend your evidence if your goal is to falsify probability theory as a tool for understanding and evaluating the risks (and dangers) that seem to prevail in this universe. Thanks for the opportunity to comment...

VaR Methodologies Compared

I recently came across a succinct comparison of the three accepted approaches to computing Value at Risk (VAR). According to Boris Agranovich of GolobalRiskConsult:
VAR or Value at Risk is a summary measure of downside risk expressed in the reference currency. A general definition is: VAR is the maximum expected loss over a given period at a given level of confidence. VaR does not inform on the size of loss that might occur beyond that confidence level.

The method used to calculate VaR may be historical simulation (either based on sensitivities or full revaluation), parametric, or Monte Carlo simulation. All methodologies share both a dependency on historic data, and a set of assumptions about the liquidity of the underlying positions and the continuous nature of underlying markets. In the wake of the current crisis the weaknesses of VAR methodology became apparent and they need to be addressed....

A VAR system alone will not be effective in protecting against market risk. It needs to be used only in combination with limits both on notional amounts and exposures and, in addition, should be reinforced by vigorous stress tests.
I personally maintain that the Monte Carlo (i.e., stochastic) approach is superior to the other methods because the results provide a more "polyvalent" explanation of the financial risk in the subject investment.

Source: GlobalRiskConsult

See also:

VaR Methodologies Compared II

Wednesday, March 17, 2010

A Global Readership

As of today, almost a third of the "hits" on The Vantage Point have come from Asia and Europe -- around half came from North America -- I guess this means that The Vantage Point has gone global!

Source: SiteMeter.com

Tuesday, March 16, 2010

Risk versus Danger

The line between “risk” and “danger” is very thin, at least according to their definitions -- are either more or less real...?
__________________

risk
n
1. the possibility of incurring misfortune or loss; hazard
2. (Business / Insurance) Insurance
a. chance of a loss or other event on which a claim may be filed
b. the type of such an event, such as fire or theft
c. the amount of the claim should such an event occur
d. a person or thing considered with respect to the characteristics that may cause an insured event to occur
at risk
a. vulnerable; likely to be lost or damaged
b. (Social Welfare) Social welfare vulnerable to personal damage, to the extent that a welfare agency might take protective responsibility
no risk Austral informal an expression of assent
take or run a risk to proceed in an action without regard to the possibility of danger involved in it
vb (tr)
1. to expose to danger or loss; hazard
2. to act in spite of the possibility of (injury or loss) to risk a fall in climbing
[from French risque, from Italian risco, from rischiare to be in peril, from Greek rhiza cliff (from the hazards of sailing along rocky coasts)]
__________________

danger
n
1. the state of being vulnerable to injury, loss, or evil; risk
2. a person or thing that may cause injury, pain, etc.
3. Obsolete power
in danger of liable to
(Medicine)
on the danger list critically ill in hospital
[daunger power, hence power to inflict injury, from Old French dongier (from Latin dominium ownership) blended with Old French dam injury, from Latin damnum]
__________________

Source: Free Dictionary

Monday, March 15, 2010

Advanced Analytics Not Information Technology

I recently fielded a forum question about the cost-creation versus value-adding capabilities of information technology (IT) and advanced (i.e., bespoke) analytics in enterprise. Here is how I responded:

Regarding the linkages between information technology (IT), advanced analytics, and value, I would gently suggest that IT is a cost center, and advanced analytics are the value-adding proposition. In other words, don't go to the IT department if you are seeking to activate value-adding analytics (though I will concede that IT does have an effective role in business intelligence [BI] production, which is very different from advanced analytics in my view).

Unfortunately, IT solution providers know full well that advanced analytics is what creates value, and so IT firms will typically "bundle" various analytic offerings with a proposed IT solution in an effort to bamboozle the client into believing that scarce IT dollars can buy both transaction management and advanced analytical services together in one "big" IT installation deal. Buyers of IT solutions should therefore beware.

What is needed today is for IT managers to yield the analytics space to subject matter experts with analytical solutions that stand separate from the data warehousing infrastructure, while seeking to reduce costs in IT by exploiting the economies of scale that IT solutions typically contribute to the cost analysis.

Again, IT is a cost center, while advanced analytics (separate from BI) are the value-adding activity.