Prof. Jayanth R. Varma's Financial Markets Blog

Photograph About
Prof. Jayanth R. Varma's Financial Markets Blog, A Blog on Financial Markets and Their Regulation

© Prof. Jayanth R. Varma
jrvarma@iima.ac.in

Subscribe to a feed
RSS Feed
Atom Feed
RSS Feed (Comments)

Follow on:
twitter
Facebook
Wordpress

July
Sun Mon Tue Wed Thu Fri Sat
       
2012
Months
Jul
2011
Months

Powered by Blosxom

Thu, 19 Jul 2012

Statistics for finance in a post crisis world

I made a presentation on “Statistics for finance in a post crisis world” at the Sixth Statistics Day Conference organized by the Reserve Bank of India on July 17, 2012. Bullet points from my slides are given below.

Big Data

Example: US Flash Crash

Big Data in Finance

Hidden Data

Shadow Banking and Hidden Credit

Hidden credit: the data challenge

Hidden Debt

Hidden Risks

Hidden Foreign Exchange Risks

Hidden Interest Rate Risk: Household Sector

Broken Data

Only Traded Prices are Real

Enronic Accounting

Forensic statistics

Broken Models

The Gaussian Distribution

Non Gaussian Copulas and Marginals

Gaussian Copulas and CDOs

The Way Forward

Simpler finance, maybe. Complex statistics, surely.

Posted at 21:53 on Thu, 19 Jul 2012     View/Post Comments (4)     permanent link


Mon, 09 Jul 2012

What is a price?

As I keep thinking about Libor fixing (see my post last week on this), I have realized that the word price is used in many ways to mean many things not all of which deserve to be called a price:

An actual traded price
This is the simplest and perhaps most unambiguous definition of a price. The only problem with this notion is due to illiquidity. If the asset is highly illiquid, there may be no recent traded price. More commonly and more importantly, the traded price is subject to the bid-ask bounce – a trade initiated by the seller executes at the bid price while a buyer initiated trade executes at the ask price. If the stock is traded frequently enough and the bid-ask spread is small in relation to the desired level of accuracy, the traded price is a clean and transparent definition of price.
The mid price
Even if the stock is modestly illiquid, there is often an ask price and a bid price in the order book and the average of these is a reasonable approximation to the true price. It is probably better however to use the entire bid-ask interval instead of just the mid price to communicate the range of uncertainty about the true price. Moreover, the bid and ask are valid only for small transaction sizes. It may be better to use the full information in the order book to do an impact cost calculation and present the bid and ask for a more reasonable order size.
An average of traded prices
Quite often closing prices on an exchange are determined as averages of prices during the last few minutes of trading – though in some cases, “few” gets stretched to quite a long period. This averages out the bid-ask bounce and is a tolerable approximation if the volatility of the “true” price during the averaging period is small in relation to the impact cost of a reasonable trade. Sometimes, the averaging is designed to deal with attempts to manipulate the closing price and then it may be reasonable in the above comparison to use the impact cost of the expected trade size of a potential market manipulator which may be significantly larger than typical trade sizes. An alternative to averaging is to use a call auction to determine the closing price.
Polled or indicative prices
Libor and the well known US Constant Maturity Treasury (CMT) fall in this category. The attempt here is to average over market participants’ quotes about what they believe is the true price. The difference between polled prices and traded prices is like the difference between an opinion poll and an actual election. I think it is a mistake to base large derivative markets on “opinion polls”.
Model prices
In the absence of traded prices, it is common to use a pricing model to estimate prices. Of course, there are several shades of grey here: accountants talk about Level One, Level Two and Level Three assets to capture some of the greyness. Outside of finance, hedonic estimates of the price of real goods are also model prices. For an even more extreme case, one could consider a surveyor’s real estate valuation opinion as a model price where the model is less precisely articulated. At the opposite end in terms of formalization of models, the equilibrium prices derived out of general equilibrium models are also model prices with the added twist that in many of these models, the no trade theorem is actually in force and the model price is an estimate of the price at which nobody wishes to trade. My own view on this is that model prices are valuation opinions and not prices.

Where does that leave us? I think that for liquid assets, actual traded prices (perhaps determined by a call auction) are the best way to define the price. For illiquid assets, it is best to recognize that there is no unique price and to use a price interval as the best way to communicate the range of uncertainty involved. I do not understand why physicists are quite happy to say that the gravitational constant in appropriate SI units is 6.67384 ± 0.00080, but in finance and economics we are unwilling to say that the price of an asset is 103.23 ± 0.65.

Posted at 08:57 on Mon, 09 Jul 2012     View/Post Comments (0)     permanent link


Thu, 05 Jul 2012

Libor, the Gaussian Copula and the Sociology of Finance

I have blogged about the sociology of finance several times (for example in 2010, and in 2011). Two pieces that I read (or in one case re-read) recently have reinforced my view that this literature is important for understanding modern finance.

When the penalties imposed on Barclays by the UK FSA and the US CFTC brought Libor back into the limelight, I found myself re-reading MacKenzie’s fascinating description of the Libor fixing (Donald MacKenzie, “What’s in a Number?”, London Review of Books, 30(18), 25 September 2008, pages 11-12) based on his ethnographic study carried out prior to the financial crisis.

None of the finance textbooks describe the actual mechanics of the Libor fixing as well as this piece. Every source on Libor recites the standard definition that Libor is “The rate at which an individual contributor panel bank could borrow funds, were it to do so by asking for and then accepting interbank offers in reasonable market size, just prior to 11.00am London time.” But one has to read MacKenzie to understand how this hypothetical condition (“were it to do so”) is actually operationalized. Similarly, MacKenzie tells us very casually that a mere $50 million or so may fall short of reasonable market size which for the major currencies would be of the order of several hundred millions.

The second paper that I have been reading also co-authored by MacKenzie is weightier and more recent (Donald MacKenzie and Taylor Spears, “‘The Formula That Killed Wall Street’? The Gaussian Copula and the Material Cultures of Modelling”, June 2012). This paper discusses the well known (and by now notorious) Gaussian copula model for pricing CDOs.

The crucial claim in this paper is that Gaussian copula models were and are crucial to intra- and inter-organizational co-ordination, while simultaneously being ‘othered’ by the modellers themselves. The word ‘other’ might be a simple word, but it has a complex meaning. What is being argued is that the modellers steeped in the culture of no-arbitrage modelling never ‘naturalized’ the Gaussian copula and did not even regard it as a proper model. The dissonance between actuarial models and no-arbitrage models is also brought out very well. I found myself thinking that the battle between CreditMetrics and CreditRisk+ more than a decade ago was also one between actuarial models and no-arbitrage models.

As an aside, the authors also bring up the issue of counterperformativity (models being invalidated by their widespread adoption): “models used for governance are undermined by being gamed; models used to hedge derivatives are undermined by the effects of that hedging on the market for the underlying asset” They also speculate on the possibility of ‘deliberate counterperformativity’: “the employment of a model that one knows overestimates the probability of ‘bad’ events, with a view to reducing the likelihood of those events.”

Posted at 12:32 on Thu, 05 Jul 2012     View/Post Comments (0)     permanent link