Prof. Jayanth R. Varma's Financial Markets Blog

Photograph About
Prof. Jayanth R. Varma's Financial Markets Blog, A Blog on Financial Markets and Their Regulation

© Prof. Jayanth R. Varma
jrvarma@iima.ac.in

Subscribe to a feed
RSS Feed
Atom Feed
RSS Feed (Comments)

Follow on:
twitter
Facebook
Wordpress

May
Sun Mon Tue Wed Thu Fri Sat
   
   
2012
Months
May
2011
Months

Powered by Blosxom

Fri, 25 May 2012

Does finance need 128 bit integers?

A blog post yesterday at Marginal Revolution raised the interesting question of whether stock prices should quote in increments of one-hundredth of a cent instead of one cent. That is a very complex question in market micro structure that I do not wish to get into now. But the comment thread on this blog post got into a much more interesting question of how to represent money in a computer.

It is well known that using a floating point number to represent money is a very bad idea – every computer scientist knows that we must use integers for this purpose even if fractions are involved. If we do not have to deal with fractions of a cent, then internally everything should be stored in cents so that a million dollars is represented by 100 million which is well within the range of a 32 bit integer (for example, the long int in ISO C++). If fractions of a cent (say milli-cents) are possible, then everything should be stored in milli-cents and a million dollars is represented by 100 billion which is beyond the range of a 32 bit integer but is well within the range of the more modern 64 bit integer (for example, the long integer in Java).

Dick King commenting at the Marginal Revolution blog post points out that if fractions of one-tenth of a micro-cent are permitted, then a 64 bit integer would allow us to represent up to around a trillion dollars. As he says: “If AAPL ever reaches a market cap of $1 trillion and you decide you want to buy it all you will not be able to place an ordinary order on an ordinary exchange ... sorry about that.” If you really want to be precise about these things, the 64 bit integer takes us only up to a little over $922 billion if you want negative numbers also; if only unsigned quantities are required, then it could take us to $1.8 trillion. But as a rough order of magnitude, we can take one trillion as the upper limit on monetary quantities that can be represented to an accuracy of one micro-cent using 64 bit integers.

I would think that there might be situations where much more than a trillion – perhaps, even a quadrillion – might be needed. In the days before the euro, we used to joke that the word quadrillion was invented to count the Italian public debt (in lire of course). But more seriously, the total open interest in all the global derivative markets is not far short of a quadrillion dollars. Equally, there may be situations where micro cents or nano cents might make sense for micro payments for charging internet transactions. For example, the digital currency Bitcoin (which is valued at approximately one US dollar) allows subdivision up to 8 decimal places or one hundredth of one millionth of a Bitcoin. If we need a single representation for all monetary quantities from 1015 (one quadrillion) to 10-8 (one hundredth of one millionth), the 64 bit integer is simply insufficient. Perhaps, finance will at some point need a 128 bit (16 byte) long integer.

Of course, some people do argue that data structures like Java’s BigInteger that allows arbitrary size of integers should be used. But this arbitrary size comes at a very heavy price. It appears that a Java BigInteger takes about 80 bytes that is five times more than a 128 bit (16 byte) long integer. The performance penalties would also be substantial. While a 128 bit integer would not be sufficient to count the number of protons in the universe, it should be adequate for the full range of monetary quantities that we are likely to encounter for a long time – it will take us from 1021 to 10-15 with a couple of digits to spare.

Posted at 14:04 on Fri, 25 May 2012     View/Post Comments (1)     permanent link


Sun, 20 May 2012

Hedging at negative cost?

I have been reading the transcript of the conference call in which JPMorgan Chase reported a $2 billion loss on a position that was intended to hedge tail risk (h/t for the transcript to Deus Ex Macchiato). Much has been written about the hedge that JPM Chairman, Jamie Dimon, himself described as “a bad strategy ... badly executed ... poorly monitored.” I want to focus instead on another interesting statement that he made about the hedge:

It was there to deliver a positive result in a quite stressed environment and we feel we can do that and make some net income

Note the tense of that verb “feel”: he does not say “felt”, he says “feel” – after that $2 billion loss, he still thinks, that you can set up a hedge which makes money! The Chairman of one of the largest banks in the world – a bank which is still well respected for highly sophisticated risk management thinks that a tail risk hedge need not cost money, but can actually make money. In other words, there are negative cost hedges out there that can protect you against tail risk.

If you believe in the Efficient Markets Hypothesis (EMH), you know that this is not possible – there is no free lunch. Sure, you can hedge against tail risk, but that will cost you money, and in turbulent markets, it will cost you a good deal of money. The global financial crisis was in a sense the revenge of the Efficient Markets Hypothesis. Those who ignored the “no free lunch” principle and chased illusory excess returns were ruined (or would have been ruined but for their successfully persuading the state to bail them out). The biggest moral hazard of the egregious bail outs of 2008 is that the financial sector has still not internalized the “no free lunch” principle of the Efficient Markets Hypothesis. That is a tragedy for which surely the taxpayer will one day have to pay once again.

In fact, the term hedge seems to have a very different meaning in the financial sector than in the corporate sector (or perhaps, I should say the old fashioned non-financialized part of the corporate sector). If you are an airline that hedges oil price risk, chances are that you are more prudent (more risk averse) than the airline that does not hedge its risk. This is because all airlines face somewhat similar oil price risks and the one that hedges is probably less risky. At least that would be the case if the airline does not use oil price hedging to justify an excessively high level of debt in its capital structure (that was why, I began by confining my remarks to the old fashioned non-financialized corporate sector).

In the financial sector (and in highly financialized industrial companies as well), things are very different. The bank that puts on a hedge does not necessarily keep its portfolio unchanged. On the contrary, it uses the hedge to take on more risks on the underlying portfolio. The total hedged portfolio is not necessarily less risky than the original unhedged portfolio. Chances are that the hedged portfolio is riskier – much riskier.

At a theoretical level, this was established more than three decades ago in a very interesting and highly readable paper by Hayne E. Leland (“Who Should Buy Portfolio Insurance?”, The Journal of Finance, 1980, 35(2), pp. 581-594.) Leland started with a very simple observation: since derivatives are zero sum games, for every buyer of portfolio insurance, there must be a seller. He then asked the obvious question – which investors would buy insurance and who would sell them.

If one were naive, one might be tempted to answer that the buyers of insurance must be either bearish on stocks or highly risk averse while the sellers must be bullish on stocks or highly risk tolerant. Leland’s answer was totally different. He showed that the bears should be selling insurance and the bulls should be buying them. The reason is that the bulls would load themselves up so heavily on stocks (possibly borrowing to buy stocks) that they need downside protection to maintain the position at all. On the other hand, if you are so bearish on stocks that you have put all your money in bonds, clearly you are not going to be buying portfolio insurance!

The situation regarding risk aversion is more complex. Everything depends on how risk tolerance increases with wealth and it will take too long to describe that argument here. The interested reader should read Leland’s original paper.

Anyway, the key point is that the hedge permits the underlying portfolio to become riskier and more toxic. It is like the old adage that the brakes make the car go faster. So when the banks argue that they need complex derivative products to hedge their risks, what they really mean is that they need these derivatives to create very risky asset portfolios while managing the downside risk up to the point where it can be palmed off to the taxpayer.

To quote another adage (this time from the world of financial trading itself), hedging in the financial world is nothing but speculation on the “basis”; it has little to do with risk reduction.

Posted at 16:51 on Sun, 20 May 2012     View/Post Comments (4)     permanent link


Sun, 13 May 2012

Automating financial advice

Two months back, Abnormal Returns wrote a post entitled “You are not all that unique an investor” which linked to a short survey of the online money management space at World Beta. There are a number of websites in the US that provide personalized financial advice based on software. When one probes further, however, it is clear that this field is still evolving and has a long way to go. One website does not provide advice on asset allocation, it only compares funds that you are already holding with other funds in the same category and recommends cheaper or better performing funds from the same category. Another site emphasises its ability to give personalized advice but it is only in the legal fine print that I could find a disclosure that the advice is based on software tools.

But today I was reading an NBER working paper by Mullainathan, Noeth and Schoar entitled “The market for financial advice: an audit study” and I realized that software does not have to be particularly good to be competitive with traditional advisors. The bar for that is so low that existing software is probably good enough and of course the software will get better. On the other hand, five years after the financial crisis, there is no evidence whatsoever that traditional financial advisors are becoming any less conflicted.

Mullainathan, Noeth and Schoar used an audit methodology where they hired trained auditors to meet with financial advisers with different types of portfolios and submit a detailed report of their interaction with the adviser (for a total of 284 client visits). They find that “advisers not only fail to de-bias their clients but they often reinforce biases that are in the interests of the advisors. Advisers encourage returns-chasing behavior and push for actively managed funds that have higher fees, even if the client starts with a well-diversified, low-fee portfolio.”

It does not even appear that the traditional adviser personalizes the advice adequately: “advisers are less likely to ask younger or female auditors some basic question about their financial situation, and it also leads to worse advice since the adviser does not have full information.”

I think that financial advice is an industry ripe for disruptive transformation through the internet and software.

Posted at 15:31 on Sun, 13 May 2012     View/Post Comments (1)     permanent link


Mon, 07 May 2012

Government cash management and liquidity squeezes

India witnesses predictable periodic liquidity squeezes due to large outflows of money from the banking system around the dates on which advance tax instalments are due to the government. The central bank does take some offsetting action to pump liquidity into the banking system, but these actions are often not quite adequate. Sometimes, the liquidity situation is fully restored only as the government starts spending out of the tax receipts. In India, we have gotten used to this as if this is the natural and unavoidable state of affairs.

It was therefore interesting to read a nice paper from the New York Federal Reserve describing how the US has solved this problem completely. The paper by Paul J. Santoro is about the evolution of treasury cash management during the financial crisis, but it is description of the pre-crisis system that is of interest for the advance tax problem. The US Treasury’s cash balance is also “highly volatile: between January 1, 2006, and December 31, 2010, it varied from as little as $3.1 billion to as much as $188.6 billion”. But this volatility does not create any problem either for the banking system or the central bank.

The Treasury divides its cash balance between two types of accounts: a Treasury General Account (TGA) at the Federal Reserve and Treasury Tax and Loan Note accounts (TT&L accounts) at private depository institutions.

If, in the pre-crisis regime, the Treasury had deposited all of its receipts in the TGA as soon as they came in, and if it had held the funds in the TGA until they were disbursed, the supply of reserves available to the banking system – and hence the overnight federal funds rate – would have exhibited undesirable volatility. To dampen the volatility, the Fed would have had to conduct frequent and large-scale open market operations, draining reserves when TGA balances were declining and adding reserves when TGA balances were rising. A more efficient strategy, and the one used by the Treasury in its Tax and Loan program, was to seek to maintain a stable TGA balance.

Each morning Treasury cash managers and analysts at the Federal Reserve Bank of New York estimated the current day’s receipts and disbursements. During a telephone conference call at 9 a.m., they combined the estimates with the previous day’s closing TGA balance, scheduled payments of principal and interest, scheduled proceeds from sales of new securities, and other similar items to produce an estimate of the current day’s closing balance. If the estimated closing balance exceeded the target, the Treasury would invest the excess at investor institutions that had sufficient free collateral and room under their balance limits to accept additional funds. If the estimated balance was below target, the Treasury would call for funds from retainer and investor institutions to make up the shortfall.

The key role in this system is played by retainer and investor institutions with whom the Treasury maintains its TT&L balances. The Santoro paper describes their role as follows:

A retainer institution also accepted tax payments but, subject to a limit specified by the institution and pledge of sufficient collateral, retained the payments in an interest-bearing “Main Account” until called for by the Treasury. If a Main Account balance exceeded the institution’s limit, or if it exceeded the collateral value of the assets pledged by the institution, the excess was transferred promptly to the TGA.

An investor institution did everything a retainer institution did and, as described below, also accepted direct investments from the Treasury. The investments were credited to the institution’s Main Account and had to be collateralized.

During the crisis, as the Fed expanded its balance sheet and banks ended up holding vast excess reserves, the pre-crisis policy of stabilizing the TGA balance ceased to be relevant. Moreover, with the Fed paying interest on excess reserves, depositing money in TT&L accounts would have been an additional subsidy to the banks. Therefore, the Treasury moved to a policy of keeping almost all its cash in the TGA (allowing it to become volatile). As and when monetary policy normalizes, the pre-crisis system will probably come back:

Nevertheless, a significant decline in excess reserves resulting from a shift in monetary policy may once again make it necessary to target a more stable TGA, so that TGA volatility does not cause undesirable federal funds rate volatility and interfere with the implementation of monetary policy.

In short, the advance tax related liquidity squeezes in India is simply the outcome of faulty government cash management practices. Other countries have solved this problem long ago (the late 1970s in the case of the US) and the solution is simple and effective. All that is lacking in India is the willingness to do the sensible thing.

Posted at 12:15 on Mon, 07 May 2012     View/Post Comments (0)     permanent link


Wed, 02 May 2012

Disclosure of risk factors

I have long felt that the risk factors that are disclosed in most offer documents are next to useless in assessing the risk of a security. In utter frustration, I have often wondered whether it would better to replace all that legalese with a simple empirical fact embellished with a nice skull and crossbones symbol:

☠   Numerous studies covering many different countries have shown that over the long term, initial public offerings tend to underperform the rest of the stock market. Subscribing to these offerings can therefore be injurious to your wealth.

Of course, the same studies also document a large positive initial return to investors who sell immediately after listing, but that is not a risk factor!

Tom C. W. Lin has a different idea in his paper, “A Behavioral Framework for Securities Risk” (34 Seattle University Law Review 325 (2011)).

In order to better capture the advantages of disclosure-based risk regulations given the behavioral tendencies of investors, this Article proposes a behavioral framework for Risk Factors built on (1) the relative likelihood of the risks and (2) the relative impact of dynamic risks. This framework makes risk disclosures more accessible and meaningful to investors and would serve as the new default for public firms. An important feature of the new default is that firms will be able to opt out of the new framework if they believe that the existing Risk Factors requirements are more appropriate. But these firms would need to explain to investors why they opted out. This new default framework would be spatially, optically, and substantively superior to the current framework for investors.

Tom Lin phrases the entire proposal in terms of behavioural finance, but nothing in the proposal depends on behavioural finance. Classifying risks on the basis of likelihood (or frequency) and impact is perfectly rational, and is in fact standard practice in risk management. Thanks to the Basel regulations for operational risks, at least the financial sector has plenty of experience doing this. So it cannot be claimed that it is not feasible.

I think this is definitely worth trying out, and if it works, we may not need the skull and crossbones after all.

Posted at 13:42 on Wed, 02 May 2012     View/Post Comments (0)     permanent link