Prof. Jayanth R. Varma's Financial Markets Blog

Photograph About
Prof. Jayanth R. Varma's Financial Markets Blog, A Blog on Financial Markets and Their Regulation

© Prof. Jayanth R. Varma

Subscribe to a feed
RSS Feed
Atom Feed
RSS Feed (Comments)

Follow on:

Sun Mon Tue Wed Thu Fri Sat

Powered by Blosxom

Sat, 29 Sep 2012

Accountable algorithms

Ed Felten argues that with modern cryptography it is possible to make randomized algorithms accountable (h/t Bruce Schneier). This means that the public can verify that the algorithm was executed correctly in a particular case even though the algorithm used random numbers to make it unpredictable.

Felten’s idea is to use one random number to achieve unpredictability and another random number to achieve randomness. The authority running the algorithm chooses the first random number (R) secretly and then commits it (the cryptographic equivalent of putting it in a tamper proof sealed envelope which will be opened later). Then, it chooses the second random number (Q) publicly (for example, by rolling the dice in public). The two random numbers are added and the sum (R + Q) is the input to the algorithm. Note that the public cannot verify that R was chosen randomly, but this does not matter because even if R is non random, R + Q is still random.

Felten’s examples are not from finance, but I find the finance applications quite fascinating. For example, the income tax department selects some individuals randomly for detailed scrutiny. Using Felten’s ideas, it is possible for the individual who is selected for scrutiny to verify that this scrutiny is a result of a genuine random selection and not because of the assessing officer’ bias. It is possible to do this without making the selection predictable.

As a second example, suppose a stock exchange wants to look at prices at random times because if fixed times are chosen, there is greater risk of the prices being manipulated. The random time must be unpredictable to participants. But after the fact, we want to be able to verify that the time was chosen randomly and that some exchange official did not deliberately choose a specific time “after the fact” with knowledge of the actual prices. Felten’s ideas can be used to solve this problem as well.

In a comment on his second post, Felten introduces even more interesting ideas. For a financial example, consider an organization which requires certain employees to take prior approval before trading stocks on personal account. Suppose the compliance officer disallows a trade on the ground that the particular stock is on a negative list of stocks that cannot be traded. How does the employee verify that the compliance officer is not lying if the list itself is secret? Felten’s method can be used to deal with this problem. The compliance officer should publicly announce the root hash of a Merkle tree containing the restricted list of stocks. This root hash by itself reveals nothing. Now the compliance officer can reveal a single path in the Merkle tree which allows the employee to verify that the stock in question is on the list. But this would not reveal anything about what else is on the list.

A lot of regulations are written assuming that the people implementing the regulation are honest. This assumption is clearly inappropriate. The Right to Information Act ensures only transparency; it does not guarantee accountability in the presence of randomization. We should require that all algorithms that are used during the implementation and enforcement of the regulations should be accountable in Felten’s sense.

Posted at 21:52 on Sat, 29 Sep 2012     View/Post Comments (1)     permanent link

Sun, 23 Sep 2012

More on Abolishing Blogs

I received many comments on my blog post regarding Pritchard’s proposal on abolishing IPOs. Several comments suggested that while this might reduce losses by retail investors on hot IPOs, it would not eliminate it. I agree completely. Another set of comments asked whether the proposal would deny investors the opportunity to earn high rates of return in IPOs. This is a more subtle issue because the average rate of return on IPOs is nothing great (for the buy and hold investor, it is in fact a below average rate of return). However, many IPOs are “lottery stocks” – though the expected return is low, there is a small probability of very high return (similar to that in a lottery ticket).

If one assumes that retail investors are leverage constrained, these lottery stocks might be attractive to some categories of investors. I recall reading long ago that when Thai telecom tycoon Thaksin Shinawatra wanted to take his company public, he chose to launch the IPO just before the launch of the telecom satellite that was crucial for his business. By doing so, he offered investors an opportunity to gamble on the successful launch of the satellite. Both the IPO and the satellite launch were successful, and years later, he went on to become the Prime Minister of his country.

If one thinks about it carefully, Pritchard’s proposal would not rule out such IPOs, because the only requirement is a seasoning period of continuing disclosures. He does not propose that IPO should happen only after the business model stabilizes.

Posted at 11:57 on Sun, 23 Sep 2012     View/Post Comments (2)     permanent link

Sat, 08 Sep 2012

Abolishing IPOs

Adam Pritchard has a provocative paper arguing that Initial Public Offerings (IPOs) must simply be abolished (“Revisiting ‘Truth in securities revisited’: Abolishing IPOs and harnessing markets in the public good”). He suggests that “companies bec[o]me public, with required periodic disclosures to a secondary market, before they [a]re allowed to make public offerings”.

Pritchard writes:

No one believes that IPOs reflect an efficient capital market. In fact the evidence is fairly strong that IPOs are inefficient. IPOs are bad deals.

IPOs are bad for companies, bad for insiders, and bad for investors. The only parties that clearly benefit from these deals are the individuals who service them: accountants, lawyers, and underwriters.

Despite the provocative language, what Pritchard is referring to is simply the robust empirical result of short term underpricing (which makes IPOs bad for companies and insiders) and long term under performance (which makes IPOs bad for investors). Pritchard correctly attributes these problems to information asymmetry between issuers and investors.

His solution is to create separate primary and secondary markets for private and public companies, and make the transition between them depend on (a) minimum size requirements and (b) acceptance of enhanced disclosure obligations. The primary and secondary markets for private companies would exclude retail investors. Retail investors would be restricted to public companies; moreover, public companies would have been seasoned in the private market before becoming public. During the seasoning period, would-be public companies would file annual reports and quarterly reports on the same lines as public companies. Price discovery would happen in the private secondary market (markets like SecondMarket and SharesPost) on the basis of these public disclosures.

After the seasoning period is over, the company trades in public markets open to retail investors. Pritchard believes that the primary market for these companies should simply be the secondary market itself – so called “At the Market” offerings.

Overall, I like these ideas as they have the potential to make the equity markets more efficient. The only thing that I do not like is Pritchard’s idea that the private markets can be opened not only to Qualified Institutional Buyers (QIBs) but also to Accredited Investors. I have been reading Jennifer Johnson’s paper describing the accredited investor idea as a Ponzi scheme run by regulators (“Fleecing grandma: a regulatory Pfionzi scheme”).

Posted at 07:22 on Sat, 08 Sep 2012     View/Post Comments (6)     permanent link

Tue, 04 Sep 2012

Resolving Central Counter Parties (CCPs) by selective tear-ups

In July 2012, the CPSS (Committee on Payment and Settlement Systems of the Bank for International Settlement) and the IOSCO (International Organization of Securities Commissions) put out of consultation a report on resolution of CCPs (Recovery and resolution of financial market infrastructures: Consultative report).

Buried deep inside the report is a proposal that would permit orderly failure of even systemically important CCPs. The idea is that the CCP could simply tear up some of its settlement guarantees and wash its hands off positions that it is unable to honour. The CPSS-IOSCO document says:

... contracts could be given a final value based on the price at which the most recent variation margin payment obligations from and to participants had been calculated. To the extent that defaulting participants with out-of-the-money positions had been unable to pay variation margin to the CCP, the CCP’s obligations and variation margin payments to all in-the-money participants could be haircut pro rata to the size of their variation margin claims. This would have the effect of allocating in full the losses that had been suffered, and limiting exposure to future losses by eliminating unmatched positions or the possibility of further obligations arising on these unmatched positions. All other contracts – probably the vast majority of the contracts cleared – could remain in force. (para 3.13)

The idea seems to be that if huge price swings and defaults in some particular segment of the CCP’s activities inflicts life threatening losses on the CCP, then the resolution mechanism steps in, cuts this segment loose and allows this segment to die. The remaining segments of the CCP can continue to function unimpeded.

Another way of looking at this is that all settlement guarantees provided by the CCP are loss limited by deep out of the money options that kick in when the CCP enters resolution. If I buy a future at 500, I would normally expect the CCP to honour this contract however much the asset price rises in value. Selective tear up means that if the asset price shoots up to say 5,000 and so many sellers default that the default losses overwhelm the capital of the CCP, it (the CCP or the resolution authority) may simply haircut me and forcibly close out my position at 4,000. It is as if along with buying the future at 500, I also sold a call to the CCP at 4,000.

The big difference is that ex ante, I do not know the strike price of this call. If I had a choice of executing my buy trade at different exchanges (with different CCPs), I would clearly choose the CCP with highest expected strike price for the call that it would wrest from me in resolution. That gives me an incentive to choose the CCP that risk manages this contract well – high margins, aggressive intra-day margin calls, and intense scrutiny of concentrated positions. Volumes in each asset class would drift to the exchange or CCP that imposes strict risk management in that asset class. Instead of a race to the bottom, there would be a risk to the top. Exchanges and CCPs would try to compete on the basis of the most exacting margin requirements. Healthy competition among CCPs would be possible.

Absent any segregation of business segments, a large CCP which clears many different products has a huge incumbency advantage. It can enter a new product segment with low margins and grab market share. People would still trade there relying on the total resources of the CCP (across all segments) even if they know that on a standalone basis, this segment of the CCP is not a reliable guarantor of trades because of the inadequate margins. In effect, the established segments of the CCP would subsidize the new segment and allow it to drive new entrants out business. The threat of selective tear up by a resolution authority has the potential to limit such cross subsidies and make the market for CCP services more contestable and competitive.

Incidentally, the use of haircuts to provide partial insulation of different segments of a CCP from losses in other segments is nothing new. For example, LCH.Clearnet runs a Swap Clear service for Interest Rate Swaps which is structured in such a manner that other segments of LCH are partially insulated from losses in this segment. LCH.Clearnet default fund rules (especially SwapClear Default Fund Supplement rules S8-S11) provide for haircuts if the resources available in the Swap Clear segment are inadequate to meet the obligation of the CCP. My memory is that when the Swap Clear service was first started, the old members of LCH were worried about the potential large losses in this segment being allocated to them, and this separation of segments was worked out to allay their concerns.

The advantage of building selective tear-up into the resolution process is that this allows a carve-out of segments to happen ex post after life threatening losses have materialized. This makes a resolution (without bailout) of a CCP more credible, palatable and feasible. While the large global CCPs came out of the 2008 crisis unscathed, I fear that the next crisis will not be so kind to them. I consider it highly likely that within the next decade a prominent CCP in a G7 country would need to be resolved.

Posted at 10:08 on Tue, 04 Sep 2012     View/Post Comments (0)     permanent link