Home 
Rong Ge, Princeton
CS 

FAQ
for Computational
Complexity and Information Asymmetry in Financial Products 
 Why do you use a simple model?
In real life people use more complex models while thinking about CDO
pricing.
This paper concerns a hardness result, not a pricing algorithm. As
noted in the paper, our simple model can be embedded in the industry
standard models like Gaussian copula, and the hardness result therefore
extends to these complex models. In general, when proving a
hardness result one should use as simple a model as possible. When
exhibiting a pricing algorithm, on the other hand, one should use as
complex a model as possible.
 Does the paper rely upon the P
vs NP conjecture?
The paper relies upon a stronger form of "P not equals NP", namely,
that the
planted dense subgraph problem does not have an efficient algorithm.
(In fact it is conjectured that there is no algorithm to even compute
any approximate solutions to this problem. There is also believed to be
no short certificate that a dense subgraph of the specified size does
not exist in the graph.) For more details on computational complexity
see the first few chapters of this book.
 The hardness result applies in
an asymptotic sense. How should one interpret this?
The planted dense subgraph problem (with parameters as discussed in the
paper) appears to have no efficient algorithm for
even moderate graph sizes. So the hardness result should apply even
with reasonable parameter choices. For example, say the number of asset
classes
is 1000, and each contains 20
mortgages. The seller group these into 200 CDOs, each with 100
mortgages. The threshold for senior tranche is set to 35%. Then in a
binary CDO the buyer's loss (= lemon cost) is as high as 12.5%, and in
a tranched CDO
the buyer's loss is as high as 1.875%.
Strictly speaking we only need to assume that the specific pricing
algorithm used by the buyers and rating agencies does not detect dense
subgraphs. In real life (at least before the current crisis hit) many
buyers
simply trusted the rating agencies, which usually use monte carlo
simulations. Those would certainly not detect dense subgraphs for the
above parameter choices.
 Are you suggesting that the
phenomenon described in this paper was a major cause of the current
global crisis?
The current crisis had many causes, including poor modeling, regulatory
failures, etc. This paper is focusing on an aspect (difficulty of
pricing and how it worsens in presence of asymmetric information) that
will remain even relevant after the above causes will be fixed.
In general, financial experts suspect that the sheer volume of
derivative transactions probably allows many kinds of manipulations to
go undetected. We are highlighting one particular kind of
manipulation where this undetectability as well as its financial cost
can be made precise.
 Does your paper imply that CDOs
and related products should be banned?
For now, that would be a hasty conclusion. One can imagine some free
market solutions to the issues raised in the paper such as (a) turn
over design of CDOs to a neutral 3rd party to avoid cherrypicking.
(But, keep in mind the experience with rating agencies in recent
years.) (b) use CDOs with parameters where the planted densest subgraph
problem is easy. (But, keep in mind that the market currently lacks
full transparency, so it is unclear how to even obtain the graph on
which to solve densestsubgraph!)
Of course, one must then investigate if these new fixes allow other
kinds of manipulation. Our paper is only a first cut at
establishing a role for computational complexity in thinking about
derivatives. Hitherto this aspect has been ignored.


