Sponsored statement: Trilemma
The relative nature of probability
Industrial revolutions are usually the precursors of metaphysical revolutions. Einstein’s abolition of absolute time and space and his enunciation of the relativity principle wouldn’t have occurred had the telegraph and railway networks, then becoming a global industry, not presented the problem of the unification of time and of the determination of city longitudes around the world. A whole technology developed, and the corresponding patents were filed to mechanise the co-ordination of clocks. Simultaneity became a mechanical procedure and was no longer a metaphysical concept.
Based in Switzerland, the home of the clock industry, and working as an application examiner in the Swiss patent office, Einstein was the privileged witness of this revolution. Owing to his status of total outsider vis-à-vis the scientific establishment (unlike Henri Poincaré), he was bold enough to redefine time after simultaneity, and to redefine simultaneity after the procedural transmission of signals between clocks. Relativity theory was conceived as a machine, not as a metaphysical speculation or an amendment of previous theory. In fact, Einstein wrote his revolutionary 1905 special relativity article in the brisk style of a patent claim, where reference to previous work or to similar inventions was precisely out of the question.
There is another major industrial revolution happening today and its metaphysical consequences haven’t yet been drawn – it is the market of contingent claims. In many respects, it is similar to the one that prompted Einstein’s theory of relativity, with the market acting as the new globe and the synchrony of prices like a global clock.
Where patents to synchronise distant clocks were filed in Einstein’s day, today the industry is filing maps of synchronous derivatives prices. When the traded vanilla options prices are too scarce, data vendors extend the market and produce complete and smooth implied volatility surfaces. When the underlying price moves or time passes, the surface is recalibrated from a new intake of traded derivatives prices instead of being recomputed from an underlying probabilistic hypothesis. And, once it is observed that implied volatility is traded and stochastic, the market is solicited again for the prices of options on volatility indexes (for example, VIX options) and the latter are repackaged and redistributed in turn in a refinement of the market synchrony, instead of upgrading the theoretical probability distribution underlying the Black-Scholes-Merton model (BSM) to stochastic volatility.
We, at Trilemma, claim that an outdated, yet very entrenched, metaphysical category has to give way once the market is conceived as a machine or a technology and no longer as a theory. The medium we need to abolish, when thinking of the material relationship between a contingent payoff and its present market price, is probability. Just as there is no absolute time rigidly attaching to the ether, but only time defined relatively to the material procedure of the synchronisation of clocks, there is no absolute probability with which to distribute the underlying and value contingent claims accordingly. Rather, probability is defined relatively to the frame of reference, whereas the real, intrinsic relation is the one that prevails between the contingent claim and its market price.
The rule is to infer the probability distribution of the underlying from the market prices of contingent claims. For instance, volatility is implied from the option price in BSM. When the implied volatility differs among strikes and maturities, we change the probability assumption and we now calibrate a stochastic volatility model (or a jump-diffusion, or a mixture of the two) to the full vanilla surface. The next day we recalibrate the model to the new options prices, thus changing the distribution again (horizontal recalibration). And, when we realise that the market prices of higher-order exotic options are not explainable within the model, we upgrade it to the next level. We thus recalibrate a model of stochastic volatility of volatility, or stochastic jump sizes and intensities, etc. (vertical recalibration).
Market prices as an invariant theory
We call ‘intrinsic non-arbitrage relations’ the ones that help us value derivatives independently of any model of the underlying process. These relations are deduced purely from the statics of the respective payoffs or static replication – that is, from the clauses that are written when possibilities are over and the underlying dynamics are terminated (typically at maturity of the instruments or at their knock-out barriers, etc.). Now, our observation is that market prices are also model-independent and, by definition, arbitrage-free. Why don’t we consider their relations intrinsic too?
Might not the metaphysical revolution lie in considering that the market prices are written too and are devoid of probability dynamics, that is, written somehow ‘after’ the end of possibilities, outside chronological time? Could the market price be essentially occurring in the middle of the event – right in the heart of the terminal-contingent payoff and with no need to predict it – yet accidentally taking place ‘before’ the event, in what may look like a chronological antecedence but is in fact a taking over of the event, literally taking the place in which the event takes place? And, if probability and temporal process do not intrinsically occur between the present price and the future payoff, what does? Relativity theory is in reality a theory of invariants, so, we ask: what is our invariant?
Only because a non-deterministic phenomenon repeats itself with a few variations are we able to assemble the variations and retrospectively call them ‘possibilities’ that are open to the event. The event is the result of abstracting the differences in the same class and of subsuming the facts under the same phenomenon, which we then suppose will admit different outcomes. The ex-ante outlook therefore has no physical existence; it is a logical abstraction. Possibilities are defined after the population, not before.
When the population is blessed with statistical regularity, we call it a statistical distribution and the ex-ante stance finds further support in the belief that the next individual event will now be generated by a probability distribution with the same moments as the statistical one. Probability is also defined in retrospect. Indeed, the whole idea of a timing of the event is illusory.
When there is no such empirical population or reference class, of which the event is recognisably a member, metaphysicians can still imagine a set of possible worlds in which to measure its frequency. A less exorbitant alternative is to drop objective probability altogether and believe only in subjective probability. However, does any of this make sense when probability, as a concept, has been recognised to be past, not future, and only to be misplaced in the future? Think, for a moment, what the probability or even the possibility of an absolute event could possibly mean – an event so severe that it is not even identified beforehand and can only be interpreted and explained after the fact (known as a Black Swan). Is time itself not void as the medium of such an event?
Recalibration process versus stochastic process
Derivatives pricing almost kicked off as a branch of actuarial science. The event of the underlying price resting above or below a certain strike at a certain maturity was analysed as the linking of very small price increments that occurred in abundance in the interval. Under the assumption that the instant probability distribution would be inferred from the statistical series, the temptation to compute the fair value of the derivative as actuarial value was great – one such that you broke even on average. However, the non-arbitrage constraint binding the derivative, the underlying and the riskless bond quickly dispelled this temptation in favour of risk-neutral pricing, if only because of the risk premium attaching to the underlying and of the investor’s expectation not to break even on average. Finally, the dynamic replication argument of BSM – itself compatible with non-arbitrage and risk-neutral pricing – gave derivatives pricing a more operational turn. It turned the abstract equivalence between the real probability measure and the risk-neutral measure into a pressing and very local accounting equation.
In reality, BSM had just consummated the thought that the market was a material procedure and not an application of probability. Nobody cared any longer whether the derivative price was sensitive to the distribution of profit and loss in the long run or to the instant random generator that caused the systematic slippage in hedge rebalancing. We all woke up in a market where derivatives and underlying were trading alongside each other and moving together. Nobody uses the BSM formula to explain the option price; everybody inverts it against the option price to compute the dynamic hedge.
Implying the BSM volatility from the vanilla option price opens an endless chain: every subsequent model (stochastic volatility, jump-diffusion, etc.) is calibrated to the options market in turn and becomes virtually stochastic by recalibration. Its meta-model will be governed by the prices of higher-level exotics and, in case they don’t actually exist, it is only virtually that we should conceive of recalibrating against them. The market is this infinite chain of prices of contingent claims.
If the chain is virtually infinite, then the relation between any contingent payoff and its price becomes intrinsic. Any probabilistic model is an arbitrary section of this infinity and will always be relative. Incidentally, every exotic structure in the ascending ladder will trade at variance with the replication plan corresponding to that section. This means the underlying stochastic process is prevented from running its course at any level. The virtual infinity of prices, or the market, replaces the whole probabilistic hypothesis and the exchange’s ‘proper time’ – or rather, its proper place – replaces the improper and misleading time of probability.
The pricing of contingent claims is not a probability theory; it is a recalibration machine. If a patent must be claimed for the new clock industry and the new financial geodesy, it should recognise this technology.
Click here to view the article in PDF format
Sponsored content
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@risk.net
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@risk.net