To give you a flavour of what is to come at Global Derivatives 2014 next week, we have compiled some ‘short, sharp’ interviews for your enjoyment. Hear from Fabio Mercurio, Bloomberg; Alexander Denev, CloudRisk and Damiano Brigo, Imperial College London. Fabio and Damiano are both presenting during Stream B on 14th May – ‘Funding, Liquidity & Portfolio Optimisation’. Fabio will also be giving a workshop on 16th May, entitled ‘ Multi-Curve Modelling & Discounting’. Alexander is giving a solo talk during the exciting ‘Electronic Trading & Quantitative Investment Strategies Summit’ on 12th May. You can view the conference agenda here.
1. What do you think is the defining quality of a quantitative analyst?
FM: It is the ability to learn from market data, modeling risk factors in a way that is consistent with typical market behavior. Too often quants are tempted to introduce model assumptions purely on the base of technical considerations such as tractability, ease of implementation etc.
AD: Eclectic and flexible. I think that these qualities allow a quantitative analyst to continuously evolve. The role of the quantitative analysis has certainly changed over the recent years. A few years ago what was required was a good knowledge of stochastic calculus, statistics and numerical skills. As of today, the biggest treasure can be having additional knowledge of economics, econometrics, regulations as well as a better understanding of the underlying business. Unfortunately, changing mindset comes at a cost of having to absorb new information and a different toolkit of quantitative techniques to better cope up and be creative within the new reality. For example, recent regulatory requirements like Basel III, opened the door to a new set of models and problems related to them. This why I think that being a quantitative analyst is a difficult task, hence the necessity for flexibility and eclecticism to able to make intellectual jumps when required and keep up to a high standard in the profession.
DB: Eclecticism. These days Quants need to know several aspects of policy, bank model, strategy… I express this sometimes by saying that we are seeing the end of platonic pricing and platonic financial modeling more generally. Valuation is becoming aggregation dependent, nonlinear, and depends also on the margining process, funding policies of the bank treasury, the credit situation of the entities that are trading, and a number of other aspects that used to be ignored by most Quants in the past. This is not easier, it is more difficult than the Quant jobs we used to have in the past. A Quant can no longer ignore economics, econometrics, historical estimation, regulation, policy, bank model, etc. We need to avoid compartimentalizing methodology. This awareness for a need of a more holistic approach has been there for a while but now it is becoming more and more indispensable.
2. What do you think is the most under-valued development in the industry of the past year?
AD: The Recovery Theorem of Stephen Ross is a recent breakthrough and, according to some, the biggest one in Finance for the last 10-20 years. It shows how to deduct a unique P distribution from a Q market implied distribution by disentangling the representative investor preferences. This opens new frontiers in knowing explicitly the market real expectations. The assumptions behind it such as time separable utility and the boundedness of the state variable are still being debated but what is clear is that what it delivers is something needed for a long time, and original. The implications of this theorem, however, are still to be fully investigated. Recent work by Peter Carr and Jiming Yu cast further light on its theoretical underpinnings and contributed to the debate around it. Unfortunately (but necessarily!) the Recovery Theorem introduces concepts with which Q-quants are not very familiar with (e.g. state dependent utility functions), hence one of the difficulties of it being swallowed and applied by a large segment of the industry.
DB: Hard to say, perhaps the recent chronicles on reports on funding costs show that even if funding costs have been debated a lot at high level, little rigorous research has been done on their valuation and on their meaning. There has been a lot of high level debate, and my impression is that many people argued because they thought they were speaking about the same thing but they were actually meaning different things.
So when you say “funding costs” you have to be very careful and very precise about what you are describing. More generally I find we are re-discovering past works that are becoming more and more relevant today, but we are talking about methodological tools that have been around a while but have been rarely used, rather than last year technical developments. I haven’t notice many of those.
3. Algorithmic trading is the biggest threat globally to liquidity. Yes or No?
FM: I’m more concerned about new regulations requiring more and more collateral to be posted to counterparties, including central clearing ones. High-quality collateral is obviously not unlimited.
AD: Both a “Yes“ and a “No” answers can be too dismissive and easily challenged, as Algo Trading (AT) comprises a lot of concepts each of them with its own specificity. AT as a market making activity creates liquidity by definition and AT as arbitrage strategy can resolve market inefficiencies – both positive effects in principle. The vast majority of empirical studies points to an overall beneficial contribution of AT to liquidity and market quality. We should not forget that these conclusions could vary for each of the underlying strategies, and some of them e.g. aggressive momentum strategies, can have an opposite effect on liquidity. It is up to the regulators to understand and limit market abuse behaviour and make the answer to this question a firm “No”. Market distress situations are another issue and a lot of papers have been written about the Flash Crash in 2010 and the subsequent problem of whether to regulate more the AT related activities. A recent paper of Andrei Kirilenko shows that AT seems to have exacerbated the volatility of the crash (a “hot potato” effect), so this raises the general question of AT behaviour during conditions of distress. Therefore, although AT provides liquidity in normal times, it can be argued that it is withdrawn in times of stress thus causing it to dry up. A breakthrough towards a better understanding of this type of liquidity crunches due to AT (HFT more specifically) was done recently by the seminal work of David Easley, Marcos Lopez de Prado and Maureen O’Hara with their VPIN indicator. The question then – whether and how to regulate more AT to prevent liquidity crunches – is still being challenged and discussed and, until then, “No” is not a certain answer to this question.
DB: Yes if left mostly unregulated and in the wild, no if properly regulated.
4. What will people be discussing at Global Derivatives 2014?
FM: I think quants, maybe a bit reluctantly, will more and more need to keep an eye on regulations, be more involved in discussions, and play a more active role in defining them.
AD: I think a lot of focus will be put on the new environment in which institutions operate. This will be a more complex and interconnected world tightened by an increased regulatory regime. My guess is that Big Data, CVA/DVA/FVA, new numerical methods to deal with the growing computational needs will be the hot topics this year. On the other hand, it was fashionable to talk about tail risks when they were everywhere around in 2007-2012, but this topic has subsided by now and I do not think it is going to be one of the focuses of Global Derivatives 2014. However, I believe we should always try to be aware of what lurks beneath and do not forget that tails risks, although can be tamed, cannot be extinguished. This is why we have to continue thinking how to model and manage them. I also expect another highly debated issue to be the essence of the quant profession and its re-shaping in the new ‘normal’ of the world.
DB: Given recent news on funding costs impact on Banks profits, funding will be an ongoing topic. Of course the impact of central clearing and initial margins on liquidity and deals profitability will also be key.
I think we’ll also see Algo trading, as per your previous question, possibly a discussion of leverage ratios, and liquidity risk. I think that reputational risk might surface sooner or later as a topic, again given the last years news on banks, and I would be keen on an update on commodities markets and models. I also think that at some point consumer and digital finance might mix with the global derivatives agenda in a more explicit way, but perhaps not yet.
Don’t miss a post – sign up to the Global Derivatives e-newsletter for exclusive news, research and interviews from our community of expert speakers.