.

 

 

 

 

 

 

.

Lifting the Fog of Markets - CVA and the Analytical Enterprise

Lifting the Fog of Markets - CVA and the Analytical Enterprise

  • Comments 0

$600 trillion – that’s the size of the derivatives market. 40 times the size of the U.S. economy. And it has long been one of the most opaque wonders of the world. So it’s no surprise that regulators are all over it. The consequences of default can be huge. But ironically it wasn’t defaults that represented the greatest source of losses during the financial crisis.

According to the Basel Committee, roughly two thirds of counterparty credit risk losses were due to credit value adjustment (CVA) losses and only one third to actual defaults.[i] Managing CVA volatility is one of the biggest challenges facing financial institutions still to this day.

Never mind the algorithmic and compute challenges involved, the acronyms alone are a lot to handle.

CVA – credit value adjustment – is the difference between the market value of a counter party exposure and a risk free position. If derivatives are cleared by CCPs (central clearing counterparties) then such contracts carry a CVA of zero. But if they remain within the bilateral structure of OTC markets – that’s a whole other ball game. CCR’s EPEs, EDEs, DVAs all add to the alphabet soup that basically sum up to the fact that this is not a topic for the technically challenged.[ii] In fact for much of Wall Street, despite the huge risks involved, getting on top of these exposures is still a work in progress.

CVA calculations are typically carried out by the trading desk and sometimes by risk and finance as well. But while finance, risk and trading may share similar models, trading rarely wants to share what part of the curve it’s on to the rest of the business. Plus, to the extent that different lines of business may use different risk methodologies creating the single version of the truth may become even more elusive. That makes the efficient allocation of capital across the enterprise more art than science.

By themselves Monte Carlo simulations are renowned for their complexity; exotics even more so. Calculating the multitude of internal and external scenarios can grow exponentially the more complex the derivative. It requires a huge amount of compute power which for many firms using standard technology can take too long.

CVA adds a further complexity – scenarios on top of scenarios.

New simulation technology is one response – simplifying the regression by reducing the number of scenarios.[iii]

But compute power is another part of the challenge. And not just to calculate the endless scenarios. As more and more data becomes electronic and accessible in digital form so the number of inputs into a scenario can also grow substantially. In analyzing the risks in any position we can not only reply on the huge amount of market information, we can tap social data as well.

Back in the days of open outcry traders in one pit could sense the buzz of markets by listening to the roar in other pits. With electronic trading that intelligence disappeared. To some extent it has replaced by 24X7 news feeds but tweets, blogs and internet activity can provide a ton of extra intelligence anticipating potentials movements in CDS spreads and CVA positions.

The pressure to deliver CVA calculations to tougher regulatory standards brings together two monolithic challenges that plague financial services today – the challenge of Big Data – and the need to access massive compute power at a moment’s notice.

The cloud is an obvious solution. It is in the one place where you can access compute power at the touch of a button. But while some firms are rushing to embrace it, others remain anxious about it. They worry about its security, in particular putting customer data beyond their firewalls. Bursting to the cloud avoids this risk keeping data on premises while accessing almost unlimited computing power at a fraction of the cost of in-house systems.

Most firms will prefer a portfolio of technology solutions – massive parallel process (MPP) allowing many servers to come together on a single problem; Hadoop to tackle the challenges of Big Data; and a range of cloud solutions – private (on premises), hybrid and public cloud provide a range of options for almost every market situation. The back office of the future has to be able to spin on a dime.

On April 10th JP Morgan Chase, Numerix and Microsoft came together at the Harvard Club to share their perspectives these challenging topics with a number of industry practitioners. [iv]The good news is that most of the tools we need are readily at our disposal.

The challenges, as always, comes in implementing them and that means a true commitment to the analytical enterprise, not just within one line of business, but across the enterprise as well.



[i] Optimal Funding Strategies For  Counterparty Credit Risk Liabilities

CLAUDIO ALBANESE, GIACOMO PIETRONERO, AND STEVE WHITE

http://www.riskcare.com/wordpress/wp-content/uploads/2011/05/Revolvers.pdf

 

[ii] CCR – counterparty credit risk, EPE – expected positive exposure; EDE – expected negative exposure; DVA – debit value adjustment

 

[iii] American Monte Carlo developed by Numerix is one example.