The explosion of the internet, in conjunction with the success of neural networks, brought the world of finance closer to more exotic approaches. Deep learning today is one such technique that is being widely adopted to cut down losses and generate profits.
When gut instincts do not do the job, mathematical methods come into play. Differential equations, for instance, can be used to represent a dynamic model. The approximation of pricing functions is a persistent challenge in quantitative finance. By the early 1980s, researchers were already experimenting with Taylor Expansions for stochastic volatility models.
For example, if company A wants to buy a commodity say oil in future from company B but is unsure of the future prices. So company A wants to make a deal with B that no matter what the price of oil is in the future, B should sell it to A for a price according to their contract.
In the world of finance, this is a watered-down version of derivatives trading. Derivatives are the securities made on underlying assets. In the above case, company A predicts a rise in price, and company B predicts a fall in price. Both these companies are making a bet on future prices and agree upon a price that cuts down their losses or can even bring profits (if A sells after price rise). So how do these companies arrive at a certain price or how do they predict the future price?
Taking the same example of derivatives trading, the researchers at Danske Bank of Denmark, have explored the implications of differential deep learning.
Deep learning offers the much needed analytic speeds, which are necessary for an approximation of volatile markets. Machine learning tools can take up the high dimensionality (many parameters) trait of a market and help resolve the computational bottlenecks.
Understanding Differential ML Through The Lens Of Finance
Differential machine learning is an extension of supervised learning, where ML models are trained on differentials of labels to inputs.
In the context of financial derivatives and risk management, pathwise differentials are popularly
computed with automatic adjoint differentiation (AAD). AAD is an algorithm to calculate derivative sensitivities, very quickly. Nothing more, nothing less. AAD is also known in the field of machine learning under the name ‘back-propagation’ or simply backprop.
Differential machine learning, combined with AAD, wrote the authors, provides extremely effective pricing and risk approximations. They say that fast pricing analytics can be produced and can effectively compute risk management metrics and even simulate hedge strategies.
This work compares differential machine learning to data augmentation in computer vision, where multiple labelled images are produced from a single one, by cropping, zooming, rotating or recolouring.
Data augmentation not only extends the training set but also encourages the machine learning model to learn important invariances (features that stay the same). Similarly, derivatives labels not only increase the amount of information in the training set but also encourage the model to learn the shape of the pricing function. Derivatives from feedforward networks form another neural network, efficiently computing risk sensitivities in the context of pricing approximation. Since the adjoints form a second network, one can use them for training as well as expect significant performance gain.
Risk sensitivities converge considerably slower than values and often remain blatantly wrong, even with hundreds of thousands of examples. We resolve these problems by training ML models on datasets augmented with differentials of labels with respect to the following inputs:
This simple idea, assert the authors, along with the adequate training algorithm, will allow ML models to learn accurate approximations even from small datasets, making machine learning viable in the context of trading.
Differential machine learning learns better from data alone, the vast amount of information contained in the differentials playing a similar role, and often more effective, to manual adjustments from contextual information.
The researchers posit that the unreasonable effectiveness of differential ML is applicable in situations where high-quality first-order derivatives with training inputs are available and in complex computational tasks such as the pricing and risk approximation of complex derivatives trading.
Differentials inject meaningful additional information, eventually resulting in better results with smaller datasets. Learning effectively from small datasets is critical in the context of regulations, where the pricing approximation must be learned quickly, and the expense of a large training set cannot be afforded.
The results from the experiments by Danske bank’s researchers show that ’learning the correct shape’ from differentials is crucial to the performance of regression models, including neural networks.
Know more about differential deep learning here.