|
Automatic differentiation is a practical field of computational mathematics of growing interest across many industries, including finance. Use of reverse-mode AD is particularly interesting since it allows for the computation of gradients in the same time required to evaluate the objective function itself. However, it requires excessive memory. This memory requirement can make reverse-mode AD infeasible in some cases (depending on the function complexity and available RAM) and, in others, slower than expected due to use of secondary memory and non-localized memory references. On the other hand, it turns out that many complex (expensive) functions in finance exhibit a natural ``substitution structure". This paper illustrates this structure in computational finance arising in calibration and inverse problems, as well as determining Greeks in a Monte Carlo setting. In these cases the required memory is a small fraction of that required by reverse-mode AD but the computing time complexity is the same. In fact, numerical results indicate significant realized speedup over straight reverse-mode AD. |
|
Keywords:Gradient, Automatic differentiation, Reverse-mode, Greeks, Local volatility, Calibration, Inverse Problems, Algorithmic Differentiation, Monte Carlo method. |
|