Quadprog in Julia

As described on wikipedia, a quadratic programming problem with n variables and m constraints is of the form $$ \min(-d^T x + 1/2 x^T D x) $$ with the constraints \( A^T x \geq b_0 \), were \(D\) is a \(n \times n\)-dimensional real symmetric matrix, \(A\) is a \(n \times m\)-dimensional real matrix, \( b_0 \) is a \(m\)-dimensional vector of constraints, \( d \) is a \(n\)-dimensional vector, and the variable \(x\) is a \(n\)-dimensional vector.

Solving convex quadratic programming problems happens to be useful in several areas of finance. One of the applications is to find the set of arbitrage-free option prices closest to a given set of market option prices as described in An arbitrage-free interpolation of class C2 for option prices (also published in the Journal of Derivatives). Another application is portfolio optimization.

In a blog post dating from 2018, Jherek Healy found that the sparse solve_qp solver of scilab was the most efficient across various open-source alternatives. The underlying algorithm is actually Berwin Turlach quadprog, originally coded in Fortran, and available as a R package. I had used this algorithm to implement the techniques described in my paper (and even proposed a small improvement regarding machine epsilon accuracy treatment, now included in the latest version of quadprog).

Julia, like Python, offers several good convex optimizers. But those support a richer set of problems than only the basic standard quadratic programming problem. As a consequence, they are not optimized for our simple use case. Indeed, I have ported the algorithm to Julia, and found out a 100x performance increase over the COSMO solver on the closest arbitrage-free option prices problem. Below is a table summarizing the results (also detailed on the github repo main page).

Solver RMSE Time (ms)
GoldfarbIdnaniSolver 0.0031130349998388157 0.205
COSMO 0.0031130309597602370 21.485
SCS 0.0031130429769795193 8.381

Comments

comments powered by Disqus