MATLAB PROJECT
An Efficient
Preconditioner for Stochastic Gradient Descent Optimization of Image
Registration
Abstract:
Stochastic gradient descent
(SGD) is commonly used to solve (parametric) image registration problems. In
the case of badly scaled problems, SGD, however, only exhibits sublinear
convergence properties. In this paper, we propose an efficient preconditioner
estimation method to improve the convergence rate of SGD. Based on the observed
distribution of voxel displacements in the registration, we estimate the
diagonal entries of a preconditioning matrix, thus rescaling the optimization
cost function. The preconditioner is efficient to compute and employ and can be
used for mono-modal as well as multi-modal cost functions, in combination with
different transformation models, such as the rigid, the affine, and the
B-spline model. Experiments on different clinical datasets show that the
proposed method, indeed, improves the convergence rate compared with SGD with
speedups around 2~5 in all tested settings while retaining the same level of
registration accuracy.
No comments:
Post a Comment