I find one implementation of Levenberg-Marquardt that solves Ax = b using least square, which does not seem to make sense. To learn more about Levenberg-Marquardt algorithm, I find one fascinating blog
here.
Note that LM is very sensitive to the initial guess, as stated in
wikipedia, and it could not deal with outliers. If the data contains a lot of outliers, LM is not able to converge.
In cases with only one minimum, an uninformed standard guess likeβT=(1,1,...,1) will work fine; in cases with multiple minima, the algorithm converges to the global minimum only if the initial guess is already somewhat close to the final solution.
A simpler version of LM, without using lambda is
.
If the problem is convex then there are plenty of other methods right...?
ReplyDeleteCan you use SGD on this kind of problems?