Module: tfg.math.optimizer.levenberg_marquardt
Stay organized with collections
Save and categorize content based on your preferences.
This module implements a Levenberg-Marquardt optimizer.
Minimizes \(\min_{\mathbf{x} } \sum_i \|\mathbf{r}_i(\mathbf{x})\|^2_2\) where
\(\mathbf{r}_i(\mathbf{x})\)
are the residuals. This function implements Levenberg-Marquardt, an iterative
process that linearizes the residuals and iteratively finds a displacement
\(\Delta \mathbf{x}\) such that at iteration \(t\) an update
\(\mathbf{x}_{t+1} = \mathbf{x}_{t} + \Delta \mathbf{x}\) improving the
loss can be computed. The displacement is computed by solving an optimization
problem
\(\min_{\Delta \mathbf{x} } \sum_i
\|\mathbf{J}_i(\mathbf{x}_{t})\Delta\mathbf{x} +
\mathbf{r}_i(\mathbf{x}_t)\|^2_2 + \lambda\|\Delta \mathbf{x} \|_2^2\) where
\(\mathbf{J}_i(\mathbf{x}_{t})\) is the Jacobian of \(\mathbf{r}_i\)
computed at \(\mathbf{x}_t\), and \(\lambda\) is a scalar weight.
More details on Levenberg-Marquardt can be found on this page.
Functions
minimize(...)
: Minimizes a set of residuals in the least-squares sense.
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2022-08-26 UTC.
[null,null,["Last updated 2022-08-26 UTC."],[],[],null,["# Module: tfg.math.optimizer.levenberg_marquardt\n\n|---------------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/tensorflow/graphics/blob/master/tensorflow_graphics/math/optimizer/levenberg_marquardt.py) |\n\nThis module implements a Levenberg-Marquardt optimizer.\n\nMinimizes \\\\(\\\\min_{\\\\mathbf{x} } \\\\sum_i \\\\\\|\\\\mathbf{r}_i(\\\\mathbf{x})\\\\\\|\\^2_2\\\\) where\n\\\\(\\\\mathbf{r}_i(\\\\mathbf{x})\\\\)\nare the residuals. This function implements Levenberg-Marquardt, an iterative\nprocess that linearizes the residuals and iteratively finds a displacement\n\\\\(\\\\Delta \\\\mathbf{x}\\\\) such that at iteration \\\\(t\\\\) an update\n\\\\(\\\\mathbf{x}_{t+1} = \\\\mathbf{x}_{t} + \\\\Delta \\\\mathbf{x}\\\\) improving the\nloss can be computed. The displacement is computed by solving an optimization\nproblem\n\\\\(\\\\min_{\\\\Delta \\\\mathbf{x} } \\\\sum_i\n\\\\\\|\\\\mathbf{J}_i(\\\\mathbf{x}_{t})\\\\Delta\\\\mathbf{x} +\n\\\\mathbf{r}_i(\\\\mathbf{x}_t)\\\\\\|\\^2_2 + \\\\lambda\\\\\\|\\\\Delta \\\\mathbf{x} \\\\\\|_2\\^2\\\\) where\n\\\\(\\\\mathbf{J}_i(\\\\mathbf{x}_{t})\\\\) is the Jacobian of \\\\(\\\\mathbf{r}_i\\\\)\ncomputed at \\\\(\\\\mathbf{x}_t\\\\), and \\\\(\\\\lambda\\\\) is a scalar weight.\n\nMore details on Levenberg-Marquardt can be found on [this page.](https://en.wikipedia.org/wiki/Levenberg%E2%80%93Marquardt_algorithm)\n\nFunctions\n---------\n\n[`minimize(...)`](../../../tfg/math/optimizer/levenberg_marquardt/minimize): Minimizes a set of residuals in the least-squares sense."]]