Optimization Theory
The Total Least Squares (TLS) problem is a well known technique for solving the following over-determined linear systems of equations $$Ax\approx b \quad A \in \mathbb{R}^{m \times n}, \ b \in \mathbb{R}, \quad m>n$$
in which both the matrix $A$ and the right hand side $b$ are affected by errors. We consider the following classical definition of TLS problem.The Total Least Squares problem with data $A \in \mathbb{R}^{m \times n}$ and $ b \in \mathbb{R}^m, m>n $ is given by, \begin{align} \min \ |E \mid f|_F \quad \text{subject to} \quad b+f \in \text{Im}(A+E) \end{align}
where $E \in \mathbb{R}^{m \times n}$ and $f \in \mathbb{R}^m$. Here, $|E \mid f|_F$ denotes the Frobenius matrix norm and $(E \mid f)$ denotes the $ m \times (n + 1)$ matrix whose first $n$ columns are the columns of $E$, and the In various engineering and statistics applications where a mathematical model reduces to the solution of an over-determined, possibly inconsistent linear equation $Ax \approx b$, solving that equation in the TLS sense yields a more convenient approach than the ordinary least squares approach, in which the data matrix is assumed constant and errors are considered right-hand side $b$. In this project, we derived a iterative algorithm (see Algorithm 1 above) for solving Total Least Square problem based on randomized projection.