In this paper we study how to update the solution of the linear system Ax = b after the matrix A is changed by addition or deletion of rows or columns.
Studying the QR Factorization of the system, more specifically, the factorization created by the Householder reflection algorithm, we find that we can split the algorithm in two parts.
Regression model is a well-studied method for the prediction of real-valued data.
Linear least squares problems are commonly solved by QR factorization.
When multiple solutions need to be computed with only minor changes in the underlying data, knowledge of the difference between the old data set and the new can be used to update an existing factorization at reduced computational cost.
We reduce the constrained problem to unconstrained linear least squares and partition it into a small subproblem. We carry out the error analysis of the proposed algorithm to show that it is backward stable.
We also illustrate the implementation and accuracy of the proposed algorithm by providing some numerical experiments with particular emphasis on dense problems. It arises in important applications of science and engineering such as in beam-forming in signal processing, curve fitting, solutions of inequality constrained least squares problems, penalty function methods in nonlinear optimization, electromagnetic data processing and in the analysis of large scale structure [) can be obtained using direct elimination, the nullspace method and method of weighting.
This updating-QR platform is then utilized for performing regression analysis.