The paper deals with a problem of solving overdetermined systems of linear algebraic equations. From a numerical method point of view, such a problem relates, in entity, to the Least Squares class. A feature of the present paper problem statement is that the system coefficients are not known and are replaced with corresponding random values that are sequentially observed. Solving the problem is focused on deriving an iterative/recursive algorithm that would be able to form a strongly consistent, i.e. converging with probability 1, sequence of estimates of the desired solution, and being stable with regard to the observation data and to the condition number of current estimates of the system matrix.