## Highlights - each of its rows is a multiple of yT and each of its column (Steven J. Leon 94) - vectors is a multiple of x\. (Steven J. Leon 94) - If the linear system Ax = b is consistent and x0 is a particular solution, then a vect (Steven J. Leon 144) - y will also be a solution if and only if y = x0 + z where z ∈ N\(A\)\. (Steven J. Leon 144) - A minimal spanning set is called a basis\. (Steven J. Leon 149) - Let x1 , x2 , \. \. \. , xn be n vectors in Rn and let X = \(x1 , \. \. \. , xn \)\. The vectors x1 , x2 , \. \. \. , (Steven J. Leon 152) - will be linearly dependent if and only if X is singular\. (Steven J. Leon 152) - ny subset of fewer than n linearly independent vectors can be extended (Steven J. Leon 162) - form a basis for V (Steven J. Leon 162) - The vector c defined in this way is called the (Steven J. Leon 170) - coordinate vector of v with respect to the ordered basis E and is denoted [v]E \. (Steven J. Leon 170) - Thus the equatio (Steven J. Leon 172) - Sx = 0 has only the trivial solution and hence the matrix S is nonsingular\. (Steven J. Leon 172) - The rank of a matrix A, denoted rank\(A\), is the dimension of the row space of A\. (Steven J. Leon 175) - A linear system Ax = b is consistent if and only if b is in the column space of A (Steven J. Leon 175) - The linear system Ax = b is consistent for every b ∈ Rm if and (Steven J. Leon 175) - only if the column vectors of A span Rm \. The system Ax = b has at most one solution (Steven J. Leon 175) - for every b ∈ Rm if and only if the column vectors of A are linearly independent\. (Steven J. Leon 175) - We cannot use the column vectors from U, since, in general, U and A have different (Steven J. Leon 178) - column spaces\. (Steven J. Leon 178) - The subspace Span\(x1 , x2 , x3 , x4 \) is the same as the column space of the matrix (Steven J. Leon 179) - In particular, it is much easier to calculate the coordinates of a given vector v with respect to an orthonormal basis\. Once these coordinates have been determined, they can be used to compute v\. (Steven J. Leon 266) - An n × n matrix Q is orthogonal if and only if QTQ = I (Steven J. Leon 268) - If the column vectors of A form an orthonormal set of vectors in Rm , then ATA = I and the solution to the least squares problem is x̂ = AT b (Steven J. Leon 270) - What if the columns of A are not orthonormal? In the next section we will learn a method for finding an orthonormal basis for R\(A\)\. From this method we will obtain a factorization of A into a product QR, where Q has an orthonormal set of column vectors and R is upper triangular\. With this factorization, the least squares problem is easily solved\. (Steven J. Leon 271) - However, the main difficulty with this method is that, in forming the normal equations, we may well end up transforming the problem into an ill-conditioned one (Steven J. Leon 469)