How to solve a system of linear equations, 'diagonalizing' the matrix.
I'll focus on lower diagonalization of the matrix of 3 x 3 form in the generalized case, and leave it up to you for completing this solution, once you've gotten a sense of what is going on here.
Basically the idea here is to create a diagonal of '1' s in the a diagonal form from the top left to bottom right of the matrix.
As a preface in introducing why 'diagonalizing' or solving the matrix would be done here, I would suggest a reading on multiplication operations on a matrices at
http://en.wikipedia.org/wiki/Matrix_multiplication
The purpose here in short is that 'diagonalizing' is solving the matrix. When we multiply for instant the coefficient unknowns with a given diagonalized matrix we have a system of solutions.
For instance the 'diagonalized' matrix of the form
leads to a solution
You can check the multiplication operations on the given matrix here.
But let's start from the beginning, and solve are given matrix where it is of the following form:
The idea is that we use row by row operations to get the rows in diagonal form. Keep in mind when operations are preformed they are done so with the rules that apply equally as they would in algebra. Thus if one row is multiplied by a given number m, both sides of the equation on such row would be multiplied by m, and every column in such row on both sides of the equation must be multiplied by m (distributive rule).
Keep in mind we are solving ('diagonlizing') the matrix where its size is a 3 x 3 system, we could procedurally extend this method however to any matrix of n x n size where a solution exists so long as the rows are independent of one another...meaning neither a multiple of the other leading to a matrix that doesn't have enough indendent equations that should lead to a solution.
But I'll focus abstractly at the methodology for solution here.
Our first objective in diagonalizing here is to make the position of 'a21' which is the 21 (row column) position equal 0, to do this we need a common term with respect to the position 'a11' which is at the 11 (row column) position.
Here for multipliers, we simply want like terms to add and subtract from in the a11 and a21 position. To do this, multiply the first row by a21 and multiply the second row by a11.
the first row is
[a11*a21 b12*a21 c13*a21]
the second row is
[a21*a11 b22*a11 c23*a11]
Just as in the ordinary case of algebraic operations on an equation, whatever is done to the left side of our equation must be done to the rights side for each row, so
on the right side for row 1 column 1 we have
a21*d1
and on the right side for row 2 column 1 we have
a11*d2
or a matrix that looks like
Unfortunately at this point simplification isn't so pretty since term matching without finding common factors implicitly inside the coefficients themselves to help with respect to term reduction and simplification here, but we are working towards a solution for 3 x 3 case which removes the necessity of solving the matrix by way of general formulation. In this case, we can take the multiply row 2 by the inverse (or reciprocal) expression (set of terms) in the row 2 column 2 position, and likewise we can do the same for row 3 column 2
Now imagine trying to solve the 4 x 4 case!
Now dividing row 3 by (c33*a11 - a31*c13)*rs - (c23*a11-a21*c13)*rt we have
Excepting the first term at the beginning not in proper form we've all but
I'll focus on lower diagonalization of the matrix of 3 x 3 form in the generalized case, and leave it up to you for completing this solution, once you've gotten a sense of what is going on here.
Basically the idea here is to create a diagonal of '1' s in the a diagonal form from the top left to bottom right of the matrix.
As a preface in introducing why 'diagonalizing' or solving the matrix would be done here, I would suggest a reading on multiplication operations on a matrices at
http://en.wikipedia.org/wiki/Matrix_multiplication
The purpose here in short is that 'diagonalizing' is solving the matrix. When we multiply for instant the coefficient unknowns with a given diagonalized matrix we have a system of solutions.
For instance the 'diagonalized' matrix of the form
leads to a solution
You can check the multiplication operations on the given matrix here.
But let's start from the beginning, and solve are given matrix where it is of the following form:
The idea is that we use row by row operations to get the rows in diagonal form. Keep in mind when operations are preformed they are done so with the rules that apply equally as they would in algebra. Thus if one row is multiplied by a given number m, both sides of the equation on such row would be multiplied by m, and every column in such row on both sides of the equation must be multiplied by m (distributive rule).
Keep in mind we are solving ('diagonlizing') the matrix where its size is a 3 x 3 system, we could procedurally extend this method however to any matrix of n x n size where a solution exists so long as the rows are independent of one another...meaning neither a multiple of the other leading to a matrix that doesn't have enough indendent equations that should lead to a solution.
But I'll focus abstractly at the methodology for solution here.
Our first objective in diagonalizing here is to make the position of 'a21' which is the 21 (row column) position equal 0, to do this we need a common term with respect to the position 'a11' which is at the 11 (row column) position.
Here for multipliers, we simply want like terms to add and subtract from in the a11 and a21 position. To do this, multiply the first row by a21 and multiply the second row by a11.
the first row is
[a11*a21 b12*a21 c13*a21]
the second row is
[a21*a11 b22*a11 c23*a11]
Just as in the ordinary case of algebraic operations on an equation, whatever is done to the left side of our equation must be done to the rights side for each row, so
on the right side for row 1 column 1 we have
a21*d1
and on the right side for row 2 column 1 we have
a11*d2
or a matrix that looks like
Subtracting row 1 from row 2 we have
We'll redivide row 1 by a21.
Then we multiply row 1 by
a31 and we'll
multiply row 3 by a11
This yields a matrix of the following
form
Unfortunately at this point simplification isn't so pretty since term matching without finding common factors implicitly inside the coefficients themselves to help with respect to term reduction and simplification here, but we are working towards a solution for 3 x 3 case which removes the necessity of solving the matrix by way of general formulation. In this case, we can take the multiply row 2 by the inverse (or reciprocal) expression (set of terms) in the row 2 column 2 position, and likewise we can do the same for row 3 column 2
so that we have
Now we can subtract row 2 from row 3
which yields
Now imagine trying to solve the 4 x 4 case!
But at least here the z value is nearly
solved. We could do some algebraic simplification...I'd suggest
figuring out how this might occur, it might help you down the road
here.
Rearranging terms we multiply row 3 by
(b32*a11 -
a31*b12) and
(b22*a11 -
a21*b12) , and we'll
designate both of these expressions
respectively as rt
and rs which gives
Now dividing row 3 by (c33*a11 - a31*c13)*rs - (c23*a11-a21*c13)*rt we have
Excepting the first term at the beginning not in proper form we've all but
diagonalized the lower half of our
matrix in this given solution.
I'd leave it as an exercise to
diagonalize the triangle form of the upper half of this matrix.
Basically, the idea here is to multiply one row by a given multiplier
so that its factored form matches another.
In the next step we likely need to have
the a multiplier of the form for the term in row 2 column 3 of this
matrix. Of course, you wouldn't want to have this multiplier
permanently applied to row 3 since this is already solved, but just
so that you could subtract this row from row 2, so that row 2 is of
the form [ 0 1 0]. We just proceed in reverse order now
creating multipliers of the lower rows to eliminate the terms of the
upper row...notice you can do this once the lower rows have [0 0 1]
and [0 1 0] coefficients since we hadn't need worry of creating new
terms where we eliminated in the case of multiplier * row subtracting
from these upper rows.
Up to now when solving matrices, I'd
mention this is just one method for solving a system of linear
equations. Of course, likely you might have the solved formulas on
the right side of the equation, programmed into your calculator's
memory in this case.
I'd leave it as another exercise to
diagonalize in the 2 x 2 generalized case which is a lot easier and
less time consuming. This also hints why generalized solutions,
outside of computer generated solutions, become magnified in terms of
computational expense.
I'll probably demonstrate other methods
later for solving such equation, aside from the other more well,
known back substitution method that I had shown in a previous posting
involving the solution of a system of equations for a given sampling
polynomial.
No comments:
Post a Comment