Loading
/custom-emojis/emojis/contour-map.png
Templates
📚
Articles & Resources
📖
Guides & Support
🌵
CalcTree
Estados de Vigas de Concreto
Bust Common Myths About Java Programming
Loading
/custom-emojis/emojis/calculator.png
Tensile Strength and Capacity Control of the W-Shape Sections According to AISC 360-16
Loading
/custom-emojis/emojis/calculator.png
Concrete Cylinder Strength Vs Cube Strength
Loading
/custom-emojis/emojis/calculator.png
Earthquake Design Action Calculation
Sıvılaşma Verileri Tablosu
Loading
/custom-emojis/emojis/rc-beam.png
Concrete Column Designer to AS3600
EM Wave Propagation Calculator
section properties with units
Forward Kinematics of Robotic Arm with 6 Degrees of Freedom
İKSA YAPILARI PROJELENDİRME HİZMET BEDELİ (2024)
GEOTEKNİK RAPOR (EK-B) ASGARİ HİZMET BEDELİ (2024)
ZEMİN İYİLEŞTİRME/DERİN TEMEL PROJELENDİRME ASGARİ HİZMET BEDELİ (2024) (İMO)
🚀
Projectile motion
Loading
/custom-emojis/emojis/bending-moment.png
Dezi et. al (2010)
🤾
Projectile motion
Matrix Algebra and its applications in Engineering and Physics's banner
🧮

Matrix Algebra and its applications in Engineering and Physics

Matrix algebra is a branch of mathematics that deals with matrices, or arrays of numbers, and their operations. It is a fundamental tool for solving linear systems of equations and is widely used in engineering and physics. In this article, we will go over some of the key concepts of matrix algebra and its applications.
Simultaneous Equations


What is a Matrix?

A matrix is an array of numbers or variables arranged in rows and columns. A matrix can be represented using bracket notation, with the matrix elements separated by semicolons. For example, the following matrix has two rows and columns:

[abcd ]\begin{bmatrix}a & b \\ c & d \ \end{bmatrix}

Matrix Operations

Matrices can be added, subtracted, and multiplied similarly to scalar numbers. For two given matrices, A and B:

A=[a1,1a1,2a2,1a2,2 ]B=[b1,1b1,2b2,1b2,2]A = \begin{bmatrix}a_{1,1} & a_{1,2} \\ a_{2,1} & a_{2,2} \ \end{bmatrix} B = \begin{bmatrix}b_{1,1} & b_{1,2} \\ b_{2,1} & b_{2,2} \end{bmatrix}\\

Matrix Addition

A+B can be represented mathematically as: (This is very intuitive)

A+B=[a1,1+b1,1a1,2+b1,2a2,1+b2,1a2,2+b2,2]A+B = \begin{bmatrix}a_{1,1} + b_{1,1} & a_{1,2} + b_{1,2} \\ a_{2,1} + b_{2,1} & a_{2,2} + b_{2,2} \end{bmatrix}

Matrix Subtraction

A-B can be represented mathematically as: (This is very intuitive)

AB=[a1,1b1,1a1,2b1,2a2,1b2,1a2,2b2,2]A-B = \begin{bmatrix}a_{1,1} - b_{1,1} & a_{1,2} - b_{1,2} \\ a_{2,1} - b_{2,1} & a_{2,2} - b_{2,2} \end{bmatrix}

Matrix Multiplication

Matrix multiplication is defined as the product of two matrices, with the resulting matrix having the same number of rows as the first matrix and the same number of columns as the second matrix. The resulting matrix elements are found by multiplying elements from each matrix and summing the products.
A multiplied by B can be represented as: (Not as intuitive)

AB=[a1,1a1,2a2,1a2,2][b1,1b1,2b2,1b2,2]=[a1,1b1,1+a1,2b2,1a1,1b1,2+a1,2b2,2a2,1b1,1+a2,2b2,1a2,1b1,2+a2,2b2,2]A B = \begin{bmatrix}a_{1,1} & a_{1,2} \\ a_{2,1} & a_{2,2} \end{bmatrix}\begin{bmatrix}b_{1,1} & b_{1,2} \\ b_{2,1} & b_{2,2} \end{bmatrix} = \begin{bmatrix}a_{1,1}b_{1,1} + a_{1,2}b_{2,1} & a_{1,1}b_{1,2} + a_{1,2}b_{2,2} \\ a_{2,1}b_{1,1} + a_{2,2}b_{2,1} & a_{2,1}b_{1,2} + a_{2,2}b_{2,2} \end{bmatrix}
Note!
  1. The first matrix must have the same number of columns as the second matrix has rows. Otherwise, the operation cannot be performed.
  2. This condition does not apply to addition and subtraction.

Try for yourself!
Matrix A
Matrix B


Addition (C=A+B)
Subtraction (D=A-B)

12
24
36
48

8
16
24
32
Multiplication (E=A*B)


140
200
300
440


The Identity Matrix

The identity matrix is square, with ones on the main diagonal and zeros elsewhere. It is the matrix equivalent of the number 1 for scalars.
The identity matrix for a matrix of size n x n is given by:

In=[100010001]I_{n} =\begin{bmatrix}1 & 0 & \dots & 0 \\0 & 1 & \dots & 0 \\\vdots & \vdots & \ddots & \vdots \\0 & 0 & \dots & 1\end{bmatrix}
An identity matrix can be used to find the inverse of a matrix. It can also be used to find the eigenvalues and eigenvectors of a matrix.

Matrix Inversion

Matrices can be inverted. However, the process is not the same as simply inversing each of its elements. A normal inversion of a fraction can be multiplied with its original number to yield the number 1. However, multiplying a matrix by its inverse must yield the identity matrix in the domain of matrices.

AA1=IA A^{-1} = I
For a 2x2 matrix:

A1=1adbc[abcd]A^{-1} = \frac {1}{ad-bc} \begin {bmatrix} a&b \\ c&d \end {bmatrix}
Let's try it out!
Change values in the input Matrix:
To see it change: the inverse Matrix:


-2
1
1.5
-0.5
Matrix Inversion


Eigenvalues and Eigenvectors

Eigenvalues and eigenvectors are important concepts in matrix algebra. An eigenvalue of a matrix is a scalar corresponding to a linear matrix transformation. An eigenvector of a matrix is a non-zero vector that, when multiplied by the matrix, results in a scalar multiple of the vector. In other words, the eigenvector is stretched or shrunk by the matrix transformation.

Eigenvalues and eigenvectors can be found by solving the characteristic equation of the matrix, which is obtained by using det (A - λI) = 0, where A is the matrix, λ is the eigenvalue, I is the identity matrix and det denotes the determinant function. The eigenvalues represent the magnitude of the transformation, while the eigenvectors represent the direction of the transformation.

In linear algebra, an eigenvector of a square matrix is a non-zero vector that results in a scalar multiple of itself when multiplied by the matrix. The scalar value is known as the eigenvalue of the eigenvector.
Mathematically, this can be represented as:

Av=λvA \mathbf{v} = \lambda \mathbf{v}
Where:

A=SquareMatrixv=Eigenvectorλ=EigenvalueA = Square \: Matrix \\ \mathbf{v} = Eigenvector \\ \lambda = Eigenvalue

Diagonalization

Diagonalization transforms a matrix into a diagonal matrix, where all off-diagonal entries are zero. A matrix is diagonalizable if and only if it has n linearly independent eigenvectors, where n is the number of columns or rows in the matrix. Diagonalization is important because it simplifies the calculation of matrix powers, exponentials, and logarithms, and also provides a basis for understanding the geometry of linear transformations.

A square matrix A is diagonalizable if it can be transformed into a diagonal matrix D using an invertible matrix P:

A=PDP1A = P D P^{-1}
Where:

P=EigenvectorMatrixD=DiagonalEigenvalueMatrixP = Eigenvector \: Matrix \\ D = Diagonal \: Eigenvalue \: Matrix
Note!
  1. D is an n x n matrix with eigenvalues of A along its diagonal
  2. P is an n x n matrix consisting of the eigenvectors that correspond to the eigenvalues in D. It also has to have an inverse.

Applications of Matrix Algebra

Matrix algebra is widely used in engineering and physics for solving linear systems of equations, calculating matrix powers, exponentials, and logarithms, and understanding the geometry of linear transformations.

Engineering

Applications of matrices in engineering are common for control theory, computer graphics, and signal processing. In control theory, matrices are used to model and control systems, including linear time-invariant systems, linear time-varying systems, and nonlinear systems. In computer graphics, matrices are used to transform 3D objects and perform rotations, scaling, and translations. In signal processing, matrices are used to transform signals and perform operations such as filtering, compression, and modulation.

Physics

In physics, matrix algebra is used in quantum mechanics, general relativity, and quantum field theory. In quantum mechanics, matrices are used to describe quantum states, operators, and measurements. In general relativity, matrices are used to describe spacetime geometry and objects' motion in gravitational fields. In quantum field theory, matrices are used to describe the behaviour of particles and fields in the presence of interactions.

More specific applications:
  1. Linear transformations and rotations
  2. Modelling dynamic systems in control theory
  3. Solving differential equations
  4. Representing and manipulating data in computer science and machine learning.
  5. Used in optimization, signal processing, and cryptography.


Conclusion

Matrix algebra is a fundamental tool in engineering and physics, with applications in control theory, computer graphics, signal processing, quantum mechanics, general relativity, and quantum field theory. Understanding eigenvalues and eigenvectors, diagonalization, and matrix operations is essential for solving linear systems of equations, understanding the geometry of linear transformations, and modelling and analyzing systems in engineering and physics.


CalcTree

CalcTree, the app you're reading this one is a calculation management platform. You can sign-up and build hosted, shareable web apps (complete with an API and a web publishing module) with tools like Python and Spreadsheets. Learn more here!