Matrix operations are mathematical operations performed on matrices, which are rectangular arrays of numbers. Matrices are widely used in various fields, including mathematics, physics, computer science, and engineering.
What is a matrix?
Formally, a matrix is just an array in which we put entries. This table can be a square or a rectangle.
Mathematically, a matrix is a linear transformation $A$ from $\mathbb{R}^n$ to $\mathbb{R}^p$. If $p=n$ then $A$ is called a square matrix, and if $p\neq q,$ $A$ is called a rectangular matrix. Using the base of the vector space $\mathbb{R}^p,$ the matrix $A$ takes the following form \begin{align*}A=\begin{pmatrix} a_{11}&a_{21}&\cdots&a_{n1}\cr a_{12}&a_{22}& \cdots&a_{n2}\cr \vdots&\vdots&\ddots&\vdots\\ a_{1p}&a_{2p}&\cdots&a_{np}\end{pmatrix}.\end{align*}For simplicity, we write $A=(a_{ij})_{\underset{1\le j\le p}{1\le i\le n}}$. We say that $A$ is a $np$-matrix. On the other hand, $n$ is the number of columns, while $p$ is the number of lines in the matrix.
Here are some common matrix operations:
In this section, we gather the essential matrix operations that are used every day!!!
Matrix Addition
Two matrices of the same size can be added together by adding their corresponding elements. For example, if A and B are both m×n matrices, the sum C = A + B is obtained by adding each element of A to the corresponding element of B.
If $$ A=\begin{pmatrix} 1&1&1\\1&1&1\\1&1&1\end{pmatrix},\quad B=\begin{pmatrix} 2&2&2\\2&2&2\\2&2&2\end{pmatrix},$$ then $$C=A+B=\begin{pmatrix} 3&3&3\\3&3&3\\3&3&3\end{pmatrix}.$$
Matrix Subtraction
Similar to addition, two matrices of the same size can be subtracted by subtracting their corresponding elements. If A and B are both m×n matrices, the difference C = A – B is obtained by subtracting each element of B from the corresponding element of A.
Matrix Scalar Multiplication
A matrix can be multiplied by a scalar, which is a single number. Each element of the matrix is multiplied by the scalar value. If A is an m×n matrix and c is a scalar, the product C = cA is obtained by multiplying each element of A by c.
If $$ A=\begin{pmatrix} 1&2\\ 3&4\end{pmatrix}$$ then $$ 3A=\begin{pmatrix} 3&6\\ 9&12\end{pmatrix}.$$
In what follows the field $\mathbb{K}$ will be the set of real numbers $\mathbb{R}$ or $\mathbb{C}$ the set of complex numbers. We denote by $\mathscr{M}_{np}(\mathbb{K})$ the set of all matrices $A=(a_{ij})_{\underset{1\le j\le p}{1\le i\le n}}$. If $n=p,$ we set $\mathscr{M}_{nn}(\mathbb{K}):=\mathscr{M}_{n}(\mathbb{K}),$ the set of square matrices of order $n$.
Let $A$ and $B$ be two matrices in $\mathscr{M}_{np}(\mathbb{K})$ with coefficients $a_{ij}$ and $b_{ij},$ respectively. Let $\lambda \in \mathbb{K}$. We define the following matrix operations \begin{align*} A+B&=(a_{ij}+b_{ij})_{\underset{1\le j\le p}{1\le i\le n}}\cr \lambda\cdot A &=(\lambda a_{ij})_{\underset{1\le j\le p}{1\le i\le n}}.\end{align*} Then $(\mathscr{M}_{np}(\mathbb{K}),+,\cdot)$ is a vector space on $\mathbb{K}$.
Let’s consider the elementary matrices $E^{ij}$ for $i=1,\cdots,n$ and $j=1,\cdots,p$ defined in the following way: all the coefficients are zero except the coefficient which corresponds to index $i,j$. We recall that the set $$\{E^{ij}:i=1,\cdots,n,\;j=1,\cdots,p\}$$ is basis of the matrix space $\mathscr{M}_{np}(\mathbb{K})$. Then we have the following important result:
The dimension of the matrix vector space
$$ \dim(\mathscr{M}_{np}(\mathbb{K}))=np.$$
Matrix Multiplication is one of the complicated matrix operations
Matrix multiplication is a more complex operation that combines the rows of the first matrix with the columns of the second matrix to produce a new matrix. If A is an m×n matrix and B is an n×p matrix, the product C = AB is an m×p matrix. The (i, j)-th element of C is computed by taking the dot product of the i-th row of A and the j-th column of B.
Let matrices $A=(a_{ij})\in \mathscr{M}_{np}(\mathbb{K})$ and $B=(b_{ij})\in \mathscr{M}_{qn}(\mathbb{K})$. Then $C=AB=(c_{ij})\in \mathscr{M}_{pq}(\mathbb{K})$, where the entries $c_{ij}$ is given by \begin{align*}c_{ij}=\sum_{k=1}^n a_{ik}b_{kj}.\end{align*}
The power of a matrix $A$ is a matrix $$A^n=\underset{(n\;\text{times})}{A\cdot A\cdots A},\quad n\in\mathbb{N}.$$
When multiplying matrices you have to be very careful because in general $AB\neq BA$. In fact, takes, for example, matrices \begin{align*} A=\begin{pmatrix} 1&2\\1&0\end{pmatrix},\quad B=\begin{pmatrix}0&3\\ 1&2\end{pmatrix}.\end{align*} Then $$ AB=\begin{pmatrix}2&7\\ 0&3\end{pmatrix},\quad BA=\begin{pmatrix}3&0\\3&2\end{pmatrix}.$$ Then we cannot immediately use the binomial expansion formula since this formula is used in a commutative ring. Here the matrix space is not commutative. But we can still use the binomial expansion if the matrices $A$ and $B$ satisfy $AB=BA$.
Matrix operations: Matrix Transposition
The transposition of a matrix is obtained by interchanging its rows with columns. If $A$ is an m×n matrix, the transpose $A^t$ is an n×m matrix. The (i, j)-th element of $A^t$ is equal to the (j, i)-th element of $A$.
If $$ A=\begin{pmatrix} 1&5&4\\3&2&7\\0&1&8\end{pmatrix},$$ then $$ A^t=\begin{pmatrix} 1&3&0\\5&1&0\\4&7&1\end{pmatrix}.$$
Matrix of a linear map
Let $\varphi: E\to F$ be a linear map, where $E$ and $F$ are finite-dimensional spaces with dimensions $n$ and $p$, respectively. That is $\varphi(x+\lambda y)=\varphi(x)+\lambda \varphi(y)$ for any $x,y\in E$, and $\lambda\in\mathbb{K}$. Let $(e_1,\cdots,e_n)$ and $(f_1,f_2,\cdots,f_p),$ the basies of $E$ and $F,$ respectively. The $np$-matrix $A$ associated with the linear map $\varphi$ is given by \begin{align*}A=\begin{pmatrix}\varphi(e_1)&\varphi(e_2)&\cdots&\varphi(e_n)\end{pmatrix}\end{align*}where for each $i=1,2,\cdots,n,$ $\varphi(e_i)$ is a colum vector calculted in the basis $(f_1,f_2,\cdots,f_p),$.
Example: Let $\mathbb{R}_2[X]$ the vector space of a polynomial of degree less or equal to $2$. We recall that the dimension of this space is $3$. Consider the application \begin{align*} \Phi:\mathbb{R}_2[X]\to \mathbb{R}^3,\qquad \Phi(P)=(P(-1),P(0),P(1)). \end{align*} Determine the matrix associated with $\Phi$. Prove that $\Phi$ is an isomorphism.
$\bullet$ Let us fisrt prove that $\Phi$ is a linear transformation. In fact, let $P,Q\in \mathbb{R}_2[X]$ and $\lambda\in \mathbb{R}$. We have \begin{align*} \Phi(P+\lambda Q)&=(P(-1)+\lambda Q(-1),P(0)+\lambda Q(0),P(1)+\lambda Q(1))\cr &= (P(-1),P(0),P(1))+(\lambda Q(-1),\lambda Q(0),\lambda Q(1)) \cr &= (P(-1),P(0),P(1))+\lambda (Q(-1), Q(0), Q(1))\cr &= \Phi(P)+\lambda \Phi(Q). \end{align*}
$\bullet$ Let $B=(1,X,X^2)$ the canonical basis of $\mathbb{R}_2[X]$ and $B’=(e_1,e_2,e_3)$ the canonical basis of $\mathbb{R}^3$. Now determine the matrix $A$ which represents the linear map in these bases. To compute the matrix associated with $\Phi$ in the basis $B$ et $B’$, we first give the coordinates of the vectors $\Phi(1),\Phi(X)$ and $\Phi(X^2)$ in the basis $B’$. We recall that $e_1=(1,0,0),\;e_2=(0,1,0),$ and $e_3=(0,0,1)$. Then \begin{align*} \Phi(1)&=(1,1,1)=e_1+e_2+e_3\cr \Phi(X)&=(-1,0,1)=-e_1+e_3\cr \Phi(X^2)&=(1,0,1)=e_1+e_3. \end{align*} Then \begin{align*} A&=\begin{pmatrix} \Phi(1)&\Phi(X)&\Phi(X^2)\end{pmatrix}\cr &= \begin{pmatrix} 1&-1&1\\1&0&0\\1&1&1\end{pmatrix}. \end{align*}
$\bullet$ Finally, we prove that $\Phi$ is an isomorphism. In fact, we know that \begin{align*}\dim\left(\mathbb{R}_2[X]\right)=3=\dim\left(\mathbb{R}^3\right).\end{align*} Then to prove that $\Phi$ is an isomorphism it suffices to prove that $\Phi$ is injective, which is equivalent to proving that the kernel $\ker(\Phi)={0}$. If $P\in \ker(\Phi)$ then $P(0)=P(-1)=P(1)=0$. This means that $P$ has three distinct roots, which is impossible because the degree of $P$ is less or equal $2$. Hence $P=0$ is “the null polynomial”, so $\ker(\Phi)={0}$. The application $\Phi$ is then injective then it is an isomorphism.
You may also consult the eigenvalues of matrices.
These are some of the fundamental matrix operations. There are other advanced operations as well, such as matrix inversion, determinant computation, and eigenvalue calculations, which are important in linear algebra and various applications of matrices.