<aside> <img src="https://prod-files-secure.s3.us-west-2.amazonaws.com/369dfa6b-d4d9-4cf2-a446-e369553b6347/40ec2a56-2306-436f-87b4-ebfd92c09728/matrix.png" alt="https://prod-files-secure.s3.us-west-2.amazonaws.com/369dfa6b-d4d9-4cf2-a446-e369553b6347/40ec2a56-2306-436f-87b4-ebfd92c09728/matrix.png" width="40px" /> Matrix: an $n\times m$ matrix is as follows:
$$ A=\left [\left . \overbrace{\begin{matrix} a_{11} & a_{12} & \cdots & a_{1m} \\ a_{21} & a_{22} & \cdots & a_{2m} \\ \vdots & \vdots & \ddots & \vdots \\ a_{n1} & a_{n2} & \cdots & a_{nm} \end{matrix}}^{m} \right \}n \right ] $$
where $a_{ij}$ is the element in the $i$th row and $j$th column
</aside>
Addition and subtraction
$$ \begin{aligned} \bold{C}&=\bold{A}+\bold{B} \;\Rightarrow \; c_{ij}=a_{ij}+b_{ij} \\ \bold{A} + \bold{B}&= \bold{B}+ \bold{A} \quad \small{\text{(commutative)}}\\ (\bold{A}+\bold{B})+\bold{C}&=\bold{A}+(\bold{B}+\bold{C}) \quad \small{\text{(associative)}}\end{aligned} $$
Scalar multiplication
$$ \begin{aligned} \bold{C}&=\lambda\bold{A} \;\Rightarrow \; c_{ij}=\lambda a_{ij} \\ \lambda\bold{A}&= \bold{A}\lambda \quad \small{\text{(commutative)}}\\ (\lambda \mu)\bold{A}&=\lambda(\mu\bold{A}) \quad \small{\text{(associative)}}\end{aligned} $$
Matrix multiplication
$$ \begin{aligned} \bold{C}&=\bold{A}\bold{B} \;\Rightarrow \; c_{ij}=\sum ^m {k=1}a{ik}b_{kj} \\ \bold{A} \bold{B}&\ne \bold{B} \bold{A} \quad \small{\text{(non commutative)}}\\ \bold{A}(\bold{B}+\bold{C})&=\bold{A}\bold{B}+\bold{A}\bold{C} \quad \small{\text{(distributive)}}\\ (\bold{A}\bold{B})\bold{C}&=\bold{A}(\bold{B}\bold{C}) \quad \small{\text{(associative)}}\end{aligned} $$
Special operations
transpose:
$$ \bold C ^T= (\bold{AB} )^T=\bold B^T \bold A^T \; \Rightarrow \sum^m_{k=1} a_{jk}b_{ki} $$
Adjoint or Hermitian conjugate
$$ \bold A^\dagger = \overline{(\bold{A}^T)} $$
where $\overline{(\bold{C})}$ is the complex conjugate $\bold{C}$
🗒️ Note: matrix is said to be Hermitian if $\bold A^\dagger= \bold A$
Trace
$$ \mathrm{Tr}(\bold A)=\sum^n_{i=1}a_{ii} $$
🗒️ Notes:
Determinant
$$ \det (\bold A)=\sum_{\sigma \in S_n}\text{sgn}(\sigma) \prod_{i=1}^n a_{i\sigma(i)} $$
where $S_n$ is the set of all permutations of the set $\{1,2,\ldots , n\}$
💃 Example: determinant of a $3\times 3$ matrix:
$$ \bold{A}=\begin{pmatrix} a_{11} & a_{12} & a_{13} \\ a_{21} & a_{22} & a_{23} \\ a_{31} & a_{32} & a_{33} \end{pmatrix} $$
$$ \begin{aligned}\operatorname{det}(\mathbf{A}) & =a_{11}\left|\begin{array}{ll}a_{22} & a_{23} \\a_{32} & a_{33}\end{array}\right|+a_{12}\left|\begin{array}{ll}a_{23} & a_{21} \\a_{33} & a_{31}\end{array}\right|+a_{13}\left|\begin{array}{ll}a_{21} & a_{22} \\a_{31} & a_{32}\end{array}\right| \\& =a_{11}\left(a_{22} a_{33}-a_{23} a_{32}\right)+a_{12}\left(a_{23} a_{31}-a_{21} a_{33}\right)+a_{13}\left(a_{21} a_{32}-a_{22} a_{31}\right)\end{aligned} $$
Inverse
$$ \begin{aligned} \bold A^{-1} \bold A&=\bold A \bold A^{-1} = \bold I \\ \bold A^{-1}&=\frac{1}{\det(\bold A)} \bold C^T \end{aligned} $$
where $\bold C^T$ is the transpose of the matrix of cofactors $\bold C$ corresponding to $\bold A$
🗒️ Note: if $\det(\bold A)=0$ then $\bold A$ has no determinant
🚀 Special relativity:
4-vector:
$$ \bold{\mathrm{x}}=\begin{pmatrix} ct \\ x \\ y \\ z \end{pmatrix} $$
to transform it to another inertial frame $S'$ moving at velocity $v$ in the $x$-direction relative to $S$ we can apply the Lorentz transformation
$$ \begin{aligned}\mathbf{x}^{\prime} & =\boldsymbol{\Lambda}(\beta) \mathbf{x}, \\\left(\begin{array}{c}c t^{\prime} \\x^{\prime} \\y^{\prime} \\z^{\prime}\end{array}\right) & =\left(\begin{array}{cccc}\gamma & -\beta \gamma & 0 & 0 \\-\beta \gamma & \gamma & 0 & 0 \\0 & 0 & 1 & 0 \\0 & 0 & 0 & 1\end{array}\right)\left(\begin{array}{c}c t \\x \\y \\z\end{array}\right)=\left(\begin{array}{c}\gamma(c t-\beta x) \\\gamma(x-\beta c t) \\y \\z\end{array}\right)\end{aligned} $$
where $\beta= v/c$ and $\gamma=1/\sqrt{1-\beta^2}$
🗒️ Note: $\boldsymbol{\Lambda}(-\beta)\boldsymbol{\Lambda}(\beta)=\boldsymbol{\Lambda}(\beta)\boldsymbol{\Lambda}(-\beta)=\bold{I}$
<aside> <img src="https://prod-files-secure.s3.us-west-2.amazonaws.com/369dfa6b-d4d9-4cf2-a446-e369553b6347/c16ad9fc-2c24-4630-8be6-2a2b656d00e3/eigenvalues.png" alt="https://prod-files-secure.s3.us-west-2.amazonaws.com/369dfa6b-d4d9-4cf2-a446-e369553b6347/c16ad9fc-2c24-4630-8be6-2a2b656d00e3/eigenvalues.png" width="40px" /> Eigenvalues $\lambda$ of a matrix $\bold A$ are
$$ \bold A \vec v_\lambda =\lambda \vec v_\lambda $$
where $\vec v_\lambda$ is the corresponding eigenvector.
To find the eigenvalues of an $n\times n$ matrix we solve the characteristic equation
$$ \det(\bold A-\lambda \bold I)=0 $$
</aside>
💃 Example:
$2 \times 2$ matrix:
$$ \bold{\sigma}_x=\begin{pmatrix} 0 & 1 \\ 1 & 0 \end{pmatrix} $$
take the characteristic equation
$$ \det (\bold \sigma _x - \lambda \bold I)=\det \left ( \begin{matrix} -\lambda & 1 \\ 1 & -\lambda \end{matrix} \right )=\lambda^2-1=0 $$
which has solution $\lambda=\pm 1$
We can solve the corresponding eigenvectors
$$ \begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix} \begin{pmatrix} v_1 \\ v_2 \end{pmatrix}= \pm \begin{pmatrix} v_1 \\ v_2 \end{pmatrix} $$
which gives $v_2=\pm v_1$ therefore the eigenvectors are
$$ \bold v_\pm =v_1 \begin{pmatrix} 1 \\ \pm 1\end{pmatrix} $$
🗒️ Note: In certain physical cases there are specific requirements such as in quantum with normalisation
The condition is as follows:
$$ \left < \bold v_\pm|\bold v_\pm\right >=1 $$
where $\left < \bold v_\pm|\bold v_\pm\right >=\overline{\bold v}\pm\cdot \bold v\pm$ where $\overline a\equiv a^*$
solving we get
$$ \bold v_\pm =\frac{1}{\sqrt{2}} \begin{pmatrix} 1 \\ \pm 1\end{pmatrix} $$