Changing the basis of a vector
💼 Case: Consider a vector space $V^N$ with two orthonormal bases $\{\left | e_j \right > \}^N_{j=1}$ and $\{\left | f_j \right > \}^N_{j=1}$.
We can write the completeness relation
$$ \begin{aligned} \hat {\bold 1}&=\sum^N_{j=1}\left |e_j \right > \left < e_j \right | \\ \hat {\bold 1}&=\sum^N_{j=1}\left |f_j \right > \left < f_j \right |
\end{aligned} $$
Using this we can write a vector $|v\rangle$ in terms of the basis $\{|e_j \rangle \}_{j=1}^N$ such that
$$ |v\rangle=\hat{\bold 1} |v \rangle =\sum^N_{j=1}\,\langle e_j | v \rangle \,| e_j \rangle $$
equally we can write
$$ |v\rangle=\hat{\bold 1} |v \rangle =\sum^N_{j=1}\,\langle f_j | v \rangle \,| f_j \rangle $$
💎 Conclusion: We changed the coordinate system
💃 Example: two dimensional vector space, with orthonormal basis vector $\{|x \rangle,|y\rangle \}$
Completeness relation $\hat {\bold 1}=| x \rangle \langle x|+ |y\rangle\langle y|$, we can write an arbitrary vector in this space $|v\rangle$
$$ |v \rangle=v_x |x\rangle+v_y |y\rangle $$
where $v_x=\langle x|v \rangle$ and $v_y=\langle y|v \rangle$
if we rotate the coordinate system by $\alpha$ we get a new basis $\{|x'\rangle,|y'\rangle\}$ which can be written
$$ |x' \rangle = \cos \alpha \, |x \rangle + \sin \alpha \, | y \rangle \qquad |y'\rangle =\cos \alpha |y\rangle- \sin \alpha |x \rangle $$
thus we get a new vector
$$ |v \rangle=v_{x'} |x'\rangle+v_{y'} |y'\rangle $$
To find the coefficients we do the following:
$$ \begin{aligned} |v \rangle&=v_{x'} |x'\rangle+v_{y'} |y'\rangle \\ &=v_{x'} \left (\cos \alpha \, |x \rangle + \sin \alpha \, | y \rangle \right )+v_{y'} \left ( \cos \alpha |y\rangle- \sin \alpha |x \rangle \right ) \\ &=\underbrace{(v_{x'}\cos\alpha-v_{y'}\sin\alpha)}{v_x}\,|x\rangle+\underbrace{(v{x'}\sin\alpha+v_{y'}\cos\alpha)}_{v_y}\,|y\rangle \\
\end{aligned} $$
solving the system for $v_{x'}$ and $v_{y'}$ we get
$$ v_{x'}=v_x\cos\alpha+v_y\sin\alpha \qquad v_{y'}=-v_x\sin\alpha+v_y\cos\alpha $$
🗒️ Note: this is what we expect if we used matrix rotation
💃 Example: Consider a vector $|v \rang = |x \rang + 2 |y\rang$ in a vector space $\R^2$ which has orthonormal basis $\{|x\rang ,|y\rang\}$
Consider a new orthonormal basis
$$ \begin{aligned} |u \rang &=|x \rang+|y\rang \\ |x \rang &=\frac 12(|u \rang+|w\rang)
\end{aligned} \qquad \begin{aligned} |w \rang &=|x \rang-|y\rang \\ |y \rang &=\frac 12(|u \rang-|w\rang)
\end{aligned} $$
thus we can write $|v\rang$ as
$$ \begin{aligned} |v \rang &= |x \rang + 2 |y\rang \\ &=\frac 12(|u \rang+|w\rang)+2\frac 12(|u \rang-|w\rang) \\ &= \frac 32 |u \rang - \frac 12 |w \rang \end{aligned} $$
🗒️ Note: to prove that these are the same vector we can take the inner product
Define a vector in two coordinates
$$ |v \rangle=\sum^N_{j=1} v_j \, |e_j \rangle = \sum ^N_{j=1} c_j \, |f_j \rangle $$
Taking the dot products
$$ \begin{aligned} \langle v | v \rang &= \left ( \sum^N_{j=1}v_j \,|e_j \rang\right )^\dagger\left ( \sum^N_{k=1}v_k \,|e_k \rang\right )= \sum^N_{j,k=1}v_j^v_k \lang e_j |e_k \rang=\sum^N_{j,k=1} v_j^ v_k \delta {jk}=\sum^n{J=1} |v_j|^2 \\ \langle v | v \rang &= \left ( \sum^N_{j=1}c_j \, |f_j \rang\right )^\dagger\left ( \sum^N_{k=1}c_k \,|f_k \rang\right )= \sum^N_{j,k=1}c_j^c_k \lang f_j |f_k \rang=\sum^N_{j,k=1} c_j^ c_k \delta {jk}=\sum^n{J=1} |c_j|^2 \end{aligned} $$
trying the relation we are trying to show
$$ \begin{aligned} \sum^N_{j=1}|v_j|^2&=\sum^N_{j=1}|c_j|^2 \\ \sum^N_{j=1}\lang v | \underbrace{e_j \rang \lang e_j}{\hat{\bold 1}} | v \rang &= \sum^N{j=1}\lang v | \underbrace{f_j \rang \lang f_j}_{\hat{\bold 1}} | v \rang \\ \lang v|v \rang &=\lang v|v \rang
\end{aligned} $$
💼 Case: consider a linear operator $\hat A$ which acts on a vector space $V^N$ with orthonormal basis $\{|e_j \rang \}^N_{j=1}$.
Applying completeness relation
$$ \begin{aligned} \hat A = \bold{ 1} \hat A \bold 1&=\left ( \sum^N_{j=1}|e_j \rang \lang e_j | \right ) \hat A \left ( \sum^N_{k=1}|e_k \rang \lang e_k | \right ) \\ &=\sum^N_{j,k=1} \lang e_j | \hat A | e_k \rang \, |e_j \rang \lang e_k |
\end{aligned} $$
similarly
$$ \hat A=\sum^N_{j,k=1} \lang f_j | \hat A | f_k \rang \, |f_j \rang \lang f_k | $$