⚽ Goals:
What is entropy?
Why does it increase?
Can we predict the equations of state of a system from first principles?
For an isolated system we will obtain the correct macroscopic result if we take an ensemble average over all accessible microstates assigning them equal probability
<aside> <img src="https://prod-files-secure.s3.us-west-2.amazonaws.com/369dfa6b-d4d9-4cf2-a446-e369553b6347/0cf73b33-2ce5-4d79-960b-65bd533ff934/Ensemble_average.png" alt="https://prod-files-secure.s3.us-west-2.amazonaws.com/369dfa6b-d4d9-4cf2-a446-e369553b6347/0cf73b33-2ce5-4d79-960b-65bd533ff934/Ensemble_average.png" width="40px" />
Ensemble average $\lang X \rang$: If the value of some quantity $X$ in the $i$th microstate is $X_i$ and its probability is $p_i$ then the value of $X$ in the Macrostate is
$$ \lang X \rang = \sum_i p_i X_i $$
</aside>
🗒️ Note: we can think of an ensemble just a collection of copies of the system each in one of the allowed microstate.
📜 Definitions
- $\nu$: the number of copies
- $\Omega$: number of accessible Macrostate
- $p_i$: probability of one of the copies
- $n$, $N$: number of atoms
🗒️ Note: $\nu \gg \Omega,n, N$
If we use $\lambda=1,\ldots,\nu$ to label the copies and $i$ to label the microstates
$$ \lang X \rang =\frac 1 \nu \sum_\lambda X_\lambda = \frac 1 \nu \sum_i \nu_i X_i = \sum_i p_i X_i $$
<aside> <img src="https://prod-files-secure.s3.us-west-2.amazonaws.com/369dfa6b-d4d9-4cf2-a446-e369553b6347/44220444-3486-4b9a-8abd-ab0f8b48586a/Microcanonical_ensemble.png" alt="https://prod-files-secure.s3.us-west-2.amazonaws.com/369dfa6b-d4d9-4cf2-a446-e369553b6347/44220444-3486-4b9a-8abd-ab0f8b48586a/Microcanonical_ensemble.png" width="40px" /> Microcanonical ensemble: an isolated system with fixed $N,V,E,T$. All microstates that the system can possibly occupy have the same energy and are equally probable.
</aside>
<aside> <img src="https://prod-files-secure.s3.us-west-2.amazonaws.com/369dfa6b-d4d9-4cf2-a446-e369553b6347/083a57c5-7991-4571-b077-06be1d6e683d/Canonical_ensemble.png" alt="https://prod-files-secure.s3.us-west-2.amazonaws.com/369dfa6b-d4d9-4cf2-a446-e369553b6347/083a57c5-7991-4571-b077-06be1d6e683d/Canonical_ensemble.png" width="40px" /> Canonical ensemble: Thermal equilibrium with a heat bath at a fixed temperature $T$. $N,V,T$ fixed but not $E$.
</aside>
<aside> <img src="https://prod-files-secure.s3.us-west-2.amazonaws.com/369dfa6b-d4d9-4cf2-a446-e369553b6347/f7117bb0-7fb0-4fa8-aebe-c18284f96c09/Grand_canonical_ensemble.png" alt="https://prod-files-secure.s3.us-west-2.amazonaws.com/369dfa6b-d4d9-4cf2-a446-e369553b6347/f7117bb0-7fb0-4fa8-aebe-c18284f96c09/Grand_canonical_ensemble.png" width="40px" /> Grand canonical ensemble: In this case $T,\mu,V$ are fixed but $N,E$ are not
</aside>
💼 Case: consider an isolated system ($E,V,N$ constant)
The probability of the system being in any one microstate is
$$ p_i=\frac 1 \Omega \qquad \sum_i p_i=\Omega \frac 1 \Omega =1 $$
🗒️ Note: this was never proved mathematically it is assumed to be true and was validated through its ability of prediction
💃 Example: we are going to assume that there are $6\times 6$ particles and that they can each be in 2 states represented by blue or green circles
In this situation we can see that $\Omega=2^{6\times 6}\simeq 6.87\times10^{10}$ and that they each have probability $p_i=(1/2)^{36}\simeq 1.46\times 10^{-11}$
In this situation the macrostates would look like
These colours are created by a certain amount of blue and green. To find the possible combinations that can make up each of this colour we can consider say the number of green $n$ (could have been blue) and the total number of particles $N$ to get the probability to be