next up previous
Next: Inhomogeneous Systems Up: Fluctuations in Inhomogeneous Systems. Previous: Fluctuations in Inhomogeneous Systems.

Subsections


Fluctuations in Homogeneous Systems

One Variable

We already know the theory of fluctuations of one variable . Just a reminder...

Professor: Do you know this?

Students: No.

Professor: Did you know this last semester?

Students: No.

Professor: Were you taught this last semester?

Students: Yes.

Illustration to the Method of Successive Iterations

Legendre Transforms and Conjugated Variables

Suppose we have one variable x. Near equilibrium we construct the thermodynamic potential A(x) for a subsystem: the free energy of the subsystem at given x. The total free energy of the subsystem plus the environment is minimal: $A(x)+A_{\text{env}}(x_{\text{env}})\to\min$. We use Lagrange multiplier and obtain

\begin{displaymath}
G(x)=A(x)-Xx\to\min\end{displaymath}

with conjugated variable X. G(x) is a Legendre transform of A(x).

Probability of a Fluctuation

Probability of a fluctuation is determined by the entropy of the whole system:

\begin{displaymath}
\Prob(x)\sim\exp\bigl(S(x)/k+S_{\text{env}}(x)/k\bigr)\end{displaymath}

or, making Legendre transform:  
 \begin{displaymath}
 \Prob(x)\sim\exp\bigl(-G(x)/kT\bigr)\end{displaymath} (1)
Expansion: G(x) has a minimum at $x_0=\left\langle x\right\rangle$. So

\begin{displaymath}
G(x)=G(x_0)+\frac12\gamma(x-x_0)^2,\quad\gamma\gt\end{displaymath}

The physical meaning of $\gamma$: since G=A-Xx,

\begin{displaymath}
\gamma = \frac{\partial^2 A}{\partial x^2} = \frac{\partial X}{\partial x}\end{displaymath}

Sometimes $\gamma^{-1}$ is called generalized susceptibility. If x is volume (density fluctuations), $\gamma^{-1}$ is compressibility. If x is entropy, $\gamma^{-1}T$ is specific heat, etc.

Gaussian Distribution

We obtained:

\begin{displaymath}
\Prob(x)\sim\exp\bigl(-\gamma(x-x_0)^2/2kT\bigr)\end{displaymath}

Normalization:

\begin{displaymath}
\int_{-\infty}^{\infty}\Prob(x)\,dx = 1\end{displaymath}

or

\begin{displaymath}
\Prob(x)\,dx = \sqrt{\frac{\gamma}{2\pi
 kT}}\exp\left(-\gamma(x-x_0)^2/2kT\right)\, dx\end{displaymath}

Let x0=0 (change zero of the scale!) and

\begin{displaymath}
\Prob(x)\,dx = \sqrt{\frac{\gamma}{2\pi
 kT}}\exp\left(-\gamma x^2/2kT\right)\, dx\end{displaymath}

Mean Values

Averages are:

\begin{displaymath}
\begin{aligned}
 \left\langle x\right\rangle &= \int_{-\inft...
 ...}^{\infty} \Prob(x)x^2\,dx =
 \frac{kT}{\gamma} 
 \end{aligned}\end{displaymath}

Example: Landau Theory for Homogeneous System

Free energy:

\begin{displaymath}
A = A_0+V\left[aM^2+dM^4-HM\right]\end{displaymath}

Let us discuss symmetrical phase above critical point (H=0, a>0). Then

\begin{displaymath}
\gamma = 2aV\end{displaymath}

and  
 \begin{displaymath}
 \left\langle M^2\right\rangle=\frac{kT}{\gamma}=\frac{kT}{2V\alpha(T-T_c)}\end{displaymath} (2)
In the critical point fluctuations diverge. Note 1/V dependence.

Case of Many Variables

Problem

Suppose we have many variables xi, i=1, 2,..., n. Let the equilibrium value

\begin{displaymath}
\left\langle x_i\right\rangle = 0\end{displaymath}

The expansion becomes  
 \begin{displaymath}
 G = G_0 + \frac12\sum_{i,j=1}^n \gamma_{ij} x_i x_j\end{displaymath} (3)
with $\gamma_{ij}=\gamma_{ji}$ (Why?). This is a quadratic form.

Matrix form

Let

\begin{displaymath}
\mathsf{x}=
 \begin{Vmatrix}
 x_1\  x_2\  \hdotsfor{1}\  ...
 ...},\quad
 \Gamma = 
 \begin{Vmatrix}
 \gamma_{ij}
 \end{Vmatrix}\end{displaymath}

Then

\begin{displaymath}
G = G_0 + \frac12\mathsf{x}^{
\dag 
}\Gamma\mathsf{x}\end{displaymath}

Linear Transformations

Let us make a linear transformation:

xi = tik yk

Then

\begin{displaymath}
G=G_0+\frac12\sum_{i,k=1}^n \gamma'_{ik}y_i y_k\end{displaymath}

with

\begin{displaymath}
\gamma'_{ik} = \sum_{l,m=1}^n t_{li} \gamma_{lm} t_{mk}\end{displaymath}

In matrix form:

\begin{displaymath}
\mathsf{x} = \mathsf{T}\mathsf{y}\end{displaymath}

and

\begin{displaymath}
G = G_0 + \frac12 \mathsf{y}^{
\dag 
}\Gamma'\mathsf{y},\quad
 \Gamma'=\mathsf{T}^{
\dag 
}\Gamma\mathsf{T} \end{displaymath}

Diagonalization

Mathematicians prove, that symmetric matrix $\Gamma$ can be diagonalized: there exists matrix $\mathsf{T}$ such as

\begin{displaymath}
\Gamma'=\mathsf{T}^{
\dag 
}\Gamma\mathsf{T} = 
 \begin{Vmat...
 ...& 0\  \hdotsfor{4}\  0 & 0 & \dots & \lambda_n
 \end{Vmatrix}\end{displaymath}

with real $\lambda_i$.

The quadratic form (3) becomes

\begin{displaymath}
G = G_0 + \frac12\sum_{i=1}^n \lambda_i y_i^2\end{displaymath}

The variables are completely separated. Each one has independent Gaussian distribution with

\begin{displaymath}
\left\langle y_i^2\right\rangle = \frac{kT}{\lambda_i},\quad
 \left\langle y_i y_j\right\rangle = 0,\, i\ne j\end{displaymath}

The variables yi are called normal coordinates

Return to Original Coordinates

Let us calculate $\left\langle x_i x_j\right\rangle$:

\begin{displaymath}
x_i x_j = \sum_{l,m=1}^n t_{il}y_l t_{jm}y_m\end{displaymath}

Averaging this:

\begin{displaymath}
\left\langle y_i y_j\right\rangle = 
 \begin{cases}
 kT/\lambda_i, & i=j\  0, & i\ne j
 \end{cases}\end{displaymath}

so

\begin{displaymath}
\left\langle x_i x_j\right\rangle = \sum_{l=1}^n \frac{kTt_{il}t_{lj}}{\lambda_l}\end{displaymath}

Recipe for Multi-Variable Problem

1.
Write down quadratic form
2.
Switch to normal coordinates
3.
Calculate averages
4.
Switch back

next up previous
Next: Inhomogeneous Systems Up: Fluctuations in Inhomogeneous Systems. Previous: Fluctuations in Inhomogeneous Systems.

© 1997 Boris Veytsman and Michael Kotelyanskii
Tue Oct 28 22:10:23 EST 1997