Next: Statistical definition of thermodynamic
Up: Boltzmann statistics
Previous: Boltzmann statistics
The foundation of statistical physics was laid towards the
end of the nineteenth century by James Clerk Maxwell, Ludwig
Boltzmann, Josiah Willard Gibbs1
and largely completed by Albert Einstein in 1905.
Maxwell's kinetic theory of gases can be said to represent the starting point.
Boltzmann made the argument more general and introduced the concept of ensembles.
Instead of considering a single system he considered a large number of equivalent systems which had
been prepared in the same way. He obtained probabilities for the possible states by calculating the relative frequency that a given state would occur in the ensemble. Gibbs followed up by establishing the equivalence of statistical physics and thermodynamics. He did this by stressing an analogy with classical mechanics, which was the best understood branch of theoretical physics at the time. Finally Einstein rounded out the picture by his theory of fluctuations, diffusion and Brownian motion.
These developments happened before the advent of quantum mechanics.
Einsteins theory of the photoelectric effect only appeared in 1905, and a comprehensive theory of quantum mechanics only became available two decades later. However,
statistical physics becomes simpler if one can appeal to some quantum
concepts.
In classical mechanics we describe a microscopic system by specifying the
coordinates and momenta of the particles. The allowed value of these
form a continuum. The procedure of counting requires, however discrete states. The modern way of getting around this difficulty is to consider classical mechanics as a limiting case of quantum mechanics, and we will take this approach, rather than following the historical route.
Next: Statistical definition of thermodynamic
Up: Boltzmann statistics
Previous: Boltzmann statistics
Birger Bergersen
1998-09-14