Statistical physics

Statistical physics

Statistical physics or statistical mechanics is a branch of physics that by means of Probability Theory is able to deduce the behavior of macroscopic physical systems from certain hypotheses about the elements or particles that make them up. Macroscopic systems are those that have a number of particles similar to Avogadro’s number, whose value is approximately incredibly large, so the size of such systems is usually easily conceivable by humans, although the size of each particle constituent be of atomic scale. An example of a macroscopic system would be, for example, a glass of water. The importance of using statistical techniques to study these systems lies in the fact that, being such large systems, it is impossible, even for the most advanced computers, keeping track of the physical state of each particle and predicting the behavior of the system through the laws of mechanics, in addition to the fact that it is impractical to know so much information about a real system. The utility of statistical physics is to link the microscopic behavior of systems with their macroscopic behavior, so that, knowing the behavior of one, details of the behavior of the other can be found out. It allows describing numerous fields of stochastic nature such as nuclear reactions; biological, chemical, neurological systems, among others. in addition to the fact that it is impractical to know so much information from a real system. The utility of statistical physics is to link the microscopic behavior of systems with their macroscopic behavior, so that, knowing the behavior of one, details of the behavior of the other can be found out. It allows describing numerous fields of stochastic nature such as nuclear reactions; biological, chemical, neurological systems, among others. in addition to the fact that it is impractical to know so much information from a real system. The utility of statistical physics is to link the microscopic behavior of systems with their macroscopic behavior, so that, knowing the behavior of one, details of the behavior of the other can be found out. It allows describing numerous fields of stochastic nature such as nuclear reactions; biological, chemical, neurological systems, among others. It allows describing numerous fields of stochastic nature such as nuclear reactions; biological, chemical, neurological systems, among others. It allows describing numerous fields of stochastic nature such as nuclear reactions; biological, chemical, neurological systems, among others.

Application examples

Empirically, thermodynamics has studied gases and established their macroscopic behavior with a high degree of accuracy. Thanks to statistical physics it is possible to deduce the thermodynamic laws that govern the macroscopic behavior of this gas, such as the equation of state of the ideal gas or the Boyle-Mariotte law, from the assumption that the particles in the gas are not subjected to no potential and move freely with a kinetic energy equal to: colliding with each other and with the walls of the container in an elastic way. The macroscopic behavior of the gas depends on only a few macroscopic variables (such as pressure, volume, and temperature). This particular approach to studying the behavior of gases is called kinetic theory. To predict the behavior of a gas, mechanics would require calculating the exact trajectory of each of the particles that compose it (which is an unapproachable problem). Thermodynamics does something radically opposite, it establishes qualitatively different principles from mechanics to study a series of macroscopic properties without wondering at all about the real nature of the subject matter. The statistical mechanics mediates between both approaches: it ignores the individual behaviors of the particles, worrying instead about averages. In this way we can calculate the thermodynamic properties of a gas from our generic knowledge of the molecules that compose it by applying mechanical laws. Thermodynamics does something radically opposite, it establishes qualitatively different principles from mechanics to study a series of macroscopic properties without wondering at all about the real nature of the subject matter. The statistical mechanics mediates between both approaches: it ignores the individual behaviors of the particles, worrying instead about averages. In this way we can calculate the thermodynamic properties of a gas from our generic knowledge of the molecules that compose it by applying mechanical laws. Thermodynamics does something radically opposite, it establishes qualitatively different principles from mechanics to study a series of macroscopic properties without wondering at all about the real nature of the subject matter. The statistical mechanics mediates between both approaches: it ignores the individual behaviors of the particles, worrying instead about averages. In this way we can calculate the thermodynamic properties of a gas from our generic knowledge of the molecules that compose it by applying mechanical laws. The statistical mechanics mediates between both approaches: it ignores the individual behaviors of the particles, worrying instead about averages. In this way we can calculate the thermodynamic properties of a gas from our generic knowledge of the molecules that compose it by applying mechanical laws. The statistical mechanics mediates between both approaches: it ignores the individual behaviors of the particles, worrying instead about averages. In this way we can calculate the thermodynamic properties of a gas from our generic knowledge of the molecules that compose it by applying mechanical laws.

History

In the 18th century Daniel Bernoulli applied statistical reasoning to explain the behavior of fluid systems. The 1950s marked a milestone in the study of thermal systems. In those years thermodynamics, which had grown basically through the experimental study of the macroscopic behavior of physical systems from the work of Nicolas Léonard Sadi Carnot, James Prescott Joule, Clausius and Kelvin, was a stable discipline of physics. The theoretical conclusions drawn from the first two laws of thermodynamics coincided with the experimental results. At the same time, the kinetic theory of gases, which had been based more on speculation than on calculations, began to emerge as a real mathematical theory. However, it was not until Ludwig Boltzmann in 1872 developed his H theorem and thus established the direct link between entropy and molecular dynamics. At practically the same time, kinetic theory began to give birth to its sophisticated successor: assembly theory. The power of the techniques that eventually emerged reduced the category of thermodynamics from “essential” to being a consequence of statistically treating large numbers of particles operating under the laws of classical mechanics. It was natural, therefore, that this new discipline ended up being called statistical mechanics or statistical physics. Application in other fields Statistical mechanics can be built on the laws of classical mechanics or quantum mechanics, depending on the nature of the problem to be studied. Although, to tell the truth, The techniques of statistical mechanics can be applied to fields outside of physics itself, such as economics. Thus, statistical physics has been used to deduce the income distribution, and the Pareto distribution for high incomes can be deduced by statistical mechanics, assuming a steady state equilibrium for them (see econophysics). Statistical-thermodynamic relationship The relationship between microscopic and macroscopic states (i.e. thermodynamics) is given by Ludwig Boltzmann’s famous formula for entropy: where is the number of microscopic states compatible with a given energy, volume, and number of particles and is the Boltzmann constant. In the term on the left we have thermodynamics by means of entropy defined according to its natural variables, which gives complete thermodynamic information of the system. On the right we have the microscopic configurations that define entropy using this formula. These configurations are obtained taking into account the model we make of the real system through its mechanical Hamiltonian. This relationship, proposed by Ludwig Boltzmann, was not initially accepted by the scientific community, in part because it implicitly contains the existence of atoms, which had not been demonstrated until then. That answer from the scientific environment, they say, made Boltzmann, evicted, decide to kill himself. Currently this expression is not the most appropriate for real calculations. This is the so-called bridge equation in the Micro Canonical Collective. There are other groups, such as the Canonical Collective or the Macrocanonical Collectivity,

Fundamental postulate

The fundamental postulate of statistical mechanics, also known as the a priori equiprobability postulate, is the following: Given an isolated system in equilibrium, the system has the same probability of being in any of the accessible microstates. This fundamental postulate is crucial to statistical mechanics, and asserts that a system in equilibrium has no preference for any of the microstates available for that equilibrium. If Ω is the number of microstates available for a certain energy, then the probability of finding the system in any one of those microstates is p = 1 / Ω; The postulate is necessary to be able to affirm that, given a system in equilibrium, the thermodynamic state (macrostate) that is associated with a greater number of microstates is the most probable macrostate of the system. It can be linked to the information theory function, given by: When all rho are equal, the information function I reaches a minimum. Thus, in the most probable macrostate it is also always one for which there is minimal information about the microstate of the system. It follows that in an isolated system in equilibrium the entropy is maximum (entropy can be considered as a measure of disorder: the greater the disorder, the more disinformation and, therefore, the lower the value of I). Entropy as a disorder In all thermodynamic books, entropy is interpreted as a measure of the disorder of the system. In fact, the second principle of thermodynamics is sometimes enunciated by saying: The disorder of an isolated system only increases. It is important to know that this relationship comes, as we have just learned, from statistical mechanics. Thermodynamics is not capable of establishing this relationship by itself, since it does not concern itself at all with microscopic states. In this sense, statistical mechanics is capable of demonstrating thermodynamics, since, starting from more elementary principles (namely, mechanics), it obtains the second principle by statistical deduction. This was Ludwig Boltzmann’s great mathematical contribution to thermodynamics.1

Calculation procedures

The modern formulation of this theory is based on the description of the physical system by a cast of sets or collectivity that represents the totality of possible configurations and the probabilities of realization of each of the configurations. Each partition is associated with a partition function that, by mathematical manipulations, allows the thermodynamic values ​​to be extracted from the system. According to the relationship of the system with the rest of the Universe, three types of communities are generally distinguished, in increasing order of complexity: the microcanonical community describes a completely isolated system, therefore with constant energy, which does not exchange energy or particles with the rest of the universe; the canonical community describes a system in thermal equilibrium with an external heat focus; it can only exchange energy in the form of heat transfer with the outside; the grand canonical community replaces the canonical community for open systems that allow the exchange of particles with the outside.

 

Leave a Comment