QUANTUM MECHANICS, a section of theoretical physics, which is a system of concepts and a mathematical apparatus necessary for describing physical phenomena due to the existence in nature of the smallest quantum of action h (Planck's constant). The numerical value h = 6.62607∙10ˉ 34 J∙s (and another frequently used value ħ = h/2π = 1.05457∙10ˉ 34 J∙s) is extremely small, but the fact that it is finite fundamentally distinguishes quantum phenomena from all others and determines their main features. Quantum phenomena include radiation processes, phenomena of atomic and nuclear physics, condensed matter physics, chemical bonding, etc.

The history of the creation of quantum mechanics. Historically, the first phenomenon, to explain which the concept of action quantum h was introduced in 1900, was the radiation spectrum of an absolutely black body, i.e., the dependence of the intensity of thermal radiation on its frequency v and temperature T of a heated body. Initially, the connection of this phenomenon with the processes occurring in the atom was not clear; At that time, the very idea of ​​the atom was not generally recognized, although even then observations were known that indicated a complex intra-atomic structure.

In 1802, Wollaston discovered narrow spectral lines in the solar radiation spectrum, which were described in detail by J. Fraunhofer in 1814. In 1859, G. Kirchhoff and R. Bunsen established that each chemical element has an individual set of spectral lines, and the Swiss scientist I. Ya. Balmer (1885), the Swedish physicist J. Rydberg (1890) and the German scientist W. Ritz (1908) found certain patterns in their arrangement. In 1896, P. Zeeman observed the splitting of spectral lines in a magnetic field (Zeeman effect), which H. A. Lorentz in next year explained by the motion of an electron in an atom. The existence of the electron was experimentally proved in 1897 by JJ Thomson.

Existing physical theories turned out to be insufficient to explain the laws of the photoelectric effect: it turned out that the energy of electrons emitted from a substance when it is irradiated with light depends only on the frequency of light v, and not on its intensity (A. G. Stoletov, 1889; F. von Lenard, 1904). This fact completely contradicted the wave nature of light generally accepted at that time, but was naturally explained on the assumption that light propagates in the form of energy quanta E=hv (A. Einstein, 1905), later called photons (G. Lewis, 1926).

Within 10 years after the discovery of the electron, several models of the atom were proposed, which, however, were not supported by experiments. In 1909-11, E. Rutherford, studying the scattering of α-particles on atoms, established the existence of a compact positively charged nucleus, in which practically the entire mass of the atom is concentrated. These experiments became the basis of the planetary model of the atom: a positively charged nucleus around which negatively charged electrons revolve. Such a model, however, contradicted the fact of the stability of the atom, since it followed from classical electrodynamics that after a time of the order of 10 -9 s, a rotating electron would fall on the nucleus, losing energy to radiation.

In 1913, N. Bohr suggested that the stability of the planetary atom is explained by the finiteness of the action quantum h. He postulated that there are stationary orbits in the atom, on which the electron does not radiate (Bohr's first postulate), and singled out these orbits from all possible condition quantization: 2πmυr = nh, where m is the electron mass, υ is its orbital velocity, r is the distance to the nucleus, n= 1,2,3,... are integers. From this condition, Bohr determined the energies E n \u003d -me 4 / 2ħ 2 n 2 (e - the electric charge of the electron) of stationary states, as well as the diameter of the hydrogen atom (about 10 -8 cm) - in full accordance with the conclusions of the kinetic theory of matter.

Bohr's second postulate stated that radiation occurs only during transitions of electrons from one stationary orbit to another, and the radiation frequency v nk of transitions from the state E n to the state E k is equal to v nk = (E k - E n)/h (see Atomic Physics ). Bohr's theory naturally explained the patterns in the spectra of atoms, but its postulates were in obvious contradiction with classical mechanics and the theory of the electromagnetic field.

In 1922, A. Compton, studying the scattering of X-rays by electrons, found that the incident and scattered X-ray energy quanta behave like particles. In 1923, Ch. T. R. Wilson and D. V. Skobeltsyn observed a recoil electron in this reaction and thereby confirmed the corpuscular nature of x-rays (nuclear y-radiation). This, however, contradicted the experiments of M. Laue, who as early as 1912 observed the diffraction of X-rays and thereby proved their wave nature.

In 1921, the German physicist K. Ramsauer discovered that at a certain energy, electrons pass through gases, practically without scattering, like light waves in a transparent medium. This was the first experimental evidence of the wave properties of the electron, the reality of which in 1927 was confirmed by direct experiments by C. J. Davisson, L. Germer and J.P. Thomson.

In 1923, L. de Broglie introduced the concept of matter waves: each particle with mass m and speed υ can be associated with a wave with a length λ = h/mυ, just as each wave with a frequency v = c/λ can be associated with a particle with energy E = hv. A generalization of this hypothesis, known as wave-particle duality, has become the foundation and universal principle of quantum physics. Its essence lies in the fact that the same objects of study manifest themselves in two ways: either as a particle or as a wave, depending on the conditions of their observation.

The relationships between the characteristics of a wave and a particle were established even before the creation of quantum mechanics: E = hv (1900) and λ = h/mυ = h/p (1923), where the frequency v and wavelength λ are wave characteristics, and the energy E and mass m, velocity υ and momentum p = mυ are the characteristics of the particle; the connection between these two types of characteristics is carried out through Planck's constant h. The duality relations are most clearly expressed in terms of the circular frequency ω = 2πν and the wave vector k = 2π/λ:

E = ħω, p = ħk.

A clear illustration of the wave-particle duality is shown in Figure 1: the diffraction rings observed in the scattering of electrons and X-rays are almost identical.

Quantum mechanics- the theoretical basis of all quantum physics - was created in less than three years. In 1925, W. Heisenberg, relying on the ideas of Bohr, proposed matrix mechanics, which by the end of that year acquired the form of a complete theory in the works of M. Born, the German physicist P. Jordan and P. Dirac. The main objects of this theory were the matrices special kind, which in quantum mechanics represent the physical quantities of classical mechanics.

In 1926, E. Schrödinger, based on the ideas of L. de Broglie about the waves of matter, proposed wave mechanics, where the wave function of the quantum state plays the main role, which obeys differential equation 2nd order with given boundary conditions. Both theories equally well explained the stability of the planetary atom and made it possible to calculate its main characteristics. In the same year, M. Born proposed a statistical interpretation of the wave function, Schrödinger (as well as independently W. Pauli and others) proved the mathematical equivalence of matrix and wave mechanics, and Born, together with N. Wiener, introduced the concept of an operator of a physical quantity.

In 1927, W. Heisenberg discovered the uncertainty relation, and N. Bohr formulated the complementarity principle. The discovery of the electron spin (J. Uhlenbeck and S. Goudsmit, 1925) and the derivation of the Pauli equation, which takes into account the spin of the electron (1927), completed the logical and computational schemes of nonrelativistic quantum mechanics, and P. Dirac and J. von Neumann presented quantum mechanics as a complete conceptual an independent theory based on a limited set of concepts and postulates, such as an operator, a state vector, a probability amplitude, a superposition of states, etc.

Basic concepts and formalism of quantum mechanics. The fundamental equation of quantum mechanics is the Schrödinger wave equation, whose role is similar to that of Newton's equations in classical mechanics and Maxwell's equations in electrodynamics. In the space of variables x (coordinate) and t (time), it has the form

where H is the Hamilton operator; its form coincides with the Hamiltonian operator of classical mechanics, in which the coordinate x and momentum p are replaced by the operators x and p of these variables, i.e.

where V(x) is the potential energy of the system.

In contrast to Newton's equation, from which the observed trajectory x(t) of a material point moving in the field of forces of the potential V(x) is found, the unobservable wave function ψ(x) of a quantum system is found from the Schrödinger equation, with the help of which, however, one can calculate values ​​of all measurable quantities. Immediately after the discovery of the Schrödinger equation, M. Born explained the meaning of the wave function: |ψ(x)| 2 is the probability density, and |ψ(x)| 2 ·Δx - the probability of finding a quantum system in the range of Δx values ​​of the x coordinate.

Each physical quantity (dynamical variable of classical mechanics) in quantum mechanics is associated with the observable a and the corresponding Hermitian operator В, which in the chosen basis of complex functions |i> = f i (х) is represented by the matrix

where f*(x) is the function complex conjugate to the function f(x).

The orthogonal basis in this space is the set of eigenfunctions |n) = f n (x)), n = 1,2,3, for which the action of the operator  is reduced to multiplication by a number (the eigenvalue a n of the operator Â):

The basis of functions |n) is normalized by the condition at n = n’ , at n ≠ n’.

and the number of basis functions (in contrast to the basis vectors of the three-dimensional space of classical physics) is infinite, and the index n can change both discretely and continuously. All possible values ​​of the observable a are contained in the set (a n ) of eigenvalues ​​of the operator  corresponding to it, and only these values ​​can become the results of measurements.

The main object of quantum mechanics is the state vector |ψ), which can be expanded in terms of eigenfunctions |n) of the chosen operator Â:

where ψ n is the probability amplitude (wave function) of state |n), and |ψ n | 2 is equal to the weight of the state n in the expansion of |ψ), and

i.e., the total probability of finding the system in one of the quantum states n is equal to one.

In Heisenberg quantum mechanics, the operators  and their corresponding matrices obey the equations

where |Â,Ĥ|=ÂĤ - Ĥ is the commutator of operators  and Ĥ. In contrast to the Schrödinger scheme, where the wave function ψ depends on time, in the Heisenberg scheme the time dependence is assigned to the operator В. Both of these approaches are mathematically equivalent, but in numerous applications of quantum mechanics, the Schrödinger approach turned out to be preferable.

The eigenvalue of the Hamilton operator Ĥ is the total energy of the system E, independent of time, which is found as a solution to the stationary Schrödinger equation

Its solutions are divided into two types depending on the type of boundary conditions.

For a localized state, the wave function satisfies the natural boundary condition ψ(∞) = 0. In this case, the Schrödinger equation has a solution only for a discrete set of energies Е n , n = 1,2,3,..., to which the wave functions ψ n ( r):

An example of a localized state is the hydrogen atom. Its Hamiltonian Ĥ has the form

where Δ \u003d ∂ 2 / ∂x 2 + ∂ 2 / ∂y 2 + ∂ 2 / ∂z 2 is the Laplace operator, e 2 / r is the interaction potential of the electron and the nucleus, r is the distance from the nucleus to the electron, and the energy eigenvalues E n , calculated from the Schrödinger equation, coincide with the energy levels of the Bohr atom.

The simplest example of a non-localized state is the free one-dimensional motion of an electron with momentum p. It corresponds to the Schrödinger equation

whose solution is a plane wave

where in the general case C = |C|exp(iφ) is a complex function, |C| and φ is its modulus and phase. In this case, the electron energy E = p 2 /2m, and the index p of the solution ψ p (x) takes a continuous series of values.

The position and momentum operators (and any other pair of canonically conjugate variables) obey the permutation (commutation) relation:

There is no common basis of eigenfunctions for pairs of such operators, and the physical quantities corresponding to them cannot be determined simultaneously with arbitrary accuracy. The commutation relation for the operators х̂ and р̂ implies a restriction on the accuracy Δх and Δр of determining the coordinate x and its conjugate momentum p of a quantum system (the Heisenberg uncertainty relation):

From here, in particular, the conclusion about the stability of the atom immediately follows, since the relation Δх = Δр = 0, corresponding to the fall of an electron on the nucleus, is prohibited in this scheme.

The set of simultaneously measurable quantities characterizing a quantum system is represented by a set of operators

commuting with each other, i.e., satisfying the relations moment) and (z-component of the moment operator). The state vector of an atom is defined as a set of common eigenfunctions ψ i (r) of all operators

which are numbered by a set of (i) = (nlm) quantum numbers of energy (n = 1,2,3,...), orbital momentum (l = 0.1, . . . , n - 1) and its projection onto the z axis (m = -l,...,-1,0,1,...,l). The functions |ψ i (r)| 2 can be conventionally considered as the shape of an atom in various quantum states i (the so-called White silhouettes).

The value of a physical quantity (observable quantum mechanics) is defined as the average value Ā of the corresponding operator Â:

This relation is valid for pure states, that is, for isolated quantum systems. In the general case of mixed states, we always deal with a large set (statistical ensemble) of identical systems (for example, atoms), whose properties are determined by averaging over this ensemble. In this case, the mean value Ā of the operator  takes the form

where p nm is the density matrix (L. D. Landau; J. von Neumann, 1929) with the normalization condition ∑ n ρ pp = 1. The density matrix formalism allows us to combine quantum mechanical averaging over states and statistical averaging over the ensemble. The density matrix also plays an important role in the theory of quantum measurements, the essence of which is always the interaction of quantum and classical subsystems. The concept of the density matrix is ​​the basis of quantum statistics and the basis for one of the alternative formulations of quantum mechanics. Another form of quantum mechanics, based on the concept of a path integral (or path integral), was proposed by R. Feynman in 1948.

Conformity principle. Quantum mechanics has deep roots in both classical and statistical mechanics. Already in his first work, N. Bohr formulated the correspondence principle, according to which quantum relations should turn into classical ones at large quantum numbers n. P. Ehrenfest in 1927 showed that, taking into account the equations of quantum mechanics, the mean value Ā of the operator  satisfies the equation of motion of classical mechanics. Ehrenfest's theorem is a special case of the general correspondence principle: in the limit h → 0, the equations of quantum mechanics go over to the equations of classical mechanics. In particular, the Schrödinger wave equation in the limit h → 0 transforms into the equation of geometric optics for the trajectory of a light beam (and any radiation) without taking into account its wave properties. Representing the solution ψ(х) of the Schrödinger equation in the form ψ(х) = exp(iS/ħ), where S = ∫ p(x)dx is an analogue of the classical action integral, we can verify that in the limit ħ → 0 the function S satisfies the classical the Hamilton-Jacobi equation. Moreover, in the limit h → 0, the operators x̂ and p̂ commute, and the coordinate and momentum values ​​corresponding to them can be determined simultaneously, as is assumed in classical mechanics.

The most significant analogies between the relations of classical and quantum mechanics for periodic motions are traced on the phase plane of canonically conjugate variables, for example, the coordinate x and momentum p of the system. Integrals of the type ∮р(х)dx taken along a closed trajectory (Poincaré integral invariants) are known in the prehistory of quantum mechanics as adiabatic Ehrenfest invariants. A. Sommerfeld used them to describe quantum laws in the language of classical mechanics, in particular for the spatial quantization of the atom and the introduction of quantum numbers l and m (it was he who introduced this term in 1915).

The dimension of the phase integral ∮pdx coincides with the dimension of Planck's constant h, and in 1911 A. Poincare and M. Planck proposed to consider the quantum of action h as the minimum volume of the phase space, the number n of cells of which is a multiple of h: n = ∮pdx/h. In particular, when an electron moves along a circular trajectory with a constant momentum р, the relation n = ∮р(х)dx/h = р ∙ 2πr/h immediately implies the Bohr quantization condition: mυr=nħ (P. Debye, 1913).

However, in the case of one-dimensional motion in the potential V(x) = mω 2 0 x 2 /2 (harmonic oscillator with natural frequency ω 0), the quantization condition ∮р(х)dx = nh implies a series of energy values ​​Е n = ħω 0 n, while the exact solution of the quantum equations for the oscillator leads to the sequence Е n = ħω 0 (n + 1/2). This result of quantum mechanics, first obtained by W. Heisenberg, fundamentally differs from the approximate one by the presence of zero vibrational energy E 0 = ħω 0 /2, which is of a purely quantum nature: the state of rest (x = 0, p = 0) is prohibited in quantum mechanics, since it contradicts the uncertainty relation Δх∙ Δр ≥ ħ/2.

The principle of superposition of states and probabilistic interpretation. The main and visual contradiction between the corpuscular and wave pictures of quantum phenomena was eliminated in 1926, after M. Born suggested interpreting the complex wave function ψ n (x) = |ψ n (x)| exp(iφ n) as the amplitude probability of state n, and the square of its modulus |ψ n (х)| 2 - as a probability density to detect state n at point x. A quantum system can be in various, including alternative, states, and its probability amplitude is equal to a linear combination of the probability amplitudes of these states: ψ = ψ 1 + ψ 2 + ...

The probability density of the resulting state is equal to the square of the sum of the probability amplitudes, and not the sum of the squares of the amplitudes, as is the case in statistical physics:

This postulate - the principle of superposition of states - is one of the most important in the system of concepts of quantum mechanics; it has many observable consequences. One of them, namely the passage of an electron through two closely spaced slits, is discussed more often than others (Fig. 2). The electron beam falls on the left, passes through the slots in the partition, and then is recorded on the screen (or photographic plate) on the right. If we close each of the slits in turn, then on the screen on the right we will see an image of an open slit. But if both slits are opened simultaneously, then instead of two slits we will see a system of interference fringes, the intensity of which is described by the expression:

The last term in this sum represents the interference of two probability waves that came to a given point of the screen from different slots in the partition, and depends on the phase difference of the wave functions Δφ = φ 1 - φ 2 . In the case of equal amplitudes |ψ 1 | = |ψ 2 |:

i.e., the intensity of the image of the slits in different points screen changes from 0 to 4|ψ 1 | 2 - in accordance with the change in the phase difference Δφ from 0 to π/2. In particular, it may turn out that with two open slits, we will not detect any signal at the image site of a single slit, which is absurd from the corpuscular point of view.

It is essential that this picture of the phenomenon does not depend on the intensity of the electron beam, i.e., it is not the result of their interaction with each other. An interference pattern arises even in the limit when the electrons pass through the slots in the partition one by one, i.e., each electron interferes with itself. This is impossible for a particle, but quite natural for a wave, for example, when it is reflected or diffracted by an obstacle whose dimensions are comparable to its length. In this experiment, wave-particle duality manifests itself in the fact that the same electron is registered as a particle, but propagates as a wave of a special nature: it is a probability wave to find an electron at some point in space. In such a picture of the scattering process, the question is: “Which of the slits did the electron-particle pass through?” loses its meaning, since the corresponding probability wave passes through both slits at once.

Another example illustrating the probabilistic nature of the phenomena of quantum mechanics is the passage of light through a translucent plate. By definition, the reflectance of light is equal to the ratio of the number of photons reflected from the plate to the number of incident ones. However, this is not the result of averaging a large number of events, but a characteristic inherent in each photon.

The principle of superposition and the concept of probability made it possible to carry out a consistent synthesis of the concepts of "wave" and "particle": each of the quantum events and its registration are discrete, but their distribution is dictated by the law of propagation of continuous waves of probability.

Tunnel effect and resonant scattering. The tunnel effect is perhaps the most famous phenomenon in quantum physics. It is due to the wave properties of quantum objects and has received an adequate explanation only within the framework of quantum mechanics. An example of a tunnel effect is the decay of a radium nucleus into a radon nucleus and an α-particle: Ra → Rn + α.

Figure 3 shows a diagram of the α-decay potential V(r): the α-particle oscillates with a frequency v in the “potential well” of a nucleus with a charge Z 0 , and after leaving it, it moves in the repulsive Coulomb potential 2Ze 2 /r, where Z=Z 0-2. In classical mechanics, a particle cannot leave the potential well if its energy E is less than the height of the potential barrier V max . In quantum mechanics, due to the uncertainty relation, a particle with a finite probability W penetrates into the sub-barrier region r 0< r < r 1 и может «просочиться» из области r < r 0 в область r >r 1 is similar to how light penetrates into the region of geometric shadow at distances comparable to the wavelength of light. Using the Schrödinger equation, we can calculate the coefficient D for the passage of an α-particle through the barrier, which in the semiclassical approximation is equal to:

Over time, the number of radium nuclei N(t) decreases according to the law: N(t) = N 0 exp(-t/τ), where τ is the average lifetime of the nucleus, N 0 is the initial number of nuclei at t = 0. Probability α- decay W = vD is related to the lifetime by the relation W = l/τ, from which the Geiger-Nettol law follows:

where υ is the speed of the α-particle, Z is the charge of the resulting nucleus. Experimentally, this dependence was discovered as early as 1909, but it was not until 1928 that G. Gamow (and independently the English physicist R. Gurney and the American physicist E. Condon) first explained it in the language of quantum mechanics. Thus, it was shown that quantum mechanics describes not only the processes of radiation and other phenomena of atomic physics, but also the phenomena of nuclear physics.

In atomic physics, the tunnel effect explains the phenomenon of field emission. In a uniform electric field with strength E, the Coulomb potential V(r) = -е 2 /r of attraction between the nucleus and the electron is distorted: V(r) = -е 2 /r - eEr, the energy levels of the atom E nl m are shifted, which leads to to a change in the frequencies ν nk of transitions between them (the Stark effect). In addition, qualitatively this potential becomes similar to the α-decay potential, as a result of which there is a finite probability of electron tunneling through the potential barrier (R. Oppenheimer, 1928). When the critical values ​​of E are reached, the barrier decreases so much that the electron leaves the atom (the so-called avalanche ionization).

Alpha decay is a special case of the decay of a quasi-stationary state, which is closely related to the concept of quantum mechanical resonance and allows us to understand additional aspects of non-stationary processes in quantum mechanics. The dependence of its solutions on time follows from the Schrödinger equation:

where E is the eigenvalue of the Hamiltonian Ĥ, which is real for the Hermitian operators of quantum mechanics, and the corresponding observable (total energy E) does not depend on time. However, the energy of non-stationary systems depends on time, and this fact can be formally taken into account if the energy of such a system is presented in a complex form: E = E 0 - iΓ/2. In this case, the dependence of the wave function on time has the form

and the probability of finding the corresponding state decreases exponentially:

which coincides in form with the α-decay law with a decay constant τ = ħ/Г.

In the reverse process, for example, in the collision of deuterium and tritium nuclei, as a result of which helium and a neutron are formed (reaction thermonuclear fusion), the concept of the reaction cross section σ is used, which is defined as a measure of the reaction probability for a unit flow of colliding particles.

For classical particles, the scattering cross section for a ball of radius r 0 coincides with its geometric cross section and is equal to σ = πr 0 2 . In quantum mechanics, it can be represented in terms of the scattering phases δl(k):

where k = р/ħ = √2mE/ħ - wave number, l - orbital momentum of the system. In the limit of very low collision energies, the quantum scattering cross section σ = 4πr 0 2 is 4 times larger than the geometric cross section of the ball. (This effect is one of the consequences of the wave nature of quantum phenomena.) In the vicinity of resonance at E ≈ E 0, the scattering phase behaves as

and the scattering cross section is

where λ = 1/k, W(E) is the Breit-Wigner function:

At low scattering energies l 0 ≈ 0, and the de Broglie wavelength λ is much larger than the size of the nuclei, therefore, at E = E 0, the resonant cross sections of nuclei σ res ≈ 4πλ 0 2 can exceed their geometric cross sections πr 0 2 by thousands and millions of times. In nuclear physics, the operation of nuclear and thermonuclear reactors depends on these cross sections. In atomic physics, this phenomenon was first observed by J. Frank and G. Hertz (1913) in experiments on resonant absorption of electrons by mercury atoms. In the opposite case (δ 0 = 0), the scattering cross section is anomalously small (the Ramsauer effect, 1921).

The function W(Е) is known in optics as the Lorentzian profile of the emission line and has the form of a typical resonance curve with a maximum at Е = Е 0, and the resonance width Г = 2∆Е = 2 (Е - Е 0) is determined from the relation W(Е 0 ± ΔΕ) = W(E 0)/2. The function W(E) is universal and describes both the decay of the quasi-stationary state and the resonant dependence of the scattering cross section on the collision energy E, and in radiation phenomena it determines the natural width Г of the spectral line, which is related to the lifetime τ of the emitter by the relation τ = ħ/Г . This ratio also determines the lifetime of elementary particles.

From the definition of τ = ħ/Г, taking into account the equality Г = 2∆Е, the uncertainty relation for energy and time follows: ∆Е ∙ ∆t ≥ ħ/2, where ∆t ≥ τ. It is similar in form to the relation ∆х ∙ ∆р ≥ ħ/2, but the ontological status of this inequality is different, since time t is not a dynamic variable in quantum mechanics. Therefore, the relation ∆Е ∙ ∆t ≥ ħ/2 does not follow directly from the basic postulates of stationary quantum mechanics and, strictly speaking, makes sense only for systems whose energy varies with time. Its physical meaning is that during the time ∆t the energy of the system cannot be measured more precisely than the value ∆Е determined by the relation ∆Е ∙ ∆t ≥ ħ/2. The stationary state (ΔE→0) exists indefinitely (∆t→∞).

Spin, identity of particles and exchange interaction. The concept of "spin" was established in physics by the works of W. Pauli, the Dutch physicist R. Kronig, S. Goudsmit and J. Uhlenbeck (1924-27), although experimental evidence of its existence was obtained long before the creation of quantum mechanics in the experiments of A. Einstein and W. J. de Haas (1915), as well as O. Stern and the German physicist W. Gerlach (1922). Spin (intrinsic mechanical momentum of a particle) for an electron is equal to S = ħ/2. It's the same important characteristic quantum particles, as well as charge and mass, which, however, has no classical analogues.

The spin operator Ŝ = ħσˆ/2, where σˆ= (σˆ x, σˆ y, σˆ z) are two-dimensional Pauli matrices, is defined in the space of two-component eigenfunctions u = (u + , u -) of the operator Ŝ z of the spin projection onto the z-axis: σˆ z u = σu, σ=±1/2. The intrinsic magnetic moment μ of a particle with mass m and spin S is equal to μ = 2μ 0 S, where μ 0 = eħ/2mс is the Bohr magneton. The operators Ŝ 2 and Ŝ z commute with the set Ĥ 0 L 2 and L z of operators of the hydrogen atom and together they form the Hamiltonian of the Pauli equation (1927), whose solutions are numbered by the set i = (nlmσ) of quantum numbers of eigenvalues ​​of the set of commuting operators Ĥ 0 , L 2 , L z , Ŝ 2 , Ŝ z . These solutions describe the most subtle features of the observed spectra of atoms, in particular, the splitting of spectral lines in a magnetic field (the normal and anomalous Zeeman effect), as well as their multiplet structure as a result of the interaction of the electron spin with the orbital momentum of the atom (fine structure) and the nuclear spin (hyperfine structure). ).

In 1924, even before the creation of quantum mechanics, W. Pauli formulated the exclusion principle: there cannot be two electrons in an atom with the same set of quantum numbers i = (nlmσ). This principle made it possible to understand the structure periodic system chemical elements and explain the periodicity of changes in their chemical properties with a monotonous increase in the charge of their nuclei.

The exclusion principle is a special case of a more general principle that establishes a connection between the spin of a particle and the symmetry of its wave function. Depending on the value of the spin, all elementary particles are divided into two classes: fermions - particles with a half-integer spin (electron, proton, μ-meson, etc.) and bosons - particles with zero or integer spin (photon, π-meson, K -meson, etc.). In 1940, Pauli proved a general theorem on the connection between spin and statistics, from which it follows that the wave functions of any system of fermions have negative parity (they change sign when they are paired), and the parity of the wave function of a system of bosons is always positive. In accordance with this, there are two types of particle energy distributions: the Fermi - Dirac distribution and the Bose - Einstein distribution, a special case of which is the Planck distribution for a system of photons.

One of the consequences of the Pauli principle is the existence of the so-called exchange interaction, which already manifests itself in a system of two electrons. In particular, it is this interaction that provides the covalent chemical bond of atoms in the molecules H 2 , N 2 , O 2 , etc. The exchange interaction is an exclusively quantum effect, an analogue of such an interaction in classical physics No. Its specificity is explained by the fact that the probability density of the wave function of the system of two electrons |ψ(r 1 ,r 2)| 2 contains not only the terms |ψ n (r 1)| 2 |ψ m (r 2)| 2 , where n and m are the quantum states of the electrons of both atoms, but also the “exchange terms” ψ n * (r 1)ψ m * (r 1)ψ n (r 2)ψ m (r 2), arising as a consequence of the principle superposition, which allows each electron to be simultaneously in different quantum states n and m of both atoms. In addition, due to the Pauli principle, the spin part of the wave function of a molecule must be antisymmetric with respect to the permutation of electrons, i.e., the chemical bonding of atoms in a molecule is carried out by a pair of electrons with oppositely directed spins. The wave function of complex molecules can be represented as a superposition of wave functions corresponding to various possible configurations of a molecule (resonance theory, L. Pauling, 1928).

Calculation methods developed in quantum mechanics (the Hartree-Fock method, the molecular orbital method, etc.) make it possible to calculate on modern computers all the characteristics of stable configurations of complex molecules: the filling order of electron shells in an atom, the equilibrium distances between atoms in molecules, the energy and direction of chemical bonds , the arrangement of atoms in space, and construct potential surfaces that determine the direction of chemical reactions. This approach also makes it possible to calculate the potentials of interatomic and intermolecular interactions, in particular, the van der Waals forces, to estimate the strength of hydrogen bonds, etc. Thus, the problem of chemical bonding is reduced to the problem of calculating the quantum characteristics of a system of particles with Coulomb interaction, and from this point of view, structural chemistry can be regarded as one of the branches of quantum mechanics.

The exchange interaction essentially depends on the type of potential interaction between particles. In particular, in some metals, it is precisely because of it that the state of pairs of electrons with parallel spins is more stable, which explains the phenomenon of ferromagnetism.

Applications of quantum mechanics. Quantum mechanics is the theoretical basis of quantum physics. It made it possible to understand the structure of the electron shells of atoms and the patterns in their radiation spectra, the structure of nuclei and the laws of their radioactive decay, the origin of chemical elements and the evolution of stars, including the explosions of new and supernovae, as well as the source of the Sun's energy. Quantum mechanics explained the meaning of the periodic system of elements, the nature of the chemical bond and the structure of crystals, heat capacity and magnetic properties substances, the phenomena of superconductivity and superfluidity, etc. Quantum mechanics - physical basis numerous technical applications: spectral analysis, laser, transistor and computer, nuclear reactor and atomic bomb, etc.

The properties of metals, dielectrics, semiconductors and other substances within the framework of quantum mechanics also receive a natural explanation. In crystals, atoms make small vibrations near equilibrium positions with a frequency ω, which are compared with the quanta of vibrations of the crystal lattice and the corresponding quasi-particles - phonons with energy E = ħω. The heat capacity of a crystal is largely determined by the heat capacity of the gas of its phonons, and its thermal conductivity can be interpreted as the thermal conductivity of the phonon gas. In metals, conduction electrons are a gas of fermions, and their scattering by phonons is the main reason for the electrical resistance of conductors, and also explains the similarity of the thermal and electrical properties of metals (see the Wiedemann-Franz law). Quasi-particles appear in magnetically ordered structures - magnons, to which spin waves correspond, in quantum liquids there arise quanta of rotational excitation - rotons, and the magnetic properties of substances are determined by the spins of electrons and nuclei (see Magnetism). Interaction of spins of electrons and nuclei with magnetic field- the basis of practical applications of the phenomena of electron paramagnetic and nuclear magnetic resonances, in particular in medical tomographs.

The ordered structure of crystals generates an additional symmetry of the Hamiltonian with respect to the shift x → x + a, where a is the period of the crystal lattice. Taking into account the periodic structure of a quantum system leads to the splitting of its energy spectrum into allowed and forbidden zones. Such a structure of energy levels underlies the operation of transistors and all electronics based on them (TV, computer, cellular telephone and etc.). At the beginning of the 21st century, significant progress was made in creating crystals with desired properties and structure of energy bands (superlattices, photonic crystals and heterostructures: quantum dots, quantum filaments, nanotubes, etc.).

When the temperature decreases, some substances pass into the state of a quantum liquid, the energy of which at a temperature T → 0 approaches the energy of zero-point oscillations of the system. In some metals, at low temperatures, Cooper pairs are formed - systems of two electrons with opposite spins and momenta. In this case, the electron gas of fermions is transformed into a gas of bosons, which entails Bose condensation, which explains the phenomenon of superconductivity.

At low temperatures, the de Broglie wavelength of the thermal motions of atoms becomes comparable to the interatomic distances and there is a correlation between the phases of the wave functions of many particles, which leads to macroscopic quantum effects (Josephson effect, magnetic flux quantization, fractional quantum Hall effect, Andreev reflection).

Based on quantum phenomena, the most accurate quantum standards of various physical quantities have been created: frequencies (helium-neon laser), electrical voltage(Josephson effect), resistance (quantum Hall effect), etc., as well as devices for various precision measurements: SQUIDs, quantum clocks, quantum gyroscope, etc.

Quantum mechanics arose as a theory to explain the specific phenomena of atomic physics (it was originally called so: atomic dynamics), but it gradually became clear that quantum mechanics also forms the basis of all subatomic physics, and all its basic concepts are applicable to describe the phenomena of nuclear physics and elementary particles. The original quantum mechanics was non-relativistic, that is, it described the motion of systems with velocities much less than the speed of light. The interaction of particles in this theory was still described in classical terms. In 1928, P. Dirac found the relativistic equation of quantum mechanics (Dirac's equation), which, while retaining all its concepts, took into account the requirements of the theory of relativity. In addition, the secondary quantization formalism was developed, which describes the creation and annihilation of particles, in particular, the creation and absorption of photons in radiation processes. On this basis, quantum electrodynamics arose, which made it possible to calculate with great accuracy all the properties of systems with electromagnetic interaction. Subsequently, it developed into quantum field theory, which combines particles and fields in a single formalism, through which they interact.

To describe elementary particles and their interactions, all the basic concepts of quantum mechanics are used: the wave-particle duality remains valid, the language of operators and quantum numbers is preserved, the probabilistic interpretation of observed phenomena, etc. In particular, to explain the interconversion of three types of neutrinos: ve , ν μ and ν τ (neutrino oscillations), as well as neutral K-mesons, the principle of superposition of states is used.

Interpretation of quantum mechanics. The validity of the equations and conclusions of quantum mechanics has been repeatedly confirmed by numerous experiments. The system of its concepts, created by the works of N. Bohr, his students and followers, known as the “Copenhagen interpretation”, is now generally accepted, although a number of the creators of quantum mechanics (M. Planck, A. Einstein and E. Schrödinger, etc.) until the end of their lives remained convinced that quantum mechanics is an unfinished theory. The specific difficulty in perceiving quantum mechanics is due, in particular, to the fact that most of its basic concepts (wave, particle, observation, etc.) are taken from classical physics. In quantum mechanics, their meaning and scope are limited due to the finiteness of the quantum of action h, and this, in turn, required a revision of the established provisions of the philosophy of knowledge.

First of all, the meaning of the concept of "observation" has changed in quantum mechanics. In classical physics, it was assumed that the perturbations of the system under study, caused by the measurement process, can be correctly taken into account, after which it is possible to restore the initial state of the system, independent of the means of observation. In quantum mechanics, the uncertainty relation puts a fundamental limit on this path, which has nothing to do with the skill of the experimenter and the subtlety of the observation methods used. The quantum of action h defines the boundaries of quantum mechanics, like the speed of light in the theory of electromagnetic phenomena or absolute zero temperatures in thermodynamics.

The reason for the rejection of the uncertainty relation and the way to overcome the difficulties of perceiving its logical consequences was proposed by N. Bohr in the concept of complementarity (see Complementarity principle). According to Bohr, a complete and adequate description of quantum phenomena requires a pair of additional concepts and a corresponding pair of observables. To measure these observables, two different types devices with incompatible properties. For example, to accurately measure the coordinates, you need a stable, massive device, and to measure the momentum, on the contrary, light and sensitive. Both of these devices are incompatible, but they are complementary in the sense that both quantities measured by them are equally necessary for a complete characterization of a quantum object or phenomenon. Bohr explained that "appearance" and "observation" are additional concepts and cannot be defined separately: the process of observation is already a certain phenomenon, and without observation, the phenomenon is a "thing in itself". In reality, we are always dealing not with the phenomenon itself, but with the result of observing the phenomenon, and this result depends, among other things, on the choice of the type of device used to measure the characteristics of a quantum object. The results of such observations are explained and predicted by quantum mechanics without any arbitrariness.

An important difference between quantum equations and classical equations is also that the wave function of a quantum system is not itself observable, and all quantities calculated with its help have a probabilistic meaning. In addition, the concept of probability in quantum mechanics is fundamentally different from the usual understanding of probability as a measure of our ignorance of the details of processes. Probability in quantum mechanics is an internal property of an individual quantum phenomenon, inherent in it initially and independently of measurements, and not a way of representing the results of measurements. Accordingly, the principle of superposition in quantum mechanics refers not to probabilities, but to probability amplitudes. In addition, due to the probabilistic nature of events, the superposition of quantum states can include states that are incompatible from the classical point of view, for example, the states of reflected and transmitted photons on the border of a semitransparent screen or alternative states of an electron passing through any of the slits in the famous interference experiment.

The rejection of the probabilistic interpretation of quantum mechanics gave rise to a lot of attempts to modify the basic principles of quantum mechanics. One of such attempts is the introduction of hidden parameters into quantum mechanics, which change in accordance with strict laws of causality, and the probabilistic nature of the description in quantum mechanics arises as a result of averaging over these parameters. The proof of the impossibility of introducing hidden parameters into quantum mechanics without violating the system of its postulates was given by J. von Neumann back in 1929. A more detailed analysis of the system of postulates of quantum mechanics was undertaken by J. Bell in 1965. Experimental verification of the so-called Bell's inequalities (1972) once again confirmed the generally accepted scheme of quantum mechanics.

Now quantum mechanics is a complete theory that always gives correct predictions within the limits of its applicability. All known attempts to modify it (about ten of them are known) did not change its structure, but laid the foundation for new branches of science about quantum phenomena: quantum electrodynamics, quantum field theory, electroweak interaction theory, quantum chromodynamics, quantum theory of gravity, string and superstring theory, etc. .

Quantum mechanics is among such achievements of science as classical mechanics, the theory of electricity, the theory of relativity and kinetic theory. No physical theory has explained such a wide range of physical phenomena in nature: of the 94 Nobel Prizes in physics awarded in the 20th century, only 12 are not directly related to quantum physics. The significance of quantum mechanics in the entire system of knowledge about the surrounding nature goes far beyond the framework of the doctrine of quantum phenomena: it has created a language of communication in modern physics, chemistry and even biology, led to a revision of the philosophy of science and the theory of knowledge, and its technological consequences still determine the direction of development of modern civilization.

Lit.: Neumann I. Mathematical foundations of quantum mechanics. M., 1964; Davydov A. S. Quantum mechanics. 2nd ed. M., 1973; Dirac P. Principles of quantum mechanics. 2nd ed. M., 1979; Blokhintsev D. I. Fundamentals of quantum mechanics. 7th ed. St. Petersburg, 2004; Landau L. D., Lifshits E. M. Quantum mechanics. nonrelativistic theory. 5th ed. M., 2004; Feynman R., Layton R., Sands M. Quantum mechanics. 3rd ed. M., 2004; Ponomarev L. I. Under the Sign of the Quantum. 2nd ed. M., 2007; Fok V. A. Beginnings of quantum mechanics. 5th ed. M., 2008.

The formation of quantum mechanics as a consistent theory with specific physical foundations is largely associated with the work of W. Heisenberg, in which he formulated uncertainty relation (principle). This fundamental position of quantum mechanics reveals the physical meaning of its equations, and also determines its connection with classical mechanics.

Uncertainty principle postulates: an object of the microcosm cannot be in states in which the coordinates of its center of inertia and momentum simultaneously take on quite definite, exact values.

Quantitatively, this principle is formulated as follows. If ∆x is the uncertainty of the coordinate value x , A ∆p is the momentum uncertainty, then the product of these uncertainties cannot be less than Planck's constant in order of magnitude:

x p h.

It follows from the uncertainty principle that the more accurately one of the quantities included in the inequality is determined, the less accurately the value of the other is determined. No experiment can simultaneously accurately measure these dynamic variables, and this is not due to the influence of measuring instruments or their imperfections. The uncertainty relation reflects the objective properties of the microworld, stemming from its corpuscular-wave dualism.

The fact that the same object manifests itself both as a particle and as a wave destroys traditional ideas, deprives the description of processes of the usual clarity. The concept of a particle implies an object enclosed in a small region of space, while a wave propagates in its extended regions. It is impossible to imagine an object possessing these qualities at the same time, and one should not try. It is impossible to build a model that is illustrative for human thinking and that would be adequate to the microworld. The equations of quantum mechanics, however, do not set such a goal. Their meaning is in a mathematically adequate description of the properties of microworld objects and the processes occurring with them.

If we talk about the connection between quantum mechanics and classical mechanics, then the uncertainty relation is a quantum limitation of the applicability of classical mechanics to objects of the microworld. Strictly speaking, the uncertainty relation applies to any physical system, however, since the wave nature of macroobjects is practically not manifested, the coordinates and momentum of such objects can be simultaneously measured with a sufficiently high accuracy. This means that it is quite sufficient to use the laws of classical mechanics to describe their motion. Recall that the situation is similar in relativistic mechanics (special relativity): at velocities much lower than the speed of light, the relativistic corrections become insignificant and the Lorentz transformations turn into Galilean transformations.

So, the uncertainty relation for coordinates and momentum reflects the corpuscular-wave dualism of the microworld and not related to the influence of measuring devices. A somewhat different meaning has a similar uncertainty relation for energyE And timet :

E t h.

It follows from this that the energy of the system can be measured only with an accuracy not exceeding h /∆ t, Where t – measurement duration. The reason for such uncertainty lies in the very process of interaction of the system (microobject) withmeasuring device. For a stationary situation, the above inequality means that the energy of interaction between the measuring device and the system can only be taken into account with an accuracy of h /∆t. In the limiting case of instantaneous measurement, the exchange of energy that takes place turns out to be completely indeterminate.

If under E is understood as the uncertainty of the value of the energy of a non-stationary state, then t is a characteristic time during which the values ​​of physical quantities in the system change significantly. From this, in particular, follows an important conclusion regarding the excited states of atoms and other microsystems: the energy of an excited level cannot be strictly determined, which indicates the presence natural width this level.

The objective properties of quantum systems reflect another fundamental position of quantum mechanics - Bohr's complementarity principle, Whereby Obtaining information about some physical quantities describing a micro-object by any experimental means is inevitably associated with the loss of information about some other quantities that are additional to the first ones..

Mutually complementary are, in particular, the coordinate of the particle and its momentum (see above - the uncertainty principle), kinetic and potential energy, electric field strength and the number of photons.

The considered fundamental principles of quantum mechanics indicate that, due to the corpuscular-wave dualism of the microworld studied by it, the determinism of classical physics is alien to it. A complete departure from visual process modeling gives special interest the question of what is physical nature de Broglie waves. In answering this question, it is customary to "start" from the behavior of photons. It is known that when a light beam is passed through a translucent plate S part of the light passes through it, and part is reflected (Fig. 4).

Rice. 4

What then happens to individual photons? Experiments with light beams of very low intensity using modern technology ( A- a photon detector), which allows you to monitor the behavior of each photon (the so-called photon counting mode), show that there can be no talk of splitting an individual photon (otherwise the light would change its frequency). It is reliably established that some photons pass through the plate, and some are reflected from it. It means that the same particlesunder the same conditions may behave differently,i.e., the behavior of an individual photon when it encounters the surface of the plate cannot be predicted unambiguously.

Reflection of a photon from a plate or passage through it are random events. And the quantitative patterns of such events are described with the help of probability theory. A photon can with probability w 1 pass through the plate and with probability w 2 reflect from her. The probability that one of these two alternative events will happen to a photon is equal to the sum of the probabilities: w 1 +w 2 = 1.

Similar experiments with a beam of electrons or other microparticles also show the probabilistic nature of the behavior of individual particles. Thus, the problem of quantum mechanics can be formulated as a predictionprobabilities of processes in the microworld, in contrast to the problem of classical mechanics - predict the reliability of events in the macrocosm.

It is known, however, that the probabilistic description is also used in classical statistical physics. So what is the fundamental difference? To answer this question, let's complicate the experiment on the reflection of light. With a mirror S 2 turn the reflected beam by placing the detector A, registering photons in the zone of its suppression with the transmitted beam, i.e., we will provide the conditions for the interference experiment (Fig. 5).

Rice. 5

As a result of interference, the light intensity, depending on the location of the mirror and the detector, will periodically change over the cross section of the beam overlap region over a wide range (including vanishing). How do individual photons behave in this experiment? It turns out that in this case the two optical paths to the detector are no longer alternative (mutually exclusive) and therefore it is impossible to say which path the photon passed from the source to the detector. We have to admit that it could hit the detector simultaneously in two ways, resulting in an interference pattern. Experience with other microparticles gives a similar result: successively passing particles create the same pattern as a photon flux.

This is already a cardinal difference from classical ideas: after all, it is impossible to imagine the movement of a particle simultaneously along two different paths. However, quantum mechanics does not pose such a problem. It predicts the result that the bright bands correspond to a high probability of the appearance of a photon.

Wave optics easily explains the result of an interference experiment with the help of the principle of superposition, according to which light waves are added taking into account the ratio of their phases. In other words, the waves are first added in amplitude, taking into account the phase difference, a periodic amplitude distribution is formed, and then the detector registers the corresponding intensity (which corresponds to the mathematical operation of squaring modulo, i.e., there is a loss of information about the phase distribution). In this case, the intensity distribution is periodic:

I = I 1 + I 2 + 2 A 1 A 2 cos (φ 1 – φ 2 ),

Where A , φ , I = | A | 2 amplitude,phase And intensity waves, respectively, and indices 1, 2 indicate their belonging to the first or second of these waves. It is clear that at A 1 = A 2 And cos(φ 1 φ 2 ) = – 1 intensity value I = 0 , which corresponds to the mutual damping of light waves (with their superposition and interaction in amplitude).

To interpret wave phenomena from the corpuscular point of view, the principle of superposition is transferred to quantum mechanics, i.e., the concept is introduced probability amplitudes – by analogy with optical waves: Ψ = A exp ( ). This means that the probability is the square of this value (modulo), i.e. W = |Ψ| 2 .The probability amplitude is called in quantum mechanics wave function . This concept was introduced in 1926 by the German physicist M. Born, thereby giving probabilistic interpretation de Broglie waves. Satisfying the principle of superposition means that if Ψ 1 And Ψ 2 are the probability amplitudes for the passage of the particle in the first and second paths, then the probability amplitude for the passage of both paths should be: Ψ = Ψ 1 + Ψ 2 . Then, formally, the statement that "the particle went two ways" acquires a wave meaning, and the probability W = |Ψ 1 + Ψ 2 | 2 exhibits the property interference distribution.

Thus, the quantity describing the state of a physical system in quantum mechanics is the wave function of the system under the assumption that the superposition principle is valid. With respect to the wave function, the basic equation of wave mechanics is written - the Schrödinger equation. Therefore, one of the main problems of quantum mechanics is to find the wave function corresponding to a given state of the system under study.

It is important that the description of the state of a particle with the help of the wave function is of a probabilistic nature, since the square of the modulus of the wave function determines the probability of finding a particle at a given time in a certain limited volume. In this quantum theory fundamentally differs from classical physics with its determinism.

At one time, classical mechanics owed its triumphal march to the high accuracy of predicting the behavior of macroobjects. Naturally, among scientists for a long time there was an opinion that the progress of physics and science in general would be inextricably linked with an increase in the accuracy and reliability of such predictions. The principle of uncertainty and the probabilistic nature of the description of microsystems in quantum mechanics radically changed this point of view.

Then other extremes began to appear. Since it follows from the uncertainty principle impossibility of simultaneousdetermining position and momentum, we can conclude that the state of the system at the initial moment of time is not precisely determined and, therefore, subsequent states cannot be predicted, i.e., principle of causality.

However, such a statement is possible only with a classical view of non-classical reality. In quantum mechanics, the state of a particle is completely determined by the wave function. Its value, set for a certain point in time, determines its subsequent values. Since causality acts as one of the manifestations of determinism, it is expedient in the case of quantum mechanics to speak of probabilistic determinism based on statistical laws, i.e., providing the higher accuracy, the more events of the same type are recorded. Therefore, the modern concept of determinism presupposes an organic combination, a dialectical unity need And chance.

The development of quantum mechanics thus had a marked influence on the progress of philosophical thought. From an epistemological point of view, of particular interest is the already mentioned conformity principle, formulated by N. Bohr in 1923, according to which any new, more general theory, which is a development of the classical one, does not completely reject it, but includes the classical theory, indicating the limits of its applicability and passing into it in certain limiting cases.

It is easy to see that the correspondence principle perfectly illustrates the relationship of classical mechanics and electrodynamics with the theory of relativity and quantum mechanics.

Quantum mechanics is a fundamental physical theory that expands, refines and combines the results of classical mechanics and classical electrodynamics in the description of microscopic objects. This theory is the basis for many areas of physics and chemistry, including solid state physics, quantum chemistry, and elementary particle physics. The term "quantum" (from the Latin Quantum - "how much") is associated with discrete portions that the theory assigns to certain physical quantities, for example, the energy of an atom.

Mechanics is a science that describes the movement of bodies and physical quantities, such as energy or momentum, are associated with it. It gives accurate and reliable results for many phenomena. This applies both to microscopic phenomena (here classical mechanics is not able to explain even the existence of a stable atom), and some macroscopic phenomena, such as superconductivity, superfluidity, or black body radiation. For over a century of the existence of quantum mechanics, its predictions have never been challenged by experiment. Quantum mechanics explains at least three types of phenomena that classical mechanics and classical electrodynamics cannot describe:

1) quantization of some physical quantities;

2) corpuscular-wave dualism;

3) the existence of mixed quantum states.

Quantum mechanics can be formulated as a relativistic or non-relativistic theory. Although relativistic quantum mechanics is one of the most fundamental theories, nonrelativistic quantum mechanics is also often used for convenience.

Theoretical basis of quantum mechanics

Various formulations of quantum mechanics

One of the first formulations of quantum mechanics is "wave mechanics" proposed by Erwin Schrödinger. In this concept, the state of the system under study is determined by the "wave function", which reflects the probability distribution of all measured physical quantities of the system. Such as energy, coordinates, momentum or angular momentum. The wave function (from a mathematical point of view) is a complex square-integrable function of the coordinates and time of the system.

In quantum mechanics, physical quantities are not associated with specific numerical values. On the other hand, assumptions are made about the probability distribution of the values ​​of the measured parameter. As a rule, these probabilities will depend on the form of the state vector at the time of the measurement. Although, to be more precise, each specific value of the measured quantity corresponds to a certain state vector, known as the "eigenstate" of the measured quantity.

Let's take a specific example. Imagine a free particle. Its state vector is arbitrary. Our task is to determine the coordinate of the particle. The eigenstate of the particle's coordinate in space is the state vector, and the norm at a certain point x is large enough, while at any other point in space it is zero. If we now make measurements, then with one hundred percent probability we will get the value of x itself.

Sometimes the system we are interested in is neither in its own state nor in the physical quantity that we measure. However, if we try to make measurements, the wave function will instantly become an eigenstate of the quantity being measured. This process is called the collapse of the wave function. If we know the wave function at the moment before the measurement, then we are able to calculate the probability of collapse into each of the possible eigenstates. For example, the free particle in our previous measurement example will have a wave function, is a wave packet centered at some point x0, is not an eigenstate of the coordinate. When we start measuring a particle's coordinate, it is impossible to predict the result we will get. It is likely, but not certain, that it will be close to x0, where the amplitude of the wave function is large. After the measurement, when we get some result x, the wave function collapses into a position with its own state concentrated exactly in x.

State vectors are functions of time. ψ = ψ (t) The Schrödinger equation determines the change in the state vector with time.

Some state vectors result in probability distributions that are constant over time. Many systems that are considered dynamic in classical mechanics are actually described by such "static" functions. For example, an electron in an unexcited atom in classical physics is depicted as a particle that moves along a circular path around the nucleus of an atom, while in quantum mechanics it is static, a spherically symmetric probability cloud around the nucleus.

The evolution of the state vector over time is deterministic in the sense that, given a certain state vector at the initial moment in time, one can make an accurate prediction of what it will be at any other moment. During the measurement process, the state vector configuration change is probabilistic, not deterministic. The probabilistic nature of quantum mechanics, therefore, manifests itself precisely in the process of making measurements.

There are several interpretations of quantum mechanics that introduce a new concept into the very act of measurement in quantum mechanics. The main interpretation of quantum mechanics, which is generally accepted today, is a probabilistic interpretation.

Physical foundations of quantum mechanics

The uncertainty principle, which states that there are fundamental obstacles to accurately measuring two or more parameters of a system at the same time with arbitrary error. In the free particle example, this means that it is fundamentally impossible to find a wave function that would be an eigenstate of both the momentum and the coordinate. From this it follows that the coordinate and momentum cannot be simultaneously determined with an arbitrary error. With an increase in the accuracy of measuring the coordinate, the maximum accuracy of measuring the momentum decreases and vice versa. Those parameters for which such a statement is true are called canonically conjugate in classical physics.

Experimental base of quantum mechanics

There are some experiments that cannot be explained without the involvement of quantum mechanics. The first kind of quantum effects is the quantization of certain physical quantities. If a free particle from the example considered above is localized in a rectangular potential well - a protoru region of size L, bounded on both sides by an infinitely high potential barrier, then it turns out that the momentum of the particle can only have certain discrete values, Where h is Planck's constant, and n is an arbitrary natural number. Parameters that can acquire only discrete values ​​are said to be quantized. Examples of quantized parameters are also the angular momentum, the total energy of a system limited in space, and also the energy electromagnetic radiation certain frequency.

Another quantum effect is wave-particle duality. It can be shown that under certain conditions of the experiment, microscopic objects such as atoms or electrons acquire the properties of particles (that is, they can be localized in a certain region of space). Under other conditions, the same objects acquire the properties of waves and exhibit effects such as interference.

The next quantum effect is the effect of entangled quantum states. In some cases, the state vector of a system of many particles cannot be represented as the sum of the individual wave functions corresponding to each of the particles. In this case, the states of the particles are said to be entangled. And then, the measurement, which was carried out for only one particle, will result in the collapse of the overall wave function of the system, i.e. such a measurement will have an immediate effect on the wave function of other particles in the system, even if some of them are at a considerable distance. (This does not contradict special relativity, since it is impossible to transmit information over a distance in this way.)

Mathematical apparatus of quantum mechanics

In the rigorous mathematical apparatus of quantum mechanics, which was developed by Paul Dirac and John von Neumann, the possible states of a quantum mechanical system are represented by state vectors in a complex separable Hilbert space. The evolution of a quantum state is described by the Schrödinger equation, in which the Hamiltonian operator, or the Hamiltonian corresponding to the total energy of the system, determines its evolution in time.

Each simulating parameter of the system is represented by Hermitian operators in the state space. Each eigenstate of the measured parameter corresponds to an eigenvector of the operator, and the corresponding eigenvalue is equal to the value of the measured parameter in the given eigenstate. During the measurement process, the probability of the system transitioning to one of the eigenstates is determined as the square of the scalar product of the eigenstate vector and the state vector before the measurement. Possible measurement results are operator eigenvalues, explains the choice of Hermitian operators for which all eigenvalues ​​are real numbers. The probability distribution of the measured parameter can be obtained by calculating the spectral decomposition of the corresponding operator (here, the spectrum of the operator is the sum of all possible values ​​of the corresponding physical quantity). The Heisenberg uncertainty principle corresponds to the fact that the operators of the corresponding physical quantities do not commute with each other. The details of the mathematical apparatus are set forth in the special article Mathematical Apparatus of Quantum Mechanics.

An analytical solution of the Schrödinger equation exists for a small number of Hamiltonians, for example, for a harmonic oscillator, a model of the hydrogen atom. Even a helium atom, which differs from a hydrogen atom by one electron, does not have a fully analytical solution to the Schrödinger equation. However, there are certain methods for the approximate solution of these equations. For example, methods of perturbation theory, where analytical result solving a simple quantum mechanical model is used to obtain solutions for more complex systems by adding a certain "perturbation" in the form of, for example, potential energy. Another method, the "Semiclassical Equation of Motion" is applied to systems for which quantum mechanics produces only slight deviations from classical behavior. Such deviations can be calculated by methods of classical physics. This approach is important in the theory of quantum chaos, which has been rapidly developing in recent years.

Interaction with other theories

The fundamental principles of quantum mechanics are rather abstract. They claim that the state space of the system is Hilbert and the physical quantities correspond to the Hermitian operators acting in this space, but do not specify specifically what kind of Hilbert space it is and what kind of operators they are. They must be chosen appropriately in order to obtain a quantitative description of a quantum system. An important guide here is the correspondence principle, which states that quantum mechanical effects cease to be significant, and the system acquires the features of a classical one as its size increases. This "big system" limit is also called the classical or matching limit. In addition, one can start by looking at the classical model of the system and then try to understand which quantum model corresponds to the classical one that is outside the matching limit.

When quantum mechanics was first formulated, it was applied to models that corresponded to the classical models of non-relativistic mechanics. For example, the well-known model of the harmonic oscillator uses a frankly non-relativistic description of the kinetic energy of the oscillator, as does the corresponding quantum model.

The first attempts to connect quantum mechanics with the special theory of relativity led to the replacement of the Schrödinger equation by the Dirac equations. These theories were successful in explaining many experimental results, but ignored facts such as relativistic creation and annihilation of elementary particles. A fully relativistic quantum theory requires the development of a quantum field theory that will apply the notion of quantization to a field rather than to a fixed list of particles. The first completed quantum field theory, quantum electrodynamics, provides a fully quantum description of electromagnetic interaction processes.

The full apparatus of quantum field theory is often excessive for the description of electromagnetic systems. A simple approach taken from quantum mechanics proposes to consider charged particles as quantum mechanical objects in a classical electromagnetic field. For example, the elementary quantum model of the hydrogen atom describes the electromagnetic field of the atom using the classical Coulomb potential (ie, inversely proportional to the distance). Such a "pseudo-classical" approach does not work if quantum fluctuations of the electromagnetic field, such as the emission of photons by charged particles, begin to play a significant role.

Quantum field theories for the strong and weak nuclear forces have also been developed. Quantum field theory for strong interactions is called quantum chromodynamics and describes the interaction of subnuclear particles - quarks and gluons. The weak nuclear and electromagnetic forces have been combined in their quantum form, into one quantum field theory called electroweak theory.

So far, it has not been possible to build a quantum model of gravity, the last of the fundamental forces. Pseudo-classical approximations work, and even provided for some effects, such as Hawking radiation. But the formulation of a complete theory of quantum gravity is complicated by the existing contradictions between the general theory of relativity, the most accurate theory of gravity known today, and some fundamental provisions of quantum theory. The intersection of these contradictions is an area of ​​active scientific research, and theories such as string theory are possible candidates for the title of a future theory of quantum gravity.

Applications of quantum mechanics

Quantum mechanics has had great success in explaining many environmental phenomena. The behavior of microscopic particles that form all forms of matter - electrons, protons, neutrons, etc. - can often be satisfactorily explained only by the methods of quantum mechanics.

Quantum mechanics is important in understanding how individual atoms combine with each other to form chemical elements and compounds. The application of quantum mechanics to chemical processes is known as quantum chemistry. Quantum mechanics can further a qualitatively new understanding of the processes of formation of chemical compounds, showing which molecules are energetically more favorable than others, and to what extent. Most of the calculations done in computational chemistry are based on quantum mechanical principles.

Modern technology has already reached the point where quantum effects become important. Examples are lasers, transistors, electron microscopes, magnetic resonance imaging. The development of semiconductors led to the invention of the diode and transistor, which are indispensable in modern electronics.

Researchers today are in search of reliable methods for the direct manipulation of quantum states. Successful attempts have been made to create the foundations of quantum cryptography, which will allow guaranteed secret transmission of information. A more distant goal is the development of quantum computers, which are expected to be able to implement certain algorithms with much greater efficiency than classical computers. Another topic of active research is quantum teleportation, which deals with technologies for transmitting quantum states over considerable distances.

Philosophical aspect of quantum mechanics

From the very moment of the creation of quantum mechanics, its conclusions contradicted the traditional idea of ​​the world order, resulting in an active philosophical discussion and the emergence of many interpretations. Even such fundamental provisions as the rules of probability amplitudes and probability distribution formulated by Max Born waited decades to be accepted by the scientific community.

Another problem of quantum mechanics is that the nature of the object it investigates is unknown. In the sense that the coordinates of an object, or the spatial distribution of the probability of its presence, can only be determined if it has certain properties (charge, for example) and environmental conditions (the presence of an electric potential).

The Copenhagen Interpretation, thanks primarily to Niels Bohr, is the basic interpretation of quantum mechanics from its inception to the present day. She argued that the probabilistic nature of quantum mechanical predictions could not be explained in terms of other deterministic theories and placed limits on our knowledge of environment. Quantum mechanics therefore provides only probabilistic results, the very nature of the universe being probabilistic, albeit deterministic in the new quantum sense.

Albert Einstein, himself one of the founders of quantum theory, was uncomfortable with the fact that in this theory there is a departure from classical determinism in determining the values ​​of the physical quantities of objects. He believed that the existing theory was incomplete and there should have been some additional theory. Therefore, he put forward a series of remarks on quantum theory, the most famous of which was the so-called EPR paradox. John Bell showed that this paradox could lead to discrepancies in quantum theory that could be measured. But experiments have shown that quantum mechanics is correct. However, some "inconsistencies" of these experiments leave questions that have not yet been answered.

Everett's interpretation of multiple worlds, formulated in 1956, proposes a model of the world in which all the possibilities for physical quantities to take certain values ​​in quantum theory simultaneously occur in reality, in a "multi-session" assembled from predominantly independent parallel universes. The multiverse is deterministic, but we get the probabilistic behavior of the universe only because we cannot observe all the universes at the same time.

Story

The foundation of quantum mechanics was laid in the first half of the 20th century by Max Planck, Albert Einstein, Werner Heisenberg, Erwin Schrödinger, Max Born, Paul Dirac, Richard Feynman and others. Some fundamental aspects of the theory still need to be studied. In 1900, Max Planck proposed the concept of energy quantization in order to obtain the correct formula for the radiation energy of a black body. In 1905, Einstein explained the nature of the photoelectric effect by postulating that the energy of light is absorbed not continuously, but in portions, which he called quanta. In 1913, Bohr explained the configuration of the spectral lines of the hydrogen atom, again using quantization. In 1924, Louis de Broglie proposed the hypothesis of wave-particle duality.

These theories, although successful, were too fragmentary and together constitute the so-called old quantum theory.

Modern quantum mechanics was born in 1925 when Heisenberg developed matrix mechanics and Schrödinger proposed wave mechanics and his equation. Subsequently, Janos von Neumann proved that both approaches are equivalent.

The next step came when Heisenberg formulated the uncertainty principle in 1927, and around that time the probabilistic interpretation began to take shape. In 1927, Paul Dirac combined quantum mechanics with special relativity. He was also the first to apply operator theory, including the popular bracket notation. In 1932 John von Neumann formulated the mathematical basis of quantum mechanics based on operator theory.

The era of quantum chemistry was started by Walter Heitler and Fritz London, who published the theory of the formation of covalent bonds in the hydrogen molecule in 1927. Subsequently, quantum chemistry was developed by a large community of scientists around the world.

Beginning in 1927, attempts began to apply quantum mechanics to rich-particle systems, resulting in the emergence of quantum field theory. Work in this direction was carried out by Dirac, Pauli, Weisskopf, Jordan. This line of research culminated in quantum electrodynamics, formulated by Feynman, Dyson, Schwinger, and Tomonaga during the 1940s. Quantum electrodynamics is the quantum theory of electrons, positrons, and the electromagnetic field.

The theory of quantum chromodynamics was formulated in the early 1960s. This theory, as we now know it, was proposed by Polister, Gross, and Vilcek in 1975. Building on the work of Schwinger, Higgs, Goldston, and others, Glashow, Weinberg, and Salam independently showed that the weak nuclear force and quantum electrodynamics can be combined and viewed as a single electroweak force.

quantization

In quantum mechanics, the quantization term is used in several close but different meanings.

Quantization is the discterization of the values ​​of a physical quantity, which in classical physics is continuous. For example, electrons in atoms can only be in certain orbitals with certain energy values. Another example is the orbital momentum of a quantum mechanical particle that can only have quite definite values. The discretization of the energy levels of a physical system with a decrease in size is called size quantization.
Quantization is also called the transition from the classical description of a physical system to a quantum one. In particular, the procedure for decomposing classical fields (for example, an electromagnetic field) into normal modes and representing them in the form of field quanta (for an electromagnetic field, these are photons) is called second quantization.

The main principles of quantum mechanics are W. Heisenberg's uncertainty principle and N. Bohr's complementarity principle.

According to the uncertainty principle, it is impossible to accurately determine the location of a particle and its momentum at the same time. The more precisely the location, or coordinate, of a particle is determined, the more uncertain its momentum becomes. Conversely, the more precisely the momentum is determined, the more uncertain its location remains.

This principle can be illustrated with the help of T. Young's experiment on interference. This experiment shows that when light passes through a system of two closely spaced small holes in an opaque screen, it behaves not like rectilinearly propagating particles, but like interacting waves, as a result of which an interference pattern appears on the surface located behind the screen in the form of alternating light and dark stripes. If, however, only one hole is left open in turn, then the interference pattern of the distribution of photons disappears.

The results of this experiment can be analyzed using the following thought experiment. In order to determine the location of an electron, it must be illuminated, that is, a photon must be directed at it. In the event of a collision of two elementary particles, we will be able to accurately calculate the coordinates of an electron (the place where it was at the time of the collision is determined). However, due to the collision, the electron will inevitably change its trajectory, since as a result of the collision, the momentum from the photon will be transferred to it. Therefore, if we accurately determine the coordinate of the electron, then at the same time we will lose knowledge of the trajectory of its subsequent movement. A thought experiment on the collision of an electron and a photon is analogous to closing one of the holes in Young's experiment: a collision with a photon is analogous to closing one of the holes in the screen: in the event of this closing, the interference pattern is destroyed or (which is the same) the trajectory of the electron becomes uncertain.

The meaning of the uncertainty principle. The uncertainty relation means that the principles and laws of Newton's classical dynamics cannot be used to describe processes involving micro-objects.

In essence, this principle means the rejection of determinism and the recognition of the fundamental role of randomness in processes involving micro-objects. In the classical description, the concept of randomness is used to describe the behavior of elements of statistical ensembles and is only a conscious sacrifice of the completeness of the description in the name of simplifying the solution of the problem. In the microcosm, however, an accurate prediction of the behavior of objects, giving the values ​​of its traditional parameters for the classical description, is generally impossible. There are still lively discussions on this subject: adherents of classical determinism, without denying the possibility of using the equations of quantum mechanics for practical calculations, see in the randomness they take into account the result of our incomplete understanding of the laws that govern the behavior of microobjects that is still unpredictable for us. A. Einstein was an adherent of this approach. Being the founder of modern natural science, who dared to revise the seemingly unshakable positions of the classical approach, he did not consider it possible to abandon the principle of determinism in natural science. The position of A. Einstein and his supporters on this issue can be formulated in a well-known and very figurative statement that it is very difficult to believe in the existence of God, each time throwing dice to make a decision about the behavior of micro-objects. However, no experimental facts have been found so far that indicate the existence of internal mechanisms that control the "random" behavior of micro-objects.

It should be emphasized that the uncertainty principle is not associated with any shortcomings in the design of measuring instruments. It is fundamentally impossible to create a device that would equally accurately measure the coordinate and momentum of a microparticle. The uncertainty principle is manifested by the corpuscular-wave dualism of nature.

It also follows from the uncertainty principle that quantum mechanics rejects the fundamental possibility postulated in classical natural science to perform measurements and observations of objects and the processes occurring with them that do not affect the evolution of the system under study.

The uncertainty principle is a special case of the more general principle of complementarity. It follows from the principle of complementarity that if in any experiment we can observe one side physical phenomenon, then at the same time we are deprived of the opportunity to observe an additional side of the phenomenon to the first. Additional properties that appear only in different experiments carried out under mutually exclusive conditions can be the position and momentum of the particle, the wave and corpuscular nature of the substance or radiation.

The principle of superposition is of great importance in quantum mechanics. The principle of superposition (principle of superposition) is an assumption according to which the resulting effect is the sum of the effects caused by each influencing phenomenon separately. One of the simplest examples is the parallelogram rule, according to which two forces acting on a body are added together. In the microcosm, the superposition principle is a fundamental principle, which, along with the uncertainty principle, forms the basis of the mathematical apparatus of quantum mechanics. In relativistic quantum mechanics, which assumes the mutual transformation of elementary particles, the principle of superposition must be supplemented by the principle of superselection. For example, during the annihilation of an electron and a positron, the principle of superposition is supplemented by the principle of conservation of electric charge - before and after the transformation, the sum of the charges of the particles must be constant. Since the charges of an electron and a positron are equal and mutually opposite, an uncharged particle should appear, which is the photon that is born in this annihilation process.

BASIC PRINCIPLES OF QUANTUM MECHANICS.

Parameter name Meaning
Article subject: BASIC PRINCIPLES OF QUANTUM MECHANICS.
Rubric (thematic category) Mechanics

In 1900 ᴦ. German physicist Max Planck suggested that the emission and absorption of light by matter occurs in finite portions - quanta, and the energy of each quantum is proportional to the frequency of the emitted radiation:

where is the frequency of the emitted (or absorbed) radiation, and h is a universal constant called Planck's constant. According to modern data

h \u003d (6.62618 0.00004) ∙ 10 -34 J ∙ s.

Planck's hypothesis was the starting point for the emergence of quantum concepts, which formed the basis of a fundamentally new physics - the physics of the microworld, called quantum physics. The deep ideas of the Danish physicist Niels Bohr and his school played a huge role in its formation. At the root of quantum mechanics lies a consistent synthesis of corpuscular and wave properties of matter. A wave is a very extended process in space (remember waves on water), and a particle is a much more local object than a wave. Light under certain conditions behaves not like a wave, but like a stream of particles. At the same time, elementary particles sometimes exhibit wave properties. Within the framework of the classical theory, it is impossible to combine wave and corpuscular properties. For this reason, the creation of a new theory that describes the patterns of the microcosm has led to the rejection of conventional ideas that are valid for macroscopic objects.

From a quantum point of view, both light and particles are complex objects that exhibit both wave and particle properties (the so-called wave-particle duality). The creation of quantum physics was stimulated by attempts to comprehend the structure of the atom and the regularities of the emission spectra of atoms.

At the end of the 19th century, it was discovered that when light falls on the surface of a metal, electrons are emitted from the latter. This phenomenon has been called photoelectric effect.

In 1905 ᴦ. Einstein explained the photoelectric effect on the basis of quantum theory. He introduced the assumption that the energy in a beam of monochromatic light consists of portions, the size of which is equal to h. The physical dimension of h is time∙energy=length∙momentum= moment of momentum. This dimension is possessed by a quantity called action, and in connection with this, h is called the elementary quantum of action. According to Einstein, an electron in a metal, having absorbed such a portion of energy, does the work of exit from the metal and acquires kinetic energy

E k \u003d h − A out.

This is Einstein's equation for the photoelectric effect.

Discrete portions of light were later (in 1927 ᴦ.) called photons.

In science, when determining the mathematical apparatus, one should always proceed from the nature of the observed experimental phenomena. The German physicist Schrödinger achieved great achievements by trying a different strategy of scientific research: first mathematics, and then understanding its physical meaning and, as a result, interpreting the nature of quantum phenomena.

It was clear that the equations of quantum mechanics must be wavelike (after all, quantum objects have wave properties). These equations must have discrete solutions (elements of discreteness are inherent in quantum phenomena). Equations of this kind were known in mathematics. Focusing on them, Schrödinger suggested using the concept of the wave function ʼʼψʼʼ. For a particle moving freely along the X axis, the wave function ψ=e - i|h(Et-px) , where p is the momentum, x is the coordinate, E-energy, h-Planck's constant. The function ʼʼψʼʼ is usually called a wave function because an exponential function is used to describe it.

The state of a particle in quantum mechanics is described by a wave function, which makes it possible to determine only the probability of finding a particle at a given point in space. The wave function does not describe the object itself or even its potentialities. Operations with the wave function make it possible to calculate the probabilities of quantum mechanical events.

Fundamental Principles quantum physics are principles of superposition, indeterminacy, complementarity and identity.

Principle superpositions in classical physics allows you to get the resulting effect from the superimposition (superposition) of several independent influences as the sum of the effects caused by each influence separately. It is valid for systems or fields described by linear equations. This principle is very important in mechanics, the theory of oscillations and the wave theory of physical fields. In quantum mechanics, the principle of superposition refers to wave functions: if a physical system can be in states described by two or more wave functions ψ 1, ψ 2 ,…ψ ń , then it can be in a state described by any linear combination of these functions:

Ψ=c 1 ψ 1 +c 2 ψ 2 +….+с n ψ n ,

where с 1 , с 2 ,…с n are arbitrary complex numbers.

The principle of superposition is a refinement of the corresponding concepts of classical physics. According to the latter, in a medium that does not change its properties under the influence of perturbations, waves propagate independently of each other. Consequently, the resulting perturbation at any point in the medium when several waves propagate in it is equal to the sum of the perturbations corresponding to each of these waves:

S \u003d S 1 + S 2 + .... + S n,

where S 1 , S 2,….. S n are perturbations caused by the wave. In the case of a non-harmonic wave, it can be represented as a sum of harmonic waves.

Principle uncertainties is that it is impossible to simultaneously determine two characteristics of a microparticle, for example, velocity and coordinates. It reflects the dual corpuscular-wave nature of elementary particles. Errors, inaccuracies, errors in the simultaneous determination of additional quantities in the experiment are related by the uncertainty relation established in 1925ᴦ. Werner Heisenberg. The uncertainty relation is that the product of the inaccuracies of any pair of additional quantities (for example, the coordinate and the projection of the momentum on it, energy and time) is determined by Planck's constant h. Uncertainty relations indicate that the more specific the value of one of the parameters included in the relationship, the more uncertain the value of the other parameter and vice versa. It means that the parameters are measured simultaneously.

Classical physics has taught us that all parameters of objects and the processes occurring with them can be measured simultaneously with any accuracy. This position is refuted by quantum mechanics.

The Danish physicist Niels Bohr came to the conclusion that quantum objects are relative to the means of observation. The parameters of quantum phenomena can be judged only after their interaction with the means of observation, ᴛ.ᴇ. with appliances. The behavior of atomic objects cannot be sharply distinguished from their interaction with measuring instruments that fix the conditions under which these phenomena occur. At the same time, it is necessary to take into account that the instruments that are used to measure the parameters are of different types. The data obtained under different conditions of the experiment should be considered as additional in the sense that only a set of different measurements can give a complete picture of the properties of the object. This is the content of the complementarity principle.

In classical physics, the measurement was considered not perturbing the object of study. The measurement leaves the object unchanged. According to quantum mechanics, each individual measurement destroys the micro-object. To carry out a new measurement, it is necessary to re-prepare the micro-object. This complicates the measurement synthesis process. In this regard, Bohr asserts the complementarity of quantum measurements. The data of classical measurements are not complementary, they have an independent meaning independently of each other. Complementation takes place where the objects under study are indistinguishable from each other and interconnected.

Bohr correlated the principle of complementarity not only with the physical sciences: ʼʼthe wholeness of living organisms and the characteristics of people with consciousness, as well as human cultures, represent features of the wholeness, the display of which requires a typically complementary way of descriptionʼʼ. According to Bohr, the possibilities of living beings are so diverse and so closely interconnected that when studying them, one again has to turn to the procedure for complementing observational data. At the same time, this idea of ​​Bohr did not receive proper development.

Features and specificity of interactions between the components of complex micro- and macrosystems. as well as external interactions between them leads to their enormous diversity. Individuality is characteristic of micro- and macrosystems, each system is described by a set of all possible properties inherent only to it. You can name the differences between the nucleus of hydrogen and uranium, although both refer to microsystems. There are no less differences between Earth and Mars, although these planets belong to the same solar system.

Thus it is possible to speak about identity of elementary particles. Identical particles have the same physical properties: mass, electric charge and other internal characteristics. For example, all the electrons of the Universe are considered identical. Identical particles obey the principle of identity - the fundamental principle of quantum mechanics, according to which: the states of a system of particles obtained from each other by rearranging identical particles in places cannot be distinguished in any experiment.

This principle is the main difference between classical and quantum mechanics. In quantum mechanics, identical particles are devoid of individuality.

STRUCTURE OF THE ATOM AND THE NUCLEAR. ELEMENTARY PARTICLES.

The first ideas about the structure of matter arose in Ancient Greece in the 6th-4th centuries BC. Aristotle considered matter to be continuous, ᴛ.ᴇ. it can be divided into arbitrarily small parts, but never reach the smallest particle that would not be further divided. Democritus believed that everything in the world consists of atoms and emptiness. Atoms are the smallest particles of matter, which means "indivisible", and in the representation of Democritus, atoms are spheres with a jagged surface.

Such a worldview existed until the end of the 19th century. In 1897ᴦ. Joseph John Thomson (1856-1940ᴦ.ᴦ.), own son W. Thomson, twice Nobel Prize winner, discovered an elementary particle, which was called an electron. It was found that the electron flies out of the atoms and has a negative electric charge. The magnitude of the electron charge e\u003d 1.6.10 -19 C (Coulomb), electron mass m\u003d 9.11.10 -31 kᴦ.

After the discovery of the electron, Thomson in 1903 put forward the hypothesis that the atom is a sphere on which a positive charge is smeared, and electrons with negative charges are interspersed in the form of raisins. The positive charge is equal to the negative, in general, the atom is electrically neutral (the total charge is 0).

In 1911, conducting an experiment, Ernst Rutherford found that the positive charge is not spread over the volume of the atom, but occupies only a small part of it. After that, he put forward a model of the atom, which later became known as the planetary one. According to this model, an atom really is a sphere, in the center of which there is a positive charge, occupying a small part of this sphere - about 10 -13 cm. The negative charge is located on the outer, so-called electron shell.

A more perfect quantum model of the atom was proposed by the Danish physicist N. Bohr in 1913, who worked in Rutherford's laboratory. He took Rutherford's model of the atom as a basis and supplemented it with new hypotheses that contradict classical ideas. These hypotheses are known as Bohr's postulates. Οʜᴎ are reduced to the following.

1. Each electron in an atom can perform a stable orbital motion along a certain orbit, with a certain energy value, without emitting or absorbing electromagnetic radiation. In these states, atomic systems have energies that form a discrete series: E 1 , E 2 ,…E n . Any change in energy as a result of the emission or absorption of electromagnetic radiation can occur in a jump from one state to another.

2. When an electron moves from one stationary orbit to another, energy is emitted or absorbed. If during the transition of an electron from one orbit to another, the energy of the atom changes from E m to E n, then h v= E m - E n , where v is the radiation frequency.

Bohr used these postulates to calculate the simplest hydrogen atom,

The area in which the positive charge is concentrated is called the nucleus. There was an assumption that the nucleus consists of positive elementary particles. These particles, called protons (in Greek, proton means first), were discovered by Rutherford in 1919. Their modulo charge is equal to the electron charge (but positive), the proton mass is 1.6724.10 -27 kᴦ. The existence of the proton was confirmed by an artificial nuclear reaction that converts nitrogen into oxygen. Nitrogen atoms were irradiated with helium nuclei. The result was oxygen and a proton. The proton is a stable particle.

In 1932, James Chadwick discovered a particle that had no electric charge and had a mass of almost equal to the mass proton. This particle was called the neutron. The mass of the neutron is 1.675.10 -27 kᴦ. The neutron was discovered by irradiating a beryllium plate with alpha particles. The neutron is an unstable particle. The lack of charge explains its easy ability to penetrate the nuclei of atoms.

The discovery of the proton and neutron led to the creation of the proton-neutron model of the atom. It was proposed in 1932 by the Soviet physicists Ivanenko, Gapon and the German physicist Heisenberg. According to this model, the nucleus of an atom consists of protons and neutrons, with the exception of the hydrogen nucleus, ĸᴏᴛᴏᴩᴏᴇ consists of one proton.

The charge of the nucleus is determined by the number of protons in it and is denoted by the symbol Z . The entire mass of an atom is contained in the mass of its nucleus and is determined by the mass of the protons and neutrons entering it, since the mass of an electron is negligible compared to the masses of a proton and a neutron. The serial number in Mendeleev's periodic table corresponds to the charge of the nucleus of a given chemical element. Mass number of an atom A is equal to the mass of neutrons and protons: A=Z+N, Where Z is the number of protons, N is the number of neutrons. Conventionally, any element is denoted by the symbol: A X z .

There are nuclei that contain the same number of protons but different numbers of neutrons, ᴛ.ᴇ. different mass numbers. Such nuclei are called isotopes. Eg, 1 H 1 - regular hydrogen 2 N 1 - deuterium, 3 N 1 - tritium. The most stable nuclei are those in which the number of protons is equal to the number of neutrons or both at the same time = 2, 8, 20, 28, 50, 82, 126 - magic numbers.

The dimensions of the atom are approximately 10 -8 cm. The atom consists of a nucleus 10-13 cm in size. Between the nucleus of the atom and the boundary of the atom there is a huge space in terms of scale in the microworld. The density in the nucleus of an atom is enormous, approximately 1.5·108 t/cm 3 . Chemical elements with mass A<50 называются легкими, а с А>50 - heavy. It's a bit crowded in the nuclei of heavy elements, ᴛ.ᴇ. an energy prerequisite for their radioactive decay is created.

The energy required to split a nucleus into its constituent nucleons is called the binding energy. (Nuclons are a generalized name for protons and neutrons, and translated into Russian means ʼʼnuclear particlesʼʼ):

E sv \u003d Δm∙s 2,

Where ∆m is the nuclear mass defect (difference between the masses of the nucleons forming the nucleus and the mass of the nucleus).

In 1928ᴦ. The theoretical physicist Dirac proposed the theory of the electron. Elementary particles can behave like a wave - they have wave-particle duality. Dirac's theory made it possible to determine when an electron behaves like a wave, and when it behaves like a particle. He concluded that there must be an elementary particle that has the same properties as an electron, but with a positive charge. Such a particle was later discovered in 1932 and named the positron. The American physicist Andersen discovered in a photograph of cosmic rays a trace of a particle similar to an electron, but with a positive charge.

It followed from the theory that an electron and a positron, interacting with each other (annihilation reaction), form a pair of photons, ᴛ.ᴇ. quanta of electromagnetic radiation. The reverse process is also possible, when a photon, interacting with the nucleus, turns into an electron-positron pair. Each particle is associated with a wave function, the square of whose amplitude is equal to the probability of finding the particle in a certain volume.

In the 1950s, the existence of the antiproton and antineutron was proved.

Even 30 years ago, it was believed that neutrons and protons are elementary particles, but experiments on the interaction of protons and electrons moving at high speeds showed that protons consist of even smaller particles. These particles were first studied by Gell Mann and called them quarks. Several varieties of quarks are known. It is assumed that there are 6 flavors: U - quark (up), d-quark (down), strange quark (strange), charm quark (charm), b - quark (beauty), t-quark (truth) ..

Each flavor quark has one of three colors: red, green, blue. This is just a designation, because Quarks are much smaller than the wavelength of visible light and therefore have no color.

Let's consider some characteristics of elementary particles. In quantum mechanics, each particle is assigned a special mechanical moment of its own, which is not associated with either its movement in space or its rotation. This own mechanical moment is called. back. So, if you rotate an electron by 360 o, then you would expect it to return to its original state. In this case, the initial state will be reached only with one more 360° rotation. That is, in order to return the electron to its original state, it must be rotated by 720 o, in comparison with the spin, we perceive the world only half. For example, on a double wire loop, the bead will return to its original position when rotated 720 degrees. Such particles have a half-integer spin ½. The spin tells us what the particle looks like when viewed from different angles. For example, a particle with spin ʼʼ0ʼʼ looks like a point: it looks the same from all sides. A particle with a spin of ʼʼ1ʼʼ can be compared to an arrow: it looks different from different sides and returns to its former form when rotated through 360 o. A particle with a spin of ʼʼ2ʼʼ can be compared with an arrow sharpened on both sides: any of its positions is repeated from a half turn (180 o). Higher spin particles return to their original state when rotated by an even smaller fraction of a full revolution.

Particles with half-integer spin are called fermions, and particles with integer spin are called bosons. Until recently, it was believed that bosons and fermions are the only possible types of indistinguishable particles. In fact, there are a number of intermediate possibilities, and fermions and bosons are only two limiting cases. Such a class of particles is called anions.

Particles of matter obey the Pauli exclusion principle, discovered in 1923 by the Austrian physicist Wolfgang Pauli. The Pauli principle states that in a system of two identical particles with half-integer spins, more than one particle cannot be in the same quantum state. There are no restrictions for particles with integer spin. This means that two identical particles cannot have the same coordinates and velocities with the accuracy specified by the uncertainty principle. If the particles of matter have very close coordinates, then their velocities must be different, and, therefore, they cannot stay at points with these coordinates for a long time.

In quantum mechanics, all forces and interactions between particles are assumed to be carried by particles with integer spins of 0.1.2. This happens as follows: for example, a particle of matter emits a particle that is the carrier of interaction (for example, a photon). As a result of recoil, the speed of the particle changes. Next, the carrier particle ʼʼbumpsʼʼ onto another particle of the substance and is absorbed by it. This collision changes the speed of the second particle, as if there is a force acting between these two particles of matter. Carrier particles that are exchanged between particles of matter are called virtual, because, unlike real ones, they cannot be registered using a particle detector. However, they exist because they create an effect that can be measured.

Carrier particles can be classified into 4 types based on the amount of interaction they carry and on which particles they interact with and which particles they interact with:

1) Gravitational force. Any particle is under the action of a gravitational force, the magnitude of which depends on the mass and energy of the particle. This is a weak force. Gravitational forces act over long distances and are always attractive forces. So, for example, the gravitational interaction keeps the planets in their orbits and us on Earth.

In the quantum mechanical approach to the gravitational field, it is believed that the force acting between the particles of matter is transferred by a particle with a spin of ʼʼ2ʼʼ, which is commonly called a graviton. The graviton does not have its own mass, and in connection with this, the force transferred by it is long-range. The gravitational interaction between the Sun and the Earth is explained by the fact that the particles that make up the Sun and the Earth exchange gravitons. The effect of the exchange of these virtual particles is measurable, because this effect is the rotation of the Earth around the Sun.

2) The next kind of interaction is created electromagnetic forces that act between electrically charged particles. The electromagnetic force is much stronger than the gravitational force: the electromagnetic force acting between two electrons is about 1040 times greater than the gravitational force. Electromagnetic interaction determines the existence of stable atoms and molecules (interaction between electrons and protons). The carrier of electromagnetic interaction is a photon.

3) Weak interaction. It is responsible for radioactivity and exists between all particles of matter with spin ½. Weak interaction provides a long and even burning of our Sun, which provides energy for the flow of all biological processes on Earth. The carriers of the weak interaction are three particles - W ± and Z 0 -bosons. Οʜᴎ were discovered only in 1983ᴦ. The radius of the weak interaction is extremely small, in connection with this, its carriers must have large masses. In accordance with the uncertainty principle, the lifetime of particles with such a large mass should be extremely short - 10 -26 s.

4) Strong interaction is an interaction, ĸᴏᴛᴏᴩᴏᴇ keeps quarks inside protons and neutrons, and protons and neutrons inside the atomic nucleus. The carrier of the strong interaction is considered to be a particle with a spin of ʼʼ1ʼʼ, which is commonly called a gluon. Gluons interact only with quarks and with other gluons. Quarks, thanks to gluons, are connected in pairs or triplets. The strong force at high energies weakens and quarks and gluons begin to behave like free particles. This property is called asymptotic freedom. As a result of experiments on powerful accelerators, photographs of tracks (traces) of free quarks, born as a result of the collision of high-energy protons and antiprotons, were obtained. The strong interaction ensures the relative stability and existence of atomic nuclei. Strong and weak interactions are characteristic of the processes of the microcosm leading to the mutual transformations of particles.

Strong and weak interactions became known to man only in the first third of the 20th century in connection with the study of radioactivity and understanding the results of the bombardment of atoms of various elements by α-particles. alpha particles knock out both protons and neutrons. The purpose of reasoning has led physicists to believe that protons and neutrons sit in the nuclei of atoms, being tightly bound to each other. There are strong interactions. On the other hand, radioactive substances emit α-, β- and γ-rays. When, in 1934, Fermi created the first theory sufficiently adequate to the experimental data, he had to assume the presence in the nuclei of atoms of negligible intensities of interactions, which began to be called weak.

Efforts are now being made to combine the electromagnetic, weak, and strong forces to form the so-called GRAND UNIFIED THEORY. This theory sheds light on our very existence. It is possible that our existence is a consequence of the formation of protons. Such a picture of the beginning of the Universe seems to be the most natural. The terrestrial matter mainly consists of protons, but there are neither antiprotons nor antineutrons in it. Experiments with cosmic rays have shown that the same is true for all matter in our galaxy.

Characteristics of strong, weak, electromagnetic and gravitational interactions are given in the table.

The order of intensity of each interaction, indicated in the table, is determined in relation to the intensity of the strong interaction, taken as 1.

Let us give a classification of the most well-known elementary particles at the present time.

PHOTON. The rest mass and its electric charge are equal to 0. The photon has an integer spin and is a boson.

LEPTONS. This class of particles does not participate in the strong interaction, but has electromagnetic, weak and gravitational interactions. Leptons have half-integer spin and are fermions. The elementary particles included in this group are assigned a certain characteristic called lepton charge. The lepton charge, unlike the electric one, is not a source of any interaction, its role has not yet been fully elucidated. The value of the lepton charge for leptons is L=1, for antileptons L= -1, for all other elementary particles L=0.

MESONS. These are unstable particles, which are inherent strong interaction. The name ʼʼmesonsʼʼ means ʼʼintermediateʼʼ and is due to the fact that the initially discovered mesons had a mass greater than that of an electron, but less than that of a proton. Today mesons are known, the masses of which are greater than the mass of protons. All mesons have integer spin and are therefore bosons.

BARYONS. This class includes a group of heavy elementary particles with a half-integer spin (fermions) and a mass not less than that of a proton. The only stable baryon is the proton, the neutron is stable only inside the nucleus. Baryons are characterized by 4 types of interaction. In any nuclear reactions and interactions, their total number remains unchanged.

BASIC PRINCIPLES OF QUANTUM MECHANICS. - concept and types. Classification and features of the category "BASIC PRINCIPLES OF QUANTUM MECHANICS." 2017, 2018.