Line UMK A. V. Peryshkin. Physics (7-9)

Line UMK G. Ya. Myakishev, M.A. Petrova. Physics (10-11) (B)

Line UMK N. S. Purysheva. Physics (7-9)

Line UMK Purysheva. Physics (10-11) (BU)

How does the progress engine work?

On improving the methods of teaching physics in Russia: from the 18th to the 21st centuries.

Physics. Who figured out why it exploded, how to calculate it, what it is, why it happens, why this detail, where does the energy go? Hundreds of questions. There are answers to a huge number, not to a huge number, and even more are not given at all. How has the teaching of one of the most important disciplines changed over the past three centuries?
Read on the topic:
Methodological assistance to a physics teacher
An important feature physics is a close relationship with the development of society and its material culture, since it can in no way be the same “thing in itself”. Physics depends on the level of development of society, and at the same time is the engine of its productive forces. That is why it is the science of nature and its laws that can be considered the “cut” that shows the scientific potential of the country and the vector of its development.

Chapter first. 18th century

Initially individual issues physics (taught according to Aristotle) ​​were studied as part of the course of philosophy in the two largest Slavic-Greek-Latin academies: Kiev-Mohyla and Moscow. Only in early XVIII century, physics stood out as an independent subject, separated from natural philosophy, forming its own goals and objectives, as befits a real discipline. Education nevertheless continued in the classical languages, that is, Latin and Greek, which significantly reduced the number of subjects studied.

Nevertheless, looking ahead, we note that the work on the creation of domestic methodological literature on physics began in Russia much earlier than in the West. After all, we have physics like academic subject was introduced into the school at the end of the 18th century, while in Europe - only at the end of the 19th.

In the meantime, Peter the Great. This phrase contains everything: the expectation of the Europeanization of education, its dissemination and popularization. Beards have nothing to do with it, forget about beards. The widespread discovery of new educational institutions allowed physics to reach a new level and in the second half of the 18th century to become a separate subject at universities.


Line UMK A. V. Peryshkin. Physics (grades 7-9)
At the end of each chapter, a summary of the final material was added to the revised version of the teaching materials, including brief theoretical information and test tasks for self-test. Textbooks have also been supplemented with assignments different types aimed at the formation of metasubject skills: comparison and classification, formulating a reasoned opinion, working with various sources of information, including electronic resources and the Internet, solving computational, graphical and experimental problems

Since 1757, lectures in physics at Moscow University have been accompanied by demonstrations of experiments. In the middle of the century, equipping universities with instruments made it possible to move from the "Cretaceous stage" to a more complex stage - "instrument physics", but in most cases the study of physical phenomena was not just accompanied, but reduced to a detailed study of instruments. The student clearly had an idea about the principle of operation of rods, plates, thermometers and a voltaic column.

Chapter two. Nineteenth century

What determines the success of teaching any subject? From the quality of programs, methods, material base and language of textbooks, the availability of physical instruments and reagents, the level of the teacher himself.

During the period we are talking about, there was no unified program in physics either at school or at the university. What were the schools doing? Schools worked on the basis of materials that were developed in the educational district, universities - relying on the course of an authoritative author or following the author's course approved by the College of Professors.

Everything changed in the second half of the century. The already mentioned Physics Cabinet of Moscow University grew, the collection of demonstration instruments increased, actively influencing the effectiveness of teaching. And in the physics program of 1872, it was recommended to give students a thorough knowledge, for the same "limiting themselves to the number of facts for each department of phenomena and studying them completely, rather than having a huge amount of superficial information." Quite logical, given that the theory of physics at that time was logical and devoid of extremely unstable dilemmas.

Read on the topic:
Preparation for the exam in physics: examples, solutions, explanations
How was physics taught? Let's talk about methods.

About pedagogical activity Nikolai Alekseevich Lyubimov, an outstanding Russian physicist, professor, one of the founders of the Moscow Mathematical Society, wrote: “ Pedagogical activity NA at Moscow University undoubtedly represented a significant step forward. In arranging the teaching of physics, one had to start almost from the ABC, and bring it to perfection, which it reached in the hands of Η. A., required great efforts and remarkable abilities. ”So, so, is the alphabet a metaphor or a real state of affairs? It seems that the real and quite similar to the current state of affairs in many educational institutions.


One of the most popular methods of teaching physics in the 19th century was the rote memorization of material, in the first round - from lecture notes, later - from short textbooks. Not surprisingly, the state of students' knowledge was alarming. The same Nikolai Alekseevich expressed himself quite clearly about the level of knowledge of the gymnasium students:

“The greatest drawback of teaching with us is that it provides only superficial information ... We had to listen to more than one hundred answers in exams. There is only one impression: the respondent does not understand what he himself is proving.

Another outstanding and well-known Russian surgeon, naturalist and teacher Nikolay Ivanovich Pirogov adhered to the same opinion, speaking out in support of the idea of ​​the importance of not only the personal qualities of the teacher, but the methods of his activity.

“It is time for us to understand that the duty of the gymnasium teacher does not consist only in the communication of scientific information, and that the main task of pedagogy is precisely how this information will be communicated to students.”

Understanding the fallacy of this approach made it possible to move on to a fundamentally new method of experimental teaching compared to the eighteenth century. Not detailed study devices and memorization of the text is at the forefront, but independent acquisition of new knowledge from the analysis of experiments. The list of instruments of Moscow University, compiled in 1854, consisted of 405 instruments, most of them belonged to the section of mechanics, about 100 - to the section of electricity and magnetic properties, about 50 devices - to heat. A standard set of any office and instruments, the description of which could be found in any textbook: Archimedean screw, siphons, gate, lever, Heron fountain, barometer, hygrometer.

Read on the topic:
USE in physics: solving problems about vibrations

The charter of 1864 ordered real (in priority subjects of the natural science cycle) and classical gymnasiums to have physical classrooms at their disposal, and the first one also had a chemistry class to boot. The active development of physics in the 1860s, its inseparable connection with industry and the development of technology, the general increase in the level of students, as well as the number of those wishing to devote themselves to an applied discipline that affects the future of the fatherland, led to a "scientific starvation". Like this? This is a keen sense of the lack of specialists with the practice scientific work. How to solve this problem? That's right, teach how to work and teach how to teach.


The first generalizing work on the methodology of teaching physics was Fyodor Shvedov's book, released in 1894, "Methodology of Physics". It considered the construction of a training course, the classification of methods and their psychological justification, for the first time a description of the tasks of the subject was given.

“The task of the science of methodology is not only to develop art, so to speak, virtuosity of presentation, but mainly to clarify the logical foundations of science, which could serve as a starting point both for the choice of material and for the order of its arrangement in each course presented, the purpose of which assumed to be intended."

This idea was progressive for its time, moreover, it has absolutely not lost its significance in modern times.

The pre-revolutionary period was characterized by a sharp increase in the number of methodological publications. If you collect all the innovative ideas contained in the works of Lermanov, Glinka, Baranov and Kashin, you can get an interesting list:

  • The introduction of "fruitful" and not "sterile" theoretical knowledge.
  • Wide use of demos.
  • Two stage system.
  • Development and application of homemade devices.
  • Perception of physics as a discipline that forms a worldview.
  • Experimental method as one of the foundations of teaching.
  • Application of induction and deduction.
  • Creative combination of theory and experiment.

It is the expansion of scientific laboratories, the introduction of practices laboratory work in gymnasium and university education, the development of scientific research led to a surge scientific discoveries at the turn of the century. Many trends have remained unchanged to this day, ensuring the continuity and constant improvement of teaching in one of the most important disciplines for understanding the world.

Chapter three. 20th century


Line UMK N. S. Purysheva. Physics (grades 10-11)
The basis of the course, written according to the author's program, is an inductive approach: the path to theoretical constructions lies through everyday life experience, observations of the surrounding reality and simple experiments. great attention given practical work schoolchildren and a differentiated approach to learning. Textbooks make it possible to organize both individual and group work of high school students, thanks to which the skills of both independent activity and teamwork are developed.

Schoolchildren and students needed to explain all this. For half a century, the idea of ​​the world has changed, which means that pedagogical practice should have changed as well. The greatest breakthrough in the microcosm, quantum theory, special relativity, nuclear physics and high energy physics.


How was the teaching of physics organized in Russia after the 1917 revolution? The construction of a new unified labor school on socialist principles radically changed the content and methods of education:

  • The importance of physics was appreciated in curriculum and in teaching.
  • Scientific research institutes and centers for pedagogical sciences were created, and departments of methodology were organized in pedagogical universities.
  • Soviet physics does not cancel the developments and progressive trends of the pre-revolutionary period, BUT.
  • Its feature (how could it be without it?) is materialism, the content of research is inseparable from the needs and direction of the country. The fight against formalism - in fact, why not.

The whole world in the middle of the 20th century is experiencing a scientific and technological revolution, in which the role of Soviet scientists is invaluable. There are legends about the level of Soviet technical education. From the end of the 1950s until 1989, when the country entered a period of a new crisis, physics developed intensively, and the teaching methodology responded to a number of challenges:

  • The new course should correspond to the latest achievements of science and technology. The textbooks of 1964 already contained information about ultrasound, artificial Earth satellites, weightlessness, polymers, properties of semiconductors, particle accelerators (!). A new chapter was even introduced - "Physics and technical progress".
  • New manuals and textbooks for high school must meet the new requirements. What? The material is presented in an accessible, interesting way, with a wide application of experiment and a clear disclosure of the laws of physics.
  • The cognitive activity of students should reach a new level. It was then that the three functions of the lesson were finally formed: educational, educational and developing.
  • Technical means training - how without them? The system of school physical experiment should be improved.

It was the Soviet methodologists who made a significant contribution to improving the structure and methods of teaching technical disciplines. New forms of physics lessons, used to this day: a problem lesson, a conference-lesson, a lesson-seminar, a lesson-excursion, practical exercises, experimental tasks, were developed in the USSR.

“The methodology of physics must solve three problems: why teach, what to teach and how to teach?” (textbook by I. I. Sokolov).

Pay attention to the order, it is the basis of a good education.

Chapter Four. twenty-first century

This chapter is still unfinished, it is an open sheet that needs to be filled out. How? By creating an object that will meet both technological progress and the tasks that this moment facing domestic science, and the goal of stimulating the scientific and inventive potential of the student.


Give the student the text of the lesson - he will learn it.

Give the student the text of the lesson and the instruments - and he will understand the principle of their work.

Give a student the text of a lecture, instruments and a study guide - and he will learn to systematize his knowledge, understand the operation of laws

Give a student textbooks, lectures, instruments and a good teacher - and he will be inspired to scientific work

Give a student all this and freedom, the Internet, and he will have the opportunity to instantly get any article, create a 3D model, watch a video of an experiment, quickly calculate and check his conclusions, constantly learn new things - and you will get a person who will learn to ask questions himself. Isn't that the most important thing in learning?

New educational and methodical complexes The Russian Textbook* is a combination of all four centuries: text, assignments, mandatory laboratory work, project activities and e-learning.

We want you to write the fourth chapter yourself.

Olga Davydova
*Since May 2017, the DROFA-VENTANA joint publishing group has been part of the Russian Textbook Corporation. The corporation also included the Astrel publishing house and the LECTA digital educational platform. CEO Alexander Brychkin, graduate of the Financial Academy under the Government of the Russian Federation, candidate of economic sciences, head of innovative projects of the DROFA publishing house in the field of digital education, was appointed.

Introduction

Physical representations of antiquity and the Middle Ages

The development of physics in modern times

Transition from classical to relativistic concepts in physics

Modern physics of macro- and microworld

Conclusion


Introduction

Physics throughout the new and recent history was a leader in scientific progress. Its concepts and methods served as models for other sciences, that is, it was, as it were, a paradigm of natural science knowledge in general. And only in the second half of the 20th century, the development of other areas led to the fact that physics began to lose its absolute leadership. But even today, in many respects, scientific and technological progress is based on the basic physical concepts and those developments of particular issues that are associated with these basic concepts.

Generalizing physical theories quite legitimately seek to reveal the deepest basis of an even wider range of phenomena, but the thought of physicists is not satisfied with these and, so to speak, by inertia rushes to a concrete - physical explanation of the structure of the whole world as a whole. And more than once it seemed that this goal had already been achieved - first in the form of classical mechanics, then in the form of thermodynamics, now in the form of generalizing theories of fields and elementary particles. But time and new discoveries inexorably force us to recognize the unfulfillment of such hopes. With regard to the whole world as a whole, one has to manage only with philosophical reflections and generalizations, only with the general theory of dialectics, only with qualitative assessments, and not with quantitative calculations.

1. Physical representations of antiquity and the Middle Ages

The term "physics" appears in ancient philosophical and scientific thought in the 6th-5th centuries BC. Physicists then called those thinkers who tried to give a more or less complete picture of the world surrounding a person. At the same time, they paid little attention to how, with the help of what methods and cognitive procedures, this knowledge arises. In addition, they considered the picture of the world they were developing to be the absolute truth, which does not need any further clarification. And yet they put forward, almost without addressing real experience, a number of fundamental ideas that were subsequently developed in modern physics and even became the basis for its further progress.

The most fundamental idea in this regard was the principle of atomism, which allowed Democritus and Epicurus to qualitatively explain the emergence of diversity in the surrounding world and show that comparatively simple models. Thus, the difference between any two things is fully explained by just three properties: the number of atoms of which they are composed; the shape of these atoms, which is adequately described by the geometry; relationships between atoms.

Every change in a thing, both quantitative and qualitative, depends on a change in these three characteristics and on their correlation. Such an understanding of physical reality led to the idea of ​​the infinity of the world and at the same time to the assertion that the basis of physical reality, that is, atoms, are absolutely unchanged, therefore, they exist outside of time. Thus, the principle of increatibility and indestructibility of matter and matter was formulated. True, for the atomists, matter existed in two forms: as atoms, or complete, and as emptiness.

Thus, the abstract contradiction of moving matter, formulated by Heraclitus as the unity of being and non-being, acquired a specific physical form as the relation of the full and the empty. Full is being, empty is non-being. At the same time, the opposites turned out to be absolutely separated, which predetermined the development of physical paradigms for a long time. Here another problem was posed, namely, the problem of elementarity, that is, atoms are absolutely elementary. After all, they do not reveal their inner structure in any way, they are absolutely indivisible.

This model of physical reality also used such paradigms that continued to play important role throughout the subsequent history of physics, their radical revision took place in essence only in the 20th century, since only with the development of quantum mechanics and the study of elementary particles were the concepts of vacuum and elementarity in principle revised.

Although ancient thinkers developed various aspects of understanding physical phenomena, they did not touch the very essence of physical reality. Three concepts were of decisive importance for the further development of physics, and indeed of the whole of natural science. These are the atomism of Democritus and Epicurus, the concept of the emergence of order from chaos by Empedocles and Anaxagoras, and the physicist Aristotle, in which he tried to describe the movement based on the principles of his philosophy. Aristotle, following Plato, believed that only that which does not have a contradiction in itself can be logically expressed. But change, movement are contradictory, therefore, knowledge is directed to what is the cause of movement, change. Such a reason, according to Aristotle, is a form, that is, a system of general properties. The form is at the same time the cause of movement, change, and the purpose of the process. Since the form is unchanging, the conclusion follows, according to which the movement occurs only insofar as its cause acts. The elimination of the cause eliminates the movement itself. This statement of Aristotle became dominant in medieval physics, which was developed in European universities and essentially remained within the framework of philosophy. And although attempts were made to revise this Aristotelian paradigm, it continued to dominate physical representations until the 17th century.

Galileo dealt the first serious blow to this physical paradigm. The introduction of the principle of inertia showed that if a body moves in a straight line and uniformly, then it will retain this state even when no force acts on it. Thus, in relation to mechanical movement, the principle of the identity of opposites was formulated. It turned out that the state of uniform and rectilinear motion and the state of rest are so identical that, being inside the system, no mechanical experiment can detect whether it is moving or at rest.

It was these paradigms that determined the first stage in the development of modern physics.


The subsequent development of physics, in particular, carried out by Newton, was only a development of the fundamental discovery of Galileo. However, some paradigmatic ideas were introduced into physics. Firstly. Newton essentially understands atomism or the corpuscular concept of matter, perhaps influenced by Boyle's work, but extends this to the theory of light, viewing light as a stream of corpuscles. At the same time, explicitly or implicitly, Newton admits two very essential idealizations. Secondly, the instantaneous action and long-range action, at least for the forces of gravity. This introduces the assumption of the existence of a timeless process. After all, both instantaneous action and long-range action exclude the temporal characteristic of interaction. Thirdly, Newton suggested that space and time are entities independent of matter. All physical processes unfold in time and space, but do not interact with them.

Using these ideas about physical reality, Newton built the first cosmological model. According to this model, stars are relatively evenly distributed in infinite space, and there are also an infinite number of them. If space were finite or the number of stars were finite, then the forces of gravity would pull all the stars into a single body. The stability of the cosmos is thus based on the infinity of space, the infinite number of stars and the relative uniformity of the distribution of these stars in space.

The successes of mechanics in the 17th-18th centuries led both the physicists themselves and the materialist philosophers to a methodological setting of a paradigmatic nature: to know something means to build a mechanical model of the area under study and thus reduce it to the laws of mechanics. These laws are the most fundamental, and any other law is just a specification of the laws of mechanics. This setting became so firmly established in the minds of physicists that even Maxwell, the creator of the theory of the electromagnetic field, first tried to explain it using mechanical models. Even in 1900, the generally recognized authority in physics of that time, Thompson, aka Lord Kelvin, argued that fundamentally new discoveries in physics cannot be expected, all such discoveries have already been made - these are the laws of mechanics. A new paradigm structure in physics begins to take shape in connection with the study of electromagnetic phenomena. Initially, naturally, attempts are made to consider these phenomena in the same system of paradigms to which mechanics has accustomed physicists. Instead of gravitating masses, electric charges are now considered, which are attracted or repelled according to a law similar to the law of gravity. However, it soon became clear that such regularities were associated with electromagnetic phenomena, with which classical mechanics did not deal. Therefore, the very substance of physical phenomena had to be revised. The study of light showed that the corpuscular model used by Newton himself was insufficient. The wave model turned out to be more adequate. But for the propagation of waves, a medium is needed, and the ether was postulated as such a medium. Thus, atoms and ether are two substances that should have allowed everything to be brought together physical phenomena to the laws of mechanics. However, already Maxwell in his latest works abandons mechanical models and derives the equations of the theory of the electromagnetic field. Studies of this theory have shown that it does not need mechanics at all, that it relates to its own empirical material in the same way as classical mechanics to its own. These are two independent theories describing qualitatively different processes.

However, the paradigm that dominated even in physics required the reduction of some laws to others. Therefore, instead of a mechanical picture of the world, there are attempts to build electromagnetic picture world, including an explanation of mechanical phenomena. Thus, the creation of the theory of the electromagnetic field was the completion of the process that significantly changed the paradigmatic structure of physical thinking. Electromagnetic processes unfold in any medium, including vacuum, and therefore the vacuum in which these processes are realized is no longer an absolute void.

Since, thanks to electromagnetic models, the unity of such seemingly heterogeneous processes as electricity, magnetism, and light was revealed, it was natural to expect that the same substance, that is, the ether, underlies all these processes. Meanwhile, a comparison of the experiments of Fizeau and Michelson-Morley showed that the concept of ether is contradictory. It must be simultaneously captured by the motion of the Earth and not captured. But a contradictory concept cannot be the basis of theoretical models. The discovery of the photoelectric effect showed that light, that is, electromagnetic oscillation, simultaneously has both a wave and a corpuscular nature. Thus, the ether turned out to be unnecessary, since it is not able to explain the dual nature of electromagnetic processes.

Transition from mechanical models physical processes to electromagnetic fundamentally changes one of the fundamental paradigms, which originates from ancient atomism. For all physics from antiquity to the second half of the nineteenth century, the paradigm dominated, according to which the carrier of properties, the subject of physical reality are particles, corpuscles, etc. Now it turned out that the field is such a carrier and, accordingly, a subject. But the field, unlike corpuscles, is continuous. According to the mathematical definition, a field, unlike matter, is a system with an infinite number of degrees of freedom.

3. Transition from classical to relativistic concepts in physics

The field concept was developed in the theory of relativity. The works of Poincaré and Einstein laid the foundations for a new understanding of physical reality. According to Poincaré, if we are faced with a situation in which the known physical laws can no longer explain empirical facts, there are two possibilities for solving the problem: you can change, firstly, the laws themselves, and secondly, space and time. In this case, we will obtain identical results of solving the problem. However, it is easier to make transformations of the properties of space and time. Lorentz showed how this can be done mathematically, and Minkowski built for this purpose such mathematical representation space-time, which revealed their inseparable connection.

These mathematical constructions were based on a generalization of an idea that originates from Galileo's principle of relativity. As already mentioned, according to this principle, being inside the system, it is impossible to find out by means of a mechanical experiment whether this system is moving or at rest, provided that the system is inertial, that is, it is in a state of rest or uniform and rectilinear motion.

This principle identifies movement and rest only from the point of view of mechanical movement. But by the beginning of the 20th century, physics was already dealing with qualitatively diverse processes. Hence the natural generalization of Galileo's principle: being inside an inertial system, no physical experiment can detect whether it is moving or at rest. Consequently, the identity of rest and motion is generalized in relation to any physical process. But in order to build a special theory of relativity, a second postulate is needed, and the result of the Michelson-Morley experiment was used as such a postulate, according to which the speed of light in vacuum does not depend on the speed of the light source. For vacuum, this value is constant and in general is the speed limit for all physical interactions. By applying the mathematical apparatus of Lorentz and Minkowski and introducing a number of epistemological assumptions based on a thought experiment, it can be shown. First, there is no universal way to detect the simultaneity of events, since this requires the exchange of signals, and the speed of the signal is finite. Therefore, two observers moving relative to each other will get different results when trying to establish the simultaneity of the same event. Secondly, with separate measurements of the spatial and temporal intervals, depending on the frame of reference, we will get different meanings. Only the space-time interval has an absolute value independent of the observer.

Despite all the revolutionary nature of this concept of space and time as the basis for understanding all physical processes, it is often referred to as classical physics. The point is that the theory of relativity retained that understanding of determinism, which was paradigmatic for classical physics, while with the creation of quantum mechanics, Laplacian determinism was revised. It was replaced by the idea of ​​probabilistic determination and uncertainty as an integral characteristic of any physical interaction.

In the special theory of relativity, the identity of rest and motion was presented in a generalized form, since it deals not only with mechanical interactions, but with any physical experiments and, consequently, that any physical laws are invariant, in inertial systems. But even this generalization is incomplete. After all we are talking only about inertial systems.

The next step, generalizing the principle of the identity of rest and motion in physical processes, was to extend it to accelerated motion as well. This was done in the general theory of relativity. It argued that no physical experiment, while inside the system, can find out whether the system is at rest or moving, regardless of what this movement is. In other words, the principle of identity of gravitational and inertial mass was introduced.

Such a formulation of the problem of movement and physical interaction generally led to a change in the understanding of space and time. Gravity could be represented as the curvature of space, depending on the distribution of gravitating masses in it. The conclusion proved by Einstein and Infeld, according to which the general theory of relativity is the third and last stage in the development of the theory of motion, seemed quite natural. After all, the principle of the identity of rest and motion received the ultimate generalization in it.

The creation of the general theory of relativity made it possible to pose the problem of creating cosmological models in a new way. Although until the 20th century, astronomers proceeded from the Newtonian model of the Universe, it became clear already in the 19th century that this model contains a contradiction to the observed facts. This was expressed most clearly in the so-called photometric and gravitational paradoxes. As Olbers showed, if space is infinitely and uniformly filled with stars, then their light must be summed up and, consequently, the night sky must glow with the brightness of the Sun, since the Sun is an average star in its luminosity. However, this is not observed. Therefore, something about the assumptions on which this model is built is wrong. Zeileger later proved the so-called gravitational paradox. According to this paradox, if there are infinitely many bodies in space, then the gravitational forces are summed up and the acceleration at any point in space under the action of these forces will be infinitely large.

The only way to get rid of these paradoxes while maintaining peace in space is to accept certain relationships between stars and star systems. If these distances line up in a d'Alembert series, which converges, then the paradoxes disappear, but the amount of matter in space tends to zero. Since the Newtonian model was built on the basis of classical mechanics, with the creation of relativistic mechanics, that is, the mechanics of the theory of relativity, it became possible to build a fundamentally new cosmological model. Assuming a certain density of matter in the Universe, a little more than ten grams to minus the twenty-ninth power per cubic centimeter, Einstein obtained a cosmological model of the Universe in the form of a four-dimensional set of events in the form of a cylinder with a finite radius and an infinite time axis. At the same time, he considered only the solution of equations that described the stationary model.

As Friedman later showed, these equations also have a non-stationary solution. In this case, the space will either shrink or expand. With positive curvature, when the mass density is higher than the critical one, the curvature is positive and the "Universe" shrinks, with a density less than the critical one, the curvature is negative and the "Universe" expands. When in 1929 Hubble discovered a redshift in the spectra of distant galaxies, he interpreted it according to the Doppler principle, according to which, when the source of oscillations is removed, the frequency of oscillations coming from it decreases, which for light means a shift to the red side of the spectrum. This was taken as confirmation of Friedman's conclusion that the Universe is non-stationary, or rather that the Universe is expanding.

The theory of relativity made a revolution primarily in the understanding of the mega-world, and only later it became clear that the laws formulated in it also operate at the level of the micro-world.

chaos relativistic physics anaxagoras

4. Modern physics of the macro- and microworld

The most fundamental result, which changed one of the main paradigms of physics, was the conclusion that all fundamental physical laws are of a statistical nature. The discovery of Heisenberg's uncertainty principle was of decisive importance. According to this principle, delta X multiplied by delta P is greater than equal to H.

When one of these errors decreases, the second one grows and, thus, the state of an elementary particle always turns out to be indefinite. But if the initial state cannot be precisely determined, then the subsequent state of the particle turns out to be all the more indeterminate. It is important that such uncertainty is inherent not only in the position of the particle in space, but also in its energy state. Consequently, from a physical point of view, uncertainty turns out to be an integral property of any physical interaction in all forms of its manifestation.

The development of the theory of elementary particles and quantum mechanics made it possible to pose a number of fundamental physical and philosophical problems. First, this is the question of the inexhaustibility of physical reality in depth. Just as Einstein and Infeld proved the theorem that general relativity provides such a complete description of motion that no further qualitative progress in this area is possible, so von Neumann proved the hidden variable theorem. According to this theorem, the laws of quantum mechanics are the last step in the description of physical interactions in the microworld. There can be no deeper description. If hidden parameters exist, they cannot be revealed. Therefore, the laws of quantum mechanics can be based not on other physical laws, but only on the laws of large numbers, that is, on the mathematical structure. At this point, physics, as it were, again returned to the Pythagorean substantiation of physical reality.

Meanwhile, research in the field of elementary particles was aimed at finding a deeper level of organization of elementary particles. For a long time it seemed that von Neumann's theorem was, in a certain sense, confirmed by experiment. Such confirmation was seen in the fact that when trying to reveal the structure of an elementary particle, to find the particles of which it consists, each time a paradoxical situation arose, qualitatively different from the interaction at the microlevel. Macrobodies, with a sufficiently strong external influence, break up into the parts of which they are composed. In contrast, if we apply to an elementary particle even such an energy that exceeds its own, that is, E = M*C2, where M is the mass of the particle, which is affected, it does not collapse, but generates particles of the same level. Therefore, they began to say that the inexhaustibility of physical reality at the level of the microworld is not that there is a deeper, more subtle level of organization, but that the variety of elementary particles forms an inexhaustible set of properties and relationships.

And yet the desire to find a deeper structural level of organization of matter persists. On this way, several theories were created, which were partially confirmed in the experiment. Such, for example, is the theory of quarks, the theory of partons, that is, partial particles that do not exist outside the whole, that is, outside their particles. Although hundreds of elementary particles have already been discovered today, most of them have a very short lifetime, and only a few particles are stable, such as the electron, proton, photon, neutrino.

Any interaction at the level of elementary particles is carried out through virtual particles. They bind elementary particles together. For example, through pi-mesons, protons interact with neutrons, due to which atomic nuclei are stable. Virtual particles remain completely misunderstood and very mysterious. On the one hand, they really exist, since without them there would be no interaction, atomic nuclei would fall apart, and electrons could not rotate in atomic orbits. On the other hand, many theorists do not recognize their unconditional existence, since in this case the law of conservation of energy is violated. Therefore, following Heraclitus, one has to assert that they both exist and do not exist at the same time.

Thanks to research in the field of quantum mechanics and elementary particles, it became possible to take a fresh look at the vacuum. It turned out that particles and antiparticles constantly appear and disappear in vacuum in pairs. However, their lifetime is so short that it is impossible to detect them experimentally. The detection of such particles would be contrary to Heisenberg's uncertainty principle. It is precisely because of the brevity of the existence of vacuum particles that bodies moving in it experience practically no resistance. However, special experiments based on mathematical models can indirectly detect the appearance and disappearance of virtual pairs of particles and antiparticles. Here we also have a situation where particles are present in vacuum and at the same time they are not there.

Already in the 1930s, it became clear that four types of interactions underlie all physical phenomena. This is gravitational, which is of decisive importance at the macro and mega levels of organization of physical reality; electromagnetic, manifested at the micro and macro levels; strong interactions, which determine the intranuclear forces; weak interactions that determine the decay of protons. At the same time, the question immediately arose: can these forces be reduced to some kind of unity, that is, the problem of creating a unified field theory arose. Naturally, at first they tried to follow the path that always gave good results, that is, to carry out the reduction of some laws to others. Thus, Einstein tried for many years to create a unified field theory, trying to derive three other interactions from the general theory of relativity. The failure that befell him on this path was determined by the fact that it is possible to successfully derive one from the other only when this reflects an objective connection. Meanwhile, all four interactions are consequences of a more general initial interaction. Success appeared only when the need to explain the Big Bang forced us to approach this problem in an evolutionary way and start from the simplest - from the vacuum. This is how the problem is worked out in the Big Bang models. If it was originally assumed that the initial state of the evolution of our Universe was a special, superdense condensation of matter and energy, and then, thanks to the explosive process, the synthesis of elementary particles began and thereby the initial state arose in which the four physical interactions known to us already acted, then in modern cosmology vacuum is taken as the initial state.

The Big Bang is viewed as a fluctuation in the vacuum, during which the relative balance of the forces of attraction and repulsion was disturbed, which led to a colossal release of energy.

Thus, our Universe arises from what is extremely simple in modern physical reality, that is, from vacuum, or from "nothing". Hegel in his Logic argued that development proceeds from nothing through something to nothing. IN AND. This statement seemed doubtful to Lenin, he wrote on this occasion that it happens to nothing, but it does not happen from nothing. But from the point of view of the Big Bang model, it is precisely from “nothing” that our Universe arises. After all, the very concept of "non-being" in Hegel's philosophy is relative, since it is identical to the concept of "being". Therefore, starting with being or non-being is not of fundamental importance in this philosophy. Each of these concepts is directly transformed into another, thus giving the concept of "becoming". And becoming gives rise to existence, that is, such a certainty in which the quality is already set. The situation is approximately the same in the cosmological model of the Big Bang. An internal contradiction embedded in a vacuum gives rise to a process, and the result of this process is the certainty of the laws of physical reality.

Conclusion

The formation of a scientific picture of the world in the era of the formation and development of classical natural science largely depended on the rapidly changing relationship between natural philosophical knowledge and knowledge based on experimental research. The ever-increasing priority of scientific knowledge and, in connection with this, the accentuated attention to methodological and epistemological issues led to the replacement of the natural-philosophical picture of the world with concepts of nature, in the center of which were the areas of natural science that were fundamental for this era.

At the same time, the process of forming a truly scientific picture of the world was quite contradictory. Thus, although natural philosophy and humanism had a destructive influence on medieval scholasticism, they were not yet able to completely supplant the worldview of the elements of scholastic peripatetism and mysticism. Only with the emergence of classical mechanics and astronomy based on axiomatics and advanced mathematics, the picture of the world acquires the essential features of a scientific worldview. An outstanding role in this process was played by the new heliocentric paradigm of Copernicus, the Galilean image of science, Newtonian methodology in building the system of the world. It became possible to form a scientific picture of the world based on empirically substantiated knowledge.

At the moment, science has established a huge variety of material objects representing the micro, macro and mega worlds, but the question remains whether these discoveries exhaust everything that exists in general. The variety of matter and its movement is infinite, and not only quantitatively, but also qualitatively. The principle of the qualitative infinity of nature means the recognition of the unlimited variety of structural forms of matter, which differ in the most fundamental laws of being.

List of used literature

1.Introduction to the history and philosophy of science. Moscow: Academic Project, 2005 - 407 p.

.Voitov, A.G. History and philosophy of science: textbook for graduate students - M.: Dashkov i K, 2007 - 691 p.

.Gorelov A. A. Concepts of modern natural science. - M.: Center, 2007. -226 p.

.Huseykhanov M.K., Radjabov O.R. Concepts of modern natural science. -M.: ITC "Dashkov and Co", 2008. - 378 p.

.Nebel B. The science of environment. How the world works. - M.: Mir, 2010. - 280 p.

Prehistory of physics. Physical observation. phenomena occurred in ancient times. At that time, the process of accumulation of factual knowledge was not yet differentiated: physical, geometrical and astronomical representations developed jointly.

The systematic accumulation of facts and attempts to explain and generalize them, which preceded the creation of physics (in the modern sense of the word), took place especially intensively in era of Greco-Roman culture(6th century BC - 2nd century AD). During this era, the initial ideas about atomic structure of matter(Democritus, Epicurus, Lucretius), the geocentric system of the world was created (Ptolemy), the beginnings of the heliocentric system appeared (Aristarchus of Samos), some simple laws of statics(rules of leverage, center of gravity), first results obtained applied optics(mirrors were made, the law of light reflection was discovered, the phenomenon of refraction was discovered), the simplest principles were discovered hydrostatics(Law of Archimedes). The simplest phenomena of magnetism and electricity were known in ancient times.

Doctrine Aristotle (389 - 322 BC) summed up the knowledge of the previous period 1 . The teaching of Aristotle, canonized by the Church, became a brake on the further development of physical science. After thousands of years of stagnation and barrenness, physics was revived only in the 15th-16th centuries. in the struggle against scholastic philosophy. The revival of science was mainly due to the needs of production in the manufacturing period. The great geographical discoveries, in particular the discovery of America, contributed to the accumulation of many new observations and the overthrow of old prejudices. The development of crafts, navigation and artillery created incentives for scientific research. Scientific thought focused on the problems of construction, hydraulics and ballistics, and interest in mathematics increased. The development of technology has created opportunities for experimentation. Leonardo da Vinci posed a whole series of physical questions and tried to solve them by experiment. He owns the saying: "experience never deceives, only our judgments deceive" .

However, in the 15-16 centuries, individual physical observations and experimental studies were random character. Only the 17th century marked the beginning systematic application of the experimental method in physics and the continuing growth of physical knowledge since then.

The first period of development of physics , dubbed classical, begins with the works Galileo Galilei (1564 - 1642) . Exactly Galileo was the creator of the experimental method in physics. A carefully thought-out experiment, the separation of minor factors from the main one in the phenomenon under study, the desire to establish exact quantitative relationships between the parameters of the phenomenon - such is Galileo's method. With this method, Galileo laid the initial foundations speakers. Galileo refuted the erroneous statements of Aristotle's mechanics: he, in particular, was able to show that not speed, but acceleration is a consequence of external influence on the body. In his work "Conversations and mathematical proofs concerning two new branches of science ..." (1638) Galileo convincingly substantiates this conclusion, which is the first formulation law of inertia eliminates apparent contradictions. He proves by experience that the free fall acceleration of bodies does not depend on their density and mass. Considering the motion of a thrown body, Galileo finds law of addition of motions and in essence expresses the position on the independence of the action of forces. The "Conversations" also provides information about the strength of bodies. They also came up with ideas about relativity of motion(principle of relativity), motion of bodies on an inclined plane ( in fact, he discovered Newton's first two laws).

In the writings of Galileo and Blaise Pascal foundations were laid hydrostatics. Galileo also made important discoveries in other areas of physics. For the first time, he confirms experimentally the phenomenon of surface tension, which was studied much later. Galileo enriches applied optics with his telescope, and his thermometer led to quantitative study of thermal phenomena.

In the first half of the 17th century, the physical doctrine of gases arose, which was of great practical importance. Disciple of Galileo E. Torricelli discovers the existence of air pressure and creates the first barometer. O. Guericke invents the air pump and finally refutes the Aristotelian statement about the "fear of the void." R. Boyle and somewhat later E. Mariotte investigate the elasticity of gases and discover the law known under their name. W. Snellius (Holland) and R. Descartes (France) discover the law of refraction of light. The creation of the microscope dates back to this time. Observations on magnets (in navigation) and on electrification during friction provide valuable information in the field of electrostatics and magnetostatics, the pioneer of which should be recognized as an English naturalist W. Gilbert .

The 2nd half of the 17th century is even richer in events. "Conversations" of Galileo marked the beginning of research fundamentals of mechanics. Curvilinear motion study ( X. Huygens ) prepared the opening fundamental law of mechanics- the relationship between force, mass and acceleration, first formulated I. Newton in his "The Mathematical Principles of Natural Philosophy" (1687) . Newton also established the basic law of system dynamics (equality of action against reaction), in which the previous studies of the impact of bodies (H. Huygens) found their generalization. For the first time, the basic concepts of physics crystallize - concepts of space and time.

Based on the laws of planetary motion established by Kepler, Newton in his Elements for the first time formulates law of gravity, which many scientists of the 17th century tried to find. Newton confirmed this law by calculating the acceleration of the Moon in its orbit based on the value of the acceleration of gravity measured in the 70s of the 17th century. He also explained the perturbations of the motion of the moon and the cause of the tides of the sea. The significance of this discovery by Newton cannot be overestimated. It showed contemporaries the power of science. It changed the whole picture of the universe.

At the same time X. Huygens and G. Leibniz formulate law of conservation of momentum ( previously expressed by Descartes in an inexact form) and the law of conservation of living forces. Huygens creates the theory of the physical pendulum and constructs a clock with a pendulum. One of the versatile scientists of the 17th century R. Hooke (England) opens known by his name law of elasticity. M. Mersenne (France) lays the foundations physical acoustics; he studies the sound of a string and measures the speed of sound in air.

During these years, in connection with the increasing use of spotting scopes, geometric optics was rapidly developing and fundamentals of physical optics. F. Grimaldi (Italy) in 1665 discovers the diffraction of light. Newton develops his doctrine of the dispersion and interference of light. He puts forward the hypothesis of light corpuscles. Spectroscopy originates from Newton's optical research. O. Römer (Denmark) in 1672 measures the speed of light. Newton's contemporary Huygens develops the initial fundamentals of wave optics, formulates the principle of propagation of waves (of light), known under his name, investigates and explains the phenomenon of double refraction in crystals 2 .

Thus, in the 17th century, the foundations of mechanics were created and research began in the most important areas of physics - in the doctrine of electricity and magnetism, on heat, physical optics and acoustics.

In the 18th century further development of all areas of physics continues. Newtonian mechanics becomes an extensive system of knowledge, covering the laws of motion of terrestrial and celestial bodies. labors L. Euler , French scientist A. Clairaut etc. is being created celestial mechanics brought to the highest perfection P. Laplace. In its developed form, mechanics becomes the basis of the machine technology of that time, in particular hydraulics.

In other branches of physics in the 18th century there was a further accumulation of experimental data, the simplest laws were formulated. W. Franklin formulates law of conservation of charge. Created in the middle of the 18th century first electric capacitor(Leiden Bank P. Mushenbruk in Holland), which made it possible to accumulate large electric charges, which facilitated the study of the law of their interaction. This law, which is the basis of electrostatics, was discovered independently of each other. G. Cavendish And J. Priestley (England) and Sh. Coulomb (France). arose theory of atmospheric electricity. W. Franklin in 1752 and a year later M. V. Lomonosov And G. V. Richman studied lightning discharges and proved the electrical nature of lightning.

Photometry began to be created in optics: British scientists W. Herschel And W. Wollaston opened infrared rays, and the German scientist I. Ritter - ultraviolet. The development of chemistry and metallurgy stimulated the development the doctrine of warmth: the concept of heat capacity was formulated, the heat capacities of various substances were measured, calorimetry was founded. Lomonosov predicted the existence of absolute zero. Studies of thermal conductivity and thermal radiation, as well as the study of the thermal expansion of bodies, began. During the same period, it was created and began to improve Steam engine.

True, heat was imagined as a special weightless liquid - caloric. In a similar way, the electrification of bodies was explained using the hypothesis of an electric fluid, and magnetic phenomena were explained by a magnetic fluid. In general, during the 18th century, weightless fluid models penetrated into all branches of physics. The overwhelming majority of researchers did not doubt their existence! This was a consequence of the belief that various physical phenomena - thermal, electrical, magnetic, optical - are not interconnected, independent of each other. It was believed that each phenomenon has its own "carrier", a special substance. Only a few advanced minds, including Euler and Lomonosov, denied the existence of weightless matter and saw in thermal phenomena and the properties of gases a hidden, but unceasing movement of the smallest particles. This difference of opinion revealed the difference physical "pictures of the world" - Newtonian And Cartesian originated in the 17th century.

The followers of Descartes (Cartesia) considered all physical phenomena as various movements of the same primary matter, the only properties of which are extension and inertia. He believed that as a result of various movements and collisions of parts of the primary matter, particles of matter (corpuscles) of various volumes and shapes are formed, between which particles of the most refined form of matter, the ether, move. The followers of Descartes saw the problem of physics in creation of purely mechanical models of phenomena. Universal gravitation, electrical and magnetic interactions, chemical reactions - everything was explained by various vortices in the ether, connecting or separating particles of matter.

However, this picture of the world met with objections as early as the middle of the 17th century. Its unsatisfactoriness was shown most convincingly by Newton in his Principia. Newton proved that the explanation of universal gravitation given by the Cartesians contradicts the facts: vortices in the ether, which, according to Descartes, completely fill the entire solar system and carry the planets with them, exclude the possibility of free passage of comets through the solar system without losing their motion.

Newton's picture of the world is based on the concept of atoms separated by emptiness and instantly interacting through the emptiness by forces of attraction or repulsion (long-range action). Forces, according to Newton, are the primary, initial property of certain types of particles; such a force as gravitation is inherent in all particles of matter. Unlike the Cartesians, Newton considered it possible that mechanical motion could not be preserved in nature. Newton saw the main task of physics is to find the forces of interaction between bodies. He did not rule out the existence of the ether, but considered it as a thin elastic gas that fills the pores of bodies and interacts with matter.

The struggle between Newtonian and Cartesian ideas lasted for almost two centuries. The same laws of nature were interpreted differently by the supporters of these two trends. In the 18th century Newton's views triumphed in physics and had a profound influence on its further development. They contributed introduction of mathematical methods in physics. At the same time, they strengthened for 100 years the idea of ​​long-range action. Cartesian tendencies revived again in the 2nd half of the 19th century, after the creation of the wave theory of light, the discovery of the electromagnetic field and the law of conservation of energy.

The second period of the history of physics begins in the first decade of the 19th century. In the 19th century, the most important discoveries and theoretical generalizations were made, which gave physics the character a single holistic science. The unity of various physical processes found expression in law of conservation of energy. The decisive role in the experimental preparation of this law was played by discovery of electric current and the study of its manifold actions, as well as the study of the mutual transformations of heat and mechanical work. In 1820 H. K. Oersted (Denmark) discovered the action of electric current on a magnetic needle. Oersted's experience served as an impetus for research A. Ampera, D. Arago and others. The law of interaction of two electric currents, found by Ampere, became the basis electrodynamics. With the lively participation of other researchers, Ampère in a short time found out connection of magnetic phenomena with electrical, reducing, in the end, magnetism to the actions of currents. So the idea of ​​magnetic fluids ceased to exist. In 1831, Faraday discovered electromagnetic induction, thus realizing his plan: "to turn magnetism into electricity."

At this stage of development the mutual influence of physics and technology has increased significantly. The development of steam technology posed numerous problems for physics. Physical studies of the mutual transformation of mechanical energy and heat, culminating in creation thermodynamics, served as the basis for the improvement of heat engines. After the discovery of the electric current and its laws, development begins electrical engineering(the invention of the telegraph, electroforming, dynamos), which, in turn, contributed to the progress electrodynamics.

In the 1st half of the 19th century there is a collapse of the idea of ​​weightless substances. This process was done slowly and with great difficulty. The first gap in the then dominant physical worldview was made by wave theory of light(English scientist T. Jung , French scientists O. Fresnel and D. Arago ) 3 . The whole set of phenomena of interference, diffraction and polarization of light, in particular the phenomenon of interference of polarized rays, could not be theoretically interpreted from a corpuscular point of view and at the same time found a complete explanation in the wave theory, according to which light is transverse waves propagating in a medium ( on air). Thus, the light substance was rejected as early as the second decade of the 19th century.

More tenacious, in comparison with light matter and ferrofluid, turned out to be a concept of caloric. Although experiments B. Rumford , which proved the possibility of obtaining an unlimited amount of heat due to mechanical work, were in clear contradiction with the idea of ​​​​a special thermal substance, the latter lasted until the middle of the century; it seemed that only with its help it was possible to explain the latent heat of melting and evaporation. The merit of creating the kinetic theory, the beginnings of which date back to the times of Lomonosov and D. Bernoulli, belonged to English scientists J. Joule, W. Thomson (Kelvin) and German scientist R. Clausius .

Thus, as a result of many-sided and lengthy experiments, in the conditions of a difficult struggle with obsolete ideas, the mutual convertibility of various physical processes was proved, and thereby the unity of all physical phenomena known at that time.

Immediate proof of conservation of energy for any physical and chemical transformations was given in the works Y. Mayer (Germany), J. Joule And G. Helmholtz . After the law of conservation of energy won universal recognition (in the 50s of the 19th century), it became the cornerstone of modern natural science. The law of conservation of energy and the principle of entropy change [R. Clausius, W. Thomson (Kelvin)] formed the basis thermodynamics; they are usually formulated as the first and second laws of thermodynamics.

The proof of the equivalence of heat and work confirmed the view on heat as the disordered motion of atoms and molecules. The works of Joule, Clausius, Maxwell, Boltzmann and others created kinetic theory of gases. Already at the first stages of the development of this theory, when molecules were still considered as solid elastic balls, it was possible to reveal the kinetic meaning of such thermodynamic quantities as temperature and pressure. The kinetic theory of gases made it possible to calculate the average paths of molecules, the sizes of molecules and their number per unit volume.

The idea of ​​the unity of all physical processes led in the 2nd half of the 19th century to a radical restructuring of all physics, to its unification into two big sections- physics of matter And field physics. The basis of the first was the kinetic theory, the second - the doctrine of the electromagnetic field.

Kinetic theory, operating with averages, for the first time introduced the methods of probability theory into physics. She served as a starting point statistical physics one of the most general physical theories. The fundamentals of statistical physics were systematized already at the threshold of the 20th century by an American scientist J. Gibbs .

Equally fundamental was discovery of the electromagnetic field and its laws. The creator of the doctrine of the electromagnetic field was M. Faraday . He was the first to express the idea that electric and magnetic actions are not transferred directly from one charge to another, but propagate through an intermediate medium. Faraday's views on the field were mathematically developed by Maxwell in the 60s of the 19th century, who managed to give complete system electromagnetic field equations. Field theory became as consistent as Newton's mechanics.

The electromagnetic field theory leads to the idea of ​​a finite propagation velocity of electromagnetic actions expressed by Maxwell (anticipated even earlier by Faraday). This idea enabled Maxwell to predict the existence electromagnetic waves. Maxwell also concluded that electromagnetic nature of light. The electromagnetic theory of light has merged electromagnetism and optics.

However, the generally accepted theory of the electromagnetic field became only after the German physicist G. Hertz discovered electromagnetic waves by experience and proved that they follow the same laws of refraction, reflection and interference as light waves.

In the second half of the 19th century, the role of physics in technology grew significantly. Electricity has found application not only as a means of communication (telegraph, telephone), but also as a method of transmission and distribution of energy and as a source of lighting. In the late 19th century, electromagnetic waves were used for wireless communication ( A. S. Popov, Marconi ), which was the beginning of radio communication. Technical thermodynamics contributed to the development of internal combustion engines. arose low temperature technology. In the 19th century, all gases were liquefied, with the exception of helium, which was obtained in a liquid state only in 1908 (Dutch physicist G. Kammerling-Onnes ).

By the end of the 19th century, physics seemed almost complete to contemporaries.. Concept approved mechanistic determinism Laplace, proceeding from the possibility to uniquely determine the behavior of the system at any time, if the initial conditions are known. It seemed to many that physical phenomena could be reduced to the mechanics of molecules and the ether, for to explain physical phenomena meant at that time to reduce them to mechanical models easily accessible on the basis of everyday experience. The mechanical theory of heat, elastic (or vortex) ether as a model of electromagnetic phenomena - this is how it looked until the end of the 19th century physical picture of the world. The ether seemed to be similar to matter in a number of its properties, but, unlike matter, weightless or almost weightless (some calculations led to the weight of a ball of ether, equal in volume to the Earth, at 13 kg).

However, mechanical models ran into more contradictions, the more detailed they were tried to be developed and applied. The ethereal vortex tube models created to explain variable fields were unsuitable for explaining constant electric fields. On the contrary, various constant field models did not explain the possibility of propagation of electromagnetic waves. Finally, none of the ether models was able to clearly explain the connection of the field with discrete charges. Various mechanical models of atoms and molecules (for example, the vortex model of the atom proposed by W. Thomson) also turned out to be unsatisfactory.

The impossibility of reducing all physical processes to mechanical gave rise to some physicists and chemists desire in general refuse to recognize the reality of atoms and molecules, to reject the reality of the electromagnetic field. E. Mach proclaimed the task of physics to be a "pure description" of phenomena. German scientist W. Ostwald opposed kinetic theory and atomism in favor of the so-called energy -- universal, purely phenomenological thermodynamics, as the only possible theory of physical phenomena.

The third (modern) period in the history of physics , dubbed non-classical or quantum relativistic physics, starts at last years 19th century. This the period is characterized by the direction of research thought deep into the substance, to its microstructure. A new era in the history of physics begins with electron detection and studies of its action and properties (English. scientist J. Thomson , Dutch scientist G. Lorenz ).

The most important role was played by the investigation of electric discharges in gases. It turned out that an electron is an elementary particle of a certain mass, which has the smallest electric charge and is part of an atom of any chemical element. This meant that an atom is not elementary, but is a complex system. It was proved that the number of electrons in an atom and their distribution over layers and groups determine the electrical, optical, magnetic and chemical properties of the atom; the polarizability of an atom, its magnetic moment, optical and x-ray spectra, and valency depend on the structure of the electron shell.

The creation of the most general theories of modern physics is connected with the dynamics of electrons and their interaction with the radiation field - theory of relativity and quantum mechanics.

The study of the motions of fast electrons in electric and magnetic fields has led to the conclusion that classical Newtonian mechanics is inapplicable to them. Such a fundamental attribute of a material particle as mass turned out to be not constant, but variable, depending on the state of electron motion. It was the collapse of ideas rooted in physics about the motion and properties of particles.

A way out of contradictions was found A. Einstein who created (in 1905) a new physical theory of space and time, theory of relativity. Later, Einstein created (in 1916) general theory of relativity who transformed the old doctrine of gravity

No less important and effective generalization of physical facts and regularities was quantum mechanics, created at the end of the first quarter of the 20th century as a result of research into the interaction of radiation with particles of matter and the study of the states of intraatomic electrons. The basic idea of ​​quantum mechanics is that all microparticles have a dual corpuscular-wave nature.

These radically new ideas about microparticles turned out to be extremely fruitful and effective. Quantum theory succeeded in explaining the properties of atoms and the processes occurring in them, the formation and properties of molecules, the properties of a solid body, and the laws of electromagnetic radiation.

The twentieth century. marked in physics powerful development experimental research methods And measuring technology. The detection and counting of individual electrons, nuclear and cosmic particles, the determination of the arrangement of atoms and the electron density in crystals and in a single molecule, the measurement of a time interval of the order of 10 -10 seconds, the observation of the movement of radioactive atoms in matter - all this characterizes the leap in measuring technology in a few recent decades.

Unprecedented in terms of power and scale, the means of research and production were directed to study of nuclear processes. The last 25 years of nuclear physics, closely associated with cosmic rays and then with the creation of powerful accelerators, have led to a technical revolution and created new, exceptionally subtle research methods not only in physics, but also in chemistry, biology, geology, and in the most diverse fields of technology. and agriculture.

Accordingly, with the growth of physical research and with their growing influence on other natural sciences and technology, the number of physical journals and books increased. At the end of the 19th century in Germany, England, the USA and Russia, in addition to academic ones, only one physical journal was published. Currently, more than two dozen magazines are published in Russia, the USA, England, Germany (in each country).

To an even greater extent the number of research institutions and scientists has increased. If in the 19th century scientific research was carried out mainly by the physical departments of universities, then in the 20th century in all countries there appeared and began to increase in number and in scale research institutes in physics or in its individual areas. Some of the institutes, especially in the field of nuclear physics, have such equipment, which in its scale and cost exceeds the scale and cost of factories.

Traditionally, cloves are found in almost every recipe for gingerbread and punch. This spice improves the taste of sauces, as well as meat and vegetable dishes. Scientists have found that the spicy clove is an excellent antioxidant and therefore suitable for strengthening the body's defenses.

Read completely

Category: Healthy lifestyle

Ramson (wild garlic) is a kind of harbinger of spring, which is eagerly awaited. This is not surprising, because the tender green leaves of wild garlic are not only a culinary, but also a healthy highlight! Ramson removes toxins, lowers blood pressure and cholesterol levels. It fights existing atherosclerosis and protects the body from bacteria and fungi. In addition to being rich in vitamins and nutrients, wild garlic also contains the active ingredient alliin, a natural antibiotic with a variety of health benefits.



Category: Healthy lifestyle

Winter is flu season. The annual flu wave usually begins in January and lasts three to four months. Can the flu be prevented? How to protect yourself from the flu? Is the flu vaccine really the only alternative, or are there other ways? What exactly can be done to strengthen the immune system and prevent the flu in natural ways, you will learn in our article.

Read completely

Category: Healthy lifestyle

There are many medicinal plants from colds. In this article, you will learn the most important herbs that will help you get over a cold faster and become stronger. You will learn which plants help with a runny nose, have an anti-inflammatory effect, relieve a sore throat and soothe a cough.

Read completely

How to become happy? A few steps to happiness Rubric: Psychology of relationships

The keys to happiness are not as far away as it might seem. There are things that cloud our reality. You need to get rid of them. In this article, we will introduce you to a few steps by which your life will become brighter and you will feel happier.

Read completely

Learning to apologize properly Rubric: Psychology of relationships

A person can quickly say something and not even notice that he offended someone. In the blink of an eye, a quarrel can flare up. One bad word follows the next. At some point, the situation is so heated that it seems that there is no way out of it. The only salvation is for one of the participants in the quarrel to stop and apologize. Sincere and friendly. After all, the cold "Sorry" does not cause any emotions. A proper apology is the best relationship healer in every life situation.

Read completely

Rubric: Psychology of relationships

Maintaining a harmonious relationship with a partner is not easy, but it is infinitely important for our health. You can eat right, exercise regularly, have a great job and a lot of money. But none of this will help if we have relationship problems with a loved one. Therefore, it is so important that our relations are harmonious, and how to achieve this, the tips in this article will help.

Read completely

Bad breath: what is the reason? Category: Healthy lifestyle

Bad breath is a rather unpleasant issue not only for the culprit of this smell, but also for his loved ones. Unpleasant smell in exceptional cases, for example, in the form of garlic food, is forgiven by everyone. Chronic bad breath, however, can easily push a person towards social offside. This should not be the case, because the cause of bad breath can in most cases be relatively easy to find and fix.

Read completely

Heading:

The bedroom should always be an oasis of peace and well-being. This is obviously why many people want to decorate their bedroom with houseplants. But is it advisable? And if so, which plants are suitable for the bedroom?

Modern scientific knowledge condemns the ancient theory that flowers in the bedroom are inappropriate. It used to be that green and flowering plants consumed a lot of oxygen at night and could cause health problems. In fact, houseplants have a minimal need for oxygen.

Read completely

Secrets of night photography Category: Photography

What camera settings should be used for long exposures, night photography and photography with low level lighting? In our article, we have collected some tips and tricks that will help you take high-quality night photos.

Formation of physics (before the 17th century). The physical phenomena of the surrounding world have long attracted the attention of people. Attempts at a causal explanation of these phenomena preceded the creation of F. in the modern sense of the word. In the Greco-Roman world (6th century BC - 2nd century AD), ideas about atomic structure substances (Democritus, Epicurus, Lucretius), the geocentric system of the world was developed (Ptolemy), the simplest laws of statics were established (the lever rule), the law of rectilinear propagation and the law of reflection of light were discovered, the principles of hydrostatics were formulated (the law of Archimedes), the simplest manifestations of electricity and magnetism were observed .

The result of acquired knowledge in the 4th century. BC e. was summed up by Aristotle. Aristotle's physics included certain correct provisions, but at the same time it lacked many of the progressive ideas of its predecessors, in particular the atomic hypothesis. Recognizing the importance of experience, Aristotle did not consider it the main criterion for the reliability of knowledge, preferring speculative ideas. In the Middle Ages, the teachings of Aristotle, canonized by the church, slowed down the development of science for a long time.

Science revived only in the 15th and 16th centuries. in the fight against the scholastic teaching of Aristotle. In the middle of the 16th century N. Copernicus put forward the heliocentric system of the world and laid the foundation for the liberation of natural science from theology. The needs of production, the development of crafts, navigation and artillery stimulated scientific research based on experience. However, in the 15-16 centuries. experimental studies were mostly random. Only in the 17th century The systematic application of the experimental method in physics began, and this led to the creation of the first fundamental physical theory—Newton's classical mechanics.

Formation of physics as a science (early 17th - late 18th centuries).

The development of physics as a science in the modern sense of the word began with the works of G. Galileo (first half of the 17th century), who realized the need for a mathematical description of motion. He showed that the impact of surrounding bodies on a given body determines not the speed, as was considered in Aristotle's mechanics, but the acceleration of the body. This statement was the first formulation of the law of inertia. Galileo discovered the principle of relativity in mechanics (see Galileo's principle of relativity) , proved the independence of the acceleration of free fall of bodies on their density and mass, substantiated the theory of Copernicus. Significant results were also obtained by him in other areas of physics. He built a telescope with high magnification and made a number of astronomical discoveries with its help (mountains on the Moon, satellites of Jupiter, etc.). The quantitative study of thermal phenomena began after the invention of the first thermometer by Galils.

In the 1st half of the 17th century. successful study of gases began. Galileo's student E. Torricelli established the existence atmospheric pressure and created the first barometer. R. Boyle and E. Mariotte investigated the elasticity of gases and formulated the first gas law that bears their name. W. Snellius and R. Descartes discovered the law of refraction of light. At the same time, the microscope was created. A significant step forward in the study of magnetic phenomena was made at the very beginning of the 17th century. W. Gilbert. He proved that the Earth is a large magnet, and was the first to strictly distinguish between electrical and magnetic phenomena.

The main achievement of F. 17th century. was the creation of classical mechanics. Developing the ideas of Galileo, H. Huygens and other predecessors, I. Newton in his work "Mathematical Principles of Natural Philosophy" (1687) formulated all the basic laws of this science (see Newton's laws of mechanics) . In the construction of classical mechanics, the ideal was first embodied scientific theory, which still exists today. With the advent of Newtonian mechanics, it was finally understood that the task of science is to find the most general quantitatively formulated laws of nature.

Newtonian mechanics achieved the greatest success in explaining the motion of celestial bodies. Based on the laws of planetary motion established by I. Kepler based on the observations of T. Brahe, Newton discovered the law of universal gravitation (see Newton's law of gravity) . WITH using this law, it was possible to calculate with remarkable accuracy the movement of the moon, planets and comets of the solar system, to explain the tides in the ocean. Newton adhered to the concept of long-range action, according to which the interaction of bodies (particles) occurs instantly directly through the void; interaction forces must be determined experimentally. He was the first to clearly formulate the classical ideas about absolute space as a container of matter, independent of its properties and movement, and absolute uniformly flowing time. Until the creation of the theory of relativity, these ideas did not undergo any changes.

Of great importance for the development of F. was the discovery of L. Galvani and A. Volt of the electric current. Creating Powerful Sources direct current- galvanic batteries - made it possible to detect and study the diverse effects of current. The chemical effect of the current was investigated (G. Davy, M. Faraday). VV Petrov received an electric arc. The discovery by H. K. Oersted (1820) of the action of an electric current on a magnetic needle proved the connection between electricity and magnetism. Based on the unity of electrical and magnetic phenomena, A. Ampère came to the conclusion that all magnetic phenomena are due to moving charged particles - electric shock. Following this, Ampere experimentally established a law that determines the strength of the interaction of electric currents (Ampère's law) .

In 1831, Faraday discovered the phenomenon of electromagnetic induction (see Electromagnetic induction) . Attempts to explain this phenomenon with the help of the concept of long-range action encountered significant difficulties. Faraday put forward a hypothesis (even before the discovery of electromagnetic induction), according to which electromagnetic interactions are carried out through an intermediate agent - an electromagnetic field (the concept of short-range interaction). This was the beginning of the formation of a new science about the properties and laws of behavior of a special form of matter - the electromagnetic field.

Even before the discovery of this law, S. Carnot in his work “Reflections on the driving force of fire and on machines capable of developing this force” (1824) obtained results that served as the basis for another fundamental law of the theory of heat - the second law of thermodynamics. This law was formulated in the works of R. Clausius (1850) and W. Thomson (1851). It is a generalization of experimental data indicating the irreversibility of thermal processes in nature, and determines the direction of possible energy processes. A significant role in the construction of thermodynamics was played by the studies of J. L. Gay-Lussac, on the basis of which B. Clapeyron found the equation of state of an ideal gas, further generalized by D. I. Mendeleev.

Simultaneously with the development of thermodynamics, the molecular-kinetic theory of thermal processes was developed. This made it possible to include thermal processes within the framework of the mechanical picture of the world and led to the discovery of a new type of laws - statistical ones, in which all relationships between physical quantities are of a probabilistic nature.

At the first stage in the development of the kinetic theory of the simplest medium - gas - Joule, Clausius and others calculated the average values ​​of various physical quantities: the speed of molecules, the number of their collisions per second, the mean free path, etc. The dependence of the gas pressure on the number of molecules per unit volume and the average kinetic energy was obtained forward movement molecules. This made it possible to reveal the physical meaning of temperature as a measure of the average kinetic energy of molecules.

The second stage in the development of molecular kinetic theory began with the work of J.C. Maxwell. In 1859, having introduced the concept of probability for the first time in physics, he found the law of distribution of molecules with respect to velocities (see Maxwell distribution) . After that, the possibilities of molecular-kinetic theory expanded enormously. And led later to the creation of statistical mechanics. L. Boltzmann built a kinetic theory of gases and gave a statistical justification for the laws of thermodynamics. The main problem that Boltzmann managed to solve to a large extent was to reconcile the time-reversible nature of the motion of individual molecules with the obvious irreversibility of macroscopic processes. The thermodynamic equilibrium of a system, according to Boltzmann, corresponds to the maximum probability of a given state. The irreversibility of processes is associated with the tendency of systems to the most probable state. Of great importance was the theorem he proved on the uniform distribution of the average kinetic energy over the degrees of freedom.

Classical statistical mechanics was completed in the works of JW Gibbs (1902), who created a method for calculating distribution functions for any system (not just gases) in thermodynamic equilibrium. Statistical mechanics received universal recognition in the 20th century. after the creation by A. Einstein and M. Smoluchowski (1905–06) on the basis of the molecular kinetic theory of the quantitative theory of Brownian motion, confirmed in the experiments of J. B. Perrin.

In the 2nd half of the 19th century. the long process of studying electromagnetic phenomena was completed by Maxwell. In his main work "Treatise on Electricity and Magnetism" (1873), he established equations for the electromagnetic field (bearing his name), which explained all the facts known at that time from a unified point of view and made it possible to predict new phenomena. Maxwell interpreted electromagnetic induction as a process of generating a vortex electric field by an alternating magnetic field. Following this, he predicted the opposite effect - the generation of a magnetic field by an alternating electric field (see Displacement current) . The most important result of Maxwell's theory was the conclusion about the finiteness of the propagation velocity of electromagnetic interactions, equal to the speed of light. The experimental detection of electromagnetic waves by G. R. Hertz (1886–89) confirmed the validity of this conclusion. It followed from Maxwell's theory that light has an electromagnetic nature. Thus, optics became one of the branches of electrodynamics. At the very end of the 19th century. P. N. Lebedev experimentally discovered and measured the light pressure predicted by Maxwell’s theory, and A. S. Popov was the first to use electromagnetic waves for wireless communication.

Experience showed that the principle of relativity formulated by Galileo, according to which mechanical phenomena proceed in the same way in all inertial frames of reference, is also valid for electromagnetic phenomena. Therefore, Maxwell's equations must not change their form (must be invariant) when moving from one inertial frame of reference to another. However, it turned out that this is true only if the transformations of coordinates and time during such a transition are different from the Galilean transformations that are valid in Newtonian mechanics. Lorentz found these transformations (Lorentz transformations) , but could not give them a correct interpretation. This was done by Einstein in his private theory of relativity.

The discovery of the private theory of relativity showed the limitations of the mechanical picture of the world. Attempts to reduce electromagnetic processes to mechanical processes in a hypothetical medium - ether turned out to be untenable. It became clear that the electromagnetic field is a special form of matter, the behavior of which does not obey the laws of mechanics.

In 1916, Einstein built the general theory of relativity - a physical theory of space, time and gravity. This theory marked a new stage in the development of the theory of gravitation.

At the turn of the 19th and 20th centuries, even before the creation of the special theory of relativity, the foundation was laid for the greatest revolution in the field of physics, associated with the emergence and development of quantum theory.

At the end of the 19th century it turned out that the distribution of the energy of thermal radiation over the spectrum, derived from the law of classical statistical physics on the uniform distribution of energy over degrees of freedom, contradicts experiment. It followed from the theory that matter should radiate electromagnetic waves at any temperature, lose energy and cool down to absolute zero, i.e., that thermal equilibrium between matter and radiation is impossible. However, everyday experience contradicted this conclusion. A way out was found in 1900 by M. Planck, who showed that the results of the theory are consistent with experience, if we assume, contrary to classical electrodynamics, that atoms emit electromagnetic energy not continuously, but in separate portions - quanta. The energy of each such quantum is directly proportional to the frequency, and the coefficient of proportionality is the quantum of action h= 6.6×10 -27 erg× sec, later known as Planck's constant.

In 1905, Einstein expanded Planck's hypothesis by assuming that the radiated portion of electromagnetic energy also propagates and is absorbed only as a whole, i.e., behaves like a particle (later it was called a photon) . On the basis of this hypothesis, Einstein explained the laws of the photoelectric effect, which do not fit into the framework of classical electrodynamics.

So, on the new quality level The corpuscular theory of light was revived. Light behaves like a stream of particles (corpuscles); however, at the same time, it also has wave properties, which manifest themselves, in particular, in the diffraction and interference of light. Consequently, wave and corpuscular properties, which are incompatible from the point of view of classical physics, are equally inherent in light (dualism of light). "Quantization" of radiation led to the conclusion that the energy of intra-atomic motions can also change only stepwise. This conclusion was made by N. Bor in 1913.

In 1926, Schrödinger, trying to obtain discrete values ​​of the energy of an atom from a wave-type equation, formulated the basic equation of quantum mechanics, named after him. W. Heisenberg and Born (1925) built quantum mechanics in other mathematical form - the so-called. matrix mechanics.

According to the Pauli principle, the energy of the entire set of free electrons of a metal, even at absolute zero different from zero. In the unexcited state, all energy levels, starting from zero and ending with some maximum level (Fermi level), are occupied by electrons. This picture allowed Sommerfeld to explain the small contribution of electrons to the heat capacity of metals: when heated, only electrons near the Fermi level are excited.

In the works of F. Bloch, H. A. Bethe and L. Neel Ginzburg of quantum electrodynamics. The first attempts to directly study the structure of the atomic nucleus date back to 1919, when Rutherford, by bombarding stable nitrogen nuclei with a-particles, achieved their artificial transformation into oxygen nuclei. The discovery of the neutron in 1932 by J. Chadwick led to the creation of the modern proton-neutron model of the nucleus (D. D. Ivanenko, Heisenberg). In 1934, the spouses I. and F. Joliot-Curie discovered artificial radioactivity.

The creation of charged particle accelerators made it possible to study various nuclear reactions. The most important result of this phase of physics was the discovery of nuclear fission.

In 1939–45 was released for the first time nuclear power by using chain reaction fission of 235 U and created the atomic bomb. The merit of using the controlled nuclear fission reaction 235 U for peaceful, industrial purposes belongs to the USSR. In 1954, the first nuclear power plant was built in the USSR (the city of Obninsk). Later, cost-effective nuclear power plants were established in many countries.

neutrinos and many new elementary particles were discovered, including extremely unstable particles - resonances, the average lifetime of which is only 10 -22 -10 -24 sec . The discovered universal interconvertibility of elementary particles indicated that these particles are not elementary in the absolute sense of the word, but have a complex internal structure that has yet to be discovered. The theory of elementary particles and their interactions (strong, electromagnetic and weak) is the subject of quantum field theory - a theory that is still far from complete.