Chaos theory and complexity theory came into being right after the invention of computers that have the computational power needed to study complex phenomena. This new field of science is believed by many to be able to bring about very fundamental changes in the lives of all human beings, just as the effects of the discovery of electricity and its properties by Michael Faraday and Thomas Edison. Although most of the advances in the science of complexity have taken place over the past forty years, there are many scientists in ancient times whose research has been instrumental in understanding complex phenomena.

In the 1870s, the King of Sweden organized a mathematical competition in which the main prize was awarded to someone who could calculate the motion of three planets using mathematical formulas. This is very important in calculating the motion of planets in space.

When two planets move relative to each other, one naturally orbits the other. In this case, one can easily express the motion of these two space objects with mathematical formulas by Newton’s laws and predict their subsequent motions. But if another object or planet were added, just like the position of the earth, moon, and sun, calculating their motion would be very complicated with mathematical formulas.

In this case, it is possible for a space object to rotate around a central space object, and for a third object to rotate around the first rotating object. In this case, calculations rooted in Newton’s laws are no longer applicable.

At that time, Henry Poincaré (1853-1912) was the only one who proved that the problem of the motion of three space objects (planets) could not be solved. He said: “As soon as the earth moves, it changes its distance from other planets (moon and sun), which in itself changes the gravitational forces.” All three space objects interact with each other in such a way that mathematical calculations and formulation of their behavior have made it impossible.

Now keep in mind that if we can not calculate the motion of three objects, how can we predict the consequences of the systems we see in our daily lives that are mostly the result of the interaction of millions or billions of agents?

In 1940, the field of cybernetics was developed. Louis Kauffman, former president of the American Cybernetics Association, defined the new field of study as follows: Cybernetics is the study of systems and processes that interact with themselves and reproduce themselves. Cybernetics connects many fields of study, from control systems to electrical networks and evolutionary biology. Norbert Weiner and W. Ross Ashby were pioneers in cybernetics.

The popularity of cybernetics among scientists has experienced many ups and downs. Because cybernetics has interconnected many different fields of knowledge, it has often been overtaken by other evolving fields.

The theory of complexity is one of those areas that is heavily inspired by cybernetics, but continues to evolve independently.

Ludwig von Bertalanffy was one of those who at the same time approached cybernetics from the point of view of general systems theory. He stressed that traditional closed systems are not able to describe the types of systems found in the world around us.

His research had important implications for cybernetics as well as dissipative systems. Systems theory places more emphasis on having a holistic view instead of reductionism. Von Bertalanffy devoted most of his research to social systems and applied most of his ideas to anthropology, economics, political science, and psychology. In this regard, Margaret Mead and Gregory Bateson were instrumental in developing a general theory of systems in the social sciences.

In the 1960s, Edward Lorenz was the one who used computers to simulate climate. One day he was in a hurry to do something but still wanted to test his simulated model, but he encountered a very strange phenomenon. In his simulation calculations, he obtained numbers that, on that day, especially due to his haste, he rounded up many of these numbers, which were mostly decimals, and included them in his calculations. He expected that rounding decimal numbers would have very little effect on the results of his calculations. But strangely enough, he found that the results were very different from what he expected. Lorenz realized that a very small change in the initial state of a system could have far-reaching changes in the final outcome of that system. Lorenz called this phenomenon sensitivity to the initial conditions. Prior to this observation, many believed that very large-scale change required very large and strong forces. He realized that even small forces could have a big impact. This phenomenon is also known as the butterfly effect. It is metaphorically stated in the literature on the science of complexity and chaos that the fluttering of a butterfly in Japan causes a storm in the United States. Lorenz concluded that if a small change in the initial state of a complex system could cause dramatic changes in the final outcome of the system, then long-term weather forecasting could not be accurately calculated over time. It’s impossible.

In the early 1970s, Robert May, who had studied the subject of rate changes and insect reproduction according to food supply levels, came across similar results to Edward Lorenz. He found that at a critical stage, the number of insect breeding systems doubled and again led to a stable pattern. After several periods of doubling, the birth system becomes unpredictable. Period doubling is one of the most important concepts in various branches of complexity sciences.

In 1971, David Ruelle and Floris Takens discovered strange attractors. Using mathematical design of strange attractors, they were able to determine the relationship between the variables of a system and the dimensions of fuzzy space. In this way, they were able to design a system and the dynamism in the system with great precision. (Note: All of these concepts will be described in later sections).

In the 1980s, Benoit Mandelbert used his home computer for what he called a fractal. A fractal is a similar shape in itself, in which a basic shape is constantly repeated at different levels. For example, if you look at the fern, you will see that the sub-branches in the branches of the fern tree are similar to the whole tree, and the sub-branches of the tree branches have repeated the same structure, this is the basic shape of the same fractal.

Ilya Prigogine worked on dissipative systems. He also received the Nobel Prize in this field. A dissipative system is a system that retains its evolving form, structure, and identity because of the constant flow of energy in and out. The body of all of us is a dissipative system. Because it maintains its shape and identity due to the various flow of energy in various forms of food, water, air, environmental stimuli and cognitive processes. Dissipative systems always operate in a far from equilibrium state. Prigogine discovered the strange behavior of chemical dissipative systems in which the color of the components changes periodically. The question he had in mind was how did a molecule in a mixture know when it should change color?

Mitchell Feigenbaum also studied period doubling in the late 1970s. He showed that the period doubling is one of the natural methods of regular effort to reaching chaos. He obtained iterative ratios in periods of multiplication known as Feigenbaum numbers.

René Thom developed the theory of catastrophe based on how a complex system branched out or branched out. He had developed the theory in his mind that if a system reached a critical point through period doubling, it must either disintegrate into chaos or reach a new level of complexity through self-organization. Thom studied how a system slipped and fell into chaos and the conditions that led to it.

In 1984, the Santa Fe Institute was established as an independent private research and education center. Since then, the center has been at the forefront of complexity science and chaos research.

Two of the most famous researchers at the Santa Fe Institute are:

Chris Langton has done the most research on the edge of chaos. At the edge of chaos, a system has enough order to maintain its identity and form, and at the same time enough chaos to create an atmosphere of innovation and learning. It is at the edge of chaos that the two phenomena of self-organization and emergence can emerge.

Stuart Kauffman began his research on automated interconnected networks built into small computer applications. When the tasks themselves interact with the network, some unexpected results are seen. In most cases, the results were very predictable, but after a critical level, the system optimizes its performance through simultaneous adaptation. His research has had very important implications for evolutionary biology.

James Gleick was the author of the book Chaos: Building a New Science in 1987, which expanded the subject of chaos and theoriee of complexity in the management sciences and various fields. Although Glick did not play a role in the development of the scientific principles of chaos, his book made chaos a popular subject among scholars.