This New theory suggests the Big Bang never occurred Instead, the Universe has simply been going forever.
A new quantum equation suggests that the Universe has no beginning or end, and it could also account for dark matter and dark energy.
Researchers have created a new model that applies our latest understanding of quantum mechanics to Einstein’s theory of general relativity - and according to their calculations, the Universe may have been going forever.
Having been a mathematical modeler all my career, I find this fascinating. I learned early on that models are made for limited purposes and will usually end up being discarded for better models.
This new one may or may not hold up, but it opens up a discussion about the relationship of hard science, especially physics, to reality that complexity theory has carried on for a long time.
Read on below to learn more.
Complexity theory as expounded by the late Robert Rosen has alwys kept us aware that physics is a model of reality that has created a surrogate world. He has made clear the fact that physics is special and biology (ala complexity theory) is more general.
The clarification of the distinction between simple and complex scientific models became in later years a major goal of Rosen's published reports. Rosen maintained that modeling is at the very essence of science and thought. His book Anticipatory Systems[citation needed] describes, in detail, what he termed the modeling relation. He showed the deep differences between a true modeling relation and a simulation, the latter not based on such a modeling relation.
In mathematical biology he is known as the originator of a class of relational models of living organisms, called (M{,}R)-systems that he devised to capture the minimal capabilities that a material system would need in order to be one of the simplest functional organisms that are commonly said to be "alive". In this kind of system, M stands for the metabolic and R stands for the 'repair' subsystems of a simple organism, for example active 'repair' RNA molecules. Thus, his mode for determining or "defining" life in any given system is a functional, not material, mode; although he did consider in his 1970s published reports specific dynamic realizations of the simplest (M{,}R)-systems in terms of enzymes (M), RNA (R), and functional, duplicating DNA (his beta-mapping).
He went, however, even farther in this direction by claiming that when studying a complex system, one "can throw away the matter and study the organization order" to learn those things that are essential to defining in general an entire class of systems. This has been, however, taken too literally by a few of his former students who have not completely assimilated Robert Rosen's injunction of the need for a theory of dynamic realizations of such abstract components in specific molecular form in order to close the modeling loop [clarification needed] for the simplest functional organisms (such as, for example, single-cell algae or microorganisms). He supported this claim (that he actually attributed to Nicolas Rashevsky) based on the fact that living organisms are a class of systems with an extremely wide range of material "ingredients", different structures, different habitats, different modes of living and reproduction, and yet we are somehow able to recognize them all as living, or functional organisms, without being however vitalists.
You can peruse my diaries to find out more about complexity theory and its many insights into the limits of reductionist science. In case you wonder why I write this stuff here it is because the whole issue is replete with political ramifications. Let me explain again.
Our political universe is replete with apparent conflicts between religion and science. The "struggle" goes back a long way. It was Descartes who made the deal with the Church that set the stage for the present situation. He did two very important things that underly much of what some call science today. These are things that cpmplexity theory has gone beyond and rejects.
The fist is the mind/body dualism. The idea was that science could study the body but that the Church would be where mind was to be kept.
Closely coupled was the second, the machine metaphor. Seeing all of nature as special forms of machines, especially living things led to Cartesian reductionism. This idea is behind most of what physics is all about. The models are all based on the idea that in order to understand a complex system you need to reduce it to simpler parts and eventually atoms ad molecules. The idea that crucial information is lost in the reduction has never been dealt with successfully and it was complexity theory, especially that started by Rosen in the 1950s at Chicago, that pointed to the flaws of reductionism.
Our problems with religion ignore the fact that the ongoing "struggle" is founded on a collusion. In fact, descartes epistemology makes god necessary. Complexity theory has shown that organisms and other complex systems can only be realistically modeled with impredicative models. They are replete with closed loops of causality. Descartes reduction to machines supplied a set of models that had no such loops and in fact ruled that any discussion of such things is "unscientific".
The realm of quantum physics is one in which a lot of the machine like models have been discarded or limited to special circumstances. I won't even try to elaborate on this. The new model is just an illustration of how physics is a set of models. Reductionist models are fraught with traps and religious people often can spot them. The whole struggle is moot when complexity theory eliminates the need for god that reductionism has made necessary. The need for an external causal entity is eliminated. Problem solved.