Why information?

  • Home
  • Why information?

Why is information more fundamental than matter, and why could science fail to recognize this for so long?

Contemporary physics increasingly describes reality in a language reminiscent of information theory. We speak of states, amplitudes, probabilities, entanglement, entropy, and coding. Yet information is still treated as something secondary—a tool for describing the world, not its foundation.

This is not an oversight or intellectual blindness. It is a consequence of the path science has taken.

“If you want to understand the secrets of the universe, think in terms of energy, frequency and vibration.”
— Nikola Tesla

 

Tesla formulated this statement thinking in terms of energy and fields, not information in the modern sense. He did not have the language of information theory or the concept of computational reality. And yet the intuition he expressed resonates surprisingly well with later conclusions: it shifts attention away from matter as "things" and toward processes, relationships, and dynamics. In this sense, Tesla's message can be read not literally but structurally—as an invitation to think about the Universe not in terms of objects, but of that which what is happening there and how it is implemented. It is this shift in perspective that forms the starting point for the Māyā Theory.

Physics arose without the language of information

Fundamental theories of physics were born in a world that lacked the concept of information in a fundamental sense. Newton's classical mechanics described reality as a collection of material objects moving in absolute space and time. It was a continuous, deterministic, and fully material world.

At the beginning of the 20th century, the theory of relativity and quantum mechanics emerged, radically changing our understanding of the world. Time and space became relative, and the behavior of matter ceased to be deterministic. Yet, despite this revolution, the language of physics remained the language of matter, fields, and geometry.

Einstein himself, the creator of general relativity, lamented that his equations only described "how" mass curves space, but did not explain "why." He sought a deeper mechanism—a deterministic one, hidden beneath the probabilistic chaos of quantum mechanics. The famous "God does not play dice" was an expression of this anxiety: he wanted a description that was not merely mathematical, but mechanistic, causal.

For him, a quantum state was a mathematical object, not a carrier of information. Spacetime was a geometric structure, not a process. Gravity was a curvature, not a consequence of the dynamics of a system. Einstein couldn't find this mechanism because he didn't yet have the language of information processing or computers — the tools that would allow him to see reality as a computable process.

At that time, computer science did not yet exist, so there was no language that would allow us to ask how reality is processed.

When the information came out, the paradigm was already established

Shannon's information theory was not developed until 1948. Digital computers, the concepts of bits, algorithms, clock speeds, and local processing were products of the second half of the 20th century. These were technical tools, not ontological ones.

By this time, the foundations of physics had already been formulated. Quantum mechanics, general relativity, and the Standard Model operated in a language that did not assume that reality computed anything. When information began to appear in physics — in the form of entropy, quantum information, or the holographic principle — it was interpreted as a property of physical systems, not their cause.

Information was added to the image of the world instead of becoming its starting point.

Matter as an assumption, not a conclusion

Historically, physics has always begun with the assumption that something exists: particles, fields, space-time. Only later was their behavior studied. Even the most abstract constructs were treated as primordial entities.

In this approach, information could only describe the state of matter. It was never treated as a mechanism that generated that matter. This reversal of logic is crucial.

If we start with matter, information will always be secondary. If we start with information, matter may turn out to be an effect.

Physical constants as a trace of a deeper level

One of the strongest signals that a material starting point is insufficient is the physical constants themselves. The speed of light, Planck's constant, and the fine-structure constant all appear in equations as inputs. Theories work because they incorporate them — not because they explain them.

Particularly telling is the fine-structure constant, a dimensionless number close to 1/137 that quantifies the strength of electromagnetic interactions. It is measured with incredible precision, yet it is not derived from the geometry, symmetry, or dynamics of the Standard Model. Without it, atoms would not be stable, chemistry would not exist, and life as we know it would be impossible.

Dimensionless numbers do not describe matter. They describe relationships, proportions, and structure. They are code signatures, not properties of objects.

Why a synthesis of many fields was needed

There's an even deeper reason why the informational nature of reality has remained invisible for so long. Its recognition required the synthesis of multiple fields that, for most of scientific history, had developed entirely independently.

Physics, mathematics, computer science, information theory, cybernetics, and complexity science each had their own languages ​​and their own problems. Modern theoretical physics has become a highly specialized science. This model of work is extremely effective in delving into details, but it hinders asking questions about the mechanism as a whole.

The question of the foundations of reality does not belong to any single discipline. It requires a simultaneous understanding of mathematical structures, physical processes, and information processing principles. For a long time, such a language simply did not exist.

New cognitive tools

In recent years, however, something else has changed — the very tools of thought. The development of artificial intelligence systems and symbolic analysis tools has radically accelerated the process of knowledge synthesis.

Tasks that previously required years of work: comparing formalisms, tracking the consequences of assumptions, connecting distant areas of knowledge - can now be performed on an incomparably shorter time scale.

This doesn't mean that machines "understand" reality. Their role is to enable human intuition where it was previously overwhelmed by information overload. AI isn't a source of new ideas, but a catalyst that allows them to crystallize more quickly.

This made it possible to ask a question that was previously practically impossible: can all known laws of physics be the result of a single, simple processing mechanism?

Changing the starting point

If reality is a process rather than a collection of objects, information is not a description of the world but its operational substance. Matter, energy, time, and space become effects of local processing rules, not primordial entities.

In this approach, the Planck boundary ceases to be a "wall of cognition" and becomes a code resolution. Gravity ceases to be a force and becomes the result of processing overload. Physical constants cease to be arbitrary numbers and become architectural parameters.

This is not an opposition to science. It is a consequence of its development.

And perhaps that is why it has only now become possible to see what has long remained invisible – that reality is not based on matter, but on information.

Cart (0 items)
Address
Warszawa
Contact
e-mail: contact@instytut-iskra.pl
Working hours
Mon - Sat: 8:00 AM - 6:00 PM Sunday: Closed