• https://www.facebook.com/levent.ozbek
  • https://www.youtube.com/watch?v=6ltGoG8z5h0&t=4s


Matematiksel Modelleme ve Simülasyon
What Is Randomness?

What Is Randomness? A Philosophical and Scientific Inquiry into Determinism, Probability, and Uncertainty

 

Levent Özbek

Ankara University, Deparment of Statistics

Ankara/Turkey

ozbek@science.ankara.edu.tr

 

"Those who seek gold dig much and find little." – Heraclitus

Abstract

This paper explores the concept of randomness from both philosophical and scientific perspectives. Drawing from determinism, causality, and probabilistic frameworks, it examines the interplay between necessity and chance in nature, society, and scientific modeling. By tracing randomness through statistical theories, quantum uncertainty, and algorithmic incompressibility, the study reveals how randomness is not merely a product of ignorance but a fundamental component of reality. The work also discusses the limitations of classical determinism and the evolving role of probability in understanding natural phenomena.

Keywords

randomness, determinism, probability, uncertainty, statistical modeling, philosophy of science

 

Introduction


This text is not intended as a structured study with a particular aim, but rather as an attempt to communicate understanding to those capable of understanding.

In their ongoing efforts to comprehend and explain the events unfolding in the universe, humans construct various models related to the phenomena and processes they are interested in. By working with these models, they attempt to forecast potential future outcomes. A model is a representation of the structure and functioning of a real-world system, expressed within the framework of concepts and laws belonging to a specific scientific discipline. A model is a depictiona representationof a real-world phenomenon. It is not the phenomenon itself, and no matter how complex it may appear, it is always an incomplete portrayal of reality. In short, what we refer to as a model is a product of the modelers own understanding of reality. Models can be classified in various ways. Among them, mathematical models are considered the most expressive and valid. The modeling approach may vary depending on the perspective of the modeler toward the real-world phenomenon.

The deterministic worldview, along with the associated concepts of causality and randomness, remains a subject of intense discussion across disciplines such as science, philosophy, and art.

Psychologists often interpret the human yearning for certainty as a desire to return to the earliest days of childhood, when a person felt safe under the protection of their parents and had yet to experience doubt or uncertainty. This desire is often reinforced by an upbringing that conditions the child to see doubt as a vice and faith as a moral imperative. Yet the quest for certainty is one of the most dangerous sources of error, for it often accompanies the pursuit of superior knowledge.

Everything that exists in the universe is a product of both randomness and necessity.

Philosophical Approach


A random event may, over time, evolve into a necessity. From the perspective that necessity always manifests through randomness, randomness is thus the complement of necessitythat is, a necessary event is always completed by random components.

Randomness is not simply an unknown necessity. Knowing the causes of a random event does not strip it of its randomness. Every phenomenon in the universe arises necessarily from its internal causes. However, each phenomenon is also influenced by external causes. Unlike internal causes, external causes are neither fundamental nor determinative. In both nature and society, everything is interconnected in one form or another; all things influence each other.

For a person, where they are born or the environment in which they grow up may be random; however, the need for food and drink to sustain life is a necessity. To determine whether a phenomenon is random or necessary, it is essential to understand whether it results from an internal or external cause. With appropriate interventions, events involving randomness can be reduced or even eliminated. Winning a lottery is considered chance, but purchasing a ticket is a necessary precondition. A traffic accident may appear random; yet in a society where traffic regulations are followed, roads are orderly and safe, and public transportation is used widely, the likelihood of accidents decreases significantly.

Randomness is objectivethat is, it is independent of human thought or will. In some cases, randomness may temporarily obstruct the functioning of necessity. Yet necessity will carve out its own path through randomness and fulfill its role.

In nature and society, necessity is the product of lawfulness. Every law represents the manifestation of the necessity that governs objects and events. Each phenomenon is related to another, either through an essential and internal connection or through a non-essential and external one. Randomness always conceals an objective necessitythat is, a law. Declaring that the occurrence of an event is necessary implies that all of its relations have been resolved, that it has gained resistance against external influences, and thus cannot be altered or reversed.

A large number of events, the aggregate of which produces a general law, each deviate from that law to varying degrees. This deviation itself constitutes randomness. Here, law does not failit becomes observable. If a law functions while allowing for certain deviations, then that law operates precisely through randomness. In this sense, a law is never absoluteit is approximate.

All scientific disciplines, in essence, rely on statistical modeling. According to the kinetic theory of gases, each gas molecule follows a highly complex path. However, thanks to the law of large numbers, observable average behavior conforms to simple laws such as those of Mariotte or Gay-Lussac. Gas molecules inside a closed container collide randomly with each other and with the container walls, but the pressure exerted on each wall remains necessarily equal. Beneath the seemingly random motion of gas molecules lies physical and chemical necessity.

Science always accounts for the phenomenon of randomness and attempts to minimize it. Within the principle of universal interconnection, no phenomenon is absolute. Everything proceeds through a constant process of emergence and disappearance, with transitions and transformations from one state to another.

Scientific and Philosophical Synthesis


Gas molecules collide randomly with each other and with the walls of the container, yet the pressure exerted on each wall remains necessarily equal. Beneath the seemingly random motion of gas molecules lies physical and chemical necessity.

Science always considers the phenomenon of randomness and seeks to minimize its effects. Under the notion of universal interconnection, no event or phenomenon is absolute. Everything proceeds through a continuous process of becoming and perishing, with transitions and transformations occurring from one form to another.

Many seemingly isolated and singular events may, in fact, be integrated into a broader framework of general relationships and regularities. Once these relations become reproducible, general, and persistent, the state of randomness ceases.

A random event is one that may or may not occur under certain conditions. In the history of philosophical science, this concept has often taken a dual form and has been treated as contradictory. Some philosophers have denied randomness entirely by asserting that all events conform to necessity and laws, while others have attributed all events to pure chance.

What, then, is the intrinsic relationship between necessity and randomness in the world around us? One possible answer is this: There is nothing that must necessarily occur which could not occur otherwise. Any event, no matter how implausible, may still be possible in some way or form. From this perspective, nothing is impossible, and nothing is strictly necessary. Those who embrace this idea, who believe in the possibility of everything and ignore necessity, are known in philosophy as indeterminists.

However, science demonstrates that everything in nature is subject to laws and governed by an unyielding necessity. Nothing can occur in any way other than it does in reality. Assuming that an event could happen contrary to established laws and without necessity implies a miraclean event without cause. Yet miracles do not occur and, in fact, cannot occur.

Some philosophers have argued that nothing in nature is random and that all events are predetermined. Those who adopt this deterministic worldview found validation in Newtonian classical mechanics. For a long time, mechanistic determinism dominated scientific and philosophical thought.

But as science began to confront more complex realities beyond the trajectories of individual objects, this mechanistic determinism began to unravel. The belief that every event is predetermined and inevitable leads directly to fatalism. Accepting that everything is unchangeably preordained is the logical consequence of mechanical determinism.

Biological and Probabilistic Perspectives


In the plant and animal kingdoms, it is evident that no organism lives foreverthis is a law of nature. However, there is no law that determines the exact date and time when a growing tree will die. The same applies to all living things.

There is no ultimate theory of the universe; only models that increasingly approximate the truth. Science, by its very nature, is a progressive discipline. Every step forward reveals both new truths and new uncertainties.

Modern probability theory attempts to mathematically formalize randomness. A probability value is not merely a measure of belief but is grounded in long-run frequency and statistical regularity. Despite this, probability cannot describe the reason behind the occurrence of a single eventit only measures the likelihood across many repetitions.

The moment an individual event occurs, the probability becomes either 0 or 1. Thus, probability theory does not explain why something did happen, only how likely it was to happen.

Randomness, as expressed in mathematics and statistics, represents a way to model uncertaintybut it is always relative to known or assumed structures. As these structures evolve or are revised, so too is our concept of randomness.

Modern Scientific Perspective and Uncertainty


Modern science defines randomness not as pure chaos or ignorance, but as the product of complex interactions that cannot be fully resolved or reduced at the current level of understanding. In this context, randomness is not an epistemic deficiency, but an ontological aspect of how the universe operates.

The emergence of quantum mechanics in the 20th century drastically altered our understanding of determinism. Heisenberg's uncertainty principle showed that, at the quantum level, even perfect knowledge of one variable (like position) imposes limits on the accuracy with which another variable (like momentum) can be known. This indeterminacy is not due to technical limitations but is an inherent feature of reality at that scale.

This raised profound philosophical questions: If the universe at its most fundamental level is indeterminate, what does that imply for causality, predictability, and free will?

In quantum physics, probabilities do not arise from lack of information, but from the intrinsic nature of quantum events. Thus, randomness is not only real, but essential to the very structure of nature. However, this randomness exists within constraints defined by probability distributionshighlighting the paradox that randomness itself has structure.

Mathematical Perspective on Randomness


In mathematics, randomness has been rigorously formalized through the theory of probability and stochastic processes. A stochastic process is a collection of random variables indexed over time or space. Brownian motion, Markov chains, Poisson processesall these are mathematical models of phenomena that evolve with a degree of unpredictability.

Modern mathematics treats randomness as a concept that can be defined axiomatically. Kolmogorov's axioms laid the foundations of modern probability theory, treating probability as a measure on a sigma-algebra of events. However, even within this framework, the philosophical nature of randomness remains contested.

In algorithmic information theory, randomness is defined via incompressibility: a sequence is considered random if it cannot be compressed into a shorter algorithmic description. This brings randomness into contact with computation, complexity, and logic.

Thus, randomness, depending on context, can be:
- the manifestation of physical indeterminacy (as in quantum mechanics),
- an epistemological limitation (as in classical systems),
- or a formal property of sequences (as in algorithmic theory).

What unifies all these interpretations is the idea that randomness challenges our ability to make absolute predictions, and therefore lies at the heart of both scientific humility and intellectual curiosity.

Mathematical Perspective

The concept of probability is closely related to some of the most fundamental problems of knowledge. First and foremost, probability is reflected in the laws of nature. According to classical probability theory, a numerical probability value is defined as the ratio of the number of "favorable outcomes" to the number of "all possible outcomes." This definition has led to different interpretations, often categorized as subjective and objective.

For example, the proposition “the probability of rolling a 6 with a fair die is 1/6” is a numerical probability statement. The issue arises in how such a numerical probability proposition is to be interpreted. Popper argued that the subjective interpretation contains psychological elements, treating the degree of probability as a measure of confidence or uncertainty. Reichenbach, speaking from a rationalist point of view, claimed that probability, in the absence of causal knowledge, is a product of reason.

Popper, in his objective interpretation, considered every numerical probability statement as a claim about the relative frequency of certain events within a sequence of occurrences. For example, the proposition “the probability of rolling a 1 on the next die throw is 1/6” is not a claim about a single future event, but rather about the statistical properties of the entire set of throws; the individual throw is simply a member of that collective. This proposition merely asserts that, within that set, the relative frequency of “rolling a 1” is 1/6. According to this interpretation, numerical probability statements are only meaningful when they can be interpreted in terms of frequency.

A frequency-based theory of probability was proposed by Richard von Mises, whose work was later described by Reichenbach as the “empirical philosophy of probability.” Reichenbach argued that rationalist interpretations of probability should not have a place in scientific philosophy. According to von Mises, probability theory is a theory of sequences of events that contain a specific element of randomness. These sequences are governed by two axiomatic conditions: the limit-value axiom and the randomness axiom. A sequence of events that satisfies both conditions is referred to by von Mises as a collective—an idealized sequence of trials imagined to repeat infinitely. For example, the series of throws of an unworn die constitutes a collective.

Each such event is defined by a specific property—e.g., "rolling a five." For each event in the sequence, a new sequence of relative frequencies can be defined. As the sequence grows, the relative frequency of the property in question should converge to a specific limit value, according to the limit-value axiom. According to von Mises, the term “probability” refers to this limit of relative frequency within a collective. In his view, the sole purpose of probability theory is to deduce unknown probabilities from known ones.

Reichenbach pointed out a limitation of the frequency interpretation: it faces difficulties when applied to single events. It makes little sense to speak of the frequency of a one-time event.

The modern definition of probability, given by Andrey Kolmogorov using the concepts of measure theory, brought an axiomatic structure to the field. This formalization includes the classical definition as a special case. Probability theory, based on Kolmogorov’s axioms and von Mises’ frequency theory, is now used to model stochastic processes—processes that contain inherent randomness. Von Mises’ theory can be more fully understood with reference to the Law of Large Numbers, introduced by Bernoulli. This law bridges the gap between theoretical and empirical probability. Through this principle, probability theory engages with experimental science and allows theoretical results to be applied across various empirical fields—expressing the behavior of natural phenomena that cannot be captured by strict deterministic laws, but can still be modeled mathematically.

Theoretical physicist Heinz Pagels, in his attempt to answer the question “What is randomness?”, emphasized the importance of distinguishing between mathematical and physical randomness. He wrote:

“The mathematical problem is a logical one—defining what it means for numbers or functions to be randomly ordered. The physical randomness problem is determining whether actual physical events conform to the mathematical criteria of randomness. Until we have a mathematical definition of randomness, we cannot determine whether a sequence of natural events is truly random. Once we do have such a definition, we are then faced with an additional empirical problem—whether real events conform to this definition. Here we encounter the first problem: mathematicians have never succeeded in giving a precise definition of randomness, or even of the related concept of probability…”

Conclusion

Randomness is not the absence of order but its complement. It reflects the limitations of knowledge, the complexity of systems, and the structure of laws that tolerate variance. By embracing randomness as a central aspect of nature and thought, we move beyond simplistic dichotomies of determinism versus chaos. Instead, we recognize randomness as essential to scientific progress, modeling, and the very logic of inquiry.

 

References

Vasilyev, M., & Stanyukovich, K. (1989). *Matter and Man*. Onur Publishing.

Planck, M. (1987). *Modern Understanding of Nature and Introduction to Quantum Theory*. Alan Publishing.

Hançerlioğlu, O. (1982). *Dictionary of Philosophy*. Remzi Publishing.

Mengüşoğlu, T. (1988). *Introduction to Philosophy*. Remzi Publishing.

Poincaré, H. (1986). *Science and Method*. Milli Eğitim Press.

Noel, E. (1999). *Chance in Contemporary Scientific Perspectives*. Pencere Publishing.

Reichenbach, H. (1993). *The Rise of Scientific Philosophy*. Remzi Publishing.

Hızır, N. (1981). *Philosophical Writings*. Çağdaş Publishing.

Pagels, H. R. (1992). *The Cosmic Code: The Language of Nature / Quantum Physics*. Sarmal Publishing.

Ruelle, D. (1994). *Chance and Chaos*. TÜBİTAK Publishing.

Prigogine, I., & Stengers, I. (1996). *From Chaos to Order*. İz Publishing.

Popper, K. R. (1998). *The Logic of Scientific Discovery*. Yapı Kredi Publishing.

  
23 kez okundu

Yorumlar

Henüz yorum yapılmamış. İlk yorumu yapmak için tıklayın