prscrew.com

Exploring the Intersection of Scientific Metaphysics and Information Theory

Written on

The concept of theory defeasibility—referring to the need for revising theories based on new evidence and hypotheses—has often been mischaracterized by pessimistic meta-inductionists and instrumentalists as a weakness in scientific methodology and epistemology. In reality, this process is a fundamental aspect of scientific advancement. It can be understood through the lenses of Robert Nola and Luciano Floridi, who advocate for optimistic meta-induction, suggesting that the overall trajectory of scientific theory and practice yields significant practical benefits, including publicly verifiable experiments and reproducible results (Floridi, 2008; Nola, 2008).

Nevertheless, it does not logically or intuitively follow from optimistic meta-induction and theory defeasibility that all metaphysical frameworks hold equal epistemic value or that they share a level playing field in terms of scientific explanation. Such a notion is evidently flawed and misguided.

It is reasonable to assert that the validity of optimistic meta-induction—which arguably deserves the label of "fact" due to the consistent success of scientific endeavors—implies that our most robust philosophies of science, as well as our scientific metaphysics, should also be subject to revision based on methodological principles. Scientific metaphysics ought to evolve in response to the changing ontological commitments of our most reliable physical sciences, which delineate what is understood to exist.

Indeed, this evolution primarily pertains to physics, rather than other disciplines. For instance, positing that the universe might be a vast conscious mind, starting from psychology or neuroscience, does not constitute scientific inquiry nor does it align with metaphysics endorsed by the sciences.

Such approaches may represent a form of quasi-scientific metaphysics, which can be enjoyable but lacks the rigor of scientific or naturalistic metaphysics. Definitions in this area are somewhat fluid, but Steven French aptly suggests that any proposed ontology should first and foremost draw from established concepts within hard science and physics, with minimal deviation. Paul Humphreys seems to have clarified the bounds of what constitutes "not much else" in this context.

That said, placing trust in physicists regarding metaphysical questions can lead us to theories like Wheeler's participatory cosmology or Barrow and Tipler’s anthropic principle, which, while entertaining, may not always provide substantive insight.

Non-Apriorist Metaphysics and the Event Horizon

Expressivists argue that both metaphysics and scientific metaphysics are essentially futile pursuits. They contend that there is no merit in attempting to engage with them, as they deem it a total loss.

While I concur with their critique of many metaphysical claims that assert scientific validity—especially those that disregard empiricism or physicalism—some expressivist philosophers argue against a realist interpretation of the observables and unobservables in quantum theory (Egg, 2019; Healey, 2013, 2018; Healey & Fine, 1990; Maudlin, 1998).

This perspective is particularly curious given that quantum theory remains one of the most predictive frameworks in existence. Might it not be accurately capturing some aspects of reality?

Take the neutrino, for instance. (Although Pauli himself dismissed it for decades, the quantum mechanical framework ultimately validated its existence.)

In many cases, I suspect that forms of expressivism may inadvertently reveal their own contradictions rather than demonstrating that scientific realism regarding theoretical referents and unobservables is misguided. Or perhaps they reduce themselves to near absurdity concerning their own tenets.

At times, it seems philosophers might be exhausting their legitimate avenues of inquiry.

While logical positivists like A.J. Ayer and Rudolf Carnap, along with many expressivists, have rightly dismissed a-priorist metaphysics as largely irrelevant to scientific explanation, outright rejecting naturalistic metaphysics is likely an unwise and somewhat foolish error. This is especially true when considering the ontological practices of physicists themselves. Metaphysical speculation is a significant part of the processes scientists engage in regularly during scientific discovery (Andersen & Becker Arenhart, 2016; Anderson et al., 1997; Arns, 2001; Beebee & Sabbarton-Leary, 2010; Chalmers et al., 2009; Crupi, 2016; Esfeld, 2010; Maudlin, 2007; Slater, 2009).

The distinction between the metaphysics of physicists and that of other scientists lies in the fact that physicists' metaphysical frameworks are usually well-informed by existing scientific knowledge and evidence. A-priorist components, when present, are typically constrained by empirical evidence and remain tentative pending further verification through scientific methodologies (as illustrated by Wolfgang Pauli's approach to the neutrino). When a-priorist elements do exist, they are often grounded in substantial, experimentally verified scientific evidence.

This interplay contributes to the development of new hypotheses that can be tested through a hypothetico-deductive framework—a methodology advocated by Karl Popper that has garnered substantial support in various scientific fields, particularly contemporary psychology and molecular biosciences (Barbieri, 2002; Liang et al., 2019). This form of scientific metaphysics—emerging from the practice of science—suggests that thorough expressivism regarding the referents of scientific theories is misguided. Ockham’s Razor, or the principle of ontic parsimony, supports this assertion. Dismissing realism about any scientific theory’s referent due to A.J. Ayer’s dismissal of anything not analytically verifiable represents a fundamentally different stance (Hanfling, 1986; Steinmann & Ayer, 1973).

Ontic Parsimony Revisited

Ontic parsimony advocates for excluding any elements in a theory's ontology that are unnecessary for explaining or "saving" phenomena (Long, 2019). "Saving the phenomena" is a traditional philosophical expression referring to the explanation of natural systems, entities, events, or processes.

However, it does not imply that the theory must be simple; it may be highly complex. Additionally, a theory with an extensive ontology may not necessarily be complex in its underlying premises and assumptions.

Utilizing our most complex scientific theory—quantum field theory (QFT)—often necessitates a temporary expansion of ontology through specific mathematical abstractions (I will set aside string theory for now due to its speculative nature). In practice, many variables associated with what physicists term mathematical singularities and infinities are temporarily introduced into QFT equations, but these are ultimately eliminated during the process of renormalization (Crowther, 2015).

In essence, even our most sophisticated scientific theories appear to reinforce the principle of ontic parsimony by suggesting that maintaining even mathematically vital ontological excess is not an effective strategy for discerning what truly exists in the world.

Claiming that such variables correspond to real entities within physical phenomena is misguided:

> [F]ield theories don’t necessarily make sense to arbitrarily small length scales. If we try to make calculations in the continuum we simply get garbage. This “garbage” is the famous infinities that surround quantum field theory. However, if we introduce some fundamental scale in the game to define our theory, some minimal length beyond which our description breaks down, suddenly everything makes sense again. And using this framework we have been able to recover all those weird experimental facts to an astounding precision. And I do mean astounding. Renormalization has the best prediction in the history of science. The g-factor of the electron is the quantity that has the best agreement between data and theoretical prediction. Period. No caveats (Melo, 2019, 4).

There is an ongoing debate among physicists and philosophers of science regarding whether the infinities and singularities encountered in QFT prior to renormalization serve as mere information "sinks"—offering no genuine insight into the physical phenomena—or if they hold valuable information regarding the natural world.

From the perspective of information theory and its interpretations, the successful application of renormalization in QFT suggests that it conveys some form of information, likely abundant. However, much of this information pertains to the model itself, its representations, and the underlying mechanics, although the nature of any propositional information encoded within it may not be immediately clear.

Expressivists often contend that such processes do not convey information. I suspect this may stem from a misunderstanding of the essence of information itself, though this concept remains somewhat open to interpretation. Shannon and Kolmogorov present the most robust applied scientific definitions of information, both of which are grounded in materialist, structural realist principles with significant statistical applications. Nonetheless, Shannon's focus on discrete models, such as those related to human communication, still recognizes any continuous, physical, stochastic process as an information source, using discrete approximations primarily for mathematical convenience (a strategy that has proven valuable for digital networks).

It seems expressivists might assume that all information is propositional. However, many natural phenomena, such as celestial X-ray emissions and DNA, possess intrinsic information independent of any cognitive interpretation. Information exists within these entities; it is not a construct we impose. We must avoid biases against non-conscious entities regarding their informational content.

Returning to the topic of renormalization in QFT, it may also reveal insights into the nature of the applied mathematics involved. Hence, within the realms of scientific inquiry, understanding the types of information sources that the "blowup" abstracta are derived from is crucial.

Complex discussions regarding the nature of mathematical abstractions arise, often framed by the Quine-Putnam indispensability argument, alongside Hartry Field's argument for mathematics devoid of abstracta. Critics of mathematical nominalism deem this the "hard road," often implying it is an impossible endeavor (Bueno, 2012; Busch & Morrison, 2016; Colyvan, 2010, 2012; Jandri?, 2020; Yablo, 2012). For the sake of this discussion, let us assume that no one disputes the existence of causal Platonic entities within physical fields or the phenomena modeled by QFT (though many philosophers of mathematics maintain such views). In simpler terms, most physicists would likely not claim that a mathematical abstractum can exert physical influence on any material system, practically or theoretically.

I am confident that even string theorist Ed Witten would not assert such a position, and I remain hopeful for string field theory to gain empirical validation.

It raises an intriguing question whether the informational "blowup" of mathematical abstractions in QFT serves as an indicator of the underlying physical phenomena or quantum systems. Nonetheless, prevailing scientific opinion tends to regard these mathematical singularities and infinities as largely devoid of informational value concerning the actual material phenomena of quantum fields.

Most physicists do not maintain that the mathematical entities involved in these "blowups" exist within the modeled physical phenomena as sources of Shannonian information. This view is further complicated by overarching concerns about the veracity of realism regarding quantum fields, which I will avoid for brevity.

This situation is intricately tied to the complexities associated with infinities and the relationships between various types of mathematical and physical infinities. The "blowups" and singularities might indeed convey information, but I hypothesize that this information is more about the model than it is about the actual phenomenon being modeled. If the information encoded by these mathematical anomalies pertains to the physical systems in question, it is likely to be indirectly indicative of something external to the phenomenon rather than a direct representation of what exists within the modeled physical system.

Science is abundant in useful explanatory fictions and abstract mathematical constructs that do not necessitate additions to our material understanding of the universe. Concepts such as frictionless planes, centers of gravity, and absolute vacuums enjoy respectable explanatory status. However, these constructs are often merely theoretical devices employed to elucidate what occurs within specific physical systems or phenomena.

The mathematical abstractions removed during QFT renormalization likely do not possess the same ontological status as centers of gravity. The extent to which they resemble such constructs or exist as mere fictions is difficult to determine. Nevertheless, physicists must eliminate them through renormalization to achieve the most accurate predictive results in the history of science, which may provide significant support for the validity of ontic parsimony. It might also affirm the practical necessity of mathematical abstractions in scientific theories, although this does not inherently imply a commitment to realism regarding such entities (Azzouni, 2015; Baker, 2005; Bangu, 2013; Hjortland, 2019; Miller, 2016; Pincock, 2012).

Given the conceptual challenges posed by infinities, we should approach the idea through the lens of the continuum. The mathematical continuum allows for the selection of any two rational or irrational numbers on the number line, with an infinite array of values guaranteed to exist between them. This selection process can continue indefinitely.

However, most applied scientists would not assert that the mathematical continuum can be realized within any physical system. Yet, ontic structural realists posit that the universe may indeed exhibit a "turtles all the way down" structure, suggesting that as physicists reach Planck-scale structures, these entities will continue to reduce to more fundamental physical structures ad infinitum.

For example, while a neural synapse can produce a range of electrical potentials, it cannot realize every potential value present within the mathematical continuum between two extremes. The inherent discretizing limits of the physical and electrochemical systems will constrain this ability.

The Nature of Information and Structure

It does not necessarily follow that the intrinsic information contained within any system is finite. This question is complex and hinges on the overall structure of the physical system, particularly concerning whether that structure has any foundational or reductive limits (the "turtles" concept) (Psillos, 2006; Saunders, 2003). The information contained within sources and states of information is contingent upon their structure. The measure may be statistical, but what is being measured should not be merely statistical. Some theorists, like Ladyman and Ross, argue that statistics are physical, which presents an intriguing perspective.

The structural component is non-negotiable. In other words, information must be rooted in physical structure or derived from structure that supervenes upon or reduces to physical structure. Without structure, there is no information.

Can anyone conceive of something that exists devoid of structure? It is a challenging, if somewhat tautological, inquiry.

This may seem trivial until one considers existential questions surrounding information and the conditions required for both structure and information to exist. Is there any genuine information associated with abstract mathematical entities (if they exist)? How could this information be transmitted or generated? If this question is misguided, is it because the nature of information is inherently subjective, necessitating a consumer or recipient? (Scarantino, 2015; Stegmann, 2015). Does the existence of information depend on logic, language, or vice versa? (Sagueillo, 2014; Sequoiah-Grayson, 2008).

Stephen Hawking, a personal hero of mine, famously claimed that the information contained within a quantum system is finite. However, this assertion is made within a specific context and seems to rely on theoretical limits related to scale variance, likely confined to certain measurement scales pertinent to the modeling in question (Jentschura & Nándori, 2014; Lyre, 2012; Nozick, 2001; Van Leeuwen, 2014).

Structure serves as both a necessary and sufficient condition for information. Even if an entity possesses only boundaries without internal structure, it could serve as a symbol to convey information when arranged with other structureless entities. However, such a structureless entity would lack intrinsic information.

I must mention that pluralism regarding the nature of information predominates within the philosophy of information, as there are good reasons for this variation. Caution is required in considering the levels of explanation and abstraction relevant to a given conception of information (Bauer et al., 2008).

The sciences exhibit a broad conceptual pluralism concerning the nature of information. The definition of information in psychology often diverges from that in physics, for example. In psychology, information is frequently framed within the context of information processing theories of cognition, suggesting that its existence is contingent upon cognitive and neurological processes (though the degree of this dependency varies among theorists). There is also a distinction between cognitive and non-cognitive types of information, a topic I will explore in greater depth in my forthcoming work. Notably, the classical Shannonian statistical understanding of information is widely utilized across scientific disciplines, and metaphysical conceptions of information tend to be more stable within specific fields, such as physics (despite notable variations even within physics).

I advocate for a scientific metaphysics concerning information, suggesting that the ontology of information should be shaped by scientific applications of the concept and references to information within scientific theories. This stance aligns with a commitment to positive scientism, echoing French's notion of Viking raiders.

Some philosophers of science argue that information is not a real entity in the ontology, proposing it merely as a nominalist placeholder for varying values and concepts. While I disagree with this position, it is not incoherent, particularly in biology and molecular biosciences. However, even theorists who once held this view now endorse various forms of teleo-functional realism regarding information, suggesting that it is realized by the properties of evolved systems.

I maintain a realist stance on information, favoring a physicalist-statistical perspective regarding its nature, a position that may be less popular among philosophers but is likely more palatable to scientists.

If I were to align with the information-eliminativists, I would have to retreat to a discussion of structure. Attempting to eliminate structure from any entity is a complex challenge. If one considers concepts like total entropy—going beyond mere noise and high entropy to contemplate what absolute nothingness might entail—then it follows that nothing would lack structure. Such a concept would not accommodate Luciano Floridi's non-uniformities.

Scientific Theories as Encoded Information

Scientific theories serve as encodings of information derived from various heterogeneous sources and channels. The channels through which this information flows are often noisy and possess limited bandwidth. The processes that convert information from these sources into representations within scientific models are inherently partial, constrained by physical, causal, cognitive, epistemic, and processing limitations.

As new information emerges—including evidence from repeated, publicly verifiable experiments—scientific theories are inclined to adapt, updating the information encoded in their representations and models. This natural evolution arises from hypothetico-deductive cycles of information encoding, coupled with the principles governing information encoding within scientific models, which include representational partiality, loss, code conventions, and recognized sets of possible source states.

The information encoding that occurs during scientific theorization is multifaceted and partly enacted through cognitive processes. In essence, the minds of scientists are intricately involved in the processing and encoding of new information that shapes revised and updated theories. This encompasses both low-level neural encoding processes frequently discussed in psychology and higher-level cognitive encoding of complex lexical and phonological information, shaped by advanced executive brain functions and metacognitive processes (Bang et al., 2018; Brown, 1987; Dennett, 2001; Derakshan & Eysenck, 2009; Shea & Frith, 2019; Taylor, 2013; Yeung & Summerfield, 2012). The encoding of defeasible theories thus occurs across various levels of abstraction and traverses internal cognitive and external non-cognitive boundaries, necessitating the consideration of cognitive and psychological noise. This underscores the importance of public reproducibility and replication within scientific inquiry.

Defeasibility—characterized by the updating and modification of theories in response to invalidated elements and the refreshing or replacement of encoded information—constitutes a fundamental and unavoidable aspect of scientific theory development, paralleling individual and collective epistemic evolution (Adami, 2012; Brigandt, 2012; Bueno, 1999; Bueno & Da Costa, 2007; French & French, 2014; Friston et al., 2016; Ladyman, 2011; Liu & Liu, 2011; Manero, 2019; Motoura, 2017; Van Benthem, 2007; van Eijck, 2014). The inherent lossy nature of information transmission, the limited bandwidth and noise within information channels, and the restrictions on encoding processes and code sets—all contribute to the foundational basis for revising scientific theories and the concept of defeasibility, which stems from the removal and substitution of misinformation or incomplete information with more accurate data.

Conclusion

There remains significant work for non-a-priorist scientific metaphysics to undertake both within the scientific realm and at its intersection with philosophy. Philosophers and philosophers of science have proven essential to the advancement of scientific methodology, as evidenced by Popper’s hypothetico-deductive framework.

In this sense, philosophy continues to function as the capable handmaiden of science, as scientists operate as naturalistic philosophers whenever they posit untested hypotheses or propositions with naturalistic implications and entailments (Thomasson, 2009).

If this assertion is too robust for the naturalist-scientist audience, it may be less contentious to state that scientists embody the role of natural philosophers when their hypotheses and propositions, while difficult to test, are reasonably grounded in evidence and the constraints of natural laws. These scientific metaphysical hypotheses motivate and are driven by the ongoing updates of information within theories, as well as the cognitive, instrument-based, and experimental methodologies that encode new and improved information into the models and representations of existing theories.

Share the page:

Twitter Facebook Reddit LinkIn

-----------------------

Recent Post:

Embracing Imperfection: The Truth About Mental Health

Discover the truth about mental health and the importance of self-acceptance rather than striving for perfection.

Recognizing Personal Growth: 2 Key Signs of Progress in Life

Explore two key psychological indicators that reveal your progress in life, even when it feels like you’re standing still.

Innovative Technologies Inspired by Nature's Genius

Explore how nature inspires innovative technologies, from superhydrophobic surfaces to adaptive insulation.

# Embracing Selective Socializing: The Art of Social Energy Management

Explore the concept of selective socializing and how to wisely manage your social energy for more meaningful connections.

# Why I Switched from Safari to Arc: A Game-Changer in Browsing

Discover why I transitioned from Safari to Arc, highlighting its innovative features and seamless user experience.

Exciting October 2022 App Updates for iOS and Mac Users

Discover the latest app updates for iPhone, iPad, and Mac in October 2022, featuring new features and enhancements!

Unraveling the Geometry Puzzle of Paper Folding Ratios

Explore a captivating geometry challenge about paper folding ratios and discover the mathematical insights behind it.

Unlocking the Power of ChatGPT: Free Prompt Engineering Course

Explore a free ChatGPT prompt engineering course from OpenAI and Andrew Ng, designed for both developers and non-developers.