In many models of cognition and biological systems, Shannon information is frequently used to quantify signal transmission, entropy, and uncertainty. However, emerging frameworks, such as Integrated Information Theory (IIT), Functional Systems Theory, and Dynamic Organicity Theory, argue for the need to account for intrinsic information, which reflects system-internal structuring rather than external observer-based encoding.
In the context of the Oscillatory Dynamics Transductive-Bridging Theorem (ODTBT), intrinsic information is structured through recursive oscillatory interactions and is understood as an emergent property of self-referential holons and transductive phase transitions (e.g., at TWIST thresholds). It’s not about signal transmission, but about how functional redundancy is restructured and how coherence arises within the system.
My question is: How is intrinsic information formally defined or modeled in contrast to Shannon information in biological or cognitive systems?
I'm especially interested in:
Any input on computational methods, modeling formalisms, or biological case studies would be much appreciated.