While Shannon’s information theory measures accuracy and quantity of transmitted data, it says nothing about whether that information is relevant or valuable to a system. Complexity theorists like Artemy Kolchinsky and David Wolpert extend this by defining semantic or meaningful information as the subset of information from an environment that is causally necessary for a system to maintain its existence against entropy. Meaning thus emerges from the mutual information between a system and its environment when that information aids persistence, and attending to irrelevant information can be energetically costly and existentially risky. This principle applies not only to life but to all physical systems, and underlies Bobby Azarian’s view of cosmic evolution as a learning process driven by the thermodynamic imperative to accumulate adaptive knowledge. As entities complexify, they become better at processing meaningful information, which in turn supports greater viability, evolutionary success, and further complexity—suggesting that meaning itself evolves and complexifies over time.
If you want to watch all the videos created so far, you can become a contributing subscriber and get access to them HERE.