I would like to answer this question with two different perspectives. My first perspective is to answer it simply by saying that information is what is known, and entropy is its opposite—what is not known. To explain it better, I like to use metaphors: coldness is not a thing on its own, but rather the absence of heat; similarly, darkness is the absence of light. In the same way, we can think of entropy as the absence of information. When something is full of entropy, it means that nothing is structured, ordered, or understood. It is randomness, uncertainty, and disorder. Information, then, is what reduces this randomness. It is like turning on a light in a dark room—you suddenly see shapes, objects, and structure. Information makes the unknown known.
My second perspective builds from the first. If information is the opposite of entropy, then maybe we can think of them as being on a balance. For every one bit of information gained, perhaps one bit of entropy is lost. Simply put, when we learn something new, we are reducing uncertainty in that specific area. But the more I think about it, the more I realize that this process is not always so clean or linear. Often, gaining one piece of information opens up more questions than it answers. For example, learning that "the sky is blue" gives us a known fact—but that one fact opens a whole series of unknowns. Why is it blue? What makes it appear blue to our eyes? What is color, really? How did the word "blue" come to be used? In other words, the acquisition of information might actually expand the horizon of entropy, rather than reduce it entirely.
This leads me to a third thought—perhaps information and entropy are not only opposites, but also intertwined in a continuous cycle. Every time we gain new information, we carve a path through the unknown, but that very path leads us to even deeper layers of what we don’t know. In this sense, entropy isn’t just something to be reduced, but something that grows and shifts as our understanding evolves. It challenges us to keep exploring, questioning, and discovering. So while information helps us navigate through entropy, it also shows us how much more there is to uncover. This dynamic makes the relationship between information and entropy not just one of opposition, but also one of mutual generation—each giving rise to the other in an ongoing loop of curiosity and discovery.
Information and entropy are undoubtedly intertwined—two opposing sides of the same coin. They exist in tension yet remain deeply connected within a system: information has the potential to reduce entropy, while entropy, in turn, can give rise to new information. This framing reflects a linear perspective, shaped by the inherent duality of these concepts. In logical terms, it resembles a disjunction: “A affects B or B affects A,” suggesting that at least one direction of influence exists, though not necessarily both. But is this truly the nature of their relationship? Or is there a more complex, dynamic interplay at work?
This question has become especially relevant in my PhD journey where I find myself constantly embracing the dynamic nature of abstraction—an evolving process that reshapes how I understand and engage with knowledge. This reflection brings to mind a quote I often share with my students from Alvin Toffler: “The illiterate of the future are not those who can't read or write, but those who cannot learn, unlearn, and relearn.” This powerful statement captures the ever-changing essence of education—an adaptability that extends far beyond the classroom and into the core of our social system. I’ve come to realize that knowledge isn't solely constructed through linear, causal relationships as often emphasized in quantitative research. There is also profound value in recognizing contextual and reciprocal relationships—the kind that qualitative research, such as what we explored in COMM 391 seeks to illuminate. These perspectives highlight that meaning is not only measured, but also interpreted, experienced, and situated within a dynamic social reality. Building on this understanding, it becomes more compelling to view entropy and information as engaged in a symbiotic or reciprocal relationship, rather than a strictly linear one. A linear view tends to oversimplify the complexity of systems in which multiple forces continuously interact. By contrast, a reciprocal model aligns with the concept of logical conjunction, where both relationships occur simultaneously—A influences B, and B influences A. This suggests a mutual interdependence, where each element shapes and reinforces the other within a dynamic system. It points to an ongoing cycle of feedback, adaptation, and contextual influence—unlike a unidirectional model, where influence flows in only one direction.
In conclusion, interpreting the relationship between information and entropy as purely linear limits our understanding of their true complexity. It ignores the possibility that order (information) and disorder (entropy) can coexist, interact, and even give rise to new meanings. Embracing a more dynamic, reciprocal perspective reminds us to approach uncertainty with openness—to feedback, transformation, and continual growth—echoing Toffler’s powerful reminder that to thrive in complexity, we must learn, unlearn, and relearn.
In the context of sustainable development and information science, understanding the relationship between information and entropy reveals the invisible forces shaping our lives and systems. From the readings in our main reference book “Sustainable Development as a Continuous Information and Communication Engagement: Towards a Theory of Societal Entropy,” reinforced through our online class discussions, I have learned to see information as more than just data; it is a stabilizing force, a driver of clarity and progress. In contrast, entropy represents the disorder, uncertainty, and breakdown of systems. In today’s world, in which information flows rapidly across borders and platforms, maintaining order through meaningful, truthful, and timely information is critical. This deeper understanding resonates with my profession as an accountant, where clarity and accuracy in financial reporting are essential to prevent disarray and mistrust.
I relate this concept to my experience in both accounting and human resource administration. In accounting, validated information is the bedrock of financial integrity; when records are accurate and complete, they provide insight, support decision-making, and help ensure compliance. Any misinformation or error, even if unintentional, can skew the analysis and mislead stakeholders, thereby increasing entropy within the system. In HR and administration, information takes another form: employee records, performance metrics, and organizational policies. These factors influence not only operational efficiency, but also the morale and productivity of human capital. If this information lacks transparency or is poorly managed, it creates confusion and distrust, again signifying increasing entropy. That is why I strongly believe in the saying, “In the currency of knowledge, information is wealth, and entropy is the debt of uncertainty.” It perfectly captures the value of reliable information in maintaining order and guiding sound decisions, especially in my field where accuracy and clarity are vital. This reflects the constant balancing act that we navigate across sectors, where information must outweigh entropy to sustain growth and credibility.
In summary, information and entropy are two sides of a dynamic relationship, one that we continuously manage in both our professional and personal lives. Information brings order, structure, and insight, whereas entropy represents the risk of collapse through misinformation, confusion, or neglect. I deeply value the insights shared by Prof. Flor and the meaningful discussions we’ve had in class, which have greatly enriched my understanding of the role of information in shaping systems and society, our goal as responsible communicators and professionals is to use information to reduce entropy, ensuring that our systems, whether in business, education, or society, remain resilient, transparent, and functional.
Entropy can be defined as states of disorder, randomness, or uncertainty. An example of this include: boiling water, hot objects cooling down, ice melting, and even salt or sugar dissolving. In ICT, randomness collected by an application can be used in encryption for example or anything that require random data. While in information, the more we have it in a specific event the more there is uncertainty since there are a lot of things to consider to make it certainor deterministic.
The movie titled “Midway” was released in 2019 and talked about major events that transpired in the Pacific war theater during the 2nd World War, specifically the famed Battle of Midway. A particular aspect of the movie dealt about the key role that communication played in how the Allied forces were able to accurately predict ship movements of the Japanese Imperial Navy in the Pacific Ocean. It tackled how a group of soldiers called “codebreakers” were able to decipher coded messages that to others would seem gibberish but in fact carried secret messages that include detailed ship movements.
It was shown in the movie how both the Japanese and Allied forces used communication and manipulated information to each meet their goals: for the Japanese, codes of information were used to “hide” the true intent of their battle plans, while the Allied forces also used information to completely decipher the enemy’s messages and in turn also used communication to mislead the enemy of its true intentions. It showed that random words or sentences that practically have no meaning have high entropy since it’s less predictable, more random, and possesses raw information, as compared to the usual words that have low entropy since it is deemed redundant or formulaic. Thus, it showed how information and entropy can be related, as shown how the two forces used information to achieve their own objectives and goals.
It cannot be denied that information is such a powerful tool in communications and its power and effect is determined on how it is used depending on the level of entropy. As stated in the example above, the idea of using high entropy words and sentences can be beneficial but only to a certain extent since in the same example, the Allied forces also used high entropy words to confuse the enemy, without letting the enemy know that it has already broken its coded or high entropy words. In short, the use of high and low entropy words works depending on how these words will be used, whether basic or complex, to achieve a goal or objective.
Shannon’s entropy refers to a measure of uncertainty, and it is said that the higher the entropy, the higher is the disorder in the system. In the context of communication, entropy is something that negatively affects interactions and communication systems—a problem, a noise. Reflecting on associated concepts, controlling entropy is dependent on the amount of information available and accessible and how information is transmitted from one place to another.
I would like to cite instances where I think information and entropy are related:
We may not think about entropy all the time, but entropy happens in our everyday lives. This made me realize that the reason I was not able to get a response from my friend is that something might hinder the message I sent, which could pertain to problems in the communication system, there might be no signal, there could be a failed mobile network delivery, or my friend might not have a cellphone load to reply. How will I know my friend’s response if she cannot also access social messaging apps due to a limited data network? How will I know the reason if my friend is in a distant location? Thus, this leads to the agony of waiting for someone’s reply. Is it really about failed communication network signals; did I say something that would make her mad; or is there a misinterpretation on her end and she did not just want to reply at all? This situation demonstrates a disconnect, and in order to address the situation, I have to reach out to her in the most effective way possible to understand where the problem is.
On the other hand, we become aware of what’s happening in our environment through media and other digital platforms. Currently, we hear news about a series of volcanic eruptions and earthquakes in Luzon, Visayas, and Mindanao, and this gives us warning that we have to prepare as “The Big One” may strike our country anytime. We may not know exactly when it will happen but because we have the information–the predictions made by responsible agencies, which are being disseminated by news outlets/social media–then we can prepare for it, wherein educational institutions and public and private organizations can conduct earthquake drills so that people may act accordingly when this disaster occurs. Although we experience uncertainty in this scenario, information helps us on what to do once we encounter the situation. Meaning, information guides people on how to act and respond, and this reduces uncertainty.
When it comes to my workplace, I think the relevance between information and entropy is evident when farmers are reluctant to adopt a technology developed by the Institute of Plant Breeding (IPB), the Site-Specific Nutrient Management-Nutrient Expert (SSNM-NE) for cassava. This particular technology developed with the support of the Department of Agriculture is a science-based software and decision-support tool that provides cost-effective fertilizer recommendations and crop management strategies suitable to farmers’ location with the primary aim of enhancing their yield. There are many farmers who are utilizing and already adopting the technology, but there are also farmers, especially the older ones, who are hesitant to use the technology that can be accessed via desktop computers, tablets, and mobile phones. Because the said farmers have uncertainties about using the technology, this creates a gap, which also affects our institute in the achievement of its goals: the full adoption of technology by farmers in target regions. In this case, entropy could pertain to farmers’ reluctance and it is important for us to identify their reasons and the factors behind not supporting and engaging in this initiative (e.g., digital divide, preferences, varying perspectives, etc.).
To address entropy, the research team acknowledges the importance of intensifying its efforts to disseminate the significance of the technology and what the farmers can gain from it through capacity building programs (trainings, technical advisory, and demonstrations/tutorials). Since the project involves different regions in the Philippines, it is important to tailor the software based on farmers’ dialect so that they can relate and understand better the information that generates knowledge towards good farming practices. How much information is needed to reduce farmers’ reluctance to technology and how frequently the dialogue is held are among the considerations.
Entropy could be addressed and managed through the amount of positive, clear, and motivating information. Therefore, information and entropy are related. What counters entropy is information.
As Dr. Alexander Flor explained in class, information is the antidote to entropy.
Nerdcore rapper MC Hawking’s 2014 song “Entropy” explains the concept really well:
“Let's just say that it's a measure of disorder
In a system that is closed, like with a border
It's sorta like a well a measurement of randomness”
In a closed system, there is no choice but to break down. This is entropy. Claude Shannon applies this principle of physics also applies to information and communication.
In communication, information is what keeps everything from breaking down and descending into chaos. The negotiation that happens in the giving, receiving, and comprehending of information is what keeps communications thriving.
This can be illustrated by the saying that when negotiations fail, they have ”broken down.” There is not enough information for the parties involved to come to an understanding or an agreement.
In an organization, communication is the highly underestimated glue that keeps it functioning properly. An organization whose departments “do not talk to each other,” is an organization whose operations are in disarray. Proper execution is hampered when there is not enough communication between departments to ensure a smooth implementation.
But to keep communication from entropy, it is not just important to have sent communication. The way information is structured and how it is comprehended is important as well. The better a message is structured and the clearer its mode of delivery, the higher the chances that it will be comprehended in the way the sender meant for it to be.
Thus, communication isn’t just a matter of sending a message. Effective communication is a delicate balance of matching the message and mode of delivery to its recipient for optimum comprehension and acceptance. Any deficiency in these may lessen the chances of a message being properly received, thus increasing the chances of communications “breaking down.”
Dr. Flor quoted Yeat’s “The Second Coming,” using the stanza
Turning and turning in the widening gyre
The falcon cannot hear the falconer;
Things fall apart; the centre cannot hold;
Mere anarchy is loosed upon the world.
to illustrate the effect of entropy on communication. I believe that this is an apt metaphor. When communication fails, its just a matter of time before the whole conversation/ endeavor/ organization falls apart.
References:
Hartnett, K. (2022, September 6). How Claude Shannon’s concept of entropy quantifies information | Quanta magazine. Quanta Magazine. https://www.quantamagazine.org/how-claude-shannons-concept-of-entropy-quantifies-information-20220906/
Allow me to discuss this question following my answer to how the power of information creates polarization between the marginalized farmers and fishers and the BPO and marketing industries. Information and entropy are negatively correlated, according to Shannon’s Entropy theory. Entropy is an uncertainty or a state of disorganization because it vanishes clear distinctions, while information negates this phenomenon, coining the term negentropy.
With the marginal farmers and fishers’ limited access to information, this results in a greater amount of entropy (uncertainties). This conceptual gap translates to real-world issues such as hunger and poverty because, according to Prof. Flor, information lies in the fabric of reality. Their entropy was caused by their limited access to real-time information on climate change, market volatility, inflation, weather forecasts, planting, and fishing biotechnological innovations. This does not mean that the BPO and marketing industries do not suffer from entropy, but rather suffer a different kind of entropy, one that is not born from issues of digital divide but from uncertainties brought by misinformation, data privacy violations, and the cognitive process disruptions brought by AI technologies.
This negative influence between information and entropy may be simple to map, but embedding this in the fabric of reality (I like how Prof. Flor coined the term) would affect many actants in the vast micro and macro human ecosystem. But being able to map out the relationship is a step towards knowledge that allows human beings to utilize information to address pressing societal issues. I believe Hilbert (2022) would agree with this assertion because this would lean into saying that “the more you know, the more you can grow”.
I find the relationship between information and entropy quite fascinating, especially in the context of communication. One of the key ideas that stuck with me in my readings on ICT4D is that “information is a difference that makes a difference.” This aligns well with Claude Shannon’s concept of entropy, which refers to the uncertainty or disorder in a communication system. The more uncertain or unpredictable a message is, the higher its entropy—meaning it contains more potential for meaningful information.
In both teaching and research, I see this relationship play out all the time. For instance, when instructions for a research paper or project are vague, my students—whether undergraduates or master’s students—often become confused and stressed. That confusion is a form of entropy. When I clarify the guidelines or provide examples, I reduce the entropy and increase their understanding. This makes the message more effective and actionable. Shannon’s entropy, then, isn’t just a mathematical theory—it’s something we experience every day in academic communication. The clearer and more structured the message, the more useful the information becomes. This also connects to the idea of the quality and contextual relevance of information, which are essential for development communication.
In a broader sense, this discussion becomes even more important in our current digital age, where information overload is common and misinformation spreads quickly. In regions like Caraga, where digital access is growing but still uneven, high entropy can mean missed opportunities, confusion during emergencies, or even social division. As a teacher and communicator, it is part of my responsibility to help reduce entropy by ensuring that the information I share is clear, truthful, and relevant. Whether I’m teaching communication theory or guiding research writing, I aim to foster an environment where information truly empowers rather than confuses. Understanding the relationship between information and entropy gives us a deeper appreciation of the power and responsibility we carry as educators and communicators.
The relationship between information and entropy may appear to be complex. Yet, this may be drawn from analogies of daily situations. By simplifying the context, the connection may be easily appreciated and proved to make sense. In mathematics, information and entropy may be likened to variables which are inversely proportional. When one increases, the other decreases, and vice-versa.
In the school scenario, a messy and disorganized classroom is a state of entropy. The more chaotic it is, the greater the chance of the losing the information on where the school supplies are. The disorder represents the lack of information on where the stuffs are kept. As opposed a neat and well-organized room where things are easily located since information on whereabouts are clear. Thus, the presence of information decreases entropy.
The use of this analogy shows the relationship of information and entropy. The latter represents uncertainty and disorder while information restores orderliness and control. Whatever the situation is, entropy and information defines how we respond to complexities and how we utilize information to guide our everyday decisions and maintain harmony.
At first glance, information and entropy may appear to be in completely different worlds, one that sounds like technology or communication, the other like something from a physics textbook. But surprisingly, they are deeply connected, and understanding how they can give you a cool insight into how everything from your brain to the universe functions. Let us start with the basics. Entropy, in Physics, is a measure of disorder or randomness in a system. The more disordered something is, the higher its entropy. At the same time, information, at least in the way communication theorists such as Claude Shannon think about it, is about reducing uncertainty. When you receive new information, you are making something more random, less unpredictable. So here is the twist! Information is essentially the opposite of entropy. When you receive a clear message or a meaningful piece of data, you are reducing the "entropy" of a situation because you are adding clarity. For example, imagine flipping a coin. Before flipping it, you still do not know the outcome; it has maximum uncertainty. Once you see it land on heads, you have gained one bit of information, and the uncertainty disappears. That is Shannon’s idea: information resolves uncertainty, which is similar to reducing entropy.
Information theory is a branch of mathematics that focuses on transmitting data through a channel that is prone to noise. The fundamental concept in information theory is the measurement of the amount of information contained within a message. In general, this concept can be applied to measure the amount of information in an event and a random variable, known as entropy, and is determined using probability. Calculating information and entropy is a valuable tool in machine learning and serves as the foundation for techniques like feature selection, constructing decision trees, and, more broadly, fitting classification models. As a result, a machine learning practitioner must possess a deep understanding and intuition for information and entropy.
Information theory focuses on compressing and transmitting data, drawing upon probability and providing a foundation for machine learning. Information serves as a means to measure the level of surprise for an event, expressed in bits.
Entropy, on the contrary, quantifies the average amount of information required to represent an event selected randomly from a probability distribution for a random variable.
Information and entropy are directly related to each other. Information is a measure of our confidence in the likelihood of an event happening. For instance, if we had to inform someone about a number on the dice, we would only use one word to indicate a number from one to six. If we wanted to refer to a different number, such as seven, ten, or twelve, we would need to specify that we are using a dice with more sides than six.
High entropy is associated with high information or the greater the uncertainty surrounding the probability of a particular event, and vice versa.
In essence, entropy is information quantified by the amount of detail required to convey a probable occurrence, and the more frequently you anticipate the event (common, routine events), the less information you need to provide.
Information and entropy are two different concepts that are inversely related. At its core, the relationship between these two concepts is a story of surprise and uncertainty. Information is not about the meaning of a message, but how much the message is received. Therefore, the higher the entropy of a system (the more unpredictable or random it is), the more potential information is contained in the message that describes its state. A message with low entropy is highly predictable and provides very little information, whereas a message with maximum entropy provides the most information with each new symbol because every outcome is a complete surprise.
I can relate to this my current experience as someone who is learning a foreign language (Japanese) for the first time, which is a situation wherein this abstract relationship between information and entropy becomes highly solidified. Nihongo, in its entirety, is a high-entropy system. Every time I encounter a new kanji, a grammar particle like wa [は], or a new verb conjugation, it is a high-information phenomenon because the outcome is entirely unpredictable. Each new piece of information I successfully absorb reduces the overall entropy of the language for me. As I progress, I will start to predict what grammar follows a certain verb form or which kanji are likely to appear in a given context. The language becomes more predictable, meaning its entropy decreases, and each new word provides less "surprising" information. This is the reason why my learning materials start with simple and repetitive phrases (low-entropy) to build a foundation, before gradually exposing myself to the full, high-entropy complexity of authentic Japanese media and real-life conversation.
Information and entropy, according to Shannon’s information theory, are closely linked concepts that depict the uncertainty and information content within a system or message. For Shannon, entropy measures the uncertainty or randomness in a system, specifically the average amount of information needed to predict the outcome of a random variable. This means that the probability of each outcome is used to calculate the results in bits.
Meanwhile, information quantifies the reduction in uncertainty that a message or event provides.
So when you receive a message, the information it conveys is directly tied to how much it reduces the entropy (uncertainty) about the system. For instance, Mark clarifies with the community that the plastic ban addresses urgent waterway pollution. This message reduces uncertainty (entropy) about why the ban is needed. However, if the community is split on the ban versus phase-out, each message Mark provides information, shifting the probabilities. Still, again, if the community converges on the fast phase-out, entropy drops significantly, reflecting that little additional information is needed to confirm the decision.
It only shows how information can control entropy. The higher the entropy is, the more information is needed to clarify the outcome, or the lower the entropy, means the outcomes are predictable, requiring less information.
I would like to answer this question from the perspective of an ICU nurse turned HR, a researcher in workforce optimization and a Doctor in Sustainability student.
The relationship between entropy and information plays out beautifully in the ICU, where uncertainty is the deciding factor between life and death. If I am unaware of my patient's oxygen saturation, I am in high entropy: we can't know what the future holds for their stability. But when I have a reading from the monitor, I gain something. That one data point reduces entropy by trading uncertainty for clarity, enabling me to make informed and timely clinical decisions. The same holds outside of the bedside, particularly in organizational environments.
My background in HR has shown me how the "entropy" of an organization is most often paired with vagueness. Management without credible data on the well-being, morale, or productivity of employees has uncertainty as the dominant concept. Choices in that environment are most often reactive, random, and based on guesses rather than being strategic. But devices such as employee questionnaires, feedback mechanisms, or standard performance appraisal inputs helpful information into the system. That information brings organizational entropy down by lowering unpredictability and allowing leaders to establish more distinct, more lasting directions for staff as well as for organizational progress.
In my research in workforce optimization, I find myself seeing how that relationship between information and entropy can be viewed not merely descriptively but strategically as well. Entropy within optimization can be viewed as the "spread" of possible outcomes, while information is the force that narrows that spread. A system with zero entropy, in which everything is determined, can become inflexible and unable to adapt. A system with too much entropy can collapse into chaos. The art is in managing uncertainty so that enough information is gathered and applied in the right way, but still enough to be flexible and resilient.
This is also what I am learning about in my course on sustainability. Sustainability in its essence is all about handling uncertainty in environmental, social, and economic systems. Climate change, resource scarcity, and community resilience all pose challenges that are high in entropy: uncertain futures, cascading risks, and multiple possible pathways. Sustainability's tools such as systems thinking, scenario planning, and adaptive governance, mirror the way that entropy is handled in clinical and organizational worlds. They strive to reduce uncertainty by gathering good data, developing models, and feedback loops, but also have to accommodate flexibility and adaptability to revise. Like an ICU nurse must respond quickly to new data, and a workplace leader must shift strategies based on feedback, sustainability practitioners must integrate information without becoming rigid, balancing predictability and resilience.
I see entropy as a pool of possible information and disinformation. Disinformation may seem to provide a level of certainty until proven false. So we need to seek information for both clarity and reduction of uncertainty.
In the ongoing investigations in our country, we seem to be drowning in entropy - disclosures, testimonies, revelations. Fact-finding, audits and verification need to be conducted to reduce questions and uncertainty. Then hopefully, when the dust clears, we get clear, verified, undeniable and truthful information. We need vetted information we can use to change and correct the flaws of the systems and structures.
Information and entropy are intimately related because of how signals are made, sent and understood. The more entropy there is, the more meanings or versions a message could have which makes it less predictable. Information, in contrast might be perceived as the alleviation of uncertainty following the reception and interpretation of a message. In this way, communication gives information by choosing and sending one message among a number of possible ones which lowers the entropy for the person getting the message.
This relationship shows how communicators need to find a balance, too much entropy can make messages hard to understand or unclear while too little entropy can make communication boring or unhelpful. In the study of communication, information is generated through the regulation of entropy, guaranteeing that messages are both significant and effective for their target audiences.
Information and entropy are deeply interconnected concepts. In the area of Information, Communication, and Technology (ICT), the link between information and entropy becomes highly practical. Digital systems, networks, and platforms constantly deal with uncertainty, errors, and disorder, which is essentially entropy. Every bit of information transmitted through technology reduces this uncertainty by clarifying meaning, enabling order, and allowing systems to function reliably. For example, error-detection and correction codes in data transmission illustrate how ICT actively uses information to counteract entropy in communication channels, ensuring that messages arrive intact and understandable.
At a larger scale, ICT infrastructures such as the internet, mobile networks, and knowledge management systems, serve as negentropic engines in society. By facilitating the flow of information across individuals, organizations, and nations, ICT reduces social and institutional entropy, helping communities adapt, innovate, and sustain growth. In sustainable development contexts, this means ICT is not just a set of tools but a vital communication process that enables societies to resist disorder (poverty, inequality, misinformation) and move toward resilience and sustainability. In essence, ICT embodies the principle that information flow reduces entropy, making it the backbone of modern progress.
Information and entropy may be distinct concepts in their respective fields, but they are still intimately related in both information theory and thermodynamics, though they originate from different disciplinary traditions.
In information theory, Claude Shannon (1948) defined information as the reduction of uncertainty that occurs when a message is received. Entropy, in this context, is a mathematical measure of uncertainty or unpredictability in a system of possible messages. The more uncertain or variable the message source, the higher its entropy. Conversely, when information is transmitted, entropy is reduced because uncertainty about the state of the system decreases.
For example, in the context of a press conference during a public crisis. Media practitioners and the wider public confront multiple, often conflicting, accounts of what has happened before the official spokesperson delivers a statement. This multiplicity of possible narratives creates uncertainty, which in information theory is described as entropy (Shannon, 1948). Since no single message has yet established authority or clarity, entropy here denotes the unpredictability of the communication environment. The spokesperson significantly reduces the uncertainty when they communicate verified information. Instead of navigating a field of competing possibilities, the audience now receives a coherent account of the situation. This reduction of uncertainty represents the acquisition of information—knowledge that narrows down the range of possible meanings (Cover & Thomas, 2006).
This analogy emphasizes the academic relationship between information and entropy in the field of communication. Entropy captures the level of ambiguity, noise, or unpredictability within a communicative system, while information reflects the process by which such ambiguity is resolved. Communication, therefore, may be understood as a dynamic practice that transforms entropy into information, enabling human actors to co-construct shared meanings and coordinated responses.
Thus, information and entropy are related in that entropy quantifies the potential information contained in a system, while information refers to the realized reduction of that entropy when uncertainty is resolved. High entropy implies that a system has more possible states and therefore more potential information content, whereas the act of acquiring information effectively decreases entropy by narrowing down the range of possibilities.