{"id":17696,"date":"2025-06-02T12:00:34","date_gmt":"2025-06-02T11:00:34","guid":{"rendered":"https:\/\/letrat.eu\/?p=17696"},"modified":"2025-06-02T17:57:19","modified_gmt":"2025-06-02T16:57:19","slug":"what-is-information-the-answer-is-a-surprise","status":"publish","type":"post","link":"https:\/\/letrat.eu\/?p=17696","title":{"rendered":"What Is Information? The Answer Is a Surprise!"},"content":{"rendered":"<p><em>By Ben Brubaker<\/em> [ <em>June 2, 2025 &#8211; Fundamentals &#8211; Quanta Magazine<\/em> ]<\/p>\n<p>It\u2019s often said that we live in the information age. But what exactly is information? It seems a more nebulous resource than iron, steam and other key substances that have powered technological transformations. Indeed, information didn\u2019t have a precise meaning until the work of the computer science pioneer Claude Shannon in the 1940s. <\/p>\n<p>Shannon was inspired by a practical problem: What\u2019s the most efficient way to transmit a message over a communication channel like a telephone line? To answer that question, it\u2019s helpful to reframe it as a game. I choose a random number between 1 and 100, and your goal is to guess the number as quickly as possible by asking yes-or-no questions. \u201cIs the number greater than zero?\u201d is clearly a bad move &#8211; you already know that the answer will be yes, so there\u2019s no point in asking. Intuitively, \u201cIs the number greater than 50?\u201d is the best opening move. That\u2019s because the two possible answers are equally likely: Either way, you\u2019ll learn something you couldn\u2019t have predicted. <\/p>\n<p>In his famous 1948 paper \u201cA Mathematical Theory of Communication,\u201d Shannon devised a formula that translated this intuition into precise mathematical terms, and he showed how the same formula can be used to quantify the information in any message. Roughly speaking, the formula defines information as the number of yes-or-no questions needed to determine the contents of a message. More predictable messages, by this measure, contain less information, while more surprising ones are more informative. Shannon\u2019s information theory laid the mathematical foundation for data storage and transmission methods that are now ubiquitous (including the error correction techniques that I discussed in the August 5, 2024, issue of Fundamentals). It also has more whimsical applications. As Patrick Honner explained in a 2022 column, information theory can help you win at the online word-guessing game Wordle.<\/p>\n<p>In a 2020 essay for Quanta, the electrical engineer David Tse reflected on a curious feature of information theory. Shannon developed his iconic formula to solve a real-world engineering problem, yet the underlying mathematics is so elegant and pervasive that it increasingly seems as if he hit upon something more fundamental. \u201cIt\u2019s as if he discovered the universe\u2019s laws of communication, rather than inventing them,\u201d Tse wrote. Indeed, Shannon\u2019s information theory has turned out to have unexpected connections to many different subjects in physics and biology.<\/p>\n<p>The first surprising link between information theory and physics was already present in Shannon\u2019s seminal paper. Shannon had previously discussed his theory with the legendary mathematician John von Neumann, who observed that Shannon\u2019s formula for information resembled the formula for a mysterious quantity called entropy that plays a central role in the laws of thermodynamics. Last year, Zack Savitsky traced the history of entropy from its origins in the physics of steam engines to the nanoscale \u201cinformation engines\u201d that researchers are developing today. It\u2019s a beautiful piece of science writing that also explores the philosophical implications of introducing information &#8211; an inherently subjective quantity &#8211; into the laws of physics.<\/p>\n<p>Such philosophical questions are especially relevant for researchers studying quantum theory. The laws of quantum physics were devised in the 1920s to explain the behavior of atoms and molecules. But in the past few decades, researchers have realized that it\u2019s possible to derive all the same laws from principles that don\u2019t seem to have anything to do with physics &#8211; instead, they\u2019re based on information. In 2017, Philip Ball explored what researchers have learned from these attempts to rebuild quantum theory.<\/p>\n<p>Physics isn\u2019t the only field influenced by ideas from information theory. Soon after Shannon\u2019s paper, information became central to the way researchers think about genetics. More recently, some researchers have brought principles from information theory to bear on some of the thorniest questions in biology. In a 2015 Q&#038;A with Kevin Hartnett, the biologist Christoph Adami described how he uses information theory to explore the origins of life. In April, Ball wrote about a new effort to reframe biological evolution as a special case of a more fundamental \u201cfunctional information theory\u201d that drives the emergence of complexity in the universe. This theory is still speculative, but it illustrates the striking extent of information theory\u2019s influence.<\/p>\n<p>As the astrobiologist Michael Wong told Ball, \u201cInformation itself might be a vital parameter of the cosmos, similar to mass, charge and energy.\u201d One thing seems certain: Researchers studying information can surely expect more surprises in the coming years.<br \/>\n&nbsp;<br \/>\n&nbsp;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>By Ben Brubaker [ June 2, 2025 &#8211; Fundamentals &#8211; Quanta Magazine ] It\u2019s often said that we live in the information age. But what exactly is information? It seems a more nebulous resource than iron, steam and other key&hellip; <a href=\"https:\/\/letrat.eu\/?p=17696\" class=\"more-link\">Lexo <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[29],"tags":[],"class_list":["post-17696","post","type-post","status-publish","format-standard","hentry","category-media-extracted"],"_links":{"self":[{"href":"https:\/\/letrat.eu\/index.php?rest_route=\/wp\/v2\/posts\/17696","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/letrat.eu\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/letrat.eu\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/letrat.eu\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/letrat.eu\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=17696"}],"version-history":[{"count":0,"href":"https:\/\/letrat.eu\/index.php?rest_route=\/wp\/v2\/posts\/17696\/revisions"}],"wp:attachment":[{"href":"https:\/\/letrat.eu\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=17696"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/letrat.eu\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=17696"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/letrat.eu\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=17696"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}