, which makes BioInformatics
Meaning of information:
is "structure" to be understood as equivalent to "information content"?
physical information technically includes both local information and global information. Most local information is entropy, and most information is local information.
- technically speaking, locality refers to compact support. Analytic functions are non-compact; they are nonzero over an infinite range. Analytic functions which are known entirely over any finite range are also known over their entire infinite range. Conversely, any function which cannot be globally known from its local behavior is non-analytic. That is, only non-analytic functions can surprise you by crashing down on your head; analytic functions warn you ahead of time.
So a system which maximizes physical information is maximizing entropy. Needless to say, that's not the desired meaning of information. Most people understand information as excluding entropy, as referring only to global information. Structure is a subset of that information. So the definition does not
use information in the sense of Shannon's theory but more in its everyday sense. Perhaps logical information? So there you go, you can either interpret structure very broadly or information very narrowly.
Would 'actively maintains its logical informational content' also apply to a computer? -- RobHarwood
Heh, you've just proved something I only suspected, that 'structure' is a strict subset of the everyday notion of information. The everyday concept of information encompasses both the physical structure of computers (which computers do not maintain) and their logical contents (which they do maintain). yet excludes entropy. Unfortunately, 'information' can get confused with physical information. Maybe order?
tangential comment: Yes, it's exactly true that (most) computers "actively maintain" their logical contents. Especially bits stored in DRAM, which fade away unless periodically refreshed many times per second.
Sorry, but what exactly is that you are speaking about? What is "information content", "structure" or "internal order"? How do you suppose to reach an agreement about meanings of these words?
Information is a word with many meanings. There is high-level information (like in a crystal) and low-level information (usually called entropy). Too much information overloads people's signal processing (human brains) and so gets perceived as no
information. A structure is an example of high-level information and entails lesser levels of low-level information. Entropy is the dominant part of low-level information and is intuitively conceived as disorder.
The complications come from having more than two levels of information. In a computer, you have a lowest level made up of atoms and molecules, you have a higher level made up of wires, and you have a highest level made up of bits and logic switches. In this case, structure refers to the physical structure (middle level), information content refers to the highest level, and internal order refers to both higher levels. The lowest level of physical information is mostly heat which isn't conceived as information in the general sense of the word. -- RichardKulisz
Any new and meaningfull signal is information. --HarryVanDerVelde
Information is surprising. The information in a signal can be measured by how unexpected (difficult to predict) it is.
I strongly recommend reading up on AlgorithmicInformationTheory
. -- LucasAckerman
: should this page be re-named "DefinitionOfInformation?