Sunday, August 4, 2019

What Is Information? :: Science Research Essays

What Is Information? ABSTRACT: There is a striking paradox in contemporary brain and cognitive science. Their purported fundamental category of information either is not defined or is used in a Shannonesque sense, which is unable to account for the processes of regulation and control when content, not the quantity of information, is concerned. I try to provide a more adequate formula which is applicable to a wide range of systems commonly counted as informational systems. Representative examples would include a single biological cell, animals, persons, and computers. In fact, I consider information-defined here as any detectable difference of physical states-to be the determining principle of all animate systems, one in which determines both their achitecture and their operation. I claim that the concept of information is a realist category and that information itself is, in ontological terms, an irreal entity unable to act on its own. Three hierarchically ordered forms of information are distinguished a nd a number of applications of the proposed definition are discussed. In the books and papers on brain science, cognitive science, etc., one of the most frequently used terms is information. We are told that brains and their various subunits — down to the level of a single neuron — process information, store it, retrieve it, transmit it, etc. They do, indeed. The point, however, is that we are not told what information is. Perhaps information is meant to be understood in the sense first given by C. Shannon? If so, it would be a huge misunderstanding for at least two reasons. First, his approach is entirely content-neutral. It concerns only technical/economical, quantitative problems of data transmission and communication. Brain activity, on the other hand, is concerned with regulation and control, where the content of information matters a lot. Furthermore, since according to Shannon's approach information is what reduces uncertainty, the whole idea presupposes such things as knowledge of a priori probabilities — a requirement which can hardly be attributed to, say, frogs and butterflies. It can serve well the purposes of mathematicians and engineers dealing with well-specified communication problems, but it is useless with regard to the systems which must cope with varieties of environmental stimuli. I suppose that what is taken for granted here is a commonsense, mentalistic connotation: information is thought to be a piece of knowledge. If this is the assumption being made, we must either flatly reject it because of its strong anthropocentric bias, or we must treat it figuratively, as a conventional term of art with no objective counterpart in reality.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.