Vol. 12 No. 4, October, 2007
"The seemingly empty space around us is seething with information. Much of it we cannot be aware of because our senses do not respond to it. Much of it we ignore because we have more interesting things to attend to. But we cannot ignore it if we are seeking a general theory of information. We cannot live only by reading and writing books" (Brookes 1980: 132)
This short paper presents an approach to a theoretical framework for understanding information in the physical, biological and human domains. It suggests that not only may an understanding of information in the physical and biological domains be helpful in dealing with the concerns of library and information science (LIS), but also that insights from LIS studies may shed light on the sciences usually thought of as more fundamental. It is based on the concept of information as a form of self-organizing complexity, which manifests as complex patterns in the physical world, meaning in context in the biological realm, and understanding through knowledge in the human domain.
This short paper draws on a more extensive and more fully argued and referenced study (Organized complexity, meaning and understanding: an approach to a unified view of information for information science, Aslib Proceedings, in press 2007). This paper focuses on, and presents new ideas on, the way in which LIS scholarship and research may contribute to the understanding of information in other domains.
Three authors in particular have proposed approaches to the idea of a unified view of information, albeit in very different ways: Tom Stonier, Andrew Madden, and Marcia Bates. Such ideas had been suggested before, notably by Brookes, and these pioneering suggestions have been developed by these three authors.
Stonier (1990, 1992, 1997) made one of the first detailed attempts to unify the concept of information in the physical, biological and human domains. Starting from the concept of information as a fundamental constituent of the physical world, Stonier proposed relations between information and the basic physical quantities of energy and entropy, and suggested that a general theory of information may be possible, based on the idea that the universe is organized into a hierarchy of information levels. Stonier identified self-organizing information processing systems as the "physical roots of intelligence", based on his conception of information as a basic property of the universe.
Madden (2004) focused on the biological domain in his evolutionary treatment of information, examining information processing as a fundamental characteristic of most forms of life. He argued that Lamarckian evolution - the idea that characteristics acquired by a biological organizm during its lifetime can be passed on to their descendants - while discredited in general biology, may be appropriate for understand the evolution of human societies, including their information behaviour. Madden proposed for the first time so far as I am aware that insights from the information sciences may be valuable to the supposedly more 'basic' sciences, in this case the biological sciences, because of the commonality of the 'information' concept.
Bates (2005), seeking like Stonier to reconcile the physical, biological and human forms of information, took the general definition that "information is the pattern of organization of everything". All information is 'natural information', existing in the physical universe of matter and energy. 'Represented information' is either 'encoded' (having symbolic, linguistic or signal-based patterns of organization) or 'embodied' (encoded information expressed in physical form), and can only be found in association with living creatures. Beyond this, Bates defined three further forms of information: Information1 - the pattern of organization of matter and energy; Information 2 - some pattern of organization of matter and energy given meaning by a living being (or its constituent parts); - Knowledge: information given meaning and integrated with other contents of understanding.
This paper builds upon these three approaches, to outline an approach to the expansion of a unified concept of information of relevance to information science. Following Stonier and Bates, tries to account for information - perhaps of different kinds - in the physical, biological and human domains, and to allow for the ideas of meaning, understanding, knowledge. Following all three authors, it will assume an evolutionary approach, which in an information science context, evokes Karl Popper's 'evolutionary epistemology' (Popper 1979). Following Madden, it allows for the possibility that the insights of the library and information sciences may contribute to the development of the physical and biological sciences, in so far as information concepts are involved.
In recent years, the role of information in physics, and the more general adoption of an 'information perspective' in the physical sciences has become much more widely accepted (von Baeyer 2004).
There are three main areas of the physical sciences in which 'information' is widely regarded as a particularly important issue: the study of entropy; aspects of quantum mechanics, and the study of self-organizing systems. A very brief commentary on these must suffice to show the increasing recognition of the relevance of 'information concepts'.
Entropy, a concept emerging from the development of thermodynamics in the nineteenth century is a measure of the disorder of a system (Penrose 2004, chapter 27). Given that order, or organization, is a quality generally associated with information, a qualitative link between information and entropy is evident; quantitatively, the 'information content' or 'entropy' in Shannon-Weaver information theory takes the same mathematical form as that of physical entropy (Roederer 2005, Leff and Rex 1990, 2003). Information may therefore, in this sense, be regarded as a kind of 'negative entropy', an indication that it may indeed be a fundamental physical quantity.
Quantum mechanics devised in the first years of the twentieth century, is the most successful physical theory yet developed, in terms of its ability to account accurately of experiments and observations. Interpreting it, however, and understanding what it 'means', is notoriously difficult. Intriguingly for our purposes, many of the interpretations available make some reference to information or knowledge (Penrose 2004, chapter 29).The physicist John A Wheeler, generally credited with initiating the trend to regard the physical world as basically made of information, with matter, energy, and even space and time, being secondary 'incidentals' (Barrow, Davies and Harper 2004), has taken this approach farther than most, in insisting that 'meaningful information' is necessarily involved, and hence that 'meaning', in the mind of a conscious observer, in effect constructs the physical world: "physics is the child of meaning even as meaning is the child of physics".
Self-organizing systems are a topic of relatively recent interest, but are proving to be of importance in a variety of areas in the physical sciences (Davies 1987, 1998). The interest in them comes from two perspectives. On the small-scale, it may be observed that simple physical and chemical systems show a propensity to 'self-organize': to spontaneously move towards a mode which is both organized and also highly complex. On the large scale, science must account for the emergence of highly complex organized structures - stars, galaxies, clusters of galaxies, and so on - in a universe which theorists assure us was entirely uniform and homogenous immediately after its creation. It is still not clear what the origins of this complexity are; it is generally assumed to come from gravitational effects, acting on very small inhomgeneities (Davies 1998, chapter 2). Gravity in the early universe can therefore be seen as "the fountainhead of all cosmic organization .. triggering a cascade of self-organizing processes" (Davies 1987, page 135).
The ubiquitousness of self-organization has led some scientists to propose that there may be 'laws of complexity', such that the universe has an 'in-built' propensity to organize itself in this way; this view is far from generally accepted, but is gaining support.
The relevance of these issues to information science is that any such complexity laws would be informational in character; that is to say they would act on the information content of the organization of matter and energy, tending to its increase. This would therefore form the basis of any unified view of information, rooted in its emergence in the physical world.
The 'informatisation' of biology has been a remarkable of feature of science over the past decades, from the elucidation of the genetic code in 1953 to the sequencing of the human genome exactly 50 years later, and accompanied by a consequent detailed understanding of the ways in which information is passed through generations of living creatures. The concepts of information theory have been extensively applied to biological, and specifically genetic, information from a relatively early stage (Gatlin 1972). These arguments follow on from those relating to the physical world, in terms of increasing levels of organized complexity and information content, the latter generally understood in term's of Shannon's formalism and its successors (Avery 2003, Yockey 2005).
With the increasing emphasis on the understanding of genetic information is the tendency to describe life itself as an informational phenomenon. Rather than defining living things, and their differences from non-living, in terms of arrangements of matter and energy, and of life processes - metabolism, reproduction, etc. - it is increasingly usual to refer to information concepts. Life, thought of in these terms, is the example of self-organized complexity par excellence. But with life comes a change from the organized complexity in the physical universe: with life we find the emergence of meaning and context. The genetic code, for example, allows a particular triplet of DNS bases to have the meaning a particular amino acid is to be added to a protein under construction; but only in the context of the cell nucleus.
It has also become clear that the origin of life itself may best be viewed as an 'information event': the crucial aspect is not the arrangement of materials to form the anatomy of a living creature, nor the beginning of metabolic processes; rather it is the initiation of information storage and communication between generations which marks the origin of life (Davies 1998, chapters 2 and 3).
Here we move to the sort of 'information' most familiar in the library / information sciences: communicable recorded information, produced by humans in order to convey what Popper terms 'objective knowledge'. In addition to the organized complexity and meaning in context of the physical and biological domains, we have conscious participants with an internal mental comprehension of knowledge and an ability to make it explicit, leading to patterns of information behaviour which are certainly both organized and complex.
Even in this familiar domain, there is some controversy about the best way to unnderstand information. Information may be regarded as an entity or 'thing' (Buckland 1991), as a cognitive attribute of the individual mind (Belkin 1990), or as something created collaboratively (Talja, Tuominen and Savolainen 2006). There is a particular issue of how information is to be understood to relate to similar entities, most particularly knowledge; see Meadow and Yuan (1997) and Chaim (2006).
Floridi (2005), an exponent of a new interest in the 'philosophy of information' within the discipline of philosophy itself, recasts the idea of knowledge as 'justified, true belief' into the idea that information is 'well-formed, meaningful and truthful data'. This seems more suitable for the needs of information science, but does not reflect the rather muddled reality of the human record. Perhaps the most interesting philosophical approach is that of Kvanvig (2003), who argues that we should replace 'knowledge' with 'understanding' as a focus for interest. Understanding, for Kvanvig, requires "the grasping of explanatory and other coherence-making relationships in a large and comprehensive body of information". It allows for there to be greater or lesser degrees of understanding, rather than just having knowledge/information or not. Crucially, it allows for understanding to be present even in the presence of missing, inconsistent, incompatible, and downright incorrect, information. It is firmly based on the idea of meaning in context, basic to biological information, and therefore underlying human information, which must build on the biological foundation. This seems to be a more appropriate entity than the philosophers' traditional ideas of knowledge for LIS.
We can therefore see human information, characterised as understanding through knowledge, as a further stage in the emergence of self-organized informational complexity.
This paper argues that information may be seen in the physical domain as patterns of organized complexity of matter and energy; in the biological domain, meaning-in-context emerges from the self-organized complexity of biological organizms; in the human domain, understanding emerges from the complex interactions of Popper's World 2, the mental product of the human consciousness, with World 3, the social product of recorded human knowledge. The term 'emerges' is used deliberately, for these are emergent properties, that is to say they appear appropriate to their level: physical, biological, conscious and social.
The linking thread, and the unifying concept of information here, is self-organized complexity. The crucial events which allow the emergence of new properties are: the origin of the universe, which spawned organized complexity itself; the origin of life, which allowed meaning-in-context to emerge; and the origin of consciousness, which allows self-reflection, and the emergence of understanding, at least partly occasioned when the self reflects on the recorded knowledge created by other selves.
If, therefore, we understood these three origins fully, we would, presumably, understand information itself equally fully, and the ways in which its various forms emerged. Sadly, the beginnings of the universe, of life, and of consciousness, are among the most deep and difficult problems for science (Gleiser 2004).
The framework described above is an evolutionary one, with meaning in context and understanding through knowledge emerging from the self-organized complexity of the physical universe. If this is accepted, then it seems clear enough that an understanding of complexity processes in the physical and biological realms might be valuable in understanding the issues of LIS.
The converse - that the findings of LIS research may be valuable in understanding the emergence of complexity in the physical and biological realms - may also be true, in two ways.
Most straightforwardly, we might expect that 'complexity laws', governing the ways in which self-organized complexity emerges, and what results from it, may - at the least - take the same general form in all situations and environments. Insights gained in the realm of the communication of human information might therefore be valuable for those studying the same general phenomena in the physical and biological sciences. It may be that the added richness and levels of complexity found with human information may make the identification of laws and concepts somewhat easier than in the 'sparser' environments of the natural sciences. Whether such laws and concepts would be directly applicable and relevant at all levels, or whether they would be emergent properties applicable on at their own levels remains to be seen; but in either case they would be a genuine contribution to the study of the supposedly more fundamental sciences.
More ambitiously, there has been a trend in science, following the so-called 'strong anthropic principle', to conjecture that the emergence of life and consciousness may, in some ill-understood way, have an effect of backward causation, so as to affect the nature of the universe which have rise to it. The analogy for our purposes would be to allow the possibility that the emergence of human information, knowledge and understanding is in itself a force in the physical universe, which can influence the generation of complexity in all domains. This is an intriguing speculation, but it is not necessary to accept it in order to believe that LIS studies may have some value for the understanding of self-organization and complexity in other domains.
We may then want to ask the basic question: what kind of LIS studies or concepts could be of value in this way ? This question has not been considered in detail, still less answered. But it seems clear that that they must be studies of the emergence of patterns within the recording and communication of human knowledge. Example might be: bibliometric, webliometric and scientometric analyses of publication; studies of emergent networks of information transfer; and studies of information seeking, and more general information behaviour, with an emphasis on the kind of patterns of behaviour which may be observed.
Adoption of the unifying concept of information as self-organized complexity allows for research and study linking this concept in several domains, from the physical to the social, and allows the possibility that library and information science research may provide insights for the physical and biological sciences.
I am grateful to Jack Meadows, Jutta Haider, Toni Weller, and an anonymous referee for helpful suggestions.
|Find other papers on this subject|
© The author, 2007.
Last updated: 19 July, 2007