As a building block of culture, meaning is social information in action. In genes, information is recorded in the form of nucleotide sequences. How then is information recorded in meanings? An individual meaning is made up of figurae—changes and differences. Meanings consist of both continuous and discrete figurae. Meanings are therefore similar to light, which has a wave-particle nature.
The quantity of meaning is determined by the number of figurae, or changes and differences. It applies to all functions of meaning—be it making and things, communication and communities, thinking and symbols. The quantity of meaning is in all cases determined by the number of figurae necessary for its reproduction. It is impossible to reduce meaning only to expression, only to content, or only to a norm, since every meaning is a material and social abstraction in action, taken in the unity of all the changes and differences necessary for its reproduction. As a social and material abstract action, meaning is based on the expenditure of energy and time. However, energy and time are constraints, not measures of meaning. Its quantity is measured by the number of figurae it contains.
Information theory calls changes and differences bits and measures information in bits. A bit is a change or difference that is reduced to being a change or difference and nothing else: “1” or “0,” “on” or “off,” etc. The simplest element of meaning, the simplest figura, is not just a bit (“0” or “1”) but a bit that has a valuation attached to it. The cultural bit has not only a modulus (“0” or “1”), but also a sign (“+” or “–”). Information is certainty, meaning is directed, value-based information. A meaning can be visualized as a string s consisting of figurae. The criterion for defining the size or quantity of the meaning is the number of figurae in the string s.
A string of figurae forms a meaning, a bundle of meanings forms a context. Since meanings exist in context, they function not as a discrete, but as a continuous set. There is no clearly defined, fixed boundary between figurae and meanings; this boundary is mobile and is determined by the context. The same human movement can, depending on the context, be either part of an action or an independent action with its own meaning—a gesture. Every meaning acquires its specifics in context. Meaning is always a specific action and the result of such an action. An abstract expression is a meaning only insofar as it is found in the context of concrete social actions.
Aristotle said that “art in some cases completes what nature cannot bring to a finish, and in others imitates nature” (Aristotle 1984, vol. 1, p. 340). Completeness or perfection is the main characteristic of a meaning and of culture as a whole, as compared to figurae. A finished biface is a meaning, a stone fragment is a figura. A finished phrase is a meaning, an unfinished phrase is an assortment of figurae. A well-thought-out book is a meaning; an ill-thought-out book is an assortment of figurae. Wisdom is the highest form of completeness of an action, enabling one to begin a fundamentally new action.
The relationship between figurae and meanings in their linguistic (symbolic) forms, a kind of “sense of meaning,” is the basis of the common language of all humans, which enables us to learn new languages in adulthood and to guess the purpose of rubble found during archaeological excavations. “Such non-signs as enter into a sign system as parts of signs we shall here call figurae; this is a purely operative term, introduced simply for convenience. Thus, a language is so ordered that with the help of a handful of figurae and trough ever new arrangements of them a legion of signs can be constructed” (Hjelmslev 1969, p. 44).
When we say that bits are “building blocks” of information, and figurae are “building blocks” of meaning, we imply that figurae, unlike bits, have qualitative properties and that the set of figurae can be divided into subsets, or that we can distinguish between basic types of figurae. This applies to meanings as such, but it was first noted for linguistic signs. Hjelmslev considered the identification of these types to be a necessary condition for understanding both the expression and the content of languages.
“Such an exhaustive description presupposes the possibility of explaining and describing an unlimited number of signs, in respect of their content as well, with the aid of a limited number of figurae. And the reduction requirement must be the same here as for the expression plane: the lower we can make the number of content-figurae, the better we can satisfy the empirical principle in its requirement of the simplest possible description” (Hjelmslev 1969, p. 67).
As we saw above, meanings are not reduced to signs and symbols. Meanings manifest themselves in the mental, social and physical existence of a person, but meanings are not born in this existence. Abstractions are not a product of the human intellect, neither in its affirmative form of understanding nor in its negative form of reason. Rather, it is understanding and reason that are the result of the evolution of social and material abstractions in action. Meanings only reproduce fundamental definitions, states, relationships, changes, directions in nature and society: “If we’re able to learn language from a few years’ worth of examples, it’s partly because of the similarity between its structure and the structure of the world” (Domingos 2015, p. 37). Hence the universality of meanings, the ability of people to understand each other, to translate each other’s languages—and this after tens of thousands of years of isolated life. During the Age of Discovery, Europeans found a common language with the Indians or Australians. All people act, talk and think in one language—the language of meaning:
“In Leibniz’s view, if we want to understand anything, we should always proceed like this: we should reduce everything that is complex to what is simple, that is, present complex ideas as configurations of very simple ones which are absolutely necessary for the expression of thoughts” (Wierzbicka 2011, p. 380). “…’Inside’ all languages we can find a small shared lexicon and a small shared grammar. Together, this panhuman lexicon and the panhuman grammar linked with it represent a minilanguage, apparently shared by the whole of humankind. … On the one hand, this mini-language is an intersection of all the languages of the world. On the other hand, it is, as we see it, the innate language of human thoughts, corresponding to what Leibniz called ‘lingua naturae’” (ibid., p. 383).
Mathematics as a domain of meaning is also a reflection of the fundamental definitions of the world. The similarities between the world and mathematics make it possible to solve scientific problems. This similarity did not arise overnight. Mathematics is a result of the evolution of meaning from the order of the universe up to the reflection of this order in the minds of people. On the scale of millions and billions of years, the difference between Turing and Wittgenstein disappears: “Turing thought of mathematics as something that was essentially discovered, something like a science of the abstract. Wittgenstein insisted that mathematics was essentially something invented, following out a set of rules we have chosen—more like an art than a science” (Grim 2017, p. 151). In fact, both mathematics and logic in general are the result of cultural evolution that occurs through selection and choice. It could be, that the logical contradiction between meanings expresses the historical and practical discrepancy of meanings in relation to the environment and the subject, and the resolution of such a contradiction reflects the overcoming of this discrepancy.
The simplicity of early meanings did not only concern making. Thinking and communicating were just as simple, relying on crude motions of body and mind. Primitive making has left us its direct results: stones, bones, etc. Unfortunately, the direct products of communicating or thinking no longer exist, so we can only judge them indirectly. In the process of social and then cultural learning, as the norm of first learned and then rational reaction expanded and cultural selection turned into traditional choice, the complexity of meanings and of the culture-society as a whole increased, as did the number of figurae and meanings.
The gradual complication of meanings becomes clear, for example, when we consider the evolution of stone tools: from the simplest Paleolithic choppers to the polished and drilled Neolithic axes, which are characterized by a much higher level of workmanship. Cultural evolution consists in the division of meanings, that is, in the emergence of ever new types of actions and their results. By dividing their activity and knowledge, people specialized in those types of actions in which they had a competitive advantage due to the characteristics of the environment or their active power. Hunting and gathering divided into farming, herding, crafts, trade. Not only the complexity of making grew, but also of communicating and thinking. Languages were more complex. Learned actions became a more important part of self-reproduction relative to instinctive behaviors, rational actions grew more vital relative to learning.
That the complexity of meanings increases as they evolve may be intuitively obvious, but it was only in the middle of the 20th century that the concepts of the quantity of information and information complexity were rigorously substantiated in the works of Claude Shannon and Andrey Kolmogorov.
Shannon introduced the concept of information entropy. According to him, entropy H is a measure of the uncertainty, unpredictability, surprise or randomness of a message, event or phenomenon. In terms of culture, such a message or event is a counterfact. Without information losses, Shannon entropy is equal to the amount of information per message symbol. The amount of information is determined by the degree of surprise inherent in a particular message:
“According to this way of measuring information, it is not intrinsic to the received communication itself; rather, it is a function of its relationship to something absent—the vast ensemble of other possible communications that could have been sent, but weren’t. Without reference to this absent background of possible alternatives, the amount of potential information of a message cannot be measured. In other words, the background of unchosen signals is a critical determinant of what makes the received signals capable of conveying information. No alternatives = no uncertainty = no information. Thus Shannon measured the information received in terms of the uncertainty that it removed with respect to what could have been sent” (Deacon 2013, p. 379).
Thus, the average amount of information H a culture-society contains can be measured by the number of (counter)facts it generates and the probability of their occurrence. The Shannon entropy H is an indicator of the complexity of a culture-society as a whole. If we look at the history of human cultures-societies, we see that their complexity has consistently grown: from a meager set of primitive meanings (tribal community, elementary language, simple stone tools, causal mini-models, animism and fetishism) to a complex arsenal of meanings characteristic of agrarian societies (fields and livestock, agricultural and craft tools, city-states and empires, writing and literature, ancient and Arabic science, world religions).
Cultural evolution has increased the complexity of entire cultures-societies as well as individual meanings. As mentioned earlier, the complexity of a meaning is determined by the minimum number of figurae required to reproduce it. Suppose we have a meaning s that can be represented as a string with a certain number of figurae. The length of this string is L(s). In this case, the complexity of a meaning s is defined by the length of the shortest program s* that can describe this meaning. The length of the program s* is called the algorithmic entropy of s or Kolmogorov complexity K(s):
“The key concept of algorithmic information theory is that of the entropy of an individual object, also called the (Kolmogorov) complexity of the object. The intuitive meaning of this concept is the minimum amount of information needed to reconstruct a given object” (Vinogradov et al. 1977-1985, vol. 1, p. 220).
We call the program s* the least or minimal action necessary to reproduce the meaning s. The complexity of the meaning s depends on the length (size) of the minimal action required to reproduce it. For example, the string asdfghjkl can be described only by itself. Its length is 9 non-repeating figurae. However, if a string s has a pattern—even a non-obvious one—it can be described by a minimal action s* that is much shorter than s itself. For example, the string afjkafjkafjkafjk can be described by the much shorter string afjk repeated as many times as necessary.
“The distinction between simplicity and complexity raises considerable philosophical difficulties when applied to statements. But there seems to exist a fairly easy and adequate way to measure the degree of complexity of different kinds of abstract patterns. The minimum number of elements of which an instance of the pattern must consist in order to exhibit all the characteristic attributes of the class of patterns in question appears to provide an unambiguous criterion” (Hayek 1988-2022, vol. 15, p. 260).
The complexity of a given meaning is determined by the size of the minimal action necessary to reproduce that meaning. As a product of culture, man himself is also a meaning. In order to be able to transmit more and more cultural experiences, he must become more complex. With each generation, the minimal action required to reproduce man as a cultural being grows, and with it the complexity of learning.
The complexity of a minimal action converges to the entropy of its source, that is, the minimal subject. As we saw above, the complexity of a culture-society is determined by the number of alternative meanings (counterfacts) it can generate. At the same time, the complexity of a culture-society is defined by the size of the minimal action necessary for its reproduction. Thus, the entropy of the culture-society considered as a source of messages (counterfacts) is approximately equal to the average complexity of all possible messages from this source:
“Shannon’s entropy does not make sense for a particular string of bits. Entropy is a property of an information source. There are many possible messages, each with its own probability. Entropy measures the size of that universe of possibilities. In contrast, the algorithmic entropy makes sense for any particular string of bits. The strings themselves can have a higher or lower information content, according to whether they require longer or shorter descriptions. The two entropies are related to each other. For a source that produces binary sequences, the Shannon entropy is approximately the average of the algorithmic entropy, taking an average over all the possible sequences that the source might produce: H ≈ ave(K). Shannon entropy is a way to estimate the algorithmic entropy, on average” (Schumacher 2015, p. 231).
In other words, the complexity of meaning, when measured in cultural bits, converges on average to the entropy of the subject, be it culture-society as a whole or an individual taken as a source of (counter)facts. As Protagoras said, “man is the measure of all things: of the things which are, that they are, and of the things which are not, that they are not” (Plato 1997, p. 169). The minimal subject is a measure of the complexity of man, that is, of the unpredictability, uncertainty, randomness and surprise of his actions performed and not performed. The minimal subject is both the source and the product of the minimal action, and together they constitute the minimal meaning.
The historical increase in the complexity of a culture-society is reflected in the increase in both the number of (counter)facts it produces and the average size of the minimal action required to reproduce an individual meaning. The transition from cultural selection based on the alternation of human generations to traditional choice based on the alternation of generations of meanings raised both the entropy of the source of (counter)facts and the complexity of the (counter)facts themselves.
When we apply the achievements of information theory to culture, we must note the difference between the terms “information” and “meaning.” Information is determinateness in general or certainty, its measure is the reduction of uncertainty. The unit of information is a bit, “1” or “0.” In contrast to information, meaning is directed certainty, an act of change in a certain direction. Examples of directed certainty are the evolution of living beings and the evolution of meanings. Humans process information (certainty) into meaning (mediated, that is, directed certainty) by matching information with needs. The unit of meaning is the cultural bit—not just “1” or “0,” but also “+” or “–.” Meaning is information in human action that reproduces the patterns of the world.
“The orderly structures and patterns of which we are most immediately aware are those within our own minds, bodies, and behavior, but virtually all human beings have a strong conviction that corresponding to these patterns of mind and body are similar patterns in what might be called the ‘real world’” (Boulding 1985, p. 9).
At the same time, meaning is not limited to a mental act that operates with abstraction. Thinking in itself is not an interaction with meanings as with some “supramundane” entities. Meaning is a material action and the result of an action. When we compare meanings with each other, we can distinguish between their general and particular properties. A clock is a device for measuring time. But this general property of being a clock does not exist in itself. It exists in the context of human activities related to clocks. The abstraction of a clock only makes sense in action, for example when you read and think about what is written in this book.
Бесплатно
Установите приложение, чтобы читать эту книгу бесплатно
О проекте
О подписке