Sovremennye Issledovaniâ Socialʹnyh Problem (Dec 2013)
ENTROPY AND NEGENTROPY SIGNS IN LANGUAGE SYSTEM
Abstract
The article is devoted to research of entropy in language. Complexity of the description of entropy is connected with that it is not a material factor of development of language, because language is not real, but the imaginable category, existing only in humans thinking. For this reason the modern researches devoted to studying and calculating of entropy are concentrated on the speech and text material. For understanding of the idea of entropy of language (not of its speech expression) it is necessary to investigate the entropy of structure of objective language, not sets of subjective languages, i.e. language as model. The research has showed that in system of a natural language signs of entropy are such language phenomena as synonimy, polysemanticism, language redundancy, variability of forms, loyalty and flexibility of rules, existence of styles. These signs are detected in comparison with signs of absolutely systematized language system, acting as a hypothesis. Entropy of such system is reduced to zero. The negentropy as negative entropy is opposed to entropy. Negentropy markers are homonymy, absence of a synonimy, invariancy, strict rules of language, monostyle. DOI: http://dx.doi.org/10.12731/2218-7405-2013-9-58