Citazioni |
|
[…] the information contributed by a unit depends on its probability in the context where it occurs. The more probable a unit, the less informative it is. The probability of a unity is presented in terms of frequency. The relations between number of units or frequency of units on the one hand and information on the other hand, follow, as it were, from the very definition of what information is. They are necessary and, should we say, automatic. If the frequency of an item decreases, its probability must necessarily decrease too, and nothing can prevent its information from soaring. - Martinet (1962), a pag.143 We call information whatever reduce uncertainty through the elimination of certain possibilities. This means that information is not the same as meaning. If I say 'he has p'…and stop short, 'p' has, of course, no meaning but it carries information because it excludes the possibility that the utterance may have been meant as 'he has given' or 'he has seen'. If I say 'he has pr'…, 'r' has no meaning, but again, it contributes information because it eliminates 'he has pushed' or 'he has placed'. This implies that what is going to be said about the dynamics of language applies to all linguistic units, distinctive or significant, phonemes and monemes alike. - Martinet (1962), a pag.142
|