(R, S)-Norm Information Measure and A Relation Between Coding and Questionnaire Theory

In this paper, we introduce a quantity which is called (R, S)-norm entropy and discuss some of its major properties in comparison with Shannon’s and other entropies known in the literature. Further, we give an application of (R, S)-norm entropy in coding theory and a coding theorem analogous to the ordinary coding theorem for a noiseless channel. The theorem states that the proposed entropy is the lower bound of mean code word length. Further, we give an application of (R, S)-norm entropy and noiseless coding theorem in questionnaire theory. We show that the relationship between noiseless coding theorem and questionnaire theory through a charging scheme based on the resolution of questions and lower bound on the measure of the charge can also be obtained.