Department of Mathematics, Geeta Institute of Management and Technology, Kanipla-136131, Kurukshetra, Haryana, India

**Arun Choudhary**

Department of Mathematics, Geeta Institute of Management and Technology, Kanipla-136131, Kurukshetra, Haryana, India

Satish Kumar

Department of Mathematics, Geeta Institute of Management and Technology, Kanipla-136131, Kurukshetra, Haryana, India

**Arun Choudhary **

Department of Mathematics, Geeta Institute of Management and Technology, Kanipla-136131, Kurukshetra, Haryana, India

Department of Mathematics, Geeta Institute of Management and Technology, Kanipla-136131, Kurukshetra, Haryana, India

Department of Mathematics, Geeta Institute of Management and Technology, Kanipla-136131, Kurukshetra, Haryana, India

In this communication, first we generalized Shannon inequality and then given its application in coding theory. In the end, lower and upper bounds are derived for the new mean codeword length in terms of generalized entropy of order α and type β.

PDF Abstract XML References Citation

Satish Kumar and Arun Choudhary, 2011. A Coding Theorem for the Information Measure of Order α and of Type β. *Asian Journal of Mathematics & Statistics, 4: 81-89.*

**DOI:** 10.3923/ajms.2011.81.89

**URL:** https://scialert.net/abstract/?doi=ajms.2011.81.89

Let

be a set of n-complete probability distributions.

For PεΔ_{n}, Shannon (1948) measure of information is defined as:

(1.1) |

The measure (1.1) has been generalized by various authors and has found applications in various disciplines such as economics, accounting, crime, physics etc.

Sharma and Mittal (1975) generalized (1.1) in the following form:

(1.2) |

For P, QεΔ_{n}, Kerridge (1961) introduced a quantity known as inaccuracy defined as:

(1.3) |

There is well known relation between H(P) and H(P, Q) which is given by

(1.4) |

The relation (1.4) is known as Shannon inequality and its importance is well known in coding theory.

In the literature of information theory, there are many approaches to extend the relation (1.4) for other measures. Nath and Mittal (1973) extended the relation (1.4) in the case of entropy of type β.

Using the method of Nath and Mittal (1973) and Van der Lubbe (1978) generalized (1.4) in the case of Renyi’s entropy. On the other hand, using the method of Campbell, Van der Lubbe (1978) generalized (1.4) for the case of entropy of type β. Using these generalizations, coding theorems are proved by these authors for these measures.

The objective of this communication is to generalize (1.4) for (1.2) and give its application in coding theory.

**GENERALIZATION OF SHANNON INEQUALITY**

For P, QεΔ_{n} define a measure of inaccuracy, denoted by H(P, Q; α, β) as:

(2.1) |

Since, H (P, Q; α, β) ≠ H(P; α, β), we will not interpret (2.1) as a measure of inaccuracy. But H (P, Q; α, β) is a generalization of the measure of inaccuracy defined in (1.2). In spite of the fact that H (P, Q; α, β) is not a measure of inaccuracy in its usual sense, its study is justified because it leads to meaningful new measures of length. In the following theorem, we will determine a relation between (1.2) and (2.1) of the type (1.4).

Since (2.1) is not a measure of inaccuracy in its usual sense, we will call the generalized relation as pseudo-generalization of the Shannon inequality.

**Theorem 1:** If P,QεΔ_{n} then it holds that:

(2.2) |

under the condition

(2.3) |

and equality holds if

**Proof :** (a) If 0<α<1<β.

By Shisha (1967) Holder’s inequality

(2.4) |

for all x_{k}, y_{k}>0, i = 1, 2, ..., n and

We see that equality holds if and only if there exists a positive constant c such that:

Making the substitutions

in (2.4), we get

Using the condition (2.3), we get

(2.5) |

Since α<1, (2.5) becomes:

(2.6) |

raising both sides of (2.6) with (β-1)/(α-1)<0, we get

(2.7) |

Using (2.7) and the fact that β>1, we get (2.2).

The proof follows on the similar lines.

**APPLICATION IN CODING THEORY**

We will now give an application of Theorem 1 in coding theory. Let a finite set of n-input symbols with probabilities p_{1}, p_{2}, …, p_{n} be encoded in terms of symbols taken from the alphabet {a_{1}, a_{2},……., a_{n}}.

Then it is known Feinstein (1958) that there always exist a uniquely decipherable code with lengths N_{1}, N_{2},…….., N_{n} iff

(2.8) |

If

(2.9) |

is the average codeword length, then for a code which satisfies (2.8), it has been shown that Feinstein (1958),

(2.10) |

with equality iff N_{k} = -log_{D} p_{k}; k = 1, 2, ……., n and that by suitable encoded into words of long sequences, the average length can be made arbitrary close to H(P). This is Shannon’s noiseless coding theorem.

By considering Renyi (1961) entropy a coding theorem and analogous to the above noiseless coding theorem has been established by Campbell (1965) and the authors obtained bounds for it in terms of

Kieffer (1979) defined a class rules and showed H_{α} (P) is the best decision rule for deciding which of the two sources can be coded with expected cost of sequences of length n when n→∞, where the cost of encoding a sequence is assumed to be a function of length only. Further Jelinek (1980) showed that coding with respect to Campbell (1965) mean length is useful in minimizing the problem of buffer overflow which occurs when the source symbol are being produced at a fixed rate and the code words are stored temporarily in a finite buffer.

Hooda and Bhaker (1997) consider the following generalization of Campbell (1965) mean length:

and proved

under the condition

where, is generalized entropy of order α = 1/1+t and type β studied by Aczel and Daroczy (1963) and Kapur (1967). It may be seen that the mean codeword length (2.9) had been generalized parametrically and their bounds had been studied in terms of generalized measures of entropies. Here we give the following another generalization of (2.9) and study its bounds in terms of generalized entropy of order α and type β.

Longo (1976), Gurdial and Pessoa (1977), Hooda and Bhaker (1997), Singh *et al*. (2003), Parkash and Sharma (2004) and Khan *et al*. (2005) have studied generalized coding theorems by considering different generalized measure of (1.1) and (2.9) under condition (2.8) of unique decipherability.

The mathematical theory of information is usually interested in measuring quantities related to the concept of information. Shannon (1948) fundamental concept of entropy has been used in different directions by the different authors such as Stepak (2005), Al-Daoud (2007), Haouas *et al*. (2008), Rupsys and Petrauskas (2009), Al-Nasser *et al*. (2010) and Fatemi (2011) etc.

In this paper we study some coding theorems by considering a new function depending on parameters α and β. Our motivation for studying this new function is that it generalizes some entropy function already existing in literature Sharma and Mittal (1975) entropy.

We define the measure of length L (α, β) by:

(2.11) |

where

Also, we have used the condition

(2.12) |

to find bounds. It may be seen that in the case when α = 1, then (2.12) reduces to Kraft Inequality (2.8).

**Theorem 2:** If N_{k}, k = 1, 2, ......, n are the lengths of codewords satisfying (2.12), then

(2.13) |

**Proof:** In (2.2) choose Q = (q_{1}, q_{2},.......q_{n}) where

(2.14) |

with choice of Q, (2.2) becomes

H (P; α, β)≤L (α, β) which proves the first part of (2.13).

The equality holds iff

which is equivalent to

(2.15) |

Choose all N_{k} such that:

Using the above relation, it follows that:

(2.16) |

We now have two possibilities:

• | If α>1; (2.16) gives us: |

(2.17) |

Now consider two cases:

(a) let 0<β<1 |

Raising both sides of (2.17) with (β-1)/(α-1), we get:

(2.18) |

Since, 2

(b) Let β>1. The proof follows similarly.

• | If 0<α<1, The proof follows on the same lines. |

**PARTICULAR'S CASES**

• | Since, D≥2, we have |

It follows then the upper bound of L (α, β) in (2.13) is greater than unity.

• | If β = α, then (2.13) becomes: |

where

be the Havrda and Charvat (1967) Entropy. Later on it also studied by Daroczy (1970) and Tsallis (1988) entropy.

and

be the new codeword length.

• | If β→1 then (2.13) becomes |

where

be the Renyi (1961) Entropy and

be the new mean codeword length.

• | If β = α and α→1 then (2.13) becomes |

which is the Shannon (1948) Classical noiseless coding theorem.

We know that optimal code is that code for which the value L (α, β) is equal to its lower bound. From the result of the theorem 2, it can be seen that the mean codeword length of the optimal code is dependent on two parameters α and β, while in the case of Shannon’s theorem it does not depend on any parameter. So it can be reduced significantly by taking suitable values of parameters.

- Al-Daoud, E., 2007. Adaptive quantum lossless compression. J. Applied Sci., 7: 3567-3571.

CrossRefDirect Link - Al-Nasser, A.D., O.M. Eidous and L.M. Mohaidat, 2010. Multilevel linear models analysis using generalized maximum entropy. Asian J. Math. Stat., 3: 111-118.

CrossRefDirect Link - Fatemi, A., 2011. Entropy of stochastic intuitionistic fuzzy sets. J. Applied Sci., (In Press).

Direct Link - Haouas, A., B. Djebbar and R. Mekki, 2008. A topological representation of information: A heuristic study. J. Applied Sci., 8: 3743-3747.

CrossRefDirect Link - Jelinek, F., 1980. Buffer overflow in variable lengths coding of fixed rate sources. Trans. Inform. Theory, 14: 490-501.

Direct Link - Kieffer, J.C., 1979. Variable-lengths source coding with a cost depending only on the codeword length. Inform. Control, 41: 136-146.

CrossRef - Longo, G., 1976. A noiseless coding theorem for sources having utilities. SIAM J. Applied Math., 30: 739-748.

Direct Link - Rupsys, P. and E. Petrauskas, 2009. Forest harvesting problem in the light of the information measures. Trends Applied Sci. Res., 4: 25-35.

CrossRefDirect Link - Singh, R.P., R. Kumar and R.K. Tuteja, 2003. Application of holder's inequality in information theory. Inform. Sci., 152: 145-154.

Direct Link - Stepak, A.M., 2005. Frequency value grammar and information theory. J. Applied Sci., 5: 952-964.

CrossRefDirect Link - Tsallis, C., 1988. Possible generalization of boltzmann gibbs statistics. J. Stat. Phy., 52: 479-487.

CrossRef