**ABSTRACT**

In this study, we study one parametric generalization measure of H (P) and H (P; Q). For the measure H (P; Q) we give three different kind of generalizations. These generalizations are R-Norm entropy and R-Norm inaccuracies. The Shannon-Gibbs type inequality has been generalized in different way using Holder’s inequality for R-Norm information measure and three different kinds of inaccuracy.

PDF Abstract XML References Citation

**Received:**April 30, 2011;

**Accepted:**May 26, 2011;

**Published:**July 02, 2011

####
**How to cite this article**

*Journal of Applied Sciences, 11: 2866-2869.*

**DOI:**10.3923/jas.2011.2866.2869

**URL:**https://scialert.net/abstract/?doi=jas.2011.2866.2869

**INTRODUCTION**

We consider the following set of positive real numbers:. Boekee and Van der Lubbe (1980) studied R-Norm entropy of distribution P is given by:

(1) |

Actually, the R-norm entropy (1) is a real function from Δ_{N}→R*, where N≥2. This measure is different from Shannon (1948), Renyi (1961), Havrda and Charvat (1967) and Daroczy (1970). The most interesting property of this measure is that when R→1, R-norm information measure (Eq. 1) approaches to Shannon (1948) entropy and in case R→∞, _{R}H (P)→(1-max p_{i}), i = 1,2,..., N.

Setting r = 1/R in Eq. 1, we get:

(2) |

which is a measure mentioned by Arimoto (1971) as an example of a generalized class of information measure. It may be marked that Eq. 2 also approaches to Shannon’s entropy as r→1.

For PεΔ_{N}, Shannon (1948) measure of information is defined as:

(3) |

The measure (Eq. 3) has been generalized by various authors and has found applications in various disciplines such as economics, accounting, crime and physics etc.

For P, QεΔ_{N}, Kerridge (1961) introduced a quantity known as inaccuracy defined as:

(4) |

There is well known relation between H (P) and H (P; Q) which is given by:

(5) |

The Eq. 5 is known as Shannon inequality and its importance is well known in coding theory.

In the literature of information theory, there are many approaches to extend the Eq. 5 for other measures. Nath and Mittal (1973) extended the relation (5) in the case of entropy of type β.

Using the method of Nath and Mittal (1973), Van der Lubbe (1978) generalized (Eq. 5) in the case of Renyi's entropy. On the other hand, using the method of Campbell (1965), generalized (Eq. 5) for the case of entropy of type β. Using these generalizations, coding theorems are proved by these authors for these measures.

The mathematical theory of information is usually interested in measuring quantities related to the concept of information. Shannon (1948) fundamental concept of entropy has been used in different directions by the different authors such as Zheng * et al*. (2008), Haouas *et al*. (2008), Yan and Zheng (2009), Kumar and Choudhary (2011) and Wang (2011) etc.

The objective of this study is to study generalization of Eq. 5 for Eq. 1 and three different kinds of R-Norm inaccuracies with the help of Shisha (1967) Holder’s inequality.

**GENERALIZATION OF SHANNON INEQUALITY**

**R-Norm inaccuracies:** The three different kinds of R-Norm inaccuracy measures are defined as:

(6) |

α = 1, 2 and 3, where:

Now we are interested to extend the result of Eq. 3 in a fashion such as:

(7) |

where, α = 1, 2 and 3.

Provided the following conditions holds.

(8) |

(9) |

(10) |

Equality in Eq. 7 holds if and only if P = Q i.e., p_{i} = q_{i} for α = 1 and 3.

And:

where, P^{R} is given as:

(11) |

Since H_{R} (P) ≠ ^{α}H_{R} (P; Q), we will not interpret Eq. 6 as a measure of inaccuracy. But ^{α}H_{R} (P; Q) is a generalization of the measure of inaccuracy defined in Eq. 1. In spite of the fact that ^{α}H_{R} (P; Q) is not a measure of inaccuracy in its usual sense, its study is justified because it leads to meaningful new measures of length. In the following theorem, we will determine a relation between Eq. 1 and Eq. 6 of the type Eq. 5.

Since Eq. 6 is not a measure of inaccuracy in its usual sense, we will call the generalized relation as pseudo-generalization of the Shannon inequality for R-Norm entropy.

**Theorem 1:** We have:

(12) |

i.e., Eq. 7 and under the condition Eq. 8, 9 and 10.

**Proof:** For different values of α = 1, 2 and 3, we shall prove Eq. 12.

**Case 1:** For α = 1,

• | Proposition 1: |

(13) |

then

(14) |

with equality iff P = Q i.e.,q_{i} = p_{i}:

**Proof:** We use Shisha (1967) Holder’s inequality:

(15) |

for all x_{i}≥0, y_{i}≥0, i = 1,2,...,N when P<1 (≠1) and p^{-1} + q^{-1} = 1, with equality if and only if there exists a positive number c such that:

(16) |

Setting:

in Eq. 15 and using Eq. 13, we get:

(17) |

Multiplying both sides of Eq. 17 by:

and raising power both sides by 1/R, we get:

Simplification for

as R>1, gives Eq. 14.

For 0<R<1, we can prove Eq. 14 on the similar lines.

**Case II:** For α = 2,

• | Proposition 2: |

(18) |

then

(19) |

with equality iff Q = P^{R} i.e.,:

**Proof:** In inequality Eq. 15 take:

we get:

(20) |

i.e.,

(21) |

with equality iff

∀i = 1,2,..., N.

Raising both sides of Eq. 21 by 1/R, simplification for

as R>1, gives Eq. 19.

For:

as 0<R<1, gives Eq. 19. i.e., H_{R} (P)≤^{2}H_{R} (P; Q), R>0 (≠1).

**Case III:** For α = 3,

• | Proposition 3: |

(22) |

then:

(23) |

with equality iff

**Proof:** In inequality Eq. 15 take

we get:

(24) |

with equality iff p_{i} = q_{i}, ∀i = 1, 2,..., N

i.e.,

(25) |

with equality iff p_{i} = q_{i}, ∀ i = 1, 2,..., N.

Raising both sides of Eq. 25 by 1/R, simplification for:

as R>1, gives Eq. 23.

For:

as 0<R<1, gives Eq. 23.

i.e., H_{R} (p)≤^{3}H_{R} (P; Q), R>0 (≠1)

From proposition 1, 2 and 3, we get the proof of the theorem 1.

**Remark:**

If R→1; Eq. 7 becomes 5. i.e., H (p)≤H (P; Q), which is Shannon inequality

**CONCLUSION**

The measure H_{R} (P) and ^{α}H_{R} (P; Q) (α = 1, 2) are one parametric generalizations of Shannon entropy and of Kerridge inaccuracy respectively, both studied by Van der Lubbe (1981). The measure ^{ α}H_{R} (P; Q); (α = 1, 2 and 3) are three different one parametric generalizations of Kerridge (1961) inaccuracy. Proposition 1, 2 and 3 gives inequalities among these measures which we called generalized Shannon inequalities.

####
**REFERENCES**

- Boekee, D.E. and J.C.A. van der Lubbe, 1980. The R-norm information measure. Inform. Control, 45: 136-155.

CrossRefDirect Link - Haouas, A., B. Djebbar and R. Mekki, 2008. A topological representation of information: A heuristic study. J. Applied Sci., 8: 3743-3747.

CrossRefDirect Link - Kumar, S. and A. Choudhary, 2011. A coding theorem for the information measure of order α and of type β. Asian J. Math. Stat., 4: 81-89.

CrossRefDirect Link - Renyi, A., 1961. On measure of entropy and information. Proc. 4th Berkeley Symp. Maths. Stat. Prob., 1: 547-561.

Direct Link - Shannon, C.E., 1948. A mathematical theory of communication. Bell Syst. Tech. J., 27: 379-423.

Direct Link - Wang, Y., 2011. Generalized Information theory: A review and outlook. Inform. Technol. J., 10: 461-469.

CrossRefDirect Link - Yan, R. and Q. Zheng, 2009. Using renyi cross entropy to analyze traffic matrix and detect DDoS attacks. Inform. Technol. J., 8: 1180-1188.

CrossRefDirect Link - Zheng, Y., Z. Qin, L. Shao and X. Hou, 2008. A novel objective image quality metric for image fusion based on renyi entropy. Inform. Technol. J., 7: 930-935.

CrossRefDirect Link - Arimoto, S., 1971. Information-theoretic considerations on estimation problems. Inform. Control, 19: 181-194.

CrossRef