Recently, a new measure of information called extropy has been introduced by Lad, Sanfilippo and Agrò as the dual version of Shannon entropy. In the literature, Tsallis introduced a measure for a discrete random variable, named Tsallis entropy, as a generalization of Boltzmann–Gibbs statistics. In this work, a new measure of discrimination, called Tsallis extropy, is introduced and some of its properties are then discussed. The relation between Tsallis extropy and entropy is given and some bounds are also presented. Finally, an application of this extropy to pattern recognition is demonstrated.
On Tsallis extropy with an application to pattern recognition / Balakrishnan, N.; Buono, F.; Longobardi, M.. - In: STATISTICS & PROBABILITY LETTERS. - ISSN 0167-7152. - 180:(2022), pp. 1-9. [10.1016/j.spl.2021.109241]
On Tsallis extropy with an application to pattern recognition
Buono F.;
2022
Abstract
Recently, a new measure of information called extropy has been introduced by Lad, Sanfilippo and Agrò as the dual version of Shannon entropy. In the literature, Tsallis introduced a measure for a discrete random variable, named Tsallis entropy, as a generalization of Boltzmann–Gibbs statistics. In this work, a new measure of discrimination, called Tsallis extropy, is introduced and some of its properties are then discussed. The relation between Tsallis extropy and entropy is given and some bounds are also presented. Finally, an application of this extropy to pattern recognition is demonstrated.File | Dimensione | Formato | |
---|---|---|---|
main.pdf
non disponibili
Tipologia:
2a Post-print versione editoriale / Version of Record
Licenza:
Non Pubblico - Accesso privato/ristretto
Dimensione
417.2 kB
Formato
Adobe PDF
|
417.2 kB | Adobe PDF | Visualizza/Apri Richiedi una copia |
Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/11583/2994636