Entropy Measures vs. Kolmogorov Complexity

Detalhes bibliográficos
Autor(a) principal: André Souto
Data de Publicação: 2011
Outros Autores: Luís Filipe Antunes, Andreia Teixeira, Armando Matos
Tipo de documento: Artigo
Idioma: eng
Título da fonte: Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)
Texto Completo: http://repositorio.inesctec.pt/handle/123456789/2961
http://dx.doi.org/10.3390/e13030595
Resumo: Kolmogorov complexity and Shannon entropy are conceptually different measures. However, for any recursive probability distribution, the expected value of Kolmogorov complexity equals its Shannon entropy, up to a constant. We study if a similar relationship holds for R´enyi and Tsallis entropies of order α, showing that it only holds for α = 1. Regarding a time-bounded analogue relationship, we show that, for some distributions we have a similar result. We prove that, for universal time-bounded distribution mt(x), Tsallis and Rényi entropies converge if and only if α is greater than 1. We also establish the uniform continuity of these entropies.
id RCAP_0f4a893676095fa8971c250942235f5a
oai_identifier_str oai:repositorio.inesctec.pt:123456789/2961
network_acronym_str RCAP
network_name_str Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)
repository_id_str 7160
spelling Entropy Measures vs. Kolmogorov ComplexityKolmogorov complexity and Shannon entropy are conceptually different measures. However, for any recursive probability distribution, the expected value of Kolmogorov complexity equals its Shannon entropy, up to a constant. We study if a similar relationship holds for R´enyi and Tsallis entropies of order α, showing that it only holds for α = 1. Regarding a time-bounded analogue relationship, we show that, for some distributions we have a similar result. We prove that, for universal time-bounded distribution mt(x), Tsallis and Rényi entropies converge if and only if α is greater than 1. We also establish the uniform continuity of these entropies.2017-11-16T14:21:34Z2011-01-01T00:00:00Z2011info:eu-repo/semantics/publishedVersioninfo:eu-repo/semantics/articleapplication/pdfhttp://repositorio.inesctec.pt/handle/123456789/2961http://dx.doi.org/10.3390/e13030595engAndré SoutoLuís Filipe AntunesAndreia TeixeiraArmando Matosinfo:eu-repo/semantics/openAccessreponame:Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)instname:Agência para a Sociedade do Conhecimento (UMIC) - FCT - Sociedade da Informaçãoinstacron:RCAAP2023-05-15T10:20:51Zoai:repositorio.inesctec.pt:123456789/2961Portal AgregadorONGhttps://www.rcaap.pt/oai/openaireopendoar:71602024-03-19T17:53:43.109774Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos) - Agência para a Sociedade do Conhecimento (UMIC) - FCT - Sociedade da Informaçãofalse
dc.title.none.fl_str_mv Entropy Measures vs. Kolmogorov Complexity
title Entropy Measures vs. Kolmogorov Complexity
spellingShingle Entropy Measures vs. Kolmogorov Complexity
André Souto
title_short Entropy Measures vs. Kolmogorov Complexity
title_full Entropy Measures vs. Kolmogorov Complexity
title_fullStr Entropy Measures vs. Kolmogorov Complexity
title_full_unstemmed Entropy Measures vs. Kolmogorov Complexity
title_sort Entropy Measures vs. Kolmogorov Complexity
author André Souto
author_facet André Souto
Luís Filipe Antunes
Andreia Teixeira
Armando Matos
author_role author
author2 Luís Filipe Antunes
Andreia Teixeira
Armando Matos
author2_role author
author
author
dc.contributor.author.fl_str_mv André Souto
Luís Filipe Antunes
Andreia Teixeira
Armando Matos
description Kolmogorov complexity and Shannon entropy are conceptually different measures. However, for any recursive probability distribution, the expected value of Kolmogorov complexity equals its Shannon entropy, up to a constant. We study if a similar relationship holds for R´enyi and Tsallis entropies of order α, showing that it only holds for α = 1. Regarding a time-bounded analogue relationship, we show that, for some distributions we have a similar result. We prove that, for universal time-bounded distribution mt(x), Tsallis and Rényi entropies converge if and only if α is greater than 1. We also establish the uniform continuity of these entropies.
publishDate 2011
dc.date.none.fl_str_mv 2011-01-01T00:00:00Z
2011
2017-11-16T14:21:34Z
dc.type.status.fl_str_mv info:eu-repo/semantics/publishedVersion
dc.type.driver.fl_str_mv info:eu-repo/semantics/article
format article
status_str publishedVersion
dc.identifier.uri.fl_str_mv http://repositorio.inesctec.pt/handle/123456789/2961
http://dx.doi.org/10.3390/e13030595
url http://repositorio.inesctec.pt/handle/123456789/2961
http://dx.doi.org/10.3390/e13030595
dc.language.iso.fl_str_mv eng
language eng
dc.rights.driver.fl_str_mv info:eu-repo/semantics/openAccess
eu_rights_str_mv openAccess
dc.format.none.fl_str_mv application/pdf
dc.source.none.fl_str_mv reponame:Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)
instname:Agência para a Sociedade do Conhecimento (UMIC) - FCT - Sociedade da Informação
instacron:RCAAP
instname_str Agência para a Sociedade do Conhecimento (UMIC) - FCT - Sociedade da Informação
instacron_str RCAAP
institution RCAAP
reponame_str Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)
collection Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)
repository.name.fl_str_mv Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos) - Agência para a Sociedade do Conhecimento (UMIC) - FCT - Sociedade da Informação
repository.mail.fl_str_mv
_version_ 1799131611162214400