kappa-Entropy Based Restricted Boltzmann Machines

Detalhes bibliográficos
Autor(a) principal: Passos, Leandro Aparecido
Data de Publicação: 2019
Outros Autores: Santana, Marcos Cleison [UNESP], Moreira, Thierry [UNESP], Papa, Joao Paulo [UNESP], IEEE
Tipo de documento: Artigo de conferência
Idioma: eng
Título da fonte: Repositório Institucional da UNESP
Texto Completo: http://hdl.handle.net/11449/197766
Resumo: Restricted Boltzmann Machines achieved notorious popularity in the scientific community in the last decade due to outstanding results in a wide range of applications and also for providing the required mechanisms to build successful deep learning models, i.e., Deep Belief Networks and Deep Boltzmann Machines. However, their main bottleneck is related to the learning step, which is usually time-consuming. In this paper, we introduce a Sigmoid-like family of functions based on the Kaniadakis entropy formulation in the context of the RBM learning procedure. Experiments concerning binary image reconstruction are conducted in four public datasets to evaluate the robustness of the proposed approach. The results suggest that such a family of functions is suitable to increase the convergence rate when compared to standard functions employed by the research community.
id UNSP_0727ba45b1df306c71457b2ace0955bc
oai_identifier_str oai:repositorio.unesp.br:11449/197766
network_acronym_str UNSP
network_name_str Repositório Institucional da UNESP
repository_id_str 2946
spelling kappa-Entropy Based Restricted Boltzmann MachinesRestricted Boltzmann MachinesKaniadakis EntropyMachine LearningRestricted Boltzmann Machines achieved notorious popularity in the scientific community in the last decade due to outstanding results in a wide range of applications and also for providing the required mechanisms to build successful deep learning models, i.e., Deep Belief Networks and Deep Boltzmann Machines. However, their main bottleneck is related to the learning step, which is usually time-consuming. In this paper, we introduce a Sigmoid-like family of functions based on the Kaniadakis entropy formulation in the context of the RBM learning procedure. Experiments concerning binary image reconstruction are conducted in four public datasets to evaluate the robustness of the proposed approach. The results suggest that such a family of functions is suitable to increase the convergence rate when compared to standard functions employed by the research community.Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)UFSCar Fed Univ Sao Carlos, Dept Comp, Sao Carlos, SP, BrazilUNESP Sao Paulo State Univ, Sch Sci, Bauru, SP, BrazilUNESP Sao Paulo State Univ, Sch Sci, Bauru, SP, BrazilCAPES: 001FAPESP: 2013/07375-0FAPESP: 2014/12236-1FAPESP: 2016/06441-7CNPq: 307066/2017-7IeeeUniversidade Federal de São Carlos (UFSCar)Universidade Estadual Paulista (Unesp)Passos, Leandro AparecidoSantana, Marcos Cleison [UNESP]Moreira, Thierry [UNESP]Papa, Joao Paulo [UNESP]IEEE2020-12-11T17:02:09Z2020-12-11T17:02:09Z2019-01-01info:eu-repo/semantics/publishedVersioninfo:eu-repo/semantics/conferenceObject82019 International Joint Conference On Neural Networks (ijcnn). New York: Ieee, 8 p., 2019.2161-4393http://hdl.handle.net/11449/197766WOS:000530893800033Web of Sciencereponame:Repositório Institucional da UNESPinstname:Universidade Estadual Paulista (UNESP)instacron:UNESPeng2019 International Joint Conference On Neural Networks (ijcnn)info:eu-repo/semantics/openAccess2024-04-23T16:11:34Zoai:repositorio.unesp.br:11449/197766Repositório InstitucionalPUBhttp://repositorio.unesp.br/oai/requestopendoar:29462024-04-23T16:11:34Repositório Institucional da UNESP - Universidade Estadual Paulista (UNESP)false
dc.title.none.fl_str_mv kappa-Entropy Based Restricted Boltzmann Machines
title kappa-Entropy Based Restricted Boltzmann Machines
spellingShingle kappa-Entropy Based Restricted Boltzmann Machines
Passos, Leandro Aparecido
Restricted Boltzmann Machines
Kaniadakis Entropy
Machine Learning
title_short kappa-Entropy Based Restricted Boltzmann Machines
title_full kappa-Entropy Based Restricted Boltzmann Machines
title_fullStr kappa-Entropy Based Restricted Boltzmann Machines
title_full_unstemmed kappa-Entropy Based Restricted Boltzmann Machines
title_sort kappa-Entropy Based Restricted Boltzmann Machines
author Passos, Leandro Aparecido
author_facet Passos, Leandro Aparecido
Santana, Marcos Cleison [UNESP]
Moreira, Thierry [UNESP]
Papa, Joao Paulo [UNESP]
IEEE
author_role author
author2 Santana, Marcos Cleison [UNESP]
Moreira, Thierry [UNESP]
Papa, Joao Paulo [UNESP]
IEEE
author2_role author
author
author
author
dc.contributor.none.fl_str_mv Universidade Federal de São Carlos (UFSCar)
Universidade Estadual Paulista (Unesp)
dc.contributor.author.fl_str_mv Passos, Leandro Aparecido
Santana, Marcos Cleison [UNESP]
Moreira, Thierry [UNESP]
Papa, Joao Paulo [UNESP]
IEEE
dc.subject.por.fl_str_mv Restricted Boltzmann Machines
Kaniadakis Entropy
Machine Learning
topic Restricted Boltzmann Machines
Kaniadakis Entropy
Machine Learning
description Restricted Boltzmann Machines achieved notorious popularity in the scientific community in the last decade due to outstanding results in a wide range of applications and also for providing the required mechanisms to build successful deep learning models, i.e., Deep Belief Networks and Deep Boltzmann Machines. However, their main bottleneck is related to the learning step, which is usually time-consuming. In this paper, we introduce a Sigmoid-like family of functions based on the Kaniadakis entropy formulation in the context of the RBM learning procedure. Experiments concerning binary image reconstruction are conducted in four public datasets to evaluate the robustness of the proposed approach. The results suggest that such a family of functions is suitable to increase the convergence rate when compared to standard functions employed by the research community.
publishDate 2019
dc.date.none.fl_str_mv 2019-01-01
2020-12-11T17:02:09Z
2020-12-11T17:02:09Z
dc.type.status.fl_str_mv info:eu-repo/semantics/publishedVersion
dc.type.driver.fl_str_mv info:eu-repo/semantics/conferenceObject
format conferenceObject
status_str publishedVersion
dc.identifier.uri.fl_str_mv 2019 International Joint Conference On Neural Networks (ijcnn). New York: Ieee, 8 p., 2019.
2161-4393
http://hdl.handle.net/11449/197766
WOS:000530893800033
identifier_str_mv 2019 International Joint Conference On Neural Networks (ijcnn). New York: Ieee, 8 p., 2019.
2161-4393
WOS:000530893800033
url http://hdl.handle.net/11449/197766
dc.language.iso.fl_str_mv eng
language eng
dc.relation.none.fl_str_mv 2019 International Joint Conference On Neural Networks (ijcnn)
dc.rights.driver.fl_str_mv info:eu-repo/semantics/openAccess
eu_rights_str_mv openAccess
dc.format.none.fl_str_mv 8
dc.publisher.none.fl_str_mv Ieee
publisher.none.fl_str_mv Ieee
dc.source.none.fl_str_mv Web of Science
reponame:Repositório Institucional da UNESP
instname:Universidade Estadual Paulista (UNESP)
instacron:UNESP
instname_str Universidade Estadual Paulista (UNESP)
instacron_str UNESP
institution UNESP
reponame_str Repositório Institucional da UNESP
collection Repositório Institucional da UNESP
repository.name.fl_str_mv Repositório Institucional da UNESP - Universidade Estadual Paulista (UNESP)
repository.mail.fl_str_mv
_version_ 1799965744069345280