Image Denoising using Attention-Residual Convolutional Neural Networks

Detalhes bibliográficos
Autor(a) principal: Pires, Rafael G. [UNESP]
Data de Publicação: 2020
Outros Autores: Santos, Daniel F. S. [UNESP], Santos, Claudio F. G., Santana, Marcos C. S. [UNESP], Papa, Joao P. [UNESP], IEEE
Tipo de documento: Artigo de conferência
Idioma: eng
Título da fonte: Repositório Institucional da UNESP
Texto Completo: http://dx.doi.org/10.1109/SIBGRAPI51738.2020.00022
http://hdl.handle.net/11449/210333
Resumo: During the image acquisition process, noise is usually added to the data mainly due to physical limitations of the acquisition sensor, and also regarding imprecisions during the data transmission and manipulation. In that sense, the resultant image needs to be processed to attenuate its noise without losing details. Non-learning-based strategies such as filter-based and noise prior modeling have been adopted to solve the image denoising problem. Nowadays, learning-based denoising techniques showed to be much more effective and flexible approaches, such as Residual Convolutional Neural Networks. Here, we propose a new learning-based non-blind denoising technique named Attention Residual Convolutional Neural Network (ARCNN), and its extension to blind denoising named Flexible Attention Residual Convolutional Neural Network (FARCNN). The proposed methods try to learn the underlying noise expectation using an Attention-Residual mechanism. Experiments on public datasets corrupted by different levels of Gaussian and Poisson noise support the effectiveness of the proposed approaches against some state-of-the-art image denoising methods. ARCNN achieved an overall average PSNR results of around 0.44dB and 0.96dB for Gaussian and Poisson denoising, respectively FARCNN presented very consistent results, even with slightly worsen performance compared to ARCNN.
id UNSP_e6c7285d98646f6adf004db9f5299623
oai_identifier_str oai:repositorio.unesp.br:11449/210333
network_acronym_str UNSP
network_name_str Repositório Institucional da UNESP
repository_id_str 2946
spelling Image Denoising using Attention-Residual Convolutional Neural NetworksDuring the image acquisition process, noise is usually added to the data mainly due to physical limitations of the acquisition sensor, and also regarding imprecisions during the data transmission and manipulation. In that sense, the resultant image needs to be processed to attenuate its noise without losing details. Non-learning-based strategies such as filter-based and noise prior modeling have been adopted to solve the image denoising problem. Nowadays, learning-based denoising techniques showed to be much more effective and flexible approaches, such as Residual Convolutional Neural Networks. Here, we propose a new learning-based non-blind denoising technique named Attention Residual Convolutional Neural Network (ARCNN), and its extension to blind denoising named Flexible Attention Residual Convolutional Neural Network (FARCNN). The proposed methods try to learn the underlying noise expectation using an Attention-Residual mechanism. Experiments on public datasets corrupted by different levels of Gaussian and Poisson noise support the effectiveness of the proposed approaches against some state-of-the-art image denoising methods. ARCNN achieved an overall average PSNR results of around 0.44dB and 0.96dB for Gaussian and Poisson denoising, respectively FARCNN presented very consistent results, even with slightly worsen performance compared to ARCNN.Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)PetrobrasNVIDIASao Paulo State Univ, Dept Comp, Bauru, SP, BrazilUniv Fed Sao Carlos, Dept Comp, Sao Carlos, SP, BrazilSao Paulo State Univ, Dept Comp, Bauru, SP, BrazilCNPq: 307066/20177CNPq: 427968/2018-6FAPESP: 2013/07375-0FAPESP: 2014/12236-1Petrobras: 2017/00285-6IeeeUniversidade Estadual Paulista (Unesp)Universidade Federal de São Carlos (UFSCar)Pires, Rafael G. [UNESP]Santos, Daniel F. S. [UNESP]Santos, Claudio F. G.Santana, Marcos C. S. [UNESP]Papa, Joao P. [UNESP]IEEE2021-06-25T15:05:13Z2021-06-25T15:05:13Z2020-01-01info:eu-repo/semantics/publishedVersioninfo:eu-repo/semantics/conferenceObject101-107http://dx.doi.org/10.1109/SIBGRAPI51738.2020.000222020 33rd Sibgrapi Conference On Graphics, Patterns And Images (sibgrapi 2020). New York: Ieee, p. 101-107, 2020.1530-1834http://hdl.handle.net/11449/21033310.1109/SIBGRAPI51738.2020.00022WOS:000651203300014Web of Sciencereponame:Repositório Institucional da UNESPinstname:Universidade Estadual Paulista (UNESP)instacron:UNESPeng2020 33rd Sibgrapi Conference On Graphics, Patterns And Images (sibgrapi 2020)info:eu-repo/semantics/openAccess2024-04-23T16:11:34Zoai:repositorio.unesp.br:11449/210333Repositório InstitucionalPUBhttp://repositorio.unesp.br/oai/requestopendoar:29462024-08-05T23:44:59.935353Repositório Institucional da UNESP - Universidade Estadual Paulista (UNESP)false
dc.title.none.fl_str_mv Image Denoising using Attention-Residual Convolutional Neural Networks
title Image Denoising using Attention-Residual Convolutional Neural Networks
spellingShingle Image Denoising using Attention-Residual Convolutional Neural Networks
Pires, Rafael G. [UNESP]
title_short Image Denoising using Attention-Residual Convolutional Neural Networks
title_full Image Denoising using Attention-Residual Convolutional Neural Networks
title_fullStr Image Denoising using Attention-Residual Convolutional Neural Networks
title_full_unstemmed Image Denoising using Attention-Residual Convolutional Neural Networks
title_sort Image Denoising using Attention-Residual Convolutional Neural Networks
author Pires, Rafael G. [UNESP]
author_facet Pires, Rafael G. [UNESP]
Santos, Daniel F. S. [UNESP]
Santos, Claudio F. G.
Santana, Marcos C. S. [UNESP]
Papa, Joao P. [UNESP]
IEEE
author_role author
author2 Santos, Daniel F. S. [UNESP]
Santos, Claudio F. G.
Santana, Marcos C. S. [UNESP]
Papa, Joao P. [UNESP]
IEEE
author2_role author
author
author
author
author
dc.contributor.none.fl_str_mv Universidade Estadual Paulista (Unesp)
Universidade Federal de São Carlos (UFSCar)
dc.contributor.author.fl_str_mv Pires, Rafael G. [UNESP]
Santos, Daniel F. S. [UNESP]
Santos, Claudio F. G.
Santana, Marcos C. S. [UNESP]
Papa, Joao P. [UNESP]
IEEE
description During the image acquisition process, noise is usually added to the data mainly due to physical limitations of the acquisition sensor, and also regarding imprecisions during the data transmission and manipulation. In that sense, the resultant image needs to be processed to attenuate its noise without losing details. Non-learning-based strategies such as filter-based and noise prior modeling have been adopted to solve the image denoising problem. Nowadays, learning-based denoising techniques showed to be much more effective and flexible approaches, such as Residual Convolutional Neural Networks. Here, we propose a new learning-based non-blind denoising technique named Attention Residual Convolutional Neural Network (ARCNN), and its extension to blind denoising named Flexible Attention Residual Convolutional Neural Network (FARCNN). The proposed methods try to learn the underlying noise expectation using an Attention-Residual mechanism. Experiments on public datasets corrupted by different levels of Gaussian and Poisson noise support the effectiveness of the proposed approaches against some state-of-the-art image denoising methods. ARCNN achieved an overall average PSNR results of around 0.44dB and 0.96dB for Gaussian and Poisson denoising, respectively FARCNN presented very consistent results, even with slightly worsen performance compared to ARCNN.
publishDate 2020
dc.date.none.fl_str_mv 2020-01-01
2021-06-25T15:05:13Z
2021-06-25T15:05:13Z
dc.type.status.fl_str_mv info:eu-repo/semantics/publishedVersion
dc.type.driver.fl_str_mv info:eu-repo/semantics/conferenceObject
format conferenceObject
status_str publishedVersion
dc.identifier.uri.fl_str_mv http://dx.doi.org/10.1109/SIBGRAPI51738.2020.00022
2020 33rd Sibgrapi Conference On Graphics, Patterns And Images (sibgrapi 2020). New York: Ieee, p. 101-107, 2020.
1530-1834
http://hdl.handle.net/11449/210333
10.1109/SIBGRAPI51738.2020.00022
WOS:000651203300014
url http://dx.doi.org/10.1109/SIBGRAPI51738.2020.00022
http://hdl.handle.net/11449/210333
identifier_str_mv 2020 33rd Sibgrapi Conference On Graphics, Patterns And Images (sibgrapi 2020). New York: Ieee, p. 101-107, 2020.
1530-1834
10.1109/SIBGRAPI51738.2020.00022
WOS:000651203300014
dc.language.iso.fl_str_mv eng
language eng
dc.relation.none.fl_str_mv 2020 33rd Sibgrapi Conference On Graphics, Patterns And Images (sibgrapi 2020)
dc.rights.driver.fl_str_mv info:eu-repo/semantics/openAccess
eu_rights_str_mv openAccess
dc.format.none.fl_str_mv 101-107
dc.publisher.none.fl_str_mv Ieee
publisher.none.fl_str_mv Ieee
dc.source.none.fl_str_mv Web of Science
reponame:Repositório Institucional da UNESP
instname:Universidade Estadual Paulista (UNESP)
instacron:UNESP
instname_str Universidade Estadual Paulista (UNESP)
instacron_str UNESP
institution UNESP
reponame_str Repositório Institucional da UNESP
collection Repositório Institucional da UNESP
repository.name.fl_str_mv Repositório Institucional da UNESP - Universidade Estadual Paulista (UNESP)
repository.mail.fl_str_mv
_version_ 1808129548094537728