Seleção de Parâmetros de Regularização por Minimização de Distâncias Esperadas de Bregman

Autores

  • Elias S. Helou
  • Lucas E. A. Simões
  • Sandra A. Santos

DOI:

https://doi.org/10.5540/03.2023.010.01.0094

Palavras-chave:

Regularização, Problemas Inversos, Divergências de Bregman

Resumo

Técnicas de regularização são ferramentas importantes para a solução numérica de problemas mal-postos (no sentido de Hadamard). O presente trabalho apresenta um método para a seleção do parâmetro de regularização através da minimização de um estimador não tendencioso para o erro preditivo do regularizador. A técnica proposta generaliza métodos existentes através da possibilidade da utilização de preditores mais gerais baseados em divergências de Bregman. O método é aplicável a uma ampla gama de técnicas de regularização lineares ou não-lineares e a diversos modelos estocásticos para o ruído.

Downloads

Não há dados estatísticos.

Biografia do Autor

Elias S. Helou

ICMC-USP, São Carlos, SP

Lucas E. A. Simões

IMECC-UNICAMP, Campinas, SP

Sandra A. Santos

IMECC-UNICAMP, Campinas, SP

Referências

R. Averkamp e C. Houdré. “Stein Estimate for Infinitely Divisible Laws”. Em: ESAIM: Probability and Statistics 10 (2006), pp. 269–276. doi: 10 . 1051 / ps : 2006011. url:http://journals.cambridge.org/abstract_S1292810006000115.

K. S. Azoury e M. K. Warmuth. “Relative Loss Bounds for On-Line Density Estimation with the Exponential Family of Distributions”. Em: Machine Learning 43.3 (2001), pp. 211–246. doi: 10.1023/A:1010896012157. url: http://link.springer.com/article/10.1023/A: 1010896012157.

A. Banerjee et al. “Clustering with Bregman Divergences”. Em: Journal of Machine Learning Research 6 (2005), pp. 1705–1749. url: http://dl.acm.org/citation.cfm?id=1194902.

A. Bovik, ed. Handbook of Image & Video Processing. San Diego (CA): Elsevier Academic Press, 2005.

M. Collins, S. Dasgupta e R. E. Schapire. “A Generalization of Principal Components Analysis to the Exponential Family”. Em: Advances in Neural Information Processing Systems 14 (2001). url: http://papers.nips.cc/paper/2078-a-generalization-of-principal-components-analysis-to-the-exponential-family.

I. Csiszár. “Why Least Squares and Maximum Entropy? An Axiomatic Approach to Inference for Linear Inverse Problems”. Em: The Annals of Statistics 19.4 (1991), pp. 2032–2066. url: http://www.jstor.org/stable/2241918.

Y. C. Eldar. “Generalized SURE for Exponential Families: Applications to Regularization”. Em: IEEE Transactions on Signal Processing 57.2 (2009), pp. 471–481. doi:10.1109/TSP.2008.2008212. url: http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=4663926.

R. M. Gray et al. “Distortion Measures for Speech Processing”. Em: IEEE Transactions on Acoustics, Speech and Signal Processing 28.4 (1980), pp. 367–376. doi: 10.1109/TASSP.1980.1163421. url: http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=1163421.

M. Hamada e E. A. Valdez. “CAPM and Option Pricing with Elliptically Contoured Distributions”. Em: The Journal of Risk and Insurance 75.2 (2008), pp. 387–409. doi: 10.1111/j.1539-6975.2008.00265.x. url: http://onlinelibrary.wiley.com/doi/10.111/j.1539-6975.2008.00265.x/full.

G. T. Herman. Image Reconstruction from Projections: The Fundamentals of Computerized Tomography. Boston (MA): Academic Press, 1980.

W. James e C. Stein. “Estimation with Quadratic Loss”. Em: Proceedings of the Berkeley Symposium on Mathematical Statistics and Probability, Volume 1: Contributions to the Theory of Statistics. University of California Press, 1961, pp. 361–379. url: http://projecteuclid.org/euclid.bsmsp/1200512173.

A. C. Kak e M. Slaney. Principles of Computerized Tomographic Imaging. New York: IEEE Press, 1988.

Z. Landsman e J. Nešlehová. “Stein’s Lemma for Elliptical Random Vectors”. Em: Journal of Multivariate Analysis 99 (2008), pp. 912–927. doi: 10.1016/j.jmva.2007.05.006. url: http://www.sciencedirect.com/science/article/pii/S0047259X07000814.

F. Luisier, T. Blu e M. Unser. “Image Denoising in Mixed Poisson–Gaussian Noise”. Em: IEEE Transactions on Image Processing 20.3 (2011), pp. 696–708. doi: 10.1109/TIP.2010.2073477. url: http://ieeexplore.ieee.org/xpls/abs_ all.jsp?arnumber=5570958.

Y. L. Montagner, E. D. Angelini e J.-C. Olivo-Marin. “An Unbiased Risk Estimator for Image Denoising in the Presence of Mixed Poisson–Gaussian Noise”. Em: IEEE Transactions on Image Processing 23.3 (2014), pp. 1255–1268. doi: 10.1109/TIP.2014.2300821. url: http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=6714502.

F. Natterer. The Mathematics of Computerized Tomography. Chichester, New York, Brisbane, Toronto, Singapore: John Wiley & Sons Ltd, 1986.

F. Natterer e F. Wübbeling. Mathematical Methods in Image Reconstruction. Philadelphia (PA): SIAM, 2001.

J. C.-M. Peng. Simultaneous Estimation of the Parameters of Independent Poisson Distribution. Rel. técn. EFS NSF 78. Stanford, dez. de 1975. url: https://purl.stanford.edu/gj419gc5599.

S. Ramani, T. Blu e M. Unser. “Monte-Carlo SURE: A Black-Box Optimization of Regularization Parameters for General Denoising Algorithms”. Em: IEEE Transactions on Image Processing 17.9 (2008), pp. 1540–1554. doi: 10.1109/TIP.2008.2001404. url: http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=4598837.

S. Si, D. Tao e B. Geng. “Bregman Divergence-Based Regularization for Transfer Subspace Learning”. Em: IEEE Transactions on Knowledge and Data Engineering 22.7 (2010), pp. 929–942. doi: 10.1109/TKDE.2009.126. url: http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=4967588&tag=1.

C. M. Stein. “Estimation of the Mean of a Multivariate Normal Distribution”. Em: The Annals of Statistics 9.6 (1981), pp. 1135–1151. url: http://www.jstor.org/stable/2240405.

Downloads

Publicado

2023-12-18

Edição

Seção

Trabalhos Completos