Sparsest Continuous Piecewise-Linear Representation of Data
暂无分享,去创建一个
[1] Nathan Srebro,et al. A Function Space View of Bounded Norm Infinite Width ReLU Nets: The Multivariate Case , 2019, ICLR.
[2] Richard Baraniuk,et al. Mad Max: Affine Spline Insights Into Deep Learning , 2018, Proceedings of the IEEE.
[3] G. Petrova,et al. Nonlinear Approximation and (Deep) ReLU\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\mathrm {ReLU}$$\end{document} , 2019, Constructive Approximation.
[4] Robert D. Nowak,et al. Neural Networks, Ridge Splines, and TV Regularization in the Radon Domain , 2020, ArXiv.
[5] Emmanuel J. Candès,et al. Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information , 2004, IEEE Transactions on Information Theory.
[6] A. Pinkus. On smoothest interpolants , 1988 .
[7] M. Unser,et al. Native Banach spaces for splines and variational inverse problems , 2019, 1904.10818.
[8] Michael Unser,et al. Splines Are Universal Solutions of Linear Inverse Problems with Generalized TV Regularization , 2016, SIAM Rev..
[9] Didier Henrion,et al. Exact Solutions to Super Resolution on Semi-Algebraic Domains in Higher Dimensions , 2015, IEEE Transactions on Information Theory.
[10] P. Weiss,et al. Exact solutions of infinite dimensional total-variation regularized problems , 2017, Information and Inference: A Journal of the IMA.
[11] G. Peyré,et al. Sparse regularization on thin grids I: the Lasso , 2017 .
[12] Genady Grabarnik,et al. Sparse Modeling: Theory, Algorithms, and Applications , 2014 .
[13] Andrew R. Teel,et al. ESAIM: Control, Optimisation and Calculus of Variations , 2022 .
[14] R. Tibshirani. The Lasso Problem and Uniqueness , 2012, 1206.0313.
[15] Gabriel Peyré,et al. Support Recovery for Sparse Super-Resolution of Positive Measures , 2017 .
[16] Helmut Bölcskei,et al. Optimal Approximation with Sparsely Connected Deep Neural Networks , 2017, SIAM J. Math. Data Sci..
[17] Robert D. Nowak,et al. Minimum "Norm" Neural Networks are Splines , 2019, ArXiv.
[18] Yuejie Chi,et al. Harnessing Sparsity Over the Continuum: Atomic norm minimization for superresolution , 2019, IEEE Signal Processing Magazine.
[19] Allan Pinkus,et al. Multilayer Feedforward Networks with a Non-Polynomial Activation Function Can Approximate Any Function , 1991, Neural Networks.
[20] Jean-Baptiste Courbot,et al. Sparse analysis for mesoscale convective systems tracking , 2020, Signal Process. Image Commun..
[21] Michael Unser,et al. Continuous-Domain Solutions of Linear Inverse Problems With Tikhonov Versus Generalized TV Regularization , 2018, IEEE Transactions on Signal Processing.
[22] D. Donoho. Superresolution via sparsity constraints , 1992 .
[23] Ben Adcock,et al. Generalized Sampling and Infinite-Dimensional Compressed Sensing , 2016, Found. Comput. Math..
[24] Gabriel Peyré,et al. MultiDimensional Sparse Super-Resolution , 2017, SIAM J. Math. Anal..
[25] Benjamin Recht,et al. The alternating descent conditional gradient method for sparse inverse problems , 2015, 2015 IEEE 6th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP).
[26] Yonina C. Eldar,et al. Sampling and Super Resolution of Sparse Signals Beyond the Fourier Domain , 2019, IEEE Transactions on Signal Processing.
[27] Pierre Weiss,et al. On the linear convergence rates of exchange and continuous methods for total variation minimization , 2019, Mathematical Programming.
[28] Emmanuel Soubies,et al. The sliding Frank–Wolfe algorithm and its application to super-resolution microscopy , 2018, Inverse Problems.
[29] Philipp Petersen,et al. Optimal approximation of piecewise smooth functions using deep ReLU neural networks , 2017, Neural Networks.
[30] G. Peyré,et al. Sparse spikes super-resolution on thin grids II: the continuous basis pursuit , 2017 .
[31] Bernhard Schölkopf,et al. A Generalized Representer Theorem , 2001, COLT/EuroCOLT.
[32] M. Kreĭn,et al. The Markov moment problem and extremal problems : ideas and problems of P.L. Čebyšev and A.A. Markov and their further development , 1977 .
[33] Carlos Fernandez-Granda,et al. Super-resolution of point sources via convex programming , 2015, 2015 IEEE 6th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP).
[34] Gabriel Peyré,et al. Support Localization and the Fisher Metric for off-the-grid Sparse Regularization , 2018, AISTATS.
[35] Michael A. Saunders,et al. Atomic Decomposition by Basis Pursuit , 1998, SIAM J. Sci. Comput..
[36] Guigang Zhang,et al. Deep Learning , 2016, Int. J. Semantic Comput..
[37] Holger Rauhut,et al. A Mathematical Introduction to Compressive Sensing , 2013, Applied and Numerical Harmonic Analysis.
[38] Matthieu Simeoni. Sparse Spline Approximation on the Hypersphere by Generalised Total Variation Basis Pursuit , 2019 .
[39] Marc Teboulle,et al. A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems , 2009, SIAM J. Imaging Sci..
[40] Michael Unser,et al. Pocket guide to solve inverse problems with GlobalBioIm , 2018, Inverse Problems.
[41] Gabriel Peyré,et al. A Low-Rank Approach to Off-The-Grid Sparse Deconvolution , 2017, ArXiv.
[42] Benjamin Recht,et al. Superresolution without separation , 2015, 2015 IEEE 6th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP).
[43] Philip Wolfe,et al. An algorithm for quadratic programming , 1956 .
[44] S. Geer,et al. Locally adaptive regression splines , 1997 .
[45] Ben Adcock,et al. BREAKING THE COHERENCE BARRIER: A NEW THEORY FOR COMPRESSED SENSING , 2013, Forum of Mathematics, Sigma.
[46] A. Berlinet,et al. Reproducing kernel Hilbert spaces in probability and statistics , 2004 .
[47] Pierre Baldi,et al. Learning Activation Functions to Improve Deep Neural Networks , 2014, ICLR.
[48] Gongguo Tang,et al. Near minimax line spectral estimation , 2013, 2013 47th Annual Conference on Information Sciences and Systems (CISS).
[49] Gongguo Tang,et al. Atomic Norm Denoising With Applications to Line Spectral Estimation , 2012, IEEE Transactions on Signal Processing.
[50] Nathan Srebro,et al. How do infinite width bounded norm networks look in function space? , 2019, COLT.
[51] Razvan Pascanu,et al. On the Number of Linear Regions of Deep Neural Networks , 2014, NIPS.
[52] Yonina C. Eldar. Compressed Sensing of Analog Signals , 2008, ArXiv.
[53] George Cybenko,et al. Approximation by superpositions of a sigmoidal function , 1989, Math. Control. Signals Syst..
[54] Michael Unser,et al. Hybrid-Spline Dictionaries for Continuous-Domain Inverse Problems , 2019, IEEE Transactions on Signal Processing.
[55] Joan Bruna,et al. On Sparsity in Overparametrised Shallow ReLU Networks , 2020, ArXiv.
[56] Vincent Duval,et al. An Epigraphical Approach to the Representer Theorem , 2019, 1912.13224.
[57] S. Frick,et al. Compressed Sensing , 2014, Computer Vision, A Reference Guide.
[58] Charles Soussen,et al. OMP and Continuous Dictionaries: Is k-step Recovery Possible? , 2019, ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).
[59] Vincent Duval. A characterization of the Non-Degenerate Source Condition in super-resolution , 2019, Information and Inference: A Journal of the IMA.
[60] Razvan Pascanu,et al. On the number of response regions of deep feed forward networks with piece-wise linear activations , 2013, 1312.6098.
[61] L. Schwartz. Théorie des distributions , 1966 .
[62] Carl de Boor,et al. On “best” interpolation☆ , 1976 .
[63] Michael Unser,et al. A representer theorem for deep neural networks , 2018, J. Mach. Learn. Res..
[64] 丸山 徹. Convex Analysisの二,三の進展について , 1977 .
[66] Emmanuel J. Candès,et al. Towards a Mathematical Theory of Super‐resolution , 2012, ArXiv.
[67] Eero P. Simoncelli,et al. Recovery of Sparse Translation-Invariant Signals With Continuous Basis Pursuit , 2011, IEEE Transactions on Signal Processing.
[68] Rémi Gribonval,et al. Approximation Spaces of Deep Neural Networks , 2019, Constructive Approximation.
[69] R. Tibshirani. Regression Shrinkage and Selection via the Lasso , 1996 .
[70] Gabriel Peyré,et al. Exact Support Recovery for Sparse Spikes Deconvolution , 2013, Foundations of Computational Mathematics.
[71] F. Gamboa,et al. Spike detection from inaccurate samplings , 2013, 1301.5873.
[72] M. R. Osborne,et al. On the LASSO and its Dual , 2000 .
[73] Michael Unser,et al. Deep Neural Networks With Trainable Activations and Controlled Lipschitz Constant , 2020, IEEE Transactions on Signal Processing.
[74] Stephen P. Boyd,et al. Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers , 2011, Found. Trends Mach. Learn..
[75] Trevor Hastie,et al. Statistical Learning with Sparsity: The Lasso and Generalizations , 2015 .
[76] I. Ekeland,et al. Convex analysis and variational problems , 1976 .
[77] Emmanuel J. Candès,et al. Super-Resolution from Noisy Data , 2012, Journal of Fourier Analysis and Applications.
[78] Joseph W. Jerome,et al. Spline solutions to L1 extremal problems in one and several variables , 1975 .
[79] Michael Unser,et al. B-Spline-Based Exact Discretization of Continuous-Domain Inverse Problems With Generalized TV Regularization , 2019, IEEE Transactions on Information Theory.
[80] K. Bredies,et al. Sparsity of solutions for variational inverse problems with finite-dimensional data , 2018, Calculus of Variations and Partial Differential Equations.
[81] Gongguo Tang,et al. Near minimax line spectral estimation , 2013, 2013 47th Annual Conference on Information Sciences and Systems (CISS).
[82] K. Bredies,et al. Inverse problems in spaces of measures , 2013 .
[83] Dmitry Yarotsky,et al. Error bounds for approximations with deep ReLU networks , 2016, Neural Networks.
[84] Yohann de Castro,et al. Exact Reconstruction using Beurling Minimal Extrapolation , 2011, 1103.4951.
[85] Antonin Chambolle,et al. On Representer Theorems and Convex Regularization , 2018, SIAM J. Optim..
[86] Razvan Pascanu,et al. On the number of inference regions of deep feed forward networks with piece-wise linear activations , 2013, ICLR.
[87] Tomaso Poggio,et al. Notes on Hierarchical Splines, DCLNs and i-theory , 2015 .
[88] Michael Unser,et al. Periodic Splines and Gaussian Processes for the Resolution of Linear Inverse Problems , 2018, IEEE Transactions on Signal Processing.
[89] Julien Mairal,et al. Optimization with Sparsity-Inducing Penalties , 2011, Found. Trends Mach. Learn..
[90] W. Rudin. Real and complex analysis , 1968 .
[91] Pin T. Ng,et al. Quantile smoothing splines , 1994 .
[92] G. Wahba. Spline models for observational data , 1990 .
[93] Kurt Hornik,et al. Approximation capabilities of multilayer feedforward networks , 1991, Neural Networks.