In this chapter we introduce and study the max-product sampling operators, which have applications to signal theory. Due to the fact that for bounded functions with positive values, the max-product sampling operators attached to them have nice properties, all the approximation results in this chapter are stated and proved under this restriction. But as it was already mentioned in Subsection 1.1.3, Property C, this restriction can easily be dropped by considering the construction used for the max-product Bernstein operator in Theorem 2.9.1. More precisely, if \(\mathcal{S}_{W,\varphi }^{(M)}\) is any max-product sampling operator defined in this chapter and \(f: \mathbb{R} \rightarrow \mathbb{R}\) is bounded and of variable sign, then it is easy to see that the new operators \(P_{W,\varphi }^{(M)}(f)(x) = \mathcal{S}_{W,\varphi }^{(M)}(f - a)(x) + a\), where \(a <\min \{ f(x);x \in \mathbb{R}\}\) and φ is the Fejer or the Whittaker kernel, keep all the approximation properties of the operator \(\mathcal{S}_{W,\varphi }^{(M)}\) (i.e., gives the same Jackson order of approximation, \(\omega _{1}(f;1/W)_{\mathbb{R}}\), keeps the interpolation properties and verifies the same saturation, local inverse, and localization results).
[1]
Gianluca Vinti,et al.
Max-product neural network and quasi-interpolation operators activated by sigmoidal functions
,
2016,
J. Approx. Theory.
[2]
Gianluca Vinti,et al.
Pointwise and uniform approximation by multivariate neural network operators of the max-product type
,
2016,
Neural Networks.
[4]
Gianluca Vinti,et al.
Approximation by Max-Product Neural Network Operators of Kantorovich Type
,
2016
.
[5]
Sorin G. Gal,et al.
Approximation by Nonlinear Generalized Sampling Operators of Max-Product Kind
,
2010
.
[6]
R. L. Stens,et al.
Prediction by Samples From the Past With Error Estimates Covering Discontinuous Signals
,
2010,
IEEE Transactions on Information Theory.