Ayoub BELHADJI (Postdoc ENS Lyon), le 7 octobre 2022 à 11h30

lundi 3 octobre 2022
par  Alain Celisse

Subsampling is the cornerstone of approximation theory. This paradigm has many applications in data analysis, signal processing, machine learning, and statistics. Recently, many works tackled the use of kernel-based approximations in these fields. In a nutshell, a kernel-based approximation requires the definition of nodes and weights, and it is up to the practitioner to design their configuration. We study two settings where the choice of the design is crucial to obtain good reconstruction guarantees : compressive learning and numerical integration on RKHSs. In the first setting, we study the design of a sketching operator based on Fourier features for compressive clustering and compressive mixture modeling. We derive sufficient conditions that guarantee that the sketching operator satisfies the Restricted Isometry Property (RIP) with respect to the maximum Maximum Mean Discrepancy (MMD). In particular, this new analysis is universal and allows us to derive theoretical guarantees for sketching under various subsampling regimes. In the second setting, we study quadrature rules for smooth functions living in an RKHS, using nodes sampled from a projection determinantal point process (DPP) that is a truncated and saturated version of the RKHS kernel. This coupling between the two kernels leads to a fast quadrature error rate. This rate depends on the spectrum of the RKHS kernel. This analysis gives a new insight into the rates of the quadratures based on DPPs, especially for high dimensional numerical integration problems.