Back to Search Start Over

Symmetric Kernels with Non-Symmetric Data: A Data-Agnostic Learnability Bound

Authors :
Lavie, Itay
Ringel, Zohar
Publication Year :
2024

Abstract

Kernel ridge regression (KRR) and Gaussian processes (GPs) are fundamental tools in statistics and machine learning with recent applications to highly over-parameterized deep neural networks. The ability of these tools to learn a target function is directly related to the eigenvalues of their kernel sampled on the input data. Targets having support on higher eigenvalues are more learnable. While kernels are often highly symmetric objects, the data is often not. Thus kernel symmetry seems to have little to no bearing on the above eigenvalues or learnability, making spectral analysis on real-world data challenging. Here, we show that contrary to this common lure, one may use eigenvalues and eigenfunctions associated with highly idealized data-measures to bound learnability on realistic data. As a demonstration, we give a theoretical lower bound on the sample complexity of copying heads for kernels associated with generic transformers acting on natural language.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2406.02663
Document Type :
Working Paper