Publications

Formal limitations of sample-wise information-theoretic generalization bounds

Abstract

Some of the tightest information-theoretic generalization bounds depend on the average information between the learned hypothesis and a single training example. However, these sample-wise bounds were derived only for expected generalization gap. We show that even for expected squared generalization gap no such sample-wise information-theoretic bounds exist. The same is true for PAC-Bayes and single-draw bounds. Remarkably, PAC-Bayes, single-draw and expected squared generalization gap bounds that depend on information in pairs of examples exist.

Date
November 1, 2022
Authors
Hrayr Harutyunyan, Greg Ver Steeg, Aram Galstyan
Conference
2022 IEEE Information Theory Workshop (ITW)
Pages
440-445
Publisher
IEEE