Although likelihood-based methods are theoretically appealing, deep generative models (DGMs) often produce unreliable likelihood estimates in practice, particularly for out-of-distribution (OOD) detection. We reinterpret variational autoencoders (VAEs) through the lens of fast and slow weights. Our approach is guided by the proposed Likelihood Path (LPath) Principle, which extends the classical likelihood principle. A critical decision in our method is the selection of statistics for classical density estimation algorithms. The sweet spot should contain just enough information that’s sufficient for OOD detection but not too much to suffer from the curse of dimensionality. Our LPath principle achieves this by selecting the sufficient statistics that form the “path” toward the likelihood. We demonstrate that this likelihood path leads to SOTA OOD detection performance, even when the likelihood itself is unreliable.

Bibtex

@inproceedings{
huang2024inference,
title={Inference, Fast and Slow: Reinterpreting {VAE}s for {OOD} Detection},
author={Sicong Huang and Jiawei He and Kry Yik-Chau Lui},
booktitle={Neurips Safe Generative AI Workshop 2024},
year={2024},
url={https://openreview.net/forum?id=K1VpgaYPnX}
}

Related Research