Imbalanced distributions are ubiquitous in real-world data. They create constraints on Deep Neural Networks to represent the minority labels and avoid bias towards majority labels. The extensive body of imbalanced approaches address categorical label spaces but fail to effectively extend to regression problems where the label space is continuous. Local and global correlations among continuous labels provide valuable insights towards effectively modelling relationships in feature space. In this work, we propose ConR, a contrastive regularizer that models global and local label similarities in feature space and prevents the features of minority samples from being collapsed into their majority neighbours. ConR discerns the disagreements between the label space and feature space and imposes a penalty on these disagreements. ConR addresses the continuous nature of label space with two main strategies in a contrastive manner: incorrect proximities are penalized proportionate to the label similarities and the correct ones are encouraged to model local similarities. ConR consolidates essential considerations into a generic, easy-to-integrate, and efficient method that effectively addresses deep imbalanced regression. Moreover, ConR is orthogonal to existing approaches and smoothly extends to uni- and multi-dimensional label spaces. Our comprehensive experiments show that ConR significantly boosts the performance of all the state-of-the-art methods on four large-scale deep imbalanced regression benchmarks. Our code is publicly available in this https URL.
Bibtex
@misc{keramati2023conr,
title={ConR: Contrastive Regularizer for Deep Imbalanced Regression},
author={Mahsa Keramati and Lili Meng and R. David Evans},
year={2023},
eprint={2309.06651},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
Related Research
-
NeuZip: Memory-Efficient Training and Inference with Dynamic Compression of Neural Networks
NeuZip: Memory-Efficient Training and Inference with Dynamic Compression of Neural Networks
Y. Hao, Y. Cao, and L. Mou. Workshop at Conference on Neural Information Processing Systems (NeurIPS)
Publications
-
ClavaDDPM: Multi-relational Data Synthesis with Cluster-guided Diffusion Models
ClavaDDPM: Multi-relational Data Synthesis with Cluster-guided Diffusion Models
W. Pang, M. Shafieinejad, L. Liu, S. Hazlewood, and X. He. Conference on Neural Information Processing Systems (NeurIPS)
Publications
-
Bayesian Neural Networks
Bayesian Neural Networks
S. Prince.
Research
How to Join The RBC Borealis Team!
Are you aspiring to build a career in AI and ML research? Do you want to join a team that is associated with innovation and high ethical standards? If you are looking for opportunities to make a personal impact, we have several programs and opportunities.
View open roles