We investigate the unsupervised constituency parsing task, which organizes words and phrases of a sentence into a hierarchical structure without using linguistically annotated data. We observe that existing unsupervised parsers capture differing aspects of parsing structures, which can be leveraged to enhance unsupervised parsing performance. To this end, we propose a notion of “tree averaging,” based on which we further propose a novel ensemble method for unsupervised parsing. To improve inference efficiency, we further distill the ensemble knowledge into a student model; such an ensemble-then-distill process is an effective approach to mitigate the over-smoothing problem existing in common multi-teacher distilling methods. Experiments show that our method surpasses all previous approaches, consistently demonstrating its effectiveness and robustness across various runs, with different ensemble components, and under domain-shift conditions.
Related Research
-
Unsupervised Event Outlier Detection in Continuous Time
Unsupervised Event Outlier Detection in Continuous Time
S. Nath, K. Y. C. Lui, and S. Liu. Workshop at Conference on Neural Information Processing Systems (NeurIPS)
Publications
-
LLM-TS Integrator: Integrating LLM for Enhanced Time Series Modeling
LLM-TS Integrator: Integrating LLM for Enhanced Time Series Modeling
C. Chen, G. Oliveira, H. Sharifi, and T. Sylvain. Workshop at Conference on Neural Information Processing Systems (NeurIPS)
Publications
-
Inference, Fast and Slow: Reinterpreting VAEs for OOD Detection
Inference, Fast and Slow: Reinterpreting VAEs for OOD Detection
S. Huang, J. He, and K. Y. C. Lui. Workshop at Conference on Neural Information Processing Systems (NeurIPS)
Publications
Work With Us!
RBC Borealis is looking to hire for various roles across different teams. Visit our career page now and discover opportunities to join similar impactful projects!
Careers at RBC Borealis