Skip to content

ltgoslo/dual-language-models

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Dual Language Models:
Balancing Training Efficiency and Overfitting Resilience


David Samuel and Lucas Georges Gabriel Charpentier

University of Oslo
Language Technology Group


Paper



Abstract

This paper combines autoregressive and masked-diffusion training objectives without any architectural modifications, resulting in flexible language models that outperform single-objective models. Autoregressive modeling has been a popular approach, partly because of its training efficiency; however, that comes at the cost of sensitivity to overfitting. On the other hand, masked-diffusion models are less efficient to train while being more resilient to overfitting. In this work, we demonstrate that dual-objective training achieves the best of both worlds. To derive the optimal ratio between both objectives, we train and evaluate 50 language models under varying levels of data repetition. We show that it is optimal to combine both objectives under all evaluated settings and that the optimal ratio is similar whether targeting autoregressive or masked-diffusion downstream performance.



This is the official repository for Dual Language Models.



Please cite the following publication

@misc{samuel2025duallanguagemodelsbalancing,
      title={Dual Language Models: Balancing Training Efficiency and Overfitting Resilience}, 
      author={David Samuel and Lucas Georges Gabriel Charpentier},
      year={2025},
      eprint={2512.14549},
      archivePrefix={arXiv},
      primaryClass={cs.CL},
      url={https://arxiv.org/abs/2512.14549}, 
}

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published