Skip to content
#

masked-language-modeling

Here are 30 public repositories matching this topic...

[NeurIPS 2023 Main Track] This is the repository for the paper titled "Don’t Stop Pretraining? Make Prompt-based Fine-tuning Powerful Learner"

  • Updated Feb 4, 2024
  • Python

An empirical study of Transformer adaptation techniques. Pre-training from scratch (MLM), classic fine-tuning, and from-scratch implementations of PEFT methods (LoRA, Adapters). Tuning both encoder (BERT) and decoder (OPT) models.

  • Updated Sep 4, 2025
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the masked-language-modeling topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the masked-language-modeling topic, visit your repo's landing page and select "manage topics."

Learn more