-
Notifications
You must be signed in to change notification settings - Fork 69
Open
Description
Using PyTorch 2 and torch.compile in the following way:
text_model_name = "M-CLIP/XLM-Roberta-Large-Vit-B-32"
text_model = pt_multilingual_clip.MultilingualCLIP.from_pretrained(
text_model_name
)
tokenizer = transformers.AutoTokenizer.from_pretrained(
text_model_name
)
text_model = torch.compile(text_model)I get the following warning:
huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
To disable this warning, you can either:
- Avoid using `tokenizers` before the fork if possible
- Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)
and embedding text doesn't work anymore.
Any ideas for fix?
Metadata
Metadata
Assignees
Labels
No labels