Skip to content

Compatibility with torch.compile #32

@SeyedAlirezaFatemi

Description

@SeyedAlirezaFatemi

Using PyTorch 2 and torch.compile in the following way:

text_model_name = "M-CLIP/XLM-Roberta-Large-Vit-B-32"
text_model = pt_multilingual_clip.MultilingualCLIP.from_pretrained(
    text_model_name
)
tokenizer = transformers.AutoTokenizer.from_pretrained(
    text_model_name
)
text_model = torch.compile(text_model)

I get the following warning:

huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
To disable this warning, you can either:
        - Avoid using `tokenizers` before the fork if possible
        - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)

and embedding text doesn't work anymore.

Any ideas for fix?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions