Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

OSError for huggingface model

I am trying to use a huggingface model (CamelBERT), but I am getting an error when loading the tokenizer: Code:

from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("CAMeL-Lab/bert-base-arabic-camelbert-ca")
model = AutoModelForMaskedLM.from_pretrained("CAMeL-Lab/bert-base-arabic-camelbert-ca")

Error:

OSError: Can't load config for 'CAMeL-Lab/bert-base-arabic-camelbert-ca'. Make sure that:

- 'CAMeL-Lab/bert-base-arabic-camelbert-ca' is a correct model identifier listed on 'https://huggingface.co/models'

- or 'CAMeL-Lab/bert-base-arabic-camelbert-ca' is the correct path to a directory containing a config.json file

I couldn't run the model because of this error.

like image 802
TMN Avatar asked Apr 22 '26 05:04

TMN


1 Answers

The model_id from huggingface is valid and should work. What can cause a problem is if you have a local folder CAMeL-Lab/bert-base-arabic-camelbert-ca in your project. In this case huggingface will prioritize it over the online version, try to load it and fail if its not a fully trained model/empty folder.

If this is the problem in your case, avoid using the exact model_id as output_dir in the model arguments. Because if you then cancel while the model is not fully trained and do not manually delete it, it will cause this issue.

If this is not the problem this might be a bug and updating your transformers version as @dennlinger suggested is probably your best shot.

like image 130
EliasK93 Avatar answered Apr 24 '26 17:04

EliasK93



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!