I am using the same app.py, but there seems to be some issue with this line
llm = CTransformers( model_type='llama', model='models/llama-2-7b.ggmlv3.q8_0.bin', config={'max_new_tokens': 256, 'temperature': 0.01} )
this is giving following error:
Repository Not Found for url: https://huggingface.co/api/models/models/llama-2-7b-chat.ggmlv3.q8_0.bin/revision/main.
Please make sure you specified the correct repo_id and repo_type.
If you are trying to access a private or gated repo, make sure you are authenticated.
I have downloaded the model locally, how can I use it? @krishnaik06
I am using the same app.py, but there seems to be some issue with this line
llm = CTransformers( model_type='llama', model='models/llama-2-7b.ggmlv3.q8_0.bin', config={'max_new_tokens': 256, 'temperature': 0.01} )this is giving following error:
Repository Not Found for url: https://huggingface.co/api/models/models/llama-2-7b-chat.ggmlv3.q8_0.bin/revision/main.
Please make sure you specified the correct
repo_idandrepo_type.If you are trying to access a private or gated repo, make sure you are authenticated.
I have downloaded the model locally, how can I use it? @krishnaik06