Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Use LLama 2 7B with python

I would like to use llama 2 7B locally on my win 11 machine with python. I have a conda venv installed with cuda and pytorch with cuda support and python 3.10. So I am ready to go.

The files a here locally downloaded from meta: folder llama-2-7b-chat with:

  • checklist.chk
  • consolidated.00.pth
  • params.json

Now I would like to interact with the model. But I only find code snippets downloading the model from huggingface, which is not needed in my case.

Can someone provide me with a few lines of code to interact with the model via Python?

like image 666
lutz Avatar asked Mar 16 '26 22:03

lutz


1 Answers

I know you mentioned huggingface is unnecessary in your case but to download and use the model, it's much easier to use their transformers.

After you download the weights - you need to re-structure the folder as follows:(notice I moved 3 of the files under 7B)

├── 7B
│   ├── checklist.chk
│   ├── consolidated.00.pth
│   └── params.json
├── config.json
├── generation_config.json
├── LICENSE
├── tokenizer_checklist.chk
├── tokenizer.model
└── USE_POLICY.md

Next download the conversion script from here: https://github.com/huggingface/transformers/blob/main/src/transformers/models/llama/convert_llama_weights_to_hf.py

And finally run this script:

python convert_llama_weights_to_hf.py --input_dir llama-2-7b/ --model_size 7B --output_dir model

Once it's finished - you can import the model as follows:

from transformers import LlamaForCausalLM, LlamaTokenizer
tokenizer = LlamaTokenizer.from_pretrained("./model")
model = LlamaForCausalLM.from_pretrained("./model")

You can then learn more on how to prompt the model here: https://huggingface.co/docs/transformers/v4.31.0/en/model_doc/llama2#transformers.LlamaForCausalLM.forward.example

like image 51
KevinCoder Avatar answered Mar 19 '26 12:03

KevinCoder



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!