admin管理员组

文章数量:1123793

While loading the tokenizer, I received this error:

ImportError: Using bitsandbytes 4-bit quantization requires the latest version of bitsandbytes: 
pip install -U bitsandbytes.

I am using Jupyter notebook on Macbook M2 pro.

Below is the source code:

quant_config = BitsAndBytesConfig(
    load_in_4bit=True,
    bnb_4bit_use_double_quant=True,
    bnb_4bit_compute_dtype=torch.bfloat16,
    bnb_4bit_quant_type="nf4"

tokenizer = AutoTokenizer.from_pretrained(BASE_MODEL, trust_remote_code=True)
tokenizer.pad_token = tokenizer.eos_token
tokenizer.padding_side = "right"

base_model = AutoModelForCausalLM.from_pretrained(
    BASE_MODEL,
    quantization_config=quant_config,
    device_map="auto",
 )

base_model.generation_config.pad_token_id = tokenizer.pad_token_id

Can someone help on this?

I updated bitsandbytes as instructed, but the error persists.

本文标签: