Skip to content

Commit b467577

Browse files
authored
convert.py : fix llama/llama2 conversion due to vocab_size=-1 (#5019)
PR #4818 (merged last week) reintroduced a config check for vocab_size that was addressed in PR #4258 (merged 2023-11-30). Without the fix, llama2 models can't be converted. The error is: `ValueError: The model's vocab size is set to -1 in params.json. Please update it manually. Maybe 32000?`
1 parent 3e945cc commit b467577

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

convert.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -348,7 +348,7 @@ def load_torch_params(model: LazyModel, config_path: Path) -> "Params":
348348
f_rope_freq_base = 1e6
349349

350350
return Params(
351-
n_vocab=config.get("vocab_size", model["tok_embeddings.weight"].shape[0]),
351+
n_vocab=model["tok_embeddings.weight"].shape[0],
352352
n_embd=config["dim"],
353353
n_layer=config["n_layers"],
354354
n_ctx=n_ctx,

0 commit comments

Comments
 (0)