Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

lm_head weight of Llama3.2_3B_instruct model #172

Open
Watebear opened this issue Oct 7, 2024 · 1 comment
Open

lm_head weight of Llama3.2_3B_instruct model #172

Watebear opened this issue Oct 7, 2024 · 1 comment

Comments

@Watebear
Copy link

Watebear commented Oct 7, 2024

Hello, I find that theres no lm_head weight in model checkpoints(.safetensors).
How does model load weight for the Linear Layer of lm_head ?

@DelinQu
Copy link

DelinQu commented Oct 10, 2024

The config of llama3.2 sets tie_word_embeddings to True, which means that LLama2.2 3B share the weight matrix of embeddings andlm_head. Please refer to discuss.huggingface.co and transformers for detail.

{
  "architectures": [
    "LlamaForCausalLM"
  ],
  "attention_bias": false,
  "attention_dropout": 0.0,
  "bos_token_id": 128000,
  "eos_token_id": 128001,
  "head_dim": 128,
  "hidden_act": "silu",
  "hidden_size": 3072,
  "initializer_range": 0.02,
  "intermediate_size": 8192,
  "max_position_embeddings": 131072,
  "mlp_bias": false,
  "model_type": "llama",
  "num_attention_heads": 24,
  "num_hidden_layers": 28,
  "num_key_value_heads": 8,
  "pretraining_tp": 1,
  "rms_norm_eps": 1e-05,
  "rope_scaling": {
    "factor": 32.0,
    "high_freq_factor": 4.0,
    "low_freq_factor": 1.0,
    "original_max_position_embeddings": 8192,
    "rope_type": "llama3"
  },
  "rope_theta": 500000.0,
  "tie_word_embeddings": true,
  "torch_dtype": "bfloat16",
  "transformers_version": "4.45.0.dev0",
  "use_cache": true,
  "vocab_size": 128256
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants