Skip to content
This repository has been archived by the owner on Nov 1, 2024. It is now read-only.

How to finetune from a consolidated model ? #699

Open
GongZhengLi opened this issue Apr 4, 2023 · 1 comment
Open

How to finetune from a consolidated model ? #699

GongZhengLi opened this issue Apr 4, 2023 · 1 comment
Labels
question Further information is requested

Comments

@GongZhengLi
Copy link

There are the ways to reshard the trained model to inference model, but how to retrain the model from the consolidated model ? (like llama)

@GongZhengLi GongZhengLi added the question Further information is requested label Apr 4, 2023
@zycalice
Copy link
Contributor

you can convert the consolidated model offline into as many shards as you like using reshard_consolidated.py

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants