You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I noticed that in the quantizer.py, you have a init_embed_ function which use data to init the embedding weight. In a distributed training environment, each rank has their own data, which leads to different init embeddings. Is this a problem?