Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inquiry about checkpoint of ORQA #173

Open
jej127 opened this issue Aug 9, 2023 · 0 comments
Open

Inquiry about checkpoint of ORQA #173

jej127 opened this issue Aug 9, 2023 · 0 comments

Comments

@jej127
Copy link

jej127 commented Aug 9, 2023

I'm reaching out to inquire about the pre-trained ICT models, specifically the weights for both the question encoder BERT_Q(q) and the block encoder BERT_B(b).

I came across the repository "https://github.com/google-research/language/tree/master/language/orqa" on GitHub, which suggests the site "gs://orqa-data/ict" as a potential source for these weights.
Upon checking, I found that "gs://orqa-data/ict" provides the pre-trained weight for only the question encoder BERT_Q(q) and a dense vector index of size (13353718, 128).
To be precise, the weight for BERT_Q(q) is located in "gs://orqa-data/ict/variables", and the dense vector index is in "gs://orqa-data/ict/encoded". However, I couldn't locate the pre-trained weight for the block encoder BERT_B(b).

Would it be possible for you to share the pre-trained weight of the block encoder BERT_B(b), if available?

I appreciate your assistance.

Thank you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant