Open
Description
Thank you for providing such an excellent thesis. I have two questions:
- When conducting testing and inference, by adding prompt words, the model can output a simple explanation. I would like to ask whether the training data includes the thinking process and explanations. Does the model support reasoning? In the dataset I only saw question-answer pairs, and there were no "explain" or "thought" fields.
- How about the inference time? For the task of desktop organization, if the machine calls the model to infer the next step for each operation, will it seem rather laggy? I tested it on my machine with a 4090 graphics card, and each step takes approximately 3 to 4 seconds. How long does it take on your end? And how can we solve the lag problem during the step-by-step process?
I'm looking forward to your reply.
Metadata
Metadata
Assignees
Labels
No labels