Skip to content

Issues about integrated inference #177

Open
@mikecovlee

Description

@mikecovlee

Traceback

Traceback (most recent call last):
  File "/home/mikecovlee/work/multi-lora-fine-tune/mlora.py", line 175, in <module>
    inference(config, model, tokenizer)
  File "/home/mikecovlee/work/multi-lora-fine-tune/mlora.py", line 106, in inference
    input_data = mlora.MultiLoraBatchData(
TypeError: MultiLoraBatchData.__init__() got an unexpected keyword argument 'prompts_'

TODO

Improve inference functions. @mikecovlee

Metadata

Metadata

Assignees

Labels

bugSomething isn't workingenhancementNew feature or request

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions