Skip to content

Python error with test.py #39

@lesept777

Description

@lesept777

Hi
I'm trying to run the training and testing phases, with a newer version of pytorch.
When running the test, I get some erro message:

python test.py
Traceback (most recent call last):
  File "test.py", line 42, in <module>
    test(args)
  File "/data1/is156025/fa125436/N2D2/env/lib/python3.8/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "test.py", line 18, in test
    model = PedalNet.load_from_checkpoint(args.model)
  File "/data1/is156025/fa125436/N2D2/env/lib/python3.8/site-packages/lightning/pytorch/core/module.py", line 1537, in load_from_checkpoint
    loaded = _load_from_checkpoint(
  File "/data1/is156025/fa125436/N2D2/env/lib/python3.8/site-packages/lightning/pytorch/core/saving.py", line 91, in _load_from_checkpoint
    model = _load_state(cls, checkpoint, strict=strict, **kwargs)
  File "/data1/is156025/fa125436/N2D2/env/lib/python3.8/site-packages/lightning/pytorch/core/saving.py", line 144, in _load_state
    obj = cls(**_cls_kwargs)
TypeError: __init__() missing 1 required positional argument: 'hparams'

It's weird because I can see the hparams argument in the init() method:

class PedalNet(pl.LightningModule):
    def __init__(self, hparams):
        super(PedalNet, self).__init__()
        self.wavenet = WaveNet(
            num_channels=hparams["num_channels"],
            dilation_depth=hparams["dilation_depth"],
            num_repeat=hparams["num_repeat"],
            kernel_size=hparams["kernel_size"],
        )
#        self.hparams = hparams
        self.hparams.update(hparams)
        self.save_hyperparameters()

What is going wrong?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions