Skip to content

role of 'param loss' in pre-train stage #12

@yjsong-alchera

Description

@yjsong-alchera

Hello. I love your great work.

I have a question.

When I tried to pre-train 3DMM estimator, I found 'param loss' (the line 305 in model.py)

param_loss = 1e-3 * (torch.mean(codedict['shape'] ** 2) + 0.8 * torch.mean(codedict['exp'] ** 2))

But, I couldn't understand the role of this loss term...

Could you explain more about this loss term? (How param loss work)

For 3DMM parameters (Shape, Expression), how can we regulate this parameters although there are no GT (Ground Truth). (I understand 'ldmk_loss' because we prepare ldmk GT before training)

Please forgive me

Thank you.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions