Skip to content

About evaluation metric #18

@zyqz97

Description

@zyqz97

Hello, I have a new question.
The metrics obtained from the released checkpoint differ from those reported in the paper.
I run the following scripts :

get refine image: resolution is 448 x 256

python -m src.main +experiment=dl3dv_mvsplat360 \ wandb.name=dl3dv_480P_ctx5_tgt56 \ mode=test \ dataset/view_sampler=evaluation \ dataset.roots=[datasets/dl3dv] \ checkpointing.load=checkpoints/dl3dv_480p.ckpt

color adjustment

python src/scripts/post_process.py --root_dir=outputs/test_scores/dl3dv_480P_ctx5_tgt56_download_ckpt

compute the metric

python -m src.scripts.compute_dl3dv_metrics --use_pp

and I modified the compute_dl3dv_metrics.py to read folder.

image

But I get the metric below, which is different from the paper:
image and FID: 16.43
image

So, I’d like to ask if I might have missed some details when evaluating. Thanks!

Metadata

Metadata

Assignees

Labels

No labels
No labels

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions