Skip to content

lambda_dist in tnt_eval.py is strange. #211

@Bin-Bean

Description

@Bin-Bean

Hello, your research is great.
However, I found something strange and have a question.

if not args.skip_training:
common_args = " --test_iterations 30000 --depth_ratio 1.0 -r 2"

for scene in tnt_360_scenes:
    source = args.TNT_data + "/" + scene
    print("python train.py -s " + source + " -m " + args.output_path + "/" + scene + common_args + ' --lambda_dist 100')
    os.system("python train.py -s " + source + " -m " + args.output_path + "/" + scene + common_args)

for scene in tnt_large_scenes:
    source = args.TNT_data + "/" + scene
    print("python train.py -s " + source + " -m " + args.output_path + "/" + scene + common_args+ ' --lambda_dist 10')
    os.system("python train.py -s " + source + " -m " + args.output_path + "/" + scene + common_args)

lambda_dist is only printed and not actually used.
Is this intentional?
Is lambda dist 0 in the tnt dataset in your paper? Like Mip-NeRF 360?
I would appreciate your reply.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions