Skip to content

The performance problem of fitting FLAME model to a scan with nearly 1 million vertices #54

@icewired-yy

Description

@icewired-yy

Hello. First of all, thank you for your impressive work and beautiful code.

I'm using the fit_scan.py to fit the FLAME model to my own scan. Unlike the tested scan in the repository, which has about 40K vertices, the scan I used has almost 1M vertices (more precisely, 1,334,321 vertices). And then I find that the fitting process will cost more than 3 hours (11099s).

I'm wondering if this situation is caused by my mistakes, or the performance of this fitting process do become worse when the number of vertices of target scan is increasing. Is it possible to accelerate this process when facing the scan with large amount of vertices?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions