Skip to content

Standardize geometric data to increase robustness of polyskeleton. #38

@saeranv

Description

@saeranv

Currently the straight skeleton being generated in the polyskel function fails at a certain level of complexity. A majority of these failures seem to be related to the changes in point equivalence tolerances that occur as the geometry changes size, which may interact poorly with the spatial hashing tolerances that I use in the graph algorithms.

A possible solution is to standardize, and center the coordinate data. An initial test resolved all the failing tests I had. Since the point equivalence is based on floating point precision, a robust scale would be to normalize all data between 0 and some number, and then move the "origin" to the (1, 1) coordinate. The decision of what to scale the geometry to is then just a question of identifying the scale that results in solving the most geometries. Based on initial tests, a number between 1 and 10 seems best.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions