-
Notifications
You must be signed in to change notification settings - Fork 7
Description
Currently the straight skeleton being generated in the polyskel function fails at a certain level of complexity. A majority of these failures seem to be related to the changes in point equivalence tolerances that occur as the geometry changes size, which may interact poorly with the spatial hashing tolerances that I use in the graph algorithms.
A possible solution is to standardize, and center the coordinate data. An initial test resolved all the failing tests I had. Since the point equivalence is based on floating point precision, a robust scale would be to normalize all data between 0 and some number, and then move the "origin" to the (1, 1) coordinate. The decision of what to scale the geometry to is then just a question of identifying the scale that results in solving the most geometries. Based on initial tests, a number between 1 and 10 seems best.