Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] Convert all mesh data to OBJ #180

Open
wants to merge 5 commits into
base: master
Choose a base branch
from

Conversation

adamkewley
Copy link
Contributor

There's much more extensive 3rd-party tooling for OBJ files, and they are significantly easier to parse in a streaming fashion (i.e. read each vertex and immediately stuff it into an array of vertices).

Comparative disk usage etc. will be tested once I have a branch that ports this data.

@adamkewley
Copy link
Contributor Author

Alright, this is now done.

I'll prototypically merge it into opensim-creator/resources and see what the perf etc. impact is. Unfortunately, I renamed some files, added other models, etc. so I will need to manually merge it a little.

@adamkewley
Copy link
Contributor Author

Note: a decision needs to be made on whether we should still keep the original .vtp files in the repo: the reason why is because people may have saved models that implicitly depend on the presence of the vtp file in the global geometry folder.

@aymanhab
Copy link
Member

Thanks for getting this started @adamkewley 👍 Indeed we need to keep all the vtp files we distribute as is for the multitude of distributed models contributed by others from the years OpenSim has been in circulation. We can refine/change the obj files to include textures or be high-res and use them in upcoming releases but backward compatibility needs to be maintained.

@adamkewley
Copy link
Contributor Author

I'll revert the vtp's in this PR so that the only change is OBJ additions + the osim file changes.

Benchmarks on my end indicate that the overall performance of OBJ parsing is actually similar to the VTP parsing, which is very surprising. I'm guessing it's related to the fact that the files are a little bit bigger and the profiler is indicating that a decent percentage of time is spent on getline

@adamkewley
Copy link
Contributor Author

Ok, everything seems to work in OSC (no missing files etc.) but there's a new problem: the vtk library isn't keeping track of the accuracy of the VTP data, which means that the OBJ files end up being much larger than they need to be. Here's an example:

VTP:

<?xml version="1.0"?>
<VTKFile type="PolyData" version="0.1" byte_order="LittleEndian" compressor="vtkZLibDataCompressor">
	<PolyData>
		<Piece NumberOfPoints="1914" NumberOfVerts="0" NumberOfLines="0" NumberOfStrips="0" NumberOfPolys="3824">
		<PointData Normals="Normals">
		<DataArray type="Float32" Name="Normals" NumberOfComponents="3" format="ascii">
			-0.652886 -0.435904 -0.619457
			-0.612236 -0.554563 -0.563584
			-0.765230 -0.464743 -0.445464
			-0.735151 -0.357964 -0.575686
			-0.430989 -0.504289 -0.748293
			-0.553159 -0.404916 -0.728051
			-0.837839 -0.510742 -0.192791
			-0.274672 -0.584935 -0.763155
			-0.423502 -0.650064 -0.630922
			-0.693818 -0.599999 -0.398269

OBJ:

vn -0.6528859734535217 -0.43590399622917175 -0.6194570064544678
vn -0.6122360229492188 -0.5545629858970642 -0.56358402967453
vn -0.7652300000190735 -0.4647429883480072 -0.44546398520469666
vn -0.7351509928703308 -0.35796400904655457 -0.575685977935791
vn -0.43098899722099304 -0.5042889714241028 -0.7482929825782776
vn -0.5531589984893799 -0.404915988445282 -0.7280510067939758
vn -0.8378390073776245 -0.5107420086860657 -0.19279100000858307
vn -0.2746720016002655 -0.5849350094795227 -0.7631549835205078

It wouldn't be a good idea to ship this, because a large amount of entropy has been injected into the OBJ files in the form of d64-to-string conversion noise, which ruins compressability.

Further, many of the VTP files contain surface normals that point in the same direction as the triangle surface. These also shouldn't be shipped, because they can be reliably+easily be computed by rendering engines (just cross-product two triangle edges), which is much faster than storing+reading+parsing the data. The same may also be true of texture coordinates (e.g. "if none are provided, use this standard unwrapping algorithm"), but those algorithms are more complex than a couple of cross products.

@adamkewley
Copy link
Contributor Author

Ok, another holdup

The legacy mesh data also contains normals. Simbody hasn't historically supported loading normals, but is about to (with @aymanhab 's work: simbody/simbody#798). This means that right now is probably our last chance to drop normals from the legacy files (we know for a fact they're unused, and that the triangle-face method for computing them works fine via OpenSim GUI + Creator).

Removing them should decrease the size of the files by around 30-40 % (uncompressed - compressed would be a slightly larger percentage). Because the majority of asset loading is related to reading the data and parsing the characters into floats, this will also decrease asset loading times by around the same percentage. High-level benchmarks in OSC indicate that asset loading is around 40-60 % of loading an OpenSim model, I would estimate that dropping these normals should make loading a model around 12-25 % faster - not bad.

I'll PR dropping the normals, which can be done before we start converting the legacy files. After that, I'll write a new VTP -> OBJ conversion script for this PR that directly extracts the vertices from the VTP file without first loading them into d64s (which is what's creating the noise).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants