Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

voxelize_mesh.py consumes too much time and memory #7

Open
jewettaij opened this issue Dec 16, 2020 · 2 comments
Open

voxelize_mesh.py consumes too much time and memory #7

jewettaij opened this issue Dec 16, 2020 · 2 comments

Comments

@jewettaij
Copy link
Owner

jewettaij commented Dec 16, 2020

The RAM required by this program is 25-100 times larger than the size of the original 3D image (tomogram) that we used to extract the surface mesh. This occurs because I am using 3rd party tools (pyvista, vtk) to handle the computation, instead of writing a new program from scratch. The computation is also slow. Unfortunately fixing this is not a priority for me yet -A 2020-12-15

@jewettaij
Copy link
Owner Author

jewettaij commented Jul 20, 2021

Here's a workaround to address the slow computation time:

The computation time can be reduced by using a mesh with a smaller number of polygons.
You can reduce the number of polygons in the mesh by:

  • opening the PLY file in meshlab,
  • using the "Filters"->"Remeshing, Simplification, and Reconstruction"->"Quadratic Edge Collapse Decimation" menu option,
  • reducing the number of polygons "faces" (eg, to less than 20000), and
  • exporting the mesh as a new PLY file.

This might result in a mesh with topological problems (such as holes or cavities), so it does not always work. To reduce the chance of these kinds of problems, you might want to smooth the mesh beforehand (for example by opening the PLY file in meshlab, and selecting the "Smoothing, Fairing and Deformation"->"HC Laplacian Smoothing" menu option).

Reducing the number of polygons in the mesh makes the "voxelize_mesh.py" program run faster, but it does not seem to reduce memory consumption. I still don't know how to solve that problem.

@jewettaij
Copy link
Owner Author

jewettaij commented Aug 17, 2021

Workaround

As of 2021-8-16, computers with terabytes of RAM can be rented from Amazon EC2 for about $13 per hour.
(Learning how to use cloud services like EC2 is never a bad skill to have...)

I realize this isn't a very satisfying solution for most users.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant