Giter VIP home page Giter VIP logo

interactivecomputergraphics / splashsurf Goto Github PK

View Code? Open in Web Editor NEW
91.0 4.0 16.0 21.18 MB

Surface reconstruction library and CLI for particle data from SPH simulations, written in Rust.

Home Page: https://splashsurf.physics-simulation.org/

License: MIT License

Rust 97.73% Dockerfile 0.21% Shell 2.06%
sph fluids sph-fluids smoothed-particle-hydrodynamics fluid-simulation fluid-dynamics surface-reconstruction particles surface-mesh rust

splashsurf's People

Contributors

dependabot[bot] avatar rezural avatar w1th0utnam3 avatar whiterabbit42k avatar yoshierahuang3456 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

splashsurf's Issues

meshing panics with indeterminate data

I'm seeing this panic in 0.6.1:

panicked at 'Trying to evaluate cell data with indeterminate data!', /home/whiterabbit/.cargo/registry/src/github.com-1ecc6299db9ec823/splashsurf_lib-0.6.1/src/marching_cubes.rs:45:17

when trying to mesh a fluid from salva; i previously had no issues, so i'm not sure what happened here. It would be great if this didn't panic, but returned an error, so I could handle, and reuse an old mesh, etc. I could also try to help figure out why this panic is getting hit, but will need some help where to start.

optional vtkio dependency

Wondering the work to make this vtkio dependency optional? I do not use any file formats for meshing, but directly pass in-memory points that salva generates, so the vtkio dep here brings in unncessary stuff.

Would you be open to PR making this optional?

Some kind of odd mesh corruption

Hello,

I was trying for some generative art using metaballs. The idea was to generate a 3D-grid/cube of points/atoms/particles/whatever, and carve shapes out of it like a kind of computational "clay".

I wrote this program:

use kiss3d::window::Window;
use nalgebra::{Point3, Vector3, DMatrix};
use splashsurf_lib::{reconstruct_surface, Parameters, SpatialDecompositionParameters, SurfaceReconstruction, SubdivisionCriterion, ParticleDensityComputationStrategy};
use std::option::Option::{None, Some};
use kiss3d::ncollide3d::procedural::{TriMesh, IndexBuffer};
use std::convert::From;
use std::prelude::v1::Vec;
use kiss3d::ncollide3d::query::Ray;
use kiss3d::ncollide3d::shape::HeightField;
use kiss3d::ncollide3d::bounding_volume::AABB;
use kiss3d::light::Light;

fn main() {

    let mut window = Window::new("Hello");

    window.set_light(Light::StickToCamera);

    let mut particles = Vec::new();

    for x in 0..20 {
        for y in 0..20 {
            for z in 0..20 {
                particles.push(
                    Point3::new(x as f32, z as f32, y as f32)
                );
            }
        }
    }

    let carve_shape = AABB::new(Point3::new(0.0, 1.0, 1.0),
                                Point3::new(20.0, 18.0, 18.0));

    particles.retain(|pt| {
        !carve_shape.contains_local_point(pt)
    });

    let vectorized = particles.iter().map(|pt| pt.to_owned().coords).collect::<Vec<Vector3<f32>>>();

    let surface : SurfaceReconstruction<usize,f32> = reconstruct_surface(&vectorized, &Parameters {
        particle_radius: 1.0,
        rest_density: 1.0,
        compact_support_radius: 2.0,
        cube_size: 0.2,
        iso_surface_threshold: 0.5,
        domain_aabb: None,
        enable_multi_threading: false,
        spatial_decomposition: None,
    }).unwrap();

    let trimesh = TriMesh::new(
        surface.mesh().vertices.iter().map(|v| Point3::new(v.x, v.y, v.z)).collect(),
        None,
        None,
        Some(IndexBuffer::Unified(surface.mesh().triangles.iter().map(|idx| Point3::new(idx[0] as u32, idx[1] as u32, idx[2] as u32)).collect()))
    );

    window.add_trimesh(trimesh, Vector3::new(1.0,1.0,1.0));

    while window.render() {

    }

}

However, something seems to have gone a teensy bit wrong with the mesh generation:

image

Here's the used crates and opt levels:

[dependencies]
kiss3d = "0.29.0"
nalgebra = "0.24.1"
splashsurf_lib = "0.6.1"
noise = "0.7.0"

[profile.dev]
opt-level = 1

vertices and indices out param support

So my usecase is currently a semi-realtime meshing of fluids from salva; for around 1k particles, I'm able to mesh about every 30-60ms.

As apart of perf investigations, I'm wondering if we can reduce some of this overhead by providing a vertices and indices out param, perhaps something like:

splashsurf_lib::reconstruct_surface_with(&positions, &params, &mut vertices, &mut indices)

This should reduce some pressure on the allocator?

crates release of splashsurf_lib to match salva 0.7

salva 0.7 uses nalgebra 0.29; it would be great if we had a matching crates.io release of splashsurf_lib so the nalgebra crate can be unified.

I see splashsurf_lib already uses 0.29 in master Cargo.toml; is it possible to get a 0.8 release with nalgebra 0.29 so we can have match for salva?

Thanks!

How to specify the output file format of "Sequences of files"

Hi! I'm using splashsurf in my project, it is very great!๐Ÿ‘

In my case, I need the surface format in .obj instead of the default .vtk. I can specify the output format by -o filename.obj for single file, however it cannot work when I use the "sequences of files", here is my code: (it still generates files in .vtk format)

splashsurf reconstruct -s "model_{}.ply" --output-dir=out -o "test_{}.obj" --mt-files=on --mt-particles=off --particle-radius=0.00187 --smoothing-length=1.2 --cube-size=0.5 --surface-threshold=0.6

For now, I wrote a for-loop in .bat file, it worked with a bad efficiency. So, I'm wondering how to generate .obj directly?

About bgeo file

I used the partio library to output particles, which is a bgeo file. However, the following error occurred during surface reconstruction:
thread 'main' panicked at 'attempt to divide by zero', C:\Users\psdz\.cargo\registry\src\github.com-1ecc6299db9ec823\nom-7.1.2\src\multi\mod.rs:576:32๏ผŒ and then RUST_BACKTRACE
Is it because I didn't use the bgeo file output by splishsplash or what else?

RFC: only triangulate surface based on `TrianglulationCriteria`, or triangluate_aabb

Hello,

I am using this for reconstruction of a volume of water, however I am only interested in the top surface of the volume, not the sides or the bottom. I thought passing domain_aabb in Parameters to reconstruct_surface might work, but alas this is not the right thing.

I see the internals of the marching cubes allows TriangulationCriteria. However this is not exposed.

It would be very nice to have either:
an Optional parameter on the Parameters struct, i.e. triangulation_aabb: Option<AxisAlignedBoundingBox>
or expose the TriangulationCriteria on the crate, and allow for passing in an implementor via the Parameters
struct.

Having both would be nice for a) ease of usage of AABB based TriangulationCriteria, b) maximum flexibility going forward.

I'm willing to have a look at this, but any thoughts on changes to the public interface would be welcome.

Also there might be some trickery regarding a global TriangulationCriteria, and merging the results of subdomain TriangulationCriteria. AFAICT TriangulationCriteria is used for stitching subdomains together generated in parallel..?

panic during meshing reconstruction in neighorhood_search

I've got a new panic i'm seeing during mesh reconstruction in the neighborhood_search file:

thread 'fluid_mesher' panicked at 'called `Option::unwrap()` on a `None` value', /home/whiterabbit/.cargo/registry/src/github.com-1ecc6299db9ec823/splashsurf_lib-0.6.1/src/neighborhood_search.rs:363:44
stack backtrace:
   0: rust_begin_unwind
             at /rustc/a178d0322ce20e33eac124758e837cbd80a6f633/library/std/src/panicking.rs:515:5
   1: core::panicking::panic_fmt
             at /rustc/a178d0322ce20e33eac124758e837cbd80a6f633/library/core/src/panicking.rs:92:14
   2: core::panicking::panic
             at /rustc/a178d0322ce20e33eac124758e837cbd80a6f633/library/core/src/panicking.rs:50:5
   3: core::option::Option<T>::unwrap
             at /rustc/a178d0322ce20e33eac124758e837cbd80a6f633/library/core/src/option.rs:388:21
   4: splashsurf_lib::neighborhood_search::sequential_generate_cell_to_particle_map
             at /home/whiterabbit/.cargo/registry/src/github.com-1ecc6299db9ec823/splashsurf_lib-0.6.1/src/neighborhood_search.rs:363:20
   5: splashsurf_lib::neighborhood_search::sequential_search
             at /home/whiterabbit/.cargo/registry/src/github.com-1ecc6299db9ec823/splashsurf_lib-0.6.1/src/neighborhood_search.rs:131:9
   6: splashsurf_lib::neighborhood_search::search_inplace
             at /home/whiterabbit/.cargo/registry/src/github.com-1ecc6299db9ec823/splashsurf_lib-0.6.1/src/neighborhood_search.rs:62:9
   7: splashsurf_lib::reconstruction::compute_particle_densities_and_neighbors
             at /home/whiterabbit/.cargo/registry/src/github.com-1ecc6299db9ec823/splashsurf_lib-0.6.1/src/reconstruction.rs:552:5
   8: splashsurf_lib::reconstruction::reconstruct_single_surface_append
             at /home/whiterabbit/.cargo/registry/src/github.com-1ecc6299db9ec823/splashsurf_lib-0.6.1/src/reconstruction.rs:590:9
   9: splashsurf_lib::reconstruction::reconstruct_surface_global
             at /home/whiterabbit/.cargo/registry/src/github.com-1ecc6299db9ec823/splashsurf_lib-0.6.1/src/reconstruction.rs:34:5
  10: splashsurf_lib::reconstruct_surface_inplace
             at /home/whiterabbit/.cargo/registry/src/github.com-1ecc6299db9ec823/splashsurf_lib-0.6.1/src/lib.rs:354:9

if you need some more info, let me know; it seems to be sort of random; this was in context of me experimenting with rendering a faucet. (so particles deleted after some time).

perf improvements: generate_iso_surface_vertices and generate_sparse_density_map

Hello! So as apart of investigation into potentially improving perf, I've collected some stats, and I've identified two target areas that appear to occupy largest portion of meshing budget:

Reconstructed 11790 vertices (indices=64464) from 1000 particlces in 43.492334ms and pushed in 43.679657ms
reconstruct_surface: 100.00%, 43.49ms/call @ 22.99Hz
  compute minimum enclosing aabb: 0.01%, 0.01ms/call @ 22.99Hz
  neighborhood_search: 11.67%, 5.07ms/call @ 22.99Hz
    parallel_generate_cell_to_particle_map: 26.25%, 1.33ms/call @ 22.99Hz
    get_cell_neighborhoods_par: 5.06%, 0.26ms/call @ 22.99Hz
    calculate_particle_neighbors_par: 64.24%, 3.26ms/call @ 22.99Hz
  parallel_compute_particle_densities: 0.47%, 0.21ms/call @ 22.99Hz
  parallel_generate_sparse_density_map: 41.18%, 17.91ms/call @ 22.99Hz
  triangulate_density_map: 46.62%, 20.28ms/call @ 22.99Hz
    interpolate_points_to_cell_data: 91.94%, 18.64ms/call @ 22.99Hz
      generate_iso_surface_vertices: 84.61%, 15.77ms/call @ 22.99Hz
      relative_to_threshold_postprocessing: 15.36%, 2.86ms/call @ 22.99Hz
    triangulate: 8.04%, 1.63ms/call @ 22.99Hz

So for meshing every frame the 1k particles, it takes from 30-50ms; Ideally we can get this down somewhere close to 16ms, so that we could have a one-frame latency delay on generating the meshes for a realtime sim in 60fps.

As such, it looks like generate_iso_surface_vertices (15.7ms) and parallel_generate_sparse_density_map (17.9ms) are good candidates.

I don't know much about fluid simulations, so I'll defer to you on matters here, but I have done a lot of work in perf and optimization; do you think there's any place to attack here, and if so, mind giving me a pointer so I could start/take a look? :)

I'm also wondering perhaps is there any data structures we don't have to compute every frame? Perhaps the density map? Or similar to #4 we could perhaps reuse container structures to reduce allocation strain?

Thanks, and looking forward to your insights here :)

Error occured: Unable to detect file format of particle output file

My vtk files were created from SPlisHSPlasH and they do contain the correct information. I am not sure why this error occurs, any help?

splashsurf convert --particles C:\Users\ghost\SPlisHSPlasH\bin\output\Obstacle\vtk\ParticleData_Fluid_6.vtk -o  C:\Users\ghost\SPlisHSPlasH\bin\output\Obstacle\Folder
[2022-10-06T17:15:22.804998-04:00][splashsurf][INFO] splashsurf v0.8.0 (splashsurf)
[2022-10-06T17:15:22.805624-04:00][splashsurf][INFO] Called with command line: splashsurf convert --particles C:\Users\ghost\SPlisHSPlasH\bin\output\Obstacle\vtk\ParticleData_Fluid_6.vtk -o C:\Users\ghost\SPlisHSPlasH\bin\output\Obstacle\Folder
[2022-10-06T17:15:22.806166-04:00][splashsurf::io][INFO] Reading particle dataset from "C:\Users\ghost\SPlisHSPlasH\bin\output\Obstacle\vtk\ParticleData_Fluid_6.vtk"...
[2022-10-06T17:15:22.807119-04:00][splashsurf::io][INFO] Successfully read dataset with 10212 particle positions.
[2022-10-06T17:15:22.807204-04:00][splashsurf::io][INFO] Writing 10212 particles to "C:\Users\ghost\SPlisHSPlasH\bin\output\Obstacle\Folder"...
[2022-10-06T17:15:22.807546-04:00][splashsurf][ERROR] Error occurred: Unable to detect file format of particle output file (file name has to end with supported extension)

Trouble converting mesh from vtk to obj

Hi there,

created a .vtk sequence with reconstruct and would like to convert it to obj...

.\splashsurf.exe convert --domain-max="1000;1000;1000" --domain-min="0;0;0" --mesh out\ParticleData_Fluid_surface_151.vtk -o obj_con\Surface_151.obj

Reading mesh from "out\ParticleData_Fluid_surface_151.vtk"...
[2023-04-17T12:12:58.423520+02:00][splashsurf][ERROR] Error occurred: Failed to load surface mesh from file "out\ParticleData_Fluid_surface_151.vtk"
[2023-04-17T12:12:58.423824+02:00][splashsurf][ERROR] caused by: Expected only triangle cells. Invalid number of vertex indices (3) of cell 0

Could you please help?

Kind regards

normal support

Hello,

First of all: incredible library! I was searching for meshing solutions to use with salva3d particles and was directed here on the discord and absolutely thrilled to find the api is 99% of what I need; amazing work, thank you so much!

So I was wondering if there is a plan to add normals for the output meshes? Or how difficult this would be, if you might perhaps some idea of the work involved?

Thanks!

Python wrapper

Hi,

I'm wondering it there any python wrapper for this library?
Also, do you have any reference regarding mathematical model you have used fot surface reconstruction? (specifically the function for finding surface level of particles).

VTK parsing error

I have particle data in XML VTU format. I've tried

  1. loading the file into ParaView and exporting it as legacy .vtk files,
  2. converting the file with ParaView's Python interface.

Both yield perfectly fine .vtk files that can be opened correctly by ParaView.
However, both produce parsing errors when I try to run them through splashsurf:

[2023-04-05T16:30:16.737039+02:00][splashsurf][ERROR]   caused by: Failed to load VTK file "test1.vtk"
[2023-04-05T16:30:16.737046+02:00][splashsurf][ERROR]   caused by: Parse error: Alt

XYZ parsing error

Hi,
I try to load a point set in XYZ format, while the splashsurf shows wrong particle number, is there anything wrong?

The commond I run:

splashsurf reconstruct test.xyz --particle-radius 1.2 --smoothing-length 1.2 --cube-size 0.5

Here is the test.xyz, just including the coordinates. I am not sure if it is the right format.

0.0000 0.0000 0.0000
2.0250 2.0250 0.0000
2.0250 0.0000 2.0250
0.0000 2.0250 2.0250
0.0000 0.0000 4.0500
2.0250 2.0250 4.0500
2.0250 0.0000 6.0750
0.0000 2.0250 6.0750
0.0000 4.0500 0.0000
2.0250 6.0750 0.0000
2.0250 4.0500 2.0250
0.0000 6.0750 2.0250
0.0000 4.0500 4.0500
2.0250 6.0750 4.0500
2.0250 4.0500 6.0750
0.0000 6.0750 6.0750
4.0500 0.0000 0.0000
6.0750 2.0250 0.0000
6.0750 0.0000 2.0250
4.0500 2.0250 2.0250
4.0500 0.0000 4.0500
6.0750 2.0250 4.0500
6.0750 0.0000 6.0750
4.0500 2.0250 6.0750
4.0500 4.0500 0.0000
6.0750 6.0750 0.0000
6.0750 4.0500 2.0250
4.0500 6.0750 2.0250
4.0500 4.0500 4.0500
6.0750 6.0750 4.0500
6.0750 4.0500 6.0750
4.0500 6.0750 6.0750

The output shows there are 58 particles, while it is 32.

[18:31:50.778][INFO] Using single precision (f32) for surface reconstruction.
[18:31:50.778][INFO] Reading particle dataset from "test.xyz"...
[18:31:50.778][INFO] Successfully read dataset with 58 particle positions.
[18:31:50.779][INFO] Minimal enclosing bounding box of particles was computed as: AxisAlignedBoundingBox { min: [-1.2000000, -1.2000000, -1.2000000], max: [1.2000105, 1.2000105, 1.2000105] }
[18:31:50.779][INFO] Splitting 58 particles into 4 chunks (with 16 particles each) for octree generation

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.