Giter VIP home page Giter VIP logo

openmaterial's People

Contributors

ludwigfriedmann avatar verakurz avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

openmaterial's Issues

References

Please add references in IEEE citation style regarding relevant publications to the readme. See eample here

Include wavelength dependent Lambertian reflectivity

For lidar simulation, the Lambertian reflectivity can be used to calculate the intensity of the backscattered light. This is for example done in models by Muckenhuber et al. or Rott et al..
Spectral reflectivity data is e.g. available from the ECOSTRESS spectral library.

Of cause a perfect diffuse Lambertian reflectivity is not a valid model for all materials, but it can be combined e.g. with partial specular Fresnel reflection (depending on IOR, already existing in OpenMaterial).

I would store the Lambertian reflectivity as a wavelength dependent parameter. This way it can not only be used by lidar simulation but also for visible light.

Make error in pathtracer

Using Ubuntu 22 with CMake 3.22.1 I cannot build the pathtracer example. After executing the make command, I get the following error:

../external/doctest/doctest.h:4107:47: error: size of array ‘altStackMem’ is not an integral constant-expression

Is there any solution to this?

HDR

In Readme.md: Examples of HDR (high density range) images
Is it high dynamic range?

index of refraction of automotive materials at 77 GHz (radar)

Here is a list of the index of refraction (N = n + i k) of some automotive materials at 77 GHz (radar):

  • Iron (as example for metal)
  • rubber
    • n = tbd
  • Glass
    • n = 2.51
    • k = 0.085
    • from [1] using Cole-Cole model, @77 GHz
    • conductivity = 1e-3 S/m
  • Plexiglass (lights, windows)
    • n = 1.60
    • k = 0.0059
    • from [1] using Cole-Cole model, @77 GHz
    • conductivity = 0 S/m
  • Polypropylene (as example for plastic)
    • n = 1.48
    • k = 0.0025
    • from [1] using Cole-Cole model, @77 GHz
    • conductivity = 0 S/m
  • Concrete with large gravel (as example for road surface)
    • n = 1.96
    • k = 0.077
    • from [1] using Cole-Cole model, @77 GHz
    • conductivity = 1.7e-3 S/m
  • Vegetation
  • Plasterboard

[1] Stanislav Stefanov Zhekov, Ondrej Franek, and Gert Frølund Pedersen; Dielectric Properties of Common Building Materials for Ultrawideband Propagation Studies; Digital Object Identifier 10.1109/MAP.2019.2955680

Remove literals

Within some of the extensions, literals are used as keys. They may be replaced by key-value pairs incorporating a keyword as key, e.g. "300" may become "temperature": 300.

Origin of the pedestrian structure not well defined

In the definition of the pedestrian structure, it says:

The position of the transform of the hip bone, which represents the root of the hierarchical bone structure, coincides with the reference coordinate frame (see below).

So the position of the hip is the origin of the reference coordinate frame.

However, it also says:

Origin (ORef): Geometric center of the bounding box of the undeflected model

Which would mean, that the hip is in the center of the bounding box of the undeflected model. This does not make sense to me. E.g. for a person with (relatively) short legs and a (relatively) long upper body, the hip would be below the center of the bounding box.

Could you clarify this @LudwigFriedmann ?

Add retroreflective material

Especially for lidar, retroreflective materials such as road signs and license plates are important.

I suggest either to include a flag if a material is retroreflective for a certain wavelength(region).
Or to be more precise, we need to introduce a "retroreflectivity" value, because no material is perfectly retroreflective. If it was, road signs would be dark for the human eye. So there is always a diffuse portion.
The problem is, I don't have a good idea how to physically set this parameter.

Concept for integration of incident-angle dependant BRDFs/BSDFs

A generic concept for the integration of incident-angle dependant BRDFs (Bidirectional Reflectance Distribution Functions) / BSDFs (Bidirectional Scatter Distribution Functions) shall be introduced in OpenMATERIAL.

A global transparency Boolean distinguishes BRDF and BSDF.

Relevant inputs/dimensions for look-up tables, located in specific files, are (top to bottom):

  1. Temperature
  2. Polarization
  3. Wavelength
  4. Incident Angle (Theta, Phi)

Those dimensions shall point to entries of a 2D array of maps (dimensions: Theta/Phi scatter)
Maps shall comprise the following values:

  • Scatter probability
  • Phase shift

New extension OpenMaterial_brdf_bsdf_data required.

anisotropic correlation length and update on surface roughness parameter documentation

Instead of the surface roughness parameter correlation_length use the two parameters surface_correlation_length_axis1 and surface_correlation_length_axis2 accounting for anisotropic roughness:

old
"surface_roughness": {
"surface_height_rms": 50,
"surface_correlation_length": 438
},

new
"surface_roughness": {
"surface_height_rms": 50e-6,
"surface_correlation_length_axis1": 438e-6
"surface_correlation_length_axis2": 284e-6
},

Documentation for surface_roughness:
-- surface_height_rms: the root-mean-square value of the 3D surface height profile in [m]. If the 3D surface profile is not available the mean value of the surface profile along two orthogonal axes can be taken as a best guess.
-- surface_correlation_length_axis1: length in [m] at which the normalized auto-correlation function in axis1-direction equals to 1/e (~37%)
-- surface_correlation_length_axis2: length in [m] at which the normalized auto-correlation function in axis2-direction equals to 1/e (~37%)

In case the surface roughness is isotropic, identical values can be given for *axis1 and *axis2 values (alternatively surface_correlation_length_axis2 is set to NaN).
In order to apply anisotropy the axis1-(unit-)vector must be defined (in the object geometry) for each (anisotropic) polygon of the mesh. To align textures on a mesh, an u- and v-vector is defined in a geometry file. We propose to use these vectors to define axis1 and axis2.
In order to obtain the roughness for an incidence plane that does neither contain the axis1- nor the axis2-vector, a linear combination of axis1 and *axis2 correlation length must be used.

Improvements on vehicle structure

This issue shall become a collection of inputs concerning improvements on the 3d vehicle structure definitoin. I will start with some ideas and provide further input along the discussion.

Proposed text (draft):

3D Model Structure
The structure of 3D models that may be used in simulation-based development for ADAS and automated driving shall be harmonized in a way that allows for easy interfacing by various applications.

Vehicles
Vehicles represent a large class of entities that may be included in a 3d representation of a virtual environment. They may be placed at any location and orientation, and they may consist of static and dynamic components. The following hierarchy shall facilitate the definition of vehicle models and their exchange among various applications.

Basic Vehicles
Vehicles include everything that has at least one wheel, may carry humans and/or goods and is powered by an on-board engine or pushed or designed to be dragged (ie, trailer) by another vehicle with on-board engine. A human may be considered as on-board engine if power is exerted on the vehicle (eg, bicycle).

Combined Vehicles
Vehicles may come as standalone objects (eg, a typical passenger car) or as combinations of objects (eg, truck or pick-up truck with semitrailer, passenger car or truck with drawbar trailer or turntable trailer, train with coaches). Therefore, it is necessary to describe potential connection points (hitch points) between vehicles, so that they may be chained together (eg, a tram consisting of multiple carriages, or the typical Australian “road train” – a truck with more than one trailer).

Axles and Wheels
Vehicles may come as single-track (eg, motorcycle) or multi-track (eg, passenger car), or mixed (eg, trike) entities. The number of axles shall not be limited (eg, for large flatbed trailers for extra heavy loads). Each axle may bear multiple wheels which are (potentially) steerable. An exception in terms of steering are turntable trailers (where the whole front axle turns around a center pivot point; they may be modeled by considering the turntable itself a single-axle, dual-track model with a (pseudo) semitrailer attached to the pivot point.
Vehicle_Structure_Axles_and_Wheels

Multiple bodies
Vehicle may consist of a single rigid body with attached wheels or as multi-body entities. A truck with a suspended driver cabin shall be considered a dual-body vehicle.

Merge emissive coefficient with complex IOR data

Temperature-related index of refraction values and emissive coefficients are currently provided by individual extensions. Those values may be merged, e.g. in an extension_ior_emissive_data.

naming convention

Hi Ludwig,

great work!
I noticed a minor typo: 'acoustic impedance' and 'Shear velocity' do not follow the usual convention of lowercase and underscores.

BR
Benedikt

Vehicle reference coordinate system

In model_structure, the reference coordinate system of a vehicle is placed on the front axle with x facing backwards.

  1. To my knowledge, the x-direction contradicts DIN ISO 8855, where it is specified as facing forward. Most of the defined sub-components also have the x-axis facing forward.
  2. In OSI the reference origin is typically located at the rear axle (see fig. 10). This coincides with the definition of the sensor coordinate frame here.

Why is the vehicle reference frame not also located at the rear axle with the x-axis facing forward? Wouldn't that simplify coordinate transformations?

detection_wavelength_range for ultrasound

In the examples given, the detection_wavelength_ranges for typical_sensor: ultrasound is 100 to 15000. However, these values are rather audible frequencies. Ultrasound devices operate with frequencies from 20 kHz up to several GHz. Hence the detection_wavelength_range is 1.66e-2 to 1e-7 m.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.