Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Light Probe interpolation using Tetrahedral Tesselations #16228

Open
bhouston opened this issue Apr 12, 2019 · 46 comments · May be fixed by #18371
Open

Light Probe interpolation using Tetrahedral Tesselations #16228

bhouston opened this issue Apr 12, 2019 · 46 comments · May be fixed by #18371
Milestone

Comments

@bhouston
Copy link
Contributor

Description of the problem

Now that we have a LightProbe class (#16191), the Spherical harmonic class (#16187) merged as well as some shader support(#16152), we should explore adding a Tetrahedral tesselation method for interpolating between light probes to the renderer so that it can set the 4 SH + weights per object.

The best reference is this presentation from Unity itself:

https://gdcvault.com/play/1015312/Light-Probe-Interpolation-Using-Tetrahedral

(Referenced from here: https://docs.unity3d.com/Manual/LightProbes-TechnicalInformation.html)

It seems that given a set of light probes you just do a 3D delaunay tetrahedralization and use 3D barycentric coordinates for interpolation and you cache which tetrahedral each object is in to speed up lookups. Pretty simple, just need a standard delaunay algorithm implemented along with a searchable tetrahedral data structure.

/ping @WestLangley @donmccurdy @richardmonette

@donmccurdy
Copy link
Collaborator

Overlaps a bit with my comment in #16223 (comment), but should objects opt in/out for lookups in the light probe volume? For example, static terrain should not be affected. A material-level option, e.g. material.useLightProbes=true/false would be sufficient.

@bhouston
Copy link
Contributor Author

I think that first one should enable ".dynamicGI" mode on WebGLRenderer, this builds your LightProbe tetrahedralization, etc and keeps it maintained. And then you enable ".dynamicGI" on individual objects to have them opt into the lighting effects that result from this. I prefer it on an object basis rather than on a material basis, it just makes more sense that way. But I guess that may be hard given the Three.JS object model. If it is on a per-material basis then yeah, it probably is .lightProbes = null or .lightProbes = [ a structure of an SH array and a weight array ]. Thus it would behave like .map = null or not. Rather than having two separate variables controlling it.

@donmccurdy
Copy link
Collaborator

I prefer it on an object basis rather than on a material basis, it just makes more sense that way. But I guess that may be hard given the Three.JS object model.

Yes, I conceptually like this on an object basis but wasn't sure how that would work. Perhaps it won't be too difficult.

@donmccurdy
Copy link
Collaborator

donmccurdy commented Apr 21, 2019

It seems that given a set of light probes you just do a 3D delaunay tetrahedralization ... Pretty simple, just need a standard delaunay algorithm implemented along with a searchable tetrahedral data structure.

This sounded simple enough, but after spending some time on every 3D Delaunay JS library I can find...

... I'm not satisfied with the results:

delaunay-triangulate incremental-delaunay darkskyapp/delaunay
delaunay-triang bug AC536B6C-18B7-4AF4-B8E9-8C6FBC89F748-16387-0000203F58568374

Note the near-intersecting lines crossing far corners of each volume, which I think indicate intersecting tetrahedra in the volume. For a simple cube 2x2x2 probe volume, the algorithms are generating 9-10 tetrahedra where 5-6 would be expected.

Referring back to Light probe interpolation using tetrahedral tessellations,

  • Bowyer-Watson seems to be the algorithm of choice
  • The method of finding the convex hull one dimension
    higher might be elegant and universal, but definitely
    not practical above 2D
  • Numerically robust implementation of the incircle and
    orientation tests available from [JShewchuk]
  • If you need a ready solution, [TetGen] by Hang Si is
    very decent and has some additional, potentially
    useful functionality, like tetrahedral mesh refinement

I cannot find a JS implementation of Bowyer-Watson. If Tetgen, a C++ implementation, is any indication, it is complex – tetgen.cxx is 35,000 lines of code. I'm mildly curious about the feasibility of compiling Tetgen to WASM, but would like to take a step back before putting time into something like that. In the interest of baby steps, here are some alternatives:

(a) Support a 2D Delaunay triangulation (not tetrahedralization) on the X/Z plane. For many uses this may be sufficient.
(b) Support only fixed grid layouts, similar to Blender's Irradiance Volumes.
(c) Use a simpler but more expensive interpolation method, and require users to limit the number of probes more closely.

Presumably this code will be in examples/js[m]/, rather than the core library, so while I'd like to have a universal solution, partial solutions may be a good starting point.

@donmccurdy
Copy link
Collaborator

Here is the relevant code:

dev...donmccurdy:feat-lightprobevolume

In particular, LightProbeVolume.build.

@donmccurdy
Copy link
Collaborator

A Comparison of Five Implementations of 3D Delaunay Tessellation, by Yuanxin Liu and Jack Snoeyink, 2005. (predates Tetgen by a year)

@donmccurdy
Copy link
Collaborator

donmccurdy commented Apr 21, 2019

Ok, I misinterpreted the visual debugging output I included. After configuring it to render individual tetrahedra instead of just the wireframe I can see what's going on:

Screen Shot 2019-04-21 at 12 30 57 AM

(only a few cells shown for clarity)

The interior tetrahedra are just fine. The coplanar points at the boundaries of the volume are creating very thin tetrahedra, which aren't helpful but can be worked around or removed.

This approach is looking feasible.

@bhouston
Copy link
Contributor Author

Yes, the weird tetrahedral on the outside are well weird, but who cares. This is amazing. Are you going to pre-interpolate the SH via the CPU so that you can just pass in a single SH to each mesh for lighting? This is amazing btw. Holy crap, Three.JS is going GI. :)

@donmccurdy
Copy link
Collaborator

Are you going to pre-interpolate the SH via the CPU so that you can just pass in a single SH to each mesh for lighting?

I think so. We could pass all 4 probes from the current cell and interpolate in the vertex shader, but (1) there are other ways to handle larger meshes, and (2) I'm not confident that approach would be correct or continuous near cell boundaries.

@donmccurdy
Copy link
Collaborator

Several issues remain (in particular, a lot of flickering...) but this is starting to look functional:

33D3B994-0ECB-4ED8-9FF2-F93A12DE2332-39966-0000693E83BFE7D7

The API could use some consideration. I'm assuming that this code will live in examples/js[m] rather than src/, and that users' applications can decide when to update a particular mesh:

import { LightProbeVolume } from 'three/examples/jsm/probes/LightProbeVolume.js';

var volume = new LightProbeVolume()
  .addProbe( probe1 )
  .addProbe( probe2 )
  .addProbe( probe3 )
  .build();

scene.add( volume );

function render () {

  volume.update( mesh );

  renderer.render( scene, camera );

}

This has some weird facets:

  • Both LightProbe and LightProbeVolume extend Object3D. I'm not sure LightProbes should be (directly) added to the scene graph at all. But if not, and they remain in src/, what is their role there?
  • How does the interpolated SH value get passed to the renderer? mesh.diffuseProbe? mesh.diffuseSH? mesh.material.diffuseSH?
  • Do probe intensities affect interpolation? If a probe's intensity is 0, does that mean it has no weight during interpolation? Or that it has its normal weight, but its coefficients are considered 0? I'm not sure per-probe intensity and color make sense to me here.

@mrdoob mrdoob added this to the r105 milestone Apr 24, 2019
@bhouston
Copy link
Contributor Author

Both LightProbe and LightProbeVolume extend Object3D. I'm not sure LightProbes should be (directly) added to the scene graph at all. But if not, and they remain in src/, what is their role there?

I think that LightProbe only exists in the scene graph during editing. At run-time it should generally not be there. It is a luxury convenience for designing only. Sort of like Geometry, it is a luxury you use when you do not care to be fast or scalable.

How does the interpolated SH value get passed to the renderer? mesh.diffuseProbe? mesh.diffuseSH? mesh.material.diffuseSH?

I think that diffuseProbe should exist on the mesh at least in the long term, but for now I think it isn't too important. Thus this is great.

Do probe intensities affect interpolation? If a probe's intensity is 0, does that mean it has no weight during interpolation? Or that it has its normal weight, but its coefficients are considered 0? I'm not sure per-probe intensity and color make sense to me here.

I am unsure why LightProbe has an intensity that is separate from its SH values. I do not understand it and I think it is unnecessary functionality. If anything, I would make intensity derived from the average intensity of the SH and if you set it, it just derives the current intensity and scales its appropriately. Thus there is no intensity value outside of the SH values themselves. Otherwise it is an unnecessary degree of freedom that just complexifies things.

@donmccurdy
Copy link
Collaborator

donmccurdy commented Apr 24, 2019

I think that LightProbe only exists in the scene graph during editing. At run-time it should generally not be there.

I agree with the spirit of this, but functionally it implies...

  • LightProbe could be added directly to scene graph (extends Object3D?)
  • LightProbe could be added to LightProbeVolume (has position and SH, nothing else)
  • LightProbe could be added to a mesh, e.g. mesh.diffuseProbe

... is that too much / overloaded? Does editing necessarily require that the probe be in the scene graph? To preview the effect of a change the LightProbeVolume needs to be rebuilt anyway, so maybe having a LightProbeVolume and LightProbeVolumeHelper might be sufficient for editing purposes.

I think that diffuseProbe should exist on the mesh at least in the long term...

Ok, I'm fine with that. It feels a little unnatural that the result of interpolating among probes should be another probe (I guess I expect the output to be an SH?) but I don't feel strongly about this.

Do probe intensities affect interpolation?

I am unsure why LightProbe has an intensity that is separate from its SH values. I do not understand it and I think it is unnecessary functionality.

@WestLangley @mrdoob would you be OK with removing intensity and perhaps color from the LightProbe class? I think we've agreed that probes do not need to extend Light, and these properties complicate things a bit.

@mrdoob
Copy link
Owner

mrdoob commented Apr 24, 2019

@WestLangley @mrdoob would you be OK with removing intensity and perhaps color from the LightProbe class?

We already removed color. I'd be okay with removing intensity too.

@richardmonette
Copy link
Contributor

I'd be okay with removing intensity too.

👍

I think we've agreed that probes do not need to extend Light

👍

LightProbe could be added to a mesh, e.g. mesh.diffuseSH

👍 Agreed, I had envisioned this being set along the lines of https://github.com/mrdoob/three.js/pull/16270/files#diff-c0e88b98497597a015ecf238e91ac3a0R1442

To be clear, per mesh/object we would store a set of pre-interpolated SH coefficients, at least imho 😄

LightProbe could be added to LightProbeVolume (has position and SH, nothing else)

👍

users' applications can decide when to update a particular mesh

Would it be possible for three.js to implement this logic internally?

My sense was the update logic is the same for most (all?) applications (e.g. if an object/mesh moves then it needs an update, if the light probes move, then all the object/meshes may need an update, and maybe there is an update queue (so each frame one object gets updated, to not take too much time per frame)) If this is the same in most cases, I would suggest it would be valuable if this feature could come 'out of the box', so to speak.

@bhouston
Copy link
Contributor Author

Would it be possible for three.js to implement this logic internally?

Yes it would be, but if this is in /examples/js, we should probably just do it manually for now until it moves into core.

@donmccurdy
Copy link
Collaborator

donmccurdy commented Apr 25, 2019

I suspect it's going to take a while to have really good workflows for designing and precomputing these probe volumes. Until we get there, having to manually call volume.update( mesh ) is a comparatively minor glitch in the developer experience. And to be honest I'm not sure I know how to do this automatically+efficiently yet, you really want at least some clear distinction between static and dynamic objects that doesn't exist today. For now it's nice to have the flexibility of working outside the renderer.

I had envisioned this being set along the lines of https://github.com/mrdoob/three.js/pull/16270/files#diff-c0e88b98497597a015ecf238e91ac3a0R1442

Yeah this looks like the right approach. 👍

@WestLangley
Copy link
Collaborator

I'm not sure LightProbes should be (directly) added to the scene graph at all.

I wanted a representation for ambient light that is more flexible than the single-color AmbientLight. LightProbe serves that purpose.

In the current implementation, LightProbe can be added to the scene graph as a source of indirect light. It extends Light, but that can be changed when all this settles down.

We currently have the following classes to the library:

LightProbe
AmbientLightProbe	// produces the same illumination as AmbientLight
HemisphereLightProbe	// produces the same illumination as HemisphereLight

And for convenience, LightProbeHelper.

Usage:

lightProbe = new THREE.LightProbe( sh, intensity );

lightProbe = new THREE.AmbientLightProbe( 'lightskyblue', intensity );

lightProbe = new THREE.HemisphereLightProbe( 'lightskyblue', 'saddlebrown', intensity );

lightProbe = THREE.LightProbeGenerator.fromCubeTexture( cubeTexture );

In addition, LightProbe can also act as a "probe", representing the illuminance at a particular location.

How does the interpolated SH value get passed to the renderer?

Add a light probe to the scene and update it on an as-needed-basis.

If there are many objects in your scene, then for now, you would have to use Mesh.onBeforeRender() to update the LightProbe.

Do probe intensities affect interpolation?

If you are using a LightProbe as a "probe", and not a source of light, then your can bake the intensity into the SH values if you want.

I am unsure why LightProbe has an intensity that is separate from its SH values. I do not understand it and I think it is unnecessary functionality.

The short answer is we are building a physical model, and units are important -- especially when renderer.physicallyCorrectLights = true;.

Consider this simple shader code:

illuminance = AmbientLightColor * light.intensity;

On the left is a physical quantity that has units of lux. On the right, is a Color, which is unit-less. That means intensity in this case has units of lux. All lights have units. SH and Color are unit-less.

If a LightProbe is a source of light, it must have an intensity property. Intensity is where the units come from.

LightProbe could be added to a mesh, e.g. mesh.diffuseProbe

We do not have consensus on the work-flow yet. For now, you would have to use mesh.onBeforeRender() to update the scene LightProbe parameters.

It feels a little unnatural that the result of interpolating among probes should be another probe

The interpolation work-flow can be implemented any way you want.

@WestLangley would you be OK with removing intensity and perhaps color from the LightProbe class?

No. intensity can not be removed if the LightProbe can be used as a source of light. However, color can be removed. That is a design decision that we can address later.

Finally, DiffuseProbe is not correct terminology, IMO. I would call it IlluminanceProbe. But that is not really necessary as LightProbe seems sufficient. The other kind of probe would be ReflectionProbe.

@donmccurdy
Copy link
Collaborator

donmccurdy commented Apr 25, 2019

I wanted a representation for ambient light that is more flexible than the single-color AmbientLight. LightProbe serves that purpose.
...
In the current implementation, LightProbe can be added to the scene graph as a source of indirect light.

We shouldn't be thinking of LightProbes as sources of light, in my opinion. Rather, they are samples – omni-directional information about light passing through a point of empty space, which happens to be encoded as SH coordinates. Proximity to a LightProbe does not necessarily mean that the probe has any affect at all on a mesh; it only takes one intervening probe to entirely eliminate another probe's influence. Or at least, that should be the case in the end.

The case could be made that AmbientLight and HemisphereLight were never actually "sources" of light in the same sense that our punctual lights were. They're very simple models of global indirect illumination. That they're named as lights is probably making this all more complicated. 😓

Could you explain why the LightProbe API should be designed to resemble AmbientLight/HemisphereLight, and not the other way around?

We do not have consensus on the work-flow yet. For now, you would have to use mesh.onBeforeRender() to update the scene LightProbe parameters.

Ok. That's fine for now, although at some later point it presents issues: given a scene with 100 objects, the onBeforeRender function must be invoked 100 times per frame regardless of how many objects are currently moving.

Finally, DiffuseProbe is not correct terminology, IMO. I would call it IlluminanceProbe. But that is not really necessary as LightProbe seems sufficient. The other kind of probe would be ReflectionProbe.

I'm not attached to the term DiffuseProbe, any of these are fine. 👍 LightProbe and ReflectionProbe sound easily distinguishable to me.

@donmccurdy
Copy link
Collaborator

donmccurdy commented Apr 25, 2019

I'm warming up to the idea of intensity values on light probes, with a qualification – what if they were applied to the SH values at edit time? For example, if I create a layout containing 10 probes and set different intensities on each, the baking process (whether that's fromCubeTexture or something else) should use that intensity as a multiplier when writing the SH values. For runtime evaluation, the intensity can then be ignored.

Blender 2.8 gives reflection probes and irradiance probes an intensity property that is used exactly this way – modifying intensity after baking has no effect.

For example:

var probe = new LightProbe( intensity = 1.0 );

LightProbeGenerator.applyCubeTexture( probe, cubeTexture );

@bhouston
Copy link
Contributor Author

I mostly agree with everything WestLangley said except these points....

Do probe intensities affect interpolation?

If you are using a LightProbe as a "probe", and not a source of light, then your can bake the intensity into the SH values if you want.

I would argue that LightProbes are never a source of light, but are always measures of light passing through, whether they are the vertices of the tetrahedral mesh or when you've interpolated to a specific point in space. Just like UVs are UVs whether defined on vertices or interpolated to a specific pixel of a fragment shader -- they are always UVs. Even when using AmbientLightProbe and HemisphereLightProbe these are not sources of light, rather they are just statements of light passing through.

On the left is a physical quantity that has units of lux. On the right, is a Color, which is unit-less. That means intensity in this case has units of lux. All lights have units. SH and Color are unit-less.

I think we should have SH in lux when used as a light probe representation. So SH is not required to be lux but when it is used in this situation it is in lux. Just like a Vector3 is unitless but when used in a specific context can have units. SH when used as a light probe is nothing more than a spherical representation of intensity for each of the RGB channels in my mind.

I really want to have less variables in a diffuse light probe if we can represent everything with just an SH in a single natural representation.

Remember that we generally have color and intensity separate in Three.JS for HDR color values (emissive, lights) mostly because it is a UI problem to specify color correctly in a HDR fashion. HDR color pickers are not really a thing because we have LDR UIs. I think that we can represent again color and intensity in the UI separately for ambient light probes, so it is usable on LDR displays, but I would derive them from the SH when needed and when they are set calculate the SH based on them. Thus they are derived values rather than distinct values.

If a LightProbe is a source of light, it must have an intensity property. Intensity is where the units come from.

I think that it is never a source of light. When there is an ambient light probe it is again just a measure of light passing through -- a constant value and usually a manually set value, but it is still light passing through. I think that we should not think of them as light sources.

@richardmonette
Copy link
Contributor

We shouldn't be thinking of LightProbes as sources of light, in my opinion. Rather, they are samples – omni-directional information about light passing through a point of empty space, which happens to be encoded as SH coordinates.

👍

Proximity to a LightProbe does not necessarily mean that the probe has any affect at all on a mesh; it only takes one intervening probe to entirely eliminate another probe's influence. Or at least, that should be the case in the end.

👍

The case could be made that AmbientLight and HemisphereLight were never actually "sources" of light in the same sense that our punctual lights were. They're very simple models of global indirect illumination. That they're named as lights is probably making this all more complicated. 😓

💯💯💯👍

(An interesting project, diversion as it would be, could be to fix the way AmbientLights are handled by tetrahedrally interpolating (instead of the current incorrect additive model) Doing this would be in essence Order-0 SH with localized interpolation. This is, imho, is an important realization. That is; LightProbes are not additive like "normal" lights, they need a very different interpolation scheme, which results in one interpolated set of SH per object.)

#16228 (comment)

I think that it is never a source of light. When there is an ambient light probe it is again just a measure of light passing through -- a constant value and usually a manually set value, but it is still light passing through. I think that we should not think of them as light sources.

👍

#16228 (comment)

@bhouston
Copy link
Contributor Author

bhouston commented Apr 25, 2019

@richardmonette wrote:

(An interesting project, diversion as it would be, could be to fix the way AmbientLights are handled by tetrahedrally interpolating (instead of the current incorrect additive model) Doing this would be in essence Order-0 SH with localized interpolation. This is, imho, is an important realization. That is; LightProbes are not additive like "normal" lights, they need a very different interpolation scheme, which results in one interpolated set of SH per object.

Exactly. But I would actually still represent them as 9 coordinate SHs because it is just easier and then we can generalize it into the LightProbeVolume instead of special casing ambient lights as 1 coordinate SHs that fits outside of the general model.

@WestLangley
Copy link
Collaborator

Here is a scene illuminated by a single AmbientLight only.

Screen Shot 2019-04-25 at 10 45 51 AM

Last week, I added the capability for a scene to be illuminated only by a light probe -- with varying intensity no less.
Screen Shot 2019-04-25 at 10 44 22 AM

And it can be used without specifying a light volume.

Would you like me to remove this feature?

@WestLangley
Copy link
Collaborator

@bhouston This is code you wrote ( or maybe we wrote it together ):

// accumulation
reflectedLight.indirectDiffuse = getAmbientLightIrradiance( ambientLightColor );

A light probe can have two purposes, (1) to measure irradiance and (2) to model irradiance.

In the shader, it models indirect light. That is, light from indirect sources. So-called "indirect diffuse light" is indirect light that is reflected diffusely.

In our physical model, indirect light is additive to total scene light, and in that sense, an ambient light or light probe is a "source" of light. And in a physical model, it must have units: lux in this case.

I see nothing incorrect about referring to indirect light sources as "sources".

@WestLangley
Copy link
Collaborator

Here is something you may find more appealing...

If you want, we could just enhance the capabilities of AmbientLight so it encompasses HemisphereLight and SH parameterizations. I am not sure what the API would look like, though.

Then, we would only use the term "light probe" when modeling irradiance volumes. We wouldn't add light probes to the scene, directly, and they would not have to extend Light or Object3D.

You may still get into trouble if you remove intensity from LightProbe, though. It will depend on the to-be-determined workflow.

@bhouston
Copy link
Contributor Author

In our physical model, indirect light is additive to total scene light, and in that sense, an ambient light or light probe is a "source" of light. And in a physical model, it must have units: lux in this case.

I see your point.

If you want, we could just enhance the capabilities of AmbientLight so it encompasses HemisphereLight and SH parameterizations. I am not sure what the API would look like, though.

Maybe we can add SH parameterizations to the existing AmbientLight and Hemispheric light and then the lighting code could sum up those SHs and also the SH on the mesh.lightProbe value and pass that into the shader as a single SH representing all indirect diffuse, e.g. a indirectDiffuseSH parameter?

The net effect is same API Ambient + Hemisphere lights, but simplified shader code and a bit of changing in the lighting code.

Just brainstorming. I should get back to work. Discussions with smart people is a bit addictive.

@mrdoob mrdoob modified the milestones: r155, r156 Jul 27, 2023
@mrdoob mrdoob modified the milestones: r156, r157 Aug 31, 2023
@mrdoob mrdoob modified the milestones: r157, r158 Sep 28, 2023
@mrdoob mrdoob modified the milestones: r158, r159 Oct 27, 2023
@mrdoob mrdoob modified the milestones: r159, r160 Nov 30, 2023
@mrdoob mrdoob modified the milestones: r160, r161 Dec 22, 2023
@mrdoob mrdoob modified the milestones: r161, r162 Jan 31, 2024
@mrdoob mrdoob modified the milestones: r162, r163 Feb 29, 2024
@mrdoob mrdoob modified the milestones: r163, r164 Mar 29, 2024
@mrdoob mrdoob modified the milestones: r164, r165 Apr 25, 2024
@mrdoob mrdoob modified the milestones: r165, r166 May 31, 2024
@mrdoob mrdoob modified the milestones: r166, r167 Jun 28, 2024
@mrdoob mrdoob modified the milestones: r167, r168 Jul 25, 2024
@mrdoob mrdoob modified the milestones: r168, r169 Aug 30, 2024
@mrdoob mrdoob modified the milestones: r169, r170 Sep 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

7 participants