Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LightProbeVolume: Baked lighting for dynamic objects #18371

Draft
wants to merge 6 commits into
base: dev
Choose a base branch
from

Conversation

donmccurdy
Copy link
Collaborator

@donmccurdy donmccurdy commented Jan 11, 2020

Summary

A LightProbeVolume samples diffuse indirect lighting in a scene at each of several LightProbe locations, then provides approximate diffuse lighting for dynamic objects at any location within that space. The method complements baked lighting and lightmaps — which only support static objects — by providing high-quality lighting for dynamic objects as they move throughout a larger scene. Like lightmaps, LightProbeVolumes can be 'baked' offline and stored, then loaded and applied at runtime very efficiently.

Fixes #16228.

Thanks to @WestLangley, @bhouston, and @richardmonette for helpful discussions and code that contributed to this PR.

API

Bake:

var cubeCamera = new THREE.CubeCamera( 0.1, 100, 256, {
    format: THREE.RGBAFormat,
    magFilter: THREE.LinearFilter,
    minFilter: THREE.LinearFilter
} );

// Renderer output is sRGB. Configure renderTarget accordingly.
cubeCamera.renderTarget.texture.encoding = THREE.sRGBEncoding;

// Generate a light probe volume and populate probes in a 10x3x10 grid. Then
// sample lighting at each probe using the CubeCamera.
var volume = new THREE.LightProbeVolume()
  .setFromBounds( new THREE.Box3().setFromObject( scene ), 10, 3, 10 )
  .build()
  .bake( renderer, scene, cubeCamera );

// Store volume and baked lighting as a small JSON object.
var data = volume.toJSON();

Apply:

// Reconstruct the volume with pre-baked lighting.
var volume = new THREE.LightProbeVolume().fromJSON( data );
scene.add( volume );

probe = new THREE.LightProbe();
scene.add( probe );

function render () {

    volume.update( mesh, probe.sh );

    renderer.render( scene, camera );

}

Lighting is sampled only at the center of each mesh, and larger objects like terrain and buildings will not receive realistic lighting from a single sample. Shadows are not cast by light probes.

Unresolved

  • Per-object SH. While per-object SH coordinates can be computed, only one "probe" is actually in the scene graph functioning as a light. So only one mesh can receive dynamic lighting. This can be resolved with an API to assign SH coordinates or a light probe to a specific object or material.
  • Optimize. Various inefficiencies left as TODO. For example, LightProbeVolumeHelper should use InstancedMesh, and volume.update() should search incrementally from the last known cell to improve update times.
  • Correct baking. I'm not convinced my "baking" method is quite right, but it looks pretty good. Maybe this could be done offline in other tools.
  • Class structure. In this demo, LightProbeVolume is in the scene graph but its probes are not (I don't want them all affecting the mesh). I'm open to other ways of structuring this. It's useful for the volume to have position in the scene.
  • Grid volumes. Should we allow arbitrarily-shaped volumes, or require grid-shaped volumes? Blender and Godot appear to be OK with grids. That would let us omit the Delaunay library (which generates 5x 1.5x more tetrahedra than needed, for a grid?), improving volume creation time. And if (in the future) we want to upload the whole volume's SH coordinates to a GPU texture for sampling in the fragment shader, I think a grid would be necessary. But it is less flexible / optimizable. Grids only, for now.

Demo

https://raw.githack.com/donmccurdy/three.js/feat-lightprobevolume-build/examples/?q=volume#webgl_lightprobe_volume

Screen Shot 2020-01-11 at 1 19 31 PM

@bhouston
Copy link
Contributor

Great job @donmccurdy , this is amazing stuff!

@donmccurdy
Copy link
Collaborator Author

To start chipping away at the unresolved parts of this PR:

Should we allow arbitrarily-shaped volumes, or require grid-shaped volumes?

Unless there's any strong argument against it, I'd like to require grid-shaped volumes for now. That should let us drop the triangulation dependency (smaller build!) and means that if we later want to represent the volume as a GPU texture, it won't require a breaking change. I can't think of a way to represent large, non-grid probe volumes efficiently as GPU textures.

@donmccurdy
Copy link
Collaborator Author

Ok, grids only for now. I've dropped the 'delaunay-triangulate' dependency, and tetrahedra are tessellated directly from the grid structure.

Next question — per-object SH. This could be addressed in a separate PR, but the light volume isn't much use if only one dynamic object can receive lighting. Discussion in #16228 covered a few possible naming conventions and APIs, but mostly focused on scene-wide lights... Do any of these make sense?

  • mesh.lightProbe
  • mesh.giProbe
  • mesh.material.envSH

Or is there another way we'd prefer to have the LightProbeVolume applied to individual objects?

@mrdoob mrdoob added this to the rXXX milestone Jan 27, 2020
@donmccurdy
Copy link
Collaborator Author

donmccurdy commented Mar 22, 2020

I was hoping I could work around the issue above without core changes, by updating the single global probe with onBeforeRender on each affected mesh:

var volume = new THREE.LightProbeVolume().fromJSON( data );
var probe = new THREE.LightProbe();
scene.add( probe );

meshes.forEach( ( mesh ) => {

  mesh.onBeforeRender( () => volume.update( mesh, probe.sh ) );

} );

function render () {

    renderer.render( scene, camera );

}

... unfortunately, that doesn't work. The light probe's uniforms are (understandably) not getting updated between each object. Some change or addition to the current use of LightProbe in the renderer will be necessary I think.

If this code were mature and production-ready (it isn't yet...), and we wanted it in core of three.js, I would say that probe volumes should be assigned to objects very much like we do for environment maps today, for consistency:

  • scene.environment = texture is global default, use material.envMap = texture to override.
  • scene.??? = volume is global default, use material.??? = ??? to override.

Does that seem like the right direction? If so, maybe we start with supporting either a LightProbe or SH assigned to a material initially, and the probe volume — or user code — can update that?

@mrdoob
Copy link
Owner

mrdoob commented Mar 22, 2020

Does that seem like the right direction? If so, maybe we start with supporting either a LightProbe or SH assigned to a material initially, and the probe volume — or user code — can update that?

Makes sense to me 👌

@donmccurdy
Copy link
Collaborator Author

Thanks! Trying to fit the environment map pattern, then.

Attaching an Object3D subclass like LightProbe directly to a Material feels awkward, so I think I'd rather do something like this...

material.indirectDiffuseSH = new THREE.SphericalHarmonics3().zero();

... zero by default, and added to any AmbientLight, HemisphereLight, or LightProbe already in the scene for backward-compatibility. Does that make sense to you @WestLangley + @bhouston? I took the name from an older comment (#16228 (comment)) by Ben:

Thus I advocate not changing the API for AmbientLight and HemisphereLight and just adding LightProbeVolume and DiffuseProbe. And we sum up all three into a single indirectDiffuseSH variable for the shader.

I would also be happy with names derived from "global illumination" or "irradiance".

@donmccurdy
Copy link
Collaborator Author

Well, it works in any case. 🙂

dev...donmccurdy:feat-indirectdiffusesh

Screen Shot 2020-03-22 at 8 40 00 PM

@WestLangley
Copy link
Collaborator

scene.??? = volume is global default, use material.??? = ??? to override

scene.lightVolume makes sense to me. A light volume is a collection of light probes.

material.indirectDiffuseSH = new THREE.SphericalHarmonics3().zero();

The environment map could have been named specularProbe, or luminanceProbe. So, maybe

material.illuminanceProbe?.
material.lightProbe?
material.illuminance?

I am not really happy with any of those suggestions, TBH.

@WestLangley
Copy link
Collaborator

I think this PR is excellent.

However, I also think some changes are in in order... for a later PR, that is.

In spite of what I said elsewhere, a light probe is not a source of light. It is a probe of light. It measures illuminance.

A probe measures light at a given location/direction.

A Light, on the other hand, casts light - and the light it casts attenuates according to distance.

Consequently, I no longer think LightProbe should extend Light -- any more than an environment map extends light. An environment map, after all, is a representation of a luminance probe.

Furthermore, AmbientLight and HemisphereLight are not lights either. They are probes -- special cases of a LightProbe.

I think we should begin thinking in these terms, and modify the library accordingly. Exactly how we go about doing that is up for discussion.

@donmccurdy
Copy link
Collaborator Author

@WestLangley thanks for the comments!

The environment map could have been named specularProbe, or luminanceProbe. So, maybe ...

material.illuminanceProbe?.
material.lightProbe?
material.illuminance?

In spite of what I said elsewhere, a light probe is not a source of light. It is a probe of light. ... Consequently, I no longer think LightProbe should extend Light ... I think we should begin thinking in these terms, and modify the library accordingly.

I think these two issues are closely related. I agree that LightProbe does not need to extend Light, and arguably should not. Similarly, LightProbeVolume in this PR is a (compound) probe rather than a light, as it "measures light at [many] given location/direction[s]".

In practice, these locations are usually predetermined, e.g. with grid structures as I've done here. Those are the locations at which probes measure light. What is left after that is not measurement but interpolation — an object moves through a volume, and an estimation of the light at the object's location (stored as SH3) is contributed to the object's lighting calculation.

While you could store that derived SH3 in a LightProbe again, I find that unsatisfying as the resulting probe is not being used to "measure." For the same reason, assigning a probe to a material seems unsatisfying. That is the argument I see for assigning an SH3 to the material rather than a probe — the SH3 simply represents a value.

@WestLangley
Copy link
Collaborator

Assigning the property an instance of SH3 is fine.

However, remember that scene.background can take many forms. material.illuminance can, too.

illuminance can be encoded not only as SH3, but SH4, for example, or perhaps more commonly as an equirectangular texture.

I am not a fan of using the encoding method in the material property name. So personally, I do not like using .indirectDiffuseSH as a property name.

I am having difficulty finding a name I do like, but I am leaning toward material.illuminance.

@donmccurdy
Copy link
Collaborator Author

donmccurdy commented Mar 25, 2020

Ok, that all makes sense! Without the "SH", "material.indirectDiffuse" does not sound as compelling — it may be confusing when "diffuse" is often used to refer to a material's base color in other software. Maybe that could be saved with "indirectDiffuseLighting".

For the sake of exploring, here's all I can come up with:

material.____ comment
.illuminance
.irradiance see Blender's Irradiance Volumes
.illumination
.gi see Godot's GI Probes
.globalIllumination yikes 😅

I am happy with any of these except the last. I could also imagine something with an env* prefix like .envIlluminance, matching .envMap. Or just .env? The consistency is nice, but the word "environment" is imprecise.

@donmccurdy
Copy link
Collaborator Author

I think I might prefer to work with irradiance rather than illuminance. The real-time rendering resources I'm able to find, or those relevant to this topic, typically use it. See the Filament docs, for example, discussing light probes, "irradianceSH", and "irradianceEnvMap". This implies physical units are W/m2 or W/cm2.

@WestLangley
Copy link
Collaborator

I think the trend now is to use photometric terms (lumens, candela, lux, nits) -- not the radiometric terms based on watts, so I would use the term 'illuminance'. We already use the photometric terms in the three.js docs.

I do not know why the Filament docs are so inconsistent in terminology -- both luminance and radiance are used. Maybe it is just force-of-habit.

@donmccurdy
Copy link
Collaborator Author

I think that is true for lights — note the Filament docs use illuminance only there — but that probes and volumes are always discussed in terms of "irradiance." I don't know why that is, but will try to find out. The intro to this talk was also quite helpful: https://www.gdcvault.com/play/1026182/.

@donmccurdy
Copy link
Collaborator Author

A pairing like scene.irradianceVolume + material.irradiance might be promising...

Regardless of what we name this, I'm not completely happy with my implementation in dev...donmccurdy:feat-indirectdiffusesh. It seems odd that two sets of SH3 coefficients have to be stored as uniforms (2 x 9 x vec3), where one is a light uniform (and therefore not updated between mesh draws) and the other is a material uniform, both representing global illumination for the object. Do you think those could be combined somehow, in the renderer?

@FishOrBear
Copy link
Contributor

@donmccurdy The lighting effect of this scene seems to have a lot of noise,

@donmccurdy
Copy link
Collaborator Author

@FishOrBear if you mean the grain/noise on the hallway itself, that's unrelated to the light probe volume — I just baked the light there with a raycaster, and did so in a hurry, causing the noise. The light probe volume is contributing only the light on dynamic objects (the sphere) in this scene.

@mindinsomnia
Copy link

Is this patch still progressing? It seems like really promising work.

@donmccurdy
Copy link
Collaborator Author

donmccurdy commented Nov 30, 2021

Does that mean unreal interpolates between different cube maps for every dynamic object in a scene?

I had the same question. @bhouston I can see how the representation you described would be better for a single light probe, but can we smoothly interpolate among many cube maps within baked irradiance volumes? A few references I'm skimming here:

While the current implementation samples the irradiance volume once per object, we may want to do that per vertex or even per pixel someday.

@WestLangley
Copy link
Collaborator

@donmccurdy Note that MeshStandardMaterial and MeshPhysicalMaterial now automatically account for the irradiance implied by the environment map (i.e., radiance map). See #22178 (comment).

Consequently, if you use an environment map, and simultaneously store irradiance in a light probe volume, you will be double-counting the irradiance.

three.js does not currently support a PBR workflow where the user provides a both a radiance map and an irradiance map.

@netpro2k
Copy link

netpro2k commented Dec 1, 2021

At the risk of derailing this thread even further... I am working on some very related changes for Hubs (still WIP) so I wanted to chime in since its quite relevant to the current discussion and wondering if there is overlap/consolidation that should happen...

I have things set up such that you can define AABBs in which a particular environment map will apply. Objects overlapping multiple environment map boxes ("reflection probes") will blend between the 2 most overlapping boxes (or the scene's environment in the case of partially overlapping with 1 box). This is quite similar to what Unity does for reflection probes and the effect is quite nice (this scene has no dynamic lights, just lightmaps + "reflection probes")

simplescreenrecorder-2021-11-30_17.11.17.mp4

Since environment maps on MeshStandardMaterials are now all PMREM this is also sort of acting as a diffuse light probe which is great. I think we will still want some higher density probes for diffuse changes (ex across hard shadow boundaries) so I was planning to implement a solution like LightProbeVolumes as well in Hubs, but its interesting that apparently Unreal is just using cubemaps? (I have not been able to find a clear indication of this). Assuming they do, I wonder if we should be doing the same? And if so, does that mean reflection probes would also want to just be in a grid?

I suspect even if we did use a grid of cubemaps for diffuse lighting doing reflection probes as boxes might still be desirable as you might want a few higher resolution probes for actual reflective objects, and having a position and box defined is also useful for doing things like box projection, though you could argue that could be handled as special cases.

Current code for this is here: Hubs-Foundation/three.js@hubs-patches-133...MozillaReality:multiple-envmap .. Note this is on top of our three-133 fork but the only notable change as it pertains to lighting is that we do not apply the irradiance from environment maps to lightmapped objects (we assume you will bake that in already) Hubs-Foundation@eb6297b

Workflow wise we are doing very much what is described above, just rendering out equirects in Blender. Current script I am using for this is here https://gist.github.com/netpro2k/fe5a3b1348f3644d9b39e149b7901cf4, though the plan is to integrate this more deeply into our addon.

Note that I am using Blender's ReflectionCubemap objects only for their gizmos/UI as the underlying cubemap data is not actually accessible

image

@gillesboisson
Copy link

Hi, I come 2 years too late, but I'm working on some blender plugin that bakes scene probes (Eevee probes system) into cubemap sheet, It still in progress but advanced enough to be tested in threejs.

here is it : https://github.com/gillesboisson/blender-probes-export

It exports :

  • irradiance probes grid as cubemaps sheet (one sheet per grid, SH not supported yet)
  • and reflection cubemap as multi level roughness level cubemap sheet.

If some people still working on multi light probe integration with threejs, I would be interested to get inputs as it is huge work to implement a solution from scratch.

@mrdoob
Copy link
Owner

mrdoob commented Sep 14, 2023

@gillesboisson

That's amazing!

I feel like SH would be better for web. That way we don't have to worry about HDR file sizes.

@WestLangley
Copy link
Collaborator

@gillesboisson Just so you are aware... the three.js PBR materials automatically account for the irradiance (i.e., global illumination) implied by the envMap (radiance map).

Consequently, if you include an environment map plus an irradiance probe in your scene, you will be double-counting irradiance.

@gillesboisson
Copy link

gillesboisson commented Sep 14, 2023

Thanks for your feedbacks, I think it will be a bit tricky with threeJS env map as I have multi level roughness map exported in reflection map.

ReflectionCubemap_packed.png

Maybe for now I should prioritize on supporting SH export in texture then I'll do experiment with Three JS

Is there anyone using this or working on this PR ? As it looks to be outdated.

@mrdoob
Copy link
Owner

mrdoob commented Sep 14, 2023

I don't think there's anyone working on this PR currently.

@gillesboisson
Copy link

Cheers,

I red this thread, looked into changes and looked at current state of threeJS code. Here is few ideas on what I see in the blender plugin and in three JS

Light probe definition and standard

The plugin use blender eevee probes objects which support irradiance grid and reflection cubemap. It render each probe element into a requirectangular image using blender cycle rendering engine, in srgb, but it seems from that I should switch to HDR for more accurate result. Then in second rendered result is packed into sheets

  • irradiance grid is packed into a image with 9 cubemap faces, I saw UE4 use small cubemaps array and some other engine use SH4 (Unity used to pack grids in 3D textures some implementation interpolate them in vertex level).

  • reflection cubemap is packed into an images with roughness levels x 6 cube faces.

All of this is more detailed on the plugin repo readme

From engine perspective

  • It's better to have clear separation between light and probes definition
  • In eevee probes have a bounds, falloff and clipping system which defined how they the object is affected and how probes are interpolated which specifics to probes.
  • In term of performance I don't know how heavy it is to do per vertex indirect lighting (SH interpolation > irradiance sampling), some unity paper explained that it's quite feasible even in mobile.

It would be interesting to have a new kind of objects which define a probes as now it still defined as light. It will have it own spatial segmentation (I saw some octogrid class somewhere but don't know it is handled in scene) and a common clipping / falloff for handling per object probe influence. It would be very nice and pretty simple to implement and compatible for future kind of probes (I saw some tetra / triangle system for irradiance ideas in this threads and also in some Unity old paper).

I'm not an expert three JS and don't know yet what's the roadmap for supporting more probes but I would be interested to help on this part. Maybe first trying to implement a simple solution with custom shader material extending PBR material, objects and scene and then with your help doing a purposal based on three js standard and good practice.

@donmccurdy
Copy link
Collaborator Author

donmccurdy commented Sep 14, 2023

It render each probe element into a requirectangular image using blender cycle rendering engine, in srgb, but it seems from that I should switch to HDR for more accurate result...

Agreed – for physically-based shading using the light probe data, we'll need HDR, some comments earlier in this thread have come to the same conclusion. Also note that Blender v4 is moving in the direction of supporting wide-gamut color, and this might affect the color space of light data sampled from probes...

Probably the best place to start would be Linear Rec. 709 ("Linear sRGB") data in OpenEXR format. We can package the data into something more efficient and web-friendly later down the road. Or, JSON is always convenient too.


I have no strong preference on using spherical harmonics vs. low-res cubemaps. @bhouston had commented earlier low-res cubemaps seem to be what most tools prefer these days, in #18371 (comment). Do others have a preference? We can convert cubemaps to SH but not necessarily the other way around...

We should aim for something we can sample efficiently on the GPU. For that reason, I'm no longer really interested in supporting arbitrary tetmesh probe layouts, as I'd tried in earlier versions of this PR. Let's stick to a grid with well-defined falloff, using either SH or low-res cubemaps.


Unfortunately I haven't been actively working on this PR for a while, and I don't have bandwidth to pick it back up right now. If anyone else is interested, please do!

@WestLangley
Copy link
Collaborator

I have no strong preference on using spherical harmonics vs. low-res cubemaps. @bhouston had commented earlier low-res cubemaps seem to be what most tools prefer these days, in #18371 (comment). Do others have a preference?

Using SH to estimate irradiance from HDR radiance probes can be problematic due to "ringing". There are various work-arounds, but I would be inclined to use low-res cube maps, instead.

@gillesboisson
Copy link

Good, less work on blender side :) I'll stick to small cubemap then.

OpenEXR looks good, I'll check on it later as I don't no yet how to handle it with blender opengl API (as I use blender internal API to compute equi panos to irradiance / reflectance cubemaps). I use 16bit / per channel PNGs for now I still need to check how to keep constistancy between my baked light probes and blended object baking tools render settings (as I'd like to be integrated with fully light baked static object).

@gillesboisson
Copy link

Just to give an update on this subject, I'm working on some integration test in this repo : https://github.com/gillesboisson/threejs-probes-test

When i'll have something working, i'll need some help to integrate it better with three js standard.

@gillesboisson
Copy link

I have some issue with my packing process. In my blender plugin :

  • it first render probes into a equirectangular pano
  • then it samples result into cubemap faces and pack it in an image for both irradiance and luminance.

in cubemaps I have weird dots in x axis on irradiance and y axis reflectance.

for reflectance I'm pretty sure it's because of panos sampling

Screenshot from 2023-10-02 17-14-45

for irradiance, I really don't get why.

Screenshot from 2023-10-02 17-00-53

Screenshot from 2023-10-02 17-01-09

It's out of the three JS scope but if anyone has reference for this kind of issue it would be a great help.

@gillesboisson
Copy link

Hi,

Just to give a follow up, I found some time to work on the probes volume system.

You can see a demo here : https://three-probes.dotify.eu

I'll start working on extending Material.

Roadmap map is here : https://github.com/gillesboisson/threejs-probes-test#roadmap

@gillesboisson
Copy link

I finally resolve my blender export issue and add Open EXR export support
https://github.com/gillesboisson/blender-probes-export

I still need to finalize global env export, I'll be able to adapt the three JS proto to support both SDR and HDR probes and maybe implement the in engine baking featured on what's has been done on this PR.

@gillesboisson
Copy link

Hi,

Here is a demo here of the probes volume structure : https://three-probes.dotify.eu/

It display volumes influences, probes and has a sphere following the camera which display interpolated cubemaps.

I could solve most of my compatibility issues, and made a lot of inprovement on blender side.

I moved SH to low priority.

I'll switch to material integration.

@gillesboisson
Copy link

Just to give a quick update

I have an almost working prototype here : https://three-probes.dotify.eu/
I find a way extends standard, physical, phong and lambert material without touching three js base code
I have only one issue with shader cache and uniforms, I asked for help to the cummunity here : https://discourse.threejs.org/t/extending-material-for-probes-volume/57729

If any one has some doc or reference on how shader caching and uniforms works it would be a great hekp

Cheers.

@gillesboisson
Copy link

gillesboisson commented Nov 8, 2023

Hi,

quick report : I could find some time to fix issues and do some cleaning.

The prototype works and support most of the targeted feature. It would be practical for me to have feedback from three experts as I'm still learning threejs low level engine parts. I'll go show the proto to the community in order to see if it makes sense for them to use it and add it to three.

The doc explain quite well the state of the proto. It's all there

demo : https://three-probes.dotify.eu/
prototype repo : https://github.com/gillesboisson/threejs-probes-test
blender plugin repo : https://github.com/gillesboisson/blender-probes-export

I'll switch to blender side in order to make a more user friendly plugin.
I'm also thinking adding object lightmap baking into the plugin as existing plugins doesn't make things easy.

Cheers

@donmccurdy
Copy link
Collaborator Author

@gillesboisson thanks for your patience with our delayed replies here. The demo is outstanding — I'm really thrilled with how this is looking! A few comments, questions, and ideas. Please feel free to respond or ignore them as you prefer. :)

  • I noted that the runtime "Sun" directed light is not visible in the reflection probes. Do you feel that's a user-specific decision, and that users could optionally include a sun lamp when baking in Blender? Or is there more to it? I know matching light intensities across Blender and three.js is not always easy...

  • Do you have a preference about whether the runtime components of this should become part of the three.js repository, or would you prefer to manage a repository yourself? Personally I am happy with either. If it became part of three.js, I think I would mainly want to document the final structure of the probe data sufficiently that it could (at least theoretically) be produced by other tools in addition to Blender someday.

  • Do you know what portion of the demo causes the warnings, THREE.WebGLTextures: Trying to use 27 texture units while this GPU supports only 16? Is this caused by having 1 texture per irradiance probe? If so, I have some work in progress to support array textures in KTX2 format ((WIP) KTX2Loader: Add support for u8, f16, and f32 array and cube textures #26642) which may help. Probably not urgent right now, but I'm happy to help with this step later if it's useful.

  • Currently we only support scene.environment (a single IBL for irradiance and reflections shared by an entire scene) in MeshStandardMaterial and MeshPhysicalMaterial. I suspect that if support for probe volumes or other global illumination were added to three.js core someday, it may be limited to only those two material types. In case this makes it easier for you to maintain in the meantime.

@gillesboisson
Copy link

Thanks for your feedback,

here is few ideas:

I noted that the runtime "Sun" directed light is not visible in the reflection probes. Do you feel that's a user-specific decision, and that users could optionally include a sun lamp when baking in Blender? Or is there more to it? I know matching light intensities across Blender and three.js is not always easy...

Blender probes has pretty robust solution, for each probe volumes you can set a specific visibility collection which you decide what is baked or not. In my demo I had to do hack with the sun as gltf importer seems to not recognize directionnal light and in some case ignore point lights.

Do you have a preference about whether the runtime components of this should become part of the three.js repository, or would you prefer to manage a repository yourself? Personally I am happy with either. If it became part of three.js, I think I would mainly want to document the final structure of the probe data sufficiently that it could (at least theoretically) be produced by other tools in addition to Blender someday.

Ideally I would like this to be merged into three js but for now I don't have a robust solution. If you check the material extension part of the code, it gives some approach on how we could implement these into the engine (current solution didn't required to modify threejs renderers, materials or shaders).

Do you know what portion of the demo causes the warnings
I still need to check on this, but it certainly because of using single cubemap : the probe handler get probe around objects and define cubemaps. You're solution looks interesting. Does webgl2 support cubemap arrays ? I was thinking about using SH for irradiance and pano texture array for radiance.

Currently we only support scene.environment (a single IBL for irradiance and reflections shared by an entire scene) in MeshStandardMaterial and MeshPhysicalMaterial. I suspect that if support for probe volumes or other global illumination were added to three.js core someday, it may be limited to only those two material types. In case this makes it easier for you to maintain in the meantime.

I actually tried to support material that support envmap, You can check on my implementation it was prettry straight forward for both physical and basic based material, I had only issues with material with "ifdef USE_ENV" in there root shader code (like basic material).

Thanks for your feedback,

I'm definetely interested about a robust solution for handling the large amounts of textures / uniforms (for now instance rendering is not supported)

@donmccurdy
Copy link
Collaborator Author

... gltf importer seems to not recognize directionnal light and in some case ignore point lights.

THREE.GLTFLoader does handle both, but (1) the punctual lights option must be enabled in the Blender glTF addon settings to export, and (2) getting visual appearance of lighting intensities to match (not just the units!) is not trivial.

... Does webgl2 support cubemap arrays?

Unfortunately not, only texture 2D arrays. In WebGPU this would be possible.

Ideally I would like this to be merged into three js but for now I don't have a robust solution. If you check the material extension part of the code, it gives some approach on how we could implement these into the engine...

Ok – we'll need to look closer at the implementation to decide on this I think. I'm not able to do that right now, but either someone else will, or I will get to it in the future!

@gillesboisson
Copy link

@donmccurdy : I updated data schema on blender plugin side and props in plugin which explained how volume are baked and how it is represented in the prototype implementation.

I'm also working and supporting light baking in material texture. :

  • In blender, the object / light props will evolve to a common approach : Static | Dynamic object extra data in gltf
  • Three JS implementation will use render layer to split light and object rendering if they are static or not

Some quick explanation on probes.json data structure here :
Some specs on Blender props : https://github.com/gillesboisson/blender-probes-export/blob/main/doc/props.md

@gillesboisson
Copy link

Quick update, I'm still working on this, I have been focuses on the blender side. I had lightmap baking support on my plugins and work on exported data format.

Here is a simple integration with active object using probes for GI and static objects (walls, pillars, etc) using lightmap for GI.

https://three-probes-small.dotify.eu/

b003a8d5867846e56ad63dbf7ef3be19e1058358
6a65425ba5f6897a9b067c63c14bb765d1593a24

@gillesboisson
Copy link

gillesboisson commented Feb 13, 2024

A little update, I did some code optimisation and features like :

  • manual probes,
  • nearest probe interpolation
  • auto update function

I update the demo scene on the repo, the open exr files are pretty heavy

Screenshot from 2024-02-13 10-34-43
Screenshot from 2024-02-13 10-33-52

repo : https://github.com/gillesboisson/threejs-probes-test
demo : https://three-probes.dotify.eu/

I worked a lot on the blender plugin, I will sell it in blender market under GPL license to finance a part of its development. It will come quite soon and will support lightmap / probes authoring and baking

The three js side of the project is made with the idea of having its own baking method. I don't how it can be a purposal on three JS features, but It will be helpfull to get some feedback on how it can go to that direction.

@mrdoob
Copy link
Owner

mrdoob commented Feb 14, 2024

the open exr files are pretty heavy

Are you aware of hdr jpg?
https://threejs.org/examples/?q=hdr#webgl_loader_texture_hdrjpg

Those ~70MB in exrs could become ~7MB in jpgs...

@gillesboisson
Copy link

gillesboisson commented Feb 14, 2024

Thanks for your feedback.

I heard about jpghdr but not looked much into it.

I have a general subject on data in general, as my way of packing, loading gtlf and texture is far from optimal.

I'll work on some better packing solution for texture and gltf (glt-transform looks goods for it). I'll support more texture format export on my plugin SH texture pack for irradiance grid and jpghdr should be one of supported format.

I'll need to improve loading part as now I'm not using GLTF loader properly (I saw there is way to customize the parser).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Light Probe interpolation using Tetrahedral Tesselations
9 participants