Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Scale parameter #55

Closed
skia opened this issue Nov 17, 2023 · 9 comments
Closed

Scale parameter #55

skia opened this issue Nov 17, 2023 · 9 comments

Comments

@skia
Copy link

skia commented Nov 17, 2023

Hi @mkkellogg !

I was wondering what was your opinion on adding another parameter that can be passed to the viewer and that would tide together MaximumDistanceToSort/MaximumDistanceToRender/SplatBufferBucketBlockSize and the Camera.far?

Is there any drawback?
Did i miss some other variable?

@mkkellogg
Copy link
Owner

I'm a bit confused as to what you mean exactly; MaximumDistanceToSort and MaximumDistanceToRender are used prior to sorting when deciding which splats to be filtered out, but SplatBufferBucketBlockSize is used when saving or loading a SplatBuffer object and Camera.far isn't really used other than for calculating the projection matrix. How were you thinking they should be tied together?

@mkkellogg
Copy link
Owner

Going to close this for now, if you need more help with this let me know.

@skia
Copy link
Author

skia commented Dec 4, 2023

Sorry - lost track of this! Apologise - quite rude since i was the one asking :)

I will try to explain myself better:
I believe you are currently relaying on the scene being between -10/10 as these are the default coordinates for colmap when it doesn't have any geolocation information, but you can easily generate bigger scenes, with the rest of the procedure explained in the paper being the same.

In this case, you would need to adjust all the mentioned parameters (ok - i don't know about the Camera.far, but i would assume so).
The tricky one is the SplatBufferBucketBlockSize : I had a scene that started from 300mb as ply and got to 1.2gb when exported from the page as (k?)splat and i was unable to view it. This was because the BlockSize was quite small (5?), so it ended up not having any full Bucket but a lot of half empty buckets to fill with padding.

If there were a "Scale" (no idea on how to call it) factor to multiply all the current values, you could easily support bigger scenes - afterall i don't think it will be long before people will find other ways to generate their own splats, maybe with an approach completely different from Inria or ML.
Being constrain to these coordinates feels more like a constrain of the approach used to generate the splats, rather than the "file format"

@mkkellogg mkkellogg reopened this Dec 4, 2023
@mkkellogg
Copy link
Owner

I admit that the compression functionality I implemented was fairly quick & dirty and still needs work. I'm hesitant to put too much work into it though, because we're still hashing out a proper universal .splat format & compression scheme here. In the mean-time, would exposing the SplatBufferBucketBlockSize variable as a parameter to the compression code be helpful? The user could simply adjust it to a value that is good for the spatial density of their scene. As an immediate workaround, you can also try converting your .ply file with compression off (compression level = 0). That should should help.

@skia
Copy link
Author

skia commented Dec 5, 2023

Thank you for your reply - sorry about the closing/opening, i was on my phone and i misclicked.

I think the discussion got focus more on the ksplat format rather than the original question: what are the values that are currently relying on the coordinates being relative small and should there be a single variable that modify all of them together (as they are related to each other)?

I think if you want to expose SplatBufferBucketBlockSize then it would be a good idea to expose all the other values as well in a similar fashion - thinking about it would make sense to have all the current numbers exposed (ie SplatTree parameters in SplatMesh), so there would be no need to rebuild the lib in order to change some of these variables. That would also be an alternative to having the mentioned scale parameter.

Regarding the ksplat: appreciate you are trying to have a unique format to represent Gaussians on the web and i don't know how you intend to proceed once that will be decided, but if you are thinking about having the same internal representation as now (SplatBuffer), then the SplatBufferBucketBlockSize problem would come back again.
So, in some way, this value has taken in consideration regardless of what the final .splat format will be.
I guess one way to make it somehow flexible is to calculate some sort of density (ie number of points/area) and then having SplatBufferBucketBlockSize as SplatBufferBucketSize/density - main problem with this approach is that the density is not linear and depends on the amount of details we have in a specific area, but if all the variable are exposed, then the numbers could be tweaked depending on the case.

@skia skia closed this as completed Dec 5, 2023
@skia skia reopened this Dec 5, 2023
@mkkellogg
Copy link
Owner

So just to clear things up, the current method of compression that utilizes the SplatBufferBucketBlockSize and SplatBufferBucketBlockSize parameters, along with the current internal format (SplatBuffer) are all meant to be temporary while we sort out the final universal format. Once we nail that down, I will switch over to using it in some form or another as both the file storage format and the internal format for storing splat data. So I was just thinking as a quick work-around to the problems you have been experiencing, I could expose SplatBufferBucketBlockSize and SplatBufferBucketBlockSize as tunable parameters when performing a conversion.

I understand that even with those parameters exposed, there are still issues around the hardcoded parameters for splat tree construction (maxDepth and maxCentersPerNode) as well as parameters like MaximumDistanceToRender. I agree that it would be nice to have all of these variables be tunable as well, but I'm not sure about a single variable that ties all these parameters together yet since so much of the design is still in flux. Maybe they just need to all be exposed as individual parameters, but there would need to be a way to persist those values in the splat file, and I'm not sure what data will ultimately supported there. (Maybe as long as we allow for a "custom data" section of the file it will be fine).

As for calculating spatial density in an effort to calculate these parameters automatically -- that thought had occurred to me, but like you mentioned the density will most likely be non-uniform so a even if SplatBufferBucketBlockSize is customizable, it still may not work that well for some scenes. One nice thing about the splat tree is that it can deal with variable density; given a target maxCentersPerNode it will create a tree with leaf nodes of variable dimensions in order to respect that maximum. The maxDepth is not meant to be a tuning parameter so much as a safegaurd against splat trees becoming too deep.

@skia
Copy link
Author

skia commented Dec 5, 2023

So just to clear things up, the current method of compression that utilizes the SplatBufferBucketBlockSize and SplatBufferBucketBlockSize parameters, along with the current internal format (SplatBuffer) are all meant to be temporary while we sort out the final universal format. Once we nail that down, I will switch over to using it in some form or another as both the file storage format and the internal format for storing splat data. So I was just thinking as a quick work-around to the problems you have been experiencing, I could expose SplatBufferBucketBlockSize and SplatBufferBucketBlockSize as tunable parameters when performing a conversion.

Makes completely sense - i thought your idea was to use the SplatBuffer internally as honestly they are quite fast and optimised, hence my comment on it 😅
But i get your point: you might very well use SplatBuffer internally in the future as well, but at the moment you want to have things more fixed before restructuring everything to accommodate it.

I understand that even with those parameters exposed, there are still issues around the hardcoded parameters for splat tree construction (maxDepth and maxCentersPerNode) as well as parameters like MaximumDistanceToRender. I agree that it would be nice to have all of these variables be tunable as well, but I'm not sure about a single variable that ties all these parameters together yet since so much of the design is still in flux. Maybe they just need to all be exposed as individual parameters, but there would need to be a way to persist those values in the splat file, and I'm not sure what data will ultimately supported there. (Maybe as long as we allow for a "custom data" section of the file it will be fine).

I can see how maxDepth and maxCentersPerNode can be different per splat, if we think about it as a normal ThreeJS object the best analogy i can think of are how many segments a SphereGeometry should have or how big a texture should be, that are parameters you cannot really tweak in a file you are importing.
I don't think it's the same for MaximumDistanceToRender - this normally would be related to the Camera.Far, wouldn't you agree?

One option to keep things flexible while the final format still need to be decided could be to have an extra file where these variables are stored - so you would have your .splat and your .params , in a similar way as you have your .obj and your .mtl

@mkkellogg
Copy link
Owner

I don't think it's the same for MaximumDistanceToRender - this normally would be related to the Camera.Far, wouldn't you agree?

Yes I definitely agree that MaximumDistanceToRender should be tied to Camera.far. It's a little tricky though because I think we'd also want to consider splat density here as well when determining this value, but maybe if we just make Camera.far easily configurable and have MaximumDistanceToRender based off of it, that will be good enough.

One option to keep things flexible while the final format still need to be decided could be to have an extra file where these variables are stored - so you would have your .splat and your .params , in a similar way as you have your .obj and your .mtl

That's an interesting idea -- We could just used defaults in the absence of a .params file, and override them if one is present. Easy enough :)

@skia
Copy link
Author

skia commented Dec 5, 2023

Yes I definitely agree that MaximumDistanceToRender should be tied to Camera.far. It's a little tricky though because I think we'd also want to consider splat density here as well when determining this value, but maybe if we just make Camera.far easily configurable and have MaximumDistanceToRender based off of it, that will be good enough.

I saw how you are using MaximumDistanceToRender and i understand your point, but if you apply the same logic as maxCentersPerNode and SplatBufferBucketBlockSize - it's a property that depends on the how good you want the result to be on the splat . If you were to save directly the SplatTree it should be one of the exposed parameters to decide how good you want the model to be (and also, it depends on the scale of the model, same as the other parameters) - but since at the moment you are not, i don't think it's the viewer the right place to make this decision.

That's an interesting idea -- We could just used defaults in the absence of a .params file, and override them if one is present. Easy enough :)

Yep :) the main problem with it is to be careful the custom section in the header wouldn't be disregarded just because you found a solution - i still think it should be part of the .splat specs. (a couple of strings to describe the key and a couple of ints surely will not impact the size in any significant way)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants