-
Notifications
You must be signed in to change notification settings - Fork 65
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Scale of data display on the LT change mapper #5
Comments
Hi, I noticed the same type of artifacts in the output images from LandTrendr, for instance when converting into images the results of the ltgee.getSegmentData. Did you happen to find an explanation of this? What is the resolution of the LandTrendr output? |
Hi @JodiNorris and @agataelia @JodiNorris @agataelia For best results you need to export the image using one of the |
One thing that would help in these cases with removing artifacts I believe is to not use bicubic resampling. I'll look at updating the code so that it does not use it or at least make it an option to not use it. |
Hi @jdbcode, Thank you for the prompt remply! I looked into reprojecting the LT output and indeed a solution like the one below works in providing a map output with consistent resolution.
I am still having GEE memory troubles when running my code over very large areas, but probably just running an export would solve that. Thanks! |
Hi @jdbcode , I am back to ask for some advice on this. I am having troubles in exporting as an asset the final output of my GEE code based on the LT functions. The area I am working with is definitely very large (working on the whole USA, masked on polygons of interest) so my first question is whether it is not advisable to run such an algorithm over these large areas. My code performs the following tasks:
My desired export is the final raster (approximately 8 bands) with the extent of the USA masked over the polygons of interest (still an extensive area). The export as asset of this data is taking a long time (> 4 days) and usually crashing for user memory limit. Do you have any suggestion:
Any recommendation would be very much appreciated, thank you! |
Hi @agataelia
|
Hi @jdbcode, Great, thank you! |
Note on allowing person to choose resampling method (don't resample on change mapping app):
|
The features on both the change mapper and the time series pages are amazing, and have been useful to a resource manager who is interested in tracking change over the last 30 years. He's asked me what the scale is for these datasets. On the time series page, the pixel size appears to be 30 meters, but on the change mapper site, the Year of Detection layer appears to have better than 1 foot precision (actually all 3 layers have this effect). I think this is some kind of artifact but don't know what's going on. Could that explanation be added to the click here for information link that is already on the right hand side?
The text was updated successfully, but these errors were encountered: