Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

track_sensor() with multi_pulse = TRUE #392

Closed
lucas-johnson opened this issue Nov 23, 2020 · 32 comments
Closed

track_sensor() with multi_pulse = TRUE #392

lucas-johnson opened this issue Nov 23, 2020 · 32 comments
Assignees
Labels
Bug A bug in the package

Comments

@lucas-johnson
Copy link

lucas-johnson commented Nov 23, 2020

With UserData properly populated and multi_pulse = T I don't think this should create an error.

track_sensor(ctg, Roussel2020(), multi_pulse = T, thin_pulse_with_time = 0)
#> Processing [===========================================] 100% (27/27) eta:  0s
#> An error occurred when processing the chunk 27. Try to load this chunk with:
#> chunk <- readRDS("/tmp/RtmpjzsRxH/chunk27.rds")
#> las <- readLAS(chunk)
#> After keeping only first and last returns of multiple returns pulses, 1 pulses still have more than 2 points. This dataset is corrupted and gpstime is likely to be invalid.
#> Error in .bboxCoords(coords) : nrow(coords) > 0 is not TRUE
chunk <- readRDS("/tmp/RtmpjzsRxH/chunk27.rds")
las <- readLAS(chunk)
las@data[, problem := .N > NumberOfReturns, by = "gpstime"]
las@data
#>           X       Y     Z   gpstime ReturnNumber NumberOfReturns ScanAngle
#> 1: 559503.7 4876494 13.98 114978726            1               2     0.996
#> 2: 559503.5 4876494  0.05 114978726            2               2     0.996
#> 3: 559509.7 4876413 21.37 114978726            1               3     0.996
#> 4: 559509.6 4876413 14.51 114978726            2               3     0.996
#> 5: 559509.5 4876413  8.08 114978726            3               3     0.996
#>    UserData PointSourceID buffer problem
#> 1:        0           214      0    TRUE
#> 2:        0           214      0    TRUE
#> 3:        1           214      0    TRUE
#> 4:        1           214      0    TRUE
#> 5:        1           214      0    TRUE
@Jean-Romain
Copy link
Collaborator

The chunk has only 5 points?

@lucas-johnson
Copy link
Author

lucas-johnson commented Nov 23, 2020

I guess so... That's the chunk that the error message pointed me to. If I read the 27th tile in the catalog (chunk size is 0) there are actually 12.51 million points. So I'm not sure what's going on here.

ctg$filename
#>  [1] "/tmp/u_3585072950_2018.las" "/tmp/u_3585073100_2018.las"
#>  [3] "/tmp/u_3600072800_2018.las" "/tmp/u_3600072950_2018.las"
#>  [5] "/tmp/u_3600073100_2018.las" "/tmp/u_3615072650_2018.las"
#>  [7] "/tmp/u_3615072800_2018.las" "/tmp/u_3615072950_2018.las"
#>  [9] "/tmp/u_3615073100_2018.las" "/tmp/u_3630072050_2018.las"
#> [11] "/tmp/u_3630072200_2018.las" "/tmp/u_3630072350_2018.las"
#> [13] "/tmp/u_3630072500_2018.las" "/tmp/u_3630072650_2018.las"
#> [15] "/tmp/u_3630072800_2018.las" "/tmp/u_3630072950_2018.las"
#> [17] "/tmp/u_3630073100_2018.las" "/tmp/u_3645071900_2018.las"
#> [19] "/tmp/u_3645072050_2018.las" "/tmp/u_3645072200_2018.las"
#> [21] "/tmp/u_3645072350_2018.las" "/tmp/u_3645072500_2018.las"
#> [23] "/tmp/u_3645072650_2018.las" "/tmp/u_3645072800_2018.las"
#> [25] "/tmp/u_3645072950_2018.las" "/tmp/u_3645073100_2018.las"
#> [27] "/tmp/u_5595087500_2015.las"
readLAS(ctg$filename[27])
#> class        : LAS (v1.4 format 6)
#> memory       : 1.1 Gb 
#> extent       : 559500, 561000, 4875000, 4876500 (xmin, xmax, ymin, ymax)
#> coord. ref.  : +proj=utm +zone=18 +ellps=GRS80 +towgs84=0,0,0,0,0,0,0 +units=m +no_defs 
#> area         : 2.25 km²
#> points       : 12.51 million points
#> density      : 5.56 points/m²
opt_chunk_size(ctg)
#> [1] 0

@lucas-johnson
Copy link
Author

lucas-johnson commented Nov 23, 2020

las_27 <- readLAS(ctg$filename[27])
las_27@data[, problem := .N > NumberOfReturns, by = c("gpstime", "UserData")]
las_27@data[problem == T, ]
#> Empty data.table (0 rows and 19 cols): X,Y,Z,gpstime,Intensity,ReturnNumber...

@Jean-Romain
Copy link
Collaborator

Can you show me the gpstime of the 5 points with more digits

@lucas-johnson
Copy link
Author

lucas-johnson commented Nov 23, 2020

options(digits = 22)
las@data
#>                          X                       Y                          Z
#> 1: 559503.7399999999906868 4876493.730000000447035 13.98000000000000042632564
#> 2: 559503.5300000000279397 4876494.030000000260770  0.05000000000000000277556
#> 3: 559509.6900000000605360 4876412.950000000186265 21.37000000000000099475983
#> 4: 559509.6199999999953434 4876412.820000000298023 14.50999999999999978683718
#> 5: 559509.5400000000372529 4876412.709999999962747  8.08000000000000007105427
#>                    gpstime ReturnNumber NumberOfReturns
#> 1: 114978725.6651494503021            1               2
#> 2: 114978725.6651494503021            2               2
#> 3: 114978725.6651494503021            1               3
#> 4: 114978725.6651494503021            2               3
#> 5: 114978725.6651494503021            3               3
#>                   ScanAngle UserData PointSourceID buffer problem
#> 1: 0.9959999918937683105469        0           214      0    TRUE
#> 2: 0.9959999918937683105469        0           214      0    TRUE
#> 3: 0.9959999918937683105469        1           214      0    TRUE
#> 4: 0.9959999918937683105469        1           214      0    TRUE
#> 5: 0.9959999918937683105469        1           214      0    TRUE

@Jean-Romain
Copy link
Collaborator

Ok I think I got it. You have two different pulses, but both have the exact same gpstime. Internally pulse IDs are attributed by gpstime but this does not work in your case where two different pulses have legitimately the same gpstime. multi_pulse was added when you reported an issue with this kind of sensor but was never tested because I don't have such data.

I will fix that tomorrow.

@Jean-Romain Jean-Romain self-assigned this Nov 23, 2020
@Jean-Romain Jean-Romain added the Bug A bug in the package label Nov 23, 2020
@lucas-johnson
Copy link
Author

Thank you so much!

@Jean-Romain
Copy link
Collaborator

Jean-Romain commented Nov 23, 2020

I just read the code. Actually my code is correct and handles multiple pulses properly but you found a limit case. The points are reordered and consequently sensor 1 and sensor 2 are grouped in memory and there is a time gap between the two groups. But with only 5 points the two groups are no longer separated by a time gap leading to a false pulse ID

@lucas-johnson
Copy link
Author

lucas-johnson commented Nov 23, 2020

Ok, I'm not sure I understand entirely. You're saying that you sort on UserID and assume a that there will be a separation in gpstime between the two groups? And in this case with 5 points there is no separation in gpstime between the groups?

@Jean-Romain
Copy link
Collaborator

Jean-Romain commented Nov 23, 2020

Fixed. But maybe you should investigate why you have so few pulses with multiple returns. It may be a bug with thin_pulse_with_time = 0

@Jean-Romain
Copy link
Collaborator

I also fixed a bug with thin_pulse_with_time = 0 for LAScatalog

@lucas-johnson
Copy link
Author

It seems the same issue is still present:

ctg <- readLAScatalog("/tmp")
opt_chunk_buffer(ctg) <- 100
opt_filter(ctg) <- "-drop_user_data 2 -drop_class 7 18 -drop_withheld"
sensor_positions <- track_sensor(ctg, Roussel2020(), multi_pulse = T, thin_pulse_with_time=0)
#> Processing [=>------------------------------------------]   4% (1/26) eta: 27sError: After keeping only first and last returns of multiple returns pulses, 2 pulses still have more than 2 points. This dataset is corrupted and gpstime is likely to be invalid.
las <- readLAS(ctg$filename[1])
any(las@data[UserData != 2, .N > NumberOfReturns, by = c("gpstime", "UserData")]$V1)
#> [1] FALSE

@Jean-Romain Jean-Romain reopened this Nov 23, 2020
@Jean-Romain
Copy link
Collaborator

Read the chunk 1 (not the file 1) and show me what it contains. Only multiple return should be loaded so it should not contain too many points

@lucas-johnson
Copy link
Author

check <- unlist(lapply(ctg$filename, function(x) {
       las <- readLAS(x)
       any(las@data[UserData != 2, .N > NumberOfReturns, by = c("gpstime", "UserData")]$V1)
}))
any(check)
#> [1] FALSE
packageVersion("lidR")
#> [1] ‘3.0.5’

@Jean-Romain
Copy link
Collaborator

Indeed when the first chunk fails it assumes it a problem with the source code not with a specific chunk with a weird pattern.

Use

chunks = catalog_makechunks(ctg)
las = readLAS(chunks[[1]])

@r-lidar r-lidar deleted a comment from lucas-johnson Nov 23, 2020
@r-lidar r-lidar deleted a comment from lucas-johnson Nov 23, 2020
@lucas-johnson
Copy link
Author

chunks = catalog_makechunks(ctg)
las = readLAS(chunks[[1]])
las@data
#>                X       Y      Z   gpstime Intensity ReturnNumber
#>      1: 359940.4 4730365 116.23 208539889       106            1
#>      2: 359711.7 4730417 116.19 208539889       136            1
#>      3: 359711.6 4730417 116.31 208539889       561            1
#>      4: 359814.2 4730590 116.31 208539892       210            1
#>      5: 359824.8 4730592 116.27 208539892       264            1
#>     ---                                                         
#> 118078: 360096.8 4731019 116.26 208539900       681            1
#> 118079: 360097.6 4731019 116.24 208539900       137            1
#> 118080: 360098.4 4731019 116.24 208539900       213            1
#> 118081: 360099.1 4731019 116.25 208539900       322            1
#> 118082: 360099.9 4731019 116.29 208539900       238            1
#>         NumberOfReturns ScanDirectionFlag EdgeOfFlightline Classification
#>      1:               1                 0                1              9
#>      2:               1                 0                1              1
#>      3:               1                 0                1              9
#>      4:               1                 0                1              9
#>      5:               1                 0                1              9
#>     ---                                                                  
#> 118078:               1                 0                0              9
#> 118079:               1                 0                0              9
#> 118080:               1                 0                0              9
#> 118081:               1                 0                0              9
#> 118082:               1                 0                0              9
#>         ScannerChannel Synthetic_flag Keypoint_flag Withheld_flag Overlap_flag
#>      1:              1          FALSE         FALSE         FALSE        FALSE
#>      2:              1          FALSE         FALSE         FALSE        FALSE
#>      3:              1          FALSE         FALSE         FALSE        FALSE
#>      4:              1          FALSE         FALSE         FALSE        FALSE
#>      5:              1          FALSE         FALSE         FALSE        FALSE
#>     ---                                                                       
#> 118078:              2          FALSE         FALSE         FALSE        FALSE
#> 118079:              2          FALSE         FALSE         FALSE        FALSE
#> 118080:              2          FALSE         FALSE         FALSE        FALSE
#> 118081:              2          FALSE         FALSE         FALSE        FALSE
#> 118082:              2          FALSE         FALSE         FALSE        FALSE
#>         ScanAngle UserData PointSourceID buffer
#>      1:   -12.996        0            48      0
#>      2:   -19.998        0            48      0
#>      3:   -19.998        0            48      0
#>      4:   -16.998        0            48      0
#>      5:   -16.998        0            48      0
#>     ---                                        
#> 118078:    -7.998        0            48      4
#> 118079:    -7.998        0            48      4
#> 118080:    -7.998        0            48      4
#> 118081:    -7.998        0            48      4
#> 118082:    -7.998        0            48      4
las@data[, problem := .N > NumberOfReturns, by = c("gpstime", "UserData")]
options(digits = 22)
las@data[problem == T & UserData != 2][order(gpstime)]
#>                          X                       Y                       Z
#> 1: 360000.0000000000000000 4730642.030000000260770 117.4300000000000068212
#> 2: 360000.0000000000000000 4730642.030000000260770 117.4300000000000068212
#> 3: 360000.0000000000000000 4730677.089999999850988 117.7900000000000062528
#> 4: 360002.5900000000256114 4730676.259999999776483 130.4499999999999886313
#> 5: 360000.0000000000000000 4730677.089999999850988 117.7900000000000062528
#> 6: 360000.0000000000000000 4730707.320000000298023 118.0199999999999960210
#> 7: 360002.2500000000000000 4730706.610000000335276 128.9900000000000090949
#> 8: 360001.9899999999906868 4730706.690000000409782 127.7100000000000079581
#> 9: 360000.0000000000000000 4730707.320000000298023 118.0199999999999960210
#>10: 360000.0000000000000000 4730642.700000000186265 117.4099999999999965894
#>11: 360000.0000000000000000 4730642.700000000186265 117.4099999999999965894
#>12: 360000.0000000000000000 4730643.980000000447035 117.3900000000000005684
#>13: 360000.0000000000000000 4730643.980000000447035 117.3900000000000005684
#>14: 360000.0000000000000000 4730713.870000000111759 117.6800000000000068212
#>15: 360000.0000000000000000 4730713.870000000111759 117.6800000000000068212
#>16: 360087.5700000000069849 4731000.000000000000000 116.2400000000000090949
#>17: 360087.5700000000069849 4731000.000000000000000 116.2400000000000090949
#>                    gpstime Intensity ReturnNumber NumberOfReturns
#> 1: 208539892.9715285599232      1679            1               1
#> 2: 208539892.9715285599232      1679            1               1
#> 3: 208539893.3646641373634       982            2               2
#> 4: 208539893.3646641373634       434            1               2
#> 5: 208539893.3646641373634       982            2               2
#> 6: 208539893.7471247911453      1213            3               3
#> 7: 208539893.7471247911453       503            1               3
#> 8: 208539893.7471247911453       443            2               3
#> 9: 208539893.7471247911453      1213            3               3
#>10: 208539895.1821788847446      1645            1               1
#>11: 208539895.1821788847446      1645            1               1
#>12: 208539895.2034289836884      1595            1               1
#>13: 208539895.2034289836884      1595            1               1
#>14: 208539896.1703459322453      1560            1               1
#>15: 208539896.1703459322453      1560            1               1
#>16: 208539897.7633522152901       152            1               1
#>17: 208539897.7633522152901       152            1               1
#>    ScanDirectionFlag EdgeOfFlightline Classification ScannerChannel
#> 1:                 0                0              2              1
#> 2:                 0                0              2              1
#> 3:                 0                0              2              1
#> 4:                 0                0              1              1
#> 5:                 0                0              2              1
#> 6:                 0                0              2              1
#> 7:                 0                0              1              1
#> 8:                 0                0              1              1
#> 9:                 0                0              2              1
#>10:                 0                0              2              2
#>11:                 0                0              2              2
#>12:                 0                0              2              2
#>13:                 0                0              2              2
#>14:                 0                0              2              2
#>15:                 0                0              2              2
#>16:                 0                0              9              1
#>17:                 0                0              9              1
#>    Synthetic_flag Keypoint_flag Withheld_flag Overlap_flag
#> 1:          FALSE         FALSE         FALSE        FALSE
#> 2:          FALSE         FALSE         FALSE        FALSE
#> 3:          FALSE         FALSE         FALSE        FALSE
#> 4:          FALSE         FALSE         FALSE        FALSE
#> 5:          FALSE         FALSE         FALSE        FALSE
#> 6:          FALSE         FALSE         FALSE        FALSE
#> 7:          FALSE         FALSE         FALSE        FALSE
#> 8:          FALSE         FALSE         FALSE        FALSE
#> 9:          FALSE         FALSE         FALSE        FALSE
#>10:          FALSE         FALSE         FALSE        FALSE
#>11:          FALSE         FALSE         FALSE        FALSE
#>12:          FALSE         FALSE         FALSE        FALSE
#>13:          FALSE         FALSE         FALSE        FALSE
#>14:          FALSE         FALSE         FALSE        FALSE
#>15:          FALSE         FALSE         FALSE        FALSE
#>16:          FALSE         FALSE         FALSE        FALSE
#>17:          FALSE         FALSE         FALSE        FALSE
#>                   ScanAngle UserData PointSourceID buffer problem
#> 1: -11.99400043487548828125        0            48      4    TRUE
#> 2: -11.99400043487548828125        0            48      4    TRUE
#> 3: -11.99400043487548828125        0            48      4    TRUE
#> 4: -11.99400043487548828125        0            48      4    TRUE
#> 5: -11.99400043487548828125        0            48      4    TRUE
#> 6: -11.99400043487548828125        0            48      4    TRUE
#> 7: -11.99400043487548828125        0            48      4    TRUE
#> 8: -11.99400043487548828125        0            48      4    TRUE
#> 9: -11.99400043487548828125        0            48      4    TRUE
#>10: -11.99400043487548828125        0            48      4    TRUE
#>11: -11.99400043487548828125        0            48      4    TRUE
#>12: -11.99400043487548828125        0            48      4    TRUE
#>13: -11.99400043487548828125        0            48      4    TRUE
#>14: -11.99400043487548828125        0            48      4    TRUE
#>15: -11.99400043487548828125        0            48      4    TRUE
#>16:  -8.99400043487548828125        0            48      4    TRUE
#>17:  -8.99400043487548828125        0            48      4    TRUE

@lucas-johnson
Copy link
Author

Well, seems like UserData doesn't accurately identify a pulse.

@lucas-johnson
Copy link
Author

lucas-johnson commented Nov 23, 2020

Sorry, I think this is a problem with the data. The code looks to have correctly produced an error. I should have checked the chunk.

Though I'm not sure why this test wouldn't have flagged something...

@Jean-Romain
Copy link
Collaborator

Jean-Romain commented Nov 23, 2020

Redo the same but with "-drop_single". This is added automatically in track_sensor() and is missing here

opt_filter(ctg) <- "-drop_single -drop_user_data 2 -drop_class 7 18 -drop_withheld"
chunks = catalog_makechunks(ctg)
las = readLAS(chunks[[1]])
las@data[, .(X,Y,Z.gpstime, UserData,PointSourceID,ReturnNumber NumberOfReturns)

There is only one that looks incorrect. 4x the same gpstime but NumberOfReturns = 3 + same UserData = 0

#>                    gpstime Intensity ReturnNumber NumberOfReturns
#> 6: 208539893.7471247911453      1213            3               3
#> 7: 208539893.7471247911453       503            1               3
#> 8: 208539893.7471247911453       443            2               3
#> 9: 208539893.7471247911453      1213            3               3

*Edit: two are incorrect actually:

#>                    gpstime Intensity ReturnNumber NumberOfReturns
#> 3: 208539893.3646641373634       982            2               2
#> 4: 208539893.3646641373634       434            1               2
#> 5: 208539893.3646641373634       982            2               2

And all the single return are duplicated. This is not an issue they are dropped on the fly but they look to be invalid points

@Jean-Romain
Copy link
Collaborator

Jean-Romain commented Nov 23, 2020

Actually point 6 is also a duplicate of point 9 and 3 is a duplicate of 5.

As a conclusion you have duplicates. I let you investigate and I'm closing because my tests perfectly played their roles 😉

@lucas-johnson
Copy link
Author

Ok thanks so much for looking. Agreed your tests were correct and did their job!

@Jean-Romain
Copy link
Collaborator

Jean-Romain commented Nov 23, 2020

Can you please check that las_check() reports pulse issues for your data ?

@lucas-johnson
Copy link
Author

las <- readLAS(chunks[[1]])
las_check(las)

#> Checking the data
#>  - Checking coordinates... ✓
#>  - Checking coordinates type... ✓
#>  - Checking coordinates quantization... ✓
#>  - Checking attributes type... ✓
#>  - Checking ReturnNumber validity... ✓
#>  - Checking NumberOfReturns validity... ✓
#>  - Checking ReturnNumber vs. NumberOfReturns... ✓
#>  - Checking RGB validity... ✓
#>  - Checking absence of NAs... ✓
#>  - Checking duplicated points...
#>   ⚠ 8 points are duplicated and share XYZ coordinates with other points
#>  - Checking degenerated ground points...
#>   ⚠ There were 6 degenerated ground points. Some X Y Z coordinates were repeated.
#>  - Checking attribute population...
#>   ⚠ 'ScanDirectionFlag' attribute is not populated.
#>  - Checking gpstime incoherances
#>    ✗ 7 pulses (points with the same gpstime) have points with identical ReturnNumber
#>  - Checking flag attributes... ✓
#>  - Checking user data attribute... ✓
#> Checking the header
#>  - Checking header completeness... ✓
#>  - Checking scale factor validity... ✓
#>  - Checking point data format ID validity... ✓
#>  - Checking extra bytes attributes validity... ✓
#>  - Checking the bounding box validity... ✓
#>  - Checking coordinate reference sytem... ✓
#> Checking header vs data adequacy
#>  - Checking attributes vs. point format... ✓
#>  - Checking header bbox vs. actual content... ✓
#>  - Checking header number of points vs. actual content... ✓
#>  - Checking header return number vs. actual content... ✓
#> Checking preprocessing already done 
#>  - Checking ground classification... yes
#>  - Checking normalization... no
#>  - Checking negative outliers... ✓
#>  - Checking flightline classification... yes> 

@Jean-Romain
Copy link
Collaborator

So las_check() already contains some pulse error diagnostic.

✗ 7 pulses (points with the same gpstime) have points with identical ReturnNumber

But the diagnostic may be inccorect for multi-pulses systems

@lucas-johnson
Copy link
Author

lucas-johnson commented Nov 23, 2020

Very nice. Still one question – why would this check fail to show problematic points? It seems that there is no problem within each las file, but when we chunk the catalog we find problems. Can you explain this?

@Jean-Romain
Copy link
Collaborator

Jean-Romain commented Nov 23, 2020

This is indeed weird. I can see two explanations

  1. A bug when reading and merging multiple file. This seems unlikely to me but who knows...
  2. You point cloud contains duplicated points but they are separated in different files.

The second explanation seems more likely to me because all your problematic points are located in the buffer (4 = on the right) very close to the edge of the processed region

@lucas-johnson
Copy link
Author

lucas-johnson commented Nov 23, 2020

Ok based on our analysis above it's clear that the duplicated points are the problem. And if we assume the second explanation is correct, how does a point with the same X, Y, Z end up in 2 different files? The files must have spatial overlap then, but I get no warning on that when I read in the catalog. Or is the spatial overlap so small that it is not registered?

In any case. If 2 points with the same X, Y, Z end up in different files this is a problem with the data and not with lidR. Thanks again for all of your help. I really really appreciate it.

@Jean-Romain
Copy link
Collaborator

Jean-Romain commented Nov 23, 2020

You are right and I don't know what to say.

I noticed that some point have round X coordinates (360000.00). Those point could be duplicated on two tiles without warning. But it remains 5 points that I can't explain.

Also you should check if your tiles overlap. If the overlaps are very small it is possible that no warning is triggered because of the numerical tolerance introduced to avoid false positive.

@lucas-johnson
Copy link
Author

No overlap.

is.overlapping(ctg)
#> [1] FALSE
spdf <- as.spatial(ctg)
countour <- rgeos::gUnaryUnion(spdf)
actual_area <- countour@polygons[[1]]@area
measured_area <- area(catalog)
measured_area
#> [1] 56786635.590801797807
actual_area
#> [1] 56786635.590801879764
actual_area - measured_area
#> [1] 0.000000081956386566162109375

@Jean-Romain
Copy link
Collaborator

Yes overlap because actual_area > measured_area. Not a lot. Try to plot contour

@lucas-johnson
Copy link
Author

I was basing this on the code here:
https://github.com/Jean-Romain/lidR/blob/16f7540e282a1b218e35e1959d762c01cf7863f0/R/utils_is.R#L84
Which says overlap is when actual_area < measured area right?
Anyways:

spdf <- as.spatial(ctg)
countour <- rgeos::gUnaryUnion(spdf)
plot(countour)

contour_plot

@Jean-Romain
Copy link
Collaborator

Jean-Romain commented Nov 23, 2020

Image shows that you have overlaps. I think some overlap may be 0 cm overlaps. In theory you shouldn't have overlaps because if we assume that tiles were cut at 36000.00 the bounding box on the left is something like 35999.99 and on the right 36000.01 because is is unlikely to have a point exactly at 36000.00. And even if you have one point at 36000 the cut must be computed with an open interval so points at 36000.00 are put either on the right or the left but not both. In your case I think this is not respected.

But maybe some overlaps are bigger.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Bug A bug in the package
Projects
None yet
Development

No branches or pull requests

2 participants