Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rename aloft bucket to aloftdata #622

Merged
merged 2 commits into from
Jul 21, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 6 additions & 6 deletions NEWS.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,19 +14,19 @@ bioRad 0.7 includes a major backend overhaul that deprecates the use of Docker.

* Faster implementations of functions previously dependent on Docker, such as `calculate_vp()`, `apply_mistnet()` and `read_pvolfile()`.

* Support for reading [VPTS CSV](https://aloftdata.eu/vpts-csv/) format through updated function `read_vpts()`. VPTS CSV table schema included to allow offline parsing of VPTS CSV files as a [frictionless](https://CRAN.R-project.org/package=frictionless) data package (#551, #590)
* Support for reading [VPTS CSV](https://aloftdata.eu/vpts-csv/) format through updated function `read_vpts()`. VPTS CSV table schema included to allow offline parsing of VPTS CSV files as a [frictionless](https://CRAN.R-project.org/package=frictionless) data package (#551, #590).

* Updated function `read_vpts()` supports reading `vp`/`vpts` data in ODIM HDF, [VPTS CSV](https://aloftdata.eu/vpts-csv/) format (#551, #590)
* Updated function `read_vpts()` supports reading `vp`/`vpts` data in ODIM HDF and [VPTS CSV](https://aloftdata.eu/vpts-csv/) format (#551, #590).

* New function `list_vpts_aloft()` produces a list of [aloft](https://aloftdata.eu/browse/) archive URLs for time series of vertical profiles (`vpts`). This list of URLs can then be used to bulk download data using any number of external tools (#553).

* New function `read_stdout()` replaces previous functionality of `read_vpts()` to read vol2bird stdout format. It also has a new `sep` argument (#536) to support both fixed-delimited and comma-separated stdout data.

* New function `as.vpts` converts a data.frame originating from a VPTS CSV file into a vpts object (#555). Inverse operation of as.data.frame.vpts
* New function `as.vpts` converts a data.frame originating from a VPTS CSV file into a vpts object (#555). Inverse operation of `as.data.frame.vpts()`.

* `read_pvolfiles()` now allows ODIM H5 files with missing `source` attribute. The functionality is similar to `read_vpfiles()`, i.e. extracting the NOD, RAD or WMO identifier, otherwise using `unknown` (2f6935c).

* `bind_into_vpts()` now works for vp and vpts objects with different heights (#343)
* `bind_into_vpts()` now works for vp and vpts objects with different heights (#343).

* Faster parallel mistnet runs (https://github.com/adokter/vol2birdR/issues/16).

Expand All @@ -42,9 +42,9 @@ bioRad 0.7 includes a major backend overhaul that deprecates the use of Docker.

* Function `vol2bird_version()` has been migrated to package vol2birdR and can be accessed by `vol2birdR::vol2bird_version()`.

* Dependency `maptools` has been replaced with [suntools](https://github.com/adokter/suntools), `rgdal` has been removed in accordance with the evolution of `sp` and the [imminent archiving](https://r-spatial.org/r/2023/05/15/evolution4.html) of `rgdal`
* Dependency `maptools` has been replaced with [suntools](https://github.com/adokter/suntools), `rgdal` has been removed in accordance with the evolution of `sp` and the [imminent archiving](https://r-spatial.org/r/2023/05/15/evolution4.html) of `rgdal`.

* Function `as.data.frame.vpts()` has output column names `lat`, `lon`, `antenna_height` renamed to `radar_latitude`, `radar_longitude`, `radar_height` for compatibility with the [VPTS CSV](https://aloftdata.eu/vpts-csv/) data format. The function also outputs an additional column `radar_wavelength` (#609)
* Function `as.data.frame.vpts()` has output column names `lat`, `lon`, `antenna_height` renamed to `radar_latitude`, `radar_longitude`, `radar_height` for compatibility with the [VPTS CSV](https://aloftdata.eu/vpts-csv/) data format. The function also outputs an additional column `radar_wavelength` (#609).

# bioRad 0.6.1

Expand Down
6 changes: 3 additions & 3 deletions R/list_vpts_aloft.R
Original file line number Diff line number Diff line change
Expand Up @@ -94,7 +94,7 @@ list_vpts_aloft <- function(date_min = NULL,

## set static urls --------------------------------------------------------
# Set base URL
base_url <- "https://aloft.s3-eu-west-1.amazonaws.com"
base_url <- "https://aloftdata.s3-eu-west-1.amazonaws.com"

# format csv --------------------------------------------------------------
if (format == "csv") {
Expand All @@ -104,7 +104,7 @@ list_vpts_aloft <- function(date_min = NULL,

found_vpts_aloft <-
aws.s3::get_bucket_df(
bucket = "s3://aloft",
bucket = "s3://aloftdata",
prefix = glue::glue("{source}/monthly"),
region = "eu-west-1",
max = Inf
Expand All @@ -124,7 +124,7 @@ list_vpts_aloft <- function(date_min = NULL,
} else {
# hdf5 files
# TODO: create file paths of form
# https://aloft.s3-eu-west-1.amazonaws.com/baltrad/hdf5/bejab/2023/05/02/bejab_vp_20230502T000000Z_0x9.h5
# https://aloftdata.s3-eu-west-1.amazonaws.com/baltrad/hdf5/bejab/2023/05/02/bejab_vp_20230502T000000Z_0x9.h5
}

# format found data -------------------------------------------------------
Expand Down
21 changes: 10 additions & 11 deletions tests/testthat/test-read_vpts.R
Original file line number Diff line number Diff line change
@@ -1,14 +1,13 @@
# Define the URLs of test files
urls <- c(
"https://aloft.s3-eu-west-1.amazonaws.com/baltrad/hdf5/czbrd/2023/06/01/czbrd_vp_20230601T000000Z_0xb.h5",
"https://aloft.s3-eu-west-1.amazonaws.com/baltrad/hdf5/czbrd/2023/06/01/czbrd_vp_20230601T000500Z_0xb.h5",
"https://aloft.s3-eu-west-1.amazonaws.com/baltrad/hdf5/czbrd/2023/06/01/czbrd_vp_20230601T001000Z_0xb.h5",
"https://aloft.s3-eu-west-1.amazonaws.com/baltrad/monthly/bejab/2023/bejab_vpts_202303.csv.gz",
"https://aloft.s3-eu-west-1.amazonaws.com/baltrad/monthly/bejab/2023/bejab_vpts_202304.csv.gz",
"https://aloft.s3-eu-west-1.amazonaws.com/baltrad/monthly/bewid/2023/bewid_vpts_202303.csv.gz"
"https://aloftdata.s3-eu-west-1.amazonaws.com/baltrad/hdf5/czbrd/2023/06/01/czbrd_vp_20230601T000000Z_0xb.h5",
"https://aloftdata.s3-eu-west-1.amazonaws.com/baltrad/hdf5/czbrd/2023/06/01/czbrd_vp_20230601T000500Z_0xb.h5",
"https://aloftdata.s3-eu-west-1.amazonaws.com/baltrad/hdf5/czbrd/2023/06/01/czbrd_vp_20230601T001000Z_0xb.h5",
"https://aloftdata.s3-eu-west-1.amazonaws.com/baltrad/monthly/bejab/2023/bejab_vpts_202303.csv.gz",
"https://aloftdata.s3-eu-west-1.amazonaws.com/baltrad/monthly/bejab/2023/bejab_vpts_202304.csv.gz",
"https://aloftdata.s3-eu-west-1.amazonaws.com/baltrad/monthly/bewid/2023/bewid_vpts_202303.csv.gz"
)


# Define the path to the new temporary directory
temp_dir <- tempdir()

Expand Down Expand Up @@ -111,7 +110,7 @@ test_that("read_vpts() can read local vp hdf5 files", {

test_that("read_vpts() returns error on multiple radars in vp hdf5 files", {
# add eehar h5
eehar <- "https://aloft.s3-eu-west-1.amazonaws.com/baltrad/hdf5/eehar/2023/06/01/eehar_vp_20230601T001000Z_0xb.h5"
eehar <- "https://aloftdata.s3-eu-west-1.amazonaws.com/baltrad/hdf5/eehar/2023/06/01/eehar_vp_20230601T001000Z_0xb.h5"
download_test_file(eehar, temp_dir, h5_dir, csv_dir)

h5_files <- list.files(temp_h5_dir, pattern = "*.h5", full.names = TRUE)
Expand Down Expand Up @@ -194,7 +193,7 @@ test_that("read_vpts() returns equal summaries from h5 and csv files from 1 day

# Get the files for the current prefix
h5_files <- aws.s3::get_bucket_df(
bucket = "s3://aloft/",
bucket = "s3://aloftdata/",
prefix = prefix,
region = "eu-west-1"
)
Expand All @@ -204,7 +203,7 @@ test_that("read_vpts() returns equal summaries from h5 and csv files from 1 day
aws.s3::save_object(
file = paste0(h5_dir, "/", basename(file_name)),
object = file_name,
bucket = "s3://aloft/",
bucket = "s3://aloftdata/",
region = "eu-west-1"
)
})
Expand All @@ -216,7 +215,7 @@ test_that("read_vpts() returns equal summaries from h5 and csv files from 1 day

# VPTS CSV

urls <- c("https://aloft.s3-eu-west-1.amazonaws.com/baltrad/daily/bewid/2023/bewid_vpts_20230414.csv")
urls <- c("https://aloftdata.s3-eu-west-1.amazonaws.com/baltrad/daily/bewid/2023/bewid_vpts_20230414.csv")

# Use lapply to download each file to a temporary location
csv_files <- lapply(urls, function(url) {
Expand Down
Loading