Title: | Tools for Synoptic Environmental Spatial Data |
---|---|
Description: | Tools for reading, plotting and manipulating spatial data, particularly from remote sensing and model output. |
Authors: | Michael D. Sumner [aut, cre], Ben Raymond [ctb] |
Maintainer: | Michael D. Sumner <[email protected]> |
License: | GPL-3 |
Version: | 0.7.0.5 |
Built: | 2024-11-13 03:21:04 UTC |
Source: | https://github.com/AustralianAntarcticDivision/raadtools |
Tools in R for reading, plotting and manipulating spatial data, originally used at the Australian Antarctic Division (AAD).
read functions like readsst
will read a data set by date-time vector, with a
set of shared arguments that work the same and documented against this dummy function.
Michael D. Sumner [email protected]
Options
There is an option 'raadtools.check.file.exists' that is TRUE by default, set to FALSE might make very long time series requests a bit faster.
options(raadtools.check.file.exists = FALSE)
Maintainer: Michael D. Sumner [email protected]
Useful links:
Raw list of all files.
allfiles(...)
allfiles(...)
... |
reserved for future use, currently ignored |
data.frame
with columns fullname
with all file paths
AMPS files
amps_d1_icefiles(data.source = "", time.resolution = "12hourly", ...)
amps_d1_icefiles(data.source = "", time.resolution = "12hourly", ...)
data.source |
ignored, reserved for future use |
time.resolution |
time resolution data to read, 6hourly only for now |
... |
reserved for future use, currently ignored |
GRIB format metadata from the Antarctic Mesoscale Prediction System (AMPS) files.
amps_metadata
A data frame with 8 columns. The columns
represent
Band |
the band number |
GRIB_COMMENT |
the data measured |
GRIB_ELEMENT |
the data name |
GRIB_FORECAST_SECONDS |
forecast seconds |
GRIB_REF_TIME |
text reference time |
GRIB_SHORT_NAME |
short name |
GRIB_UNIT |
text unit |
GRIB_VALID_TIME |
text valid time |
print(amps_metadata) ## u_wind_at_900 <- read ## unfinished
print(amps_metadata) ## u_wind_at_900 <- read ## unfinished
Voyage track data from the Aurora Australis
aurora
A data frame with 3 columns. The columns
represent
LONGITUDE_DEGEAST |
Longitude values |
LATITUDE_DEGNORTH |
Latitude values |
DATE_TIME_UTC |
Date-time values (POSIXct) |
This is a sample of the "Aurora Australis Voyage 3 2012/13 Track and Underway Data".
## Not run: ## These data were obtained like this base <- "http://gcmd.gsfc.nasa.gov/KeywordSearch/RedirectAction.do?target" b1 <- "=F4t70bSf87FLsT1TNxR9TSPS74xbHAdheLQcH5Z5PMmgzJ9t%2Bi%2FEs1e8Fl61" b2 <- "MPhKjo9qxb2f9wyA%0D%0AoE1kjJ8AMcpFlMMRH7Z6umgNLsGMnWPeQdU7mZHMp%2" b3 <- "FtqMpahIrde%2F%2B9%2FZWAkIFrh2bhIiNfl4I9J%0D%0A5KBX9g5Wf7I9JdOgqY" b4 <- "bDdpj0iM1K%2BA%3D%3D" aurora2013 <- read.csv(paste(base, b1, b2, b3, b4, collapse = ""), stringsAsFactors = FALSE) aurora2013$DATE_TIME_UTC <- as.POSIXct(aurora2013$DATE_TIME_UTC, tz = "GMT") ## get a daily sample aurora <- aurora2013[,c("LONGITUDE_DEGEAST", "LATITUDE_DEGNORTH", "DATE_TIME_UTC")] aurora <- aurora[!duplicated(format( aurora$DATE_TIME_UTC, "%Y-%j")), ] aurora <- aurora[order(aurora$DATE_TIME_UTC), ] save(aurora, file = "aurora.rda") ## End(Not run)
## Not run: ## These data were obtained like this base <- "http://gcmd.gsfc.nasa.gov/KeywordSearch/RedirectAction.do?target" b1 <- "=F4t70bSf87FLsT1TNxR9TSPS74xbHAdheLQcH5Z5PMmgzJ9t%2Bi%2FEs1e8Fl61" b2 <- "MPhKjo9qxb2f9wyA%0D%0AoE1kjJ8AMcpFlMMRH7Z6umgNLsGMnWPeQdU7mZHMp%2" b3 <- "FtqMpahIrde%2F%2B9%2FZWAkIFrh2bhIiNfl4I9J%0D%0A5KBX9g5Wf7I9JdOgqY" b4 <- "bDdpj0iM1K%2BA%3D%3D" aurora2013 <- read.csv(paste(base, b1, b2, b3, b4, collapse = ""), stringsAsFactors = FALSE) aurora2013$DATE_TIME_UTC <- as.POSIXct(aurora2013$DATE_TIME_UTC, tz = "GMT") ## get a daily sample aurora <- aurora2013[,c("LONGITUDE_DEGEAST", "LATITUDE_DEGNORTH", "DATE_TIME_UTC")] aurora <- aurora[!duplicated(format( aurora$DATE_TIME_UTC, "%Y-%j")), ] aurora <- aurora[order(aurora$DATE_TIME_UTC), ] save(aurora, file = "aurora.rda") ## End(Not run)
Title
ccmp_files(time.resolution = "6hourly", ...)
ccmp_files(time.resolution = "6hourly", ...)
time.resolution |
time resoution data to read, daily or monthly |
... |
passed in to brick, primarily for |
data frame of file paths
ccmp_files()
ccmp_files()
Ocean colour palette for chlorophyll-a.
chl.pal(x, palette = FALSE, alpha = 1)
chl.pal(x, palette = FALSE, alpha = 1)
x |
a vector of data values or a single number |
palette |
logical, if |
alpha |
value in 0,1 to specify opacity |
Flexible control of the chlorophyll-a palette. If x
is a
single number, the function returns that many colours evenly
spaced from the palette. If x
is a vector of multiple
values the palette is queried for colours matching those values,
and these are returned. If x
is missing and palette
is FALSE
then a function is returned that will generate n
evenly spaced colours from the palette, as per
colorRampPalette
.
colours, palette, or function, see Details
Derived from http://oceancolor.gsfc.nasa.gov/DOCS/palette_chl_etc.txt.
## Not run: chl <- readchla(xylim = c(100, 110, -50, -40)) ## just get a small number of evenly space colours plot(chl, col = chl.pal(10)) ## store the full palette and work with values and colours pal <- chl.pal() ## the standard full palette plot(chl, breaks = pal$breaks, col = pal$cols) ## a custom set of values with matching colours plot(chl, col = chl.pal(pal$breaks[seq(1, length(pal$breaks), length = 10)])) ## any number of colours stored as a function myfun <- chl.pal() plot(chl, col = myfun(18)) ## just n colours plot(chl, col = chl.pal(18)) ## End(Not run)
## Not run: chl <- readchla(xylim = c(100, 110, -50, -40)) ## just get a small number of evenly space colours plot(chl, col = chl.pal(10)) ## store the full palette and work with values and colours pal <- chl.pal() ## the standard full palette plot(chl, breaks = pal$breaks, col = pal$cols) ## a custom set of values with matching colours plot(chl, col = chl.pal(pal$breaks[seq(1, length(pal$breaks), length = 10)])) ## any number of colours stored as a function myfun <- chl.pal() plot(chl, col = myfun(18)) ## just n colours plot(chl, col = chl.pal(18)) ## End(Not run)
Chlorophyll-a for the Southern Ocean
chlafiles( time.resolution = c("weekly", "monthly"), product = c("johnson", "oceancolor"), platform = c("MODISA", "SeaWiFS"), ... )
chlafiles( time.resolution = c("weekly", "monthly"), product = c("johnson", "oceancolor"), platform = c("MODISA", "SeaWiFS"), ... )
time.resolution |
weekly (8day) or monthly |
product |
choice of chla product, see |
platform |
(modisa or seawifs) |
... |
reserved for future use, currently ignored |
This function generates a list of available chlorophyll-a files, including SeaWiFS and MODIS.
data.frame
Coastline data set as SpatialPolygons, currently not supported.
coastmap( map = c("world", "world2", "ant_coast", "ant_coast01", "ant_coast10", "Countries_hires", "world1", "world360", "cst00_polygon", "cst01_polygon", "cst10_polygon"), ... )
coastmap( map = c("world", "world2", "ant_coast", "ant_coast01", "ant_coast10", "Countries_hires", "world1", "world360", "cst00_polygon", "cst01_polygon", "cst10_polygon"), ... )
map |
A named map source |
... |
arguments passed to worker functions |
This function reads and returns a coastline polygon data set, either from the source or serialized cache (.Rdata). Data source locations are controlled by options.
The following named data sets were available.
"world", "world2" - Atlantic and Pacific versions of maptools wrld_simpl ("world1", "world360" are aliases) "ant_coast", "ant_coast01", "ant_coast10" - AAD Antartic coast in full, "1 mill", and "10 mill" resolution ("cst00_polygon", "cst01_polygon", and "cst10_polygon" are aliases) "Countries_hires" - the "CIA" world map exported from the Manifold GIS data set
SpatialPolygonsDataFrame or SpatialPolygons (world1/2)
Each element can be looked up by name, see Examples
commonprojections
commonprojections
An object of class list
of length 4.
This should be use only for a convenient reference to look up the projection strings commonly in use. There's no guarantee that this would be appropriate and you should seek cartographic expertise.
http://www.spatialreference.org
names(commonprojections) commonprojections[["polar"]]
names(commonprojections) commonprojections[["polar"]]
Circumpolar ROMS files.
cpolarfiles(...)
cpolarfiles(...)
... |
ignored |
data frame of fullname, date
cpolarfiles()
cpolarfiles()
Load file names and dates of AVISO current data
currentsfiles(time.resolution = c("daily", "weekly"), ...)
currentsfiles(time.resolution = c("daily", "weekly"), ...)
time.resolution |
time resolution to load |
... |
reserved for future use, currently ignored |
A data.frame of file names and dates
data.frame of file names and dates
This function loads the latest cache of stored files for these products, which are available from http://webdav.data.aad.gov.au/data/environmental/derived/antarctic/
derivaadcfiles(products, ...)
derivaadcfiles(products, ...)
products |
which derived product |
... |
reserved for future use, currently ignored |
data.frame of file
and date
readderivaadc
and derivaadcproducts
(prods <- derivaadcproducts()) readderivaadc(sample(prods, 1L))
(prods <- derivaadcproducts()) readderivaadc(sample(prods, 1L))
readderivaadc
List all derived data products available through readderivaadc
derivaadcproducts()
derivaadcproducts()
character vector of product names
readderivaadc
and derivaadcfiles
This function loads the latest cache of stored files for ice products, currently only daily NSIDC southern hemisphere is available.
derivicefiles(product = "time_since_melt", ...)
derivicefiles(product = "time_since_melt", ...)
product |
which derived product |
... |
reserved for future use, currently ignored |
data.frame of file
and date
Calculate the shortest distance (metres) to a threshold sea ice contour. If in
doubt use distance_to_ice_edge
, the definition of the edge is not straightforward, especially so for
the higher resolution products and near the coast.
distance_to_ice_edge
computes a single "main" edge at continental scale
distance_to_ice
computes all distances to any ice at threshold concentration
distance_to_ice_edge( date, threshold = 15, xylim = NULL, hemisphere = "south", returnfiles = FALSE, inputfiles = NULL, latest = TRUE, ... ) distance_to_ice( date, threshold = 15, xylim = NULL, hemisphere = "south", returnfiles = FALSE, inputfiles = NULL, latest = TRUE, ... )
distance_to_ice_edge( date, threshold = 15, xylim = NULL, hemisphere = "south", returnfiles = FALSE, inputfiles = NULL, latest = TRUE, ... ) distance_to_ice( date, threshold = 15, xylim = NULL, hemisphere = "south", returnfiles = FALSE, inputfiles = NULL, latest = TRUE, ... )
date |
date or dates of data to read, see Details |
threshold |
the sea ice concentration threshold to contour at |
xylim |
spatial extents to crop from source data, can be anything accepted by |
hemisphere |
"north" or "south", default is "south" |
returnfiles |
ignore options and just return the file names and dates |
inputfiles |
input the files data base to speed up initialization |
latest |
if TRUE and date input is missing, return the latest time available otherwise the earliest |
... |
passed to brick, primarily for |
The distance is always positive, use readice
in the usual way to determine if a
location is inside or out of the ice field itself. (If inside means zero distance to ice
for you then set it explicitly based on the concentration a point is in.)
Future work may generalize this to other data sources.
raster layer with distances to this date's sea ice edge
beware that any queried location outside of this layer's range will be undetermined, and the external boundary of this layer is not constant with respect to the pole, and that in general a location may be closer to ice in the opposite hemisphere.
The argument hemisphere
may be north or south (default is south), but this will only work if your locations
are on the actual map, so it's not possible to request the distance to ice in both poles for any point.
plot(distance_to_ice(latest = TRUE)) plot(distance_to_ice_edge(latest = TRUE)) a = extract(distance_to_ice, aurora[17:25, ]) extract(distance_to_ice, aurora[17:25, ], hemisphere = "south") # library(trip) # extract(distance_to_ice_edge, walrus818[seq(50, 400, by = 20), ], hemisphere = "north")
plot(distance_to_ice(latest = TRUE)) plot(distance_to_ice_edge(latest = TRUE)) a = extract(distance_to_ice, aurora[17:25, ]) extract(distance_to_ice, aurora[17:25, ], hemisphere = "south") # library(trip) # extract(distance_to_ice_edge, walrus818[seq(50, 400, by = 20), ], hemisphere = "north")
Extract methods for raadtools read functions
## S4 method for signature 'function,data.frame' extract(x, y, ctstime = FALSE, fact = NULL, verbose = TRUE, ...)
## S4 method for signature 'function,data.frame' extract(x, y, ctstime = FALSE, fact = NULL, verbose = TRUE, ...)
x |
A raadtools read function. |
y |
Object to use for querying from the raadtools read functions, such as a vector of character, Date, or POSIXt values, data.frame, trip, etc. |
ctstime |
specify whether to find the nearest value in time ( |
fact |
integer. Aggregation factor expressed as number of cells in each direction (horizontally and vertically). Or two integers (horizontal and vertical aggregation factor). See Details in |
verbose |
report on progress or keep quiet |
... |
Additional arguments passed to the read function. |
Extract data from read functions in various ways.
data values extracted by the read functions
a <- structure(list(x = c(174, 168, 156, 111, 99, 64, 52, 46, -4, -15, -30, -38, -47, -62, -87, -127, -145, -160, -161), y = c(-72, -39, -50, -58, -35, -38, -48, -60, -48, -35, -37, -51, -68, -72, -69, -54, -40, -49, -54)), .Names = c("x", "y"), row.names = c(NA, -19L), class = "data.frame") a$time <- structure(c(5479, 5479, 5479, 5479, 5479, 5479, 5479, 5479, 5479, 5479, 5479, 5489, 5529, 5529, 5529, 5579, 5579, 5579, 5579), class = "Date") extract(readsst, a) extract(readsst, a, method = "bilinear") a$time <- sort(as.Date("2005-01-01") + sample(c(0, 0, 0, 8, 20, 50), nrow(a), replace = TRUE)) extract(readsst, a)
a <- structure(list(x = c(174, 168, 156, 111, 99, 64, 52, 46, -4, -15, -30, -38, -47, -62, -87, -127, -145, -160, -161), y = c(-72, -39, -50, -58, -35, -38, -48, -60, -48, -35, -37, -51, -68, -72, -69, -54, -40, -49, -54)), .Names = c("x", "y"), row.names = c(NA, -19L), class = "data.frame") a$time <- structure(c(5479, 5479, 5479, 5479, 5479, 5479, 5479, 5479, 5479, 5479, 5479, 5489, 5529, 5529, 5529, 5579, 5579, 5579, 5579), class = "Date") extract(readsst, a) extract(readsst, a, method = "bilinear") a$time <- sort(as.Date("2005-01-01") + sample(c(0, 0, 0, 8, 20, 50), nrow(a), replace = TRUE)) extract(readsst, a)
Provided by Alex Fraser
fraser_fasticefiles(...)
fraser_fasticefiles(...)
... |
ignored |
This function wraps a raadfiles function fasticefiles
, but expands the file list out to individual bands.
RasterLayer
fraser_fasticefiles()
fraser_fasticefiles()
Load spatial map of fronts data.
frontsmap(map = c("park", "orsi"))
frontsmap(map = c("park", "orsi"))
map |
name of map to load |
Currently ORSI or PARK is the only supported layers.
"park" = Southern Ocean fronts as determined by Park and Durand, doi: 10.17882/59800
"orsi" - the ORSI fronts derived from the files provided by the WOCE Atlas, see References
SpatialLinesDataFrame
http://woceatlas.tamu.edu/Sites/html/atlas/SOA_DATABASE_DOWNLOAD.html
Data frame of file names and dates.
ghrsstfiles()
ghrsstfiles()
data.frame
This function loads the latest cache of stored files for ice products.
icefiles( time.resolution = "daily", product = "nsidc", hemisphere = c("south", "north"), ... )
icefiles( time.resolution = "daily", product = "nsidc", hemisphere = c("south", "north"), ... )
time.resolution |
daily or monthly files? |
product |
choice of sea ice product, see |
hemisphere |
north or south |
... |
reserved for future use, currently ignored |
The 'fullname' is the path to the raw NSIDC binary file, 'vrt_dsn' a VRT string describing the fullname as a GDAL DSN string.
data.frame of file
and date
## Not run: icf <- icefiles() icf[nrow(icf), ] ## End(Not run)
## Not run: icf <- icefiles() icf[nrow(icf), ] ## End(Not run)
Read map images.
imagemap(map = c("ibcso_background_hq"), fact = 1L)
imagemap(map = c("ibcso_background_hq"), fact = 1L)
map |
name of the map to load |
fact |
resize factor, see |
The ancient lost art of cartography.
ibcso_background: The IBCSO RGB-map rasterlayer in high resolution
RasterBrick, with R G B bands
http://www.ibcso.org/data.html
Voyage track data from the Nuyina
aurora
A data frame with 9 columns. The columns represent
longitude latitude date_time_utc air_pressure port_longwave_irradiance port_air_temperature precipitation_rate sea_water_salinity sea_water_temperature
This is a sample of the Nuyina underway in 2023.
see data-raw for extraction
range(nuyina$date_time_utc)
range(nuyina$date_time_utc)
For L3 bin derived products.
oc_sochla_files(time.resolution = c("daily"), product = c("MODISA", "SeaWiFS"))
oc_sochla_files(time.resolution = c("daily"), product = c("MODISA", "SeaWiFS"))
time.resolution |
temporal resolution (base is daily) |
product |
which product |
data frame
This function loads the latest cache of stored files for NASA ocean colour products.
ocfiles( time.resolution = c("daily", "weekly", "monthly", "weekly32"), product = c("MODISA", "SeaWiFS", "VIIRS"), varname = c("CHL", "RRS", "POC", "KD490", "NPP_PAR", "SNPP_CHL", "SNPP_RRS"), type = c("L3m", "L3b"), bz2.rm = TRUE, ext = c("nc", "main"), ... )
ocfiles( time.resolution = c("daily", "weekly", "monthly", "weekly32"), product = c("MODISA", "SeaWiFS", "VIIRS"), varname = c("CHL", "RRS", "POC", "KD490", "NPP_PAR", "SNPP_CHL", "SNPP_RRS"), type = c("L3m", "L3b"), bz2.rm = TRUE, ext = c("nc", "main"), ... )
time.resolution |
daily or monthly files? |
product |
choice of ocean colour sensor |
varname |
which variable (or set of variables) |
type |
which level of data |
bz2.rm |
ignore files that are compressed |
ext |
which extension, e.g. "nc" or "main" |
... |
reserved for future use, currently ignored |
data.frame of file
and date
Spatial lines in the standard (71S) South Polar projection. Data is the 'rnaturalearth ne_coastline' at 'scale = 10'.
polar_map(crs = commonprojections$polar)
polar_map(crs = commonprojections$polar)
crs |
PROJ.4 string |
Use 'crs' to modify the projection, or set to 'NA' to leave in longlat.
SpatialLinesDataFrame
plot(polar_map())
plot(polar_map())
Create a grid of query values for of longitude, latitude, and date.
query_grid(lon, lat, date)
query_grid(lon, lat, date)
lon |
vector of longitudes in the grid |
lat |
vector of latitudes in the grid |
date |
vector of dates in the grid |
This be used as the second argument to extract()
for general
'point-in-time' value extraction.
The ouput is a data frame of the grid expanded, a row for every combination of longitude, latitude, date - so can be quite large (!) so be careful with very fine resolution or long series.
lon <- c(100, 110, 120) lat <-seq(-60, -40, by = 5) dts <- seq(as.Date("2002-03-03"), by = "5 days", length.out = 10) qd <- query_grid(lon = lon, lat = lat, date = dts) ## use bilinear to ensure we get an interpolate value in the grid qd$sst <- extract(readsst, qd, method = "bilinear") ## ssha qd$ssha <- extract(readssh, qd, ssha = TRUE, method = "bilinear") qd$wind_mag <- extract(readwind, qd, magonly = TRUE, method = "bilinear")
lon <- c(100, 110, 120) lat <-seq(-60, -40, by = 5) dts <- seq(as.Date("2002-03-03"), by = "5 days", length.out = 10) qd <- query_grid(lon = lon, lat = lat, date = dts) ## use bilinear to ensure we get an interpolate value in the grid qd$sst <- extract(readsst, qd, method = "bilinear") ## ssha qd$ssha <- extract(readssh, qd, ssha = TRUE, method = "bilinear") qd$wind_mag <- extract(readwind, qd, magonly = TRUE, method = "bilinear")
Dummy function
raadtools( date, time.resolution = "daily", lon180 = TRUE, setNA = TRUE, latest = FALSE, returnfiles = FALSE, ... )
raadtools( date, time.resolution = "daily", lon180 = TRUE, setNA = TRUE, latest = FALSE, returnfiles = FALSE, ... )
date |
date or dates of data to read, |
time.resolution |
time resoution data to read, daily or monthly |
lon180 |
defaults to TRUE, to "rotate" Pacific view 0, 360 data to Atlantic view -180, 180 |
setNA |
mask out land values (only applies to monthly time.resolution) |
latest |
if TRUE (and date not supplied) return the latest time available |
returnfiles |
ignore options and just return the file names and dates |
... |
passed in to brick, primarily for |
xylim |
spatial extents to crop from source data, can be anything accepted by |
inputfiles |
input the files data base to speed up initialization |
This function loads the latest cache of stored files for rapid response products.
rapid_responsefiles(product = c("aqua", "terra"), ...)
rapid_responsefiles(product = c("aqua", "terra"), ...)
... |
reserved for future use, currently ignored |
We acknowledge the use of Rapid Response imagery from the Land Atmosphere Near-real time Capability for EOS (LANCE) system operated by the NASA/GSFC/Earth Science Data and Information System (ESDIS) with funding provided by NASA/HQ.
data.frame of file
and date
## Not run: rf <- rapid_responsefiles() range(rf$date) ## End(Not run)
## Not run: rf <- rapid_responsefiles() range(rf$date) ## End(Not run)
Functions read_sla_daily
and so on for "ugosa, adt, ugos, sla, vgos, vgosa, err".
read_ugosa_daily( date, xylim = NULL, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL ) read_ugos_daily( date, xylim = NULL, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL ) read_sla_daily( date, xylim = NULL, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL ) read_vgos_daily( date, xylim = NULL, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL ) read_vgosa_daily( date, xylim = NULL, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL ) read_err_daily( date, xylim = NULL, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL )
read_ugosa_daily( date, xylim = NULL, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL ) read_ugos_daily( date, xylim = NULL, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL ) read_sla_daily( date, xylim = NULL, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL ) read_vgos_daily( date, xylim = NULL, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL ) read_vgosa_daily( date, xylim = NULL, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL ) read_err_daily( date, xylim = NULL, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL )
date |
date or dates of data to read, |
xylim |
spatial extents to crop from source data, can be anything accepted by |
latest |
if TRUE (and date not supplied) return the latest time available |
returnfiles |
ignore options and just return the file names and dates |
... |
passed in to brick, primarily for |
inputfiles |
input the files data base to speed up initialization |
The labels 'u', 'v' are used by another function readcurr()
which reads the 'ugos' and 'vgos' components and corresponds to an older
but analogous dataset.
'ugos' is Absolute geostrophic velocity: zonal component, 'surface_geostrophic_eastward_sea_water_velocity' in m/s.
'vgos' is Absolute geostrophic velocity: meridian component, 'surface_geostrophic_northward_sea_water_velocity' in m/s.
'ugosa' is Geostrophic velocity anomalies: zonal component, 'surface_geostrophic_eastward_sea_water_velocity_assuming_sea_level_for_geoid' in m/s.
'vgosa' is Geostrophic velocity anomalies: meridian component, 'surface_geostrophic_northward_sea_water_velocity_assuming_sea_level_for_geoid' in m/s.
'adt' is Absolute dynamic topography, 'sea_surface_height_above_geoid' in metres
'sla' is Sea level anomaly, 'sea_surface_height_above_sea_level' in metres.
For the raw files see raadfiles::altimetry_daily_files
and for example do
'files <- read_sla_daily(returnfiles = TRUE); ncdf4::nc_open(files$fullname1)' which will give a full NetCDF metadata print out of the source files (all the variables are in each file).
a raster layer
a <- read_adt_daily(sort(Sys.Date() - 1:50), xylim = extent(100, 150, -70, -40)) ## Not run: animate(a, pause = 0, col = colorRampPalette(c("dodgerblue", "white", "firebrick"))(21), breaks = c(seq(min(cellStats(a, min)), 0, length = 11), seq(0.001, max(cellStats(a, max)), length = 10))) ## End(Not run)
a <- read_adt_daily(sort(Sys.Date() - 1:50), xylim = extent(100, 150, -70, -40)) ## Not run: animate(a, pause = 0, col = colorRampPalette(c("dodgerblue", "white", "firebrick"))(21), breaks = c(seq(min(cellStats(a, min)), 0, length = 11), seq(0.001, max(cellStats(a, max)), length = 10))) ## End(Not run)
Both eras are available in the same series, scaling is applied to the first series to ensure the data is always in percentage (0, 100).
read_amsr_ice( date, xylim = NULL, latest = TRUE, ..., returnfiles = FALSE, inputfiles = NULL )
read_amsr_ice( date, xylim = NULL, latest = TRUE, ..., returnfiles = FALSE, inputfiles = NULL )
date |
date or dates of data to read, see Details |
xylim |
spatial extents to crop from source data, can be anything accepted by |
latest |
if TRUE and date input is missing, return the latest time available otherwise the earliest |
... |
passed to brick, primarily for |
returnfiles |
ignore options and just return the file names and dates |
inputfiles |
input the files data base to speed up initialization |
This function relies on the file-listing of raadfiles::amsr_daily_files()
raadfiles::amsr_daily_files read_cersat_ice readice
Using the TIF files (these are 2012 onwards)
read_amsr2_3k_ice( date, xylim = NULL, latest = TRUE, ..., setNA = TRUE, returnfiles = FALSE, inputfiles = NULL )
read_amsr2_3k_ice( date, xylim = NULL, latest = TRUE, ..., setNA = TRUE, returnfiles = FALSE, inputfiles = NULL )
date |
date or dates of data to read, see Details |
xylim |
spatial extents to crop from source data, can be anything accepted by |
latest |
if TRUE and date input is missing, return the latest time available otherwise the earliest |
... |
passed to brick, primarily for |
setNA |
mask zero and values greater than 100 as NA |
returnfiles |
ignore options and just return the file names and dates |
inputfiles |
input the files data base to speed up initialization |
This function relies on the file-listing of raadfiles::amsr2_3k_daily_files()
raadfiles::amsr2_3k_daily_files read_amsr2_ice
Using the TIF files (these are 2012 onwards)
read_amsr2_ice( date, xylim = NULL, latest = TRUE, ..., setNA = TRUE, returnfiles = FALSE, inputfiles = NULL )
read_amsr2_ice( date, xylim = NULL, latest = TRUE, ..., setNA = TRUE, returnfiles = FALSE, inputfiles = NULL )
date |
date or dates of data to read, see Details |
xylim |
spatial extents to crop from source data, can be anything accepted by |
latest |
if TRUE and date input is missing, return the latest time available otherwise the earliest |
... |
passed to brick, primarily for |
setNA |
mask zero and values greater than 100 as NA |
returnfiles |
ignore options and just return the file names and dates |
inputfiles |
input the files data base to speed up initialization |
This function relies on the file-listing of raadfiles::amsr2_daily_files()
.
raadfiles::amsr2_daily_files
Using the TIF files (these are 2012 onwards)
read_amsre_ice( date, xylim = NULL, latest = TRUE, ..., returnfiles = FALSE, inputfiles = NULL )
read_amsre_ice( date, xylim = NULL, latest = TRUE, ..., returnfiles = FALSE, inputfiles = NULL )
date |
date or dates of data to read, see Details |
xylim |
spatial extents to crop from source data, can be anything accepted by |
latest |
if TRUE and date input is missing, return the latest time available otherwise the earliest |
... |
passed to brick, primarily for |
returnfiles |
ignore options and just return the file names and dates |
inputfiles |
input the files data base to speed up initialization |
This function relies on the file-listing of raadfiles::amsre_daily_files()
.
raadfiles::amsre_daily_files read_amsr2_ice
RSS CCMP_RT V2.1 derived surface winds (Level 3.0)
read_ccmp( date, time.resolution = c("6hourly"), xylim = NULL, lon180 = FALSE, magonly = FALSE, dironly = FALSE, uonly = FALSE, vonly = FALSE, nobsonly = FALSE, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL )
read_ccmp( date, time.resolution = c("6hourly"), xylim = NULL, lon180 = FALSE, magonly = FALSE, dironly = FALSE, uonly = FALSE, vonly = FALSE, nobsonly = FALSE, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL )
date |
date or dates of data to read, see Details |
time.resolution |
time resolution (6hourly) |
xylim |
spatial extents to crop from source data, can be anything accepted by |
magonly |
return just the magnitude from the U and V components |
dironly |
return just the direction from the U and V, in degrees N=0, E=90, S=180, W=270 |
uonly |
return just the U component of velocity |
vonly |
return just the V component of velocity components, in degrees (0 north, 90 east, 180 south, 270 west) |
latest |
if TRUE (and date not supplied) return the latest time available, otherwise the earliest |
returnfiles |
ignore options and just return the file names and dates |
... |
passed to brick, primarily for |
CCMP wind data is read from files managed by
ccmp_files
. Dates are matched to file names by
finding the nearest match in time within a short duration. By
default only one time step is returned with both U and V
components. Multiple dates can be returned for magnitude or
direction, U or V only or N-obs only.
This is the " RSS VAM 6-hour analyses starting from the NCEP GFS wind analyses"
See References.
icefiles
for details on the repository of
data files, raster
for the return value
## read a single time slice, direction only x <- read_ccmp(dironly = TRUE) ## get a local extent for a zoom plot and plot the directions [0,360) as an image with arrows e <- extent(projectExtent(raster(extent(130, 150, -50, -30), crs = "+proj=longlat"), projection(x))) x <- crop(read_ccmp(), e) crds <- coordinates(x) scale <- 0.05 vlen <- function(x) sqrt(x[[1]]^2 + x[[2]]^2) plot(vlen(crop(x, e))) x1 <- crds[,1] y1 <- crds[,2] x2 <- crds[,1] + values(x[[1]]) * scale y2 <- crds[,2] + values(x[[2]]) * scale arrows(x1, y1, x2, y2, length = 0.02) ## faster if we get the file list first ccfiles <- ccmp_files() earliest <- read_ccmp(ccfiles$date[1:16], xylim = e, magonly = TRUE, inputfiles = ccfiles) plot(earliest, col = hcl.colors(64), zlim = c(0, 20))
## read a single time slice, direction only x <- read_ccmp(dironly = TRUE) ## get a local extent for a zoom plot and plot the directions [0,360) as an image with arrows e <- extent(projectExtent(raster(extent(130, 150, -50, -30), crs = "+proj=longlat"), projection(x))) x <- crop(read_ccmp(), e) crds <- coordinates(x) scale <- 0.05 vlen <- function(x) sqrt(x[[1]]^2 + x[[2]]^2) plot(vlen(crop(x, e))) x1 <- crds[,1] y1 <- crds[,2] x2 <- crds[,1] + values(x[[1]]) * scale y2 <- crds[,2] + values(x[[2]]) * scale arrows(x1, y1, x2, y2, length = 0.02) ## faster if we get the file list first ccfiles <- ccmp_files() earliest <- read_ccmp(ccfiles$date[1:16], xylim = e, magonly = TRUE, inputfiles = ccfiles) plot(earliest, col = hcl.colors(64), zlim = c(0, 20))
This is southern hemisphere daily 12.5km SSM/I since 1991.
read_cersat_ice( date, xylim = NULL, latest = TRUE, setNA = TRUE, ..., returnfiles = FALSE, inputfiles = NULL )
read_cersat_ice( date, xylim = NULL, latest = TRUE, setNA = TRUE, ..., returnfiles = FALSE, inputfiles = NULL )
date |
date or dates of data to read, see Details |
xylim |
spatial extents to crop from source data, can be anything accepted by |
latest |
if TRUE and date input is missing, return the latest time available otherwise the earliest |
setNA |
mask zero and values greater than 100 as NA |
... |
passed to brick, primarily for |
returnfiles |
ignore options and just return the file names and dates |
inputfiles |
input the files data base to speed up initialization |
This function relies on the file-listing of raadfiles::cersat_daily_files()
raadfiles::cersat_daily_files read_amsr_ice
Data is read as daily files in a way consistent with other read functions that can
be used by extract()
. (readchla
for example cannot be used this way as it returns a mean value for the period asked for).
read_chla_daily( date, time.resolution = "daily", xylim = NULL, lon180 = TRUE, varname = c("chlor_a"), setNA = TRUE, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL ) read_chla_weekly( date, xylim = NULL, lon180 = TRUE, varname = c("chlor_a"), setNA = TRUE, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL ) read_chla_monthly( date, xylim = NULL, lon180 = TRUE, varname = c("chlor_a"), setNA = TRUE, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL )
read_chla_daily( date, time.resolution = "daily", xylim = NULL, lon180 = TRUE, varname = c("chlor_a"), setNA = TRUE, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL ) read_chla_weekly( date, xylim = NULL, lon180 = TRUE, varname = c("chlor_a"), setNA = TRUE, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL ) read_chla_monthly( date, xylim = NULL, lon180 = TRUE, varname = c("chlor_a"), setNA = TRUE, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL )
date |
date or dates of data to read, |
time.resolution |
time resoution data to read, daily or monthly |
xylim |
spatial extents to crop from source data, can be anything accepted by |
lon180 |
defaults to TRUE, to "rotate" Pacific view 0, 360 data to Atlantic view -180, 180 |
varname |
variable to return from the data files, default is "sst" or "anom", "err", "ice" |
setNA |
mask out land values (only applies to monthly time.resolution) |
latest |
if TRUE (and date not supplied) return the latest time available |
returnfiles |
ignore options and just return the file names and dates |
... |
passed in to brick, primarily for |
inputfiles |
input the files data base to speed up initialization |
This function will read from the entire era available to SeaWiFS, MODISA, and VIIRS.
To read from particular eras use the output of ocfiles()
for inputfiles argument.
raster object
read_chla_daily(latest = FALSE) ## we should see SeaWiFS read_chla_daily() ## we should see MODISA (or VIIRS)
read_chla_daily(latest = FALSE) ## we should see SeaWiFS read_chla_daily() ## we should see MODISA (or VIIRS)
Global 2.5 Minute Geoid Undulations, a big raster from multiple files. We just create a temporary virtual raster from the source files, and return as a RasterLayer.
read_geoid(xylim = NULL, force = FALSE)
read_geoid(xylim = NULL, force = FALSE)
xylim |
an extent, in longlat, or an object that provides one |
force |
if TRUE ignore cached virtual saved file, and recompute (try this if it otherwise fails) |
Each file is an ESRI GRID raster data set of 2.5-minute geoid undulation values covering a 45 x 45 degree area. Each raster file has a 2.5-minute cell size and is a subset of the global 2.5 x 2.5-minute grid of pre-computed geoid undulation point values found on the EGM2008-WGS 84 Version web page. This ESRI GRID format represents a continuous surface of geoid undulation values where each 2.5-minute raster cell derives its value from the original pre-computed geoid undulation point value located at the SW corner of each cell.
RasterLayer, longitude latitue grid of geoid level
## Not run: geoid <- read_geoid() ## End(Not run)
## Not run: geoid <- read_geoid() ## End(Not run)
Average Lead-Frequency for the Arctic for winter months November-April 2002/03-2018/19 based on daily lead composites as derived from MOD/MYD-29 IST 5 min granules
read_leads_clim_south(xylim = NULL) read_leads_clim_north(xylim = NULL) read_leads_clim(hemisphere = c("south", "north"), xylim = NULL, ...)
read_leads_clim_south(xylim = NULL) read_leads_clim_north(xylim = NULL) read_leads_clim(hemisphere = c("south", "north"), xylim = NULL, ...)
F. Reiser, S. Willmes, G. Heinemann (2020), A new algorithm for daily sea ice lead identification in the Arctic and Antarctic winter from thermal-infrared satellite imagery, Data subm.
read_leads_clim() read_leads_clim_north(xylim = extent(c(-1, 1, -1, 1) * 50000)) south <- read_leads_clim_south() ## hone in on Mawson pt <- reproj::reproj_xy(cbind(62 + 52/60, -(67 + 36/60)), projection(south), source = "+proj=longlat") lead <- read_leads_clim_south(xylim = extent(pt[1] + c(-1, 1) * 250000, pt[2] + c(-1, 1) * 250000)) plot(lead, col = grey.colors(100)) abline(v = pt[1], h = pt[2])
read_leads_clim() read_leads_clim_north(xylim = extent(c(-1, 1, -1, 1) * 50000)) south <- read_leads_clim_south() ## hone in on Mawson pt <- reproj::reproj_xy(cbind(62 + 52/60, -(67 + 36/60)), projection(south), source = "+proj=longlat") lead <- read_leads_clim_south(xylim = extent(pt[1] + c(-1, 1) * 250000, pt[2] + c(-1, 1) * 250000)) plot(lead, col = grey.colors(100)) abline(v = pt[1], h = pt[2])
Read Southern Ocean Chlorophyll-a in L3 bin form.
read_oc_sochla( date, time.resolution = c("daily"), bins = NULL, product = c("MODISA", "SeaWiFS"), latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL )
read_oc_sochla( date, time.resolution = c("daily"), bins = NULL, product = c("MODISA", "SeaWiFS"), latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL )
date |
date/s to read |
time.resolution |
daily for now |
latest |
return only the latest day if date intput missing, otherwise the earliest |
returnfiles |
give the files instead of the data |
... |
ignored |
inputfiles |
speed up the read by inputting the files |
xylim |
extent in longitude latitude using 'raster::extent' |
relies on processing done here https://github.com/AustralianAntarcticDivision/ocean_colour
tibble of the values in L3 bin form
Currently using MODISA 4km Mapped L2m, see raadfiles::par_files()
for the actual source files.
read_par( date, time.resolution = "8D", xylim = NULL, lon180 = FALSE, nobsonly = FALSE, latest = TRUE, returnfiles = FALSE, inputfiles = NULL, ... )
read_par( date, time.resolution = "8D", xylim = NULL, lon180 = FALSE, nobsonly = FALSE, latest = TRUE, returnfiles = FALSE, inputfiles = NULL, ... )
date |
date or dates of data to read, |
time.resolution |
'8D' currently, use the NASA tokens for time period |
xylim |
spatial extents to crop from source data, can be anything accepted by |
lon180 |
defaults to TRUE, to "rotate" Pacific view 0, 360 data to Atlantic view -180, 180 |
latest |
if TRUE (and date not supplied) return the latest time available |
returnfiles |
ignore options and just return the file names and dates |
inputfiles |
input the files data base to speed up initialization |
... |
passed in to brick, primarily for |
raster object
read_par(latest = FALSE)
read_par(latest = FALSE)
Model data read from files managed by
sose_monthly_files
. Dates are matched to file names by finding
the nearest match in time within a short duration. If date
is greater than length 1 then the sorted set of unique matches is
returned.
read_sose( date, time.resolution = c("monthly"), varname = "", level = 1L, setNA = TRUE, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL )
read_sose( date, time.resolution = c("monthly"), varname = "", level = 1L, setNA = TRUE, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL )
varname |
variable to return from the data files, default is whatever first value
from |
level |
defaults to 1L - there are 52 |
This function doesn't support xylim, lon180, or more than one date being read.
raster
object
## Not run: read_sose(varname = "Uvel", level = 1) read_sose(varname = "SeaIceArea", level = 1, latest = FALSE) read_sose(varname = "Chl", level = 10, latest = FALSE) ## End(Not run)
## Not run: read_sose(varname = "Uvel", level = 1) read_sose(varname = "SeaIceArea", level = 1, latest = FALSE) read_sose(varname = "Chl", level = 10, latest = FALSE) ## End(Not run)
Read from The Antarctic Mesoscale Prediction System (AMPS) files.
readamps_d1wind
reads the "d1" level
wind (defaults to 1).
readamps_d1wind( date, time.resolution = "4hourly", xylim = NULL, magonly = FALSE, dironly = FALSE, uonly = FALSE, vonly = FALSE, latest = TRUE, returnfiles = FALSE, level = 1, ..., inputfiles = NULL )
readamps_d1wind( date, time.resolution = "4hourly", xylim = NULL, magonly = FALSE, dironly = FALSE, uonly = FALSE, vonly = FALSE, latest = TRUE, returnfiles = FALSE, level = 1, ..., inputfiles = NULL )
readamps
reads the band
(defaults to 1).
See amps_metadata
for a description of the bands.
Data http://www2.mmm.ucar.edu/rt/amps/wrf_grib/, and contain gridded forecast data from the AMPS-WRF model. The 'd1' (domain 1) files are on a 30km grid, while the 'd2' (domain 2) files are on a 10km grid. The grid domains are shown at http://polarmet.osu.edu/AMPS/ - d2 extends out to southern Australia and the tip of South America, and d1 covers just Antarctica (including Davis) and parts of the Southern Ocean. There are two forecast cycles, starting at 00UT and 12UT daily. Forecasts are for +00, +03, +06, +09, +12 and +15 hours. These steps can be used to see how the wind field is forecast as changing, and to obtain two different data sets each 3 hour timestep (e.g. the +15h forecast for the 00UT run is at the same time as the +03h forecast from the 12UT run). The 00UT and 12UT runs contain fewer parameters that the +03h and later forecasts
An example file date is "2015-10-25 UTC" gdalinfo 2015102512_WRF_d1_f000.grb uonly is Band 5 Block=329x1 Type=Float64, ColorInterp=Undefined Description = 10[m] HTGL (Specified height level above ground) Metadata: GRIB_COMMENT=u-component of wind [m/s] GRIB_ELEMENT=UGRD GRIB_FORECAST_SECONDS=0 sec GRIB_REF_TIME= 1445774400 sec UTC GRIB_SHORT_NAME=10-HTGL GRIB_UNIT=[m/s] GRIB_VALID_TIME= 1445774400 sec UTC vonly is Band 27 Block=329x1 Type=Float64, ColorInterp=Undefined Description = 10[m] HTGL (Specified height level above ground) Metadata: GRIB_COMMENT=v-component of wind [m/s] GRIB_ELEMENT=VGRD GRIB_FORECAST_SECONDS=0 sec GRIB_REF_TIME= 1445774400 sec UTC GRIB_SHORT_NAME=10-HTGL GRIB_UNIT=[m/s] GRIB_VALID_TIME= 1445774400 sec UTC
Raster
af <- amps_d1files() w <- readamps_d1wind(latest = TRUE, inputfiles = af) #w <- readamps_d2wind(latest = TRUE) vlen <- function(a, b) sqrt(a * a + b * b) plot(vlen(w[[1]], w[[2]])) ## arrow jigger arr <- function(x, sub = seq(ncell(x[[1]])), scale = 1) { cr <- coordinates(x[[1]]); cr1 <- cr; cr1[,1] <- cr[,1] + values(x[[1]])*scale; cr1[,2] <- cr1[,2] + values(x[[2]]*scale); segments(cr[sub,1], cr[sub,2], cr1[sub,1], cr1[sub,2]) } arr(w, sample(ncell(w), 10000), scale = 15000)
af <- amps_d1files() w <- readamps_d1wind(latest = TRUE, inputfiles = af) #w <- readamps_d2wind(latest = TRUE) vlen <- function(a, b) sqrt(a * a + b * b) plot(vlen(w[[1]], w[[2]])) ## arrow jigger arr <- function(x, sub = seq(ncell(x[[1]])), scale = 1) { cr <- coordinates(x[[1]]); cr1 <- cr; cr1[,1] <- cr[,1] + values(x[[1]])*scale; cr1[,2] <- cr1[,2] + values(x[[2]]*scale); segments(cr[sub,1], cr[sub,2], cr1[sub,1], cr1[sub,2]) } arr(w, sample(ncell(w), 10000), scale = 15000)
I don't remember what these are but they were at https://orca.science.oregonstate.edu/
readcafe( date, time.resolution = c("monthly"), xylim = NULL, lon180 = TRUE, setNA = TRUE, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL )
readcafe( date, time.resolution = c("monthly"), xylim = NULL, lon180 = TRUE, setNA = TRUE, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL )
date |
date or dates of data to read, |
time.resolution |
time resoution data to read, daily or monthly |
xylim |
spatial extents to crop from source data, can be anything accepted by |
lon180 |
defaults to TRUE, to "rotate" Pacific view 0, 360 data to Atlantic view -180, 180 |
setNA |
mask out land values (only applies to monthly time.resolution) |
latest |
if TRUE (and date not supplied) return the latest time available |
returnfiles |
ignore options and just return the file names and dates |
... |
Arguments passed on to |
inputfiles |
input the files data base to speed up initialization |
Files are found with raadfiles::cafe_monthly_files()
.
readcafe(date = "2010-01-15")
readcafe(date = "2010-01-15")
Ocean colour Chlorophyll-a data, provide an input of daily dates and these will be averaged into one layer.
readCHL_month( date, xylim = NULL, returnfiles = FALSE, ..., inputfiles = NULL, latest = TRUE ) readchla( date, product = c("any", "MODISA", "SeaWiFS", "VIIRS"), xylim = NULL, algorithm = c("nasa"), latest = TRUE, grid = NULL, ..., returnfiles = FALSE, inputfiles = NULL )
readCHL_month( date, xylim = NULL, returnfiles = FALSE, ..., inputfiles = NULL, latest = TRUE ) readchla( date, product = c("any", "MODISA", "SeaWiFS", "VIIRS"), xylim = NULL, algorithm = c("nasa"), latest = TRUE, grid = NULL, ..., returnfiles = FALSE, inputfiles = NULL )
date |
date or dates of data to read, see Details |
xylim |
spatial extents to crop from source data, can be anything accepted by |
returnfiles |
ignore options and just return the file names and dates |
... |
currently ignored Note that reaCHL_month reads the NASA algorithm L3m products. Note that this function cannot be used with 'extract()', for that use |
inputfiles |
input the files data base to speed up initialization |
latest |
if TRUE (and date not supplied) return the latest time available, otherwise the earliest |
product |
choice of product, see Details |
algorithm |
nasa only |
grid |
template raster object for output |
raster
object
readCHL_month
chlafiles
for details on the repository of
data files, raster
for the return value
readCHL_month() ## Not run: d <- readchla(c("2003-01-01", c("2003-06-01")), xylim = extent(100, 150, -70, -30)) ## End(Not run)
readCHL_month() ## Not run: d <- readchla(c("2003-01-01", c("2003-06-01")), xylim = extent(100, 150, -70, -30)) ## End(Not run)
Current data is read from files managed by
currentsfiles
. Dates are matched to file names by
finding the nearest match in time within a short duration. By
default only one time step is returned with both U and V
components. Multiple dates can be returned for magnitude or
direction, U or V only.
readcurr( date, time.resolution = c("daily"), xylim = NULL, lon180 = TRUE, magonly = FALSE, dironly = FALSE, uonly = FALSE, vonly = FALSE, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL )
readcurr( date, time.resolution = c("daily"), xylim = NULL, lon180 = TRUE, magonly = FALSE, dironly = FALSE, uonly = FALSE, vonly = FALSE, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL )
date |
date or dates of data to read, see Details |
time.resolution |
time resolution to read |
xylim |
spatial extents to crop from source data, can be anything accepted by |
lon180 |
defaults to TRUE, to "rotate" Pacific view 0, 360 data to Atlantic view -180, 180 |
magonly |
return just the magnitude from the U and V components |
dironly |
return just the direction from the U and V, in degrees N=0, E=90, S=180, W=270 |
uonly |
return just the U component of velocity |
vonly |
return just the V component of velocity components, in degrees (0 north, 90 east, 180 south, 270 west) |
latest |
if TRUE (and date not supplied) return the latest time available, otherwise the earliest |
returnfiles |
ignore options and just return the file names and dates |
... |
passed to brick, primarily for |
These labels 'u', 'v', 'mag', and 'dir' correspond to arguments 'uonly', 'vonly', 'magonly', and 'dironly' which exist to have the function return only the 'u' or 'v' layer (by default both are returned which can only work for a single time step). 'magonly' and 'dironly' are special cases calculated from 'u' and 'v', the magnitude and direction of the field respectively in m/s and degrees from north.
This is the "DT merged all satellites Global Ocean Gridded SSALTO/DUACS Sea Surface Height L4 product and derived variables" See References.
raster
object with the "U"
(meridional/horizontal/X) and "V" (zonal/vertical/Y) components of velocity in
m/s. Setting either of the (mutually exclusive) magonly
and dironly
arguments returns the magnitude (in m/s) or
direction (in degrees relative to North) of the velocity vectors.
These data for daily files are stored in longitude/latitude projection on the sphere between longitudes in the Pacific
view [0, 360], the default behaviour is to reset this to Atlantic
view [-180, 180] with lon180
.
icefiles
for details on the repository of
data files, raster
for the return value
## read a single time slice, and plot the directions [0,360) as an image with arrows x <- readcurr(dironly = TRUE) ## get a local extent for a zoom plot e <- extent(projectExtent(raster(extent(130, 150, -50, -30), crs = "+proj=longlat"), projection(x))) x <- crop(readcurr(), e) crds <- coordinates(x) scale <- 1.5 vlen <- function(x) sqrt(x[[1]]^2 + x[[2]]^2) plot(vlen(crop(x, e))) x1 <- crds[,1] y1 <- crds[,2] x2 <- crds[,1] + values(x[[1]]) * scale y2 <- crds[,2] + values(x[[1]]) * scale arrows(x1, y1, x2, y2, length = 0.03)
## read a single time slice, and plot the directions [0,360) as an image with arrows x <- readcurr(dironly = TRUE) ## get a local extent for a zoom plot e <- extent(projectExtent(raster(extent(130, 150, -50, -30), crs = "+proj=longlat"), projection(x))) x <- crop(readcurr(), e) crds <- coordinates(x) scale <- 1.5 vlen <- function(x) sqrt(x[[1]]^2 + x[[2]]^2) plot(vlen(crop(x, e))) x1 <- crds[,1] y1 <- crds[,2] x2 <- crds[,1] + values(x[[1]]) * scale y2 <- crds[,2] + values(x[[1]]) * scale arrows(x1, y1, x2, y2, length = 0.03)
Derived data are read from files managed by derivaadcfiles
.
readderivaadc(products, xylim = NULL, returnfiles = FALSE, ...)
readderivaadc(products, xylim = NULL, returnfiles = FALSE, ...)
products |
choice of products, see |
xylim |
spatial extents to crop from source data, can be anything accepted by |
returnfiles |
ignore options and just return the file names |
... |
passed to brick, primarily for |
raster
object
http://data.aad.gov.au/aadc/metadata/metadata.cfm?entry_id=Polar_Environmental_Data
derivaadcfiles
for details on the repository of
data files, raster
for the return value
## Not run: prods <- c("bathymetry","chl_summer_climatology") x <- readderivaadc(prods) ## End(Not run)
## Not run: prods <- c("bathymetry","chl_summer_climatology") x <- readderivaadc(prods) ## End(Not run)
Derived sea ice data is read from files managed by derivicefiles
.
readderivice( date, time.resolution = c("daily"), product = c("time_since_melt"), xylim = NULL, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL )
readderivice( date, time.resolution = c("daily"), product = c("time_since_melt"), xylim = NULL, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL )
date |
date or dates of data to read, see Details |
time.resolution |
time resoution data to read, daily or monthly |
product |
choice of sea ice product, see Details |
xylim |
spatial extents to crop from source data, can be anything accepted by |
latest |
if TRUE and date input missing, return the latest time available, otherwise the earliest |
returnfiles |
ignore options and just return the file names and dates |
... |
passed to brick, primarily for |
inputfiles |
input the file set to avoid rescanning that (for extract point-in-time) |
Currently available products are
Dates are matched to file names by finding the nearest match in
time within a short duration. If date
is greater than
length 1 then the sorted set of unique matches is returned.
time_since_melt
32767 (treated as missing data)
32766 = land
32765 = open-ocean zone
-32768 = ice that hasn't melted during the data period
In terms of missing data 32767, in the nc file, so should be NA once read into R): these are either open water in the sea ice zone that hasn't re-frozen during the data period, or missing sea ice data that couldn't be interpolated.
raster
object
derivicefiles
for details on the repository of
data files, raster
for the return value
Fast ice data
readfastice( date, product = c("circum_fast_ice", "binary_fast_ice"), xylim = NULL, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL )
readfastice( date, product = c("circum_fast_ice", "binary_fast_ice"), xylim = NULL, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL )
date |
date or dates to read (can be character, POSIXt, or Date) |
product |
'circum_fast_ice' or 'binary_fast_ice' |
xylim |
extent in native space of the grid |
returnfiles |
return the file details only |
... |
reserved for future use, currently ignored |
time.resolution |
fixed, the underlying time step is 15 days |
High-resolution mapping of circum-Antarctic landfast sea ice distribution, 2000–2018.
Fast ice data on original polar stereographic grid, the product "circum_fast_ice" is 1000m resolution published in Alex Fraser et al. (2020).
RasterBrick see Details
Classified surface type:
0: pack ice or ocean
1: continent
2: islands
3: ice shelf
4: fast ice
5: manual fast ice edge
6: auto fast ice edge
0: Southern Ocean, pack ice or icebergs, corresponding to light blue in the PNG files.
1: Antarctic continent (including ice shelves), as defined using the Mosaic of Antarctica product, corresponding to white in the PNG files.
2: Fast ice, as classified from a single 20-day MODIS composite image, corresponding to dark blue in the PNG files
3: Fast ice, as classified using a single 20-day AMSR-E composite image, corresponding to yellow in the PNG files
4: Fast ice, as classified using the previous or next 20-day MODIS composite images, corresponding to red in the PNG files
http://data.aad.gov.au/aadc/metadata/metadata.cfm?entry_id=modis_20day_fast_ice
## read a particular date, it's circumpolar grid with 7 discrete numerc classes fice <- readfastice("2015-10-01") ## hone in on Davis ex <- c(1742836L, 3315135L, 129376L, 1136611L) davis_ice <- crop(fice, extent(c(1742836L, 3315135L, 129376L, 1136611L))) plot(davis_ice >= 4) #, col = c("brown", "white", grey(c(0.2, 0.5, 0.8))), breaks = c(0, 1, 3, 4, 5, 6)) ## compare 5 years change davis_ice2 <- crop(readfastice("2010-10-01"), extent(ex)) par(mfrow = c(2, 1)) plot(davis_ice >= 4) plot(davis_ice2 >= 4)
## read a particular date, it's circumpolar grid with 7 discrete numerc classes fice <- readfastice("2015-10-01") ## hone in on Davis ex <- c(1742836L, 3315135L, 129376L, 1136611L) davis_ice <- crop(fice, extent(c(1742836L, 3315135L, 129376L, 1136611L))) plot(davis_ice >= 4) #, col = c("brown", "white", grey(c(0.2, 0.5, 0.8))), breaks = c(0, 1, 3, 4, 5, 6)) ## compare 5 years change davis_ice2 <- crop(readfastice("2010-10-01"), extent(ex)) par(mfrow = c(2, 1)) plot(davis_ice >= 4) plot(davis_ice2 >= 4)
Sokolov Rintoul
readfronts( date, time.resolution = c("weekly"), product = c("sokolov"), xylim = NULL, lon180 = TRUE, setNA = FALSE, trim = TRUE, returnfiles = FALSE, RAT = TRUE, ... )
readfronts( date, time.resolution = c("weekly"), product = c("sokolov"), xylim = NULL, lon180 = TRUE, setNA = FALSE, trim = TRUE, returnfiles = FALSE, RAT = TRUE, ... )
date |
date or dates of data to read, see Details |
time.resolution |
time resoution data to read, daily or monthly |
product |
choice of product, see Details |
xylim |
spatial extents to crop from source data, can be anything accepted by |
lon180 |
if TRUE, data originally in Pacific view will be returned in Atlantic view (-180:180) |
setNA |
is |
trim |
if |
returnfiles |
ignore options and just return the file names and dates |
RAT |
if |
... |
reserved for future use, currently ignored |
raster
object
## Not run: b <- readfronts(c("1993-01-01", "2005-01-02"), lon180 = FALSE) ## End(Not run)
## Not run: b <- readfronts(c("1993-01-01", "2005-01-02"), lon180 = FALSE) ## End(Not run)
Daily files This dataset contains Backward-in-time FSLE product deduced from DT merged Global Ocean Gridded Absolute Geostrophic Velocities SSALTO/Duacs L4 product (version 2018).
readfsle( date, time.resolution = c("daily"), xylim = NULL, latest = TRUE, varname = c("fsle_max", "theta_max"), returnfiles = FALSE, verbose = TRUE, ..., inputfiles = NULL )
readfsle( date, time.resolution = c("daily"), xylim = NULL, latest = TRUE, varname = c("fsle_max", "theta_max"), returnfiles = FALSE, verbose = TRUE, ..., inputfiles = NULL )
date |
date or dates of data to read, see Details |
time.resolution |
time resolution to read (only daily) |
xylim |
spatial extents to crop from source data, can be anything accepted by |
latest |
if TRUE and input date is missing return the latest time available, otherwise the earliest |
varname |
either |
returnfiles |
ignore options and just return the file names and dates |
verbose |
print messages on progress etc. |
... |
passed to brick, primarily for |
inputfiles |
input the files data base to speed up initialization |
data.frame
SST in Kelvin
readghrsst( date, time.resolution = c("daily"), xylim = NULL, lon180 = TRUE, varname = c("analysed_sst", "analysis_error", "mask", "sea_ice_fraction"), setNA = TRUE, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL )
readghrsst( date, time.resolution = c("daily"), xylim = NULL, lon180 = TRUE, varname = c("analysed_sst", "analysis_error", "mask", "sea_ice_fraction"), setNA = TRUE, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL )
date |
datetime |
time.resolution |
daily |
xylim |
spatial extents to crop from source data, can be anything accepted by |
lon180 |
currently ignored |
varname |
one of "analysed_sst", "analysis_error", "mask", "sea_ice_fraction" |
setNA |
ignored |
latest |
if TRUE (and date not supplied) return the latest time available |
returnfiles |
ignore options and just return the file names and dates |
... |
arguments passed to raster brick |
inputfiles |
input the files data base to speed up initialization |
RasterStack or RasterLayer
Sea ice at 25km for either hemisphere.
readice_daily( date, time.resolution = "daily", product = "nsidc", hemisphere = c("south", "north", "both"), xylim = NULL, setNA = TRUE, rescale = FALSE, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL, resample = "bilinear" ) readice( date, time.resolution = "daily", product = "nsidc", hemisphere = c("south", "north", "both"), xylim = NULL, setNA = TRUE, rescale = FALSE, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL, resample = "bilinear" ) readice_monthly( date, time.resolution = "monthly", product = "nsidc", hemisphere = c("south", "north"), xylim = NULL, setNA = TRUE, rescale = TRUE, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL )
readice_daily( date, time.resolution = "daily", product = "nsidc", hemisphere = c("south", "north", "both"), xylim = NULL, setNA = TRUE, rescale = FALSE, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL, resample = "bilinear" ) readice( date, time.resolution = "daily", product = "nsidc", hemisphere = c("south", "north", "both"), xylim = NULL, setNA = TRUE, rescale = FALSE, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL, resample = "bilinear" ) readice_monthly( date, time.resolution = "monthly", product = "nsidc", hemisphere = c("south", "north"), xylim = NULL, setNA = TRUE, rescale = TRUE, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL )
date |
date or dates of data to read, see Details |
time.resolution |
time resoution data to read, daily or monthly |
product |
choice of sea ice product, see Details |
hemisphere |
north or south (or both, if 'both' xylim should be an actual raster or terra grid) |
xylim |
spatial extents to crop from source data, can be anything accepted by |
setNA |
mask zero and values greater than 100 as NA |
rescale |
rescale values from integer range? |
latest |
if TRUE and date input is missing, return the latest time available otherwise the earliest |
returnfiles |
ignore options and just return the file names and dates |
... |
passed to brick, primarily for |
inputfiles |
input the files data base to speed up initialization |
resample |
warper resampling method used when 'xylim' is a full grid |
extension |
default for product "amsr" is "hdf" but can be "tif" , extension = "hdf" |
This function relies on the file-listing of icfiles()
.
Currently available products are
daily or monthly NSIDC concentration data, processed by the SMMR/SSMI NASA Team
Dates are matched to file names by finding the nearest match in
time within a short duration. If date
is greater than
length 1 then the sorted set of unique matches is returned.
For NSIDC data a ratify
ied raster is returned if setNA
and
rescale
are both set to FALSE
. Use levels(x)
to return the data.frame of values
and levels (there's no straight-through rule, all numeric values are explicit along with special
values like "Unused").
The values used are documented here http://nsidc.org/data/docs/daac/nsidc0051_gsfc_seaice.gd.html
If 'both' is specified for hemisphere or if 'xylim' is a full raster grid, the warper is applied to VRT versions of the NSIDC files, which allows them to be combined in one reprojection step. In this case 'xylim' can be specified, to give a projected grid of any form. ' If not supplied (when hemisphere = 'both') then longlat raster at 0.25 degrees is assumed. ('xylim' can be specified as ' a target grid and with only north or south hemisphere applied). When the warper is used, 'setNA' and 'resample' behave the same ' way, though exact results will be different depending on the value of 'resample'.
raster
object
icefiles
for details on the repository of
data files, raster
for the return value
library(raadtools) ice <- readice(latest = TRUE) ## can read one or other hemisphere in native projection readice(hemisphere = "south") readice(hemisphere = "north") ## or we can read both, and get longlat by default readice(hemisphere = "both") ## or set our own grid and read to that ## spex::buffer_extent(extent(c(-.5, .5, -1, 1)* rad * pi), 25000) tm_ex <- c(-.5, .5, -1, 1) * 20025000 tm_template <- raster(extent(tm_ex), res = 25000, crs = "+proj=tmerc") readice(hemisphere = "both", xylim = tm_template) ## this means we can run extract on global ice, and get 0 in the middle ## extract(readice, data.frame(176, c(-72, 84), as.Date("2020-04-03") + c(0, 100))) ## [1] 80 NA ## extract(readice, data.frame(176, c(-72, 84), as.Date("2020-04-03") + c(0, 100)), hemisphere = "both") ## [1] 78.0 94.4 ## it's interpolated from the original data
library(raadtools) ice <- readice(latest = TRUE) ## can read one or other hemisphere in native projection readice(hemisphere = "south") readice(hemisphere = "north") ## or we can read both, and get longlat by default readice(hemisphere = "both") ## or set our own grid and read to that ## spex::buffer_extent(extent(c(-.5, .5, -1, 1)* rad * pi), 25000) tm_ex <- c(-.5, .5, -1, 1) * 20025000 tm_template <- raster(extent(tm_ex), res = 25000, crs = "+proj=tmerc") readice(hemisphere = "both", xylim = tm_template) ## this means we can run extract on global ice, and get 0 in the middle ## extract(readice, data.frame(176, c(-72, 84), as.Date("2020-04-03") + c(0, 100))) ## [1] 80 NA ## extract(readice, data.frame(176, c(-72, 84), as.Date("2020-04-03") + c(0, 100)), hemisphere = "both") ## [1] 78.0 94.4 ## it's interpolated from the original data
Read the NSIDC pixel-area files for either hemisphere. Only 25km product is supported. Tool name "psn25area_v3.dat and pss25area_v3.dat": http://nsidc.org/data/polar-stereo/tools_geo_pixel.html#pixel_area.
readice_area(product = "nsidc", hemisphere = "south", ...)
readice_area(product = "nsidc", hemisphere = "south", ...)
product |
"nsidc" the 25km NSIDC passive microwave |
hemisphere |
south (default) or north |
... |
ignored |
raster with the area of the cells in m^2
readice
readice_area() readice_area(hemisphere = "north")
readice_area() readice_area(hemisphere = "north")
Read the Sall?e Mixed layer depth climatology.
readmld(date, xylim = NULL, returnfiles = FALSE, ...)
readmld(date, xylim = NULL, returnfiles = FALSE, ...)
date |
date to extract, can only be a climatology month |
xylim |
extent specification |
returnfiles |
return the file details only |
... |
ignored |
Information from JB Sallee (2013-06-06) is below, and an example to read and display it in R. A recently updated dataset using the most recent Southern Ocean observation, with updated gridding analysis so it should be better and have appropriate land mask: ftp://ftp.nerc-bas.ac.uk/jbsall/MLfitted_SO.mat This is a compilation of ocean observations: a compilation of 225389 profiles from the Argo program and 106682 profiles from all kind of different ships and PIs. This is not model evaluation. This is likely the best estimate of monthly MLD , because it uses the largest number of observations. There is no observational products with time variability because the Southern Ocean only started to be observed (at basin scale) in this last decade. No observation product can give a temporal evolution at basin scale. Model estimate could be use, but model are known to perform particularly badly in the surface layer so the mixed-layer is always problematic from model. Another possibility could be to use some reanalysis product like that Met-Office product (http://www.metoffice.gov.uk/hadobs/en3/data/EN3_v2a/download_EN3_v2a.html), but this is an objective analysis of available observation, so in the southern Ocean before 2002 (when Argo started), there is not much, so the results will be strongly dominated by the climatology (World ocean Atlas). Strong suggestion to use Bluelink for MLD instead: https://gist.github.com/mdsumner/32d718dbb5667904a605cea92757b335
RasterBrick
Read MODIS Rapid Response RGB images
readrapid_response( date, product = c("aqua", "terra"), latest = TRUE, returnfiles = FALSE, ... )
readrapid_response( date, product = c("aqua", "terra"), latest = TRUE, returnfiles = FALSE, ... )
date |
date of image to load |
latest |
if TRUE and date input is missing return the latest time available, otherwise the earliest |
returnfiles |
return just the list of files |
... |
other arguments for |
MODIS Rapid Response, Antarctica
SMAP surface salinity data read from files managed by
salfiles
.
readsal( date, time.resolution = c("daily"), xylim = NULL, lon180 = TRUE, varname = c("sss_smap", "nobs", "sss_ref", "gland", "gice", "surtep"), setNA = TRUE, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL )
readsal( date, time.resolution = c("daily"), xylim = NULL, lon180 = TRUE, varname = c("sss_smap", "nobs", "sss_ref", "gland", "gice", "surtep"), setNA = TRUE, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL )
date |
date or dates of data to read, |
time.resolution |
time resoution data to read, daily or monthly |
xylim |
spatial extents to crop from source data, can be anything accepted by |
lon180 |
defaults to TRUE, to "rotate" Pacific view 0, 360 data to Atlantic view -180, 180 |
varname |
variable to return from the data files, default is "sss_smap", also available is "nobs", "sss_ref", "gland", "gice", "surtep" |
setNA |
mask out land values (only applies to monthly time.resolution) |
latest |
if TRUE (and date not supplied) return the latest time available |
returnfiles |
ignore options and just return the file names and dates |
... |
Arguments passed on to |
inputfiles |
input the files data base to speed up initialization |
readall |
FALSE by default |
raster
object
Sea surface height/anomaly
readssh( date, time.resolution = c("daily"), xylim = NULL, lon180 = TRUE, ssha = FALSE, latest = TRUE, returnfiles = FALSE, verbose = TRUE, ..., inputfiles = NULL )
readssh( date, time.resolution = c("daily"), xylim = NULL, lon180 = TRUE, ssha = FALSE, latest = TRUE, returnfiles = FALSE, verbose = TRUE, ..., inputfiles = NULL )
date |
date or dates of data to read, see Details |
time.resolution |
time resolution to read |
xylim |
spatial extents to crop from source data, can be anything accepted by |
lon180 |
defaults to TRUE, to "rotate" Pacific view 0, 360 data to Atlantic view -180, 180 components, in degrees (0 north, 90 east, 180 south, 270 west) |
ssha |
logical, to optionally return anomaly or height |
latest |
if TRUE and date input is missing return the latest time available, otherwise the earliest |
returnfiles |
ignore options and just return the file names and dates |
verbose |
print messages on progress etc. |
... |
passed to brick, primarily for |
inputfiles |
input the files data base to speed up initialization |
Details
data.frame
SST data read from files managed by
sstfiles
. Dates are matched to file names by finding
the nearest match in time within a short duration. If date
is greater than length 1 then the sorted set of unique matches is
returned.
see Details in raadtools
readsst( date, time.resolution = c("daily", "monthly"), xylim = NULL, lon180 = TRUE, varname = c("sst", "anom", "err", "ice"), setNA = TRUE, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL )
readsst( date, time.resolution = c("daily", "monthly"), xylim = NULL, lon180 = TRUE, varname = c("sst", "anom", "err", "ice"), setNA = TRUE, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL )
date |
date or dates of data to read, |
time.resolution |
time resoution data to read, daily or monthly |
xylim |
spatial extents to crop from source data, can be anything accepted by |
lon180 |
defaults to TRUE, to "rotate" Pacific view 0, 360 data to Atlantic view -180, 180 |
varname |
variable to return from the data files, default is "sst" or "anom", "err", "ice" |
setNA |
mask out land values (only applies to monthly time.resolution) |
latest |
if TRUE (and date not supplied) return the latest time available |
returnfiles |
ignore options and just return the file names and dates |
... |
Arguments passed on to |
inputfiles |
input the files data base to speed up initialization |
readall |
FALSE by default |
raster
object
icefiles
for details on the repository of
data files, raster
for the return value
## Not run: ## read one time slice and plot it up in preparation for reading a time series d <- readsst() plot(d) ## this step is interactive, draw a boundary on the plot ext <- drawExtent() ## these can be created manually with xmin,xmax,ymin,ymax ## ext <- extent(-100, 150, -75, -30) ## now read a big chunk of data for this small region dts <- seq(as.Date("2001-01-03"), by = "1 week", length = 100) sst <- readsst(dts, xylim = ext) ## End(Not run)
## Not run: ## read one time slice and plot it up in preparation for reading a time series d <- readsst() plot(d) ## this step is interactive, draw a boundary on the plot ext <- drawExtent() ## these can be created manually with xmin,xmax,ymin,ymax ## ext <- extent(-100, 150, -75, -30) ## now read a big chunk of data for this small region dts <- seq(as.Date("2001-01-03"), by = "1 week", length = 100) sst <- readsst(dts, xylim = ext) ## End(Not run)
Functions to provide topographic (bathymetry and/or topography) data.
readtopo( topo = c("gebco_23", "gebco_08", "ibcso", "etopo1", "etopo2", "kerguelen", "smith_sandwell", "gebco_14", "macrie1100m", "macrie2100m", "cryosat2", "lake_superior", "ramp", "ibcso_is", "ibcso_bed", "rema_1km", "rema_200m", "rema_100m", "rema_8m", "gebco_19", "gebco_21"), polar = FALSE, lon180 = TRUE, xylim = NULL, returnfiles = FALSE, ..., resample = "bilinear" ) readbathy( topo = c("gebco_23", "gebco_08", "ibcso", "etopo1", "etopo2", "kerguelen", "smith_sandwell", "gebco_14", "macrie1100m", "macrie2100m", "cryosat2", "lake_superior", "ramp", "ibcso_is", "ibcso_bed", "rema_1km", "rema_200m", "rema_100m", "rema_8m", "gebco_19", "gebco_21"), polar = FALSE, lon180 = TRUE, xylim = NULL, returnfiles = FALSE, ..., resample = "bilinear" ) topofile( topo = c("gebco_23", "gebco_08", "ibcso", "etopo1", "etopo2", "kerguelen", "smith_sandwell", "gebco_14", "macrie1100m", "macrie2100m", "cryosat2", "lake_superior", "ramp", "ibcso_is", "ibcso_bed", "rema_1km", "rema_200m", "rema_100m", "rema_8m", "gebco_19", "gebco_21"), polar = FALSE, lon180 = TRUE, ... ) read_rema_tiles(...)
readtopo( topo = c("gebco_23", "gebco_08", "ibcso", "etopo1", "etopo2", "kerguelen", "smith_sandwell", "gebco_14", "macrie1100m", "macrie2100m", "cryosat2", "lake_superior", "ramp", "ibcso_is", "ibcso_bed", "rema_1km", "rema_200m", "rema_100m", "rema_8m", "gebco_19", "gebco_21"), polar = FALSE, lon180 = TRUE, xylim = NULL, returnfiles = FALSE, ..., resample = "bilinear" ) readbathy( topo = c("gebco_23", "gebco_08", "ibcso", "etopo1", "etopo2", "kerguelen", "smith_sandwell", "gebco_14", "macrie1100m", "macrie2100m", "cryosat2", "lake_superior", "ramp", "ibcso_is", "ibcso_bed", "rema_1km", "rema_200m", "rema_100m", "rema_8m", "gebco_19", "gebco_21"), polar = FALSE, lon180 = TRUE, xylim = NULL, returnfiles = FALSE, ..., resample = "bilinear" ) topofile( topo = c("gebco_23", "gebco_08", "ibcso", "etopo1", "etopo2", "kerguelen", "smith_sandwell", "gebco_14", "macrie1100m", "macrie2100m", "cryosat2", "lake_superior", "ramp", "ibcso_is", "ibcso_bed", "rema_1km", "rema_200m", "rema_100m", "rema_8m", "gebco_19", "gebco_21"), polar = FALSE, lon180 = TRUE, ... ) read_rema_tiles(...)
topo |
Data source, see Details. |
polar |
Flag for returning the polar version of the IBCSO data. |
lon180 |
Flag for returning data in Atlantic -180, 180 rather than Pacific 0, 360 view. |
xylim |
spatial extents or raster grid to crop from source data, see Details |
returnfiles |
Return just the relevant file name |
... |
reserved for future use, ignored currently |
resample |
method to use for GDAL warper resampling (currently via terra), the 'method' argument to 'project()' |
Use readtopo
(or its alias readbathy
) to read data
from the chosen data set. The function topofile
is used to
find the full file name.
readtopo()
and readbathy()
accept terra::ext
or
raster::extent
or just numeric objects for 'xylim'. Alternatively these can be
a terra SpatRaster
or a raster BasicRaster
(RasterLayer, RasterBrick, or
RasterStack) as a template target raster for crop and resize, or reprojection to
new raster grid. The 'resample' argument controls the kind of sampling when
regridded or warped. (i.e. if xylim is extent, you get crop(), if it's a grid,
it is remodelled to the grid. extent doesn't need crs, grid does).
The following data sets are available using the argument topo
.
The GEBCO grid, 2023 version "ice surface"
The GEBCO_08 Grid, a global 30 arc-second grid largely generated by combining quality-controlled ship depth soundings with interpolation between sounding points guided by satellite-derived gravity data. http://www.gebco.net/data_and_products/gridded_bathymetry_data/
IBCSO bathymetry data, resolution 1min, use argument polar = TRUE
to return the projected version (polar stereographic with true scale at 71S, WGS84), 500m resolution. http://www.ibcso.org/data.html. Default is Ice Surface 'ibcso_is', use 'ibcso_bed' for Bedrock.
ETOPO1 is a 1 arc-minute global relief model of Earth's surface that integrates land topography and ocean bathymetry. http://www.ngdc.noaa.gov/mgg/global/global.html
Historic and deprecated prior version of ETOPO1. http://www.ngdc.noaa.gov/mgg/global/etopo2.html
Kerguelen Plateau Bathymetric Grid, https://doi.org/10.26186/147703
A bathymetric Digital Elevation Model (DEM) of the George V and Terre Adelie continental shelf and margin - 100, 250, and 500 metre resolution. http://data.aad.gov.au/aadc/metadata/metadata_redirect.cfm?md=AMD/AU/GVdem_2008
Global seafloor topography from satellite altimetry and ship depth soundings. http://topex.ucsd.edu/WWW_html/mar_topo.html
Antarctica CryoSat-2 Digital Elevation Model (DEM). https://earth.esa.int/web/guest/missions/esa-operational-eo-missions/cryosat
Bathymetry of Lake Superior https://www.ngdc.noaa.gov/mgg/greatlakes/superior.html
Radarsat Antarctic digital elevation model V2 https://github.com/AustralianAntarcticDivision/blueant#radarsat-antarctic-digital-elevation-model-v2
Digital Elevation Model (DEM) of Australia with 1 Second Grid
Reference Elevation Model of Antartica (REMA) for the peninsula, see read_rema_tiles
for the index
GEBCO_2019 Grid https://www.gebco.net/data_and_products/gridded_bathymetry_data/gebco_2019/gebco_2019_info.html
topofile
returns a character string of the full path to a file name
readtopo
and readbathy
return the requested data as a RasterLayer (these are aliases)
Read wind
readwind( date, time.resolution = c("6hourly"), xylim = NULL, lon180 = TRUE, magonly = FALSE, dironly = FALSE, uonly = FALSE, vonly = FALSE, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL )
readwind( date, time.resolution = c("6hourly"), xylim = NULL, lon180 = TRUE, magonly = FALSE, dironly = FALSE, uonly = FALSE, vonly = FALSE, latest = TRUE, returnfiles = FALSE, ..., inputfiles = NULL )
date |
date or dates of data to read, |
time.resolution |
time resoution data to read, daily or monthly |
xylim |
spatial extents to crop from source data, can be anything accepted by |
lon180 |
defaults to TRUE, to "rotate" Pacific view 0, 360 data to Atlantic view -180, 180 |
magonly |
return just the magnitude from the U and V components |
dironly |
return just the direction from the U and V, in degrees N=0, E=90, S=180, W=270 |
uonly |
return just the horizontal component of velocity, U |
vonly |
return just the vertical component of velocity, V |
latest |
if TRUE (and date not supplied) return the latest time available |
returnfiles |
ignore options and just return the file names and dates |
... |
Arguments passed on to
|
inputfiles |
input the files data base to speed up initialization |
Read wind data
The inputfiles
argument may be used to speed up individual reads, see the examples. Note that
this must then ignore the time.resolution
argument, which is also set by windfiles
- and no
warning is given. If using this argument you must give the same time.resolution
as was used to create the files
data.frame
.
raster object
# Speed up individual read calls. ff <- windfiles(time.resolution = "6hourly") t1 <- system.time({readwind(max(ff$date))}) t2 <- system.time({readwind(max(ff$date), inputfiles = ff)}) ## Not run: dts <- seq(as.Date("2000-01-01"), by = "1 days", length = 350) library(animation) ani.start(ani.width = 800, ani.height = 800) for (i in seq_along(dts)) { x <- readwind(dts[i]); if (i == 1L) crds <- coordinates(x[[1]]) plot(sqrt(x[[1]]^2 + x[[2]]^2), xlim = c(40, 180), ylim = c(-90, -20)); x1 <- crds[,1] y1 <- crds[,2] x2 <- crds[,1] + values(x[[1]])/4 y2 <- crds[,2] + values(x[[2]])/4 arrows(x1, y1, x2, y2, length = 0.06); plot(m, add = TRUE) } ani.stop() ## End(Not run)
# Speed up individual read calls. ff <- windfiles(time.resolution = "6hourly") t1 <- system.time({readwind(max(ff$date))}) t2 <- system.time({readwind(max(ff$date), inputfiles = ff)}) ## Not run: dts <- seq(as.Date("2000-01-01"), by = "1 days", length = 350) library(animation) ani.start(ani.width = 800, ani.height = 800) for (i in seq_along(dts)) { x <- readwind(dts[i]); if (i == 1L) crds <- coordinates(x[[1]]) plot(sqrt(x[[1]]^2 + x[[2]]^2), xlim = c(40, 180), ylim = c(-90, -20)); x1 <- crds[,1] y1 <- crds[,2] x2 <- crds[,1] + values(x[[1]])/4 y2 <- crds[,2] + values(x[[2]])/4 arrows(x1, y1, x2, y2, length = 0.06); plot(m, add = TRUE) } ani.stop() ## End(Not run)
Load file names and dates of SMAP sea surface salinity
salfiles(time.resolution = c("daily"), ...)
salfiles(time.resolution = c("daily"), ...)
time.resolution |
time resolution read |
... |
reserved for future use, currently ignored |
A data frame of file names and dates.
Only a short period of daily data is available, from '2015-03-27' to '2018-10-21'.
data.frame of file names and dates
This is a convenience wrapper around the corresponding admin functions in the raadfiles
package, so that most users will not need to use those functions directly.
This function tells raadtools
where to look for its data files, (by default) will build the necessary file list cache if it does not exist, and loads that cache into memory so that raadtools
can use it.
set_data_roots(..., build_cache_if_missing = TRUE, refresh_cache = FALSE) get_data_roots()
set_data_roots(..., build_cache_if_missing = TRUE, refresh_cache = FALSE) get_data_roots()
... |
strings: one or more paths to directories containing data |
build_cache_if_missing |
logical: |
refresh_cache |
logical: for each path, should the file collection be re-scanned and the cache rebuilt? (May be slow) |
TRUE (invisibly) on success
## Not run: ## assume that we have downloaded some data files to c:/my/data library(raadtools) set_data_roots("c:/my/data") ## see the README/vignette for more detail ## End(Not run)
## Not run: ## assume that we have downloaded some data files to c:/my/data library(raadtools) set_data_roots("c:/my/data") ## see the README/vignette for more detail ## End(Not run)
Available variable names from SOSE.
sose_monthly_varnames(iteration = "")
sose_monthly_varnames(iteration = "")
These varnames can be used in read_sose(varname = )
.
chacter vector
sose_monthly_varnames()
sose_monthly_varnames()
Load file names and dates of AVISO SSH/SSHA data
sshfiles(time.resolution = c("daily"), ...)
sshfiles(time.resolution = c("daily"), ...)
time.resolution |
set to daily only |
... |
reserved for future use, currently ignored |
A data.frame of file names and dates
data.frame of file names and dates
SST colours
sst.pal(x, palette = FALSE, alpha = 1)
sst.pal(x, palette = FALSE, alpha = 1)
x |
a vector of data values or a single number |
palette |
logical, if |
alpha |
value in 0,1 to specify opacity |
colours, palette, or function, see Details
Derived from "http://oceancolor.gsfc.nasa.gov/DOCS/palette_sst.txt.
Load file names and dates of OISST sea surface temperature data
sstfiles(time.resolution = c("daily", "monthly"), ...)
sstfiles(time.resolution = c("daily", "monthly"), ...)
time.resolution |
time resolution read |
... |
reserved for future use, currently ignored |
A data frame of file names and datres
data.frame of file names and dates
In polar stereographic, with longitude and latitude and name as attributes.
Obtained with antanym
plot(readice()) plot(stations, add = TRUE)
plot(readice()) plot(stations, add = TRUE)
Read meridional and zonal components of surface currents (m/s) from altimetry products. Input
is optional, with date
and xylim
. As with the grid reading functions read_ugos_daily and read_vgos_daily
this by default will return the latest data available and for the entire world.
table_uvgos( date, xylim = NULL, ..., xy = TRUE, cell = FALSE, na.rm = TRUE, latest = TRUE, res = NULL )
table_uvgos( date, xylim = NULL, ..., xy = TRUE, cell = FALSE, na.rm = TRUE, latest = TRUE, res = NULL )
date |
dates to read |
xylim |
extent of data to read |
... |
arguments passed to read functions (only |
xy |
include coordinates of cell in the output, |
cell |
include cell index in the output, |
na.rm |
by default missing values are removed, set to |
Please note that the coordinates are in longitude latitude, but the velocity components are in m/s. You cannot meaningfully transform the x,y coordinates and use the velocity components without taking into account rotation in the transformation (we might write some helpers for this ...).
Argument lon180
may be used to specify Pacific or Atlantic orientation.
data frame of u, v, x,y (longitude,latitude), cell, and date
uv <- table_uvgos("2001-01-01", xylim = extent(60, 120, -60, -10)) plot(range(uv$x), range(uv$y), type = "n", asp = 1.1) scal <- function(x) (x - min(x, na.rm = TRUE))/diff(range(x, na.rm = TRUE)) nn <- 56 arrows(uv$x, uv$y, uv$x + uv$u, uv$y + uv$v, col = grey.colors(nn)[scal(sqrt(uv$u^2 + uv$v^2)) * (nn-1) + 1], length = 0)
uv <- table_uvgos("2001-01-01", xylim = extent(60, 120, -60, -10)) plot(range(uv$x), range(uv$y), type = "n", asp = 1.1) scal <- function(x) (x - min(x, na.rm = TRUE))/diff(range(x, na.rm = TRUE)) nn <- 56 arrows(uv$x, uv$y, uv$x + uv$u, uv$y + uv$v, col = grey.colors(nn)[scal(sqrt(uv$u^2 + uv$v^2)) * (nn-1) + 1], length = 0)
Conversion to POSIXct ensuring no local time zone applied. Currently supported is character, Date and
anything understood by as.POSIXct
.
timedateFrom(x, ...)
timedateFrom(x, ...)
x |
input date-time stamp, character, Date or other supported type. |
... |
ignored |
the vector x
converted (if necessary) to POSIXct
NCEP2 wind files
windfiles(data.source = "", time.resolution = c("6hourly"), ...)
windfiles(data.source = "", time.resolution = c("6hourly"), ...)
data.source |
ignored, reserved for future use |
time.resolution |
time resolution data to read, 6hourly only for now |
... |
reserved for future use, currently ignored |
Files containing NCEP2 wind vector data
data.frame
of file names and dates