Last modified 11 months ago Last modified on 2019-03-14 14:18:58

There's a lot of dimension restructuring, global attribute adding, and feature type defining that goes on for each file. At this point the conversion happens with a seris of python programs that are documented in the WIKI of the git-hub development site at: https://github.com/USGS-CMG/usgs-cmg-portal/wiki/Procedure-to-update-time-series-data-from-CMG.

There is also a comparison between the elements in EPIC- and CF-compliant files in the Conversion to Climate-Forecast (CF) Convention Compliance section of OFR-2007-1194. The details of the dimensions and the requirement of some new global attributes are two big differences between the two kinds of files. Another big difference in CF is the addition of a 'standard_name' variable attribute containing a CF-1.6 variable name. With the added standard_name,that variable can be discovered by clients that use CF names, like the CMGP portal.

Notes on the use of "summary' and "project_summary" global attributes

Our pressure data up until EPR is largely uncorrected for atmospheric, so we map all our P_ variables except P_1ac to sea_water_pressure. Pressure data that has had the atmospheric component removed is stored in P_1ac (Chandeleur, EPR Barnagat and Chincoteague, Cal coastal wetlands data) will be mapped to sea_water_pressure_due_to_sea_water.

In the CF conversion, collect.py imports epic2cf (google it and github to find the code), and this module contains a dictionary of EPIC codes that are used to determine the standard_name. This means that collect.py runs fine on both 'BPR_915' and 'BP_915', because both have EPIC codes=915. Do not remove the epic_code attribute from future netCDF files!

In the CF output, we have several kinds of attribution metadata. The PROJECT attribute is from EPIC, and contains "USGS Coastal and Marine Geology Program" in all but the oldest files. Click here to see a selection of the other attribution metadata.

One of the things having our data in CF buys us is that it can the be ingested to the portal. The most recent status report shows what needs to happen to ingest waves data.