Google Earth Layer
A beginner's guide to accessing Argo data
Click here for a pdf version.
Argo collects salinity/temperature profiles from a sparse (average 3° x 3° spacing) array of robotic floats that populate the ice-free oceans that are deeper than about 2000m. They also give information on the surface and subsurface currents. Each profile is made up of about 200 data points. The first Argo floats were deployed in 2000 and the array contains 3000 floats as of late 2007. Argo data are made available to users quickly and free of restriction.
Complete documentation of the Argo data system is contained in the "Argo Data Management Handbook" and "Argo Real-time Quality Control Tests Procedures", both of which are available at the Argo Data Management Team website.
GTS data stream (for operational center use)
Data that go out on the GTS are subject to a number of quality checks in national data assembly centers and only those measurements within a profile that pass all tests (see real time quality control tests) are inserted onto the GTS. The checks are also used to set quality flags in the data that go on to enter other data streams detailed below.
Currently the data are in the TESAC format and temperature and salinity are truncated to two decimal places. The vertical co-ordinate is depth not pressure (as measured). No corrections are made to salinities in the GTS data stream.
An Argo Grey list of floats is maintained at the GDACs. The Grey list contains floats which may have problems with one or more sensors. Therefore, data from floats on the Grey list is not sent out on the GTS and should be treated with caution.
Profile data, quality control flags and probably trajectory data will soon become available using the BUFR format that is less restrictive than TESAC.
GDAC data stream (for general public)
For most users, the GDACs should be the route to access Argo data. The data on the GDACs are held in netCDF format with files that contain profile, trajectory, meta and technical data for each float. It is important to understand the naming system of the files as well as the variable names and quality control flags within each data file.
The GDACs provide a number of means of selecting data:
Which GDAC should I use?
- By selecting a latitude/longitude and time range
- By DAC (Data Assembly Centre)
- By ocean basin
This depends on a number of factors but the first issue is location. North American users will likely use the Monterey site and European users the Coriolis site. Beyond this, the choice is a matter of which GDAC has a mode of access that best suits the user's needs.
Both GDACs offer access to the complete Argo data collection, including float metadata, detailed trajectory data, and geographic and float specific multi-profile collections is provided via a DODSLAS, HTTP and FTP.These access modes are described below:
Image of webpage interface
The available selection and display tools are:-
- A profile location plot for all profiles returned by the query (may be plotted with or without float ID for queries returning many profiles)
- Download of selected profiles (in netCDFnetCDF Multi-Profile format) as a TAR file
- Plots of T-P and S-P for individual profiles
- Plots of float tracks for individual floats
Image of webpage interface
A subsetting tool allows selection by
Access to additional non-Argo data (XBT, CTD, drifters, moorings, thermosalinograph,ADCP) is available from Coriolis through the same interface.
- profile type
- time and lat/long windows,
- measured parameter
- platform type
- real-time or delayed mode QC data
GADR data at US NODC
The US NODC maintains the Global Argo Data Repository (GADR) which operates as a long-term archive for Argo data. The GADR has the responsibility to manage updates to Argo data that are reanalyzed some time later and for which corrections may be applied.
While the GDACs are the main source of Argo data to users with high speed internet access, there will be some who cannot get data in this manner. The GADR can provide alternate means for users to get Argo data.
Real time data is the first form of Argo data available to the public. Because of the requirement for delivering data to users within 24 hours of the float transmitting its profile data, the real time quality control tests are limited and automatic. If data from a float fails these tests, the data will not be distributed onto the GTS but will be on the GDACs as netCDF files. Real time files on the GDACs all start with "R" (e.g. R5900400_001.nc).
The real time data should be free from gross errors in position, temperature and pressure. The uncalibrated salinity values are available on both the GTS and at the GDACs. If a salinity offset is known, it may appear as an "adjusted salinity" (PSAL_ADJUSTED) variable on the GDACs. These data are identified with R in the "Data mode" variable if no adjustments were made and are identified with A if an adjustment was made.
In general these data should be consistent with ocean climatologies even though no climatology tests have been performed. For science applications sensitive to small pressure biases (e.g. calculations of global ocean heat content or mixed layer depth), it is not recommended to use "R" files.
The tests described below are not in the order of implementation, but all DACs apply the tests in same order to the profile data.
Test 17 is not mandatory in real time. Tests marked * are also applied to trajectory data.
- Platform ID *
- Impossible date *
- Impossible location *
- Position on land *
- Impossible speed *
- Global range test *
- Regional parameter range*
- Pressure increasing
- Spike test
- Top - bottom spike - obsolete
- Gradient test
- Digit rollover
- Stuck value
- Density inversion
- Grey list
- Gross salinity or temperature drift
- Visual QC - not mandatory in real time
- Frozen profile
- Pressure not greater than Deepest_Pressure +10%
Delayed mode data
Delayed mode data profiles have been subjected to detailed scrutiny by oceanographic experts and the adjusted salinity has been estimated by comparison with high quality ship-based CTD data and climatologies using the process described by OW1, WJO, or Böhme and Send. This process is carried out on a 1 year long data window and so no Delayed Mode observations can be less than 1 year old. These data are appropriate for applications sensitive to small pressure biases. Read below to learn about which variables to use and how to interpret the quality control flags which are vital to pulling the best quality data from the files.
Variable namesIt is important to understand the basic variable naming structure within the profile files. There are two versions of temperature, salinity and pressure in each file: a "real time" version (TEMP, PSAL, PRES) and a "delayed mode" version (TEMP_ADJUSTED, PSAL_ADJUSTED, PRES_ADJUSTED). It is recommended that users work with the delayed mode, or adjusted version, if it is filled.
In the "D" files, the adjusted variables are filled with the data that has been corrected after examination by the oceanographic experts, making it the highest quality data for that profile.
Quality control flagsThe qc flags are:
- 0 No QC tests have been performed
- 1 Observation good
- 2 Observation probably good (implies some uncertainty)
- 3 Observation thought to be bad but may be recoverable
- 4 Observation thought to be bad and irrecoverable
Argo Regional Centers (ARCs)
Argo has a number of regional centers whose functions include:
The centers are identified as follows:
- Performing regional analysis of all the Argo data in the region to assess its internal consistency as well as its consistency with recent shipboard CTD data.
- Providing feedback to PIs about the results of the regional analysis and possible outliers.
- Facilitating development of a reference database for delayed more quality control. This includes assembling the most recent CTD data in their region.
- Preparing and distributing Argo data products on a regular basis. The main data product will be a consistent Argo delayed mode dataset for their region, but other products might include weekly analyses of temperature, salinity and currents calculated from floats. Documentation of these products will also be provided.
Pacific ARC: http://apdrc.soest.hawaii.edu/argo/
North Atlantic ARC: http://www.coriolis.eu.org/cdc/Argo-NA-ARC.htm
South Atlantic ARC: http://www.aoml.noaa.gov/phod/sardac/index.php
Indian ARC: http://www.incois.gov.in/Incois/argo/argo_dataregional.jsp
Southern ARC: http://www.bodc.ac.uk/projects/international/argo/southern_ocean/
Tools for assisting with Argo data handling
Some people have difficulty working with netCDF format files on the Argo GDAC servers. Information on netCDF can be found on the UCAR website.
Here is a simple Matlab program to read in a netCDF Argo file.
Users are encouraged to share the tools they develop with the rest of the Argo Community.
1 Owens, W.B. and A.P.S. Wong, 2009: An improved calibration method for the drift of the conductivity sensor on autonomous CTD profiling floats by theta-S climatology. Deep Sea Research Part I: Oceanographic Research Papers, 56, 450-457.
Böme, L. and U. Send, 2005: Objective analyses of hydrographic data for referencing profiling float salinities in highly variable environments. Deep Sea Research Part II - Tropical Studies in Oceanography, 52, 651-664.
Wong, A.P.S., G.C. Johnson and W.B. Owens, 2003: Delayed-mode calibration of Autonomous CTD profiling float salinity data by Theta-S climatology. Journal of Atmospheric and Oceanic Technology, 20, 308-318.