A beginner's guide to accessing Argo data
Click here for a pdf version.
Argo collects salinity/temperature profiles from a sparse (average 3° x 3°
spacing) array of robotic floats that populate the ice-free oceans that are deeper
than about 2000m. They also give information on the surface and subsurface currents.
Most profiles are made up of about 200 data points, but floats with high speed communications
may be sending many more data points given the higher bandwidth. The first Argo floats
were deployed in late 1999 and the array contains more than 3600 active floats as of 2014.
Argo data are made available to users quickly and free of restriction.
Complete documentation of the Argo data system is on the
Documentation section of the ADMT website. In particular, the "Argo user's manual"
and the "Argo quality control manual" may be helpful.
GTS data stream (for operational center use)
Data that go out on the GTS are subject to a number of quality checks in national data assembly
centers and only those measurements within a profile that pass all tests (see
real time quality control tests) are inserted onto the GTS. The checks are also used
to set quality flags in the data that go on to enter other data streams detailed below.
Currently the data are in both the TESAC and the BUFR format with the expectation to switch entirely
to BUFR in the future. Temperature and salinity data are truncated to two decimal places.
The vertical co-ordinate is depth not pressure (as measured). No corrections are made to
salinities in the GTS data stream. The BUFR format is less restrictive than TESAC and allows
for quality flags to be included.
An Argo Grey list of floats is maintained at the GDACs (
here and here).
The Grey list contains floats which may have problems with one or more sensors. Therefore, data
from floats on the Grey list is not sent out on the GTS and should be treated with caution.
GDAC data stream (for NetCDF Argo files)
For users interested in manipulating the actual Argo NetCDF files, the GDACs should be the route
to access Argo data. Both GDACs offer access to the complete Argo data collection, including
float metadata, detailed trajectory data, profile data and technical data all in NetCDF format.
The data is organized in either a geographic (by ocean basin) or a Data Assembly Centre (DAC)
manner and is provided via HTTP and FTP as well as through data browsers and other portals like
OpenDAP, OGC-WCS, etc. It
is important to understand the
naming system of the files as well as the variable names and quality control flags within each
data file. See the ADMT
Documentation page for the Argo User's Manual and the Argo quality control manual for more
Which GDAC should I use?
This depends on a number of factors but the first issue is location. North American users will
likely use the Monterey site and European users the Coriolis site. Beyond this, the choice is
a matter of which GDAC has a mode of access that best suits the user's needs.
HTTP and FTP Access
The http and ftp sites are identical and are
organized into three main folders: a "dac" folder which sorts the data by Data Assembly Centre
(DAC), a "geo" folder
which sorts the data by ocean basin, and a "latest_data" folder which includes the most recent data.
There are also several index files in the top directory containing a list of metadata on each type
data file (meta, prof, tech and traj) contained in the "dac" and "geo" folders. It is possible to download these lists and search them
for floats in specific regions, times, DACs, etc. There is also a grelist which contains a list of
floats that likely have sensor problems.
Monthly Snapshots via DOI
Each month since the start of 2014, there is a snapshot taken of the GDACs which is assigned a DOI. Users can simply choose the latest month's snapshot of the GDAC and access it via the DOI link making it easy to cite in publications.
via the GDAC synchronization service
The rsync server "vdmzrs.ifremer.fr" server provides a synchronization service between the "dac" directory of
the GDAC with a user mirror. See the
ADMT website for more details. From the user side, the rysnc service:
- Downloads the new files
- Downloads the updated files
- Removes the files that have been removed from the GDAC
- Compresses/uncompresses the files during the transfer
- Preserves the files creation/update dates
- Lists all the files that have been transferred (easy to use for a user side post-processing)
Synchronization of a particular float:
- rsync -avzh --delete vdmzrs.ifremer.fr::argo/coriolis/69001 /home/mydirectory/...
Synchronization of the whole dac directory of Argo GDAC:
- rsync -avzh --delete vdmzrs.ifremer.fr::argo/ /home/mydirectory/...
Both GDACs also provide data selection tools which allow users to enter different search criteria
and then select different ways to display and receive the data. Regardless of the GDAC, users
can search for data using the following means of selecting data:
- By selecting a latitude/longitude and time range
- By DAC (Data Assembly Centre)
- By Delayed Mode or Real Time quality
After providing selection criteria at the
US GDAC Data Browser, one can expect the following:
- A profile location plot for all profiles returned by the query (may be plotted with or
without float ID for queries returning many profiles)
- Download of selected profiles (in netCDFnetCDF Multi-Profile format) as a TAR file
- Plots of T-P and S-P for individual profiles
- Plots of float tracks for individual floats
Image of US GDAC webpage interface
The Coriolis GDAC interface allows one to search by a few additional criteria:
- profile or trajectory file
- measured parameter
- platform type (ie, access to additional non-Argo data (XBT, CTD, drifters, moorings, thermosalinograph,ADCP) is
available from Coriolis through the same interface
The results can be downloaded as either NetCDF or ASCII data and a map is displayed with the locations
of all the resultant floats.
Image of Coriolis webpage interface
Argo float data and metadata are available a variety of web services suchs as OpenDAP, OGC-WCS, etc.
here for more information.
Gridded files based on Argo NetCDF files at the GDACs
The Argo website maintains a list of
gridded fields based on Argo data. These gridded files can be based exclusively on Argo data or can include
other data sources as well. They are all freely available and are usually NetCDF, but some
are ASCII. The gridded fields can sometimes provide an easier method to examine some water
properties and are an extension of the Argo dataset.
While each grid producer makes every effort to remove errors from their grid, some might
still exist. Therefore, users are reminded that these grids are to be used cautiously. Please
notify the grid producer if an error is discovered.
Data Viewers that incorporate Argo data
There are a variety of data viewers
that allow users to look at Argo data. The Global Marine
Argo Atlas is a graphical user interface (GUI) that helps users easily look at Argo data and
compare it to other global
gridded data sets like Reynolds SST and Aviso alitmetry. Users can choose from maps, sections, time
series, line drawings and a few products and can enter geographic specifications, time limits,
depths, etc. to customize the plot. Postscript and JPGs are available as well as an ascii or NetCDF
output of the data plotted. The Atlas is updated monthly for the PC version and is static for
the Mac version although users can manually update it each month.
For users interested in the Indian Ocean, there is the Indian Ocean Argo
Data Viewing Application. This GUI offers the option to look at profile data and value added products
like temperature, salinity, currents, etc. at various depths in the water column. This viewer works
on Windows and covers 2001 - March 2014.
Ocean Data View (ODV) and Java Ocean Atlas (
JOA) also support Argo profile
There is an Argo Google Earth layer
produced by the Argo Information Centre that shows the positions of all active and inactive floats
and makes it possible to plot all the trajectories in a certain region and time range. The
balloons that appear when a user clicks on the float location show profile data from the float
and make it possbile to quickly look at profiles from floats in various areas of the world ocean.
Both GDACs offer data selection tools described in the Data Browsers section above.
GADR data at US NODC
The US NODC maintains the Global Argo Data Repository
(GADR) which operates as a
long-term archive for Argo data. The GADR has the responsibility to manage updates to Argo data
that are reanalyzed some time later and for which corrections may be applied.
While the GDACs are the main source of Argo data to users with high speed internet access, there
will be some who cannot get data in this manner. The GADR can provide alternate means for users
to get Argo data.
Real time data is the first form of Argo data available to the public. Because of the requirement
for delivering data to users within 24 hours of the float transmitting its profile data, the real
time quality control tests are limited and automatic. If data from a float fails these tests,
the data will not be distributed onto the GTS but will be on the GDACs as NetCDF files. Real
time files on the GDACs all start with "R" (e.g. R5900400_001.nc).
The real time data should be free from gross errors in position, temperature and pressure.
The uncalibrated salinity values are available on both the GTS and at the GDACs. If a salinity
offset is known, it may appear as an "adjusted salinity" (PSAL_ADJUSTED) variable on the GDACs.
These data are identified with 'R' in the "Data mode" variable if no adjustments were made and
are identified with 'A' if an adjustment was made.
In general these data should be consistent with ocean climatologies even though no climatology
tests have been performed. For science applications sensitive to small pressure biases
(e.g. calculations of global ocean heat content or mixed layer depth), it is not recommended
to use "R" files.
The tests described below are not in the order of implementation, but all DACs apply the tests
in same order to the profile data.
Test 17 is not mandatory in real time. Tests marked * are also applied to trajectory data.
- Platform ID *
- Impossible date *
- Impossible location *
- Position on land *
- Impossible speed *
- Global range test *
- Regional parameter range*
- Pressure increasing
- Spike test
- Top - bottom spike - obsolete
- Gradient test
- Digit rollover
- Stuck value
- Density inversion
- Grey list
- Gross salinity or temperature drift
- Visual QC - not mandatory in real time
- Frozen profile
- Pressure not greater than Deepest_Pressure +10%
Delayed mode data
Delayed mode data profiles have been subjected to detailed scrutiny by oceanographic experts
and the adjusted salinity has been estimated by comparison with high quality ship-based CTD
data and climatologies using the process described by OW1, WJO, or Böhme and
Send. This process is carried out on a one year long data window and so no Delayed Mode
observations can be less than one year old. These data are appropriate for applications
sensitive to small pressure biases. Read below to learn about which variables to use and how
to interpret the quality control flags which are vital to pulling the best quality data from
It is important to understand the basic variable naming structure within the profile files.
There are two versions of temperature, salinity and pressure in each file: a "real time"
version (TEMP, PSAL, PRES) and a "delayed mode" version (TEMP_ADJUSTED, PSAL_ADJUSTED,
PRES_ADJUSTED). It is recommended that users work with the delayed mode, or adjusted version,
if it is filled.
In the "D" files, the adjusted variables are filled with the data that has been corrected
after examination by the oceanographic experts, making it the highest quality data for that
Quality control flags
The qc flags are:
- 0 No QC tests have been performed
- 1 Observation good
- 2 Observation probably good (implies some uncertainty)
- 3 Observation thought to be bad but may be recoverable
- 4 Observation thought to be bad and irrecoverable
Argo Regional Centers (ARCs)
Argo has a number of regional centers whose functions include:
The centers are identified as follows:
- Performing regional analysis of all the Argo data in the region
to assess its internal consistency as well as its consistency with
recent shipboard CTD data.
- Providing feedback to PIs about the results of the regional
analysis and possible outliers.
- Facilitating development of a reference database for delayed mode
quality control. This includes assembling the most recent CTD
data in their region.
- Preparing and distributing Argo data products on a regular basis.
The main data product will be a consistent Argo delayed mode
dataset for their region, but other products might include weekly
analyses of temperature, salinity and currents calculated from
floats. Documentation of these products will also be provided.
North Atlantic ARC:
South Atlantic ARC:
Tools for assisting with Argo data handling
Some people have difficulty working with NetCDF format files on the Argo GDAC servers.
Information on NetCDF can be found on the UCAR website.
Here is a
simple Matlab program to read in a netCDF Argo file.
Users are encouraged to share the tools they develop with the rest of the Argo Community.
1 Owens, W.B. and A.P.S. Wong, 2009: An improved calibration method for the drift
of the conductivity sensor on autonomous CTD profiling floats by theta-S climatology. Deep
Sea Research Part I: Oceanographic Research Papers, 56, 450-457.
Böme, L. and U. Send, 2005: Objective analyses of hydrographic data for referencing
profiling float salinities in highly variable environments. Deep Sea Research Part II -
Tropical Studies in Oceanography, 52, 651-664.
Wong, A.P.S., G.C. Johnson and W.B. Owens, 2003: Delayed-mode calibration of Autonomous CTD
profiling float salinity data by Theta-S climatology. Journal of Atmospheric and Oceanic
Technology, 20, 308-318.