Argo world 

A beginner's guide to accessing Argo data

Click here for a pdf version.

Argo collects salinity/temperature profiles from a sparse (average 3° x 3° spacing) array of robotic floats that populate the ice-free oceans that are deeper than about 2000m. They also give information on the surface and subsurface currents. Most profiles are made up of about 200 data points, but floats with high speed communications may be sending many more data points given the higher bandwidth. The first Argo floats were deployed in late 1999 and the array contains more than 3600 active floats as of 2014. Argo data are made available to users quickly and free of restriction.

Complete documentation of the Argo data system is on the Documentation section of the ADMT website. In particular, the "Argo user's manual" and the "Argo quality control manual" may be helpful.

Accessing Argo data
via GTS for operational centers
via DOIs on the GDACs
via the NetCDF files on the GDACs
via Data selection tools on the GDACs
via Gridded files based on Argo NetCDF files at GDACs
via Data Viewers that incoproate Argo data
via Individual float data and plots at Coriolis GDAC
via GADR for archived and offline data
Quality of Argo data
Real time data
Delayed mode data
Citing Argo data
Choosing the correct DOI
Online resources
Argo Data Management Team website
Argo Regional Centers
GitHub repository for OW delayed mode quality control tools in Matlab
Tools for netCDF files
GTS data stream (for operational center use)

Data that go out on the GTS are subject to a number of quality checks in national data assembly centers and only those measurements within a profile that pass all tests (see real time quality control tests) are inserted onto the GTS. The checks are also used to set quality flags in the data that go on to enter other data streams detailed below.

Currently the data are in both the TESAC and the BUFR format with the expectation to switch entirely to BUFR in the future. Temperature and salinity data are truncated to two decimal places. The vertical co-ordinate is depth not pressure (as measured). No corrections are made to salinities in the GTS data stream. The BUFR format is less restrictive than TESAC and allows for quality flags to be included.

An Argo Grey list of floats is maintained at the GDACs ( here and here). The Grey list contains floats which may have problems with one or more sensors. Therefore, data from floats on the Grey list is not sent out on the GTS and should be treated with caution.

DOIs on the GDACs

For users interested in using actual Argo NetCDF files in a scientific paper, an Argo DOI is the best option to ensure reproducibility of data and scientific results. There are several DOIs associated with Argo data, all of which are described on the Argo Data Management page. Each month, a snapshot is taken of the GDAC (learn more about the GDACs) and assigned a DOI. When it is time to write a paper, obtain the most recent monthly DOI, a zipped tarball, and use that as the starting place for all Argo data analysis. There is no need to create a mirror of Argo data via ftp. Instead, the mirror comes in one big file.

GDAC data stream (for NetCDF Argo files)

For users interested in manipulating the actual Argo NetCDF files, the GDACs should be the route to access Argo data. Both GDACs offer access to the complete Argo data collection, including float metadata, detailed trajectory data, profile data and technical data all in NetCDF format. The data is organized in either a geographic (by ocean basin) or a Data Assembly Centre (DAC) manner and is provided via DOIs, HTTP and FTP as well as through data browsers and other portals like OpenDAP, OGC-WCS, etc. It is important to understand the naming system of the files as well as the variable names and quality control flags within each data file. See the ADMT Documentation page for the Argo User's Manual and the Argo quality control manual for more information.

Which GDAC should I use?

This depends on a number of factors but the first issue is location. North American users will likely use the Monterey site and European users the Coriolis site. Beyond this, the choice is a matter of which GDAC has a mode of access that best suits the user's needs.

HTTP and FTP Access

The http and ftp sites are identical and are organized into three main folders: a "dac" folder which sorts the data by Data Assembly Centre (DAC), a "geo" folder which sorts the data by ocean basin, and a "latest_data" folder which includes the most recent data. There are also several index files in the top directory containing a list of metadata on each type of Argo data file (meta, prof, tech and traj) contained in the "dac" and "geo" folders. It is possible to download these lists and search them for floats in specific regions, times, DACs, etc. There is also a grelist which contains a list of floats that likely have sensor problems.

via DOI

Each month since the start of 2014, there is a snapshot taken of the GDACs which is assigned a DOI. Users can simply choose the latest month's snapshot of the GDAC and access it via the DOI link making it easy to cite in publications.

via the GDAC synchronization service

The rsync server "" server provides a synchronization service between the "dac" directory of the GDAC with a user mirror. See the ADMT website for more details. From the user side, the rysnc service:

  • Downloads the new files
  • Downloads the updated files
  • Removes the files that have been removed from the GDAC
  • Compresses/uncompresses the files during the transfer
  • Preserves the files creation/update dates
  • Lists all the files that have been transferred (easy to use for a user side post-processing)

Synchronization of a particular float:

  • rsync -avzh --delete /home/mydirectory/...

Synchronization of the whole dac directory of Argo GDAC:

  • rsync -avzh --delete /home/mydirectory/...

Other Portals

Argo float data and metadata are available a variety of web services suchs as OpenDAP, OGC-WCS, etc. Click here for more information.

Data selection tools on the GDACs

Both GDACs also provide data selection tools which allow users to enter different search criteria and then select different ways to display and receive the data. Regardless of the GDAC, users can search for data using the following means of selecting data:

  • By selecting a latitude/longitude and time range
  • By DAC (Data Assembly Centre)
  • By Delayed Mode or Real Time quality

After providing selection criteria at the US GDAC Data Browser, one can expect the following:

  • A profile location plot for all profiles returned by the query (may be plotted with or without float ID for queries returning many profiles)
  • Download of selected profiles (in netCDFnetCDF Multi-Profile format) as a TAR file
  • Plots of T-P and S-P for individual profiles
  • Plots of float tracks for individual floats

Image of US GDAC webpage interface

The Coriolis GDAC interface allows one to search by a few additional criteria:

  • profile or trajectory file
  • measured parameter
  • platform type (ie, access to additional non-Argo data (XBT, CTD, drifters, moorings, thermosalinograph,ADCP) is available from Coriolis through the same interface

The results can be downloaded as either NetCDF or ASCII data and a map is displayed with the locations of all the resultant floats.

Gridded files based on Argo NetCDF files at the GDACs

The Argo website maintains a list of gridded fields based on Argo data. These gridded files can be based exclusively on Argo data or can include other data sources as well. They are all freely available and are usually NetCDF, but some are ASCII. The gridded fields can sometimes provide an easier method to examine some water properties and are an extension of the Argo dataset.

While each grid producer makes every effort to remove errors from their grid, some might still exist. Therefore, users are reminded that these grids are to be used cautiously. Please notify the grid producer if an error is discovered.

Data Viewers that incorporate Argo data

There are a variety of data viewers that allow users to look at Argo data. The Global Marine Argo Atlas is a graphical user interface (GUI) that helps users easily look at Argo data and compare it to other global gridded data sets like Reynolds SST and Aviso alitmetry. Users can choose from maps, sections, time series, line drawings and a few products and can enter geographic specifications, time limits, depths, etc. to customize the plot. Postscript and JPGs are available as well as an ascii or NetCDF output of the data plotted. The Atlas is updated monthly for the PC version and is static for the Mac version although users can manually update it each month.

For users interested in the Indian Ocean, there is the Indian Ocean Argo Data Viewing Application. This GUI offers the option to look at profile data and value added products like temperature, salinity, currents, etc. at various depths in the water column. This viewer works on Windows and covers 2001 - March 2014.

Ocean Data View (ODV) and Java Ocean Atlas ( JOA) also support Argo profile data.

There is an Argo Google Earth layer produced by the Argo Information Centre that shows the positions of all active and inactive floats and makes it possible to plot all the trajectories in a certain region and time range. The balloons that appear when a user clicks on the float location show profile data from the float and make it possbile to quickly look at profiles from floats in various areas of the world ocean.

Both GDACs offer data selection tools described in the Data selection tools on the GDACs section above.

Individual float data and plots at Coriolis GDAC

For users interested in quickly getting information about individual floats, the Coriolis GDAC offers a description of all floats by WMO ID#. The description includes some meta data about the float, plots of the float's trajectory and data, and links to download individual files in NetCDF or ASCII.

GADR data at US NCEI

The US NCEI maintains the Global Argo Data Repository (GADR) which operates as a long-term archive for Argo data. The GADR has the responsibility to manage updates to Argo data that are reanalyzed some time later and for which corrections may be applied.

While the GDACs are the main source of Argo data to users with high speed internet access, there will be some who cannot get data in this manner. The GADR can provide alternate means for users to get Argo data.

Real-time data

Real time data is the first form of Argo data available to the public. Because of the requirement for delivering data to users within 24 hours of the float transmitting its profile data, the real time quality control tests are limited and automatic. If data from a float fails these tests, the data will not be distributed onto the GTS but will be on the GDACs as NetCDF files. Real time files on the GDACs all start with "R" (e.g.

The real time data should be free from gross errors in position, temperature and pressure. The uncalibrated salinity values are available on both the GTS and at the GDACs. If a salinity offset is known, it may appear as an "adjusted salinity" (PSAL_ADJUSTED) variable on the GDACs. These data are identified with 'R' in the "Data mode" variable if no adjustments were made and are identified with 'A' if an adjustment was made.

In general these data should be consistent with ocean climatologies even though no climatology tests have been performed. For science applications sensitive to small pressure biases (e.g. calculations of global ocean heat content or mixed layer depth), it is not recommended to use "R" files.

The tests described below are not in the order of implementation, but all DACs apply the tests in same order to the profile data.

  1. Platform ID *
  2. Impossible date *
  3. Impossible location *
  4. Position on land *
  5. Impossible speed *
  6. Global range test *
  7. Regional parameter range*
  8. Pressure increasing
  9. Spike test
  10. Top - bottom spike - obsolete
  11. Gradient test
  12. Digit rollover
  13. Stuck value
  14. Density inversion
  15. Grey list
  16. Gross salinity or temperature drift
  17. Visual QC - not mandatory in real time
  18. Frozen profile
  19. Pressure not greater than Deepest_Pressure +10%
Test 17 is not mandatory in real time. Tests marked * are also applied to trajectory data.

Delayed mode data

Delayed mode data profiles have been subjected to detailed scrutiny by oceanographic experts and the adjusted salinity has been estimated by comparison with high quality ship-based CTD data and climatologies using the process described by OW1, WJO, or Böhme and Send. This process is carried out on a one year long data window and so no Delayed Mode observations can be less than one year old. These data are appropriate for applications sensitive to small pressure biases. Read below to learn about which variables to use and how to interpret the quality control flags which are vital to pulling the best quality data from the files.

Variable names

It is important to understand the basic variable naming structure within the profile files. There are two versions of temperature, salinity and pressure in each file: a "real time" version (TEMP, PSAL, PRES) and a "delayed mode" version (TEMP_ADJUSTED, PSAL_ADJUSTED, PRES_ADJUSTED). It is recommended that users work with the delayed mode, or adjusted version, if it is filled.

In the "D" files, the adjusted variables are filled with the data that has been corrected after examination by the oceanographic experts, making it the highest quality data for that profile.

Quality control flags

The qc flags are:
  • 0 No QC tests have been performed
  • 1 Observation good
  • 2 Observation probably good (implies some uncertainty)
  • 3 Observation thought to be bad but may be recoverable
  • 4 Observation thought to be bad and irrecoverable

Citing Argo Data with a DOI

Argo data are freely available without restriction. However, to track uptake and impact, we ask that where Argo data are used in a publication or product, an acknowledgement be given. The Argo Data Management Team assigns Digital Object Identifiers (DOIs) to Argo documents and datasets. These can easily be included in publications and keeps a direct and permanent link to the Argo document or data set used in that publication (to ensure reproducibility). The data sets are archived in monthly snapshots each with its own DOI.

To acknowledge Argo, please use the following sentence and place the appropriate Argo DOI afterwards as described below.

" These data were collected and made freely available by the International Argo Program and the national programs that contribute to it.  (,  The Argo Program is part of the Global Ocean Observing System. "

There are two options when picking an Argo DOI:

(1) Pick the one associated with when you obtained the data from the Argo GDAC. Please refer to the ADMT's DOI page to find the correct DOI to include in your publication or product.


(2) If you did not obtain the Argo data from the GDAC, but from an Argo data product or other source without a date for the Argo data attached to it, use the general one with no time associated with it:

Argo (2000). Argo float data and metadata from Global Data Assembly Centre (Argo GDAC). SEANOE.

Argo Regional Centers (ARCs)

Argo has a number of regional centers whose functions include:

  • Performing regional analysis of all the Argo data in the region to assess its internal consistency as well as its consistency with recent shipboard CTD data.
  • Providing feedback to PIs about the results of the regional analysis and possible outliers.
  • Facilitating development of a reference database for delayed mode quality control. This includes assembling the most recent CTD data in their region.
  • Preparing and distributing Argo data products on a regular basis. The main data product will be a consistent Argo delayed mode dataset for their region, but other products might include weekly analyses of temperature, salinity and currents calculated from floats. Documentation of these products will also be provided.
The centers are identified as follows:

Pacific ARC:

North Atlantic ARC:

South Atlantic ARC:

Indian ARC:

Southern ARC:

MedArgo ARC:

GitHub repository for OW delayed mode quality control tools in Matlab

Cecile Cabanes has set up a GitHub repository matlabow which contains the OW software used for calibrating profiling float conductivity sensor drift. This is under a general DMQC repository for all tools that aid in the DMQC process. If you have tools you'd like to add to this repository, email for more information.

This new release of OW in Matlab includes the updates proposed in Cabanes et al, DSR1 2016. Details of the updates can be found in the commits tab and in the doc/README.DOC file.

Tools for assisting with Argo data handling

Some people have difficulty working with NetCDF format files on the Argo GDAC servers. Information on NetCDF can be found on the UCAR website.

Here are some simple Matlab programs to read in a netCDF Argo profile and trajectory file There are also programs to plot the CTD data from the profile file and the trajectory locations.

Users are encouraged to share the tools they develop with the rest of the Argo Community.

1 Owens, W.B. and A.P.S. Wong, 2009: An improved calibration method for the drift of the conductivity sensor on autonomous CTD profiling floats by theta-S climatology. Deep Sea Research Part I: Oceanographic Research Papers, 56, 450-457.

Böme, L. and U. Send, 2005: Objective analyses of hydrographic data for referencing profiling float salinities in highly variable environments. Deep Sea Research Part II - Tropical Studies in Oceanography, 52, 651-664.

Wong, A.P.S., G.C. Johnson and W.B. Owens, 2003: Delayed-mode calibration of Autonomous CTD profiling float salinity data by Theta-S climatology. Journal of Atmospheric and Oceanic Technology, 20, 308-318.