OceanObs'09 - Additional Contributions

 
Session: Information Synthesis and Delivery (04C)


Data Tools and Services at Physical Oceanography DAAC
Bingham, Andrew; Thompson, C; Stough, T; Henderson, M; Pan, L; Mattmann, C
Jet Propulsion Laboratory, UNITED STATES

PO.DAAC Overview
The Physical Oceanographic Distributed Active Archive Center (PO.DAAC) archives and distributes NASA's satellite data and associated information pertaining to the state of Earths oceans.
PO.DAAC supports a diverse community of over 12,000 users that includes ocean and climate researchers, operational agencies, ocean resource managers, educators and the general public. PO.DAAC has developed a set of tools and services for searching and acquiring data from its holdings, which exceeds 50 TB. Moreover, these tools and services are continually evolving to stay apace with technological advancements, especially as they relate to web services.

Existing Tools & Services
POET (http://poet.jpl.nasa.gov)
The PO.DAAC Ocean ESIP Tool (POET) provides interactive, on-line subsetting and visualization for many of PO.DAAC's gridded (Level-3) data products. Viewing options include latitude-longitude maps, animations, time series plots, and space-time profiles. In addition, this tool can handle WMS/WCS requests.

SCCOOS Portal (http://sccoos.jpl.nasa.gov)
As part of the Southern California Coastal Ocean Observing System, this portal serves out high-resolution, near real-time images and data that support several coastal resource management applications.

FTP/HEFT (ftp://podaac.jpl.nasa.gov)
All PO.DAAC data are freely available via the PO.DAAC FTP site. The site is laid out in a standardized and logical directory structure, which helps users to quickly navigate to the data of interest. Coupled with each data set is a README file, links to documentation and sample software to read the data. The High Efficiency File Transfer (HEFT) requires users to download a client to achieve downloads on the order of 1000x standard FTP.

Datacasting (http://podaac.jpl.nasa.gov/datacasting)
Uses RSS feeds to create a notification when a new data granule (data file) is made available. With the Datacasting Feed Reader, users are able to subscribe to feeds and download granules immediately to their computer. Moreover, they can create filters based on metadata tags in the feed to limit what files get downloaded. For example, only download granules that pass through a specified region or contain data related to a specific event.

Hurricane/Typhoon Tracker (http://podaac.jpl.nasa.gov/hurricanes)
This tool tracks the location of historical and on-going hurricanes and provides overlays of ultra high-resolution wind images (from QuikSCAT) and optimally interpolated 5 km sea surface temperature.

Tools and Service under Development
Granule-based Searches
Using the OpenSearch protocol, this search feature will provide a free-text or machine-machine query interface to quickly identify granules based on the full set of metadata maintained in the PO.DAAC inventory.

Level-2 (Swath-based) Subsetter
This capability will give users the ability to subset swath-based data granules by (time, space and parameter) and output the data in a standardized NetCDF file format, as well as other common image formats ands standards, such as GeoTIFF and KML.

Cutting Edge Technologies
PO.DAAC is partnering with several research and development teams funded under the NASA ACCESS program to infuse cutting-edge technologies into an operational setting. The Virtual Oceanographic Data Center (Mattmann et al.) will utilize modern search technologies from Apache's software foundation including Lucene and Solr to create web services and a common portal allowing free-text and facet-based searching of oceans data and metadata from NASA ocean missions (OSTM, GHRSST), NOAA, and the National Virtual Ocean Data System (NVODS). The Web-Based Altimetry Service (Callahan et al.) uses the SciFlow technology to give users the capability to select different altimeter processing algorithms and create altimeter products tailor to a localized region.

 
Web-based Altimeter Service
Callahan, Philip S.1; Xing, Zhangfan1; Raskin, Robert G.1; Oslund, Kenneth A.1; Wilson, Brian D.1; Wilson, B. W.2; Xing, Z.2; Raskin, R.2; Oslund, K.2
1Jet Propulsion Laboratory, UNITED STATES;
2Jet Propulsion Lab, UNITED STATES

We are developing a web-based system to allow updating and subsetting of altimeter data. This is crucial to the expanded use and improvement of altimeter data. The service aspect is necessary for altimetry because the result of most interest (sea surface height anomaly, SSHA) is composed of several components which are updated individually and irregularly by specialized experts. This makes it difficult for projects to provide the most up-to-date products. Some components are the subject of ongoing research, so the ability for investigators to make products for comparison or sharing is important. The service will allow investigators/producers to get their component models or processing into widespread use much more quickly. For coastal altimetry, the ability to subset the data to the area of interest and insert specialized models or data processing results is crucial.

A key part of the Altimeter Service is having data producers provide updated or local models and data. In order for this to succeed, producers need to register their products with the Altimeter Service and agree to provide the product either on demand or in a way that can be integrated into the basic altimeter data record structure.

We will describe the basic structure of the web service and the steps toward implementation. We will integrate the web and Grid workflow features of SciFlo with algorithms developed for the Ocean Surface Topography Science Team work to produce improved Geophysical Data Records (GDRs) with retracking (RGDRs) and other improved data elements. TOPEX RGDRs in a netCDF format that has been coordinated with Jason data will be the initial basis of the service. The goal is to allow individual users to produce their own GDRs and/or SSHA data sets using data components that they select from known sources or supply themselves. In particular, we will enable for the first time customized and easily repeatable regional studies by allowing users to swap in accurate, high-resolution, local models (tides and other corrections) and update the SSH and SSHA for regions of interest. In addition to time and space subsetting, we will provide the ability to select variables of interest as the data will be in netCDF, allowing straightforward extraction of data elements.

The research described here was carried out at the Jet Propulsion Laboratory, California Institute of Technology, under a contract with the National Aeronautics and Space Administration.

 
Operational Quality Control Monitoring of Envisat RA-2 Data
Cotton, David1; Nogueira Loddo, Carolina2; Féménias, Pierre3; Morabito, Bruno4; Pinori, Sabrina2
1SatOC, UNITED KINGDOM;
2SERCO, ITALY;
3ESA, ITALY;
4VEGA, ITALY

Operational quality control monitoring of data streams from all ESA's space-borne Earth Observation instruments is carried under the IDEAS contract. IDEAS started its operation on August 1st 2008, supporting the ESA Sensor Performance Products and Algorithms (SPPA) team and replacing the previous DPQC (Data Processing and Quality Control) activity.

IDEAS provides the following services:

  • Handling of user requests
  • Operational Quality Control of ESA and 3 Party Mission products
  • Support to CAL/VAL teams where appropriate
  • Maintenance of consortium QC / analysis tools
  • Provision of adequate hardware and servers for service access

    At present, IDEAS activities are managed in four instrument "families":

  • Atmospheric Chemistry (SCIAMACHY, GOMOS, MIPAS, GOME)
  • Optical (MERIS, AATSR, LANDSAT, Prism, AVNIR)
  • Altimetry (RA, RA-2, CRYOSAT)
  • SAR (SAR, ASAR, SCAT, PALSAR)

    In this presentation we focus on data from the ENVISAT RA-2 instrument, and highlight key issues that have been identified in the past year through this important monitoring activity. This includes a study into the possible impacts of the new IF filters acquired by a revised procedure put in place after cycle 66, to address the problems caused by anomalous IF masks that were acquired up until this time.

    We also summarise data handling recommendations, provide information on how users can access Quality Reports, and invite users to comment on the usefulness of the information that is made available.

  •  
    GODAE Ocean Data Quality Control Intercomparison Project
    Cummings, James1; Keeley, Robert2; Martin, Matthew3; Carval, Thierry4; Brassington, Gary5
    1Naval Research Laboratory, UNITED STATES;
    2Department of Fisheries and Oceans, CANADA;
    3UK MetOffice, UNITED KINGDOM;
    4IFREMER, FRANCE;
    5CAWCR, Bureau of Meteorology, AUSTRALIA

    A workshop was organized prior to the Biarritz GODAE symposium to discuss the potential and priorities for the exchange of information and collaboration on the quality control of ocean observations. The workshop was the initial step in a process that has evolved into a comprehensive ocean data quality control intercomparison project. Currently, outcomes of profile data quality control procedures from 5 oceanographic centers are available on the US GODAE server: http://www.usgodae.org/ftp/outgoing/godae_qc. The contributing centers include: (1) U.S. Navy Fleet Numerical Meteorology and Oceanography Center (FNMOC); (2) U.K. Met Office (UKMO); (3) Marine Environmental Data Service of Canada (MEDS); (4) Australian Bureau of Meteorology (BMRC); and (5) French Coriolis Data Center (Coriolis). Daily inputs of profile QC data from the centers are matched and used to create NetCDF formatted WMO call sign data files. A WMO call sign file contains the entire time history of the reporting platform and all of the QC information used by the center to determine profile data quality. WMO call sign data files exist for the time period 2004 to the present and are updated daily as new profile data QC information is received from the centers.

    The WMO-based call sign files have a vareirty of applications. First, the call sign files allow GOOS data providers access to information about the fate of their data in GODAE analysis/forecast systems. Oceanographic centers are in the best position to operate ocean data quality control systems and the call sign data files provide a way to facilitate the relay of real-time QC information back to program managers and operators regarding utilization of their buoy, XBT, and profiling float observing networks. Second, the WMO call sign files provide a way for the oceanographic centers to compare their ocean data quality control systems. The quality control procedures being used at the centers are expected to substantially vary depending upon the type of data being considered and whether extensive use is made of ocean model first guess fields or whether more specific tools (e.g., instrumentation error checks), manual checks, and comparisons with climatology are used at the center. Finally, the time history aspect of the call sign files provide a natural way to look at systematic problems (biases) in the reporting platform, such as sensor drift or calibration errors.

    In this paper we review the development and future of the GODAE Ocean Data Quality Control Intercomparison Project. We describe the design and process of generating the WMO call sign data files using daily outputs of profile data quality control information from the oceanographic centers. The WMO call sign data files represent the starting point of all follow-on analysis and intercomparison of ocean data QC outcomes. The project is initially focusing on the ocean profile data, but the system can easily be expanded to include additional ocean data types, QC variables, and analysis/application tools.

     
    Argo and Synthesis Products Developed and Served at the Asia-Pacific Data-Research Center
    Hacker, Peter; Maximenko, Nikolai; Potemra, Jim; Lebedev, Konstantin; DeCarlo, Sharon; Shen, Yingshuo
    University of Hawaii, IPRC, UNITED STATES

    The Asia-Pacific Data-Research Center (APDRC) within the International Pacific Research Center (IPRC) at the University of Hawaii offers a web-based, data and product server system, which provides access to a range of in situ, model-based and satellite-based products. Initiated in 2001, a primary motivation has been to provide easy access for the broad user community to the wide range of climate data and products, often underutilized due to lack of easy access. Working closely with our NOAA/PMEL partners, the center has implemented a data server system using OPeNDAP protocol in order to provide web-based access to atmospheric, oceanic, and air/sea flux products, which can be directly accessed via client-based software such as GrADS, Matlab, Ferret, and FORTRAN code. The system uses a range of servers including LAS and OPeNDAP (THREDDS and GDS) for gridded products, and DAPPER/DChart for in situ data.

    Recently and into the future, the APDRC is shifting focus from data server infrastructure to the production of value-added products using the new observing system data sets such as Argo and satellite-based products. Global Argo products under development and available on our servers include: surface and deep velocities from float trajectories; profile data interpolated on standard depth levels and isopycnals; mixed layer, isothermal layer and barrier layer depths; and statistics, climatologies, and monthly and annual averages. Map products include information on data coverage, and are available as both gridded/interpolated products and spatial bin-averaged products. Synthesis products under development include absolute dynamic topography computed from Argo floats, drifters, satellite wind and altimetry data. Mean surface dynamic topography is computed from drifter, wind and altimetry data. Instantaneous surface dynamic topography is obtained from Mean Dynamic Ocean Topography (MDOT). Absolute dynamic topography at depth is calculated from Argo T/S profiles by integrating surface topography downward. The horizontal gradient of absolute dynamic topography at Argo float parking-depths is assessed from float velocities and geostrophy. At the present time, these products are updated monthly.

    Future collaborative activities follow. Since a variety of Argo products are currently being produced by several centers and by individual researchers, we propose hosting an 'open work space' on our website for product demonstration, evaluation and intercomparison, and for comments and discussion. The goal would be to increase product quality and utility.

    Future product development at the APDRC will include the combined use of Aquarius sea surface salinity data with Argo data to provide global products on the space/time variability of oceanic salinity fields and regional salinity fronts.

    In order to provide increasing utility in the future to the broad user community including applications, management, and the general public (in addition to the traditional research communities), we plan to make products available via Google Earth, Goggle Map and Geospatial Information System (GIS) formats.

    The IPRC/APDRC server address is: http://apdrc.soest.hawaii.edu with Argo products available at http://apdrc.soest.hawaii.edu/projects/argo/.

     
    The GENESI-DR infrastructure: an opportunity for the ocean science community
    Kaiser-Weiss, Andrea1; Migliorini, Stefano1; Manzella, Giuseppe2; Cossu, Roberto3; Hosford, Steven4; Li Santi, Eliana3; Fusco, Luigi3
    1University of Reading, UNITED KINGDOM;
    2ENEA, ITALY;
    3ESA, ITALY;
    4CNES, FRANCE

    Ground European Network for Earth Science Interoperations - Digital Repositories (GENESI-DR) (http://www.genesi-dr.eu/) is an ESA-led, European Commission funded two-year project, aimed at providing reliable, easy, long-term access to historical and recently aquired Earth science data from space, airborne and in-situ sensors archived in large distributed repositories. The specific strength of GENESI-DR lies in the concept of offering a single access point to the petabytes of heterogeneous data located at a variety of individual data repositories. Here we will show how the already deployed infrastructure, which currently involves 9 different digital repositories (from ESA, CNES, DLR, KSAT, ASI, NILU, Infoterra, JRC, ENEA), allows the scientists to easily discover, access, and even process heterogeneous and scattered data from a single access point. We will discuss how the GENESI-DR e-Infrastructure can inter-operate with other infrastructures (SeaDataNet) and how it is being validated against an ocean-related application (ENEA and CNR ISAC subset of SeaDataNET distributed data bases). As an example, we will demonstrate the following data: (a) daily generated sea surface temperature (SST) maps archived at the Italian National Council of Research; (b) vertical profiles of sea temperature measured by Volunteer Opportunity Ships (VOS) and archived at ENEA; (c) SST and chlorophyll maps, generated on-the-fly from satellite data stored at ESA using computational resources federated to GENESI-DR, and based on the parameters set by the user. Finally, we will provide training for scientists interested in using GENESI-DR for data access and processing. Training will also be available to data repository holders who would like to "genesi-fy" their data, i.e., to link their own data (or data repositories) to GENESI-DR.

     
    Design of Future Altimeter Missions: The End-to-End Thematic Simulator
    Lombard, A.1; Auge, E.1; Lamouroux, J.2; Lambin, J.1; Lyard, F.3; De Mey, P.3; Pénard, C.2; Lalanne, T.2; Jeansou, E.2; Roblou, L.3
    1CNES, FRANCE;
    2NOVELTIS, FRANCE;
    3LEGOS, FRANCE

    In the current frame of debates on future altimetry constellation design, the need for a decision-making tool has been highlighted by CNES and realised through the development of an end-to-end altimeter thematic simulator. This simple, flexible and evolutive tool aims at examining the merits of various observing configurations and discriminate among them.

    The present study describes the present prototype of this end-to-end mission simulator for altimetry. Based on a simplified version of the recently published Ensemble Twin Experiments methodology (Mourre et al., 2006), the simulator aims at quantifying the potential of an altimetry observing system by estimating its ability to reduce the statistical error of a storm surge model as well as a tide model, in the region Bay of Biscay + English channel + Celtic Sea. Relative performance score helps discriminate the various observing scenarios (number of satellites, orbits, instrument type,...).

    Some validation and application case results are presented.

     
    Integrating ncWMS into the THREDDS Data Server
    Mak, P1; Blower, J2; Caron, J3; Davis, E3; Santokhee, A2; Bindoff, N4
    1Australian Research Collaboration Services (ARCS), AUSTRALIA;
    2Reading eScience Centre (ReSC), Environmental Systems Science Centre, University of Reading, UNITED KINGDOM;
    3Unidata, University Corporation for Atmospheric Research (UCAR), UNITED STATES;
    4TPAC, ACECRC, CMAR, CAWCR, IASOS, AUSTRALIA

    TDS is a framework for serving and cataloguing heterogeneous data types through common protocols over HTTP. It is a middleware that simplifies the publication of and access to scientific data (Caron, John, Davis, E. R., Ho, Y. and Kambic, R. P., 2006). It has a significant global user base with many ocean, climate and modelling communities using this to share data. The main advantage of the TDS server (and also of other OPeNDAP servers) is its use of the Data Access Protocol (DAP) to harmonise the delivery across the internet of a whole suite of self-describing file formats (currently 20 types) commonly used in the these communities. Interoperability is enhanced with the use of NetCDF Markup Language (ncML) (Nativi, Stefano, Caron, J., Davis, E., Domenico, B., 2005), where metadata views can be added to conform to a naming convention, while the underlying files remains unchanged. Additionally, ncML offers aggregation of datasets, where large datasets spanning multiple files can be seen as a logical volume. These capabilities give TDS an enormous amount of flexibility to deliver heterogeneous files from legacy data sets and from diverse applications and sources to across the internet through a uniform interface with simple client applications.

    However, sharing data across discipline, such as the GIS community has been difficult, as the underlying protocol, DAP does not allow data to be referenced in geospatial coordinates. This protocol depends on the structure of the underlying objects and uses exclusively indexes for referencing elements and thus can be used for almost any indexed data (James Gallagher, N. Potter, T.Sgouros, S. Hankin, G. Flierl, 2007). The gap in protocols for geo-referenced data sets is being filled by the specification of a suite of web services from the Open Geospatial Consortium (OGC). This suite includes data access - most commonly Web Feature Service (WFS) and Web Coverage Service (WCS) and visualisation - Web Map Service (WMS). WCS has already been integrated into the TDS framework (Nativi, S. and Domenico, B. and Caron, J. and Davis, E. and Bigagli, L, 2006). Adding WMS is a logical progression of features for TDS. Instead of implementing from scratch another WMS server, an existing server, ncWMS was chosen to integrate into TDS. The ncWMS server was developed by the Reading eScience Centre (ReSC) as part of the UK e-Science initiative to enable commonly developed meteorological and oceanographic data sets that were available in the NetCDF file type to be delivered to the geographical information systems community using internationally recognised standards, such as WMS. This application allowed the visualisation of the NetCDF data into this standard protocol, thus creating a bridging from NetCDF data types to the WMS standard.

    The TDS server (Version 3.17) has the capacity to deliver data in the WCS and OPeNDAP protocol across the internet. The server is built on top of the core NetCDF-Java library - an implementation of Unidatas Common Data Model (CDM). CDM creates an abstraction layer over file formats and metadata convention, such that, it is possible to access data using temporal-spatial referencing systems through a single interface. ncWMS is a visualisation server that is also using the same NetCDF-Java library. It contains a user interface, Godiva2 that allows users to select and view configured layers. As TDS and ncWMS share many common libraries, integration could proceed without major changes to the code. Datasets are typically served through OPeNDAP using TDS with additional servers installed and configured to enable visualisation. It requires managing multiple servers and essentially doubling the amount of administration workload. The tight integration of ncWMS allows the visualisation service to be toggled like any other services in TDS. It also means that only a single server has to be administrated.

    The work is now included as part of the TDS 4.0 stable release and is expected to form part of the infrastructure for the MyOcean project (http://www.myocean.eu.org). It is also used by the eMarine Information Infrastructure (eMII) to serve Integrated Marine Information System (IMOS) datasets (http://www.imos.org.au/).

    This project is made possible through the support of the NERC Knowledge Exchange Funding Scheme, Unidata, the Australian Research Collaboration Services (ARCS) and Australian National Data Services (ANDS).

     
    Unified Access to Distributed Data Sets: SeaDataNet - Pan-European Infrastructure for Marine and Ocean Data Management
    Manzella, G.1; Schaap, D.2; Rickards, L.3; Nast, F.4; Iona, S.5; Piessersen, P.6; Schlitzer, R.7; Beckers, J.M.8; Barale, V.9; Tonani, M.10; Maudire, G.11
    1ENEA, ITALY;
    2MARIS, NETHERLANDS;
    3NERC BODC, UNITED KINGDOM;
    4BSH, GERMANY;
    5HCMR, GREECE;
    6IOC IODE, NETHERLANDS;
    7AWI, GERMANY;
    8ULG, BELGIUM;
    9CEC, ITALY;
    10INGV, ITALY;
    11Ifremer, FRANCE

    Multidisciplinary oceanographic and marine data are collected by more than a thousand research institutes, governmental organizations and private companies in the countries bordering the European seas using various heterogeneous observing sensors installed on research vessels, submarines, aircraft, moorings drifting buoys and satellites. The various sensors measure physical parameters (temperature, salinity current, sea level, optical properties, magnetic field, gravity), chemistry, biology, seabed characteristics, seabed depth etc. The data are collected at a very considerable cost and are of prime value because they are the reference for any study and, if lost, cannot be remade.

    This data and information is very important for research, but also for monitoring, predicting and managing the marine environment, assessing fish stocks and biodiversity, offshore engineering, controlling any hazard or disaster, and the tourist industry. They support the execution of international protocols, conventions and agreements, which have been signed by coastal states for protection of the seas, such as OSPAR, HELCOM and the Bucharest and Barcelona conventions. They are essential for implementation of Europe's environmental policy concerning Integrated Coastal Zone Management (ICZM), the Water Framework Directive, and the new Marine Strategy Directive. Overall there are many thousands of users, based in the research sector, government and industry.

    SeaDataNet is an Integrated Research Infrastructure Initiative (I3) in EU FP6 to provide the Pan-European data management system adapted both to the fragmented observation system and the users need for an integrated access to data, meta-data, products and services. The SeaDataNet project started in 2006, but builds upon earlier data management infrastructure projects, undertaken over a period of 20 years by an expanding network of oceanographic data centres from the countries around all European seas. Its predecessor project Sea-Search had a strict focus on metadata. SeaDataNet maintains significant interest in the further development of the metadata infrastructure, but its primary objective is the provision of easy data access and generic data products.

    The SeaDataNet project has the following objectives:

  • To set up and operate an efficient Pan-European distributed infrastructure for managing marine and ocean data by connecting 40 National Oceanographic Data Centres (NODC's), national oceanographic focal points, and ocean satellite data centres, in Europe. These Data Centres are mostly divisions of major national marine research institutes and based in 35 countries, surrounding the European seas.
  • To ensure consistent dataset quality and to provide on-line trans-national access to marine metadata, data, products and services through a unique portal, while the base data and information are stored and managed at the distributed data centres.
  • To secure the long term archiving of the large number of multidisciplinary data.
  • To develop added value regional data products like gridded climatologies and trends, in partnership with scientific research laboratories.
  •  
    Utilization of Ocean Reanalysis Data for Climate Variability Analysis of the North Pacific Intermediate Water
    Matsumoto, Satoshi1; Fujii, Yosuke1; Yasuda, Tamaki1; Kamachi, Masafumi1; Nakano, Toshiya2
    1JMA/Meteorological Research Institute, JAPAN;
    2JMA, JAPAN

    Recently, the number of ocean observations has been denser in time and space. Historical observation data is, however, not sufficient from the climate analysis point of view. Numerical ocean models have also improved. They are necessarily affected by defects of parameterization schemes of sub-grid scale phenomena and sea surface fluxes. On the other hand, ocean reanalysis gives a more realistic and 4 dimensional gridded historical data by synthesis of the information from the observation and the model. Therefore, ocean reanalysis data sets are beneficial to climate variability analyses of the historical ocean.

    We conducted ocean analysis/reanalysis experiments for global ocean and the North Pacific. The MRI Multivariate Ocean Variational Estimation (MOVE) System was applied for these experiments. The system adopts a multivariate 3DVAR scheme, in which adopted are a coupled temperature-salinity empirical orthogonal function decomposition in the vertical and horizontal Gaussian structure for background error covariance matrix. Periods of the analyses/reanalyses are 1948-2007 for global ocean and 1955-2005 for the North Pacific. Resolutions in the global and North Pacific are 1 degree (0.3 degree for meridional direction at tropical region) and 0.5 degree, respectively. Sea surface boundary condition for these analyses is an atmospheric reanalysis data, NCEP-R1. Assimilated observation data are in situ observations of temperature and salinity profile (World Ocean Database 2001, Global Temperature and Salinity Profile Project) and satellite altimetry sea surface height anomaly data (AVISO).

    We have been investigated ocean climate variability (e.g., subsurface ocean heat content) and water mass variability (e.g., the North Pacific Tropical Water and Intermediate Water) by using these analysis/reanalysis datasets. In this paper, we report a climate change of the North Pacific Intermediate Water (NPIW). Freshening of NPIW for recent several decades has been shown by observation based analyses. We obtained 3 dimensional distribution of freshening trend of North Pacific for the last 40 years from ocean reanalysis data (MOVE_G_RA_2007). The trend is consistent with the observation based analyses. The trend is large at the western sub-tropical region and the upstream of NPIW, i.e., confluence zone of the western boundary currents of sub-polar and sub-tropical gyres. The freshening at the upstream is caused by increasing of a sub-polar water ratio of the mixed water and not able to explain by changes of the characteristics of the sub-polar and the sub-tropical waters themselves.

     
    Global Ocean and Sea Ice State Estimation in the Presence of Eddies
    Menemenlis, Dimitris1; Heimbach, Patrick2; Hill, Christopher N.2; Campin, Jean-Michel2; Forget, Gael2; Losch, Martin3; Nguyen, An T.1; Schodlok, Michael1; Zhang, Hong1
    1Jet Propulsion Laboratory, California Institute of Technology, UNITED STATES;
    2Massachusetts Institute of Technology, UNITED STATES;
    3Alfred Wegener Institute for Polar and Marine Research, GERMANY

    The Estimating the Circulation and Climate of the Ocean, Phase II (ECCO2) project, aims to produce a best-possible, global, time-evolving synthesis of most available ocean and sea-ice data at a resolution that admits ocean eddies. A first ECCO2 synthesis for the period 1992-2002 has been obtained using a Green's Function approach to estimate initial temperature and salinity conditions, surface boundary conditions, and several empirical ocean and sea ice model parameters. Data constraints include altimetry, gravity, drifter, hydrography, and observations of sea-ice. Although the control space is small (~80 parameters have been adjusted), this first global-ocean and sea ice data synthesis substantially reduces large-scale biases and drifts of the model relative to observations and to the baseline integration. A second ECCO2 synthesis is being obtained during the ARGO-rich period (2004-present) using the adjoint method (Lagrange multipliers), which permits a much larger number of control parameters to be estimated. This paper compares and contrasts the two estimation methodologies, with emphasis on the particular challenges caused by ocean eddies and by sea ice processes, it evaluates the Green's-function-based solution relative to a wide range of satellite and in-situ observations, and it presents early results from the adjoint-method-based solution.

     
    Towards an operational ecosystem approach - European Marine Ecosystem Observatory
    Mills, David K1; Laane, Remi2; Malcolm, Stephen J1; Rees, Jon M1; Baretta-Bekker, J G3; van Ruiten, Kees2; Colijn, Franciscus4; Petersen, Willi4; Schroeder, Friedhelm4; Wehde, Henning5; Svendsen, Einar6; Hackett, Bruce7; Ridderinkhof, Hermann8; Edwards, Martin9; Gohin, Francis10; Forster, Rodney1; Keeble, Kathryn1; Hydes, David11; Nolan, Glen12
    1Cefas, UNITED KINGDOM;
    2Deltares, NETHERLANDS;
    3Waterdienst, NETHERLANDS;
    4GKSS, GERMANY;
    5NIVA, NORWAY;
    6IMR, NORWAY;
    7Met Office, NORWAY;
    8NIOZ, NETHERLANDS;
    9SAHFOS, UNITED KINGDOM;
    10IFREMER, FRANCE;
    11NOC, UNITED KINGDOM;
    12Marine Institute, IRELAND

    European policies on the sea, such as the new Marine Strategy Framework Directive, require a wide range of marine scientific data and information to support the ecosystem-based approach to the management of human activities. The evidence required will be within 'regions' which cross national boundaries and will be based on observations from physics to fish over wide time and space scales. Also additional information is necessary to understand the causal relations between natural and human pressures and environmental status. A European Marine Ecosystem Observatory (EMECO) has been developed that initially focuses on observations in the North Sea. The aim of EMECO is to facilitate networking between European research and monitoring communities with complementary interests focussed on innovative monitoring methods and strategies, that integrate modelling and data-model integration (e.g. field measurements and remote sensing). EMECO builds on existing international cooperation including on-going research and monitoring projects as well as current networks (e.g. EuroGOOS, NOOS, ECOOP, MyOcean, GMES). Many of the component platforms such as Ferrybox and SmartBuoy are mature technologies funded by the EU or its member states which have been delivering in situ data for nearly a decade. Methods for integrating and interpreting spatial and temporal multinational data sets from satellite and models are under development and a prototype application using web based tools including Google Earth has been implemented allowing users to manipulate and visualise integrated data products. EMECO has started dialogues with marine policy makers at European and national level, and is addressing an urgent need for integrated international initiatives that are essential in supporting sustainable development of the coastal seas at regional scales.

     
    Cyberinfrastructure for the U.S. NSF Ocean Observatories Initiative: A Modern Virtual Observatory
    Orcutt, J.1; Peach, C.2; Arrott, M.3; Farcas, C.3; Farcas, E.3; Krueger, I.3; Meisinger, M.3; Chave, A.4; Schofield, O.5; Kleinert, J.6; Vernon, F.2
    1University of California, San Diego, UNITED STATES;
    2Scripps Institution of Oceanography, UNITED STATES;
    3California Institute of Telecommunications & Information Technology, UNITED STATES;
    4Woods Hole Oceanographic Institution, UNITED STATES;
    5Rutgers University, UNITED STATES;
    6Raytheon Intelligence and Information Systems, UNITED STATES

    The Ocean Observatories Initiative (OOI) is an environmental observatory covering a diversity of oceanic environments, ranging from the coastal to the deep ocean. Construction will begin in summer 2009 with deployment phased over five years. A comprehensive cyberinfrastructure is the key integrating element of the OOI and is based on a design utilizing loosely coupled, distributed services with components throughout the observatories, from seafloor instruments to deep sea moorings to shore facilities to computing and archiving infrastructure. The OOI cyberinfrastructure itself can be viewed as an example (instantiation) of a grid or cloud of sensors, networks and other resources. At the same time, the multi-institutional organization can be thought of as a Virtual Organization; in fact, the cyberinfrastructure and organization can both be viewed as Virtual Organizations in which there is flexible, secure, coordinated resource sharing among dynamic collections of individuals, institutions and resources. An earlier realization of such a Virtual Organization on an equally large scale is the Large Hadron Collider (LHC) at CERN, which is connected to more than 750 physicists at 130 sites. The OOI design includes fifty different instrument types with more than a thousand sensors, actuators and autonomous vehicles. While the LHC has been delayed by hardware startup problems, the delivery of data to analysis sites throughout the world relies upon many of the same technologies that are being incorporated into the OOI cyberinfrastructure as well as new approaches, which expand technologies beyond grids to distributed computing and storage clouds, both academic and commercial. Developments in information technology, message passing and social networking, as examples, advance so rapidly that the Virtual Organization must be able to adapt to these changes through evolution of the system architecture even during construction. To meet this need, we have adopted a spiral development strategy and a modular design that can be adapted during both the construction and operations and maintenance phases. In order to meet NSF requirements and provide a basis for integrated planning, the OOI as a whole has relied heavily on systems engineering including user and system design requirements derived through small, intense elicitation workshops bringing together experts in information technology and domain science, and formal testing, verification and validation procedures. In addition, work breakdown structures, program execution plans, risk assessment and mitigation tools and other formal planning methods have been adopted. In developing the OOI cyberinfrastructure, years of planning including conceptual, preliminary and final designs have been necessary requiring the use not only of person-person meetings throughout the US and abroad, but electronic means supporting teleconferencing, videoconferencing, wikis, e-mail, web sites and social networking have been essential. We will review each of the approaches in use to build a viable Virtual Organization and offer an evaluation of the relative importance of each. Plans and existing activities for integration of the OOI with other environmental sensing networks globally will be discussed.

     
    Information Infrastructure for the Australian Integrated Marine Observing System
    Proctor, Roger1; Proctor, Roger2; Roberts, K.1; Bohm, P.1; Cameron, S.1; Hope, J.1; Jones, C.1; Mancini, S.1; Pepper, K.1; Tattersall, K.1; Ward, B.1; Williams, G.1; Mak, P.3; Goessmann, F.3
    1University of Tasmania, AUSTRALIA;
    2Proudman Oceanographic Laboratory, UNITED KINGDOM;
    3Australian Research Collaboration Service, AUSTRALIA

    Marine data and information are the main products of the Integrated Marine Observing System (IMOS, www.imos.org.au) and data management is therefore a central element to the project's success. The eMarine Information Infrastructure (eMII) provides a single integrative framework for data and information management that will allow discovery and access of the data by scientists, managers and the public. The initial strategy has focussed on defining specific data streams and developing end-to-end protocols, standards and systems to join the related observing systems into a unified data storage and access framework.

    IMOS data streams can be categorized in four ways:
    1) gridded data from satellites and HF radar systems
    2) timeseries data from moorings, argo floats, gliders and ships of opportunity
    3) image data from Autonomous Underwater Vehicles
    4) biological data from continuous plankton recorders and acoustic tagging

    1) and 2) provide real-time and delayed-mode data sets whereas 3) and 4) are delayed mode delivery only.

    The IMOS data management infrastructure employs Open Geospatial Consortium (OGC) standards wherever possible. The main components of the system are:

  • OpeNDAP /THREDDS servers hosting CF-compliant netCDF, HDF or Geotiff data
  • The opensource GeoNetwork (http://geonetwork-opensource.org/) Metadata Entry and Search Tool (MEST) for metadata cataloguing. Much of the development work for this tool was carried out by the BlueNet project (www.bluenet.org.au).
  • SensorML, which provides standard models and an XML encoding for describing sensors and measurement processes
  • the opensource DataTurbine (www.dataturbine.org), data streaming middleware providing the foundation for reliable data acquisition and instrument management services
  • A web portal using the opensource ZK Ajax framework (www.zkoss.org) and the OpenLayers geospatial framework (http://openlayers.org/) incorporates access to Web Services.

    Additional storage formats and database protocols (e.g. WOCE exchange format, oracle) accommodate the data sets not readily converted to netCDF.

    A distributed network of OPeNDAP/THREDDS servers around Australia forms the primary data storage. This complements the regional nodal structure of IMOS and allows rapid access to data by the local research community. Each local server also supports the GeoNetwork catalog with, wherever possible, automatic harvesting of metadata from the OPeNDAP/THREDDS system. An IMOS netCDF standard ensures that all necessary metadata to comply with ISO 19115 can be automatically extracted from the netCDF files. Automation of metadata creation from non-netCDF datasets is also being investigated. A master GeoNetwork catalog at the University of Tasmania routinely harvests new metadata records from the regional catalogs to maintain a central registry.

    The IMOS Facility for Automated Intelligent Monitoring of Marine Systems (FAIMMS) uses DataTurbine streaming middleware to deliver real-time data from a sensor network across the Great Barrier Reef. However, the software is also being used provide a real-time view (through the portal) of all IMOS timeseries data collected within the preceding month or two.

    The portal acts as a shop-window to view IMOS data and as a data search engine utilising the GeoNetwork catalog tool. At present three views of IMOS data are being developed: the real-time view through DataTurbine; a Facilities view whereby all data from an IMOS facility, e.g. gliders, can be explored; and a Node view whereby all data within an IMOS regional node, e.g. Southern Australia, can be explored. Through the GeoNetwork MEST the search engine can allow simple and complex data searches, both of IMOS data and other national and international datasets. Accompanying the different views of IMOS data will be a software toolbox. All IMOS data is freely available without constraints and is obtainable through a simple self registration process.

    Data storage and retrieval in IMOS is designed to be interoperable with other national and international programs. Thus, it will be possible to integrate data from sources outside IMOS into IMOS data products, and IMOS data will also be exported to international programs such as Argo, Oceansites. Also, most of the real-time physical parameters data will be exported to the World Meteorological Organisations Global Telecommunications System (GTS).

    As the IMOS program gains momentum the concept of data sharing and its value is spreading across Australia.. The long-term view of the data management infrastructure developed for IMOS is that it will become the infrastructure of the Australian Oceans Data Network.

  •  
    Multi-altimeter Sea Level Assimilation in MFS Model : Impact on Mesoscale Structures
    PUJOL, M.-I.1; DOBRICIC, S.2; PINARDI, N.3; ADANI, M.2
    1INGV, ITALY;
    2CMCC, ITALY;
    3Alma Mater Studiorum Università di Bologna, ITALY

    The impact of multi-satellite altimeter observations assimilation in a high-resolution Mediterranean model was analysed. Four different altimeter missions (Jason-1, Envisat, Topex/Poseidon interleaved and Geosat Follow-On) were used over a 7-month period [September 2004, March 2005] to study the impact of the assimilation of one to four satellites on the analyses quality. The study highlighted three important results. First, it showed the positive impact of the altimeter data on the analyses. The corrected fields capture missing structures of the circulation and eddies are modified in shape, position and intensity with respect to the model simulation. Secondly, the study demonstrated the improvement in the analyses induced by each satellite. The impact of the addition of a second satellite is almost equivalent to the improvement given by the introduction of the first satellite: the second satellite data brings a 12% reduction of the root mean square error (rmse) for the Sea Level Anomaly (SLA). The third and fourth satellite also significantly improve the rmse, with more than 3% reduction for each of them. Finally, it was shown that Envisat and Geosat Follow-On additions to J1 impact the analyses more than the addition of Topex/Poseidon suggesting that the across track spatial resolution is still one of the important aspects of a multi-mission satellite observing system. This result could support the concept of multi-mission altimetric monitoring done by complementary horizontal resolution satellite orbits.

    Comparison of the model analyses with independent temperature and salinity profiles confirmed these results showing a positive impact of the sea level assimilation on the subsurface salinity and temperature estimates.

     
    Arctic Regional Ocean Observing System: Arctic ROOS
    Sandven, Stein1; Bertino, Laurent1; Dahlin, Hans2; Johannessen, Ola M1
    1Nansen Environmental and Remote Sensing Center, NORWAY;
    2EuroGOOS, SWEDEN

    An Arctic Regional Ocean Observing System (Arctic ROOS) has been established by a group of 14 institutions from nine European countries working actively with ocean observation and modelling systems in Arctic and sub-Arctic seas. The background for Arctic ROOS is the growing demand for operational monitoring and forecasting services in Arctic and sub-Arctic seas as a consequence of climate change and increasing human activities in these areas. The Arctic regions offer vast areas of hydrocarbon resources that have just started to be exploited. The Arctic Ocean is surrounded by continental shelves, where in particular the huge Siberian shelf covering the eastern hemisphere, extending from the Barents Sea to the Chukchi Sea. There is growing political interest for the Arctic Ocean and several countries have started investigations of the continental shelves. Sea ice is a major obstacle to accessing the Arctic shelf areas where large potential petroleum resources are located. Operations in sea ice require specialized vessels and constructions designed to withstand the forcing from ice pressure. The observed and predicted sea ice reduction has stimulated the interest for oil and gas exploration in Arctic areas that previously were considered to be inaccessible due to sea ice. Polar waters represent a significantly higher degree of risk to shipping and offshore operations than most other waters, due to the presence of ice fields, wind and waves, icing of vessels and darkness in the winter. The risk of oil spills and other pollution in Arctic waters is a serious issue because of potential damage to the environment. The presence of sea ice makes cleanup techniques normally employed in more temperate climates useless in ice-covered areas. The safety and efficiency of sea transportation, off-shore operations, fisheries and other marine activities have been the motivation to establish operational sea ice monitoring and forecasting services in many countries in addition to the weather services. These services are usually limited to national areas of interest and leaves large parts of the Arctic without daily monitoring and forecasting services. With support from the Global Monitoring for Environment and Security programme (GMES) and other international programmes, satellite observations and modelling systems covering the whole Arctic and sub-Arctic regions are being developed, and several operational services are presently delivering information on sea ice and ocean variables The main components of Arctic ROOS are (1) satellite observations from polar orbiting satellites using active and passive microwave, optical and infrared instruments, (2) numerical modelling including data assimilation, nowcasting, short term forecasting, model comparison and validation, and (3) In situ observation systems based on ship-borne instruments, moored instruments, ice buoys, floats and drifters. Satellite observations of seas ice, wind, waves, oil spills and ocean colour parameters have been developed extensively in recent years with support from GMES projects funded by ESA and EU as well as national programmes. Modelling and forecasting systems have been developed through several EU-funded projects, in particular MERSEA IP, which is completed in 2008 (http://www.mersea.eu.org/). The in situ component of the Arctic ocean observing system is the least developed. In a few places, such as the Fram Strait, moorings have been deployed for more that ten years, measuring ocean and sea ice parameters. Hydrographical surveys from ships have been performed in ice-free waters for many years, but large parts of the interior of the ocean are not observed by any in situ system at all. During IPY 2007 2009 there are, however, several research projects developing new observing systems for ice-covered areas (Dickson, 2007). A key project is DAMOCLES IP, funded by FP6, where testing of new instruments and platforms for under-ice operations is a main activity (http://www.damocles-eu.org/). More information about Arctic ROOS is found at http://arctic-roos.org

     
    GlobWave: Providing Global Harmonized Wave Data
    Snaith, Helen1; Busswell, Geoff2; Sheera, Harjit2; Collard, Fabrice3; Piollé, Jean-François4; Queffeulou, Pierre4; Quilfen, Yves4; Ash, Ellis5; Cotton, David5; Carter, David5; Poulter, David1; Williams, Ivan2
    1National Oceanography Centre, Southampton, UNITED KINGDOM;
    2Logica, UNITED KINGDOM;
    3CLS, FRANCE;
    4IFREMER, FRANCE;
    5SatOC, UNITED KINGDOM

    The primary objective of the GlobWave project is to improve the uptake of satellite-derived wind-wave and swell data by the scientific, operational and commercial user community. The project is a 3 year initiative funded by the European Space Agency, which aims to develop, operate and maintain an integrated set of information services based on satellite wave data.

    Wave data are available from in-situ measurements, satellite altimeter and SAR instruments and are generated by an increasing number of wave models used in forecasting wave conditions. However, the use of wave data in a commercial, scientific and operational environment has been hampered the lack of harmonized and integrated wave data; users are often confused by what wave data are available, the data quality and a lack of data standardization. Merging and analysis of complementary satellite and in-situ measurements can deliver wave products with enhanced accuracy, spatial and temporal coverage, together with new types of higher-level products. This requires the development of methodologies for complementary use of wave data from these different sources.

    This concept has been pioneered in the GHRSST initiative (including its ESA component Medspiration), which has clearly shown the benefits of a user-centric scientific approach. The GlobWave project proposes to transfer this successful approach into the wave domain, and build on it with new achievements.

  • Standardized wave data products and formats to provide a uniform, harmonized set of satellite wave data and ancillary information, in a common format.
  • Reliable wave data based on multiple sensors and sources, which has been quality controlled, calibrated and validated with consistent characterization of errors and biases.
  • Easy access to wave data products via a web portal, regularly updated including processed near-real-time data, and based on an integrated set of information services that are continuously updated and improved based on user feedback and ongoing process improvement.
  • Improved uptake of satellite-derived wind-wave and swell data by the scientific, operational and commercial user community.
  • A sustainable service that users can rely upon to meet their needs in the long term, not just for the duration of the ESA-funded project.

    The project will build on the knowledge and contacts of the consortium members, lead by Logica UK, with support from CLS, IFREMER, SatOC and NOCS, to increase the value provided to GlobWave by existing projects. The project User and Steering Groups will provide direction and focus for the project, ensuring the widest range of activities are included and ensuring that user expectations are met.

  •  
    A High-Quality Global Historic Hydrographic Data Set
    Stammer, Detlef1; Fahrbach, Eberhard2; Nast , Friedrich3; Grobe, Hartmut4; Gouretski , Viktor1
    1University of Hamburg, GERMANY;
    2AWI, GERMANY;
    3BSH, GERMANY;
    4AWI/PANGEA, GERMANY

    There is a general need in the oceanographic community for a historic hydrographic data product, providing quality-controlled temperature and salinity information as long backwards in time as possible. In a cooperative effort between the KlimaCamus of the University of Hamburg, the German Oceanographic Data Centre (DOD, Hamburg), the PANGEA Publishing Network for Geoscientific & Environmental Data and the Alfred Wegener Institut für Polarforschung (AWI), we combine all available global historic hydrographic data into a new quality-controlled product in support of ocean state estimation which provides an estimate of the time-varying ocean circulation by combining all available ocean data with ocean models. Such a data set will provide a description of the two most important characteristics of sea water and applications will include water mass analysis, ocean modeling, besides ocean syntheses. In a first step we create an up to date hydrographic profile data set, which includes temperature and salinity measurements, obtained by means of the old Nansen hydrographic casts and by the modern Conductivity/Temperature/Depth (CTD) instruments. These two kinds of data are by far the most accurate compared to other instrument types. Efforts are spend to include many German data sets not included in the historic data archives before, as well as other data obtained in the past over the global ocean. We extend the quality-control procedure of the World Ocean Database 2005 in several ways. The T and S quality checks will be conducted in the T/S-space and inter-cruise offsets will be calculated wherever possible. As shown by Johnson et al. (2001) and by Gouretski and Jancke (2001), systematic offsets exist between quality-controlled data from different cruises (e.g. WOCE data set). Such inter-cruise offsets will be determined and documented on a cruise by cruise basis. The metadata, most important for the quality assessment of the temperature and salinity data, will be also provided (if available) along with profile data on a cruise by cruise basis. Data processing methods developed during the initial stage of the project will be used for the analysis of other types of hydrographic subsurface data, such as those from mechanical and expendable bathythermographs and profiling floats. The hydrographic cast data will be used as a reference for the quality assessment of data from other instruments. Data will be world-wide available on the data server of the KlimaCampus of the University of Hamburg (www.klimacampus.de).

     
    The CLIVAR and Carbon Hydrographic Data Office
    Swift, J.; Diggs, S.; Fields, J.; Kappa, J.; Kinkade, D.; Berys, C.; Anderson, S.; Barna, A.; Lee, R.; Morison, J.; Muus, D.; Piercy, S.; Shen, M.
    UCSD Scripps Institution of Oceanography, UNITED STATES

    The CCHDO's primary mission is to be a distribution center - to data users - of CTD and hydrographic data sets of the highest possible quality. These data are a product of WOCE, CLIVAR, IOCCP and other oceanographic research programs -- past, present and to come. Whenever possible the CCHDO provides these data in three widely-used formats: WHP-Exchange, which is recommended for data submissions to the CCHDO, and WOCE and netCDF. The CCHDO acquires data through contacts with scientists, data teams, and national data centers. All files are checked for consistency and formats and headers adjusted as needed. The CCHDO also merges bottle data parameters from multiple data originators. The CCHDO produces data files which are up-to-date, properly attributed, well-documented, and with a data history that is available to users. Files are posted on a public web site along with extensive documentation. The CCHDO stands ready to assist the oceanographic community with distribution of the next generation of CTD, hydrographic, ocean carbon, and tracer data.

     
    Enhancements to a Digital Library Web Portal for Ocean and Climate Data.
    Blain, Peter1; Williams, Raymond2; Mak, Pauline3; Petrelli, Paola3; Bindoff, Nathan3
    1Tasmanian Partnership for Advanced Computing, University of Tasmania., AUSTRALIA;
    2Raymond Williams School of Computing and Information Systems, Tasmanian Partnership for Advanced Com, AUSTRALIA;
    3Tasmanian Partnership for Advanced Computing, University of Tasmania, Australian Research Collaborat, AUSTRALIA

    The Tasmanian Partnership for Advanced Computing (TPAC) currently hosts a digital library web portal providing the marine and climate scientific communities with ready access to a wide variety of ocean and climate datasets. The portal uses the OPeNDAP framework for delivery of files and deals with a large number of heterogeneous and geographically distributed datasets, some huge in scale. It employs an associated data harvester that regularly checks specified locations for updated or modified datasets, and updates the portal with the current state of each dataset. The harvester is also capable of discovering new datasets and automatically adding them to the portal. A significant problem hampering data discovery by the harvester is the non-compliance of the datasets to a common standard and the lack of comprehensive metadata to accompany each dataset. The strategy in developing the TPAC Digital Library has been to accept all datasets and collect whatever metadata is associated with each one. This strategy has enabled the Digital Library to expand quickly to become a useful facility, but, as it expands further, a lack of critical metadata, particularly information on the geospatial extent of each dataset, is limiting its capacity to guide researchers to the datasets they require. To address this problem, the database employed by the portal has been restructured to accommodate spatial data, and the harvester modified to retrieve geospatial extents using the OGC Web Coverage Service standard. This enables users to perform spatial searches on some datasets within the Digital Library. Further modifications are currently being made to the harvester to allow spatial searches on almost all of the datasets within the library. Another problem has been the fact that, although the harvester is capable of handling datasets with tens of thousands of files, in some cases datasets include over a million files and it takes a very long time to harvest the required metadata. In order to improve its performance on datasets with very large numbers of files, strategies have been identified and implemented to improve the speed of the harvester. The strategies implemented so far have achieved a three-fold increase in the speed of the harvester in the test environment. This poster outlines the facilities that the TPAC Digital Library portal currently offers, and discusses recent enhancements (including the two described above) that increase the portal's usefulness for the marine and climate scientific communities.