In this comparison it is expected that no single algorithm will perform best everywhere and under all conditions. Therefore the round robin exercise has the following goals:
- selecting modules which perform best for several instruments
- selecting one of several algorithms (or combinations) for the same instrument; identify among competing retrievals for one sensor their stronger elements also by substituting harmonized elements against standard elements to explore sensitivities to:
- aerosol model (a priori assumptions/choices to size and composition)
- surface reflectance characterization (especially for retrievals over land)
- cloud (and near-cloud) pixel removal procedure
- ancillary data
- selecting algorithms which can provide additional information (e.g. coverage, parameters)
- identify the best elements for preferred retrievals approaches with enough similiarities (element harmonization) in retrievals for different sensors to allow data combination (e.g. AATSR and MERIS)
- identify the added value or need of complementary retrieval approaches
The round robin exercise compares in the first year a larger set of algorithms to find the best performing combinations or modules out of them. These will then be implemented into the prototype algorithms which are then used for the ECV production in the second year of the project.
The round robin exercise will be based on 4 months of global data (March, June, September, December 2008). The ECV product will comprise of the full year of 2008 dataset.
This page will report on results of comparing different algorithms in the round robin exercise. It will explain the rationale behind the decision to choose final modules.
The round robin protocol is contained in the product validation plan:
Prior to the round robin activities are ongoing for understanding algorithm differences and uncertainties and harmonizing modules where feasable. The intended sequence of test datasets (September 2008) is summarized in this table:
The round robin exercise was open not only to the aerosol-CCI retrieval groups but also to external participants at own expenses as long as required format and naming requirements were fulfilled and the complete requested dataset was provided in due time for validation. However, no external partner participated, but one project partner submitted one additional retrieval dataset.
The results of the first round robin analysis are summarized in the Product Valdiation and Algorithm Selection Report, version 1: