Calibration and Construction of Microwave Sensor Temperature Records in the Lower Stratosphere from 2001 to 2015 Using Global Positioning System Radio Occultation Data

Shu-Peng
Ho
UCAR/COSMIC
Liang Peng, UCAR/COSMIC
Poster
In this study we use Global Positioning System (GPS) Radio Occultation (RO)-simulated Advanced Microwave Sounding Unit (AMSU) Ch9 brightness temperature (i.e., the temperature in the lower stratosphere, TLS) to serve as onboard benchmark calibration references to calibrate those observations from multiple microwave sounders from 2001 to 2010. The RO-simulated TLS from the CHAllenging Minisatellite Payload (CHAMP), Gravity Recovery And Climate Experiment (GRACE), and Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC) are collocated with AMSU TLS measurements from National Oceanic and Atmospheric Administration (NOAA) 15, 16, 18, 19, Aqua, and Metop-A in the same month. Results show that the RO-simulated TLS are highly useful to identify and correct AMSU location- and seasonal-dependent intersatellite biases resulting from the possible warm target temperature drift from orbit to orbit. The mean differences of RO-calibrated TLS time series for N16, N18, N19, Aqua, and Metop-A relative to the RO-calibrated TLS for N15 are equal to (0.01, 0.00, 0.06, 0.09, 0.03)K, respectively. The calibrated TLS data from multiple AMSU missions are used to generate a climate data record containing 10-years of TLS. The global TLS anomalies trend and that for each of the northern high-latitudes, the northern mid-latitudes, the Tropics, the southern mid-latitudes, and the southern high-latitudes is equal to -0.17 K/decade, 0.01 K/decade, 0.025 K/decade, 0.01 K/decade, -0.179 K/decade, and -1.037 K/decade, respectively. Although not traceable to the International System of Units, the RO-simulated TLS preserve the high precision and long-term stable nature of the RO raw data.
OSTS session
Regional and Global CAL/VAL for Assembling a Climate Data Record