Svendsen, Peter Limkilde3; Andersen, Ole Baltazar1; Nielsen, Allan Aasbjerg3
1 National Space Institute, Technical University of Denmark2 Geodesy, National Space Institute, Technical University of Denmark3 Department of Applied Mathematics and Computer Science, Technical University of Denmark4 Image Analysis & Computer Graphics, Department of Applied Mathematics and Computer Science, Technical University of Denmark
Sea-level reconstructions spanning several decades have been examined in numerous studies for most of the world's ocean areas, where satellite missions such as TOPEX/Poseidon and Jason-1 and -2 have provided much-improved knowledge of variability and long-term changes in sea level. However, these dedicated oceanographic missions are limited in coverage to between ±66° latitude, and satellite altimeter data at higher latitudes is of a substantially worse quality. Following the approach of Church et al. (2004), we apply a model based on empirical orthogonal functions (EOFs) to the Arctic Ocean, constrained by tide gauge records. A major challenge for this area is the sparsity of both satellite and tide gauge data beyond what can be covered with interpolation, necessitating a time-variable model and consideration to data preprocessing, including selection of appropriate tide gauges. In order to have a reasonable amount of tide gauge data available, we focus on a reconstruction timespan of the last five decades, and the implementation of the model is validated by applying it to global sea-level data. We examine the influence of the individual tide gauges on the resulting solution and the ability of the model to reconstruct known data, in addition to the effects of regularization techniques and the relationship with climatological indices such as the Arctic Oscillation (AO). EOFs are obtained in a preliminary analysis from existing ocean models such as DRAKKAR, and from satellite data (from the ERS-1 and -2 and Envisat missions). In addition to EOFs, we also implement an alternative decomposition technique known as minimum/maximum autocorrelation factors (MAF), based on the spatial or temporal autocorrelation within the calibration period, rather than explained variance.