Search for a command to run...
Several methods for optimization of model parameters, uncertainty quantification and uncertainty reduction by optimal experimental designs are studied and applied to models with different computational complexity from climate research. \n \nThe generalized least squares estimator and its special cases the weighted and the ordinary least squares estimator are described in detail together with their statistical properties. They are applied to several models using the SQP algorithm, a derivative based local optimization algorithm, in combination with the OQNLP algorithm, a globalization algorithm. This combination is proven to find model parameters well fitting to the measurement data with few function evaluations which is especially important for computationally expensive models. \n \nThe uncertainty in the estimated model parameters implied by the uncertainty in the measurement data as well as the resulting uncertainty in the model output is quantified in several ways using the first and second derivative of the model with respect to its parameters. The advantages and disadvantages of the different methods are highlighted. \n \nThe reduction of the uncertainty by additional measurements is predicted using optimal experimental design methods. It is determined how many measurements are advisable and how their conditions, like time, location and which process to be measured, should be chosen for an optimal uncertainty reduction. Robustification approaches, like sequential optimal experimental design and approximate worst case experimental designs are used to mitigate the dependency of predictions on the model parameters estimate. \n \nA detailed statistical description of the measurements is important for the applied methods. Therefore, a statistical analysis of millions of marine measurement data is carried out. The climatological means, the variabilities, split into climatological and short scale variabilities, and correlations are estimated from the data. The associated probability distributions are examined for normality and log-normality using statistical testing and visual inspection. \n \nTo determine the correlations, an algorithm was developed that generates valid correlation matrices, i.e., positive semidefinite matrices with ones as diagonal values, from estimated correlation matrices. The algorithm tries to keep the changes as small as possible and to achieve a matrix with a low condition number. Its (worst case) execution time and memory consumption are asymptotically equal to those of the fastest algorithms to check positive semidefiniteness, making the algorithm applicable to large matrices. It is also suitable for sparse matrices because it preserves sparsity patterns. In addition to statistics, it can also be useful in numerical optimization. \n \nIn the context of this thesis, several software packages were developed or extended which are freely available as open source and extensively tested. \n \nThe results obtained from the models and data help to improve the understanding of the underlying processes. The applied methods are not limited to the application examples used here and can be applied to many data and models in climate research and beyond.