1 Department of Physics, Technical University of Denmark
Density functional theory (DFT) emerged almost 50 years ago. Since then DFT has established itself as the central electronic structure methodology for simulating atomicscale systems from a few atoms to a few hundred atoms. This success of DFT is due to a very favorable accuracy-to-computational cost ratio. In a wide range of applications, in fields as diverse as condensed matter physics over geophysics, chemistry, and chemical engineering to molecular biology, DFT now continuously delivers the theoretical base for experimental interpretation and forms an essential component for establishing reliable insights and generation of knowledge. In spite of the great successes of DFT, fundamental challenges still exist for the theory. Systematic improvement of the central ingredient in the theory—the approximate exchange–correlation functional—is by no means a straightforward task: Some materials and chemical properties are simply not treated acceptably well, errors are unknown and often difficult to control, and it is difficult to evaluate whether a highly optimized, empirically fitted, functional is truly a “good” functional or whether it is merely over-parametrized and overfitted. In this thesis I address these problems systematically. It is here analyzed carefully which ingredients must be included to establish a functional which simultaneously performs well for a range of important materials and chemical properties. A general methodology is defined for the multi-objective fitting of exchange–correlation functionals in a language that naturally allows for error prediction within DFT. The optimization methodology includes as a central ingredient establishment of a compromise between high transferability (measured by the smoothness of the resulting functional) and low prediction error using statistical resampling techniques, thereby systematically avoiding problems with overfitting. The first ever density functional presenting both reliable accuracy and convincing error estimation is generated. The methodology is general enough to be applied to more complex functional forms with higher-dimensional fitting and resampling. This is illustrated by searching for meta-GGA type functionals that outperform current meta-GGAs while allowing for error estimation.