Date of Award

May 2020

Document Type

Dissertation

Degree Name

Doctor of Philosophy (PhD)

Department

Chemistry

Committee Member

Steven J Stuart

Committee Member

Brian Dominy

Committee Member

Leah Casabianca

Committee Member

Robert Latour

Abstract

In this dissertation the application of statistical mechanics is presented to improve classical simulated annealing and machine learning-based interatomic potentials.

Classical simulated annealing is known to be among the most robust global optimization methods. Therefore, many variations of this method have been developed over the last few decades. This dissertation introduces simulated annealing with adaptive cooling and shows its efficiency with respect to the classical simulated annealing. Adaptive cooling simulated annealing makes use of the on-the-fly evaluation of the sta- tistical mechanical properties to adaptively adjust the cooling rate. In this case, the cooling rate is adaptively adjusted based on the instantaneous evaluations of the heat capacities, with the possible future extension to the density of states. Results are presented for Lennard-Jones clusters optimized by adaptive cooling sim- ulated annealing and the classical simulated annealing. The adaptive cooling approach proved to be more efficient than the classical simulated annealing.

Statistical mechanics was also used to improve the quality and transferability of machine learning- based interatomic potentials. Machine learning (ML)-based interatomic potentials are currently garnering a lot of attention as they strive to achieve the accuracy of electronic structure methods at the computational cost of empirical potentials. Given their generic functional forms, the transferability of these potentials is highly dependent on the quality of the training set, the generation of which is a highly labor-intensive activity. Good training sets should at once contain a very diverse set of configurations while avoiding redundancies that incur cost without providing benefits. We formalize these requirements in a local entropy maximization framework and propose an automated sampling scheme to sample from this objective function. We show that this approach generates much more diverse training sets than unbiased sampling and is competitive with hand-crafted training sets[1].

Share

COinS
 
 

To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.