 Main
Improving Wavefunction Efficiency by Tessellating Correlation Factors and Coupled StateSpecific Optimization
 Van Der Goetz, Beatrice Weston
 Advisor(s): HeadGordon, Martin P.
Abstract
Rearranging chemical bonds is chemistry. Simulating chemical reactions is an expensive and complex process, and is necessary to understand the photochemical reactions that drive processes like chemical lightharvesting. The electronic manybody physics describing the bonds that participate in these processes becomes complicated and expensive even on modestly sized molecules, and computationally affordable approximations can qualitatively fail. One approach to this problem relies on devising compact and expressive wavefunction forms that are simple enough to be efficiently computed yet complex enough to capture the subtleties of manyelectron physics.
Due to their relaxed integrability conditions, positionspace Quantum Monte Carlo methods permit the use of flexible wavefunction components like Jastrow correlation factors that can be used to exactly express wavefunction cusps, which are otherwise difficult to describe. As many existing factors used in these calculations focus primarily on shortrange or general correlation effects, we aim to augment the library of realspace correlation factors by developing one designed to handle the strong electronic correlation of bondbreaking. These factors do this by accounting for correlations between populations of electrons in different pockets of space using a set of indicatorlike functions fashioned into a tessellation of Voronoi cells. These Voronoi cells can be automatically constructed around atomic coordinates or further tailored to the chemical system, as the math describing them is flexible enough to allow more subtle intraatomic subdivisions and curved interfaces. In simple test systems, this factor is able to correctly cleave chemical bonds when applied to a singledeterminant reference, and the resulting wavefunction is competitive with highlyaccurate but expensive methods.
Just as wavefunctions can be limited by their functional form, so too they can be limited by the definition of their reference point. Many excited state methods are based on a linear response formalism, in which excited states are generated in the language of perturbations applied to a fixed ground state. When the wavefunction qualitatively changes upon electronic excitation  as they do in chargetransfer states and core excitations  these methods can fail to predict excitation energies with errors on the scale of several electronvolts. Optimizing molecular orbitals for individual excited states is one particularly efficient way to make the necessary zerothorder changes to capture these states, but these statespecific methods can suffer from instabilities in their optimization. One particular benefit of this statespecific approach is that these tailored orbitals naturally compress the wavefunction, but in certain cases, the ground state can interfere when minimal representations of the excited state are sought, ultimately causing variational collapse. We show that this collapse behavior occurs in two different statespecific approaches, and show how this arises from an inadequatelymodeled avoided crossing, and argue that orbitalCI coupling plays a key role in its prevention.
Newer excited state methods have also seen the use of target functions, where functions of the energy such as the squaregradient magnitude are minimized in place of the energy in order to stabilize statespecific optimization by transforming saddle points on the energy surface into minima on the target function surface. While pilot implementations of squaregradientbased optimizers for the ExcitedState MeanField (ESMF) wavefunction are able to obtain statespecific orbitals at low cost, these are still new and have yet to benefit from numerical accelerations, limiting their use. For instance, while stable, the quasiNewton optimizer in the ESMFGVP implementation uses no Hessian preconditioner, and while stable to variational collapse due to its inclusion of orbitalCI coupling, is much slower relative to the occasionallyunstable ESMFSCF implementation. Using the exact Hessian and full NewtonRaphson optimization as a benchmark, we explore a variety of Hessian approximations, and find that an approximate diagonal Hessian can accelerate the ESMFGVP squaregradient minimization to match the speed of the gradientonly ESMFSCF at meanfield costscaling while resisting variational collapse.
Main Content
Enter the password to open this PDF file:













