Description
This book series traditionally includes reviews of current topics in computational chemistry and provides minitutorials for novices initiating new directions in their own research along with critical literature reviews highlighting advanced applications. Volume 29 is no exception to that long‐standing tradition. While each chapter has a unique focus, several themes thread through the chapters in this volume, including noncovalent interactions in Chapters 1 and 2, machine learning methods and their applications in Chapters 4 and 5, periodic systems in Chapters 5 and 6, and visualization in Chapters 6 and 7.
The first chapter focuses on a critical shortcoming of standard density functional theory (DFT) methods, demonstrating the failure of popular functionals to correctly model dispersion, π‐π, and hydrogen bonding types of noncovalent interactions. Gino A. DiLabio and Alberto Otero‐de‐la‐Roza guide the reader through different corrections that have been incorporated into DFT methods to address this deficiency and then provide valuable benchmark results for 50 combinations of functional and dispersion correction methods. Anyone planning to investigate questions influenced by noncovalent interactions using DFT methods is well‐advised to utilize these benchmark data to select the most appropriate method for their system of interest.
Chapter 2 offers an alternative approach to addressing the modeling challenge of intermolecular interactions, namely, through the use of quantum electrodynamics (QED) to accurately model electron–photon couplings in order to properly account for the finite speed of electromagnetic signal propagation. Quantum treatment of light matter interactions is particularly critical in a variety of spectroscopic techniques,
including single‐photon methods. Akbar Salam provides an informative introduction to the essentials of the QED approach contrasted against the more typical semiclassical multipolar expansion approach to treating long‐range interaction between charged species. Derivation of electrostatic interactions, resonance energy transfer,
and dispersion potential from molecular QED theory are provided. The chapter concludes with QED theory of macroscopic systems.
In Chapter 3, Joshua Pottel and Nicolas Moitessier review approaches to transition state modeling using the classical methods of molecular mechanics force fields. Their review presents treatment of transition states using both ground state force fields and in transition state force fields. An extensive set of force fields is discussed with shared emphasis on both theory and validation studies. Of particular interest to new practitioners is information on force field availability along with identification of whether those force fields are implemented as stand‐alone software packages, as options within commonly known modeling software packages, or through web server interfaces.
Machine learning methods have been spreading into a variety of application areas over the last several decades. Such methods can be instrumental in identifying trends from large quantities of data or in developing predictive models from an initial training set in order to guide subsequent experiments in the most productive directions. Chapter 4 focuses on machine learning methods that have been applied to research in materials science. The underlying mathematical tools utilized in supervised learning methods are introduced, and algorithms utilized in both supervised and unsupervised learning methods are discussed. The chapter concludes with specific applications of machine learning to a variety of materials problems including determining phase diagrams, predicting materials properties, interatomic potential development, crystal structure predictions, and automated analysis of 2D sectional micrographs to derive information about material structure. Tim Mueller, Aaron Gilad Kusne, and Rampi Ramprasad conclude with references to both open source and commercial software packages in which the algorithms described are implemented.