no code implementations • 28 Sep 2021 • Onur Kara, Arijit Sehanobish, Hector H Corzo
Transformers are state-of-the-art deep learning models that are composed of stacked attention and point-wise, fully connected layers designed for handling sequential data.
1 code implementation • 8 Jun 2021 • Hector H. Corzo, Arijit Sehanobish, Onur Kara
In this report, we present a deep learning framework termed the Electron Correlation Potential Neural Network (eCPNN) that can learn succinct and compact potential functions.
1 code implementation • 23 Jun 2020 • Arijit Sehanobish, Hector H. Corzo, Onur Kara, David van Dijk
Attempts to apply Neural Networks (NN) to a wide range of research problems have been ubiquitous and plentiful in recent literature.