Browsing by Author "Wang, Junjie"
Now showing 1 - 3 of 3
- Results Per Page
- Sort Options
- General-purpose machine-learned potential for 16 elemental metals and their alloys
A1 Alkuperäisartikkeli tieteellisessä aikakauslehdessä(2024-12) Song, Keke; Zhao, Rui; Liu, Jiahui; Wang, Yanzhou; Lindgren, Eric; Wang, Yong; Chen, Shunda; Xu, Ke; Liang, Ting; Ying, Penghua; Xu, Nan; Zhao, Zhiqiang; Shi, Jiuyang; Wang, Junjie; Lyu, Shuang; Zeng, Zezhu; Liang, Shirong; Dong, Haikuan; Sun, Ligang; Chen, Yue; Zhang, Zhuhua; Guo, Wanlin; Qian, Ping; Sun, Jian; Erhart, Paul; Ala-Nissila, Tapio; Su, Yanjing; Fan, ZheyongMachine-learned potentials (MLPs) have exhibited remarkable accuracy, yet the lack of general-purpose MLPs for a broad spectrum of elements and their alloys limits their applicability. Here, we present a promising approach for constructing a unified general-purpose MLP for numerous elements, demonstrated through a model (UNEP-v1) for 16 elemental metals and their alloys. To achieve a complete representation of the chemical space, we show, via principal component analysis and diverse test datasets, that employing one-component and two-component systems suffices. Our unified UNEP-v1 model exhibits superior performance across various physical properties compared to a widely used embedded-atom method potential, while maintaining remarkable efficiency. We demonstrate our approach’s effectiveness through reproducing experimentally observed chemical order and stable phases, and large-scale simulations of plasticity and primary radiation damage in MoTaVW alloys. - GPUMD: A package for constructing accurate machine-learned potentials and performing highly efficient atomistic simulations
A1 Alkuperäisartikkeli tieteellisessä aikakauslehdessä(2022-09-21) Fan, Zheyong; Wang, Yanzhou; Ying, Penghua; Song, Keke; Wang, Junjie; Wang, Yong; Zeng, Zezhu; Xu, Ke; Lindgren, Eric; Rahm, J. Magnus; Gabourie, Alexander J.; Liu, Jiahui; Dong, Haikuan; Wu, Jianyang; Chen, Yue; Zhong, Zheng; Sun, Jian; Erhart, Paul; Su, Yanjing; Ala-Nissila, TapioWe present our latest advancements of machine-learned potentials (MLPs) based on the neuroevolution potential (NEP) framework introduced in Fan et al. [Phys. Rev. B 104, 104309 (2021)] and their implementation in the open-source package gpumd. We increase the accuracy of NEP models both by improving the radial functions in the atomic-environment descriptor using a linear combination of Chebyshev basis functions and by extending the angular descriptor with some four-body and five-body contributions as in the atomic cluster expansion approach. We also detail our efficient implementation of the NEP approach in graphics processing units as well as our workflow for the construction of NEP models and demonstrate their application in large-scale atomistic simulations. By comparing to state-of-the-art MLPs, we show that the NEP approach not only achieves above-average accuracy but also is far more computationally efficient. These results demonstrate that the gpumd package is a promising tool for solving challenging problems requiring highly accurate, large-scale atomistic simulations. To enable the construction of MLPs using a minimal training set, we propose an active-learning scheme based on the latent space of a pre-trained NEP model. Finally, we introduce three separate Python packages, viz., gpyumd, calorine, and pynep, that enable the integration of gpumd into Python workflows. - Nonlinear optical response of strain-mediated gallium arsenide microwire in the near-infrared region
A1 Alkuperäisartikkeli tieteellisessä aikakauslehdessä(2024-05-03) Cui, Xiangpeng; Huo, Wenjun; Qiu, Linlu; Zhao, Likang; Wang, Junjie; Lou, Fei; Zhang, Shuaiyi; Khayrudinov, Vladislav; Tam, Wing Yim; Lipsanen, Harri; Yang, He; Wang, Xia