Using device 0 (rank 0, local rank 0, local size 3) : Tesla P100-PCIE-16GB Using device 1 (rank 1, local rank 1, local size 3) : Quadro GP100 Using device 2 (rank 2, local rank 2, local size 3) : Tesla P100-PCIE-16GB running on 3 total cores distrk: each k-point on 3 cores, 1 groups distr: one band on 1 cores, 3 groups ******************************************************************************* You are running the GPU port of VASP! When publishing results obtained with this version, please cite: - M. Hacene et al., http://dx.doi.org/10.1002/jcc.23096 - M. Hutchinson and M. Widom, http://dx.doi.org/10.1016/j.cpc.2012.02.017 in addition to the usual required citations (see manual). GPU developers: A. Anciaux-Sedrakian, C. Angerer, and M. Hutchinson. ******************************************************************************* ----------------------------------------------------------------------------- | | | W W AA RRRRR N N II N N GGGG !!! | | W W A A R R NN N II NN N G G !!! | | W W A A R R N N N II N N N G !!! | | W WW W AAAAAA RRRRR N N N II N N N G GGG ! | | WW WW A A R R N NN II N NN G G | | W W A A R R N N II N N GGGG !!! | | | | Please note that VASP has recently been ported to GPU by means of | | OpenACC. You are running the CUDA-C GPU-port of VASP, which is | | deprecated and no longer actively developed, maintained, or | | supported. In the near future, the CUDA-C GPU-port of VASP will be | | dropped completely. We encourage you to switch to the OpenACC | | GPU-port of VASP as soon as possible. | | | ----------------------------------------------------------------------------- vasp.6.2.1 16May21 (build Apr 11 2022 11:03:26) complex MD_VERSION_INFO: Compiled 2022-04-11T18:25:55-UTC in devlin.sd.materialsdesign. com:/home/medea2/data/build/vasp6.2.1/16685/x86_64/src/src/build/gpu from svn 1 6685 This VASP executable licensed from Materials Design, Inc. POSCAR found type information on POSCAR SiO ClH POSCAR found : 4 types and 75 ions ----------------------------------------------------------------------------- | | | W W AA RRRRR N N II N N GGGG !!! | | W W A A R R NN N II NN N G G !!! | | W W A A R R N N N II N N N G !!! | | W WW W AAAAAA RRRRR N N N II N N N G GGG ! | | WW WW A A R R N NN II N NN G G | | W W A A R R N N II N N GGGG !!! | | | | You use a magnetic or noncollinear calculation, but did not specify | | the initial magnetic moment with the MAGMOM tag. Note that a | | default of 1 will be used for all atoms. This ferromagnetic setup | | may break the symmetry of the crystal, in particular it may rule | | out finding an antiferromagnetic solution. Thence, we recommend | | setting the initial magnetic moment manually or verifying carefully | | that this magnetic setup is desired. | | | ----------------------------------------------------------------------------- NWRITE = 1 NWRITE = 1 NWRITE = 1 LDA part: xc-table for Pade appr. of Perdew WARNING: The GPU port of VASP has been extensively tested for: ALGO=Normal, Fast, and VeryFast. Other algorithms may produce incorrect results or yield suboptimal performance. Handle with care! ----------------------------------------------------------------------------- | | | W W AA RRRRR N N II N N GGGG !!! | | W W A A R R NN N II NN N G G !!! | | W W A A R R N N N II N N N G !!! | | W WW W AAAAAA RRRRR N N N II N N N G GGG ! | | WW WW A A R R N NN II N NN G G | | W W A A R R N N II N N GGGG !!! | | | | The distance between some ions is very small. Please check the | | nearest-neighbor list in the OUTCAR file. | | I HOPE YOU KNOW WHAT YOU ARE DOING! | | | ----------------------------------------------------------------------------- POSCAR, INCAR and KPOINTS ok, starting setup creating 32 CUDA streams... creating 32 CUDA streams... creating 32 CUFFT plans with grid size 54 x 126 x 40... creating 32 CUDA streams... creating 32 CUFFT plans with grid size 54 x 126 x 40... creating 32 CUFFT plans with grid size 54 x 126 x 40... FFT: planning ... WAVECAR not read entering main loop N E dE d eps ncg rms rms(c) DAV: 1 0.286197767867E+04 0.28620E+04 -0.10154E+05 3735 0.851E+02 DAV: 2 0.784268933998E+03 -0.20777E+04 -0.19761E+04 4815 0.228E+02 DAV: 3 0.473206042543E+03 -0.31106E+03 -0.30505E+03 4410 0.919E+01 DAV: 4 0.463415062408E+03 -0.97910E+01 -0.97360E+01 4536 0.176E+01 DAV: 5 0.463152113935E+03 -0.26295E+00 -0.26273E+00 4815 0.269E+00 0.355E+03 DAV: 6 0.654841911148E+03 0.19169E+03 -0.36475E+02 4014 0.442E+01 0.166E+03 DAV: 7 0.481231199476E+03 -0.17361E+03 -0.58115E+02 4671 0.557E+01 0.871E+02 DAV: 8 -0.270307155360E+03 -0.75154E+03 -0.17901E+03 4914 0.172E+02 0.129E+03 DAV: 9 -0.129040352320E+04 -0.10201E+04 -0.16530E+03 5076 0.673E+01 0.111E+03 DAV: 10 0.190543241191E+03 0.14809E+04 -0.10102E+04 6039 0.129E+02 0.784E+02 DAV: 11 -0.104973229770E+03 -0.29552E+03 -0.27236E+03 5274 0.827E+01 0.672E+02 DAV: 12 -0.215808781951E+02 0.83392E+02 -0.31681E+02 4698 0.313E+01 0.628E+02 DAV: 13 0.696986000047E+02 0.91279E+02 -0.14805E+02 4482 0.256E+01 0.552E+02 DAV: 14 0.384601049374E+03 0.31490E+03 -0.24583E+02 4923 0.271E+01 0.484E+02 DAV: 15 0.663322858284E+03 0.27872E+03 -0.35124E+02 5274 0.245E+01 0.302E+02 DAV: 16 0.656394603866E+03 -0.69283E+01 -0.17456E+02 5256 0.179E+01 0.178E+02 DAV: 17 0.638525064114E+03 -0.17870E+02 -0.53038E+01 4599 0.157E+01 0.330E+02 DAV: 18 0.642331137422E+03 0.38061E+01 -0.23828E+01 4473 0.819E+00 0.350E+02 DAV: 19 0.640040977213E+03 -0.22902E+01 -0.18714E+01 4284 0.536E+00 0.360E+02 DAV: 20 0.642425648841E+03 0.23847E+01 -0.91325E+00 4302 0.261E+00 0.355E+02 DAV: 21 0.660250585343E+03 0.17825E+02 -0.91250E+01 3897 0.123E+01 0.218E+02 DAV: 22 0.667271255630E+03 0.70207E+01 -0.34140E+01 4356 0.103E+01 0.276E+02 DAV: 23 0.665847446385E+03 -0.14238E+01 -0.63056E+00 4662 0.426E+00 0.525E+02 DAV: 24 0.654329039629E+03 -0.11518E+02 -0.33548E+00 4167 0.273E+00 0.523E+02 DAV: 25 0.676455132599E+03 0.22126E+02 -0.16753E+01 4203 0.655E+00 0.605E+02 ----------------------------------------------------------------------------- | | | EEEEEEE RRRRRR RRRRRR OOOOOOO RRRRRR ### ### ### | | E R R R R O O R R ### ### ### | | E R R R R O O R R ### ### ### | | EEEEE RRRRRR RRRRRR O O RRRRRR # # # | | E R R R R O O R R | | E R R R R O O R R ### ### ### | | EEEEEEE R R R R OOOOOOO R R ### ### ### | | | | Error EDDDAV: Call to ZHEGV failed. Returncode = 8 1 9 | | | | ----> I REFUSE TO CONTINUE WITH THIS SICK JOB ... BYE!!! <---- | | | ----------------------------------------------------------------------------- ***************************** Error running VASP parallel with MPI #!/bin/bash cd "/home/user/MD/TaskServer/Tasks/172.16.0.39-32000-task47763" export PATH="/home/user/MD/Linux-x86_64/IntelMPI5/bin:$PATH" export LD_LIBRARY_PATH="$LD_LIBRARY_PATH:/home/user/MD/Linux-x86_64/IntelMPI5/lib:/home/user/MD/TaskServer/Tools/vasp-gpu6.2.1/Linux-x86_64" "/home/user/MD/Linux-x86_64/IntelMPI5/bin/mpirun" -r ssh -np 3 "/home/user/MD/TaskServer/Tools/vasp-gpu6.2.1/Linux-x86_64/vasp_gpu" 1 1 1 *****************************