Using device 1 (rank 1, local rank 1, local size 3) : Quadro GP100 Using device 0 (rank 0, local rank 0, local size 3) : Tesla P100-PCIE-16GB Using device 2 (rank 2, local rank 2, local size 3) : Tesla P100-PCIE-16GB running on 3 total cores distrk: each k-point on 3 cores, 1 groups distr: one band on 1 cores, 3 groups ******************************************************************************* You are running the GPU port of VASP! When publishing results obtained with this version, please cite: - M. Hacene et al., http://dx.doi.org/10.1002/jcc.23096 - M. Hutchinson and M. Widom, http://dx.doi.org/10.1016/j.cpc.2012.02.017 in addition to the usual required citations (see manual). GPU developers: A. Anciaux-Sedrakian, C. Angerer, and M. Hutchinson. ******************************************************************************* ----------------------------------------------------------------------------- | | | W W AA RRRRR N N II N N GGGG !!! | | W W A A R R NN N II NN N G G !!! | | W W A A R R N N N II N N N G !!! | | W WW W AAAAAA RRRRR N N N II N N N G GGG ! | | WW WW A A R R N NN II N NN G G | | W W A A R R N N II N N GGGG !!! | | | | Please note that VASP has recently been ported to GPU by means of | | OpenACC. You are running the CUDA-C GPU-port of VASP, which is | | deprecated and no longer actively developed, maintained, or | | supported. In the near future, the CUDA-C GPU-port of VASP will be | | dropped completely. We encourage you to switch to the OpenACC | | GPU-port of VASP as soon as possible. | | | ----------------------------------------------------------------------------- vasp.6.2.1 16May21 (build Apr 11 2022 11:03:26) complex MD_VERSION_INFO: Compiled 2022-04-11T18:25:55-UTC in devlin.sd.materialsdesign. com:/home/medea2/data/build/vasp6.2.1/16685/x86_64/src/src/build/gpu from svn 1 6685 This VASP executable licensed from Materials Design, Inc. POSCAR found type information on POSCAR SiO ClH POSCAR found : 4 types and 75 ions NWRITE = 1 ----------------------------------------------------------------------------- | | | W W AA RRRRR N N II N N GGGG !!! | | W W A A R R NN N II NN N G G !!! | | W W A A R R N N N II N N N G !!! | | W WW W AAAAAA RRRRR N N N II N N N G GGG ! | | WW WW A A R R N NN II N NN G G | | W W A A R R N N II N N GGGG !!! | | | | You use a magnetic or noncollinear calculation, but did not specify | | the initial magnetic moment with the MAGMOM tag. Note that a | | default of 1 will be used for all atoms. This ferromagnetic setup | | may break the symmetry of the crystal, in particular it may rule | | out finding an antiferromagnetic solution. Thence, we recommend | | setting the initial magnetic moment manually or verifying carefully | | that this magnetic setup is desired. | | | ----------------------------------------------------------------------------- NWRITE = 1 NWRITE = 1 LDA part: xc-table for Pade appr. of Perdew WARNING: The GPU port of VASP has been extensively tested for: ALGO=Normal, Fast, and VeryFast. Other algorithms may produce incorrect results or yield suboptimal performance. Handle with care! ----------------------------------------------------------------------------- | | | W W AA RRRRR N N II N N GGGG !!! | | W W A A R R NN N II NN N G G !!! | | W W A A R R N N N II N N N G !!! | | W WW W AAAAAA RRRRR N N N II N N N G GGG ! | | WW WW A A R R N NN II N NN G G | | W W A A R R N N II N N GGGG !!! | | | | The distance between some ions is very small. Please check the | | nearest-neighbor list in the OUTCAR file. | | I HOPE YOU KNOW WHAT YOU ARE DOING! | | | ----------------------------------------------------------------------------- POSCAR, INCAR and KPOINTS ok, starting setup creating 32 CUDA streams... creating 32 CUDA streams... creating 32 CUDA streams... creating 32 CUFFT plans with grid size 54 x 126 x 40... creating 32 CUFFT plans with grid size 54 x 126 x 40... creating 32 CUFFT plans with grid size 54 x 126 x 40... FFT: planning ... WAVECAR not read entering main loop N E dE d eps ncg rms rms(c) DAV: 1 0.218868792435E+04 0.21887E+04 -0.99105E+04 3726 0.851E+02 DAV: 2 0.418563421855E+03 -0.17701E+04 -0.16789E+04 4770 0.207E+02 DAV: 3 0.128365417353E+03 -0.29020E+03 -0.27813E+03 4572 0.837E+01 DAV: 4 0.116439815292E+03 -0.11926E+02 -0.11864E+02 4509 0.181E+01 DAV: 5 0.116133511521E+03 -0.30630E+00 -0.30604E+00 4896 0.275E+00 0.146E+03 DAV: 6 0.261652560140E+03 0.14552E+03 -0.32447E+02 4365 0.420E+01 0.462E+02 DAV: 7 0.819939786485E+02 -0.17966E+03 -0.59491E+02 4662 0.543E+01 0.701E+02 DAV: 8 -0.103306059564E+04 -0.11151E+04 -0.12164E+04 5427 0.643E+01 0.550E+02 DAV: 9 0.242330585052E+03 0.12754E+04 -0.34519E+03 4797 0.609E+01 0.284E+02 DAV: 10 0.225731383517E+03 -0.16599E+02 -0.16605E+02 4743 0.174E+01 0.183E+02 DAV: 11 0.215562512406E+03 -0.10169E+02 -0.68109E+01 5031 0.265E+01 0.184E+02 DAV: 12 0.233022337111E+03 0.17460E+02 -0.81465E+01 4248 0.190E+01 0.119E+02 DAV: 13 0.229818717949E+03 -0.32036E+01 -0.20481E+01 4329 0.881E+00 0.137E+02 DAV: 14 0.217530325227E+03 -0.12288E+02 -0.18301E+02 4320 0.751E+00 0.785E+01 DAV: 15 0.211514096246E+03 -0.60162E+01 -0.10308E+01 4401 0.565E+00 0.947E+01 DAV: 16 0.202328808921E+03 -0.91853E+01 -0.57462E+00 4419 0.570E+00 0.104E+02 DAV: 17 0.198233141061E+03 -0.40957E+01 -0.38042E+00 4302 0.374E+00 0.972E+01 DAV: 18 0.193337861845E+03 -0.48953E+01 -0.51209E+00 4302 0.444E+00 0.116E+02 DAV: 19 0.190647265107E+03 -0.26906E+01 -0.77403E-01 4482 0.167E+00 0.125E+02 DAV: 20 0.190029456607E+03 -0.61781E+00 -0.30981E-01 4545 0.127E+00 0.137E+02 ----------------------------------------------------------------------------- | | | EEEEEEE RRRRRR RRRRRR OOOOOOO RRRRRR ### ### ### | | E R R R R O O R R ### ### ### | | E R R R R O O R R ### ### ### | | EEEEE RRRRRR RRRRRR O O RRRRRR # # # | | E R R R R O O R R | | E R R R R O O R R ### ### ### | | EEEEEEE R R R R OOOOOOO R R ### ### ### | | | | Error EDDDAV: Call to ZHEGV failed. Returncode = 8 1 9 | | | | ----> I REFUSE TO CONTINUE WITH THIS SICK JOB ... BYE!!! <---- | | | ----------------------------------------------------------------------------- ***************************** Error running VASP parallel with MPI #!/bin/bash cd "/home/user/MD/TaskServer/Tasks/172.16.0.39-32000-task47739" export PATH="/home/user/MD/Linux-x86_64/IntelMPI5/bin:$PATH" export LD_LIBRARY_PATH="$LD_LIBRARY_PATH:/home/user/MD/Linux-x86_64/IntelMPI5/lib:/home/user/MD/TaskServer/Tools/vasp-gpu6.2.1/Linux-x86_64" "/home/user/MD/Linux-x86_64/IntelMPI5/bin/mpirun" -r ssh -np 3 "/home/user/MD/TaskServer/Tools/vasp-gpu6.2.1/Linux-x86_64/vasp_gpu" 1 1 1 *****************************