Using device 0 (rank 0, local rank 0, local size 3) : Tesla P100-PCIE-16GB Using device 1 (rank 1, local rank 1, local size 3) : Quadro GP100 Using device 2 (rank 2, local rank 2, local size 3) : Tesla P100-PCIE-16GB running on 3 total cores distrk: each k-point on 3 cores, 1 groups distr: one band on 1 cores, 3 groups ******************************************************************************* You are running the GPU port of VASP! When publishing results obtained with this version, please cite: - M. Hacene et al., http://dx.doi.org/10.1002/jcc.23096 - M. Hutchinson and M. Widom, http://dx.doi.org/10.1016/j.cpc.2012.02.017 in addition to the usual required citations (see manual). GPU developers: A. Anciaux-Sedrakian, C. Angerer, and M. Hutchinson. ******************************************************************************* ----------------------------------------------------------------------------- | | | W W AA RRRRR N N II N N GGGG !!! | | W W A A R R NN N II NN N G G !!! | | W W A A R R N N N II N N N G !!! | | W WW W AAAAAA RRRRR N N N II N N N G GGG ! | | WW WW A A R R N NN II N NN G G | | W W A A R R N N II N N GGGG !!! | | | | Please note that VASP has recently been ported to GPU by means of | | OpenACC. You are running the CUDA-C GPU-port of VASP, which is | | deprecated and no longer actively developed, maintained, or | | supported. In the near future, the CUDA-C GPU-port of VASP will be | | dropped completely. We encourage you to switch to the OpenACC | | GPU-port of VASP as soon as possible. | | | ----------------------------------------------------------------------------- vasp.6.2.1 16May21 (build Apr 11 2022 11:03:26) complex MD_VERSION_INFO: Compiled 2022-04-11T18:25:55-UTC in devlin.sd.materialsdesign. com:/home/medea2/data/build/vasp6.2.1/16685/x86_64/src/src/build/gpu from svn 1 6685 This VASP executable licensed from Materials Design, Inc. POSCAR found type information on POSCAR SiO H POSCAR found : 3 types and 36 ions ----------------------------------------------------------------------------- | | | W W AA RRRRR N N II N N GGGG !!! | | W W A A R R NN N II NN N G G !!! | | W W A A R R N N N II N N N G !!! | | W WW W AAAAAA RRRRR N N N II N N N G GGG ! | | WW WW A A R R N NN II N NN G G | | W W A A R R N N II N N GGGG !!! | | | | You use a magnetic or noncollinear calculation, but did not specify | | the initial magnetic moment with the MAGMOM tag. Note that a | | default of 1 will be used for all atoms. This ferromagnetic setup | | may break the symmetry of the crystal, in particular it may rule | | out finding an antiferromagnetic solution. Thence, we recommend | | setting the initial magnetic moment manually or verifying carefully | | that this magnetic setup is desired. | | | ----------------------------------------------------------------------------- NWRITE = 1 NWRITE = 1 NWRITE = 1 LDA part: xc-table for Pade appr. of Perdew WARNING: The GPU port of VASP has been extensively tested for: ALGO=Normal, Fast, and VeryFast. Other algorithms may produce incorrect results or yield suboptimal performance. Handle with care! ----------------------------------------------------------------------------- | | | W W AA RRRRR N N II N N GGGG !!! | | W W A A R R NN N II NN N G G !!! | | W W A A R R N N N II N N N G !!! | | W WW W AAAAAA RRRRR N N N II N N N G GGG ! | | WW WW A A R R N NN II N NN G G | | W W A A R R N N II N N GGGG !!! | | | | The distance between some ions is very small. Please check the | | nearest-neighbor list in the OUTCAR file. | | I HOPE YOU KNOW WHAT YOU ARE DOING! | | | ----------------------------------------------------------------------------- POSCAR, INCAR and KPOINTS ok, starting setup creating 32 CUDA streams... creating 32 CUDA streams... creating 32 CUDA streams... creating 32 CUFFT plans with grid size 54 x 96 x 40... creating 32 CUFFT plans with grid size 54 x 96 x 40... creating 32 CUFFT plans with grid size 54 x 96 x 40... FFT: planning ... WAVECAR not read entering main loop N E dE d eps ncg rms rms(c) DAV: 1 0.890389258378E+03 0.89039E+03 -0.41041E+04 3465 0.799E+02 DAV: 2 0.224438775616E+03 -0.66595E+03 -0.62681E+03 5094 0.159E+02 DAV: 3 0.102878738230E+03 -0.12156E+03 -0.11727E+03 5211 0.633E+01 DAV: 4 0.960042387775E+02 -0.68745E+01 -0.68096E+01 5031 0.165E+01 DAV: 5 0.957347382600E+02 -0.26950E+00 -0.26927E+00 5130 0.329E+00 0.631E+02 DAV: 6 0.153729035680E+03 0.57994E+02 -0.35613E+02 5382 0.495E+01 0.121E+02 DAV: 7 0.846398115944E+02 -0.69089E+02 -0.30662E+02 4815 0.457E+01 0.185E+02 DAV: 8 0.144222931530E+03 0.59583E+02 -0.50945E+02 5229 0.421E+01 0.974E+01 DAV: 9 0.188282979477E+03 0.44060E+02 -0.14883E+02 4545 0.247E+01 0.677E+01 DAV: 10 0.186212516802E+03 -0.20705E+01 -0.22210E+01 4653 0.862E+00 0.897E+01 DAV: 11 0.191310843222E+03 0.50983E+01 -0.10780E+01 5292 0.817E+00 0.115E+02 DAV: 12 0.187461816222E+03 -0.38490E+01 -0.15987E+01 4059 0.849E+00 0.145E+02 DAV: 13 0.187391187166E+03 -0.70629E-01 -0.20606E+00 4662 0.295E+00 0.153E+02 DAV: 14 0.188701232238E+03 0.13100E+01 -0.32440E+00 3960 0.360E+00 0.143E+02 DAV: 15 0.187799461279E+03 -0.90177E+00 -0.47228E+01 4185 0.145E+01 0.144E+02 DAV: 16 0.208416788664E+03 0.20617E+02 -0.85596E+01 3699 0.182E+01 0.186E+02 DAV: 17 0.221492839659E+03 0.13076E+02 -0.44994E+01 4716 0.156E+01 0.261E+02 DAV: 18 0.220217098599E+03 -0.12757E+01 -0.43349E+01 5058 0.737E+00 0.259E+02 DAV: 19 0.217423281666E+03 -0.27938E+01 -0.26511E+00 3996 0.341E+00 0.258E+02 DAV: 20 0.213113833337E+03 -0.43094E+01 -0.52901E+00 3717 0.485E+00 0.266E+02 ----------------------------------------------------------------------------- | | | EEEEEEE RRRRRR RRRRRR OOOOOOO RRRRRR ### ### ### | | E R R R R O O R R ### ### ### | | E R R R R O O R R ### ### ### | | EEEEE RRRRRR RRRRRR O O RRRRRR # # # | | E R R R R O O R R | | E R R R R O O R R ### ### ### | | EEEEEEE R R R R OOOOOOO R R ### ### ### | | | | Error EDDDAV: Call to ZHEGV failed. Returncode = 8 1 9 | | | | ----> I REFUSE TO CONTINUE WITH THIS SICK JOB ... BYE!!! <---- | | | ----------------------------------------------------------------------------- ***************************** Error running VASP parallel with MPI #!/bin/bash cd "/home/user/MD/TaskServer/Tasks/172.16.0.39-32000-task39821" export PATH="/home/user/MD/Linux-x86_64/IntelMPI5/bin:$PATH" export LD_LIBRARY_PATH="$LD_LIBRARY_PATH:/home/user/MD/Linux-x86_64/IntelMPI5/lib:/home/user/MD/TaskServer/Tools/vasp-gpu6.2.1/Linux-x86_64" "/home/user/MD/Linux-x86_64/IntelMPI5/bin/mpirun" -r ssh -np 3 "/home/user/MD/TaskServer/Tools/vasp-gpu6.2.1/Linux-x86_64/vasp_gpu" 1 1 1 *****************************