Using device 1 (rank 1, local rank 1, local size 3) : Tesla P100-PCIE-12GB Using device 2 (rank 2, local rank 2, local size 3) : Quadro GP100 Using device 0 (rank 0, local rank 0, local size 3) : Tesla P100-PCIE-12GB running on 3 total cores distrk: each k-point on 3 cores, 1 groups distr: one band on 1 cores, 3 groups ******************************************************************************* You are running the GPU port of VASP! When publishing results obtained with this version, please cite: - M. Hacene et al., http://dx.doi.org/10.1002/jcc.23096 - M. Hutchinson and M. Widom, http://dx.doi.org/10.1016/j.cpc.2012.02.017 in addition to the usual required citations (see manual). GPU developers: A. Anciaux-Sedrakian, C. Angerer, and M. Hutchinson. ******************************************************************************* ----------------------------------------------------------------------------- | | | W W AA RRRRR N N II N N GGGG !!! | | W W A A R R NN N II NN N G G !!! | | W W A A R R N N N II N N N G !!! | | W WW W AAAAAA RRRRR N N N II N N N G GGG ! | | WW WW A A R R N NN II N NN G G | | W W A A R R N N II N N GGGG !!! | | | | Please note that VASP has recently been ported to GPU by means of | | OpenACC. You are running the CUDA-C GPU-port of VASP, which is | | deprecated and no longer actively developed, maintained, or | | supported. In the near future, the CUDA-C GPU-port of VASP will be | | dropped completely. We encourage you to switch to the OpenACC | | GPU-port of VASP as soon as possible. | | | ----------------------------------------------------------------------------- vasp.6.2.1 16May21 (build Apr 11 2022 11:03:26) complex MD_VERSION_INFO: Compiled 2022-04-11T18:25:55-UTC in devlin.sd.materialsdesign. com:/home/medea2/data/build/vasp6.2.1/16685/x86_64/src/src/build/gpu from svn 1 6685 This VASP executable licensed from Materials Design, Inc. POSCAR found type information on POSCAR TiN H SiO C POSCAR found : 6 types and 154 ions NWRITE = 1 NWRITE = 1 NWRITE = 1 LDA part: xc-table for Pade appr. of Perdew WARNING: The GPU port of VASP has been extensively tested for: ALGO=Normal, Fast, and VeryFast. Other algorithms may produce incorrect results or yield suboptimal performance. Handle with care! ----------------------------------------------------------------------------- | | | W W AA RRRRR N N II N N GGGG !!! | | W W A A R R NN N II NN N G G !!! | | W W A A R R N N N II N N N G !!! | | W WW W AAAAAA RRRRR N N N II N N N G GGG ! | | WW WW A A R R N NN II N NN G G | | W W A A R R N N II N N GGGG !!! | | | | The distance between some ions is very small. Please check the | | nearest-neighbor list in the OUTCAR file. | | I HOPE YOU KNOW WHAT YOU ARE DOING! | | | ----------------------------------------------------------------------------- POSCAR, INCAR and KPOINTS ok, starting setup creating 32 CUDA streams... creating 32 CUDA streams... creating 32 CUDA streams... creating 32 CUFFT plans with grid size 140 x 48 x 48... creating 32 CUFFT plans with grid size 140 x 48 x 48... creating 32 CUFFT plans with grid size 140 x 48 x 48... FFT: planning ... WAVECAR not read entering main loop N E dE d eps ncg rms rms(c) RMM: 1 -0.786314918631E+04 -0.78631E+04 -0.87496E+05 2460 0.166E+03 RMM: 2 0.957667422701E+04 0.17440E+05 -0.17050E+05 2460 0.546E+02 RMM: 3 0.961904995653E+04 0.42376E+02 -0.38436E+04 2460 0.361E+02 RMM: 4 0.820457470134E+04 -0.14145E+04 -0.22392E+04 2460 0.309E+02 RMM: 5 0.657645228603E+04 -0.16281E+04 -0.30088E+04 2460 0.304E+02 RMM: 6 0.639727100629E+04 -0.17918E+03 -0.53801E+04 2460 0.386E+02 RMM: 7 0.368103735793E+04 -0.27162E+04 -0.66960E+04 2460 0.391E+02 RMM: 8 0.664425463479E+04 0.29632E+04 -0.83880E+04 2460 0.479E+02 RMM: 9 0.763764913042E+04 0.99339E+03 -0.31179E+04 9703 0.404E+02 RMM: 10 0.662112990564E+04 -0.10165E+04 -0.53742E+03 9827 0.248E+02 RMM: 11 0.654028488367E+04 -0.80845E+02 -0.19539E+03 9807 0.153E+02 RMM: 12 0.644066623608E+04 -0.99619E+02 -0.10918E+03 9835 0.100E+02 0.236E+03 RMM: 13 0.757663106328E+04 0.11360E+04 -0.52633E+03 9748 0.303E+02 0.112E+03 RMM: 14 0.579915787120E+04 -0.17775E+04 -0.60589E+03 9785 0.236E+02 0.948E+02 RMM: 15 0.828674296646E+04 0.24876E+04 -0.42334E+03 9780 0.166E+02 0.591E+02 RMM: 16 0.827448579889E+04 -0.12257E+02 -0.24108E+03 9836 0.122E+02 0.508E+02 WARNING in EDDRMM: call to ZHEGV failed, returncode = 6 3 1 RMM: 17 0.823978667921E+04 -0.34699E+02 -0.24241E+03 9782 0.987E+01 0.402E+02 WARNING in EDDRMM: call to ZHEGV failed, returncode = 6 3 1 RMM: 18 0.823927800630E+04 -0.50867E+00 -0.17823E+03 9703 0.834E+01 0.296E+02 WARNING in EDDRMM: call to ZHEGV failed, returncode = 6 3 1 RMM: 19 0.838483744898E+04 0.14556E+03 -0.14835E+03 9687 0.703E+01 0.468E+02 WARNING in EDDRMM: call to ZHEGV failed, returncode = 6 3 1 RMM: 20 0.829113885590E+04 -0.93699E+02 -0.89677E+02 9798 0.552E+01 0.253E+02 WARNING in EDDRMM: call to ZHEGV failed, returncode = 6 3 1 RMM: 21 0.825030086658E+04 -0.40838E+02 -0.67675E+02 9813 0.468E+01 0.222E+02 RMM: 22 0.810659688193E+04 -0.14370E+03 -0.43838E+02 9782 0.474E+01 0.359E+02 RMM: 23 0.806202243943E+04 -0.44574E+02 -0.53890E+02 9799 0.459E+01 0.383E+02 WARNING in EDDRMM: call to ZHEGV failed, returncode = 6 3 1 WARNING in EDDRMM: call to ZHEGV failed, returncode = 6 3 1 WARNING in EDDRMM: call to ZHEGV failed, returncode = 6 3 1 WARNING in EDDRMM: call to ZHEGV failed, returncode = 6 3 1 RMM: 24 0.791568170258E+04 -0.14634E+03 -0.90722E+02 9744 0.474E+01 0.267E+02 WARNING in EDDRMM: call to ZHEGV failed, returncode = 6 3 1 WARNING in EDDRMM: call to ZHEGV failed, returncode = 6 3 1 WARNING in EDDRMM: call to ZHEGV failed, returncode = 6 3 1 WARNING in EDDRMM: call to ZHEGV failed, returncode = 6 3 1 ----------------------------------------------------------------------------- | | | EEEEEEE RRRRRR RRRRRR OOOOOOO RRRRRR ### ### ### | | E R R R R O O R R ### ### ### | | E R R R R O O R R ### ### ### | | EEEEE RRRRRR RRRRRR O O RRRRRR # # # | | E R R R R O O R R | | E R R R R O O R R ### ### ### | | EEEEEEE R R R R OOOOOOO R R ### ### ### | | | | LAPACK: Routine ZPOTRF failed! 615 1 1 | | | | ----> I REFUSE TO CONTINUE WITH THIS SICK JOB ... BYE!!! <---- | | | ----------------------------------------------------------------------------- ***************************** Error running VASP parallel with MPI #!/bin/bash cd "/home/user/MD/TaskServer/Tasks/172.16.0.59-32000-task09329" export PATH="/home/user/MD/Linux-x86_64/IntelMPI5/bin:$PATH" export LD_LIBRARY_PATH="$LD_LIBRARY_PATH:/home/user/MD/Linux-x86_64/IntelMPI5/lib:/home/user/MD/TaskServer/Tools/vasp-gpu6.2.1/Linux-x86_64" "/home/user/MD/Linux-x86_64/IntelMPI5/bin/mpirun" -r ssh -np 3 "/home/user/MD/TaskServer/Tools/vasp-gpu6.2.1/Linux-x86_64/vasp_gpu" 1 1 1 *****************************