Run vasp parallelly but call wannier90 library serially from within one vasp calculation.

Queries about input and output files, running specific calculations, etc.


Moderators: Global Moderator, Moderator

Locked
Message
Author
hszhao.cn@gmail.com
Full Member
Full Member
Posts: 189
Joined: Tue Oct 13, 2020 11:32 pm

Run vasp parallelly but call wannier90 library serially from within one vasp calculation.

#1 Post by hszhao.cn@gmail.com » Sun Mar 06, 2022 8:36 am

I tried to validate the example discussed here, and found that it can only be run in serial mode. The following describes the test steps and error messages when running vasp in parallel:

Code: Select all

$ cd ~/Public/hpc/vasp/release/PAW_PBE_54
$ cat Sr_sv/POTCAR Ti_sv/POTCAR O/POTCAR > ~/Public/hpc/vasp/learning-research/WannierFunctions/LSCDM/POTCAR
$ cd ~/Public/hpc/vasp/learning-research/WannierFunctions/LSCDM
$ module load vasp/6.3.0

# Wannier90 library can only be called serially from within vasp.
# The following command works smoothly:
$ mpirun -np 1 vasp_std

# The following command will fail:
$ mpirun -np 8 vasp_std
 running on    8 total cores
 distrk:  each k-point on    8 cores,    1 groups
 distr:  one band on    1 cores,    8 groups
 vasp.6.3.0 20Jan22 (build Feb 23 2022 17:43:54) complex                        
  
 POSCAR found type information on POSCAR SrTiO 
 POSCAR found :  3 types and       5 ions
 Reading from existing POTCAR
 scaLAPACK will be used
 -----------------------------------------------------------------------------
|                                                                             |
|           W    W    AA    RRRRR   N    N  II  N    N   GGGG   !!!           |
|           W    W   A  A   R    R  NN   N  II  NN   N  G    G  !!!           |
|           W    W  A    A  R    R  N N  N  II  N N  N  G       !!!           |
|           W WW W  AAAAAA  RRRRR   N  N N  II  N  N N  G  GGG   !            |
|           WW  WW  A    A  R   R   N   NN  II  N   NN  G    G                |
|           W    W  A    A  R    R  N    N  II  N    N   GGGG   !!!           |
|                                                                             |
|     For optimal performance we recommend to set                             |
|       NCORE = 2 up to number-of-cores-per-socket                            |
|     NCORE specifies how many cores store one orbital (NPAR=cpu/NCORE).      |
|     This setting can greatly improve the performance of VASP for DFT.       |
|     The default, NCORE=1 might be grossly inefficient on modern             |
|     multi-core architectures or massively parallel machines. Do your        |
|     own testing! More info at https://www.vasp.at/wiki/index.php/NCORE      |
|     Unfortunately you need to use the default for GW and RPA                |
|     calculations (for HF NCORE is supported but not extensively tested      |
|     yet).                                                                   |
|                                                                             |
 -----------------------------------------------------------------------------

 Reading from existing POTCAR
 LDA part: xc-table for Pade appr. of Perdew
 POSCAR, INCAR and KPOINTS ok, starting setup
 FFT: planning ... GRIDC
 FFT: planning ... GRID_SOFT
 FFT: planning ... GRID
 WAVECAR not read
 entering main loop
       N       E                     dE             d eps       ncg     rms          rms(c)
DAV:   1     0.488935775524E+03    0.48894E+03   -0.23732E+04   384   0.205E+03
DAV:   2    -0.444415239670E+01   -0.49338E+03   -0.48519E+03   472   0.537E+02
DAV:   3    -0.481495850408E+02   -0.43705E+02   -0.43662E+02   536   0.158E+02
DAV:   4    -0.502452052623E+02   -0.20956E+01   -0.20946E+01   568   0.306E+01
DAV:   5    -0.503227780552E+02   -0.77573E-01   -0.77569E-01   528   0.450E+00    0.268E+01
DAV:   6    -0.399861167993E+02    0.10337E+02   -0.66280E+01   560   0.751E+01    0.162E+01
DAV:   7    -0.400119889366E+02   -0.25872E-01   -0.73698E+00   504   0.212E+01    0.103E+01
DAV:   8    -0.400631143510E+02   -0.51125E-01   -0.17655E+00   496   0.146E+01    0.289E+00
DAV:   9    -0.399712800056E+02    0.91834E-01   -0.42197E-01   520   0.692E+00    0.818E-01
DAV:  10    -0.399730572130E+02   -0.17772E-02   -0.11147E-01   480   0.236E+00    0.682E-01
DAV:  11    -0.399667980961E+02    0.62591E-02   -0.39695E-02   656   0.159E+00    0.125E-01
DAV:  12    -0.399668310387E+02   -0.32943E-04   -0.13947E-03   480   0.402E-01    0.594E-02
DAV:  13    -0.399668870109E+02   -0.55972E-04   -0.20052E-04   568   0.104E-01    0.389E-02
DAV:  14    -0.399668780151E+02    0.89958E-05   -0.51866E-05   528   0.588E-02    0.537E-03
DAV:  15    -0.399668797851E+02   -0.17700E-05   -0.45718E-06   504   0.200E-02    0.692E-03
DAV:  16    -0.399668795643E+02    0.22077E-06   -0.54401E-06   504   0.274E-02    0.230E-03
DAV:  17    -0.399668795185E+02    0.45846E-07   -0.32344E-07   504   0.621E-03    0.961E-04
DAV:  18    -0.399668794561E+02    0.62413E-07   -0.78952E-08   488   0.239E-03    0.384E-04
DAV:  19    -0.399668794223E+02    0.33802E-07   -0.73545E-09   240   0.764E-04    0.153E-04
DAV:  20    -0.399668794270E+02   -0.47080E-08   -0.13054E-09   240   0.353E-04
 Calling wannier_setup of wannier90 in library mode
 SCDM mode
 Computing MMN (overlap matrix elements)
 Calling wannier_run of wannier90 in library mode (check wannier90.wout)
forrtl: severe (174): SIGSEGV, segmentation fault occurred
Image              PC                Routine            Line        Source             
vasp_std           0000000001F29B2A  Unknown               Unknown  Unknown
libpthread-2.31.s  000014CA2FB153C0  Unknown               Unknown  Unknown
libmpi.so.12.0.0   000014CA2E37C212  Unknown               Unknown  Unknown
libmpi.so.12.0.0   000014CA2E6634E8  Unknown               Unknown  Unknown
libmpi.so.12.0.0   000014CA2E5AA9D0  Unknown               Unknown  Unknown
libmpi.so.12.0.0   000014CA2E05C617  Unknown               Unknown  Unknown
libmpi.so.12.0.0   000014CA2E0316AA  Unknown               Unknown  Unknown
libmpi.so.12.0.0   000014CA2E11B3BD  Unknown               Unknown  Unknown
libmpi.so.12.0.0   000014CA2E5AB697  MPI_Scatterv          Unknown  Unknown
libmpifort.so.12.  000014CA2F68AF70  PMPI_SCATTERV         Unknown  Unknown
vasp_std           0000000001EC5CEE  Unknown               Unknown  Unknown
vasp_std           0000000001D94196  Unknown               Unknown  Unknown
vasp_std           000000000151CFC1  mlwf_mp_mlwf_wann        1624  mlwf.F
vasp_std           0000000001500086  mlwf_mp_mlwf_main         575  mlwf.F
vasp_std           0000000001D131D3  MAIN__                   3175  main.F
vasp_std           000000000041BAE2  Unknown               Unknown  Unknown
libc-2.31.so       000014CA2DCF30B3  __libc_start_main     Unknown  Unknown
vasp_std           000000000041B9EE  Unknown               Unknown  Unknown

===================================================================================
=   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
=   RANK 1 PID 2664453 RUNNING AT X10DAi-00
=   KILLED BY SIGNAL: 9 (Killed)
===================================================================================

===================================================================================
=   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
=   RANK 2 PID 2664455 RUNNING AT X10DAi-00
=   KILLED BY SIGNAL: 9 (Killed)
===================================================================================

===================================================================================
=   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
=   RANK 3 PID 2664457 RUNNING AT X10DAi-00
=   KILLED BY SIGNAL: 9 (Killed)
===================================================================================

===================================================================================
=   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
=   RANK 4 PID 2664458 RUNNING AT X10DAi-00
=   KILLED BY SIGNAL: 9 (Killed)
===================================================================================

===================================================================================
=   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
=   RANK 5 PID 2664459 RUNNING AT X10DAi-00
=   KILLED BY SIGNAL: 9 (Killed)
===================================================================================

===================================================================================
=   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
=   RANK 6 PID 2664460 RUNNING AT X10DAi-00
=   KILLED BY SIGNAL: 9 (Killed)
===================================================================================

===================================================================================
=   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
=   RANK 7 PID 2664462 RUNNING AT X10DAi-00
=   KILLED BY SIGNAL: 9 (Killed)
===================================================================================
Attached please find all the files for the above test. I wonder whether I can run vasp parallelly but call wannier90 library serially from within one vasp calculation.

Regards,
HZ
You do not have the required permissions to view the files attached to this post.

marie-therese.huebsch
Full Member
Full Member
Posts: 211
Joined: Tue Jan 19, 2021 12:01 am

Re: Run vasp parallelly but call wannier90 library serially from within one vasp calculation.

#2 Post by marie-therese.huebsch » Mon Mar 07, 2022 12:58 pm

Hi HZ,

I saw another related post by you on how to compile Wannier90 here.
If I see correctly, it seems you compiled the Wannier90 library with COMMS=mpi, right? That means it is the parralelized version and not the serial version.

To compile Wannier90 with VASP, you need to remove COMMS=mpi from the make.inc file of Wannier90 and make the library by entering make lib. This is described on the VASP Wiki, as well: wiki/index.php/Makefile.include#Wannier ... ptional.29

Afterward, also

Code: Select all

mpirun -np 8 vasp_std
should run smoothly for your calculation. Let me know if this solves the problem.

Best regards,
Marie-Therese

PS: When you upload a zip, please try to only include the essential files. Thank you for understanding.

hszhao.cn@gmail.com
Full Member
Full Member
Posts: 189
Joined: Tue Oct 13, 2020 11:32 pm

Re: Run vasp parallelly but call wannier90 library serially from within one vasp calculation.

#3 Post by hszhao.cn@gmail.com » Tue Mar 08, 2022 12:15 am

Great. It does the trick. Thank you very much for the tip.

By saying "the essential files" in the following comment:
PS: When you upload a zip, please try to only include the essential files. Thank you for understanding.
Do you mean exclude the POTCAR file?

Regards,
HZ

marie-therese.huebsch
Full Member
Full Member
Posts: 211
Joined: Tue Jan 19, 2021 12:01 am

Re: Run vasp parallelly but call wannier90 library serially from within one vasp calculation.

#4 Post by marie-therese.huebsch » Tue Mar 08, 2022 6:48 am

Glad to help!

The essential files are all input files (so depending of the calculation that is mostly INCAR, KPOINTS, POTCAR and POSCAR, but possibly ICONST, KPOINTS_OPT etc), and the main output files (OUTCAR, stdout and for molecular dynamics REPORT). So, unless requested, please do not upload vaspout.h5, PCDAT, CHG, CHGCAR, etc. It is also good to include information on how you run the job like you did anyways. For your reference, here are the forum guidelines. Thanks for asking.

Locked