Page 1 of 1

Weird memory requirement

Posted: Fri Nov 08, 2013 8:46 pm
by chelman
There is a line in the OUTCAR file which say:
"total amount of memory used by VASP on root node xxx Kb"
where xxx is a number.
If I decide to run parallel in 2 quad core ( 8 nodes for vasp ) with NPAR = 8 the xxx=1341434.
But If I decide to run (SAME SYSTEM ) in parallel with 4 quad core ( 16 nodes for vasp ) with NPAR=16, the xxx=2085985.
The NPAR is the ONLY difference between these two runs!
What i don't understand is that If I have more cores (nodes) to run why the memory requirement is bigger, it's seems like more cores need more memory?? shouldn't be other way around?

Thanks!!

Weird memory requirement

Posted: Mon Nov 11, 2013 3:35 am
by ledssiul
I think this behavior is normal and it is already explained in the vasp tutorial. Take a look in the NPAR and LPLANE variable meaning.

http://cms.mpi.univie.ac.at/vasp/guide/node138.html

You will find that an optimum setting of these two variables depends a lot on the type of machine you are running.

Hope it helps,

Regards,

Luis

Weird memory requirement

Posted: Thu Nov 14, 2013 10:31 pm
by chelman
Thanks for your help, but still nothing.
Now, I'm asking to the community (specially to the administrator) :
Is there anyway to predict (estimate) the memory requirement per node before run a job??
Is there any account that we can make to estimate the memory ?

Perhaps this question is too general, in the particular case I'm working on, the question can be redefined to:

I'm already converge a spin polarized calculation and I want to perform a spin-orbit one. Knowing the sizes for the spin-polarized calculation, can I predict the size of the SO one?