[CP2K-user] [CP2K:12240] Running Cp2k in parallel using thread in a PC
Nikhil Maroli
scin... at gmail.com
Fri Sep 20 15:54:08 UTC 2019
Hello,
Im using GROMACS with GPU since 2014. Running Cp2k doest shows any
information about GPU. Im using ubuntu 16 LTS
On Fri, Sep 20, 2019 at 9:17 PM Pierre Cazade <pierre.a... at gmail.com>
wrote:
> Hi Nikhil,
>
> This is an excellent question. I did not try the GPU version of CP2K yet.
> I am actually trying to compile it on the cluster that I am using.
>
> Normally, you only need to install CUDA libraries and set up the
> environment variables properly. Then, the executable detects the presence
> of the GPU automatically, provided you have installed the driver from
> nvidia. At least, this is how gromacs behaves, for example. Which linux
> distribution are you using?
>
> If you use the GPU, avoid using too many threads. Ideally, one per GPU.
>
> Regards,
> Pierre
>
> PS: Regarding your previous post: rather than "mpirun -n 2", try "mpirun
> -np 2". Finally, on a multiple node calculation on a cluster, you can use
> "mpirun -np 8 -ppn 2". The "-np" tells mpirun the total number of MPI
> threads requested and the "-ppn" tells how many threads per node you want.
> In the present example, I am using 4 nodes and I want 2 MPI threads for
> each of them, so a total of 8. Of course, don't forget to set the
> OMP_NUM_TREADS as well.
>
>
>
> On 20/09/2019 16:29, Nikhil Maroli wrote:
>
> Thank you very much for your reply.
> Could you please tell me how to use GPU in cp2k?
> I have installed all the libraries and compiled with cuda. I couldn't find
> any instructions to assign GPU for the calculations.
>
> On Fri, Sep 20, 2019, 8:15 PM Pierre Cazade <pierre.a... at gmail.com>
> wrote:
>
>> Hello Nikhil,
>>
>> Withe command "mpirun -n 42 cp2k.pop -i inp.inp -o -out.out", you are
>> requesting 42 MPI threads and not 42 OpenMP threads. MPI usually relies on
>> replicated data which means that, for a poorly program software, it will
>> request a total amount of memory which the amount of memory required by a
>> scalar execution times the number of threads. This can very quickly become
>> problematic, in particular for QM calculations. OpenMP, however relies on
>> shared memory, the data is normally not replicated but shared between
>> threads and therefore, in an ideal scenario, the amount of memory needed
>> for 42 OpenMP threads is the same as a single one.
>>
>> This might explains why you calculation freezes. You are out of memory.
>> On your workstation, you should only use the executable "cp2k.ssmp" which
>> is the OpenMP version. Then you don't need the mpirun command:
>>
>> cp2k.ssmp -i inp.inp -o -out.out
>>
>> To control the number of OpenMP threads, set the env variable:
>> OMP_NUM_THREADS, e.g. in bash, export OMP_NUM_THREADS=48
>>
>> Now, if you need to balance between MPI and OpenMP, you should use the
>> executable named cp2k.psmp. Here is such an example:
>>
>> export OMP_NUM_THREADS=24
>> mpirun -n 2 cp2k.psmp -i inp.inp -o -out.out
>>
>> In this example, I am requesting two MPI threads and each of them can use
>> up to 24 OpenMP threads.
>>
>> Hope this clarifies things for you.
>>
>> Regards,
>> Pierre
>>
>> On 20/09/2019 14:09, Nikhil Maroli wrote:
>>
>> Dear all,
>>
>> I have installed all the versions of CP2K in my workstation with 2 x 12
>> core processor, total thread=48
>>
>> I wanted to run cp2k in parallel using 42 threads, can anyone share the
>> commands that i can use.
>>
>> I have tried
>>
>> mpirun -n 42 cp2k.pop -i inp.inp -o -out.out
>>
>> After this command there is a rise in memory to 100 % and the whole
>> system freezes. (i have 128GB ram).
>>
>> Any suggestion will be greatly appreciated,
>> --
>> You received this message because you are subscribed to the Google Groups
>> "cp2k" group.
>> To unsubscribe from this group and stop receiving emails from it, send an
>> email to cp... at googlegroups.com.
>> To view this discussion on the web visit
>> https://groups.google.com/d/msgid/cp2k/39284c57-f6eb-463e-81a6-3a123596a9f2%40googlegroups.com
>> <https://groups.google.com/d/msgid/cp2k/39284c57-f6eb-463e-81a6-3a123596a9f2%40googlegroups.com?utm_medium=email&utm_source=footer>
>> .
>>
>>
>> --
>> Dr Pierre Cazade, PhD
>> AD3-023, Bernal Institute,
>> University of Limerick,
>> Plassey Park Road,
>> Castletroy, co. Limerick,
>> Ireland
>>
>> --
>> You received this message because you are subscribed to the Google Groups
>> "cp2k" group.
>> To unsubscribe from this group and stop receiving emails from it, send an
>> email to cp... at googlegroups.com.
>> To view this discussion on the web visit
>> https://groups.google.com/d/msgid/cp2k/91cedcdb-79a0-a5f3-cf51-fde52abbba49%40gmail.com
>> <https://groups.google.com/d/msgid/cp2k/91cedcdb-79a0-a5f3-cf51-fde52abbba49%40gmail.com?utm_medium=email&utm_source=footer>
>> .
>>
> --
> You received this message because you are subscribed to the Google Groups
> "cp2k" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to cp... at googlegroups.com.
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/cp2k/CAMEzy6RbFpKfoc7dr94Z%3DNCzdzotQQCMA8W7w1x%3DLZe9zXXeNA%40mail.gmail.com
> <https://groups.google.com/d/msgid/cp2k/CAMEzy6RbFpKfoc7dr94Z%3DNCzdzotQQCMA8W7w1x%3DLZe9zXXeNA%40mail.gmail.com?utm_medium=email&utm_source=footer>
> .
>
>
> --
> Dr Pierre Cazade, PhD
> AD3-023, Bernal Institute,
> University of Limerick,
> Plassey Park Road,
> Castletroy, co. Limerick,
> Ireland
>
> --
> You received this message because you are subscribed to the Google Groups
> "cp2k" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to cp... at googlegroups.com.
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/cp2k/5162a14a-ff8a-1a52-bcf6-68bab95c1f22%40gmail.com
> <https://groups.google.com/d/msgid/cp2k/5162a14a-ff8a-1a52-bcf6-68bab95c1f22%40gmail.com?utm_medium=email&utm_source=footer>
> .
>
--
Regards,
Nikhil Maroli
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.cp2k.org/archives/cp2k-user/attachments/20190920/e296b6c4/attachment.htm>
More information about the CP2K-user
mailing list