[CP2K-user] [CP2K:19595] Re: Error: Parameter ‘mpi_comm_null’ at (1) has not been declared

Mikhail Povarnitsyn povarnitsynme at gmail.com
Wed Nov 29 18:14:24 UTC 2023


Dear Frederick,

Thank you for this. Following your suggestion, I ran the toolchain job on a 
node of interest, and upon successful completion, I observed that the 
'local.psmp' arch file is identical to one obtained by compilation on the 
head node. In both cases, the -march=native option is present.

Did I correctly understand your idea of achieving node-dependent 
compilation?  Where I can find the actual values of -march, -mtune, and 
other relevant parameters during the toolchain step? 

Best regards, 
Mikhail

On Wednesday, November 29, 2023 at 1:05:39 PM UTC+1 Frederick Stein wrote:

> Dear Mikhail,
>
> Did you try to compile the code as part of a job on the given machine? 
> Then, the compiler should be able grep the correct flags.
>
> Best,
> Frederick 
>
> Mikhail Povarnitsyn schrieb am Mittwoch, 29. November 2023 um 13:00:52 
> UTC+1:
>
>> Dear Frederick,
>>
>> I appreciate your continued assistance. Given that we have a mixture of 
>> processor types (Intel Xeon and AMD EPYC), determining the optimal -march 
>> and -mtune options (now 'native', by default) is currently not 
>> straightforward.
>>
>> Best regards,
>> Mikhail
>> On Wednesday, November 29, 2023 at 9:52:35 AM UTC+1 Frederick Stein wrote:
>>
>>> Dear Mikhail,
>>>
>>> I am not quite the expert for these machine-dependent options. I 
>>> personally ignore this error message. Machine-related optimizations 
>>> dependent on the actual setup (CPU, cross-compilation, ...) and the 
>>> compiler. If you compile for a supercomputing cluster, it is recommended to 
>>> make use of the center-provided compiler wrappers as they may have better 
>>> optimized MPI libraries or setup for their machine. You may check the 
>>> manual of your compiler for further information on machine-dependent 
>>> options in case you know the CPU and its instructions set (For Gfortran: 
>>> https://gcc.gnu.org/onlinedocs/gcc/x86-Options.html).
>>>
>>> Maybe, some of the other folks are of better help here.
>>>
>>> Best,
>>> Frederick
>>> Mikhail Povarnitsyn schrieb am Mittwoch, 29. November 2023 um 00:41:41 
>>> UTC+1:
>>>
>>>> Dear Frederick,
>>>>
>>>> I wanted to express my gratitude for your advice on removing 
>>>> -D__MPI_F08; it was immensely helpful.
>>>>
>>>> Upon comparing the performance of the 'cp2k.popt' code across versions 
>>>> 2023.2, 7.1, and 9.1, I observed a consistent runtime of approximately 25 
>>>> minutes, with minor variations within a few seconds for all versions. 
>>>> However, in the output of version 2023.2, I noticed a new message:
>>>>
>>>> *** HINT in environment.F:904 :: The compiler target flags (generic) 
>>>> used *** *** to build this binary cannot exploit all extensions of this CPU 
>>>> model *** *** (x86_avx2). Consider compiler target flags as part of FCFLAGS 
>>>> and *** *** CFLAGS (ARCH file). ***
>>>>
>>>> I would greatly appreciate your advice on how I can enhance the 
>>>> performance of the parallel version.
>>>>
>>>> Thank you in advance for your assistance.
>>>>
>>>> Best regards, Mikhail
>>>>
>>>>
>>>> On Tuesday, November 28, 2023 at 9:43:14 PM UTC+1 Frederick Stein wrote:
>>>>
>>>>> Dear Mikhail,
>>>>>
>>>>> Can you remove the -D__MPI_F08 and recompile. I think it might be 
>>>>> related to an unsufficient support of the mpi_f08 module which CP2K uses by 
>>>>> default with OpenMPI and which is not tested with older versions of 
>>>>> compiler and library. Alternatively, if you have access to a later version 
>>>>> of the library (also with the CP2K Toolchain, add the flags 
>>>>> --with-openmpi=install or --with-mpich=install or additionally with 
>>>>> --with-gcc=install to install GCC 13).
>>>>>
>>>>> Best regards,
>>>>> Frederick
>>>>>
>>>>> Mikhail Povarnitsyn schrieb am Dienstag, 28. November 2023 um 18:45:51 
>>>>> UTC+1:
>>>>>
>>>>>> Dear Frederick,
>>>>>>
>>>>>> Thank you very much for the reply!
>>>>>>
>>>>>> 1) Yes, I mean OpenMPI 3.1.6.
>>>>>>
>>>>>> 2) 'local.psmp' file is attached, hope that is what you asked for.
>>>>>>
>>>>>> 3) Yes, I did the command  'source 
>>>>>> /user/povar/cp2k-2023.2/tools/toolchain/install/setup' after the toolchain.
>>>>>>
>>>>>> 4)  mpifort --version
>>>>>> GNU Fortran (GCC) 9.3.0
>>>>>> Copyright (C) 2019 Free Software Foundation, Inc.
>>>>>> This is free software; see the source for copying conditions.  There 
>>>>>> is NO
>>>>>> warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR 
>>>>>> PURPOSE.
>>>>>>
>>>>>> Best regards,
>>>>>> Mikhail
>>>>>>
>>>>>> On Tuesday, November 28, 2023 at 1:08:25 PM UTC+1 Frederick Stein 
>>>>>> wrote:
>>>>>>
>>>>>>> Dear Mikhail,
>>>>>>>
>>>>>>> I suppose you mean OpenMPI 3.1.6 (MPI is just the standard defining 
>>>>>>> an interface for parallel programming). Could you post your arch file of 
>>>>>>> the parallel build? Did you source the setup file after the toolchain 
>>>>>>> script finished? Could you also post the output of `mpifort --version`?
>>>>>>>
>>>>>>> Best,
>>>>>>> Frederick
>>>>>>>
>>>>>>> Mikhail Povarnitsyn schrieb am Dienstag, 28. November 2023 um 
>>>>>>> 11:06:14 UTC+1:
>>>>>>>
>>>>>>>> Dear Developers and Users,
>>>>>>>>
>>>>>>>> I am attempting to install the latest version, 2023.2, using the 
>>>>>>>> GNU compiler (gcc 9.3.0) along with MPI 3.1.6. I employed the toolchain 
>>>>>>>> script as follows: './install_cp2k_toolchain.sh'.
>>>>>>>>
>>>>>>>> The serial version 'ssmp' has been successfully compiled. However, 
>>>>>>>> the compilation of the parallel version 'psmp' failed with the following 
>>>>>>>> error: 
>>>>>>>>
>>>>>>>> /user/povar/cp2k-2023.2/exts/dbcsr/src/mpi/dbcsr_mpiwrap.F:106:53:
>>>>>>>>
>>>>>>>>   106 |    MPI_COMM_TYPE, PARAMETER :: mp_comm_null_handle = 
>>>>>>>> MPI_COMM_NULL
>>>>>>>>       |                                                     1
>>>>>>>> Error: Parameter ‘mpi_comm_null’ at (1) has not been declared or is 
>>>>>>>> a variable, which does not reduce to a constant expression
>>>>>>>> /user/povar/cp2k-2023.2/exts/dbcsr/src/mpi/dbcsr_mpiwrap.F:107:53:
>>>>>>>>
>>>>>>>>   107 |    MPI_COMM_TYPE, PARAMETER :: mp_comm_self_handle = 
>>>>>>>> MPI_COMM_SELF
>>>>>>>>       |                                                     1
>>>>>>>> Error: Parameter ‘mpi_comm_self’ at (1) has not been declared or is 
>>>>>>>> a variable, which does not reduce to a constant expression
>>>>>>>> /user/povar/cp2k-2023.2/exts/dbcsr/src/mpi/dbcsr_mpiwrap.F:108:54:
>>>>>>>>
>>>>>>>>   108 |    MPI_COMM_TYPE, PARAMETER :: mp_comm_world_handle = 
>>>>>>>> MPI_COMM_WORLD
>>>>>>>>
>>>>>>>> and other similar errors. 
>>>>>>>>
>>>>>>>> Could you please help?
>>>>>>>>
>>>>>>>> Best regards
>>>>>>>> Mikhail
>>>>>>>>
>>>>>>>>

-- 
You received this message because you are subscribed to the Google Groups "cp2k" group.
To unsubscribe from this group and stop receiving emails from it, send an email to cp2k+unsubscribe at googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/cp2k/ae84e905-793b-4f3d-8887-d107e0fbb87dn%40googlegroups.com.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.cp2k.org/archives/cp2k-user/attachments/20231129/8b1ad0ae/attachment.htm>


More information about the CP2K-user mailing list