Running CDFT Tutorial Calculation on Cluster

Nico Holmberg holmbe... at gmail.com
Tue Jul 31 06:35:45 UTC 2018


Hi Brian,

Thanks for the information. Just to confirm, if you run
ifort --version

does the command return ifort (IFORT) 17.0.4 ? I need to use a different 
machine than normally to compile with that version of the Intel compiler, 
so I need a while to familiarize myself with the proper build process on 
that machine. Hopefully I'll find the time later this week.

By the way, the error message you posted is quite cryptic and does not 
point to anything related to writing the cube file. Any chance you could 
post the full output log file for the crashing simulation? Are you able to 
reproduce the crash if you decrease the number of MPI tasks to, say 2 or 4, 
from 14? 


BR,

Nico




maanantai 30. heinäkuuta 2018 23.00.41 UTC+3 Brian Day kirjoitti:
>
> Hi Nico,
>
> I use the following intel compilers: module load intel/2017.3.196 
> intel-mpi/2017.3.196 cp2k/6.1
>
> Summarized below are the conditions I used and the results I got:
> nodes = 1, tasks = 14, executable = cp2k.popt -i *.inp -o *.out ---> Cube 
> files generated without issue
> nodes = 1, tasks = 14, executable = mpirun -np $SLURM_NTASKS cp2k.popt -i 
> *.inp -o *.out ---> 
>
> forrtl: severe (174): SIGSEGV, segmentation fault occurred
>
> Image              PC                Routine            Line        Source
>
> cp2k.popt          000000000D730E14  Unknown               Unknown  
> Unknown
>
> libpthread-2.17.s  00002ABEB44735E0  Unknown               Unknown  
> Unknown
>
> libc-2.17.so   00002ABEB61E74DC  cfree                 Unknown  Unknown
>
> cp2k.popt          000000000D767FA8  Unknown               Unknown  
> Unknown
>
> cp2k.popt          000000000109F105  qs_dispersion_typ         135  
> qs_dispersion_types.F
>
> cp2k.popt          0000000000B88BFF  qs_environment_ty        1476  
> qs_environment_types.F
>
> cp2k.popt          0000000000B13020  force_env_types_m         232  
> force_env_types.F
>
> cp2k.popt          0000000000EE66E0  f77_interface_mp_         335  
> f77_interface.F
>
> cp2k.popt          000000000043BEF8  cp2k_runs_mp_run_         405  
> cp2k_runs.F
>
> cp2k.popt          0000000000432814  MAIN__                    281  cp2k.F
>
> cp2k.popt          000000000043151E  Unknown               Unknown  
> Unknown
>
> libc-2.17.so   00002ABEB6188C05  __libc_start_main     Unknown  Unknown
>
> cp2k.popt          0000000000431429  Unknown               Unknown  
> Unknown
> Note that I had to run these simulations on a different cluster here, as 
> the one I was using previously requires the submission file to declare a 
> minimum of 2 nodes. I can talk to our computing center and see if they can 
> test this themselves. 
>
> To (hopefully) clarify my earlier message, when I ran with 2 nodes, the 
> last line in the cp2k output file would be:
>
>  The sum of alpha and beta density is written in cube file format to the 
> file:
>
>
>  
> /scratch/slurm-1244788/water-dimer-frag-b-pbe-energy-ELECTRON_DENSITY-1_0.cube
> and the electron density file would appear in the submission directory, 
> but it would be empty. If I re-ran the simulation and passed this blank 
> file to the cluster which I was running on, it would run successfully, but 
> when trying to open the file in another program such as Avogadro, or using 
> it in a subsequent simulation, it would not work. Maybe this is because it 
> is only partially writing as you pointed out. 
>
> I will try and update this with the compiling information and a detailed 
> debugging output. (Sorry if any of the above does not make sense, I am 
> still fairly new to computational work). 
>
> Thanks again.
>
> -Brian
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.cp2k.org/archives/cp2k-user/attachments/20180730/7194bc4b/attachment.htm>


More information about the CP2K-user mailing list