<div dir="ltr">Hi Brian,<div><br></div><div>Sorry for the long delay in replying, I had a couple of tight deadlines that required my full attention. </div><div><br></div><div>I compiled CP2K using version of 17.0.4 20170411 of the Intel Fortran compiler, Intel MPI and MKL. You can find my arch file below. I ran the tutorial files with 1, 2, and 24 MPI processes and did not encounter any issues. </div><div><br></div><div>Looking at the stack trace you included in your last post, it seems that the calculation is crashing somewhere inside the MPI I/O routine that CP2K is calling. This looks like a library issue to me. Are you able to provide any more information about how your binary has been compiled?</div><div><br></div><div>By the way, if you have access to the latest development version of CP2K (dated yesterday), you can disable MPI I/O to force CP2K to use the serial versions of the cube writer/reader. This will bypass your issue without fixing the underlying issue. See discussion in <a href="https://groups.google.com/forum/#!topic/cp2k/RgsNKmQtVXw">this post</a> for more information.</div><div><br></div><div><div class="prettyprint" style="background-color: rgb(250, 250, 250); border-color: rgb(187, 187, 187); border-style: solid; border-width: 1px; overflow-wrap: break-word;"><code class="prettyprint"><div class="subprettyprint"><font color="#660066"><div class="subprettyprint"># Bare bones arch file for building CP2K with the Intel compilation suite<br># Tested with ifort (IFORT) + Intel MPI + MKL version<span style="font-family: Arial, Helvetica, sans-serif;"> </span><span style="font-family: Arial, Helvetica, sans-serif;">17.0.4 20170411</span><span style="font-family: Arial, Helvetica, sans-serif;"> </span></div></font><font color="#660066"><div class="subprettyprint"><br></div><div class="subprettyprint"># Build tools</div><div class="subprettyprint">CC       = icc</div><div class="subprettyprint">CPP      =</div><div class="subprettyprint">FC       = mpiifort</div><div class="subprettyprint">LD       = mpiifort</div><div class="subprettyprint">AR       = ar -r</div><div class="subprettyprint"><br></div><div class="subprettyprint"># Flags and libraries</div><div class="subprettyprint">CPPFLAGS =</div><div class="subprettyprint"><br></div><div class="subprettyprint">DFLAGS   = -D__BLACS -D__INTEL -D__MKL -D__FFTW3 -D__parallel -D__SCALAPACK  \</div><div class="subprettyprint">           -D__HAS_NO_SHARED_GLIBC</div><div class="subprettyprint"><br></div><div class="subprettyprint">CFLAGS   = $(DFLAGS)</div><div class="subprettyprint"><br></div><div class="subprettyprint">FCFLAGS  = $(DFLAGS) -O2 -g -traceback -fp-model precise -fp-model source -free  \</div><div class="subprettyprint">           -I$(MKLROOT)/include -I$(MKLROOT)/include/fftw</div><div class="subprettyprint"><br></div><div class="subprettyprint">LDFLAGS  = $(FCFLAGS)</div><div class="subprettyprint"><br></div><div class="subprettyprint">LDFLAGS_C = $(FCFLAGS) -nofor_main</div><div class="subprettyprint"><br></div><div class="subprettyprint">LIBS     = -Wl,--start-group \</div><div class="subprettyprint">           $(MKLROOT)/lib/intel64/libmkl_scalapack_lp64.a \</div><div class="subprettyprint">           $(MKLROOT)/lib/intel64/libmkl_intel_lp64.a \</div><div class="subprettyprint">           $(MKLROOT)/lib/intel64/libmkl_sequential.a \</div><div class="subprettyprint">           $(MKLROOT)/lib/intel64/libmkl_core.a \</div><div class="subprettyprint">           $(MKLROOT)/lib/intel64/libmkl_blacs_intelmpi_lp64.a \</div><div class="subprettyprint">           -Wl,--end-group \</div><div class="subprettyprint">           -lpthread -lm -ldl</div><div class="subprettyprint"><br></div><div class="subprettyprint"># Required due to memory leak that occurs if high optimisations are used</div><div class="subprettyprint">mp2_optimize_ri_basis.o: mp2_optimize_ri_basis.F</div><div class="subprettyprint"><span style="white-space:pre">                    </span> $(FC) -c $(subst O2,O0,$(FCFLAGS)) $<</div></font></div></code></div><br></div><div><br>torstai 9. elokuuta 2018 19.51.08 UTC+3 Brian Day kirjoitti:<blockquote class="gmail_quote" style="margin: 0;margin-left: 0.8ex;border-left: 1px #ccc solid;padding-left: 1ex;"><div dir="ltr">Actually, the error message is slightly different, see below:<div><br></div><div>







<p><span>forrtl: severe (174): SIGSEGV, segmentation fault occurred</span></p>
<p><span>Image<span>              </span>PC<span>                </span>Routine<span>            </span>Line<span>        </span>Source</span></p>
<p><span>cp2k.popt<span>          </span>000000000D730E14<span>  </span>Unknown <span>              </span>Unknown<span>  </span>Unknown</span></p>
<p><span>libpthread-2.17.s<span>  </span>00002ABEBC47F5E0<span>  </span>Unknown <span>              </span>Unknown<span>  </span>Unknown</span></p>
<p><span>libmpi.so.12<span>   </span> <span>  </span>00002ABEBD7DA1BA<span>  </span>PMPI_File_write_a <span>    </span>Unknown<span>  </span>Unknown</span></p>
<p><span>libmpifort.so.12.<span>  </span>00002ABEBCF2F1AE<span>  </span>pmpi_file_write_a <span>    </span>Unknown<span>  </span>Unknown</span></p>
<p><span>cp2k.popt<span>          </span>0000000002F17BCC<span>  </span>message_passing_m<span>        </span>3315<span>  </span>message_passing.F</span></p>
<p><span>cp2k.popt<span>          </span>0000000002A33AFC<span>  </span>realspace_grid_cu <span>        </span>698<span>  </span>realspace_grid_cube.F</span></p>
<p><span>cp2k.popt<span>          </span>0000000002A31F4D<span>  </span>realspace_grid_cu <span>        </span>211<span>  </span>realspace_grid_cube.F</span></p>
<p><span>cp2k.popt<span>          </span>0000000000A06F9D<span>  </span>cp_realspace_grid<span>          </span>64<span>  </span>cp_realspace_grid_cube.F</span></p>
<p><span>cp2k.popt<span>          </span>0000000000A9F32B<span>  </span>qs_scf_post_gpw_m<span>        </span>2651<span>  </span>qs_scf_post_gpw.F</span></p>
<p><span>cp2k.popt<span>          </span>0000000000A883B1<span>  </span>qs_scf_post_gpw_m<span>        </span>2001<span>  </span>qs_scf_post_gpw.F</span></p>
<p><span>cp2k.popt<span>          </span>0000000000EB6610<span>  </span>qs_scf_post_scf_m<span>          </span>70<span>  </span>qs_scf_post_scf.F</span></p>
<p><span>cp2k.popt<span>          </span>00000000017A267F<span>  </span>qs_scf_mp_scf_<span>            </span>285<span>  </span>qs_scf.F</span></p>
<p><span>cp2k.popt<span>          </span>0000000000BA7709<span>  </span>qs_energy_mp_qs_e<span>          </span>86<span>  </span>qs_energy.F</span></p>
<p><span>cp2k.popt<span>          </span>0000000000C52681<span>  </span>qs_force_mp_qs_ca <span>        </span>115<span>  </span>qs_force.F</span></p>
<p><span>cp2k.popt<span>          </span>000000000096F4AA<span>  </span>force_env_methods <span>        </span>242<span>  </span>force_env_methods.F</span></p>
<p><span>cp2k.popt<span>          </span>000000000043BCAC<span>  </span>cp2k_runs_mp_run_ <span>        </span>323<span>  </span>cp2k_runs.F</span></p>
<p><span>cp2k.popt<span>          </span>0000000000432814<span>  </span>MAIN__<span>                    </span>281<span>  </span>cp2k.F</span></p>
<p><span>cp2k.popt<span>          </span>000000000043151E<span>  </span>Unknown <span>              </span>Unknown<span>  </span>Unknown</span></p>
<p><span><a href="http://libc-2.17.so" target="_blank" rel="nofollow" onmousedown="this.href='http://www.google.com/url?q\x3dhttp%3A%2F%2Flibc-2.17.so\x26sa\x3dD\x26sntz\x3d1\x26usg\x3dAFQjCNHazZ1VR7HhBzKEq8UMH7E3ErbugA';return true;" onclick="this.href='http://www.google.com/url?q\x3dhttp%3A%2F%2Flibc-2.17.so\x26sa\x3dD\x26sntz\x3d1\x26usg\x3dAFQjCNHazZ1VR7HhBzKEq8UMH7E3ErbugA';return true;">libc-2.17.so</a><span>      </span> <span>  </span>00002ABEBE194C05<span>  </span>__libc_start_main <span>    </span>Unknown<span>  </span>Unknown</span></p>
<p><span>cp2k.popt<span>          </span>0000000000431429<span>  </span>Unknown <span>              </span>Unknown<span>  </span>Unknown</span></p><div><br></div>Thanks again for all your help so far!</div><div><br></div><div>-Brian</div><div><br>On Thursday, August 9, 2018 at 12:49:45 PM UTC-4, Brian Day wrote:<blockquote class="gmail_quote" style="margin:0;margin-left:0.8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr">Hi Nico,<div><br></div><div>Sorry for the long delayed reply, I had forgotten to check this thread for some time! </div><div><br></div><div>ifort --version returns: ifort (IFORT) 17.0.04 20170411.</div><div>Additionally, I get the same error message when I reduce the number of mpi tasks to 4 (2 per node, 2 nodes).</div><div><br></div><div>Best,</div><div>     Brian</div></div></blockquote></div></div></blockquote></div></div>