<div dir="ltr">Hi All,<br><br>we could finally compile and install cp2k with openmpi 1.6.5, but it's very slow because we have to compile with O0!<br>in this way the benchmark goes fine without exiting with the error :<br> <br>*** cp_dbcsr_cholesky:cp_dbcsr_<wbr>cholesky_decompose *** <br>*** processor 0 ::
err=-300 condition FAILED at line 123 ***<br><br>and then we have succesfully installed with openmpi 1.7.2 with O1.<br>Now it's all ok.<br>Thanks to all!!!<br><br>Cristiano<br><br>Il giorno lunedì 30 settembre 2013 11:37:38 UTC+2, sassy ha scritto:<blockquote class="gmail_quote" style="margin: 0;margin-left: 0.8ex;border-left: 1px #ccc solid;padding-left: 1ex;">Hi Cristiano,
<br>
<br>I got version 2.4 compiled with OpenMPI 1.6.5 and gfortran 4.7.2 and ATLAS
<br>3.10.0.
<br>
<br>However, I also had the problem with the cholesky_decompose like you had.
<br>
<br>In the end what I done was either simply changing the number of
<br>processors/cores I was using and the problem vanished, if I remember
<br>correctly, or try and use 'random' for the initial guess. It was one of the
<br>two I think to remember.
<br>
<br>In any case, yes, it is possible to use OpenMPI 1.6.5 and I am using that in
<br>production as well. I concluded that the cholesky_decompose had nothing to do
<br>wit the the OpenMPI version I was using.
<br>
<br>I hope that helps a bit.
<br>
<br>All the best from a mild but grey London
<br>
<br>Jörg
<br>
<br>On Monday 30 September 2013 09:41:44 <a href="javascript:" target="_blank" gdf-obfuscated-mailto="dNJ6WB2xGrkJ">crist...@gmail.com</a> wrote:
<br>> Hi!
<br>> thanks for tyour suggestion
<br>> I'll try with the 1.7.2 release and then I'll post if everything will go in
<br>> the right way.
<br>> Cristiano
<br>>
<br>> Il giorno venerdì 27 settembre 2013 16:30:22 UTC+2, Ari Paavo Seitsonen ha
<br>>
<br>> scritto:
<br>> > Dear Cristiano,
<br>> >
<br>> > This might not really help you, but I remember that on the "big"
<br>> >
<br>> > computer at our university OpenMPI 1.6.5 gave some error (I don't
<br>> > remember what the error was though), and thus I still use the version
<br>> > 1.4.5 when compiling. Did you try with for example version 1.7.2?
<br>> >
<br>> > Greetings,
<br>> >
<br>> > apsi
<br>> >
<br>> > 2013/9/27 <<a>crist...@gmail.com</a> <javascript:>>
<br>> >
<br>> >> Hi all,
<br>> >>
<br>> >> I've succesfully compiled cp2k.popt with intel openmpi/1.6.5,
<br>> >> mkl/11.0.1, libint/1.1.4
<br>> >>
<br>> >> my arch file (Linux-x86-64-intel.popt):
<br>> >>
<br>> >>
<br>> >> INTEL_LIB = $(INTEL_MKL)/lib/intel64
<br>> >> INTEL_INC = -I$(INTEL_MKL)/include -I$(INTEL_MKL)/include/fftw
<br>> >> INTEL_SCALAPACK = -L$(INTEL_LIB) -lmkl_scalapack_lp64
<br>> >> -lmkl_blacs_openmpi_lp64 -lmkl_intel_lp64 -lmkl_sequential -lmkl_core
<br>> >>
<br>> >> LIBINT_PATH = <my_path to libint-1.1.4>
<br>> >> LIBINT_LIB = <my_path to
<br>> >> cp2k-2.4.0>/tools/hfx_tools/<wbr>libint_tools/libint_cpp_<wbr>wrapper.o
<br>> >> $(LIBINT_PATH)/lib/libderiv.a $(LIBINT_PATH)/lib/libint.a -lstdc++
<br>> >>
<br>> >>
<br>> >> CC = icc
<br>> >> CPP =
<br>> >> FC = mpif90
<br>> >> LD = mpif90
<br>> >> AR = ar -r
<br>> >> DFLAGS = -D__HAS_NO_ISO_C_BINDING -D__INTEL -D__FFTSG -D__parallel
<br>> >> -D__BLACS -D__SCALAPACK -D__FFTW3 -D__LIBINT -D__FFTMKL
<br>> >> CPPFLAGS = -C -traditional $(DFLAGS) -I$(INTEL_INC)
<br>> >> FCFLAGS = $(DFLAGS) -I$(INTEL_INC) -O1 -heap-arrays 64 -fpp -free
<br>> >> FCFLAGS2 = $(DFLAGS) -I$(INTEL_INC) -O1 -heap-arrays 64 -fpp -free
<br>> >> LDFLAGS = $(FCFLAGS)
<br>> >> LIBS = -L$(INTEL_LIB) -mkl $(LIBINT_LIB) $(INTEL_SCALAPACK)
<br>> >>
<br>> >> OBJECTS_ARCHITECTURE = machine_intel.o
<br>> >>
<br>> >>
<br>> >> graphcon.o: graphcon.F
<br>> >>
<br>> >> $(FC) -c $(FCFLAGS2) $<
<br>> >>
<br>> >> but when I run my test with input file
<br>> >>
<br>> >> &GLOBAL
<br>> >>
<br>> >> PROJECT test01
<br>> >> RUN_TYPE MD
<br>> >> PRINT_LEVEL LOW
<br>> >>
<br>> >> &END GLOBAL
<br>> >> &FORCE_EVAL
<br>> >>
<br>> >> METHOD QS
<br>> >> &DFT
<br>> >>
<br>> >> BASIS_SET_FILE_NAME ../../BASIS_MOLOPT
<br>> >> POTENTIAL_FILE_NAME ../../GTH_POTENTIALS
<br>> >> &MGRID
<br>> >>
<br>> >> CUTOFF 400
<br>> >> NGRIDS 5
<br>> >>
<br>> >> &END MGRID
<br>> >> &QS
<br>> >> &END QS
<br>> >> &SCF
<br>> >>
<br>> >> CHOLESKY OFF
<br>> >> MAX_SCF 40
<br>> >> SCF_GUESS ATOMIC
<br>> >> &OT
<br>> >>
<br>> >> ENERGY_GAP 0.001
<br>> >> MINIMIZER CG
<br>> >> PRECONDITIONER FULL_ALL
<br>> >>
<br>> >> &END OT
<br>> >> &OUTER_SCF ON
<br>> >>
<br>> >> EPS_SCF 1.0E-6
<br>> >> MAX_SCF 10
<br>> >>
<br>> >> &END OUTER_SCF
<br>> >> &PRINT
<br>> >>
<br>> >> &RESTART
<br>> >> &END
<br>> >> &RESTART_HISTORY
<br>> >> &END
<br>> >>
<br>> >> &END PRINT
<br>> >>
<br>> >> &END SCF
<br>> >> &XC
<br>> >>
<br>> >> &XC_FUNCTIONAL PBE
<br>> >> &END XC_FUNCTIONAL
<br>> >> &VDW_POTENTIAL
<br>> >>
<br>> >> POTENTIAL_TYPE PAIR_POTENTIAL
<br>> >> &PAIR_POTENTIAL
<br>> >>
<br>> >> REFERENCE_FUNCTIONAL PBE
<br>> >> PARAMETER_FILE_NAME my_parameter.dat
<br>> >> TYPE DFTD3
<br>> >> R_CUTOFF [angstrom] 30
<br>> >>
<br>> >> &END PAIR_POTENTIAL
<br>> >>
<br>> >> &END VDW_POTENTIAL
<br>> >>
<br>> >> &END XC
<br>> >>
<br>> >> &END DFT
<br>> >>
<br>> >> &SUBSYS
<br>> >>
<br>> >> &CELL
<br>> >>
<br>> >> ABC 16.826 16.136 30.0
<br>> >>
<br>> >> &END CELL
<br>> >> &TOPOLOGY
<br>> >>
<br>> >> COORDINATE XYZ
<br>> >> COORD_FILE_NAME ./test01_ini.xyz
<br>> >> CONNECTIVITY OFF
<br>> >>
<br>> >> &END TOPOLOGY
<br>> >>
<br>> >> ...
<br>> >> ...
<br>> >> ...
<br>> >>
<br>> >> &END SUBSYS
<br>> >>
<br>> >> &END FORCE_EVAL
<br>> >> &MOTION
<br>> >>
<br>> >> &MD
<br>> >>
<br>> >> ENSEMBLE NVE
<br>> >> STEPS 10000
<br>> >> TIMESTEP 0.5
<br>> >> TEMPERATURE 310.0
<br>> >> TEMP_TOL 50.0
<br>> >>
<br>> >> &END MD
<br>> >> &PRINT
<br>> >>
<br>> >> &RESTART
<br>> >>
<br>> >> &EACH
<br>> >>
<br>> >> MD 1
<br>> >>
<br>> >> &END
<br>> >>
<br>> >> &END
<br>> >> &TRAJECTORY
<br>> >>
<br>> >> &EACH
<br>> >>
<br>> >> MD 1
<br>> >>
<br>> >> &END EACH
<br>> >>
<br>> >> &END TRAJECTORY
<br>> >>
<br>> >> &END PRINT
<br>> >>
<br>> >> &END MOTION
<br>> >>
<br>> >> cp2k.popt exit with:
<br>> >> *****************************<wbr>******************************<wbr>************
<br>> >> ** *** 11:26:10 ERRORL2 in
<br>> >> cp_dbcsr_cholesky:cp_dbcsr_<wbr>cholesky_decompose *** *** processor 0 ::
<br>> >> err=-300 condition FAILED at line 123 ***
<br>> >> *****************************<wbr>******************************<wbr>***********
<br>> >> ***
<br>> >>
<br>> >>
<br>> >> ===== Routine Calling Stack =====
<br>> >>
<br>> >> 11 cp_dbcsr_cholesky_decompose
<br>> >> 10 qs_ot_get_derivative
<br>> >>
<br>> >> 9 ot_mini
<br>> >> 8 ot_scf_mini
<br>> >> 7 qs_scf_loop_do_ot
<br>> >> 6 scf_env_do_scf_inner_loop
<br>> >> 5 scf_env_do_scf
<br>> >> 4 qs_energies_scf
<br>> >> 3 qs_forces
<br>> >> 2 qs_mol_dyn_low
<br>> >> 1 CP2K
<br>> >>
<br>> >> but the same run goes fine with a cp2k.popt compiled with intel
<br>> >> openmpi/1.4.4, mkl/10.2.2, libint/1.1.4
<br>> >>
<br>> >> does anyone know if cp2k/2.4 dosen't work fine with openmpi/1.6.5?
<br>> >> any suggestions?
<br>> >>
<br>> >> thanks for help!!!!
<br>> >>
<br>> >> Cristiano
<br>> >
<br>> > -=*=-=*=-=*=-=*=-=*=-=*=-=*=-=<wbr>*=-=*=-=*=-=*=-=*=-=*=-=*=-=*=<wbr>-=*=-=*=-=*=-
<br>> > =*=-
<br>> >
<br>> > Ari P Seitsonen / <a>Ari.P...@iki.fi</a> <javascript:> /
<br>> >
<br>> > <a href="http://www.iki.fi/~apsi/" target="_blank">http://www.iki.fi/~apsi/</a>
<br>> >
<br>> > Physikalisch-Chemisches Institut der Universität Zürich
<br>> > Tel: +41 44 63 55 44 97 / Mobile: +41 79 71 90 935
<br>
<br>--
<br>******************************<wbr>******************************<wbr>*
<br>Jörg Saßmannshausen
<br>University College London
<br>Department of Chemistry
<br>Gordon Street
<br>London
<br>WC1H 0AJ
<br>
<br>email: <a href="javascript:" target="_blank" gdf-obfuscated-mailto="dNJ6WB2xGrkJ">j.sas...@ucl.ac.uk</a>
<br>web: <a href="http://sassy.formativ.net" target="_blank">http://sassy.formativ.net</a>
<br>
<br>Please avoid sending me Word or PowerPoint attachments.
<br>See <a href="http://www.gnu.org/philosophy/no-word-attachments.html" target="_blank">http://www.gnu.org/philosophy/<wbr>no-word-attachments.html</a>
<br>
<br></blockquote></div>