Block after " Adding QM/MM electrostatic....."

Luca bellu... at unisi.it
Wed Nov 7 16:22:45 UTC 2007


 Hi all,
 many thanks for tips...
 I resolved block  problem after proper reinstallation of the library.

 Brief summary of the problem
 CP2K problem:
 Random block without any warning during QM/MM Molecular Dynamics in
SCF
 procedure.

 Machine:
 INTEL 2 processor quad core 2.667 with 8Gb of ram
 Compiler: icc , ifort intel 10.0.23
 Library   : MKL 9.1.023,
                mpich2-1.0.5p4 -upgrade-> to mpich2-1.0.6p1
                (www-unix.mcs.anl.gov/mpi/mpich/)
                BLACS(www.netlib.org/blacs/)
                SCALAPACK(www.netlib.org/scalapack/)
                and  ubuntu 7.10 system library

 First, I tested my PC with memtest86.... my ram is OK !
 Then I upgraded mpich2 version and I reinstalled it
 in /usr/local/mpich2-1.0.6p1 with the following commands:
    cd /usr/local/mpich2-1.0.6p1/
    F77=ifort
    F90=ifort
    CC=icc
    CXX=icpc
    export F77 F90 CC CXX
    ./configure --with-device=ch3:shm
    make ; make install
 where shm stands for shared memory, used for single SMP machine. See
 documentation in mpich2.

 With new MPI I recompiled BLACS and SCALAPACK.
  Compile BLACS:
  Download BLACS and PATCH!!
  You should Modify Bmake.inc for yourself.
  I created LIB dir in my home directory where are BLACS and SCALAPACK
  This is my file Bmake.inc
  (Comments with ##?? are my comment)
 
#==========================================================================
=== #====================== SECTION 1: PATHS AND LIBRARIES
 =======================
 
#==========================================================================
=== #  The following macros specify the name and location of libraries
 required by #  the BLACS and its tester.
 
#==========================================================================
===

 #  --------------------------------------
 #  Make sure we've got a consistent shell
 #  --------------------------------------
    SHELL = /bin/sh

 #  -----------------------------
 #  The top level BLACS directory
 #  -----------------------------
    BTOPdir = $(HOME)/LIB/BLACS         ##?? directory LIB

 #
 
---------------------------------------------------------------------------
 #  The communication library your BLACS have been written for.
 #  Known choices (and the machines they run on) are:
 #
 #     COMMLIB   MACHINE
 #     .......
 .............................................................. #
CMMD
    Thinking Machine's CM-5
 #     MPI       Wide variety of systems
 #     MPL       IBM's SP series (SP1 and SP2)
 #     NX        Intel's supercomputer series (iPSC2, iPSC/860, DELTA,
 PARAGON) #     PVM       Most unix machines; See PVM User's Guide for
 details #
 
---------------------------------------------------------------------------
 COMMLIB = MPI

 #  -------------------------------------------------------------
 #  The platform identifier to suffix to the end of library names
 #  -------------------------------------------------------------
    PLAT = LINUX

 #  ----------------------------------------------------------
 #  Name and location of the BLACS library.  See section 2 for
 #  details on BLACS debug level (BLACSDBGLVL).
 #  ----------------------------------------------------------
    BLACSdir    = $(BTOPdir)/LIB
    BLACSDBGLVL = 0
    BLACSFINIT  =
 $(BLACSdir)/blacsF77init_$(COMMLIB)-$(PLAT)-$(BLACSDBGLVL).a
BLACSCINIT  =
 $(BLACSdir)/blacsCinit_$(COMMLIB)-$(PLAT)-$(BLACSDBGLVL).a
BLACSLIB    =
 $(BLACSdir)/blacs_$(COMMLIB)-$(PLAT)-$(BLACSDBGLVL).a

 #  -------------------------------------
 #  Name and location of the MPI library.
 #  -------------------------------------
    MPIdir =/usr/local/mpich2-1.0.6p1/             ##?? My MPI library
path
    MPILIBdir =
    MPIINCdir = $(MPIdir)/src/include
    MPILIB =

 #  -------------------------------------
 #  All libraries required by the tester.
 #  -------------------------------------
    BTLIBS = $(BLACSFINIT) $(BLACSLIB) $(BLACSFINIT) $(MPILIB)

 #  ----------------------------------------------------------------
 #  The directory to put the installation help routines' executables
 #  ----------------------------------------------------------------
    INSTdir = $(BTOPdir)/INSTALL/EXE

 #  ------------------------------------------------
 #  The name and location of the tester's executable
 #  ------------------------------------------------
    TESTdir = $(BTOPdir)/TESTING/EXE
    FTESTexe = $(TESTdir)/xFbtest_$(COMMLIB)-$(PLAT)-$(BLACSDBGLVL)
    CTESTexe = $(TESTdir)/xCbtest_$(COMMLIB)-$(PLAT)-$(BLACSDBGLVL)
 
#==========================================================================
=== #=============================== End SECTION 1
 ===============================
 
#==========================================================================
===


 
#==========================================================================
=== #========================= SECTION 2: BLACS INTERNALS
 ========================
 
#==========================================================================
=== #  The following macro definitions set preprocessor values for the
 BLACS. #  The file Bconfig.h sets these values if they are not set by
the
 makefile. #  User's compiling only the tester can skip this entire
section.
 #  NOTE: The MPI defaults have been set for MPICH.
 
#==========================================================================
===

 #
-----------------------------------------------------------------------
 #  The directory to find the required communication library include
files,
 #  if they are required by your system.
 #
-----------------------------------------------------------------------
    SYSINC = -I$(MPIINCdir)

 #
 
---------------------------------------------------------------------------
 #  The Fortran 77 to C interface to be used.  If you are unsure of
the
 correct #  setting for your platform, compile and run
 BLACS/INSTALL/xintface. #  Choices are: Add_, NoChange, UpCase, or
 f77IsF2C.
 #
 
---------------------------------------------------------------------------
 INTFACE = -DAdd_                ##?? it works for me
 #

 #
------------------------------------------------------------------------
 #  Allows the user to vary the topologies that the BLACS default
topologies
 #  (TOP = ' ') correspond to.  If you wish to use a particular
topology
 #  (as opposed to letting the BLACS make the choice), uncomment the
 #  following macros, and replace the character in single quotes with
the
 #  topology of your choice.
 #
------------------------------------------------------------------------
 #  DEFBSTOP   = -DDefBSTop="'1'"
 #  DEFCOMBTOP = -DDefCombTop="'1'"

 #
-------------------------------------------------------------------
 #  If your MPI_Send is locally-blocking, substitute the following
line
 #  for the empty macro definition below.
 #  SENDIS = -DSndIsLocBlk
 #
-------------------------------------------------------------------
    SENDIS =

 #
--------------------------------------------------------------------
 #  If your MPI handles packing of non-contiguous messages by copying
to
 #  another buffer or sending extra bytes, better performance may be
 #  obtained by replacing the empty macro definition below with the
 #  macro definition on the following line.
 #  BUFF = -DNoMpiBuff
 #
--------------------------------------------------------------------
    BUFF =

 #
-----------------------------------------------------------------------
 #  If you know something about your system, you may make it easier
for the
 #  BLACS to translate between C and fortran communicators.  If the
empty
 #  macro defininition is left alone, this translation will cause the
C
 #  BLACS to globally block for MPI_COMM_WORLD on calls to
BLACS_GRIDINIT
 #  and BLACS_GRIDMAP.  If you choose one of the options for
translating
 #  the context, neither the C or fortran calls will globally block.
 #  If you are using MPICH, or a derivitive system, you can replace
the
 #  empty macro definition below with the following (note that if you
let
 #  MPICH do the translation between C and fortran, you must also
indicate
 #  here if your system has pointers that are longer than integers.
If so,
 #  define -DPOINTER_64_BITS=1.)  For help on setting TRANSCOMM, you
can
 #  run BLACS/INSTALL/xtc_CsameF77 and BLACS/INSTALL/xtc_UseMpich as
 #  explained in BLACS/INSTALL/README.
    TRANSCOMM = -DPOINTER_64_BITS=1 -DUseMpi2     ##?? Because I use
MPI2
 #
 #  If you know that your MPI uses the same handles for fortran and C
 #  communicators, you can replace the empty macro definition below
with
 #  the macro definition on the following line.
 #  TRANSCOMM = -DCSameF77
 #
-----------------------------------------------------------------------
 #  TRANSCOMM =

 #
 
--------------------------------------------------------------------------
 #  You may choose to have the BLACS internally call either the C or
 Fortran77 #  interface to MPI by varying the following macro.  If
TRANSCOMM
 is left #  empty, the C interface BLACS_GRIDMAP/BLACS_GRIDINIT will
 globally-block if #  you choose to use the fortran internals, and the
 fortran interface will #  block if you choose to use the C
internals.  It
 is recommended that the #  user leave this macro definition blank,
unless
 there is a strong reason #  to prefer one MPI interface over the
other.
 #  WHATMPI = -DUseF77Mpi
 #  WHATMPI = -DUseCMpi
 #
 
--------------------------------------------------------------------------
 WHATMPI =

 #
 
---------------------------------------------------------------------------
 #  Some early versions of MPICH and its derivatives cannot handle
user
 defined #  zero byte data types.  If your system has this problem
(compile
 and run #  BLACS/INSTALL/xsyserrors to check if unsure), replace the
empty
 macro #  definition below with the macro definition on the following
line.
 #  SYSERRORS = -DZeroByteTypeBug
 #
 
---------------------------------------------------------------------------
 SYSERRORS =

 #  ------------------------------------------------------------------
 #  These macros set the debug level for the BLACS.  The fastest
 #  code is produced by BlacsDebugLvl 0.  Higher levels provide
 #  more debug information at the cost of performance.  Present levels
 #  of debug are:
 #  0 : No debug information
 #  1 : Mainly parameter checking.
 #  ------------------------------------------------------------------
    DEBUGLVL = -DBlacsDebugLvl=$(BLACSDBGLVL)

 #
 
-------------------------------------------------------------------------
#
  All BLACS definitions needed for compile (DEFS1 contains definitions
used
 #  by all BLACS versions).
 #
 
-------------------------------------------------------------------------
 DEFS1 = -DSYSINC $(SYSINC) $(INTFACE) $(DEFBSTOP) $(DEFCOMBTOP) $
(DEBUGLVL)
 BLACSDEFS = $(DEFS1) $(SENDIS) $(BUFF) $(TRANSCOMM) $(WHATMPI) $
(SYSERRORS)
 
#==========================================================================
=== #=============================== End SECTION 2
 ===============================
 
#==========================================================================
===


 
#==========================================================================
=== #=========================== SECTION 3: COMPILERS
 ============================
 
#==========================================================================
=== #  The following macros specify compilers, linker/loaders, the
archiver,
 #  and their options.  Some of the fortran files need to be compiled
with
 no #  optimization.  This is the F77NO_OPTFLAG.  The usage of the
remaining
 #  macros should be obvious from the names.
 
#==========================================================================
=== F77            = mpif90
    F77NO_OPTFLAGS = -fp-port
    F77FLAGS  = $(F77NO_OPTFLAGS) -O2  ##??Is aggressive optimization
 reliable? F77LOADER      = $(F77) -i-static
    F77LOADFLAGS   = -i-static
    CC             = mpicc
    CCFLAGS        = -O2  ##??Is aggressive optimization reliable?
    CCLOADER       = $(CC)
    CCLOADFLAGS    =

 #
 
--------------------------------------------------------------------------
 #  The archiver and the flag(s) to use when building an archive
(library).
 #  Also the ranlib routine.  If your system has no ranlib, set RANLIB
=
 echo. #
 
--------------------------------------------------------------------------
 ARCH      = ar
    ARCHFLAGS = r
    RANLIB    = ranlib

 
#==========================================================================
=== #=============================== End SECTION 3
 ===============================
 
#==========================================================================
===

 In my previous Bmake.inc I rapleced -O2 optimization flag with -
O3....uhm
 perhaps this was the problems for my block ?

 Compile blacs: cd /SRC/MPI  and type make.... see documentation for
more
 info BLACS libraries are in ~/LIB/BLACS/LIB/

 Compile SCALAPACK:
 Download scalapack1.8.0 and untar in LIB directory.
 You should rewrite SLmake.inc
 Follow my SLmake.inc
 Comments with ##?? are my comment
 
###########################################################################
#
 
###########################################################################
# #
 #  Program:         ScaLAPACK
 #
 #  Module:          SLmake.inc
 #
 #  Purpose:         Top-level Definitions
 #
 #  Creation date:   February 15, 2000
 #
 #  Modified:
 #
 #  Send bug reports, comments or suggestions to scal... at cs.utk.edu
 #
 
###########################################################################
# #
 SHELL         = /bin/sh
 #
 #  The complete path to the top level of ScaLAPACK directory, usually
 #  $(HOME)/SCALAPACK
 #
 home    = $(HOME)/LIB/scalapack-1.8.0   ##?? set home variable for my
LIB
 dir #
 #  The platform identifier to suffix to the end of library names
 #
 PLAT          = LINUX
 #
 #  BLACS setup.  All version need the debug level (0 or 1),
 #  and the directory where the BLACS libraries are
 #
 BLACSDBGLVL   = 0
 BLACSdir      = $(HOME)/LIB/BLACS/LIB
 #
 #  MPI setup; tailor to your system if using MPIBLACS
 #  Will need to comment out these 6 lines if using PVM
 #
 USEMPI        = -DUsingMpiBlacs
 SMPLIB        =
 BLACSFINIT    = $(BLACSdir)/blacsF77init_MPI-LINUX-0.a
 BLACSCINIT    = $(BLACSdir)/blacsCinit_MPI-LINUX-0.a
 BLACSLIB      = $(BLACSdir)/blacs_MPI-LINUX-0.a
 TESTINGdir    = $(home)/TESTING

 #
 #  PVMBLACS setup, uncomment next 6 lines if using PVM
 #
 #USEMPI        =
 #SMPLIB        = $(PVM_ROOT)/lib/$(PLAT)/libpvm3.a
 #BLACSFINIT    =
 #BLACSCINIT    =
 #BLACSLIB      = $(BLACSdir)/blacs_PVM-$(PLAT)-$(BLACSDBGLVL).a
 #TESTINGdir    = $(HOME)/pvm3/bin/$(PLAT)

 CBLACSLIB     = $(BLACSCINIT) $(BLACSLIB) $(BLACSCINIT)
 FBLACSLIB     = $(BLACSFINIT) $(BLACSLIB) $(BLACSFINIT)

 #
 #  The directories to find the various pieces of ScaLapack
 #
 PBLASdir      = $(home)/PBLAS
 SRCdir        = $(home)/SRC
 TESTdir       = $(home)/TESTING
 PBLASTSTdir   = $(TESTINGdir)
 TOOLSdir      = $(home)/TOOLS
 REDISTdir     = $(home)/REDIST
 REDISTTSTdir  = $(TESTINGdir)
 #
 #  The fortran and C compilers, loaders, and their flags
 #
 F77           = mpif90  ##?? rename it if necessary
 CC            = mpicc
 NOOPT         = -fp-port
 F77FLAGS      = $(NOOPT) -O2 ## ?? Standard optimization
 CCFLAGS       = -O2                  ## ?? Same comments
 SRCFLAG       =                         ## ?? in BLACS optimization
 F77LOADER     = $(F77)
 CCLOADER      = $(CC)
 F77LOADFLAGS  = -i-static
 CCLOADFLAGS   =
 #
 #  C preprocessor defs for compilation
 #  (-DNoChange, -DAdd_, -DUpCase, or -Df77IsF2C)
 #
 CDEFS         =-DAdd_ -DNO_IEEE $(USEMPI)
 #
 #  The archiver and the flag(s) to use when building archive
(library)
 #  Also the ranlib routine.  If your system has no ranlib, set RANLIB
=
 echo #
 ARCH          = ar
 ARCHFLAGS     = cr
 RANLIB        = ranlib
 #
 #  The name of the libraries to be created/linked to
 #
 SCALAPACKLIB  = $(home)/libscalapack.a  ##?? New library created
 BLASLIB       = $(HOME)/LIB/libblas.a        ##?? but I do not have
in LIB
 dir LAPACKLIB     = $(HOME)/LIB/liblapack.a   ##??but  I do not have
in LIB
 dir . ##??Can I delete it ? #
 PBLIBS        = $(SCALAPACKLIB) $(FBLACSLIB) $(LAPACKLIB) $(BLASLIB)
 $(SMPLIB) PRLIBS        = $(SCALAPACKLIB) $(CBLACSLIB) $(SMPLIB)
 RLIBS         = $(SCALAPACKLIB) $(FBLACSLIB) $(CBLACSLIB) $
(LAPACKLIB)
 $(BLASLIB) $(SMPLIB)
 LIBS          = $(PBLIBS)

 
###########################################################################
###########################
 
###########################################################################
###########################

 With these library I recompiled cp2k.
 Next there is my Linux-i686.popt

 
###########################################################################
########################### PERL     = perl
 CC       = cc
 CPP      = cpp
 FC       = mpif90 -FR
 LD       = mpif90 -i-static -O3 -xT -Vaxlib
 AR       = ar -r
 DFLAGS   = -D__INTEL -D__FFTSG\
            -D__parallel -D__BLACS -D__SCALAPACK\
            -D__FFTW3
            CPPFLAGS = -traditional -C $(DFLAGS) -P
 FCFLAGS  = $(DFLAGS) -O3 -xT -Vaxlib
 MLPATH   = /usr/local/LIB
 MKLPATH  = /opt/intel/mkl/9.1.023/lib/32
 LDFLAGS  = $(FCFLAGS) -i-static  -O3 -xT -Vaxlib
 LIBS     = $(MLPATH)/libscalapack.a \
            $(MLPATH)/libblacsCinit_MPI-LINUX-0.a \
            $(MLPATH)/libblacsF77init_MPI-LINUX-0.a \
            $(MLPATH)/libblacs_MPI-LINUX-0.a \
            $(MKLPATH)/libmkl_lapack.a \
            $(MKLPATH)/libmkl_ia32.a \
            $(MKLPATH)/libguide.a \
            /usr/lib/libfftw3.a -lpthread

 OBJECTS_ARCHITECTURE = machine_intel.o

 ifposix.int:
       touch ifposix.int

 
###########################################################################
######################

 I hope that these notes are useful
 Any suggestions for improving the performance is welcome

 ....It is not always the fault of cp2k  :-)

 Ciao
 Luca




More information about the CP2K-user mailing list