Dear Experts,<div><br></div><div>I am new to CP2K. I have successfully compiled but I have got the following error message. Please I need your help. Thank you in advance.</div><div><br></div><div>Sincerely,</div><div>Bharat</div><div><br></div><div>############################################################################################################</div><div><div>tset: standard error: Inappropriate ioctl for device</div><div><br></div><div>cd /RQusagers/sharmabh/test2/test2</div><div>eval `/share/apps/Modules/$MODULE_VERSION/bin/modulecmd tcsh load mkl/11.1.073`</div><div>/share/apps/Modules/3.2.5/bin/modulecmd tcsh load mkl/11.1.073</div><div>setenv CPATH /share/apps/intel/Compiler/11.1/073/mkl/include</div><div>setenv FPATH /share/apps/intel/Compiler/11.1/073/mkl/include:/etc/ksh.fpath:/etc/ksh.fpath</div><div>setenv LD_LIBRARY_PATH /share/apps/intel/Compiler/11.1/073/mkl/lib/em64t:/share/apps/intel/Compiler/11.0/083/lib/intel64:/opt/intel/fce/11.0.069/lib:/opt/intel/cce/11.0.069/lib</div><div>setenv LIBRARY_PATH /share/apps/intel/Compiler/11.1/073/mkl/lib/em64t</div><div>setenv LOADEDMODULES intel-compilers/11.0.083:modules:mkl/11.1.073</div><div>setenv MKLROOT /share/apps/intel/Compiler/11.1/073/mkl</div><div>setenv _LMFILES_ /share/apps/Modules/modulefiles/intel-compilers/11.0.083:/share/apps/Modules/3.2.5/modulefiles/modules:/share/apps/Modules/modulefiles/mkl/11.1.073</div><div>set _exit=0</div><div>test 0 = 0</div><div>eval `/share/apps/Modules/$MODULE_VERSION/bin/modulecmd tcsh load openmpi_intel64`</div><div>/share/apps/Modules/3.2.5/bin/modulecmd tcsh load openmpi_intel64</div><div>setenv CPATH /share/apps/openmpi/intel/1.4.1/include:/share/apps/intel/Compiler/11.1/073/mkl/include</div><div>setenv FPATH /share/apps/openmpi/intel/1.4.1/include:/share/apps/intel/Compiler/11.1/073/mkl/include:/etc/ksh.fpath:/etc/ksh.fpath</div><div>setenv LD_LIBRARY_PATH /opt/torque/lib64:/share/apps/openmpi/intel/1.4.1/lib:/share/apps/intel/Compiler/11.1/073/mkl/lib/em64t:/share/apps/intel/Compiler/11.0/083/lib/intel64:/opt/intel/fce/11.0.069/lib:/opt/intel/cce/11.0.069/lib</div><div>setenv LIBRARY_PATH /opt/torque/lib64:/share/apps/openmpi/intel/1.4.1/lib:/share/apps/intel/Compiler/11.1/073/mkl/lib/em64t</div><div>setenv LOADEDMODULES intel-compilers/11.0.083:modules:mkl/11.1.073:openmpi_intel64/1.4.1</div><div>setenv MANPATH /share/apps/openmpi/intel/1.4.1/share/man:/share/apps/Modules/3.2.5/man:/share/apps/intel/Compiler/11.0/083/man::/opt/intel/fce/11.0.069/man:/opt/intel/cce/11.0.069/man:/opt/intel/clck/1.3/doc/man:/usr/share/man:/usr/local/share/man:/opt/rocks/man:/opt/torque/man:/opt/ganglia/man:/opt/mpich/gnu/man</div><div>setenv OMPI_LDFLAGS -shared-intel</div><div>setenv PATH /share/apps/openmpi/intel/1.4.1/bin:/usr/java/latest/bin:/share/apps/Modules/3.2.5/bin:/share/apps/intel/Compiler/11.0/083/bin/intel64:/usr/kerberos/bin:/usr/java/latest/bin:/opt/intel/fce/11.0.069/bin:/opt/intel/cce/11.0.069/bin:/opt/intel/clck/1.3:/bin:/usr/bin:/opt/ganglia/bin:/opt/ganglia/sbin:/opt/openmpi/bin/:/opt/maui/bin:/opt/torque/bin:/opt/torque/sbin:/opt/rocks/bin:/opt/rocks/sbin:/opt/toolworks/totalview.8.6.0-2/bin:/opt/ganglia/bin:/opt/ganglia/sbin:/opt/openmpi/bin/:/opt/maui/bin:/opt/torque/bin:/opt/torque/sbin:/opt/rocks/bin:/opt/rocks/sbin:/opt/toolworks/totalview.8.6.0-2/bin</div><div>setenv _LMFILES_ /share/apps/Modules/modulefiles/intel-compilers/11.0.083:/share/apps/Modules/3.2.5/modulefiles/modules:/share/apps/Modules/modulefiles/mkl/11.1.073:/share/apps/Modules/modulefiles/openmpi_intel64/1.4.1</div><div>set _exit=0</div><div>test 0 = 0</div><div>eval `/share/apps/Modules/$MODULE_VERSION/bin/modulecmd tcsh load fftw/3.2.2-double`</div><div>/share/apps/Modules/3.2.5/bin/modulecmd tcsh load fftw/3.2.2-double</div><div>setenv CPATH /share/apps/fftw/fftw-3.2.2/double/include:/share/apps/openmpi/intel/1.4.1/include:/share/apps/intel/Compiler/11.1/073/mkl/include</div><div>setenv FPATH /share/apps/fftw/fftw-3.2.2/double/include:/share/apps/openmpi/intel/1.4.1/include:/share/apps/intel/Compiler/11.1/073/mkl/include:/etc/ksh.fpath:/etc/ksh.fpath</div><div>setenv LD_LIBRARY_PATH /share/apps/fftw/fftw-3.2.2/double/lib:/opt/torque/lib64:/share/apps/openmpi/intel/1.4.1/lib:/share/apps/intel/Compiler/11.1/073/mkl/lib/em64t:/share/apps/intel/Compiler/11.0/083/lib/intel64:/opt/intel/fce/11.0.069/lib:/opt/intel/cce/11.0.069/lib</div><div>setenv LOADEDMODULES intel-compilers/11.0.083:modules:mkl/11.1.073:openmpi_intel64/1.4.1:fftw/3.2.2-double</div><div>setenv PATH /share/apps/fftw/fftw-3.2.2/double/bin:/share/apps/openmpi/intel/1.4.1/bin:/usr/java/latest/bin:/share/apps/Modules/3.2.5/bin:/share/apps/intel/Compiler/11.0/083/bin/intel64:/usr/kerberos/bin:/usr/java/latest/bin:/opt/intel/fce/11.0.069/bin:/opt/intel/cce/11.0.069/bin:/opt/intel/clck/1.3:/bin:/usr/bin:/opt/ganglia/bin:/opt/ganglia/sbin:/opt/openmpi/bin/:/opt/maui/bin:/opt/torque/bin:/opt/torque/sbin:/opt/rocks/bin:/opt/rocks/sbin:/opt/toolworks/totalview.8.6.0-2/bin:/opt/ganglia/bin:/opt/ganglia/sbin:/opt/openmpi/bin/:/opt/maui/bin:/opt/torque/bin:/opt/torque/sbin:/opt/rocks/bin:/opt/rocks/sbin:/opt/toolworks/totalview.8.6.0-2/bin</div><div>setenv _LMFILES_ /share/apps/Modules/modulefiles/intel-compilers/11.0.083:/share/apps/Modules/3.2.5/modulefiles/modules:/share/apps/Modules/modulefiles/mkl/11.1.073:/share/apps/Modules/modulefiles/openmpi_intel64/1.4.1:/share/apps/Modules/modulefiles/fftw/3.2.2-double</div><div>set _exit=0</div><div>test 0 = 0</div><div>mpirun -np 2 /RQusagers/sharmabh/backup/exe/Linux-x86-64-intel/cp2k.popt -i input.inp</div><div>--------------------------------------------------------------------------</div><div>MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD</div><div>with errorcode 1.</div><div><br></div><div>NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.</div><div>You may or may not see output from other processes, depending on</div><div>exactly when Open MPI kills them.</div><div>--------------------------------------------------------------------------</div><div>--------------------------------------------------------------------------</div><div>mpirun has exited due to process rank 0 with PID 29138 on</div><div>node compute-0-73.local exiting without calling "finalize". This may</div><div>have caused other processes in the application to be</div><div>terminated by signals sent by mpirun (as reported here).</div><div>--------------------------------------------------------------------------</div></div><div>#########################################################################################################</div>