gcc 4.8.1 (need to install GCC)
clang 3.3
intel 15 (need to install Intel)
NOTE: Intel alone does not fully support C++11 if environment without gcc 4.8 or newer. Installing an external GCC is an solution for this, but this may cause some error when running Plumed due to cross-compiling.
git pull origin master
# or Clone a specific tag name using git clone: https://git-scm.com/docs/git-clone
NOTE: Intel alone does not fully support C++11 if environment without gcc 4.8 or newer. Installing an external GCC is an solution for this, but this may cause some error when running Plumed due to cross-compiling.
# Dowload Plumed:
https://github.com/plumed/plumed2/releases/tag/v2.5.2
https://github.com/plumed/plumed2/releases/tag/v2.5.2
tar xvzf plumed2-2.5.2.tar.gz
# download branch v2.5
git clone https://github.com/plumed/plumed2.git --branch=v2.5 plumed2-2.5.x
git clone https://github.com/plumed/plumed2.git --branch=v2.5 plumed2-2.5.x
cd plumed2-2.5.x
git pull origin v2.5 # or git checkout v2.5
cd plumed2-2.7htt
git pull origin hack-the-tree
git pull origin v2.5 # or git checkout v2.5
# download branch with PYCV
git clone --branch v2.6-pycv-devel https://github.com/giorginolab/plumed2-pycv.git plumed2-2.6pycv
cd plumed2-2.6pycv
# download branch hack-the-tree
git clone --branch hack-the-tree https://github.com/plumed/plumed2.git plumed2-2.7httcd plumed2-2.7htt
git pull origin hack-the-tree
git pull origin master
# or Clone a specific tag name using git clone: https://git-scm.com/docs/git-clone
git clone <url> --branch=<tag_name>
I. OMPI + Intel
1. USC1
module load mpi/openmpi4.0.2-Intel2019xe
module load intel/mkl-2019xe
check: mpiicpc -v (intel C++)
mpicxx --version ( gcc C++)
Notes: openMPI must be compile with gcc 4.8 or newer (load gcc/gcc-7.4.0 when compile openMPI)
module load intel/mkl-2019xe
check: mpiicpc -v (intel C++)
mpicxx --version ( gcc C++)
Notes: openMPI must be compile with gcc 4.8 or newer (load gcc/gcc-7.4.0 when compile openMPI)
# Install PLUMED
(to compile with mpi-enable, need to use compiler: CXX=mpic++ CC=mpicc)
chose modules to install: https://www.plumed.org/doc-v2.5/user-doc/html/mymodules.html
enable/disable modules:
b. use Blas & Lapack from intel_mkl
LIBS="-mkl"
c. or use internal link: (blas & lapack is automatically built, need FORTRAN compiler)
--disable-external-blas --disable-external-lapack \
VMD trajectory plugins
https://www.plumed.org/doc-master/user-doc/html/_installation.html
Making lepton library faster
#Configuring PLUMEDenable/disable modules:
./configure --enable-modules=+crystallization-colvar
./configure --enable-modules=all:-colvar-multicolvar
BLAS and LAPACK Libs
a. separate compile Blas & Lapackb. use Blas & Lapack from intel_mkl
LIBS="-mkl"
c. or use internal link: (blas & lapack is automatically built, need FORTRAN compiler)
--disable-external-blas --disable-external-lapack \
VMD trajectory plugins
https://www.plumed.org/doc-master/user-doc/html/_installation.html
Making lepton library faster
--enable-asmjit
./configure --prefix=/uhome/p001cao/local/app/plumed2/2.6htt \
CXX=mpic++ LIBS="-mkl" \
--enable-openmp --enable-modules=all --enable-asmjit
# or
./configure --prefix=/uhome/p001cao/local/app/plumed2/2.6 \
CXX=mpic++ --disable-external-blas --disable-external-lapack \
--enable-openmp --enable-modules=all --enable-asmjit
# create Module file + test
prepend-path PATH $topdir/bin
prepend-path PATH $topdir/bin
prepend-path LD_LIBRARY_PATH $topdir/lib
prepend-path INCLUDE $topdir/include
prepend-path PKG_CONFIG_PATH $topdir/lib/pkgconfig # this is required in order to Lammps can found Plumed
prepend-path PATH $topdir/bin
prepend-path LD_LIBRARY_PATH $topdir/lib
prepend-path INCLUDE $topdir/include
prepend-path PKG_CONFIG_PATH $topdir/lib/pkgconfig # this is required in order to Lammps can found Plumed
# test:
module load plumed2/2.6.0
plumed help
module load plumed2/2.6.0
plumed help
# USC2:
module load mpi/ompi4.0.3-intel19u5
module load intel/compiler-xe19u5
module load intel/mkl-xe19u5
./configure CXX=mpic++ CC=mpicc LIBS="-mkl" \
--enable-openmp --enable-modules=all --enable-asmjit \
--prefix=/home1/p001cao/local/app/plumed2/2.7htt
make -j 8
module load intel/mkl-xe19u5
Configure
./configure CXX=mpic++ CC=mpicc LIBS="-mkl" \
--enable-openmp --enable-modules=all --enable-asmjit \
--prefix=/home1/p001cao/local/app/plumed2/2.7htt
make -j 8
make install
II. Install PLUMED using lMPI-2019xe
# 1. USC 1:
(use this, bc compilers are available for all clusters)
NOTE: intelMPI on eagle does not work, due to wrong path
(use this, bc compilers are available for all clusters)
NOTE: intelMPI on eagle does not work, due to wrong path
1. Module load:
module load intel/compiler-xe19u5module load mpi/impi-xe19u5
module load intel/mkl-xe19u5
module load compiler/gcc/9.1.0
module load conda/py37
Configure
--enable-openmp --enable-modules=all --enable-asmjit \
--prefix=/uhome/p001cao/local/app/plumed2/2.6httIMPI \
# 2. USC 2:
# impi2016 (not support openMP)
module load mpi/intel-xe2016/impi-xe2016u4
module load compiler/intel/xe2016u4
check: $ which mpiicpc
# or impi2019
(source /home1/p001cao/local/app/intel/xe19u5/compilers_and_libraries_2019.5.281/linux/mpi/intel64/bin/mpivars.sh)
module load mpi/intel-xe2016/impi-xe2016u4
module load compiler/intel/xe2016u4
check: $ which mpiicpc
# or impi2019
(source /home1/p001cao/local/app/intel/xe19u5/compilers_and_libraries_2019.5.281/linux/mpi/intel64/bin/mpivars.sh)
NOTE: https://software.intel.com/en-us/articles/using-environment-modules-with-the-intel-development-tools
module load mpi/impi-xe19u5
module load intel/mkl-xe19u5
module load compiler/gcc-9.2.0
module load intel/mkl-xe19u5
module load compiler/gcc-9.2.0
source mpivars.sh
Configure./configure CXX=mpiicpc CC=mpiicc LIBS="-mkl" \
--enable-openmp --enable-modules=all --enable-asmjit \
--prefix=/home1/p001cao/local/app/plumed2/2.7htt-impi
III. OMPI+ GCC
don't use external "mkl" --> will cannot compile
# USC 1
module load mpi/ompi4.0.3-gcc9.2.0
Configure
./configure CXX=mpic++ CC=mpicc \
--enable-openmp --enable-modules=all --enable-asmjit \
--prefix=/uhome/p001cao/local/app/plumed2/2.7htt-gcc
###############################
# USC 1
module load mpi/ompi4.0.3-gcc9.2.0
Configure
./configure CXX=mpic++ CC=mpicc \
--enable-openmp --enable-modules=all --enable-asmjit \
--prefix=/uhome/p001cao/local/app/plumed2/2.7htt-gcc
###############################
# USC 2
module load mpi/ompi4.0.3-gcc9.2.0
module load tool_dev/binutils-2.32 # gold linker --> must use the same linker with MPI
./configure CXX=mpic++ CC=mpicc LDFLAGS="-fuse-ld=gold -lrt" \
--enable-openmp --enable-modules=all --enable-asmjit \
--prefix=/home1/p001cao/local/app/plumed2/2.7htt-gcc
################
module load compiler/gcc/7.4.0
module load mpi/gcc-7.4.0/ompi/3.1.4
Configure
./configure CXX=mpic++ CC=mpicc \
--enable-openmp --enable-modules=all --enable-asmjit \
--prefix=/home1/p001cao/local/app/plumed2/2.7htt-gcc3
./configure CXX=mpic++ CC=mpicc \
--enable-openmp --enable-modules=all --enable-asmjit \
--prefix=/home1/p001cao/local/app/plumed2/2.7htt-gcc3
IV. Install PLUMED using openmpi-4.0.1 + GCC-7.4.0 (CAN)
1. Module load:
module load mpi/openmpi4.0.1-gcc7.4.0
module load gcc/gcc-7.4.0
check: mpic++ --version ( gcc C++)
Configuring PLUMED:
./configure --prefix=/home/thang/local/app/plumed2/2.6.0-gcc \
CXX=mpic++ --disable-external-blas --disable-external-lapack \
--enable-openmp --enable-modules=all
module load gcc/gcc-7.4.0
check: mpic++ --version ( gcc C++)
2. Install PLUMED
unzip plumed2-hack-the-tree.zip
cd plumed2-hack-the-tree
./configure --prefix=/home/thang/local/app/plumed2/2.6.0-gcc \
CXX=mpic++ --disable-external-blas --disable-external-lapack \
--enable-openmp --enable-modules=all
V. Stuff commands
1. See the the test input files of an action
go to "regtest" folder of plumed source code, type command:
grep <ACTION_NAME> */*/plumed.dat
Ex: grep DISTANCE */*/plumed.dat
REf:
https://www.plumed.org/doc-master/user-doc/html/_installation.html
https://groups.google.com/forum/#!topic/plumed-users/x3YKcbDA-AE
https://groups.google.com/forum/#!topic/plumed-users/x3YKcbDA-AE
B. Plumed on USC2
IV. Use OMPI-clang
don't use external "mkl" --> will cannot compile
module load mpi/ompi4.0.3-clang10
Configure
./configure CXX=mpic++ CC=mpicc \
--enable-openmp --enable-modules=all --enable-asmjit \
--prefix=/home1/p001cao/local/app/plumed2/2.7htt-clang
./configure CXX=mpic++ CC=mpicc \
--enable-openmp --enable-modules=all --enable-asmjit \
--prefix=/home1/p001cao/local/app/plumed2/2.7htt-clang
III. Use MVAPICH2-GCC
module load mpi/mvapich2-2.3.2-gcc9.2.0
module load conda/py37mvapichSupp
module load mpi/mvapich2-2.3.2-gcc9.2.0
module load conda/py37mvapichSupp
Configure
./configure CXX=mpic++ CC=mpicc LDFLAGS="-fuse-ld=gold -lrt" \
--enable-openmp --enable-modules=all --enable-asmjit \
--prefix=/home1/p001cao/local/app/plumed2/2.7htt-mva
./configure CXX=mpic++ CC=mpicc LDFLAGS="-fuse-ld=gold -lrt" \
--enable-openmp --enable-modules=all --enable-asmjit \
--prefix=/home1/p001cao/local/app/plumed2/2.7htt-mva