Tuesday, September 10, 2019

Install PLUMED

compilers: PLUMED 2.4 requires a compiler that supports C++11: (need one among of the follows)
gcc 4.8.1      (need to install GCC)
clang 3.3
intel 15        (need to install Intel)

NOTE: Intel alone does not fully support C++11 if environment without gcc 4.8 or newer. Installing an external GCC is an solution for this, but this may cause some error when running Plumed due to cross-compiling.

tar xvzf plumed2-2.5.2.tar.gz

# download branch v2.5
git clone https://github.com/plumed/plumed2.git --branch=v2.5   plumed2-2.5.x
cd plumed2-2.5.x
git pull origin v2.5                           # or    git checkout v2.5

# download branch with PYCV
git clone --branch v2.6-pycv-devel  https://github.com/giorginolab/plumed2-pycv.git   plumed2-2.6pycv
cd    plumed2-2.6pycv

# download branch hack-the-tree      
git clone   --branch hack-the-tree    https://github.com/plumed/plumed2.git    plumed2-2.7htt
cd plumed2-2.7htt
git pull origin hack-the-tree

git pull origin master  

# or Clone a specific tag name using git clone: https://git-scm.com/docs/git-clone
git clone <url> --branch=<tag_name>

I. OMPI + Intel

1. USC1

module load mpi/openmpi4.0.2-Intel2019xe
module load intel/mkl-2019xe

check:  mpiicpc  -v                      (intel C++)
            mpicxx --version             ( gcc C++)

Notes: openMPI must be compile with gcc 4.8 or newer (load gcc/gcc-7.4.0 when compile openMPI)

# Install PLUMED

(to compile with mpi-enable, need to use compiler: CXX=mpic++   CC=mpicc)
chose modules to install: https://www.plumed.org/doc-v2.5/user-doc/html/mymodules.html
enable/disable modules:
./configure --enable-modules=+crystallization-colvar
./configure --enable-modules=all:-colvar-multicolvar 
BLAS and LAPACK Libs
a. separate compile Blas & Lapack
b. use Blas & Lapack from intel_mkl
LIBS="-mkl"
c. or use internal link: (blas & lapack is automatically built, need FORTRAN compiler)
--disable-external-blas --disable-external-lapack \

VMD trajectory plugins
https://www.plumed.org/doc-master/user-doc/html/_installation.html

Making lepton library faster
--enable-asmjit

#Configuring PLUMED
./configure --prefix=/uhome/p001cao/local/app/plumed2/2.6htt \
CXX=mpic++ LIBS="-mkl" \
--enable-openmp --enable-modules=all --enable-asmjit

# or
./configure --prefix=/uhome/p001cao/local/app/plumed2/2.6 \
CXX=mpic++ --disable-external-blas --disable-external-lapack \
--enable-openmp --enable-modules=all --enable-asmjit

# create Module file + test
prepend-path PATH $topdir/bin
prepend-path   PATH                $topdir/bin
prepend-path   LD_LIBRARY_PATH     $topdir/lib
prepend-path   INCLUDE             $topdir/include
prepend-path   PKG_CONFIG_PATH     $topdir/lib/pkgconfig          # this is required in order to Lammps can found Plumed

# test: 
module load plumed2/2.6.0
plumed help

# USC2:

module load mpi/ompi4.0.3-intel19u5
module load intel/compiler-xe19u5
module load intel/mkl-xe19u5
Configure

./configure CXX=mpic++ CC=mpicc  LIBS="-mkl" \
--enable-openmp --enable-modules=all --enable-asmjit \
--prefix=/home1/p001cao/local/app/plumed2/2.7htt 

make -j 8
make install

II. Install PLUMED using lMPI-2019xe

# 1. USC 1:
(use this, bc compilers are available for all clusters)
NOTE: intelMPI on eagle does not work, due to wrong path
    1. Module load:
    module load intel/compiler-xe19u5
    module load mpi/impi-xe19u5
    module load intel/mkl-xe19u5
    module load compiler/gcc/9.1.0
    module load conda/py37

    Configure
    ./configure CXX=mpiicpc CC=mpiicc LIBS="-mkl" \
    --enable-openmp --enable-modules=all --enable-asmjit \
    --prefix=/uhome/p001cao/local/app/plumed2/2.6httIMPI \

    # 2. USC 2:
    # impi2016 (not support openMP)
    module load mpi/intel-xe2016/impi-xe2016u4
    module load compiler/intel/xe2016u4
    check: $ which mpiicpc

    # or impi2019
    (source /home1/p001cao/local/app/intel/xe19u5/compilers_and_libraries_2019.5.281/linux/mpi/intel64/bin/mpivars.sh)
    module load mpi/impi-xe19u5
    module load intel/mkl-xe19u5
    module load compiler/gcc-9.2.0 
    source mpivars.sh
    Configure
    ./configure CXX=mpiicpc CC=mpiicc LIBS="-mkl" \
    --enable-openmp --enable-modules=all --enable-asmjit \
    --prefix=/home1/p001cao/local/app/plumed2/2.7htt-impi

    III. OMPI+ GCC

    don't use external "mkl" --> will cannot compile
    # USC 1
    module load mpi/ompi4.0.3-gcc9.2.0
    Configure
    ./configure CXX=mpic++ CC=mpicc \
    --enable-openmp --enable-modules=all --enable-asmjit \
    --prefix=/uhome/p001cao/local/app/plumed2/2.7htt-gcc
    ###############################


    # USC 2

    module load mpi/ompi4.0.3-gcc9.2.0
    module load tool_dev/binutils-2.32              # gold linker -->  must use the same linker with MPI
    Configure 
    ./configure CXX=mpic++ CC=mpicc LDFLAGS="-fuse-ld=gold -lrt" \
    --enable-openmp --enable-modules=all --enable-asmjit \
    --prefix=/home1/p001cao/local/app/plumed2/2.7htt-gcc

    ################
    module load compiler/gcc/7.4.0
    module load mpi/gcc-7.4.0/ompi/3.1.4
    Configure 
    ./configure CXX=mpic++ CC=mpicc \
    --enable-openmp --enable-modules=all --enable-asmjit \
    --prefix=/home1/p001cao/local/app/plumed2/2.7htt-gcc3

    IV. Install PLUMED using openmpi-4.0.1 + GCC-7.4.0 (CAN)

    1. Module load:

    module load mpi/openmpi4.0.1-gcc7.4.0
    module load gcc/gcc-7.4.0

    check:  mpic++ --version             ( gcc C++)

    2. Install PLUMED

    unzip plumed2-hack-the-tree.zip
    cd plumed2-hack-the-tree

    Configuring PLUMED: 
    ./configure --prefix=/home/thang/local/app/plumed2/2.6.0-gcc \
    CXX=mpic++ --disable-external-blas --disable-external-lapack \
    --enable-openmp --enable-modules=all

    V. Stuff commands

    1. See the the test input files of an action
    go to "regtest" folder of plumed source code, type command:

    grep <ACTION_NAME> */*/plumed.dat

    Ex: grep DISTANCE */*/plumed.dat

    don't use external "mkl" --> will cannot compile

    module load mpi/ompi4.0.3-clang10
    Configure
    ./configure CXX=mpic++ CC=mpicc \
    --enable-openmp --enable-modules=all --enable-asmjit \
    --prefix=/home1/p001cao/local/app/plumed2/2.7htt-clang

    III. Use MVAPICH2-GCC
    module load mpi/mvapich2-2.3.2-gcc9.2.0
    module load conda/py37mvapichSupp
    Configure 
    ./configure CXX=mpic++ CC=mpicc LDFLAGS="-fuse-ld=gold -lrt" \
    --enable-openmp --enable-modules=all --enable-asmjit \
    --prefix=/home1/p001cao/local/app/plumed2/2.7htt-mva