Showing posts with label GCC. Show all posts
Showing posts with label GCC. Show all posts

Tuesday, September 10, 2019

Install PLUMED

compilers: PLUMED 2.4 requires a compiler that supports C++11: (need one among of the follows)
gcc 4.8.1      (need to install GCC)
clang 3.3
intel 15        (need to install Intel)

NOTE: Intel alone does not fully support C++11 if environment without gcc 4.8 or newer. Installing an external GCC is an solution for this, but this may cause some error when running Plumed due to cross-compiling.

tar xvzf plumed2-2.5.2.tar.gz

# download branch v2.5
git clone https://github.com/plumed/plumed2.git --branch=v2.5   plumed2-2.5.x
cd plumed2-2.5.x
git pull origin v2.5                           # or    git checkout v2.5

# download branch with PYCV
git clone --branch v2.6-pycv-devel  https://github.com/giorginolab/plumed2-pycv.git   plumed2-2.6pycv
cd    plumed2-2.6pycv

# download branch hack-the-tree      
git clone   --branch hack-the-tree    https://github.com/plumed/plumed2.git    plumed2-2.7htt
cd plumed2-2.7htt
git pull origin hack-the-tree

git pull origin master  

# or Clone a specific tag name using git clone: https://git-scm.com/docs/git-clone
git clone <url> --branch=<tag_name>

I. OMPI + Intel

1. USC1

module load mpi/openmpi4.0.2-Intel2019xe
module load intel/mkl-2019xe

check:  mpiicpc  -v                      (intel C++)
            mpicxx --version             ( gcc C++)

Notes: openMPI must be compile with gcc 4.8 or newer (load gcc/gcc-7.4.0 when compile openMPI)

# Install PLUMED

(to compile with mpi-enable, need to use compiler: CXX=mpic++   CC=mpicc)
chose modules to install: https://www.plumed.org/doc-v2.5/user-doc/html/mymodules.html
enable/disable modules:
./configure --enable-modules=+crystallization-colvar
./configure --enable-modules=all:-colvar-multicolvar 
BLAS and LAPACK Libs
a. separate compile Blas & Lapack
b. use Blas & Lapack from intel_mkl
LIBS="-mkl"
c. or use internal link: (blas & lapack is automatically built, need FORTRAN compiler)
--disable-external-blas --disable-external-lapack \

VMD trajectory plugins
https://www.plumed.org/doc-master/user-doc/html/_installation.html

Making lepton library faster
--enable-asmjit

#Configuring PLUMED
./configure --prefix=/uhome/p001cao/local/app/plumed2/2.6htt \
CXX=mpic++ LIBS="-mkl" \
--enable-openmp --enable-modules=all --enable-asmjit

# or
./configure --prefix=/uhome/p001cao/local/app/plumed2/2.6 \
CXX=mpic++ --disable-external-blas --disable-external-lapack \
--enable-openmp --enable-modules=all --enable-asmjit

# create Module file + test
prepend-path PATH $topdir/bin
prepend-path   PATH                $topdir/bin
prepend-path   LD_LIBRARY_PATH     $topdir/lib
prepend-path   INCLUDE             $topdir/include
prepend-path   PKG_CONFIG_PATH     $topdir/lib/pkgconfig          # this is required in order to Lammps can found Plumed

# test: 
module load plumed2/2.6.0
plumed help

# USC2:

module load mpi/ompi4.0.3-intel19u5
module load intel/compiler-xe19u5
module load intel/mkl-xe19u5
Configure

./configure CXX=mpic++ CC=mpicc  LIBS="-mkl" \
--enable-openmp --enable-modules=all --enable-asmjit \
--prefix=/home1/p001cao/local/app/plumed2/2.7htt 

make -j 8
make install

II. Install PLUMED using lMPI-2019xe

# 1. USC 1:
(use this, bc compilers are available for all clusters)
NOTE: intelMPI on eagle does not work, due to wrong path
    1. Module load:
    module load intel/compiler-xe19u5
    module load mpi/impi-xe19u5
    module load intel/mkl-xe19u5
    module load compiler/gcc/9.1.0
    module load conda/py37

    Configure
    ./configure CXX=mpiicpc CC=mpiicc LIBS="-mkl" \
    --enable-openmp --enable-modules=all --enable-asmjit \
    --prefix=/uhome/p001cao/local/app/plumed2/2.6httIMPI \

    # 2. USC 2:
    # impi2016 (not support openMP)
    module load mpi/intel-xe2016/impi-xe2016u4
    module load compiler/intel/xe2016u4
    check: $ which mpiicpc

    # or impi2019
    (source /home1/p001cao/local/app/intel/xe19u5/compilers_and_libraries_2019.5.281/linux/mpi/intel64/bin/mpivars.sh)
    module load mpi/impi-xe19u5
    module load intel/mkl-xe19u5
    module load compiler/gcc-9.2.0 
    source mpivars.sh
    Configure
    ./configure CXX=mpiicpc CC=mpiicc LIBS="-mkl" \
    --enable-openmp --enable-modules=all --enable-asmjit \
    --prefix=/home1/p001cao/local/app/plumed2/2.7htt-impi

    III. OMPI+ GCC

    don't use external "mkl" --> will cannot compile
    # USC 1
    module load mpi/ompi4.0.3-gcc9.2.0
    Configure
    ./configure CXX=mpic++ CC=mpicc \
    --enable-openmp --enable-modules=all --enable-asmjit \
    --prefix=/uhome/p001cao/local/app/plumed2/2.7htt-gcc
    ###############################


    # USC 2

    module load mpi/ompi4.0.3-gcc9.2.0
    module load tool_dev/binutils-2.32              # gold linker -->  must use the same linker with MPI
    Configure 
    ./configure CXX=mpic++ CC=mpicc LDFLAGS="-fuse-ld=gold -lrt" \
    --enable-openmp --enable-modules=all --enable-asmjit \
    --prefix=/home1/p001cao/local/app/plumed2/2.7htt-gcc

    ################
    module load compiler/gcc/7.4.0
    module load mpi/gcc-7.4.0/ompi/3.1.4
    Configure 
    ./configure CXX=mpic++ CC=mpicc \
    --enable-openmp --enable-modules=all --enable-asmjit \
    --prefix=/home1/p001cao/local/app/plumed2/2.7htt-gcc3

    IV. Install PLUMED using openmpi-4.0.1 + GCC-7.4.0 (CAN)

    1. Module load:

    module load mpi/openmpi4.0.1-gcc7.4.0
    module load gcc/gcc-7.4.0

    check:  mpic++ --version             ( gcc C++)

    2. Install PLUMED

    unzip plumed2-hack-the-tree.zip
    cd plumed2-hack-the-tree

    Configuring PLUMED: 
    ./configure --prefix=/home/thang/local/app/plumed2/2.6.0-gcc \
    CXX=mpic++ --disable-external-blas --disable-external-lapack \
    --enable-openmp --enable-modules=all

    V. Stuff commands

    1. See the the test input files of an action
    go to "regtest" folder of plumed source code, type command:

    grep <ACTION_NAME> */*/plumed.dat

    Ex: grep DISTANCE */*/plumed.dat

    don't use external "mkl" --> will cannot compile

    module load mpi/ompi4.0.3-clang10
    Configure
    ./configure CXX=mpic++ CC=mpicc \
    --enable-openmp --enable-modules=all --enable-asmjit \
    --prefix=/home1/p001cao/local/app/plumed2/2.7htt-clang

    III. Use MVAPICH2-GCC
    module load mpi/mvapich2-2.3.2-gcc9.2.0
    module load conda/py37mvapichSupp
    Configure 
    ./configure CXX=mpic++ CC=mpicc LDFLAGS="-fuse-ld=gold -lrt" \
    --enable-openmp --enable-modules=all --enable-asmjit \
    --prefix=/home1/p001cao/local/app/plumed2/2.7htt-mva

    Sunday, July 28, 2019

    Compiling GCC 10

    NOTE:
    - Some applications require C++11, this is only supported on GCC 4.8 or newer
    - intel 2018 support gcc versions 4.3 - 6.3
    https://software.intel.com/en-us/articles/intel-c-compiler-180-for-linux-release-notes-for-intel-parallel-studio-xe-2018

    1. Download:
    https://gcc.gnu.org/releases.html

    check all availabe versions GCC :
    svn ls svn://gcc.gnu.org/svn/gcc/tags | grep gcc | grep release
    #or http://ftp.tsukuba.wide.ad.jp/software/gcc/releases

    wget http://ftp.tsukuba.wide.ad.jp/software/gcc/releases/gcc-10.3.0/gcc-10.3.0.tar.gz
    tar xvf gcc-10.3.0.tar.gz

    2. Install
    a. download prerequisites:
    cd gcc-10.3
    ./contrib/download_prerequisites

    b. Configure:
    note: compile GCC out source-dir, to avoid modifying source code when compiling get fail


    ## USC1: (eagle)
    ## (if any) error: Couldn't resolve host 'github.com' while ....
    git config --global --unset http.proxy    
    git config --global --unset https.proxy 
    ####
    git clone -b releases/gcc-11.2.0 https://github.com/gcc-mirror/gcc gcc-11.2.0
    cd gcc-11.2
    git checkout releases/gcc-11.2
    ./contrib/download_prerequisites
    mkdir build && cd build
    module load compiler/gcc-10.3         # to avoid:  uint64_t or int64_t not found 
    ../configure --enable-languages=c,c++,objc,obj-c++,fortran \
    --enable-checking=release --enable-shared --disable-multilib --with-system-zlib \
    --prefix=/uhome/p001cao/local/app/compiler/gcc-11.2
    make    # note use -j to know what error

    ## USC 2
    --prefix=/home1/p001cao/local/app/compiler/gcc-10.3
    ## CAN
    --prefix=/home/thang/local/app/compiler/gcc-10.3
    ## CAN_GPU
    --prefix=/home/thang/local/app/compiler/gcc-10.3
    #configure: error: uint64_t or int64_t not found       --> need at least gcc-4.5
    module load compiler/gcc-7.4

    https://stackoverflow.com/questions/7412548/error-gnu-stubs-32-h-no-such-file-or-directory-while-compiling-nachos-source

    c. Installing:
    make -j 8
    make install

    check: g++ -v

    3.Make module file 
    at directory: /uhome/p001cao/local/share/lmodfiles/GCC
    create file "gcc-10.3"
    # for Tcl script use only set topdir /home1/p001cao/local/app/compiler/gcc-10.3
    set version gcc-10.3.0
    setenv gcc $topdir

    # gcc compiler

    prepend-path    PATH                    $topdir/bin
    prepend-path    INCLUDE         $topdir/include
    prepend-path    LD_LIBRARY_PATH         $topdir/lib
    prepend-path    LD_LIBRARY_PATH         $topdir/lib64
    prepend-path    LD_LIBRARY_PATH         $topdir/libexec


    ref:
    https://m.blog.naver.com/PostView.nhn?blogId=shumin&logNo=220823075261&proxyReferer=https%3A%2F%2Fwww.google.com%2F
    https://gcc.gnu.org/wiki/InstallingGCC
    https://serverkurma.com/linux/how-to-install-newer-version-of-gcc-on-centos-6-x/
    https://trilinos.github.io/install_gcc.html

    Saturday, July 27, 2019

    Install FFTW, BLAS & LAPACK



    II. BLAS & LAPACK
    https://ahmadzareei.github.io/azareei/linux/2016/04/08/configuring-blas-lapack.html
    1. Dowloand LAPACK
    https://thelinuxcluster.com/2012/04/09/building-lapack-3-4-with-intel-and-gnu-compiler/
    wget http://www.netlib.org/lapack/lapack-3.9.0.tgz
    tar -zxvf lapack-3.9.0.tgz
    cd lapack-3.9.0
    ## Use gfortran 64-bits compiler
    module load compiler/gcc-10.2
    ##  copy and create make.inc
    cp INSTALL/make.inc.gfortran make.inc

    Edit the make.inc

    make -j 12
    ## copy lib to new place
    mkdir -p /uhome/p001cao/local/app/lapack-3.9
    cp liblapack.a /uhome/p001cao/local/app/lapack-3.9/liblapack.a
    ## Use:
    export LAPACK=/uhome/p001cao/local/app/lapack-3.9/liblapack.a


    2. Install BLAS
    - should use BLAS source include in LAPACK source
    cd lapack-3.9.0/BLAS
    ## Use gfortran 64-bits compiler
    module load compiler/gcc-10.2
    gfortran -O3 -std=legacy -m64 -fno-second-underscore -fPIC -c *.f
    ar r libfblas.a *.o   # creates libblas.a
    ranlib libfblas.a
    ## copy lib to new place
    mkdir -p /uhome/p001cao/local/app/blas
    cp libfblas.a /uhome/p001cao/local/app/blas/libfblas.a  
    ## Use:
    export BLAS=/uhome/p001cao/local/app/blas/libfblas.a