GROMACS

Last Updated: Nov 03, 2017

GROMACS overview

gromacs logo

GROMACS (GROningen MAchine for Chemical Simulations) is a general software package used to simulate the molecular dynamics of systems with millions of particles based on Newtonian motion equations. GROMACS is mainly used for biochemical molecules, such as proteins, lipids, and other nucleic acids that have a variety of complex bonding interactions. Because GROMACS provides high efficiency for typical computing simulations, such as non-bonding interactions, many researchers use it for the study on non-biological systems (such as polymers).

GROMACS supports all the common algorithms used in modern molecular dynamics, and its code is maintained by developers throughout the world. For more information, visit the official website.

Prerequisites

For the following examples, you must install the GROMACS software package during cluster creation.

SW-setup-GROMACS

Note: To run the gromacs-gpu example, you must select a GPU series instance for the computing node during cluster creation. Otherwise, you are not able to run gromacs-gpu as described in the second example.

Meanwhile, you must select the MPI library the software depends on.

setup_mpi

Run GROMACS examples

Note: Before running GROMACS examples, you must complete the prerequisites first as described in Submit jobs.

GROMACS Example 1: Lysozyme in water

This example guides you through the process of setting up a simulation system containing a protein (lysozyme) and ions in a box of water. See Official tutorial.

Download address

Click here to download.

Procedure

  • Serial version$ ./serial_run.sh

  • Parallel version$ ./parallel_run.sh

GROMACS Example 2: Water molecule motion

This example simulates the motion of a large number of water molecules in a given space at a given temperature. A GPU instance is required.

Procedure

  • Set environmental variables and run module avail to check whether GROMACS software has been installed.
  1. $ export MODULEPATH=/opt/ehpcmodulefiles/ # The environmental variables needed by the module command
  2. $ module avail
  3. ------------------------------ /opt/ehpcmodulefiles -------------------------------------
  4. gromacs-gpu/2016.3 openmpi/1.10.7
  • Run module load to load GROMACS and OpenMPI.
  1. $ module load openmpi
  2. $ module load gromacs-gpu
  3. $ which gmx_mpi
  4. /opt/gromacs-gpu/2016.3/bin/gmx_mpi
  • Download the water example.

Assume that the current directory is under the $HOME path of the current user.

  1. $ pwd
  2. /home/<current_user_name>
  3. $ wget http://public-ehs.oss-cn-hangzhou.aliyuncs.com/packages/water_GMX50_bare.tar.gz
  4. $ tar xzvf water_GMX50_bare.tar.gz
  • Submit the PBS job to run the water example.

    • For PBS job script of high-configuration computing node (>32 CPU cores, dual GPU cards)

      1. $ cat > gromacs_single_node.pbs
      2. #!/bin/sh
      3. #PBS -l ncpus=32,mem=4gb
      4. #PBS -l walltime=00:20:00
      5. #PBS -o gromacs_gpu_pbs.log
      6. #PBS -j oe
      7. cd /home/water-cut1.0_GMX50_bare/1536
      8. /opt/gromacs-gpu/2016.3/bin/gmx_mpi grompp -f pme.mdp -c conf.gro -p topol.top -o topol_pme.tpr
      9. /opt/openmpi/1.10.7/bin/mpirun -np 4 /opt/gromacs-gpu/2016.3/bin/gmx_mpi mdrun -ntomp 8 -resethway -noconfout -nsteps 8000 -v -pin on -nb gpu -gpu_id 0011 -s topol_pme.tpr
    • For PBS job script of low-configuration node

      1. $ cat > gromacs_single_node.pbs
      2. #!/bin/sh
      3. #PBS -l ncpus=4,mem=4gb
      4. #PBS -l walltime=00:20:00
      5. #PBS -o gromacs_gpu_pbs.log
      6. #PBS -j oe
      7. cd /home/water-cut1.0_GMX50_bare/1536
      8. /opt/gromacs-gpu/2016.3/bin/gmx_mpi grompp -f pme.mdp -c conf.gro -p topol.top -o topol_pme.tpr
      9. /opt/openmpi/1.10.7/bin/mpirun -np 1 /opt/gromacs-gpu/2016.3/bin/gmx_mpi mdrun -ntomp 4 -resethway -noconfout -nsteps 8000 -v -pin on -nb gpu -s topol_pme.tpr
  • Submit the job using the PBS job script.

  1. $ qsub gromacs_single_node.pbs
  2. 1.iZ2zedptfv8e8dc9c2zt0tZ
  3. $ qstat
  4. Req'd Req'd Elap
  5. Job ID Username Queue Jobname SessID NDS TSK Memory Time S Time
  6. --------------- -------- -------- ---------- ------ --- --- ------ ----- - -----
  7. 1.iZ2zedptfv8e8 mingying workq gromacs_si 20775 1 4 4gb 00:20 R 00:03
Thank you! We've received your feedback.