edit-icon download-icon

GROMACS

Last Updated: Aug 01, 2018

gromacs logo

GROMACS (GROningen MAchine for Chemical Simulations) is a general software package used to simulate the molecular dynamics of systems with millions of particles based on Newtonian motion equations. GROMACS is mainly used for biochemical molecules, including proteins, lipids, and other nucleic acids that have a variety of complex bonding interactions. Because GROMACS provide high efficiency for typical computing simulations including non-bonding interactions. Hence, many researchers use it to conduct studies on non-biological systems, for example, polymers.

GROMACS support all the common algorithms used in modern molecular dynamics, and its code is maintained by developers across the world. For more information, visit the GROMACS official website.

Prerequisites

Install the GROMACS software package before you create a cluster.

SW-setup-GROMACS

Note:

  • To run the gromacs-gpu example, select a GPU series instance for the computing node during the cluster creation. Otherwise, you may not be able to run gromacs-gpu as explained in the second example.

  • Before running GROMACS examples, you must complete the prerequisites as explained in Submit jobs article.

Meanwhile, select the MPI library on which the software depends.

setup_mpi

Run GROMACS examples

Example 1: Lysozyme in water

This example walks you through the process of setting up a simulation system containing a protein (lysozyme) and ions in a water container. See Official tutorial and click here to download.

Procedure

  1. Serial version$ ./serial_run.sh

  2. Parallel version$ ./parallel_run.sh

Example 2: Water molecule motion

This example simulates the motion of a large number of water molecules in a given space and at a given temperature. A GPU instance is required.

Procedure

  1. Set environmental variables and run the module to check whether GROMACS software has been installed.
    1. $ export MODULEPATH=/opt/ehpcmodulefiles/ # The environmental variables needed by the module command
    2. $ module avail
    3. ------------------------------ /opt/ehpcmodulefiles -------------------------------------
    4. gromacs-gpu/2016.3 openmpi/1.10.7
  2. Run module load to load GROMACS and OpenMPI.
    1. $ module load openmpi
    2. $ module load gromacs-gpu
    3. $ which gmx_mpi
    4. /opt/gromacs-gpu/2016.3/bin/gmx_mpi
  3. Download the water example.

    Assume that the current directory is under the $HOME path of the current user.

  1. $ pwd
  2. /home/<current_user_name>
  3. $ wget http://public-ehs.oss-cn-hangzhou.aliyuncs.com/packages/water_GMX50_bare.tar.gz
  4. $ tar xzvf water_GMX50_bare.tar.gz
  5. 1. Submit the PBS job to run the water example:
  6. - For PBS job script of high-configuration computing node (>32 CPU cores, dual GPU cards).
  7. ```bash
  8. $ cat > gromacs_single_node.pbs
  9. #!/bin/sh
  10. #PBS -l ncpus=32,mem=4gb
  11. #PBS -l walltime=00:20:00
  12. #PBS -o gromacs_gpu_pbs.log
  13. #PBS -j oe
  14. cd /home/water-cut1.0_GMX50_bare/1536
  15. /opt/gromacs-gpu/2016.3/bin/gmx_mpi grompp -f pme.mdp -c conf.gro -p topol.top -o topol_pme.tpr
  16. /opt/openmpi/1.10.7/bin/mpirun -np 4 /opt/gromacs-gpu/2016.3/bin/gmx_mpi mdrun -ntomp 8 -resethway -noconfout -nsteps 8000 -v -pin on -nb gpu -gpu_id 0011 -s topol_pme.tpr
  • For the PBS job script of a low-configuration node:

    1. $ cat > gromacs_single_node.pbs
    2. #!/bin/sh
    3. #PBS -l ncpus=4,mem=4gb
    4. #PBS -l walltime=00:20:00
    5. #PBS -o gromacs_gpu_pbs.log
    6. #PBS -j oe
    7. cd /home/water-cut1.0_GMX50_bare/1536
    8. /opt/gromacs-gpu/2016.3/bin/gmx_mpi grompp -f pme.mdp -c conf.gro -p topol.top -o topol_pme.tpr
    9. /opt/openmpi/1.10.7/bin/mpirun -np 1 /opt/gromacs-gpu/2016.3/bin/gmx_mpi mdrun -ntomp 4 -resethway -noconfout -nsteps 8000 -v -pin on -nb gpu -s topol_pme.tpr
  • Submit a job using the PBS job script:
  1. $ qsub gromacs_single_node.pbs
  2. 1.iZ2zedptfv8e8dc9c2zt0tZ
  3. $ qstat
  4. Req'd Req'd Elap
  5. Job ID Username Queue Jobname SessID NDS TSK Memory Time S Time
  6. --------------- -------- -------- ---------- ------ --- --- ------ ----- - -----
  7. 1.iZ2zedptfv8e8 mingying workq gromacs_si 20775 1 4 4gb 00:20 R 00:03
Thank you! We've received your feedback.