All Products
Search
Document Center

Use WRF to perform high-performance computing

Last Updated: Sep 10, 2021

This topic uses Weather Research and Forecasting Model (WRF) as an example to show how to perform high-performance computing by using an Elastic High Performance Computing (E-HPC) cluster.

Background information

WRF is a next-generation mesoscale numerical weather prediction system designed for both atmospheric research and operational forecasting applications. WRF can produce simulations based on actual atmospheric conditions or idealized conditions. The model serves a wide range of meteorological applications. It features a software architecture that allows for parallel computation and system extensibility. For more information, visit the WRF official website.WRF

Step 1: Create a cluster and user

  1. Log on to the E-HPC console.

  2. Create a cluster named wrf-test.

    For more information, see Create a cluster. Set the following parameters:

    • Scheduler: Select slurm.

    • Other Software: Select wrf-mpich 3.8.1, wrf-openmpi 3.8.1, mpich 3.2, and openmpi 1.10.7.

    • VNC: Turn on the VNC switch. Then, you can remotely log on to the cloud desktop or app of E-HPC by using the E-HPC console.

      wrf-test
  3. Create a sudo user named wrftest.

    For more information, see Create a user.

Step 2: Run geogrid.exe

The geogrid.exe file defines the model horizontal domain and horizontally interpolates static geographical data to the model domain. Before you run geogrid.exe, perform the following operations:

  • Install NCAR Command Language (NCL) on the logon node. For more information, visit the NCL official website.

  • Configure the namelist.wps file. For information about the parameters and parameter descriptions of the namelist.wps file, visit the WRF official website.

Note

In this example, namelist.wps resides in the /home/wrftest/WPS directory.

  1. On the Cluster page of the E-HPC console, find wrf-test, and click Connect.

  2. In the Connect panel, set Cluster User to wrftest, and specify a password and port number for wrf-test. Then, click Connect via SSH.

  3. Check whether the required software of WRF is installed in the cluster.

    export MODULEPATH=/opt/ehpcmodulefiles/
    module avail
  4. Load the WRF software environment.

    module load wrf-mpich/3.8.1 mpich/3.2
    echo $WPSHOME $WRFHOME
  5. Copy the installed WRF Preprocessing System (WPS) and WRF software to the working directory.

    cp -r $WPSHOME $WPSCOPYHOME
    cp -r $WRFHOME $WRFCOPYHOME
    Note

    Replace $WPSCOPYHOME and $WRFCOPYHOME in the preceding code with the actual working directory, for example, /home/wrftest/WPS.

  6. Download and decompress the geographical data file.

    Note

    In this example, the geog_complete.tar.gz file is used. You can also download a different geographical data file based on your needs. For more information, visit the WRF official website.

    cd /home/wrftest/WPS
    wget https://www2.mmm.ucar.edu/wrf/src/wps_files/geog_complete.tar.gz
    
    tar -zxvf geog_complete.tar.gz
    
    
  7. Link the GEOGRID.TBL file.

    The GEOGRID.TBL file determines which fields are interpolated by geogrid.

    ln -s geogrid/GEOGRID.TBL GEOGRID.TBL
  8. Interpolate static geographical data to the model domain.

    ./geogrid.exe

    After geogrid.exe is run, the geo_em.d0N.nc file is generated in the WPS directory. The following result is returned.

    georgid

Step 3: Run ungrib.exe

The ungrib.exe file extracts meteorological fields from GRIB files.

  1. Download and decompress the meteorological data file of Hurricane Katrina.

    Note

    In this example, the Katrina.tar.gz file is used. To download Katrina.tar.gz, click Katrina.tar.gz example file. You can also download a different geographical data file based on your needs. For more information, visit the WRF official website.

    wget http://www2.mmm.ucar.edu/wrf/TUTORIAL_DATA/Katrina.tar.gz
    tar -zxvf Katrina.tar.gz
  2. Link the data file to the WPS directory.

    ./link_grib.csh /home/wrftest/wrfdata/Katrina/avn
  3. Link the Vtable of the data file.

    In this example, Vtable.GFS is used. You can also select a different Vtable based on your needs.

    ln -sf ungrib/Variable_Tables/Vtable.GFS Vtable
  4. Extract the required meteorological fields.

    ./ungrib.exe

    After ungrib.exe is run, files that have the name of FILE:YYYY-MM-DD_hh* are generated in the WPS directory. The following result is returned.

    ungrib

Step 4: Run metgrid.exe

The metgrid.exe file horizontally interpolates meteorological fields to the model domain determined by geogrid.exe.

  1. Link the GEOGRID.TBL file.

    The GEOGRID.TBL file defines how metgrid.exe horizontally interpolates meteorological fields to the model domain.

    ln -s metgrid/METGRID.TBL.ARW METGRID.TBL
  2. Horizontally interpolate meteorological fields to the model domain determined by geogrid.exe.

    ./metgrid.exe

    After metgrid.exe is run, files that have the name of met_em.d0N.yyyy-mm-dd_hh:mm:ss.nc are generated in the WPS directory. The following result is returned.

    metgrid

Step 5: Run wrf.exe

The wrf.exe file stores the forecast data. Before you run wrf.exe, define the namelist.input file. The parameters about &time_control and &domains in the namelist.input file must be the same as those in the namelist.wps file. For more information, visit the WRF official website.

Note

In this example, namelist.input resides in the /home/wrftest/WRFV3/run directory.

  1. Go to the WRFV3 directory.

    cd /home/wrftest/WRFV3
  2. Import the processing result of WPS.

    ln -s /home/wrftest/WRFV3/run/met_em*
  3. Initialize simulation data.

    ./real.exe

    After real.exe is run, the wrfinput.d0N and wrfbyd.dxx files are generated in the /home/wrftest/WRFV3/run directory.

  4. Export the forecast data.

    mpirun -np 10 /home/wrftest/WRFV3/run/wrf.exe

    After wrf.exe is run, the wrfout.d0N_[date] file is generated in the WRF directory. Create NCAR Graphics images for the WRF result, as shown in the following figure.WRF