This topic uses Weather Research and Forecasting Model (WRF) as an example to show how to perform high-performance computing by using an Elastic High Performance Computing (E-HPC) cluster.
WRF is a next-generation mesoscale numerical weather prediction system designed for both atmospheric research and operational forecasting applications. WRF can produce simulations based on actual atmospheric conditions or idealized conditions. The model serves a wide range of meteorological applications. It features a software architecture that allows for parallel computation and system extensibility. For more information, visit the WRF official website.
Step 1: Create a cluster and user
Log on to the E-HPC console.
Create a cluster named wrf-test.
For more information, see Create a cluster. Set the following parameters:
Scheduler: Select slurm.
Other Software: Select wrf-mpich 3.8.1, wrf-openmpi 3.8.1, mpich 3.2, and openmpi 1.10.7.
VNC: Turn on the VNC switch. Then, you can remotely log on to the cloud desktop or app of E-HPC by using the E-HPC console.
Create a sudo user named wrftest.
For more information, see Create a user.
Step 2: Run geogrid.exe
The geogrid.exe file defines the model horizontal domain and horizontally interpolates static geographical data to the model domain. Before you run geogrid.exe, perform the following operations:
Install NCAR Command Language (NCL) on the logon node. For more information, visit the NCL official website.
Configure the namelist.wps file. For information about the parameters and parameter descriptions of the namelist.wps file, visit the WRF official website.
In this example, namelist.wps resides in the /home/wrftest/WPS directory.
On the Cluster page of the E-HPC console, find wrf-test, and click Connect.
In the Connect panel, set Cluster User to wrftest, and specify a password and port number for wrf-test. Then, click Connect via SSH.
Check whether the required software of WRF is installed in the cluster.
export MODULEPATH=/opt/ehpcmodulefiles/ module avail
Load the WRF software environment.
module load wrf-mpich/3.8.1 mpich/3.2 echo $WPSHOME $WRFHOME
Copy the installed WRF Preprocessing System (WPS) and WRF software to the working directory.
cp -r $WPSHOME $WPSCOPYHOME cp -r $WRFHOME $WRFCOPYHOMENote
Replace $WPSCOPYHOME and $WRFCOPYHOME in the preceding code with the actual working directory, for example, /home/wrftest/WPS.
Download and decompress the geographical data file.Note
In this example, the geog_complete.tar.gz file is used. You can also download a different geographical data file based on your needs. For more information, visit the WRF official website.
cd /home/wrftest/WPS wget https://www2.mmm.ucar.edu/wrf/src/wps_files/geog_complete.tar.gz tar -zxvf geog_complete.tar.gz
Link the GEOGRID.TBL file.
The GEOGRID.TBL file determines which fields are interpolated by geogrid.
ln -s geogrid/GEOGRID.TBL GEOGRID.TBL
Interpolate static geographical data to the model domain.
After geogrid.exe is run, the geo_em.d0N.nc file is generated in the WPS directory. The following result is returned.
Step 3: Run ungrib.exe
The ungrib.exe file extracts meteorological fields from GRIB files.
Download and decompress the meteorological data file of Hurricane Katrina.
wget http://www2.mmm.ucar.edu/wrf/TUTORIAL_DATA/Katrina.tar.gz tar -zxvf Katrina.tar.gz
Link the data file to the WPS directory.
Link the Vtable of the data file.
In this example, Vtable.GFS is used. You can also select a different Vtable based on your needs.
ln -sf ungrib/Variable_Tables/Vtable.GFS Vtable
Extract the required meteorological fields.
After ungrib.exe is run, files that have the name of FILE:YYYY-MM-DD_hh* are generated in the WPS directory. The following result is returned.
Step 4: Run metgrid.exe
The metgrid.exe file horizontally interpolates meteorological fields to the model domain determined by geogrid.exe.
Link the GEOGRID.TBL file.
The GEOGRID.TBL file defines how metgrid.exe horizontally interpolates meteorological fields to the model domain.
ln -s metgrid/METGRID.TBL.ARW METGRID.TBL
Horizontally interpolate meteorological fields to the model domain determined by geogrid.exe.
After metgrid.exe is run, files that have the name of met_em.d0N.yyyy-mm-dd_hh:mm:ss.nc are generated in the WPS directory. The following result is returned.
Step 5: Run wrf.exe
The wrf.exe file stores the forecast data. Before you run wrf.exe, define the namelist.input file. The parameters about &time_control and &domains in the namelist.input file must be the same as those in the namelist.wps file. For more information, visit the WRF official website.
In this example, namelist.input resides in the /home/wrftest/WRFV3/run directory.
Go to the WRFV3 directory.
Import the processing result of WPS.
ln -s /home/wrftest/WRFV3/run/met_em*
Initialize simulation data.
After real.exe is run, the wrfinput.d0N and wrfbyd.dxx files are generated in the /home/wrftest/WRFV3/run directory.
Export the forecast data.
mpirun -np 10 /home/wrftest/WRFV3/run/wrf.exe
After wrf.exe is run, the wrfout.d0N_[date] file is generated in the WRF directory. Create NCAR Graphics images for the WRF result, as shown in the following figure.