Friday 17 July 2015

WRF3.7 on gaia

Downloaded and installed on gaia: sweeneyc/Code/WRF3.7

module list

Currently Loaded Modulefiles:
  1) gcc/4.9.24.4.1
  2) openmpi/1.8.4
  3) netcdf/2014_10
  4) mpich2/1.5
  5) netcdf-fortran/

export JASPERLIB=/gaia/software/src/grib2/lib
export JASPERINC=/gaia/software/src/grib2/include
export NETCDF=/gaia/software/src/netcdf-fortran-4.4.1/build
export PHDF5=/gaia/software/src/hdf5-1.8.13/build
export NCARG_ROOT=/gaia/software/src/ncl_ncarg-6.2.1

./clean -a

./configure
#35
#1

./compile em_real

This should produce the following in main/*.exe:
wrf.exe -- model executable
real.exe -- real-data initialization
ndown.exe -- off-line, one-way nesting
tc.exe -- TC bogussing, serial only

cd ../WPS

./clean -a

./configure
#1

vim configure.wps
# WRF_DIR                 =       ../WRFV3p7
# NETCDF                  =       /gaia/software/src/netcdf-c/build
# NETCDFF                 =       /gaia/software/src/netcdf-fortran-4.4.1/build
#WRF_INCLUDE         =   ...
                        -I$(NETCDF)/include\
                        -I$(NETCDFF)/include
#WRF_LIB             =   ...
                        -L$(NETCDF)/lib -lnetcdf \
                        -L$(NETCDFF)/lib -lnetcdff -lgomp
# DM_FC               = mpif90 #-f90=gfortran
# DM_CC               = mpicc #-cc=gcc
#Append -lgomp to LDFLAGS

./compile >& log.compile

This should produce:

geogrid.exe -> geogrid/src/geogrid.exe
ungrib.exe -> ungrib/src/ungrib.exe
metgrid.exe -> metgrid/src/metgrid.exe

TEST RUN

In WPS directory, edit namelist.wps:
max_dom = 1,
geog_data_path = '/gaia/software/src/WPS_geog/'

Check domain:
module load ncl_ncarg
export NCARG_ROOT=/gaia/software/src/ncl_ncarg-6.2.1
ncl util/plotgrids_new.ncl

run ./geogrid.exe then make sure that the static file has been created:

geo_em.d01.nc

Download the Tutorial case study data

Link in the AWIP Vtable:
ln -sf ungrib/Variable_Tables/Vtable.AWIP Vtable

Link-in the GRIB data by making use of the script link_grib.csh

./link_grib.csh ../../../DATA/TESTWRF/2000012

Edit namelist.wps, and set the following:

start_date = '2000-01-24_12:00:00',
end_date = '2000-01-25_12:00:00',
interval_seconds = 21600,

Run ungrib to create the intermediate files:

./ungrib.exe >& ungrib_data.log

Run metgrid.exe to interpolate the input data on our model domain:

# module load netcdf-fortran/4.4.1 
./metgrid.exe

Change into WRF directory.
cd ../WRFV3p7/run

Link-in the met_em files created with metgrid.exe
ln -sf ../../WPS/met_em.d01.2000-01* .

namelist.input should already have default settings. Run real.exe. Check that the following two files have been created:
wrfinput_d01
wrfbdy_d01

Run wrf.exe. Check that the following file has been created:
wrfout_d01_2000-01-24_12:00:00


Move output file to Mac, and plot something with Python:

Thursday 16 July 2015

LFR reforecasts using 20CR and ERA-20C


20CR

The latest (recommended) version of the NOAA/CIRES Twentieth Century Global Reanalysis is Version 2c:
http://rda.ucar.edu/datasets/ds131.2/

I registered for an account, and downloaded data using "Data Access" > "Get a Subset"

Land cover and geometric height were downloaded from a different page:
http://rda.ucar.edu/datasets/ds131.1/index.html#sfol-wl-/data/ds131.1?g=2

Recommended data for WRF:

3D: Temp, U, V, HGT, RH
2D: Surface Pressure, MSLP, Skin Temp, 2mT, 2mRH, 10mU, 10mV


If the WRF model is going to be run with the Noah LSM model, then at least 2 levels of Soil Temperature and Soil Moisture are required.

Soil Height is recommended but not required. If Soil Height is supplied, then SOIL Temperatures and TSK can be adjusted to the WRF model terrain height.

Water equivalent snow depth (SNOW) is a nice field to have, but not required.

SEAICE is good to have for a high latitude winter case, but it is not required.

http://www2.mmm.ucar.edu/wrf/OnLineTutorial/Basics/UNGRIB/ungrib_req_fields.htm


----

ERA-20C

ERA-20C is ECMWF's first atmospheric reanalysis of the 20th century, from 1900-2010. It assimilates observations of surface pressure and surface marine winds only.
http://www.ecmwf.int/en/research/climate-reanalysis/era-20c

----

WRF

Latest version for WRF-ARW and WPS is V3.7:
http://www2.mmm.ucar.edu/wrf/users/wrfv3.7/wrf_model.html

----

Run WRF with 20CR

Change into WPD directory and clean up any previous data:
rm geo_em* GRIBFILE.* met_em.* FILE\:* *.log

# module load netcdf/2014_10
Run goegrid.exe.


Link in the AWIP Vtable. I've written one called Vtable.20CRv2c based on Vtable.NARR. Differences:

use SPECHUMD at 2m
PMSL GRIB = 2
ST GRIB = 11
SOILHGT GRIB = 8

ln -sf ungrib/Variable_Tables/Vtable.20CRv2c Vtable

Link-in the GRIB data.
ls
./link_grib.csh ../../../DATA/20CRv2c/

Edit namelist.wps.
start_date = '1910-05-20_06:00:00',
end_date   = '1910-05-21_06:00:00',
...
 e_we              =  67,
 e_sn              =  54,
 geog_data_res     = '10m',
 dx = 50000,
 dy = 50000,
 map_proj = 'lambert',
 ref_lat   =  48.049,
 ref_lon   =  8.07,
 truelat1  =  50.0,
 truelat2  =  46.0,
 stand_lon =  8.0,
...

Run
./ungrib.exe

Run
# module load netcdf-fortran/4.4.1 
./metgrid.exe

Change into WRF directory. Clean up any old data.
cd ../WRFV3p7/run
rm met_em.* wrfinput_* wrfbdy_* namelist.output wrfout_* rsl.*

Link-in the met_em files created with metgrid.exe
ln -sf ../../WPS/met_em.* .

check namelist.input. Run real.exe. Check that the following two files have been created:
wrfinput_d01
wrfbdy_d01

Run wrf.exe. Check that the following file has been created:
wrfout_d01_2000-01-24_12:00:00

xxxx

I should check out my old work files on metstore:

/store/CONOR/BACKUP/Code/WRF/WRF-20CRV2

xxxx

Friday 10 July 2015

Python for WRF

I am using NCL at the moment to generate the images for UCD MCC GFS/WRF forecast:

http://mathsci.ucd.ie/met/mcc-forecast.html

It all works fine... so I want to change it! It would be nice to run the forecast on two domains - a higher resolution over Ireland and the UK. I'd also like to have meteograms for a number of locations, which would show forecasts and (historical) observations.

Can I generate these images using Python? I've had a quick Google around, and a promising place to start seems to be with the UK Met Office SciTools packages:

http://scitools.org.uk/

I've installed it on my mac using conda:

conda install -c scitools iris

Then I had a look at the user guide (http://scitools.org.uk/iris/docs/latest/userguide/index.html) and saw this:


If you are reading this user guide for the first time it is strongly recommended that you read the user guide fully before experimenting with your own data files.

Hmmm. I'm usually too impatient. Let's see how far I get...
- Iris data structures are "cubes", which contain data and metadata
- To load a single file into a list of Iris cubes the iris.load() function is used:

import iris
filename = '/Users/conor/DATA/2015071000/wrfout_d01_2015-07-12_02:00:00'
cubes = iris.load(filename)
print cubes
t2m = cubes[165]
print t2m

Wow! That worked! Loading multiple files, and specific fields, is easy too:

filenames = '/Users/conor/DATA/2015071000/wrfout_d01_2015-07-12*'
t2ms = iris.load(filenames, 'T2')
print t2ms

- The iris.save() function saves one or more cubes to a file.
- Iris is able to save to CF NetCDF (1.5) and GRIB (edition 2).

So, can I plot something?

No. Problem. WRF is not CF-compliant, so it doesn't store data in the format that iris expects. NCL works with WRF because it was specifically coded to handle the WRF format. Possibly useful links if I want to continue with this:

https://groups.google.com/forum/#!topic/scitools-iris/O7IRIhMomWc


I tried using wrfncxnj-0.1_r2183. It did produce a netcdf output file, but when I read that in, I still couldn't plot it in iris.

I did, however, get Scott Hosking's python code (link above) to work, after one change due to a .join error:

timestr = ''.join(times[time])

Now I can plot WRF using Python!


A more comprehensive set of functions are available on this page:

http://www.atmos.washington.edu/~lmadaus/pyscripts.html

I wget-ed these files:

coltbls.py
plot_wrf_maps.py

And ran this:
./plot_wrf_maps.py -e -f ../../../DATA/TESTWRF/wrfout_d01_2000-01-24_12\:00\:00 

This produced lots of plots! Here's one:



This may be a good starting point for MCC plots.

Thursday 14 May 2015

Continuity equation

Consider a control volume of fixed mass $\delta M$ that moves with the fluid. Letting $\delta V = \delta x \delta y \delta z$ be the volume, we can write: $$ \frac{1}{\delta M} \frac{D}{Dt} (\delta M) = \frac{1}{\rho \delta V} \frac{D}{Dt} (\rho \delta V) = \frac{1}{\delta \rho} \frac{D \rho}{Dt} + \frac{1}{\delta V} \frac{D \delta V}{Dt} = 0 $$

So, following the motion of a control volume, the mass is conserved, and the fractional change in density is equal and opposite to the fractional change in volume.
 

 We can rewrite the volume term: $$ \frac{1}{\delta V} \frac{D \delta V}{Dt} = \frac{1}{\delta x \delta y \delta z} \frac{D}{Dt} (\delta x \delta y \delta z) = \frac{1}{\delta x} \frac{D \delta x}{Dt} + \frac{1}{\delta y} \frac{D \delta y}{Dt} + \frac{1}{\delta z} \frac{D \delta z}{Dt} $$ What does $\frac{D \delta x}{Dt}$ represent? Consider the $u$ velocity at the left ($u1$) and the right ($u2$) of our control volume: $$ u1 = \frac{D x}{D t} \\ u2 = \frac{D (x + \delta x)}{D t} \\ u2 - u1 = \delta u = \frac{D \delta x}{D t} $$

Thursday 12 March 2015

VAPOR on HIWINDS

Install VAPOR:
https://www.vapor.ucar.edu/docs/vapor-installation/vapor-binary-installation
Register for VAPOR.
username: conor.sweeney_656814
password: namewedyr
Cite VAPOR if used:
https://www.vapor.ucar.edu/citation
Download: Version 2.4.0 (Stable Release) Release Date: March 10, 2015 Linux - x86_64 (64 bit)
required packages:
sudo apt-get install mesa-utils
sudo apt-get install csh

Install:
 cd /home/conor/Code/VAPOR
wget https://www.vapor.ucar.edu/sites/default/files/Installers/vapor-2.4.0-Linux_x86_64.tar.gz
tar -xzf vapor-2.4.0-Linux_x86_64.tar.gz
./vapor-install.csh /home/conor/Code/VAPOR
cd /home/conor/Code/VAPOR/vapor-2.4.0/bin
chmod +x vapor-setup.sh
./vapor-setup.sh
./vaporgui 
./vaporgui: error while loading shared libraries: libexpat.so.0: failed to map segment from shared object: Cannot allocate memory

Wednesday 11 March 2015

Metview and OpenIFS

Information on using MetView with OpenIFS:

https://software.ecmwf.int/wiki/display/OIFS/Using+MetView+with+OpenIFS

wget https://software.ecmwf.int/wiki/download/attachments/24314215/metview_openifs.tar.gz

Unpack the compressed tarfile inside your $HOME/metview


Monday 9 March 2015

Metview on HIWINDS

packages
sudo apt-get install libproj-dev
sudo apt-get install libxt-dev
sudo apt-get install libxmu-dev
sudo apt-get install qt4-qmake
sudo apt-get install libqt4-dev
NETCDF
tar -xzf netcdf-4.3.3.1.tar.gz
cd netcdf-4.3.3.1
./configure --disable-dap
make
make check > makecheck.log
sudo make install
tar -xzf netcdf-cxx-4.2.tar.gz
cd netcdf-cxx-4.2
export LD_LIBRARY_PATH=/usr/local/lib
CPPFLAGS=-I/usr/local/include  ./configure 
make >& make.log
make check >& makecheck.log
sudo make install
GRIB API
tar -xzf grib_api-1.13.0.tar.gz
cd grib_api-1.13.0
CPPFLAGS=-I/usr/local/include  ./configure CFLAGS="-fPIC -O2" --with-netcdf=/usr/local
make >& make.log
make check >& makecheck.log
sudo make install
EMOSLIB
wget https://software.ecmwf.int/wiki/download/attachments/44242086/libemos-4.0.0-Source.tar.gz
tar -xzf libemos-4.0.0-Source.tar.gz
mkdir build 
cd build
cmake /home/conor/Code/ECMWF/EMOSLIB/libemos-4.0.0-Source 
make >& make.log
make check >& makecheck.log
sudo make install
MAGICS
tar -xzf Magics-2.24.1-Source.tar.gz 
mkdir build
cd build
cmake /home/conor/Code/ECMWF/MAGICS/Magics-2.24.1-Source -DENABLE_METVIEW=ON
make
MAGPLUS_HOME=/home/conor/Code/ECMWF/MAGICS/Magics-2.24.1-Source make test
sudo make install
METVIEW
tar -xzf Metview-4.5.2-Source.tar.gz
mkdir build
cd build
cmake /home/conor/Code/ECMWF/METVIEW/Metview-4.5.2-Source >& cmake.log
make >& make.log
make test
sudo make install

Monday 26 January 2015

WRF mpi on gaia

WRF compiled and ran first time following these instructions:

http://www2.mmm.ucar.edu/wrf/OnLineTutorial/compilation_tutorial.php

Check speedup using test case here:

/gaia/home/sweeneyc/Code/WRFMPI/run



gaia has 24 nodes, each with two Intel Xeons E5-2660v2 CPUs

http://gaia.ucd.ie/technicalDetails.php

These CPUs have 10 cores and 20 threads:

http://ark.intel.com/products/75272/Intel-Xeon-Processor-E5-2660-v2-25M-Cache-2_20-GHz

Hyperthreading is just a virtual core, which can speed things up when, for example, there is a cache miss, but I won't use it here. So, each node has 2 CPUs, each CPU has 10 cores.

Test speedup by submitting jobs with different numbers of cores:

#PBS -l nodes=1:ppn=40
  • time mpirun -np 4 --map-by ppr:1:core wrf.exe: 550s
  • time mpirun -np 9 --map-by ppr:1:core wrf.exe: 331s
  • time mpirun -np 15 --map-by ppr:1:core wrf.exe: 299s (????)
  • time mpirun -np 16 --map-by ppr:1:core wrf.exe: 303s (????)
  • time mpirun -np 20 wrf.exe: 166s (no flags)
  • time mpirun -np 20 --map-by ppr:1:core wrf.exe: 166s (map by core)
  • time mpirun -np 30 wrf.exe: 258s
  • time mpirun -np 30 --map-by ppr:1:core wrf.exe: doesn't run. more procs than available (good).
#PBS -l nodes=2:ppn=40
  • time mpirun -np 30 --map-by ppr:1:core wrf.exe: 126s
  • time mpirun -np 36 --map-by ppr:1:core wrf.exe: 118s
  • time mpirun -np 40 --map-by ppr:1:core wrf.exe: 110s
#PBS -l nodes=3:ppn=40
  • time mpirun -np 42 --map-by ppr:1:core wrf.exe: 107s
  • time mpirun -np 49 --map-by ppr:1:core wrf.exe: 98s
  • time mpirun -np 56 --map-by ppr:1:core wrf.exe: 94s
  • time mpirun -np 60 --map-by ppr:1:core wrf.exe: 90s


This speedup was initially horrible. The problem was with Infiniband. It wasn't working, so the nodes were communicating using standard etherenet, hence the horribly slow times. Eamonn fixed this by using the correct port for Infiniband (ib1 not ib0) and installing the official Mellanox drivers. He then recompiled WRF for openMPI/1.8.4 and gcc/4.9.2. The new module is WRF/3.6.1openmpi_ib.

Doing longer run tests: Run WRF for 7 days:

WPS/3.6.1 and WRF/3.6.1openmpi_ib

#PBS -l nodes=4:ppn=40
- time mpirun -np 80 --map-by ppr:1:core wrf.exe: real    46m20s

#PBS -l nodes=6:ppn=40
- time mpirun -np 120 --map-by ppr:1:core wrf.exe: real    39m43s

WPS/3.7 and WRF/3.7

#PBS -l nodes=4:ppn=40
- time mpirun -np 80 --map-by ppr:1:core wrf.exe: real    

#PBS -l nodes=6:ppn=40
- time mpirun -np 120 --map-by ppr:1:core wrf.exe: real  40m50s

WRF crashes with 7 or more nodes, over-decomposition of domain:
http://forum.wrfforum.com/viewtopic.php?f=6&t=4930
Rough guideline there suggests at least 15x15 per tile.

SMALLER DOMAINS

#PBS -l nodes=4:ppn=40
- time mpirun -np 80 --map-by ppr:1:core wrf.exe: real  40m09s

#PBS -l nodes=5:ppn=40
- time mpirun -np 100 --map-by ppr:1:core wrf.exe: real  37m19s

WRF crashes with 6 or more nodes, over-decomposition of domain.

WRF/3.7_dm_sm
#PBS -l nodes=5:ppn=40
export OMP_NUM_THREADS=20
- time mpirun -np 5 --map-by ppr:1:core wrf.exe: time out (40m) at 12-02-06:00

export OMP_NUM_THREADS=4
- time mpirun -np 25 --map-by ppr:1:core wrf.exe: time out (40m) at 12-02-18:00

export OMP_NUM_THREADS=4
- time mpirun -np 25 wrf.exe: time out (40m) at 12-01-00:00 (!)


Wednesday 14 January 2015

Download ERA-Interim data

ERA-Interim dataset (January 1979 to present).
http://www.ecmwf.int/en/research/climate-reanalysis/era-interim

Global atmospheric and surface parameters from 1 January 1979 to present, at T255 spectral resolution (~80 km) on 60 vertical levels. 6-hourly atmospheric fields.

Retrieve data using batch scripts:
https://software.ecmwf.int/wiki/display/WEBAPI/Accessing+ECMWF+data+servers+in+batch

To do this on gaia:

1) Create directory for ECMWF retrieval and change into that directory.
/gaia/home/sweeneyc/Code/ECMWF

2) Download ecmwf-api-client-python.tgz
wget https://software.ecmwf.int/wiki/download/attachments/23694554/ecmwf-api-client-python.tgz?version=4&modificationDate=1394123817025&api=v2

3) Unpack ecmwf-api-client-python.tgz
tar -xzf ecmwf-api-client-python.tgz

4) set environment variable to point to that folder.
PYTHONPATH=/gaia/home/sweeneyc/Code/ECMWF

5) Install your API key.
To access ECMWF you will need an  API key. For that you first need to login at https://apps.ecmwf.int/auth/login/
and then retrieve you key at
https://api.ecmwf.int/v1/key/
For this, you will need to have an account on ECMWF web site. If you don't have an account, please self register at
https://apps.ecmwf.int/registration/

Copy the information in this page and paste it in the file $HOME/.ecmwfapirc

6) Agree with conditions.
To access these dataset, you need to agree on the corresponding terms and conditions that can be found under the "Licence" link in the table on the using batch access web page. For ERA-Interim, it's this link:
http://apps.ecmwf.int/datasets/licences/general

7) Use web page to generate MARS request.
Test by getting MSLP. Surface level data are here:
http://apps.ecmwf.int/datasets/data/interim_full_daily/?levtype=sfc

retrieve,
class=ei,
dataset=interim,
date=1979-01-01/to/1979-01-31,
grid=0.75/0.75,
levtype=sfc,
param=151.128,
step=0,
stream=oper,
time=00/06/12/18,
type=an

8) Create python script.
Use the information you get from step 5 to create a file called GetERAI.py (or any name you like):

#!/usr/bin/env python
from ecmwfapi import ECMWFDataServer
server = ECMWFDataServer()
server.retrieve({
'dataset' : "interim",
'step' : "0",
'levtype' : "sfc",
'date' : "19790101/to/19790131",
'time' : "00/06/12/18",
'type' : "an",
'param' : "151.128",
'area' : "80/-30/30/30",
'grid' : "0.75/0.75",
'target' : "data.grib"
})

9) Get the data!

Run the script:
python GetERAI.py

This will (hopefully) result in a file called data.grib in your directory. You can check the file:

module load WPS
g1print.exe data.grib | head

Copen: File = data.grib                                                                                                               
Fortran Unit = 0
UNIX File descriptor: 3

----------------------------------------------------
 rec GRIB GRIB  Lvl  Lvl  Lvl         Time      Fcst
 Num Code name  Code one  two                   hour
----------------------------------------------------
   1 151 MSL      1    0    0  1979-01-01_00:00 + 00
   2 151 MSL      1    0    0  1979-01-01_06:00 + 00

Check that the GRIB Code and Lvl numbers are the same as expected by Vtable.ERA-interim.ml

Here are the python files I used to download ERA-Interim data to drive WRF:

Invariant Fields:

GetERAI-sfc.py (only need to retrieve these once)
#!/usr/bin/env python
from ecmwfapi import ECMWFDataServer
server = ECMWFDataServer()
server.retrieve({
'class' : "ei",
'dataset' : "interim",
'step' : "0",
'levtype' : "sfc",
'date' : "19890101",
'time' : "12",
'type' : "an",
'param' : "129.128/172.128",
'area' : "80/-30/30/20",
'grid' : "0.75/0.75",
'stream' : "oper",
'RESOL' : "AV",
'target' : "data-invariant.grb"
})

Surface Fields:

GetERAI-sfc.py
#!/usr/bin/env python
from ecmwfapi import ECMWFDataServer
server = ECMWFDataServer()
server.retrieve({
'class' : "ei",
'dataset' : "interim",
'step' : "0",
'levtype' : "sfc",
'date' : "19790101/to/19790131",
'time' : "00/06/12/18",
'type' : "an",
'param' : "134.128/139.128/141.128/151.128/165.128/166.128/167.128/168.128/170.128/183.128/235.128/236.128/31.128/33.128/34.128/39.128/40.128/41.128/42.128",
'area' : "80/-30/30/20",
'grid' : "0.75/0.75",
'stream' : "oper",
'RESOL' : "AV",
'target' : "data.grib"
})

Model Level Fields:

GetERAI-ml.py
#!/usr/bin/env python
from ecmwfapi import ECMWFDataServer
server = ECMWFDataServer()
server.retrieve({
'class' : "ei",
'dataset' : "interim",
'step' : "0",
'levtype' : "ml",
'levelist' : "1/to/60",
'date' : "19790101/to/19790131",
'time' : "00/06/12/18",
'type' : "an",
'param' : "130.128/131.128/132.128/133.128",
'area' : "80/-30/30/20",
'grid' : "0.75/0.75",
'stream' : "oper",
'RESOL' : "AV",
'target' : "data.grib"
})

Data retrieval can be automated for different dates by using a bash script like this one:

GetERAI.bash

#!/bin/bash -l

CODEDIR=/gaia/home/sweeneyc/Code/ECMWF
DATADIR=/gaia/home/sweeneyc/DATA/ERAI

PYTHONPATH=/gaia/home/sweeneyc/Code/ECMWF

cd $CODEDIR

DATE1=19790101
DATE2=19790103

YY1=`echo $DATE1 | cut -c1-4`
MM1=`echo $DATE1 | cut -c5-6`
DD1=`echo $DATE1 | cut -c7-8`

YY2=`echo $DATE2 | cut -c1-4`
MM2=`echo $DATE2 | cut -c5-6`
DD2=`echo $DATE2 | cut -c7-8`

sed -e s/NDATE1N/${DATE1}/g -e s/NDATE2N/${DATE2}/g GetERAI-sfc.py > GetERAI-${DATE1}-${DATE2}-sfc.py

python GetERAI-${DATE1}-${DATE2}-sfc.py

sed -e s/NDATE1N/${DATE1}/g -e s/NDATE2N/${DATE2}/g GetERAI-ml.py > GetERAI-${DATE1}-${DATE2}-ml.py

python GetERAI-${DATE1}-${DATE2}-ml.py

mkdir -p ${DATADIR}/$YY1

mv ERAI-${DATE1}-${DATE2}-sfc.grb ERAI-${DATE1}-${DATE2}-ml.grb ${DATADIR}/$YY1/

exit 0
----

How to run WRF driven by ERA-Interim data

Get data

Download data as explained here:
http://conorsweeneyucd.blogspot.com/2015/01/download-era-interim-data.html

Test run on gaia

Working directory:
/gaia/home/sweeneyc/Code/WRF/ERAI

- load the WPS module:
module load WPS

- Edit namelist.wps:
Notes:
- For domain 2, set end time = start time in namelist.wps if you don't want updating. We do want to update surface fields (SST), so start and end dates should be the same for both domains.
&share
 wrf_core = 'ARW',
 max_dom = 1,
 start_date = '1979-01-01_00:00:00','2006-08-16_12:00:00',
 end_date   = '1979-01-01_06:00:00','2006-08-16_12:00:00',
 interval_seconds = 21600
 io_form_geogrid = 2,
/

&geogrid
 parent_id         =   1,   1,
 parent_grid_ratio =   1,   3,
 i_parent_start    =   1,  31,
 j_parent_start    =   1,  17,
 e_we              =  74, 112,
 e_sn              =  61,  97,
 geog_data_res     = '10m','2m',
 dx = 27000,
 dy = 27000,
 map_proj = 'lambert',
 ref_lat   =  52.,
 ref_lon   = -23.5,
 truelat1  =  50.0,
 truelat2  =  55.0,
 stand_lon = -20.0,
 geog_data_path = '/gaia/software/src/WPS_geog/'
/

&ungrib
 out_format = 'WPS',
 prefix = 'FILE',
/

&metgrid
 fg_name = 'FILE'
 io_form_metgrid = 2, 
/

- Check the domain:
module load ncl_ncarg
export NCARG_ROOT=/gaia/software/src/ncl_ncarg-6.2.1
ncl util/plotgrids_new.ncl

- Make sure you have geogrid/GEOGRID.TBL then run geogrid.exe

- Copy in the Vtable for ERA-Interim:
cp /gaia/software/src/WPS-3.6.1/ungrib/Variable_Tables/Vtable.ERA-interim.ml .
cp Vtable.ERA-interim.ml Vtable

- Link in the grib data:
link_grib.csh ~/DATA/ERAI/1979/ERAI-

- Run ungrib.exe
This should produce the files:
FILE:1979-01-01_00
FILE:1979-01-01_06

- create ecmwf_coeffs file:

    0         0.000000    0.00000000
    1        20.000000    0.00000000
    2        38.425343    0.00000000
    3        63.647804    0.00000000
    4        95.636963    0.00000000
    5       134.483307    0.00000000
    6       180.584351    0.00000000
    7       234.779053    0.00000000
    8       298.495789    0.00000000
    9       373.971924    0.00000000
   10       464.618134    0.00000000
   11       575.651001    0.00000000
   12       713.218079    0.00000000
   13       883.660522    0.00000000
   14      1094.834717    0.00000000
   15      1356.474609    0.00000000
   16      1680.640259    0.00000000
   17      2082.273926    0.00000000
   18      2579.888672    0.00000000
   19      3196.421631    0.00000000
   20      3960.291504    0.00000000
   21      4906.708496    0.00000000
   22      6018.019531    0.00000000
   23      7306.631348    0.00000000
   24      8765.053711    0.00007582
   25     10376.126953    0.00046139
   26     12077.446289    0.00181516
   27     13775.325195    0.00508112
   28     15379.805664    0.01114291
   29     16819.474609    0.02067788
   30     18045.183594    0.03412116
   31     19027.695313    0.05169041
   32     19755.109375    0.07353383
   33     20222.205078    0.09967469
   34     20429.863281    0.13002251
   35     20384.480469    0.16438432
   36     20097.402344    0.20247594
   37     19584.330078    0.24393314
   38     18864.750000    0.28832296
   39     17961.357422    0.33515489
   40     16899.468750    0.38389215
   41     15706.447266    0.43396294
   42     14411.124023    0.48477158
   43     13043.218750    0.53570992
   44     11632.758789    0.58616841
   45     10209.500977    0.63554746
   46      8802.356445    0.68326861
   47      7438.803223    0.72878581
   48      6144.314941    0.77159661
   49      4941.778320    0.81125343
   50      3850.913330    0.84737492
   51      2887.696533    0.87965691
   52      2063.779785    0.90788388
   53      1385.912598    0.93194032
   54       855.361755    0.95182151
   55       467.333588    0.96764523
   56       210.393890    0.97966272
   57        65.889244    0.98827010
   58         7.367743    0.99401945
   59         0.000000    0.99763012
   60         0.000000    1.00000000

- Run calc_ecmwf_p.exe
This should produce the files:
PRES:1979-01-01_00
PRES:1979-01-01_06

- Make sure you have your metgrid/METGRID.TBL pointing to METGRID.TBL.ARW

- Run metgrid.exe
This should produce the files:
met_em.d01.1979-01-01_00:00:00.nc
met_em.d01.1979-01-01_06:00:00.nc

Notes:
- I have checked, and the met files to contain varying values for SST and SEAICE.
- metgrid.exe failed at first because my ERAI data didn't cover my WRF domain!

- Change into WRF/run directory
/gaia/home/sweeneyc/Code/WRF/WRFV3/run.test

- module load WRF

- link in the met_em files:
ln -s ../../../WRF/ERAI/met_em.d01.1979-01-01_0* .

- edit namelist.input
It should agree with namelist.wps.
Use ncdump -h to check NUM_MET levels:
num_metgrid_levels = 61
NUM_METGRID_SOIL_LEVELS = 4

- Run real.exe
This should produce:
wrfinput_d01
wrfbdy_d01

- Run WRF
mpirun -np 4 wrf.exe


- >>>>STOPPED HERE<<<<

calc_ecmwf_p.exe

In the course of vertically interpolating meteorological fields, the real program requires 3-d pressure and geopotential height fields on the same levels as the other atmospheric fields. The calc_ecmwf_p.exe utility may be used to create such these fields for use with ECMWF sigma-level data sets. Given a surface pressure field (or log of surface pressure field) and a list of coefficients A and B, calc_ecmwf_p.exe computes the pressure at an ECMWF sigma level $k$ at grid point $(i,j)$ as $P_{ijk} = A_k + B_k*Psfc_{ij}$. The list of coefficients used in the pressure computation can be copied from a table appropriate to the number of sigma levels in the data set from http://www.ecmwf.int/products/data/technical/model_levels/index.html. This table should be written in plain text to a file, ecmwf_coeffs, in the current working directory.

If soil height (or soil geopotential), 3-d temperature, and 3-d specific humidity fields are available, calc_ecmwf_p.exe computes a 3-d geopotential height field, which is required to obtain an accurate vertical interpolation in the real program. Given a set of intermediate files produced by ungrib and the file ecmwf_coeffs, calc_ecmwf_p loops over all time periods in namelist.wps, and produces an additional intermediate file, PRES:YYYY-MM-DD_HH, for each time, which contains pressure and geopotential height data for each full sigma level, as well as a 3-d relative humidity field. This intermediate file should be specified to metgrid, along with the intermediate data produced by ungrib, by adding 'PRES' to the list of prefixes in the fg_name namelist variable.

Input variables
WRF comers with a variable table for ERA-Interim model level data: Vtable.ERA-interim.ml:
GRIB | Level| Level| Level| metgrid  |  metgrid | metgrid                                  |
Code | Code |   1  |   2  | Name     |  Units   | Description                              |
-----+------+------+------+----------+----------+------------------------------------------+
 130 | 109  |   *  |      | TT       | K        | Temperature                              |
 131 | 109  |   *  |      | UU       | m s-1    | U                                        |
 132 | 109  |   *  |      | VV       | m s-1    | V                                        |
 133 | 109  |   *  |      | SPECHUMD | kg kg-1  | Specific humidity                        |
 165 |  1   |   0  |      | UU       | m s-1    | U                                        | At 10 m
 166 |  1   |   0  |      | VV       | m s-1    | V                                        | At 10 m
 167 |  1   |   0  |      | TT       | K        | Temperature                              | At 2 m
 168 |  1   |   0  |      | DEWPT    | K        |                                          | At 2 m
     |  1   |   0  |      | RH       | %        | Relative Humidity at 2 m                 | At 2 m
 172 |  1   |   0  |      | LANDSEA  | 0/1 Flag | Land/Sea flag                            |
 129 |  1   |   0  |      | SOILGEO  | m2 s-2   |                                          |
     |  1   |   0  |      | SOILHGT  | m        | Terrain field of source analysis         |
 134 |  1   |   0  |      | PSFC     | Pa       | Surface Pressure                         |
 151 |  1   |   0  |      | PMSL     | Pa       | Sea-level Pressure                       |
 235 |  1   |   0  |      | SKINTEMP | K        | Sea-Surface Temperature                  |
  31 |  1   |   0  |      | SEAICE   | fraction | Sea-Ice-Fraction                         |
  34 |  1   |   0  |      | SST      | K        | Sea-Surface Temperature                  |
  33 |  1   |   0  |      | SNOW_DEN | kg m-3   |                                          |
 141 |  1   |   0  |      | SNOW_EC  | m        |                                          |
     |  1   |   0  |      | SNOW     | kg m-2   |Water Equivalent of Accumulated Snow Depth|
     |  1   |   0  |      | SNOWH    | m        | Physical Snow Depth                      |
 139 | 112  |   0  |   7  | ST000007 | K        | T of 0-7 cm ground layer                 |
 170 | 112  |   7  |  28  | ST007028 | K        | T of 7-28 cm ground layer                |
 183 | 112  |  28  | 100  | ST028100 | K        | T of 28-100 cm ground layer              |
 236 | 112  | 100  | 255  | ST100255 | K        | T of 100-255 cm ground layer             |
  39 | 112  |   0  |   7  | SM000007 | m3 m-3   | Soil moisture of 0-7 cm ground layer     |
  40 | 112  |   7  |  28  | SM007028 | m3 m-3   | Soil moisture of 7-28 cm ground layer    |
  41 | 112  |  28  | 100  | SM028100 | m3 m-3   | Soil moisture of 28-100 cm ground layer  |
  42 | 112  | 100  | 255  | SM100255 | m3 m-3   | Soil moisture of 100-255 cm ground layer |
-----+------+------+------+----------+----------+------------------------------------------+

Lines which have a metgrid name but no GRIB codes are generated by ungrib.exe.

namelist.input

parent_grid_ratio: integer parent-to-nest domain grid size ratio. Typically an odd number ratio is used in real-data applications.

feedback: To define a two-way nested run, feedback is on. The values of the coarse domain are overwritten by the values of the variables in the nest at the coincident points.

Sea Surface Temperature
The WRF model physics does not predict sea-surface temperature, vegetation fraction, albedo and sea ice. For long simulations, the model provides an alternative to read-in the time-varying data and update these fields. In order to use this option, one must have access to time-varying SST and sea ice fields. Twelve monthly values of vegetation fraction and albedo are available from the geogrid program. Once these fields are processed via WPS, one may activate the following options in the namelist record &time_control before running the program real.exe and wrf.exe:

io_form_auxinput4 = 2
auxinput4_inname = “wrflowinp_d<domain>” (created by real.exe)
auxinput4_interval = 360, 360, 360,

and in &physics

sst_update = 1

Note that this option doesn’t work with sf_ocean_physics options.

Restart Run

The results at the end of one or more restart runs should be identical to a single run without any restart. In order to do a restart run, one must first create a restart file. This is done by setting the namelist variable restart_interval (unit is in minutes) to be equal to or less than the simulation length in the first model run, as specified by run_* variables or start_* and end_* times. When the model reaches the time to write a restart file, a restart file named wrfrst_d_ will be written. The date string represents the time when the restart file is valid.
When one starts the restart run, edit the namelist.input file, so that your start_* time will be set to the restart time (which is the time the restart file is written). The other namelist variable one must set is restart, this variable should be set to .true. for a restart run.

WRF may fail to write a restart file. This is because the basic netCDF file support is only 2Gb. One solution is to simply set the namelist option io_form_restart = 102 (instead of 2), and this will force the restart file to be written into multiple pieces, one per processor.

One-way Nested Run

Step 1: Make a coarse grid run. Hourly output.

Step 2: Run geogrid.exe and metgrid.exe for two domains (as if you are making a 2-way nested run).

Step 3.
- Edit the namelist.input file, changing ‘max_dom = 2’, and making sure columns 1 and 2 are set-up for a 2 domain run, editing the correct start time and grid dimensions.
- Run real.exe for 2 domains.
- Rename the wrfinput_d02 file to wrfndi_d02.

Step 4.
- add io_form_auxinput2 = 2 in the &time_control section of namelist.input to run ndown.exe
- Set namelist variable interval_seconds to reflect the history output interval from the coarse domain model run.
- Do not change physics options until after running ndown program.
- Run ndown.exe, which uses input from the coarse grid wrfout file(s), and the wrfndi_d02 file generated from Step 3 above. This will produce a wrfinput_d02 and wrfbdy_d02 file.

Step 5.
- Rename wrfinput_d02 and wrfbdy_d02 to wrfinput_d01 and wrfbdy_d01, respectively.
- Rename (or move) the original wrfout_d01* files to something else (or another directory) so as to not overwrite them.
- Edit namelist.input,moving all of the fine-grid domain data from column 2 to column 1 so that this run will be for the fine-grid domain only. Make sure that the time_step is set to comply with the fine-grid domain (typically 6*DX). It may be beneficial to save namelist.input to something else prior to this step in case you need to repeat this process in the future.
- Users may take advantage of a feature that allows both the initial and lateral boundaries to use the moist and scalar arrays (have_bcs_moist and have_bcs_scalar, respectively). This option is only to be used during the WRF model run which follows the ndown processing. With this option, a user must keep the microphysics options the same between forecasts. The advantage is that the previous WRF model provides realistic lateral boundary tendencies for all of the microphysical variables, instead of a simple “zero inflow” or “zero gradient outflow”.
- Run WRF for this grid.

Time Series Output
Time series output. To activate the option, a file called “tslist” must be present in the WRF run directory. The tslist file contains a list of locations defined by their latitude and longitude along with a short description and an abbreviation for each location. A sample file looks something like this:

#-----------------------------------------------#
# 24 characters for name | pfx | LAT | LON |
#-----------------------------------------------#
Cape Hallett hallt -72.330 170.250
McMurdo Station mcm -77.851 166.713

The maximum number of time series locations is controlled by the namelist variable max_ts_locs in the namelist record &domains. The default value is 5. The time series output contains selected variables at the surface, including 2-m temperature, vapor mixing ratio, 10-m wind components, u and v, rotated to the earth coordinate, etc.. More information for time series output can be found in WRFV3/run/README.tslist.

Starting in V3.5, in addtion to surface variables, vertical profiles of earth-relative U and V, potential temperature, water vapor, and geopotential height will also be output. The default number of levels in the output is 15, but can be changed with namelist variable max_ts_level.

Examples of namelists for various applications

a. 1 to 4 km grid distances, convection-permitting runs for a 1- 3 day run: Page 5-30
b. 10 to 20 km grid distances: Page 5-31
e. Regional climate case at 10 to 30 km grid sizes: Page 5-32

Tuesday 13 January 2015

Atmospheric Reanalysis Datasets

https://climatedataguide.ucar.edu/climate-data/atmospheric-reanalysis-overview-comparison-tables

Potential datasets for downscaling:

NCEP Reanalysis V2:
- 1979 - present
- 2.5 degree
- 28 sigma levels
- 6-hourly

20CR V2:
- 1871 - 2012
- 2.0 degree
- 24 pressure levels
- 6-hourly

ERA-Interim:
- 1971 - present
- 0.75 degree
- 60 levels
- 6-hourly

ERA-20C:
- 1900 - 2010
- 125 km
- 91 levels
- 3-hourly

CFSR:
- 1979 - present
- 0.5 degree
- 64 levels
- hourly

JRA-55:
- 1958 - present
- 0.5 degree
- 60 levels
- 6-hourly

NASA MERRA:
- 1979 - 2013
- 0.5 degree
- 72 levels
- 6-hourly

Friday 9 January 2015

WRF on gaia

Eamonn has installed WPS and WRF as modules:

WPS/3.6.1
WRF/3.6.1

module load WPS
module load WRF

He has also put the geog data in a central location:

/gaia/software/src/WPS_geog

To specify the number of cores for WRF:

export OMP_NUM_THREADS=9

The just run wrf.exe

I found a post that suggests the domain be decomposed into squares of no less than 15*15. I don't know if this relates to MPI or OpenMP, but I'll use it as a guideline anyway.

Friday 2 January 2015

Divergence Theorem

The divergence theorem, also known as Gauss's theorem or Ostrogradsky's theorem states that the outward flux of a vector field through a closed surface is equal to the volume integral of the divergence over the region inside the surface. Intuitively, it states that the sum of all sources minus the sum of all sinks gives the net flow out of a region. $$ \iiint_V ( \nabla \cdot \mathbf{F} ) dV = \oint \oint_S ( \mathbf{F} \cdot \mathbf{n} ) dS $$