Obtaining and Processing NARR Surface Fields

Downloading the Files

Obtain the 9 fields detailed in Table 6 using the Perl script get-httpsubset.pl. This will create a set of subdirectories, e.g.

...
198001
198002
...
198011
198012
198101
...
200805
200806
An example is:

./get-httpsubset.pl 1999010100 1999013121 UGRD 10_m_above_gnd ./

which obtains the UGRD u-velocity component field 10 meters above the ground for the entire month of January 1999. This example will create the subdirectory 199901 containing the following GRIB files:
narr-a_221_19990101_0000_000.sub.grb
narr-a_221_19990101_0300_000.sub.grb
narr-a_221_19990101_0600_000.sub.grb
...
narr-a_221_19990101_1800_000.sub.grb
narr-a_221_19990101_2100_000.sub.grb
narr-a_221_19990102_0000_000.sub.grb
narr-a_221_19990102_0300_000.sub.grb
...
narr-a_221_19990131_1800_000.sub.grb
narr-a_221_19990131_2100_000.sub.grb
A complete set of the fields required for running ROMS with an active ocean-atmosphere boundary layer (from Table 6) for January 1999 can be obtained by running all of the following commands to obtain, respectively, the u-wind component, v-wind component, temperature, pressure, relative humidity, total cloud cover, precipitation rate, downward SW radiation flux, and upward SW radiation flux.

./get-httpsubset.pl 1999010100 1999013121 UGRD 10_m_above_gnd ./
./get-httpsubset.pl 1999010100 1999013121 VGRD 10_m_above_gnd ./
./get-httpsubset.pl 1999010100 1999013121 TMP sfc ./
./get-httpsubset.pl 1999010100 1999013121 PRES sfc ./
./get-httpsubset.pl 1999010100 1999013121 RH 2_m_above_gnd ./
./get-httpsubset.pl 1999010100 1999013121 TCDC atmos_col ./
./get-httpsubset.pl 1999010100 1999013121 DSWRF sfc ./
./get-httpsubset.pl 1999010100 1999013121 USWRF sfc ./

The first command will create the initial GRIB file, and each following command will append an additional field to that GRIB file.

Converting from GRIB to NETCDF4 Format

The GRIB files need to be converted to NetCDF 4 files via the program grib2nc4, e.g.


grib2nc4 narr-a_221_19990101_0000_000.sub.grb

which will create, by default, the NetCDF 4 file:
narr-a_221_19990101_0000_000.sub.grb.nc
that is, the original file with .nc appended.

A Python script that does this automatically in a specified subdirectory makecdf.py is used, e.g., as:

makecdf.py 199001

The program code for makecdf.py is:

Listing #1 - makecdf.py

#!/usr/bin/python
#
import sys, os, fnmatch

#  Get name of present working directory.
pwd = os.getcwd()

#for arg in sys.argv:
#       print arg
#  Get name of directory specified on command line.
directory = pwd + "/" + sys.argv[1]

#  Set alias for grib2nc4 program.

grib2nc4 = '/usr/bin/grib2nc4'

#  Define all_files as per p. 89 in Python Cookbook (2nd Ed.).

def all_files(root, patterns='*', single_level=False, yield_folders=False):
        # Expand patterns from semicolon-separated string to list.
        patterns = patterns.split(';')
        for path, subdirs, files in os.walk(root):
                if yield_folders:
                        files.extend(subdirs)
                files.sort()
                for name in files:
                        for pattern in patterns:
                                if fnmatch.fnmatch(name, pattern):
                                        yield os.path.join(path,name)
                                        break
                if single_level:
                        break

#  Convert all *.grb files into *.nc files.

for filename in all_files(directory, '*.grb'):
        command = 'grib2nc4 ' + filename
        os.system(command)
#        print filename

Extracting the Gulf of Mexico Subset

Extracting the Gulf of Mexico geographic subset from the bloody huge geographic coverage contained within each file. This will be done by choosing an appropriate rectangular lon/lat box and picking out the points within its limits.

import scipy,numpy,netCDF4

grid = netCDF4.Dataset('../grid.nc','r+')

lon = grid.variables['gridlon_221']
lat = grid.variables['gridlat_221']

lats = grid.variables['gridlat_221'][:]
lats.shape
(277, 349)
lats
array([[  1.        ,   1.10430562,   1.20829403, ...,   1.10619318,
          1.00189328,   0.89727974],
       [  1.18484223,   1.28965878,   1.39415729, ...,   1.29155564,
          1.18674481,   1.08161962],
       [  1.36998236,   1.47531199,   1.58032298, ...,   1.47721815,
          1.37189424,   1.26625526],
       ...,
       [ 46.52961349,  46.815979  ,  47.1023674 , ...,  46.82117081,
         46.5348053 ,  46.24846649],
       [ 46.58289719,  46.8695755 ,  47.1562767 , ...,  46.87477112,
         46.5880928 ,  46.30144119],
       [ 46.63458252,  46.92156601,  47.20857239, ...,  46.92676926,
         46.63978195,  46.35282898]], dtype=float32)

lons = grid.variables['gridlon_221'][:]
lons.shape
(277, 349)
lons
array([[-145.5       , -145.3150177 , -145.12953186, ...,  -68.68833923,
         -68.5033493 ,  -68.31887817],
       [-145.60473633, -145.41944885, -145.23364258, ...,  -68.58391571,
         -68.39861298,  -68.21383667],
       [-145.70999146, -145.5243988 , -145.33827209, ...,  -68.47897339,
         -68.29336548,  -68.10827637],
       ...,
       [ 149.47387695,  149.55291748,  149.63285828, ...,   -3.55436254,
          -3.47530246,   -3.39711761],
       [ 149.05722046,  149.13401794,  149.21166992, ...,   -3.13541102,
          -3.058604  ,   -2.98264885],
       [ 148.63970947,  148.7142334 ,  148.78959656, ...,   -2.71559358,
          -2.64105368,   -2.56734204]], dtype=float32)

#  Flatten the 2-D lon/lat arrays to 1-D.

lonflat = lons[:,:].ravel()
latflat = lats[:,:].ravel()
lonflat.shape
(96673,)
latflat.shape
(96673,)

#  Create boolean arrays for each lon/lat min/max conditional.

lonmin = lonflat > -98
lonmax = lonflat < -80
latmin = latflat > 18
latmax = latflat < 31

#  Create a single boolean array containing all four conditionals.

lonlatrange = lonmin & lonmax & latmin & latmax

#  Apply the boolean array to the original flattened lon/lat arrays
#  to obtain the subset within the lon/lat min/max ranges.

lonmid = lonflat[lonlatrange]
latmid = latflat[lonlatrange]

#  Apply same to any other flattened array, e.g. u.

u = Uwind[:,:].ravel()
umid = u[lonlatrange]

Converting from NARR- to ROMS-Compatible NetCDF4 Format

The NetCDF 4 files need to be munged for several reasons. Using ncdump4, a typical NetCDF 4 file created from a GRIB file looks like:

netcdf narr-a_221_19990101_0000_000.sub.grb {
dimensions:
        gridy_221 = 349 ;
        gridx_221 = 277 ;
variables:
        float gridrot_221(gridx_221, gridy_221) ;
                gridrot_221:least_significant_digit = 7L ;
...
        float gridlon_221(gridx_221, gridy_221) ;
                gridlon_221:least_significant_digit = 7L ;
...
        float gridlat_221(gridx_221, gridy_221) ;
                gridlat_221:least_significant_digit = 7L ;
...
        float U_GRD_221_HTGL(gridx_221, gridy_221) ;
                U_GRD_221_HTGL:least_significant_digit = 7L ;
...
data:

 gridrot_221 =
  -0.5147449, -0.5122718, -0.5097917, -0.5073047, -0.5048109, -0.5023101,
...
 gridlon_221 =
  -145.5, -145.315, -145.1295, -144.9435, -144.757, -144.5699, -144.3824,
...
 gridlat_221 =
  1, 1.104306, 1.208294, 1.311961, 1.415304, 1.518318, 1.621, 1.723346,
...
 U_GRD_221_HTGL =
  -4.603577, -4.603577, -4.681702, -4.681702, -4.681702, -4.791077,
Transmogrifying this into a NetCDF 4 or NetCDF 3 file acceptable to ROMS requires the following actions:

A sample Python session to accomplish this follows:

#  Import the needed modules.

import scipy,numpy,netCDF4
import time,datetime

#  Open an existing file as object g for modification.

g = netCDF4.Dataset('test.nc','r+')

#  Create an unlimited time dimension.

g.createDimension('wind_time',None)

#  Create a variable wind_time.

times = g.createVariable('wind_time','f8',('wind_time',))

#  Create the variable attributes for wind_time.

times.units = 'days since 1970-01-01 0:00:00 0:00'
times.long_name = 'wind time since initialization'
times.standard_name = 'time'

#  Rename the variable U_GRD_221_HTGL to Uwind.  The file object must
#  be closed and reopened to see the changes externally.

g.renameVariable('U_GRD_221_HTGL','Uwind')
g.close()
g = netCDF4.Dataset('test.nc','r+')

#  Extract the variable Uwind into the variable object u.

u = g.variables['Uwind']

#  Extract the time from the variable attribute initial_time.

datestring = u.initial_time
datestring
'01/01/1999 (03:00)'

#  Extract a usable datetime string and then find the days since midnight 1970-01-01.

dtstring = datetime.datetime(*time.strptime(datestring, "%m/%d/%Y (%H:%S)")[0:5])
dtstring
datetime.datetime(1999, 1, 1, 3, 0)
days = date2num(dtstring,units='days since 1970-01-01 00:00:00')
days
10592.125

#  Write the wind time variable to the NetCDF file.  The file object
#  must be closed and reopened to see the changes externally.

wind_time = g.variables['wind_time']
wind_time[0] = days
g.close()
g = netCDF4.Dataset('test.nc','r+')

#  Close the file and move on to the next file.

g.close() 

Listing #1 - cdfconv.py

#!/usr/bin/python
#
import sys, os, fnmatch
import scipy,numpy,netCDF4
import time,datetime
from netCDF4 import num2date, date2num

#prefix = 'narr-u-

#  Get name of present working directory.
pwd = os.getcwd()

#for arg in sys.argv:
#	print arg
#  Get name of directory specified on command line.
directory = pwd + "/" + sys.argv[1]
print directory

#  Set alias for grib2nc4 program.

#grib2nc4 = '/usr/bin/grib2nc4'

#  Define all_files as per p. 89 in Python Cookbook (2nd Ed.).

def all_files(root, patterns='*', single_level=False, yield_folders=False):
	# Expand patterns from semicolon-separated string to list.
	patterns = patterns.split(';')
	for path, subdirs, files in os.walk(root):
		if yield_folders:
			files.extend(subdirs)
		files.sort()
		for name in files:
			for pattern in patterns:
				if fnmatch.fnmatch(name, pattern):
					yield os.path.join(path,name)
					break
		if single_level:
			break

#  Convert all *.grb files into *.nc files.

prefix = "narr"
vname = "UGRD-10m"

for filename in all_files(directory, 'narr-a_*.nc'):

	import sys, os, fnmatch
	import scipy,numpy,netCDF4
	import time,datetime
	from netCDF4 import num2date, date2num

	print filename
	#  Open the old file.
	ncifile = netCDF4.Dataset(filename,'r+')
	#  Parse the old filename, e.g. narr-a_221_19900101_0000_000.sub.grb.nc
	filestr = filename.split('_')
	print filestr
	hrst = str(filestr[3][0:2])
	#  Create a new filename and open the new file. 
	filenew = prefix + "-" + vname + "-" + filestr[2] + hrst + ".nc"
	print filenew
	ncofile = netCDF4.Dataset(filenew,'w')
	print ncofile

        #  Create an unlimited time dimension "time"
        ncofile.createDimension('time',None)
        #  Create a variable "time"
        times = ncofile.createVariable('time','f8',('time',))
        #  Create the variable attributes for "time"
        times.units = 'days since 1970-01-01 0:00:00 0:00'
        times.long_name = 'time since initialization'
        times.standard_name = 'time'

        #  Extract the variable U_GRD_221_HTGL into the variable object u.
        u = ncifile.variables['U_GRD_221_HTGL']
        #  Extract the initial time into "datestring"
        initial_time = u.initial_time
        print initial_time
        #  Extract a date/time string and calculate days since 1970-01-01
        dtstring = datetime.datetime(*time.strptime(initial_time, "%m/%d/%Y (%H:%S)")[0:5])
        days = date2num(dtstring,units='days since 1970-01-01 00:00:00')
        print days
        #  Write the time variable value into the output file.
        time = ncofile.variables['time']
        time[0] = days

	#  Find and extract the GOM-located subsets of lon and lat.  
	lon = ncifile.variables['gridlon_221']
	lat = ncifile.variables['gridlat_221']
	lons = ncifile.variables['gridlon_221'][:]
	lats = ncifile.variables['gridlat_221'][:]
	lonflat = lons[:,:].ravel()
	print " lonflat = ",lonflat
	latflat = lats[:,:].ravel()
	lonmin = lonflat > -98
	lonmax = lonflat < -80
	latmin = latflat > 18
	latmax = latflat < 31
	lonlatrange = lonmin & lonmax & latmin & latmax
	lonmid = lonflat[lonlatrange]
	lonmidsize = len(lonmid)
	print " lonmidsize = ",lonmidsize
	latmid = latflat[lonlatrange]
	latmidsize = len(latmid)
	print " latmidsize = ",latmidsize
	nsize = len(lonmid)
	print " nsize = ",nsize
	
	#  Extract GOM-located subset of U.
	uwind = ncifile.variables['U_GRD_221_HTGL'][:]
	uflat = uwind[:,:].ravel()
	umid = uflat[lonlatrange]	

	#  Create the output file dimensions "lon" and "lat"
	ncofile.createDimension('points', nsize)
	#  Create the output file variables "lon" and "lat"
	lons = ncofile.createVariable('longitude','f4',('points',))
	lats = ncofile.createVariable('latitude','f4',('points',))
	#  Create the output file variable "Uwind"
	Uwind = ncofile.createVariable('Uwind','f4',('time','points',))

	lons.long_name = "longitude"
	lons.standard_name = "longitude"
	lons.units = "degrees_east"

	lats.long_name = "latitude"
	lats.standard_name = "latitude"
	lats.units = "degrees_north"

	from time import strftime
	timestamp = strftime("%Y-%m-%d %H:%M:%S")
	history = "Subset created on " + timestamp

	ncofile.title = "NARR Project Data Subset for Gulf of Mexico ROMS Simulations"
	ncofile.institution = "US National Weather Service - NCEP (WMC)"
	ncofile.source = "North American Regional Reanalysis (NARR) Project"
	ncofile.history = history
	ncofile.note1 = "U and V components of vector quantities are resolved relative to earth NOT grid"
	
	Uwind.units = "m s-1"
	Uwind.long_name = "u-component of wind"
	Uwind.standard_name = "eastward_wind"
	Uwind.center = "PONG Group, TAMU Dept. of Oceanography"
	Uwind.origin = "The North American Regional Reanalysis (NARR) Project"
	Uwind.initial_time = initial_time
	Uwind.model = "North American Regional Reanalysis (NARR)"
	Uwind.NARR_GRIB_parameter_name = "UGRD"
	Uwind.NARR_GRIB_parameter_level = "10_m_above_ground"
	Uwind.geographical_subset = "98W to 80W, 18N to 31N"

	lons[:] = lonmid
	lats[:] = latmid
	Uwind[:] = umid

	ncofile.close()
	ncifile.close()


  • Converting the units of fields from GRIB units to ROMS units.

  • Compiling the 3-hourly files into daily/weekly/? length files.



    GRIDS AND GRIB

    NCEP NARR (North American Regional Reanalysis)

    ROMS Forcing

    ROMS Forcing Package

    If the ocean-atmosphere boundary layer is activated, the user needs to provided the following atmospheric fields. The required field along with the corresponding NARR field parameters are given in the following table:

    Table 6 - ROMS Variables Required for Active Ocean-Atmosphere Boundary Layer

    Field ROMS var. ROMS units NARR # WGRIB ABBREV. NAME & NARR UNITS LEVEL/LAYER TYPE
    surface u-wind componentUairm/s 293UGRDu wind [m/s]10 m above ground
    surface v-wind componentVairm/s 294VGRDv wind [m/s]10 m above ground
    surface air temperatureTairC 267TMPTemperature [K]sfc
    surface air pressurePairmb 265PRESPressure [Pa]sfc
    surface air relative humidityQair% 291RHRelative humidity [%]2 m above gnd
    cloud fractioncloud0 to 1 417TCDCTotal cloud cover [%]atmos col
    rain fall raterainkg/m^2/s 381PRATEPrecipitation rate [kg/m^2/s]sfc
    shortwave radiation fluxswradW/m^2 420DSWRFDownward SW radiation flux [W/m^2]sfc
    shortwave radiation fluxswradW/m^2 422USWRFUpward SW radiation flux [W/m^2]sfc

    Conversions:

    1 mb = 100 Pa
    1 deg. C = 274.15 deg. K

    Needed Conversions

    cloud = TCDC/100.
    Tair(C) = TMP(K) - 273.15
    Pair(mb) = PRES(Pa)/100.



    Introduction to proj4 and Related Lambert Projection Information

    Useful proj4 links:

    To obtain information about the Lambert Conformal Conic projection:

    proj -l=lcc
          lcc : Lambert Conformal Conic
            Conic, Sph&Ell
            lat_1= and lat_2= or lat_0
    

    To check the corner points of GRIB grid #221:

    proj +proj=lcc +lat_1=50.0 +lat_2=50.0 +units=km +lon_0=-145.5 +lat_0=1.0
    -145.5 1
    0.00    0.00
    148.639 46.635
    -4423.64        7779.44
    -2.566 46.352
    5437.68 13357.10
    -68.318 0.897
    9825.53 5557.50
    
    invproj +proj=lcc +lat_1=50.0 +lat_2=50.0 +units=km +lon_0=-145.5 +lat_0=1.0
    0.0 0.0
    145d30'W        1dN
    -4423.64 7779.44
    148d38'20.337"E 46d38'6.196"N
    5437.68 13357.10
    2d33'57.438"W   46d21'7.298"N
    9825.53 5557.50
    68d19'4.831"W   0d53'49.07"N
    



    GMT

    The GMT commands are:

    −Jllon0/lat0/lat1/lat2/scale or −JLlon0/lat0/lat1/lat2/width 
    (Lambert [C])
        Give origin lon0/lat0, two standard parallels lat1/lat2, and
        scale along these (1:xxxx or UNIT/degree).
    

    psbasemap -Jl-145.5/1.0/50.0/50.0 -Rr-145.5/1.0/-2.6/40.4
    



    Getting the NARR Files Ready for the TABS/TGLO System

    The step by step instructions on how to get from here to there.

    Obtaining the Grid Point Information

    The GRIB files containing the reanalyzed model output do not contain the longitudes and latitudes of the grid points. The GRIB grid number is given, and you're apparently supposed to use whatever information you can find about the given projection and its parameters to generate the lon/lat information. Fortunately, the NARR folks have supplied another, separate GRIB file containing the lon/lat information.

    A GRIB file containing grid point information can be found at:

    http://wwwt.emc.ncep.noaa.gov/mmb/rreanl/index.html

    with the direct link being:

    http://wwwt.emc.ncep.noaa.gov/mmb/rreanl/AWIP32.fixed

    This GRIB file can be converted to a NetCDF 4 file using a program from the netcdf4-python package available at:

    http://code.google.com/p/netcdf4-python/.

    The command to do this is:

    grib2nc4 AWIP32.fixed

    which will create the NetCDF 4 file:

    AWIP32.fixed.nc

    The contents of this file can be perused in ASCII format via the ncdump command from the latest version of NetCDF4, which I've aliased to ncdump4, e.g.

    ncdump4 AWIP32.fixed.nc | more

    The lon/lat grid meta-info found this way is as follows:

    dimensions:
            gridy_221 = 349 ;
            gridx_221 = 277 ;
    ...
            float gridlat_221(gridx_221, gridy_221) ;
                    gridlat_221:least_significant_digit = 7L ;
                    gridlat_221:corners = 1.f, 0.8972797f, 46.35283f, 46.63458f ;
                    gridlat_221:grid_description = "AWIPS - Regional - NOAMHI - High Resolution North American Master Grid (Lambert Conformal)" ;
                    gridlat_221:mpProjection = "LAMBERTCONFORMAL" ;
                    gridlat_221:mpLambertParallel2F = 50.f ;
                    gridlat_221:long_name = "latitude" ;
                    gridlat_221:Lo1 = 214.5f ;
                    gridlat_221:Lov = 253.f ;
                    gridlat_221:mpLambertMeridianF = 253.f ;
                    gridlat_221:Dx = 32.46341f ;
                    gridlat_221:mpLambertParallel1F = 50.f ;
                    gridlat_221:units = "degrees_north" ;
                    gridlat_221:Dy = 32.46341f ;
                    gridlat_221:La1 = 1.f ;
            float gridlon_221(gridx_221, gridy_221) ;
                    gridlon_221:least_significant_digit = 7L ;
                    gridlon_221:Latin1 = 50.f ;
                    gridlon_221:Latin2 = 50.f ;
                    gridlon_221:corners = -145.5f, -68.31888f, -2.567342f, 148.6397f ;
                    gridlon_221:grid_description = "AWIPS - Regional - NOAMHI - High Resolution North American Master Grid (Lambert Conformal)" ;
                    gridlon_221:mpProjection = "LAMBERTCONFORMAL" ;
                    gridlon_221:mpLambertParallel2F = 50.f ;
                    gridlon_221:long_name = "longitude" ;
                    gridlon_221:Lo1 = 214.5f ;
                    gridlon_221:Lov = 253.f ;
                    gridlon_221:mpLambertMeridianF = 253.f ;
                    gridlon_221:Dx = 32.46341f ;
                    gridlon_221:mpLambertParallel1F = 50.f ;
                    gridlon_221:units = "degrees_east" ;
                    gridlon_221:Dy = 32.46341f ;
                    gridlon_221:La1 = 1.f ;
    

    Obtaining the NARR GRIB Files

    Getting the NARR data from NCDC can be easily accomplished via the Perl script get-httpsubset.pl which can be obtained from:

    http://nomads.ncdc.noaa.gov/ncdc-ui/get-httpsubset.pl

    This requires two additional Perl scripts - get_inv.pl and get_grib.pl - which can be obtained from the page:

    http://www.cpc.ncep.noaa.gov/products/wesley/fast_downloading_grib.html

    When get-httpsubset.pl is run with no arguments, the following information is obtained:

    Started : ./get-httpsubset.pl Version v 1.3.5  Feb 13, 2007
    
        *> Inadequate Input Parameters <*
    
    Usage:
    ./get-httpsubset.pl interactive         // Prompts user for input parameters
            OR:
    ./get-httpsubset.pl      [NARR set] [OPTIONS]
    
            YYYYMMDDHH      Date-Time group of start and end of desired date range
                    ex: 2000120100 2001013121
                                 - Available Range: 1979010100 -> 2003123121
                                   HH can be (00 03 06 09 12 15 18 21)
    
            GRIB Parameter name     List of parameters to subset
                    ex: TMP      - ALL for all fields, same for Levels
                                   VAR1-VARn for multiple fields, dash to seperate
    
            Levels                  List of vertical levels to subset
                    ex: 500_mb   - substitute underscores for spaces
    
            Output path             Absolute path to dump the results
                    /home/user/data/narr
    
      [Opt] Data set = currently either: narr-a, narr-b narrmon-a narrmon-b
                                         narrmonhr-a narrmonhr-b
                                         nam gfs namanl gfsanl ruc20
                    Default = narr-a
    
        [OPTIONS]
            -nocheck        Do not check for dependancy programs and files
                            (not recommended)
            --output-name-scheme=complex
                            Name output files with descriptive, long filenames
                            Also creates /YYYYMM/YYYYMMDD heirachy
    
    
            *Note*  Avoid using ALL for both Levels and Params,
                doing so will retrieve the entire file (54 Mb for narr-a)
    

    An example of the use of get-httpsubset.pl wherein we want to obtain the necessary (from Table 6) UGRD variable at 10 m above the ground on Jan. 1, 1999 at 0000 hours UCT would be:

    ./get-httpsubset.pl 1999010100 1999010100 UGRD 10_m_above_gnd ./

    This will create a subdirectory 199901 containing the GRIB file:

    narr-a_221_19990101_0000_000.sub.grb
    
    containing our desired field. To obtain all eight available fields from that day, we would use:

    ./get-httpsubset.pl 1999010100 1999010100 UGRD 10_m_above_gnd ./

    which would give, in subdirectory 199901:

    narr-a_221_19990101_0000_000.sub.grb
    narr-a_221_19990101_0300_000.sub.grb
    narr-a_221_19990101_0600_000.sub.grb
    narr-a_221_19990101_0900_000.sub.grb
    narr-a_221_19990101_1200_000.sub.grb
    narr-a_221_19990101_1500_000.sub.grb
    narr-a_221_19990101_1800_000.sub.grb
    narr-a_221_19990101_2100_000.sub.grb
    

    Using the netcdf-python Module

    Create a test netCDF file test.nc via:

    cd /home/baum/NOMADS/199901
    cp narr-a_221_19990101_0000_000.sub.grb.nc test.nc
    
    Start Python and load the netCDF4 module:
    [baum@rossby 199901]$ python
    Python 2.4.2 (#1, Apr  9 2006, 11:50:00)
    [GCC 4.1.0 20060304 (Red Hat 4.1.0-3)] on linux2
    Type "help", "copyright", "credits" or "license" for more information.
    >>> import netCDF4
    
    Open the existing file test.nc as object g using r+ for appending new items, and then print the dimensions that exist in the file.
    >>> g = netCDF4.Dataset('test.nc','r+')
    >>> print g.dimensions
    {'gridy_221': , 'gridx_221': }
    
    Create a new unlimited dimension (via None) time, and then print the list of dimensions in the file.
    >>> g.createDimension('time',None)
    >>> print g.dimensions
    {'gridy_221': , 'gridx_221': , 'time': }
    
    Print the list of variables:
    >>> print g.variables
    {'gridrot_221': , 'gridlon_221': , 'gridlat_221': , 'U_GRD_221_HTGL': }
    
    Create a new variable time and then print the list of variables again:
    >>> times = rootgrp.createVariable('time','f8',('time',))
    >>> print g.variables
    {'gridrot_221': , 'gridlon_221': , 'time': , 'gridlat_221': , 'U_GRD_221_HTGL': }
    
    Add a global attribute to the netCDF file, and a variable attribute to time, and then print the global attribute name/value pairs:
    >>> g.description = 'bogus example'
    >>> times.units = 'seconds'
    >>> print g.__dict__
    {'description': 'bogus example'}
    
    Delete the global attribute.
    >>> del g.description
    >>> print g.__dict__
    {}
    
    Delete the time variable attribute units.
    >>> del times.units
    
    Find the value of a dimension.
    >>> ngridy = len(g.dimensions['gridy_221'])
    >>> ngridx = len(g.dimensions['gridx_221'])
    >>> ngridx,ngridy
    (277, 349)
    
    Read an array into a numpy variable from a NetCDF4 variable:
    >>> gridlon = g.variables['gridlon_221'][:]
    >>> gridlat = g.variables['gridlat_221'][:]
    >>> u = g.variables['U_GRD_221_HTGL'][:]
    >>> u
    array([[  -4.60357666,   -4.60357666,   -4.68170166, ..., -999.        ,
            -999.        , -999.        ],
           [  -4.67388916,   -4.52545166,   -4.52545166, ..., -999.        ,
            -999.        , -999.        ],
           [  -4.62701416,   -4.62701416,   -4.40045166, ..., -999.        ,
            -999.        , -999.        ],
           ...,
           [-999.        , -999.        , -999.        , ..., -999.        ,
            -999.        , -999.        ],
           [-999.        , -999.        , -999.        , ..., -999.        ,
            -999.        , -999.        ],
           [-999.        , -999.        , -999.        , ..., -999.        ,
            -999.        , -999.        ]], dtype=float32)
    
    Deal with dates, times, seconds since..., etc.
    >>> from datetime import datetime, timedelta
    >>> from netCDF4 import num2date, date2num
    >>> dates = [datetime(2001,3,1)+n*timedelta(hours=12) for n in range(temp.shape[0])]
    >>> times[:] = date2num(dates,units=times.units,calendar=times.calendar)
    >>> print 'time values (in units %s): ' % times.units+'\n',times[:]
    time values (in units hours since January 1, 0001): 
    [ 17533056.  17533068.  17533080.  17533092.  17533104.]
    >>> dates = num2date(times[:],units=times.units,calendar=times.calendar)
    >>> print 'dates corresponding to time values:\n',dates
    dates corresponding to time values:
    [2001-03-01 00:00:00 2001-03-01 12:00:00 2001-03-02 00:00:00
     2001-03-02 12:00:00 2001-03-03 00:00:00]