FLEXPART Model/MODIShotspotdata

From mn/geo/geoit
Jump to: navigation, search

Acronyms:

 lc - land cover
 hs - hot spots
 ba - burnt area
 MRT - MODIS Reprojection Tool


Terminology:

 Subdomain - the geographical area of interest, represented by a rectangle in lat-lon coordinate space, over which emission estimates will be calculated.
             It is used instead of the global domain because MRT has difficulties when operating with latge (>2Gb) files. 

Directories:

 Working directory - /amos_vol2/nilu2/home/zva. This is the default directory for running all scripts and subroutines.
 MODIS raw data directory - /xnilu_wrk/data.
 MODIS data working directory - /amos_vol2/nilu2/home/zva/MODIS_DATA.
 MODIS Reprojection Tool directory - /amos_vol2/nilu2/home/zva/MODIS_MRT/bin.
 HDF4 home directory - /amos_vol2/nilu2/home/zva/HDF4/bin.

Links:

 Main webpage - http://modis-fire.umd.edu/data.asp. - Contains links to ftp cites and user's guides for data sets are provided here. 
                Howewer, ocassionally links are invalid.
 Hot spots data from Terra: ftp://e4ftl01u.ecs.nasa.gov/MOLT/MOD14A1.005/
 Hot spots data from Aqua:  ftp://e4ftl01u.ecs.nasa.gov/MOLA/MYD14A1.005/
 Burned area data: ftp://ba1.geog.umd.edu   or   ftp://e4ftl01u.ecs.nasa.gov/MOTA/MCD45A1.005/
 The data links links are also changeable, but they can work.



I. Prepare MODIS hot spots and burnt area data. For hs, two datasets are needed: first - from Terra, second - from Aqua satellite.


1) Download hs data tiles for 8-days period of interest into the directories:

      /xnilu_wrk/data/modis_hotspots_terra ('terra' directory)
      /xnilu_wrk/data/modis_hotspots_aqua  ('aqua'  directory)
 To download data, modify and run ./gethdf script in each of the directories. The directory structure on MODIS ftp server is changeable,
 so the script was modified regularly. 
 As an example, let we are interested in 8-days period starting at 28.07.2007. So, hs files are located in ./2007.07.28 subdirectories
 of 'terra' and 'aqua' directories.


2) Merge data tiles for the region of interest into the single file. Use 'mrtmosaic' tool. Let we are interested in the Russian region, so we need

  to mosaic tiles with (h19-h28, v01-v05) coordinates in the MODIS tile grid system (see MODIS Fire Users Guide for information about tiles numbering).

2.1 Prepare list of files for mosaicking. Use 'lp' script. Do

  cd /xnilu_wrk/data/modis_hotspots_terra/2007.07.28
 In ./lp script, set the range of tiles numbers: h1=19, h2=28, v1=1, v2=5 and run the script. Then, copy the ./list_of_paths file into the MRT directory:
 Then, navigate to the directory:
  ./lp
  cp list_of_paths /amos_vol2/nilu2/home/zva/MODIS_MRT/bin/list_of_paths

2.2 Run the mosaicking script from the MRT directory:

  cd /amos_vol2/nilu2/home/zva/MODIS_MRT/bin
  nohup ./mrtmosaic -i list_of_paths -o hstm.hdf -s "1 1 1 1 1 1 1 1" > & mrtmosaic.log1 &
 The script merges files from list_of_paths into the single file hstm.hdf. The -s flags and the following string indicate that only 8 FireMask fields
 will be merged. So, the output file hstm.hdf contains only fire masks for 8 consequent days, and no more data.


3) The MODIS data is originally in Sinusoid (SIN) projection. So, you need to reproject the data into simple lat-lon coordinates (geographical projection).

  Use 'resample' tool.

3.1 In the MRT directory, open the geo.prm file. It should contain the following text:

  INPUT_FILENAME = hstm.hdf
  SPECTRAL_SUBSET = ( 1 1 1 1 1 1 1 1 )
  SPATIAL_SUBSET_TYPE = INPUT_LAT_LONG
  SPATIAL_SUBSET_UL_CORNER = ( 80.0 10.0 )
  SPATIAL_SUBSET_LR_CORNER = ( 30.0 180.0 )
  OUTPUT_FILENAME = hstr.hdf
  OUTPUT_PROJECTION_TYPE = GEO
  OUTPUT_PROJECTION_PARAMETERS = ( 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 )
  OUTPUT_PIXEL_SIZE = 0.00833333334
 Two parameters - the SPATIAL_SUBSET_UL_CORNER and SPATIAL_SUBSET_LR_CORNER - specify the bounding rectangle (in geographical coordinates) for the output
 data set. As the data set in hstm.hdf file actually represents a parallelepipedon-like figure in geographical coordinate space, there will be empty areas
 in the reprojected data set. They wll be automatically filled with zeroes. Note the OUTPUT_PIXEL_SIZE must conform with the pixal size in the land cover
 data. For Hansan's (2000) dataset it is 1 km or 0.008333334 grad.

3.2 In the MRT directory, run

  nohup ./resample -p geo.prm -i hstm.hdf -o hstr.hdf > & relample.log1 &
 As a result, you will have hstr.hdf file containing hot spots by Terra, for 8 consequent days, at the rectangular subdomain in lat-lon coordinates.
 After resampling, copy the resulting file to MODIS data working directory. All operations with data will now take place in this directory.

4. Convert data into raw binary format. Use 'hdp' utility. Actually you need only to modify and run the following scripts in the working directory:

  hdfdump_hs.bs - to dump hot spots data
  hdfdump_ba.bs - to dump burnt area data
 The first script should take hstr.hdf file as an input and dump its content into 8 files named hstr_0x.dat with 0x=01,02,..,08 - the number of day.

5. Repeat steps 1-4 for Aqua hot spots and Terra burnt area data sets. Note that OUTPUT_PIXEL_SIZE in geo.prm file for the ba data file should be

 equal to 0.004166667 grad., i.e. the half-size of pixel in the land cover data.
 As a result, the MODIS data working directory should contain the following files in raw binary format:
  hstr_01.dat        -|
  hstr_02.dat         |
  ...                 |   - hot spots by Terra
  hstr_08.dat        -|
  hsar_01.dat        -|
  hsar_02.dat         |
  ...                 |   - hot spots by Aqua
  hsar_08.dat        -|
  bar.dat             |   - burned area for the month of interest by Terra


 For the selected spatial subdomain, each hs* file contain 6000 rows and 20400 columns of data. The bar.dat file contains 12000 rows and 40406 columns,
 as the spatial resolution of ba data is two times higher than that of hs data. All the data files are in raw binary format without any header.


II. Prepare land cover data. For efficiency reasons, it makes sense to cut-off the subdomain from the land cover data. Use 'select_lc_subdom' module:

  vi select_lc_subdom.f90
  make -f makeselectlcsubdom
  ./select_lc_subdom.exe
  The domain boundaries are specified with the following parameters in select_lc_subdom.f90 file:
  blat,tlat - bottom and top latitudes,
  llon,rlon - left and right longitudes of the subdomain rectangle.
 For the Russian subregion, the parameter values are blat=30.d0, tlat=80.d0, llon=10.d0, rlon=180.d0. The resulting raw binary file in the MODIS data working directory:
  lc_subdom.dat


III. Merge h.s. and l.c. data. Use merge_hs_and_lc module.

  vi merge_hs_and_lc.f90
  make -f makemergehsandlc
  ./merge_hs_and_lc.exe
 Input files are hs 16 data files listed in sec. I.5:
  hstr_01.dat              
  ...                
  hstr_08.dat        
  hsar_01.dat               
  ...                
  hsar_08.dat    
  lc_subdom.dat 
 As an output, the module generates 8 files with land cover data for grid cells where fire hot spots with nomanal or high confidence was detected 
 by Terra OR Aqua satellites. Output files:   
  hslc_01.dat
  ...
  hslc_08.dat
 All input and output files are in the MODIS data working directory.


IV. Now you have prepared all data to perform emission estimates. Use 'femis' module.

  make -f makefemis
  ./femis.exe
 Input files:
  hslc_01.dat
  ...
  hslc_08.dat
  bar.dat
 Output files contain CO emission fluxes [kg/m^2*s] for 8 consequent days:
  co_01.dat
  ...
  co_08.dat