MODIS: Difference between revisions

From GRASS-Wiki
Jump to navigation Jump to search
(→‎Removing holes: rolling 28 day composite)
(→‎Introduction: MODIS Tile Calculator)
 
(59 intermediate revisions by 5 users not shown)
Line 2: Line 2:
* [http://modis.gsfc.nasa.gov/ About MODIS]
* [http://modis.gsfc.nasa.gov/ About MODIS]


There are two MODIS Satellites, Aqua and Terra.
There are two satellites, Aqua and Terra which carry the MODIS sensor as payload. The Moderate Resolution Imaging Spectroradiometer (MODIS) is a 36-channel from visible to thermal-infrared sensor that was launched as part of the Terra satellite payload in December 1999 and Aqua satellite (May 2002). The Terra satellite passes twice a day (at about 10:30am, and 22:30pm local time), also the Aqua satellite passes twice a day (at about 01:30am, and 13:30pm local time).


== MODIS Aqua ==
'''Details about the sensor'''
(compiled from various sources)
 
MODIS (MODerate resolution Imaging Spectroradiometer) is a whiskbroom multi-spectral sensor (across track scanner) on board the satellites Terra and Aqua that measures 36 spectral frequencies of light. Its swath is 2330 km (across track) by 10° of latitude (along track at nadir). Their circular, sun-synchronous, near polar orbit takes about 99 min. which means that the same region is acquired several times during daylight. However, at nadir observations of the same region are taken every 16 days. A complete earth scan is achieved every 1.2 days. Due to its large swath geometric distortions occur at the scan edges. The pixel size grows with the scan angle, by a factor of 8 at the edge of scan, making the shape of each scan-stripe look like a bow-tie. Successively, the scan-stripes being wider at the edges of the scan, overlap each other. This overlap creates a double-vision effect on the edges of the scan, as each point of the Earth’s surface appears in two adjacent scans. This distortion emphasises the line where scan-stripes meet and, thus, magnifies small differences between the sides of the scan mirror.
 
MODIS Tile Calculator: https://landweb.modaps.eosdis.nasa.gov/cgi-bin/developer/tilemap.cgi
 
== MODIS Land ==
 
MODIS Land provides many data products including:
* Daily Surface Reflectance
* Snow coverage
* NDVI / EVI Vegetation index
* Land Surface Temperature (LST)
* Atmosphere Daily Global Products
* many more
 
Most products are created separately for the different MODIS sensors onboard the TERRA (MOD products) and AQUA (MYD products) satellites. Many products are also available as a combined MODIS product (MCD products).
 
See [http://modis-land.gsfc.nasa.gov/ MODIS Land Team Home Page]
 
=== MODIS data download: installation of convenient GRASS GIS addon i.modis ===
 
For a convenient download addon, see {{AddonCmd|i.modis}}, a toolset for downloading and processing of MODIS products using pyModis.
 
Installation of pyModis:
 
<source lang="bash">
# either
pip install pyModis
 
# or
sudo pip install pyModis
</source>
 
Installation of i.modis toolset:
 
<source lang="bash">
# in a GRASS GIS session
g.extension i.modis
</source>
 
Now {{AddonCmd|i.modis}}, {{AddonCmd|i.modis.download}}, and  {{AddonCmd|i.modis.import}} are locally installed.
 
==== Download of MODIS tiles: i.modis.download ====
 
For obtaining the needed NASA credentials, see [http://www.pymodis.org/info.html#requirements here].
 
Example: download of Global MODIS Vegetation Index, sixteen days Global 0.05Deg CMG (Terra/Aqua):
 
<source lang="bash">
##### preparation: first time only
# store NASA username and password in two lines
echo "myusername
xxxxxx" > ~/.imodis
 
# create new space-time cube
t.create type=strds temporaltype=absolute output=ndvi_16_5600m title="Global NDVI 16 days MOD13C1" \
        description="MOD13C1 Global NDVI 16 days" semantictype=mean
##### END first time only
 
MODIS_TMPFOLDER=/path/to/modis/ndvi_MOD13C1.006/
 
# fetch list of available MODIS data up to today
i.modis.download settings=~/.imodis product=ndvi_terra_sixteen_5600 \
startday=2015-01-01 endday=`date +"%Y-%m-%d"` folder=$MODIS_TMPFOLDER
</source>
 
The downloaded HDF files can then be imported with {{AddonCmd|i.modis.download}}.
 
==== Download of MODIS tiles: modis_download.py ====
 
As an alternative to {{AddonCmd|i.modis.download}} the pyModis' tool [http://www.pymodis.org/scripts/modis_download.html modis_download.py] can be used directly. Be sure to have created the needed <tt>.netrc</tt> file with your NASA credentials (see [http://www.pymodis.org/info.html#requirements here] for details):
 
<source lang="bash">
# example: Global MODIS Vegetation Index, sixteen days Global 0.05Deg CMG (Terra/Aqua)
mkdir -p ndvi_MOD13C1.006
 
modis_download.py -e 2015-10-01 -f 2016-01-31 --product=MOD13C1.006 ndvi_MOD13C1.006/
</source>
 
The downloaded HDF files can then be imported with {{AddonCmd|i.modis.download}}.
 
=== MODIS Land Products Quality Control ===
 
A number of MODIS Land products include a quality subdataset that is bit-encoded: different bit ranges represent different quality assessments.
 
The general formula to extract a specific bit range for start bit = sb (first start bit is 0) and number of bits from start bit = nb is the modulus of (quality divided by 2 to the power of the first start bit) by 2 to the power of number of bits - 1, or as r.mapcalc expression using standard math:
 
bitfield = (quality / (2 ^ sb)) % (2 ^ nb)
 
or in bit operation notation:
 
bitfield = (quality >> sb) & ((1 << nb) - 1)
 
==== MOD09A1/MYD09A1 version 6 ====
 
Surface reflectance of MODIS bands 1 - 7 at 500 meter resolution for every 8 days
 
The name of the quality control subdataset is "sur_refl_qc_500m". The type is unsigned int 32 bit, and the contents are
* bits 0 - 1: MODLAND QA bits
* bits 2 - 5: band 1 data quality
* bits 6 - 9: band 2 data quality
* bits 10 - 13: band 3 data quality
* bits 14 - 17: band 4 data quality
* bits 18 - 21: band 5 data quality
* bits 22 - 25: band 6 data quality
* bits 26 - 29 band 7 data quality
* bit 30: atmospheric correction performe
* bit 31: adjacency correction performed
 
Here, the quality subdataset is of type unsigned int 32 bit, therefore you need to use '>>>' instead of '>>' in r.mapcalc expressions. For example, to extract the MODLAND QA bits, use
 
qa = quality & 3
 
To extract the quality of band 2, use
 
band2_quality = (quality >>> 6) & 15
 
* See the user guide at https://lpdaac.usgs.gov/sites/default/files/public/product_documentation/mod09_user_guide_v1.4.pdf
 
The band quality might indicate lower quality (7 = noisy detector or 8 = dead detector), even though the MODLAND QA bits indicate ideal quality on all bands. On the other hand, MODLAND QA bits sometimes indicate pixel "... not produced for other reasons, some or all bands ...". In this case, the band-specific data quality needs to be checked. That means pixels should only be discarded if both MODLAND QA > 1 and band quality > 8 (maybe band quality > 12 for high tolerance).
 
==== MOD11A1/MYD11A1 version 6 ====
 
Land Surface Temperature (LST) at 1 km resolution, daily
 
The name of the quality control subdatasets are "QC_Day" and "QC_Night". The type is unsigned int 8 bit, and the contents are
* bits 0 - 1: Mandatory QA flag
* bits 2 - 3: Data quality flag
* bits 4 - 5: Emissivity error flag
* bits 6 - 7: LST error flag
 
The nodata (NULL) value is set to 0 (zero) which is not mentioned in the documentation, but set in the hdf file and recognized by GDAL. Unfortunately, the value 0 (zero) also indicates highest quality pixels. In order to preserve pixels with value 0 (zero), the nodata value needs to be changed with "gdalbuildvrt -srcnodata 255 -vrtnodata 255" or with "gdal_translate -a_nodata 255" or after import into GRASS with "r.null null=255". After that, pixels with a QC value of 0 and a valid LST value are of highest quality, while pixels with a QC value of 0 and an invalid LST value can be set to NULL.
 
The most useful bit fields are the "Mandatory QA flag", extracted with
 
qa = quality & 3
 
and the "LST error flag", extracted with
 
lst_error = (quality >> 6) & 3
 
==== MOD13Q1/MYD13Q1 version 6 ====
 
Vegetation indices (NDVI and EVI) at 250 meter resolution for every 16 days
 
The name of the quality control subdataset is "250m 16 days VI Quality". The type is unsigned int 16 bit, and the contents are
* bits 0 - 1: MODLAND QA bits
* bits 2 - 5: VI usefulness
* bits 6 - 7: Aerosol Quantity
* bit 8: Adjacent cloud detected
* bit 9: Atmosphere BRDF Correction
* bit 10: Mixed Clouds
* bits 11 - 13: Land/Water Mask
* bit 14: Possible snow/ice
* bit 15: Possible shadow
 
The VI usefulness bits are sometimes indicating 1111 = "Not useful for any other reason/not processed", even for a reasonable VI value and MODLAND QA bits indicating good quality. Therefore the VI usefulness should be used with care, or not at all. The MODLAND QA bits extracted with
 
qa = quality & 3
 
is sufficient to filter out bad quality pixels with qa > 1.
 
=== Advanced MODIS LST time series reconstruction ===
 
* See http://www.geodati.fmach.it/eurolst.html (LST dataset download)
 
== MODIS Ocean ==


Sea surface color, temperature, and derivative images
Sea surface color, temperature, and derivative images
* http://oceancolor.gsfc.nasa.gov/PRODUCTS/
* http://oceancolor.gsfc.nasa.gov/PRODUCTS/


The following examples assume you want to import multiple images at once, and so use shell scripting loops.
The following examples assume you want to import multiple images at once, and so use shell scripting loops. They also assume input files are SMI (Standard Mapped Image) format, as produced by the Ocean Color group (http://oceancolor.gsfc.nasa.gov).
 


=== SST (Level 3) ===
=== SST (Level 3) ===
Line 16: Line 183:
==== Get the data ====
==== Get the data ====


* Get data in HDF4 format from http://oceancolor.gsfc.nasa.gov/PRODUCTS/L3_sst.html
* Get data in HDF4 format from http://oceancolor.gsfc.nasa.gov/cgi/l3
: You'll probably want the Standard Mapped Image (SMI) file version, not the Binned (BIN) one.


File names look like: <tt>A20023532002360.L3m_8D_SST_4</tt>
File names look like: <tt>A20023532002360.L3m_8D_SST_4</tt>
Line 30: Line 198:
  * 4: 4.6km pixel size (8640x4320 image, 2.5 minute resolution)
  * 4: 4.6km pixel size (8640x4320 image, 2.5 minute resolution)


==== Decompress ====


==== Decompress ====
Run this command in the command line:
  bzip2 -d A20023532002360.L3m_8D_SST_4.bz2
  bzip2 -d A20023532002360.L3m_8D_SST_4.bz2


==== Check ====
==== Check ====
* Use [http://www.gdal.org GDAL's] <tt>gdalinfo</tt> tool to view the HDF file's meta-data.
 
Use [http://www.gdal.org GDAL's] <tt>gdalinfo</tt> tool to view the HDF file's meta-data. Run this command in the command line:
  gdalinfo A20023532002360.L3m_8D_SST_4
  gdalinfo A20023532002360.L3m_8D_SST_4


Line 55: Line 225:
   SUBDATASET_2_DESC=[4320x8640] l3m_qual (8-bit unsigned integer)
   SUBDATASET_2_DESC=[4320x8640] l3m_qual (8-bit unsigned integer)


==== Import ====


GRASS can be quite strict about moving maps around between differing map projections. Because GDAL does not automatically georeference HDF4 images, we need to apply the georeferencing information manually. This is a slight chore, but means fewer errors due to projection mis-matches later on. Everything we need to know is given in the data file's meta-data, as seen with <tt>gdalinfo</tt>.


===== Method 0 =====


==== Import ====
The USGS/NASA's [https://lpdaac.usgs.gov/tools/modis_reprojection_tool MODIS Reprojection Tool] (MRT) support reprojection for Level 3 data or [https://lpdaac.usgs.gov/tools/modis_reprojection_tool_swath MODIS Swath Reprojection Tool] for Level 1B or 2 data and can be used to create a GeoTiff. This requires the [ftp://ladsweb.nascom.nasa.gov/allData/5/MOD03/ geolocation angles].
 
GRASS can be quite strict about moving maps around between differing map projections. Because GDAL does not automatically georeference HDF4 images, we need to apply the georeferencing information manually. This is a slight chore, but means fewer errors due to projection mis-matches later on. Everything we need to know is given in the data file's meta-data, as seen with <tt>gdalinfo</tt>.


===== Method 1 =====
===== Method 1 =====
Line 72: Line 243:
: Use "<tt>+nadgrids=@null</tt>" trick?? http://proj.maptools.org/faq.html#sphere_as_wgs84
: Use "<tt>+nadgrids=@null</tt>" trick?? http://proj.maptools.org/faq.html#sphere_as_wgs84


* Use <tt>gdal_translate</tt> to extract the desired data array from the HDF4 file and convert into a GeoTiff. While creating the GeoTiff specify the map projection, bounding box, and no-data information. Then import this GeoTiff using the <tt>r.in.gdal</tt> module. In this example we set the projection to be lat/lon WGS84 (EPSG code 4326), even though this may not be entirely correct.
* Use <tt>gdal_translate</tt> to extract the desired data array from the HDF4 file and convert into a GeoTiff. While creating the GeoTiff specify the map projection, bounding box, and no-data information. Then import this GeoTiff using the {{cmd|r.in.gdal}} module. In this example we set the projection to be lat/lon WGS84 (EPSG code 4326), even though this may not be entirely correct.


Note: this shell script is best stored in a text file and then executed:
  for file in A*SST_4 ; do
  for file in A*SST_4 ; do
   echo "map: $file"
   echo "map: $file"
Line 86: Line 258:
===== Method 2 =====
===== Method 2 =====


Import raw HDF4 data into a XY location, clean it up, then copy into a lat/lon location. This is more work than method 1 but demonstrates using GRASS tools to do the manipulation instead of GDAL tools (the result should be the same).
Import raw HDF4 data into a <strike>XY location, clean it up, then copy into a lat/lon location</strike> lat/lon location using the new {{cmd|r.in.gdal|version=65}} <tt>-l</tt> flag in GRASS 6.5+. This is more work than method 1 but demonstrates using GRASS tools to do the manipulation instead of GDAL tools (the result should be the same).


====== Extract ======
====== Extract ======


Because the HDF4 data file contains multiple data arrays we need to extract the one we are interested in. In this example we extract the "data" layer but not the "quality" layer, and output a GeoTIFF file using GDAL's <tt>gdal_translate</tt> tool, using the appropriate SUBDATASET NAME:
Because the HDF4 data file contains multiple data arrays we need to extract the one we are interested in. In this example we extract the "data" layer but not the "quality" layer, and output a GeoTIFF file using GDAL's <tt>gdal_translate</tt> tool, using the appropriate SUBDATASET NAME:
'
 
  for file in *SST_4 ; do
  for file in *SST_4 ; do
   gdal_translate HDF4_SDS:UNKNOWN:"$file":0 ${file}_unproj.tif
   gdal_translate HDF4_SDS:UNKNOWN:"$file":0 ${file}_unproj.tif
Line 98: Line 270:
====== Import ======
====== Import ======


* Create a simple XY GRASS location and mapset
''Update: in GRASS 6.5 and newer you can now use the {{cmd|r.in.gdal|version=65}} `-l` flag to import unreferenced maps directly into a lat/lon location. You must then immediately run {{cmd|r.region}} to correct the map's region bounds.''
 
* Create a <strike>simple XY</strike> lat/lon GRASS location and mapset
* Import the imagery and set image bounds and resolution:
* Import the imagery and set image bounds and resolution:
  for file in *unproj.tif ; do
  for file in *unproj.tif ; do
   BASE=`basename $file _unproj.tif`
   BASE=`basename $file _unproj.tif`
   echo "map: $file"
   echo "map: $file"
   r.in.gdal in=$file out=$BASE
   r.in.gdal -l in=$file out=$BASE
   r.region $BASE n=90 s=-90 w=-180 e=180
   r.region $BASE n=90 s=-90 w=-180 e=180
  done
  done
Line 111: Line 285:
  r.info A20023532002360.L3m_8D_SST_4
  r.info A20023532002360.L3m_8D_SST_4


<strike>
* Convert the simple XY location into a Lat/Lon location
* Convert the simple XY location into a Lat/Lon location
: ''Plate carrée, or WGS84? -- Does it matter with data at the km scale?''
: ''Plate carrée, or WGS84? -- Does it matter with data at the km scale?''
: Modify the location's projection settings with g.setproj? (run that from the PERMANENT mapset)
: Modify the location's projection settings with g.setproj? (run that from the PERMANENT mapset)
''Hack'': after import move the mapset dir into a world_ll location, then edit the mapset's WIND file and change the <tt>proj:</tt> line from 0 to 3. (0 is unprojected, 3 is LL) You also have to edit each file header in the $MAPSET/cellhd/ directory.
''Hack'': after import move the mapset dir into a world_ll location, then edit the mapset's WIND file and change the <tt>proj:</tt> line from 0 to 3. (0 is unprojected, 3 is LL) You also have to edit each file header in the $MAPSET/cellhd/ directory.
</strike>


* Remove NULLs
* Remove NULLs
Line 138: Line 314:
   #r.colors "${map}.degC" color=bcyr  # fairly nice
   #r.colors "${map}.degC" color=bcyr  # fairly nice
  done
  done


* Check:
* Check:
Line 165: Line 340:


The processed color rules file is here: [http://trac.osgeo.org/grass/browser/grass-addons/raster/r.colors.tools/palettes]
The processed color rules file is here: [http://trac.osgeo.org/grass/browser/grass-addons/raster/r.colors.tools/palettes]


* Apply color rules to imported maps:
* Apply color rules to imported maps:
Line 172: Line 346:
   r.colors "${map}.degC" rules=palette_sst.gcolors
   r.colors "${map}.degC" rules=palette_sst.gcolors
  done
  done
==== Apply a quality mask ====
If there is cloud cover, the satellite will show the temperature at the top of the cloud, not at the sea surface. The HDF data comes with a quality layer and we can use it to restrict the data only to areas with the highest confidence.
Select the file:
file=T2010235.L3m_DAY_SST_4km
echo "map: $file"
Extract the quality layer from the HDF data file:
gdal_translate -a_srs "+init=epsg:4326" -a_nodata 65535 \
  -a_ullr -180 90 180 -90 -co "COMPRESS=PACKBITS" \
  HDF4_SDS:UNKNOWN:"$file":1  ${file}.qual_prep.tif
Import the extracted GeoTiff with {{cmd|r.in.gdal}}:
r.in.gdal in=${file}.qual_prep.tif out=$file.qual
Set up some color rules to cover the quality status categories using {{cmd|r.colors}}:
r.colors $file.qual col=rules << EOF
  0 green
  1 yellow
  2 red
  255 black
EOF
Assign quality map category labels with {{cmd|r.category}}:
r.category $file.qual rules=- << EOF
0:best quality
1:not as good quality
2:rather bad quality
3:pretty bad quailty
4:254:really bad quality
255:unusable quality
EOF
Setup a MASK map to only show the best quality data using a {{cmd|r.reclass}} virtual raster:
r.reclass in=$file.qual out=MASK --o << EOF
  0 = 1    good quality
  * = NULL
EOF
With the ''MASK'' map in place, redrawing the map display (or any other raster processing) will only occur for the good data cells.


=== Chlorophyll-a (Level 3) ===
=== Chlorophyll-a (Level 3) ===
Line 191: Line 407:


==== Decompress ====
==== Decompress ====
Run this command in the command line:
  bzip2 -d A20023352002365.L3m_MO_CHLO_4.bz2
  bzip2 -d A20023352002365.L3m_MO_CHLO_4.bz2


==== Check ====
==== Check ====
* Use [http://www.gdal.org GDAL's] <tt>gdalinfo</tt> tool to view the HDF file's meta-data.
Use [http://www.gdal.org GDAL's] <tt>gdalinfo</tt> tool to view the HDF file's meta-data. Run this command in the command line:
  gdalinfo A20023352002365.L3m_MO_CHLO_4
  gdalinfo A20023352002365.L3m_MO_CHLO_4


Line 233: Line 449:
===== Method 2 =====
===== Method 2 =====


* Create a simple XY GRASS location and mapset
''Update: in GRASS 6.5 and newer you can now use the {{cmd|r.in.gdal|version=65}} `-l` flag to import unreferenced maps directly into a lat/lon location. You must then immediately run {{cmd|r.region}} to correct the map's region bounds.''
 
* Create a <strike>simple XY</strike> lat/lon GRASS location and mapset
* Import the imagery and set image bounds and resolution:
* Import the imagery and set image bounds and resolution:
  for file in A*_MO_CHLO_4 ; do
  for file in A*_MO_CHLO_4 ; do
   echo "map: $file"
   echo "map: $file"
   r.in.gdal in=$file out=$file
   r.in.gdal -l in=$file out=$file
   r.region $file n=90 s=-90 w=-180 e=180
   r.region $file n=90 s=-90 w=-180 e=180
  done
  done
Line 245: Line 463:
  r.info A20023352002365.L3m_MO_CHLO_4
  r.info A20023352002365.L3m_MO_CHLO_4


<strike>
* Convert the simple XY location into a Lat/Lon location
* Convert the simple XY location into a Lat/Lon location
: ''Plate carrée, or WGS84? -- Does it matter with data at the km scale?''
: ''Plate carrée, or WGS84? -- Does it matter with data at the km scale?''
: Modify the location's projection settings with g.setproj? (run that from the PERMANENT mapset)
: Modify the location's projection settings with g.setproj? (run that from the PERMANENT mapset)
''Hack:'': after import move the mapset dir into a world_ll location, then edit the mapset's WIND file and change the <tt>proj:</tt> line from 0 to 3. (0 is unprojected, 3 is LL) You also have to edit each file header in the $MAPSET/cellhd/ directory.
''Hack:'': after import move the mapset dir into a world_ll location, then edit the mapset's WIND file and change the <tt>proj:</tt> line from 0 to 3. (0 is unprojected, 3 is LL) You also have to edit each file header in the $MAPSET/cellhd/ directory.
</strike>


* Remove NULLs
* Remove NULLs
Line 254: Line 474:
   r.null $map setnull=65535
   r.null $map setnull=65535
  done
  done


===== Method 3 =====
===== Method 3 =====


Like method 2 above, but instead of moving the mapset directory in the file system use i.rectify to move the maps into the target Lat/Lon location. After importing the images, setting their bounds, and removing NULLs, add all images to an imagery group with the <tt>i.group</tt> module. Run <tt>i.target</tt> and select a lat/lon location. The idea here is to run i.rectify with a first-order transform, where the transform uses a scaling factor of 1.0 and rotation of 0.0, i.e. no change at all. The trick is to set those. Usually you would use <tt>i.points</tt> or the GUI georeferencing tool to set those, and here you can add a few points by hand if you like. But the POINTS file in the group's data directory should use identical coordinates for both source and destination. ''Hack:'' edit the POINTS file by hand to make it so, using the four corners and 0,0 at the center of the image. Finally run i.rectify to push the image into the target location.
Like method 2 above, but instead of moving the mapset directory in the file system use {{cmd|i.rectify}} to move the maps into the target Lat/Lon location. After importing the images, setting their bounds, and removing NULLs, add all images to an imagery group with the {{cmd|i.group}} module. Run {{cmd|i.target}} and select a lat/lon location. The idea here is to run <tt>i.rectify</tt> with a first-order transform, where the transform uses a scaling factor of 1.0 and rotation of 0.0, i.e. no change at all. The trick is to set those. Usually you would use {{cmd|i.points}} or the GUI georeferencing tool to set those, and here you can add a few points by hand if you like. But the POINTS file in the group's data directory should use identical coordinates for both source and destination. ''Hack:'' edit the POINTS file by hand to make it so, using the four corners and 0,0 at the center of the image. Finally run <tt>i.rectify</tt> to push the image into the target location.
 


==== Processing ====
==== Processing ====
Line 300: Line 518:


The processed color rules file is here: [http://trac.osgeo.org/grass/browser/grass-addons/raster/r.colors.tools/palettes]
The processed color rules file is here: [http://trac.osgeo.org/grass/browser/grass-addons/raster/r.colors.tools/palettes]


* Apply color rules to imported maps:
* Apply color rules to imported maps:
Line 308: Line 525:
  done
  done


=== Removing holes ===


== SeaWiFS ==
See [[AVHRR]].


From the same [http://oceancolor.gsfc.nasa.gov/PRODUCTS/ OceanColor website as above] (gsfc.nasa.gov) you can download [http://en.wikipedia.org/wiki/Seawifs SeaWiFS] data, a particularly good source for chlorophyll-''a'' at sea and NDVI on land.
* If it's land cover classes, {{cmd|r.fillnulls}} will produce nonsense data: floating point. Rather use {{cmd|r.neighbors}} with a modal filter, {{cmd|r.patch}} the original with the filtered map. This will replace NULLs with the most common surrounding land cover type.


* the SeaWiFS Standard Mapped Products are already geocoded in WGS84 so should be easy to load into a lat/lon location with r.in.gdal. (at least this is true for PAR maps)
=== Rolling composite ===


* Processing is very similar to the MODIS examples above.
Create a rolling 28 day composite:


== Pathfinder SST ==
Say you have weekly satellite data but there is a lot of NULL cells where there was cloud cover that day. You can fill these holes by looking for valid data in the previous week's image, and if again you find clouds, look in week prior to that, etc., up to 4 weeks have past, after which point if any holes remain give up and leave it unfilled. You can do this quite simply with r.patch:


=== About ===
r.patch output=rolling28day.date input=newest,last_week,2weeks_ago,3weeks_ago


AVHRR sea surface temperature data v5.0 at 4km resolution is available for 1985-present.
NULL data cells are filled using the maps in the given '''input=''' order.
 
* See the [http://www.nodc.noaa.gov/sog/pathfinder4km/ 4km AVHRR Pathfinder homepage] at NOAA's NODC. (especially the [http://www.nodc.noaa.gov/SatelliteData/pathfinder4km/userguide.html user's guide])
 
* The data can be downloaded from their FTP site:
: ftp://data.nodc.noaa.gov/pub/data.nodc/pathfinder/
 
* The filesize for a 4km resolution data file is approx 16mb.
 
* You will probably want to download both the raw sst data  and the quality mask layer ("*-sst" and "*-qual" files). During processing we will use the quality layer to mask the raw data for valid pixels.
 
File names look like: <tt>2000301-2000305.s0451pfv50-sst-16b.hdf</tt>
 
key:
* 2000: Year at start
* 301: Julian day at start
* 2000: Year at end
* 305: Julian day at end
* s: 16-bit data
* 04: 4km resolution (8192x4096 image, approx 2.637 minute resolution)
* 5: 5 day ensemble
* 1: nighttime (3 is daytime)
* pfv50: Pathfinder version 5.0
* sst: Product: Raw SST (including cloud hits)
* 16b: 16 bits (only present in pre-2003 data)


=== Download ===
=== MODIS SWATH ===


Here's a little shell script code to download SST and Quality maps for a given set of days.
Here an alternative to MODIS SWATH TOOL using GDAL:
#### 8-day
* https://gis.stackexchange.com/questions/81361/how-to-reproject-modis-swath-data-to-wgs84?rq=1
DAYS="\
2000097-2000104
2000145-2000152
2000241-2000248
2000313-2000320"
BASE_URL="<nowiki>http://data.nodc.noaa.gov/pathfinder/Version5.0/8day/2000/</nowiki>"
for TIME in 1 3 ; do
  for JULIAN in $DAYS ; do
    wget "${BASE_URL}${JULIAN}.s048${TIME}pfv50-sst-16b.hdf"
    wget "${BASE_URL}${JULIAN}.m048${TIME}pfv50-qual.hdf"
  done
done


=== Check ===
== See also ==


* Use GDAL's gdalinfo tool to view the HDF file's meta-data, as above for MODIS HDF4 data.
* [[SeaWiFS]]
* [[AVHRR]]
* [http://acc6.its.brooklyn.cuny.edu/~gurwitz/core5/nav2tool.html Decimal/Binary Conversion Tool] for MODIS QA bitpatterns


=== Import ===
== Bibliography ==


* This data is already in Lat/Lon WGS84, so things are made a bit easier.
* Alexandris, N. (2011). Burned area mapping via non-centered Principal Components Analysis using public domain data and free open source software. Department of Remote Sensing & Landscape Information Systems, Faculty of Forest and Environmental Sciences, Albert-Ludwigs-University, Freiburg im Breisgau, Germany. Dissertation [http://www.freidok.uni-freiburg.de/volltexte/8399 Entry at freidok.uni-freiburg.de]
* Carpi G., Cagnacci F., Neteler M., Rizzoli A, 2008: ''Tick infestation on roe deer in relation to geographic and remotely-sensed climatic variables in a tick-borne encephalitis endemic area''. Epidemiology and Infection, 136, pp. 1416-1424. ([http://www.ncbi.nlm.nih.gov/pubmed/18081949 PubMed])
* Metz, M., Rocchini, D., Neteler, M., 2014. Surface Temperatures at the Continental Scale: Tracking Changes with Remote Sensing at Unprecedented Detail. Remote Sensing 6, 3822–3840 ([http://dx.doi.org/10.3390/rs6053822 DOI], [http://www.mdpi.com/2072-4292/6/5/3822 Abstract], [http://www.mdpi.com/2072-4292/6/5/3822/pdf PDF]). Dataset download: http://gis.cri.fmach.it/eurolst
* Neteler, M., 2005: ''Time series processing of MODIS satellite data for landscape epidemiological applications''. International Journal of Geoinformatics, 1(1), pp. 133-138 ([http://creativecity.gscc.osaka-cu.ac.jp/IJG/article/download/325/326 PDF])
* Neteler, M., 2010: ''Estimating daily Land Surface Temperatures in mountainous environments by reconstructed MODIS LST data''. Remote Sensing 2(1), 333-351. ([http://dx.doi.org/10.3390/rs1020333 DOI], [http://www.mdpi.com/2072-4292/2/1/333 Abstract], [http://www.mdpi.com/2072-4292/2/1/333/pdf PDF])
* Neteler, M., Roiz, D., Rocchini, D., Castellani, C. and Rizzoli, A. (2011). Terra and Aqua satellites track tiger mosquito invasion: modeling the potential distribution of Aedes albopictus in north-eastern Italy. International Journal of Health Geographics, 10:49 ([http://dx.doi.org/10.1186/1476-072X-10-49 DOI] | [http://www.ij-healthgeographics.com/content/pdf/1476-072X-10-49.pdf PDF])
* Neteler, M., Metz, M., Rocchini, D., Rizzoli, A., Flacio, E., Engeler, L., Guidi, V., Lüthy, P., Tonolla, M. (2013): Is Switzerland suitable for the invasion of Aedes albopictus? PLoS ONE 8(12): e82090. [[http://dx.doi.org/10.1371/journal.pone.0082090 DOI] | [http://www.plosone.org/article/fetchObject.action?uri=info%3Adoi%2F10.1371%2Fjournal.pone.0082090&representation=PDF PDF]]
* Roiz D., Neteler M., Castellani C., Arnoldi D., Rizzoli A., 2011: ''Climatic Factors Driving Invasion of the Tiger Mosquito (Aedes albopictus) into New Areas of Trentino, Northern Italy''. PLoS ONE. 6(4): e14800. ([http://dx.plos.org/10.1371/journal.pone.0014800 DOI] - [http://www.plosone.org/article/fetchObjectAttachment.action?uri=info%3Adoi%2F10.1371%2Fjournal.pone.0014800&representation=PDF PDF])
* Zorer, R., Rocchini, D., Metz, M., Delucchi, L., Zottele, F., Meggio, F., Neteler, M. (2013): Daily MODIS Land Surface Temperature Data for the Analysis of the Heat Requirements of Grapevine Varieties. IEEE Transactions on Geoscience and Remote Sensing 51, 2128–2135. ([http://dx.doi.org/10.1109/TGRS.2012.2226465 DOI])


* Open a lat/lon WGS84 location (EPSG code 4326)
[[Category: Geodata]]
 
[[Category:Image processing]]
* Use gdal_translate to convert the HDF4 file into a GeoTiff containing map projection, bounding box, and no-data information. Then import this GeoTiff using the r.in.gdal module.
 
# Convert and import SST maps
for file in *pfv50-sst*.hdf ; do
  map=`basename "$file" .hdf`
  echo "map: $map"
  gdal_translate -a_srs "+init=epsg:4326" -a_nodata 0 \
    -a_ullr -180 90 180 -90 -co "COMPRESS=PACKBITS" \
    HDF4_SDS:UNKNOWN:"$file":0  ${map}_prep.tif
  r.in.gdal in=${map}_prep.tif out="$map"
done
 
# Convert and import Overall Quality layers
for file in *pfv50-qual.hdf ; do
  map=`basename "$file" .hdf`
  echo "map: $map"
  gdal_translate -a_srs "+init=epsg:4326" \
    -a_ullr -180 90 180 -90 -co "COMPRESS=PACKBITS" \
    HDF4_SDS:UNKNOWN:"$file":0  ${map}_prep.tif
  r.in.gdal in=${map}_prep.tif out="$map"
  r.support "$map" title="SST Overall Quality" history=""
  r.support "$map" history="key: 0 is worst, 7 is best"
  r.colors -r "$map"  # use standard colors
done
 
=== Processing ===
 
* Convert values to temperature in degrees C. Note slope and intercept are taken from the image's metadata using gdalinfo:
  scale_factor=0.075
  add_off=-3
 
 
Given those coefficients we can now process the data. In this example we will only accept pixels with a quality of 4 or greater.
Note that special care is taken to quote the map names in r.mapcalc as they contain minus signs.
 
for map in `g.mlist rast pat="*pfv50-sst*" | grep -v '\.degC$'` ; do
  echo "$map"
  g.region rast=$map
  #r.info -r $map
  Scale=0.075
  Intercept=-3
  qual_map="`echo "$map" | sed  -e 's/\.s/.m/' -e 's/-sst.*$//'`-qual"
  r.mapcalc MASK="if(\"$qual_map\" >= 4)"
  r.mapcalc "\"${map}.degC.q4\" = ($Scale * \"$map\") + $Intercept"
  g.remove MASK
  r.support "${map}.degC.q4" units="deg-C"
done
 
* Check:
r.info 2000301-2000305.s0451pfv50-sst-16b.degC.q4
r.univar 2000301-2000305.s0451pfv50-sst-16b.degC.q4
 
 
* With r.univar (or d.histogram) we see some spurious data points (125deg C water!). We can check with r.univar's "-e percentile=99" that the majority of the data really does not go above 30deg C. If  you like you can use r.mapcalc to remove anything above 40C:
r.mapcalc "clean_map = if( orig_map > 40, null(), orig_map )"
 
=== Set colors ===
 
For a very nice looking image we can use Goddard's OceanColor palettes from
: http://oceancolor.gsfc.nasa.gov/PRODUCTS/colorbars.html
For more information on how to use those with GRASS see the above section regarding MODIS data. To apply the [http://trac.osgeo.org/grass/browser/grass-addons/raster/r.colors.tools/palettes converted rules]:
 
r.colors "${map}.degC.q4" rules=palette_sst.gcolors
 
== Removing holes ==
 
=== Interpolation ===
Cloud cover or technical problems often causes there to be some gaps in the data. GRASS has a [http://grass.osgeo.org/grass63/manuals/html63_user/r.fillnulls.html r.fillnulls] module which uses spline interpolation to fill in these small gaps. These holes are left in the data for a reason, so check if it is appropriate for your analysis before removing them. Also note that the areas around the holes may be of low quality and using them as the starting point for filling the holes just spreads the error wider.
 
The first step is to create a mask of the sea, i.e. the area which it is valid to interpolate over. To do this we overlay a time series of satellite images and see which areas are permanently no data.
 
* Overlay all maps; the data will be mixed up and bogus but the spatial coverage will be informative
r.patch in=`g.mlist -r rast pat='A200.*4$' sep=,` out=all_sat_images_stacked
 
* Create a 0/1 mask layer showing areas of sea, and label categories
r.mapcalc "sea_mask = if( isnull(all_sat_images_stacked), 0, 1)"
r.category sea_mask rules=- << EOF
  0:Land
  1:Sea
EOF
g.remove all_sat_images_stacked
 
The next step is to perform the filling operation. To speed up processing, we first temporarily fill known land areas with a value as there is no point interpolating over those gaps. To avoid interpolation overshoots, smear coastal values slightly inland before interpolation. After the holes are filled we reapply the mask to again remove those areas. As the r.fillnulls step can take some time, it is advisable to use g.region to zoom into an area of interest first.
 
for map in `g.mlist rast pat=*SST_4.degC` ; do
    r.grow in=$map out=${map}.grown radius=5
    r.mapcalc "tmp_landfill = if( isnull($map) && sea_mask == 1, \
      null(), if( isnull(${map}.grown), 0, ${map}.grown) )"
    r.fillnulls in=tmp_landfill out=${map}.filled_tmp
    g.copy sea_mask,MASK
    r.mapcalc "${map}.filled = ${map}.filled_tmp"
    g.remove tmp_landfill,${map}.grown,${map}.filled_tmp,MASK
done
 
=== Rolling composite ===
 
Create a rolling 28 day composite:
 
Say you have weekly satellite data but there is a lot of NULL cells where there was cloud cover that day. You can fill these holes by looking for valid data in the previous week's image, and if again you find clouds, look in week prior to that, etc., up to 4 weeks have past, after which point if any holes remain give up and leave it unfilled. You can do this quite simply with r.patch:
 
r.patch output=rolling28day.date input=newest,last_week,2weeks_ago,3weeks_ago
 
NULL data cells are filled using the maps in the given '''input=''' order.

Latest revision as of 16:51, 10 May 2018

Introduction

There are two satellites, Aqua and Terra which carry the MODIS sensor as payload. The Moderate Resolution Imaging Spectroradiometer (MODIS) is a 36-channel from visible to thermal-infrared sensor that was launched as part of the Terra satellite payload in December 1999 and Aqua satellite (May 2002). The Terra satellite passes twice a day (at about 10:30am, and 22:30pm local time), also the Aqua satellite passes twice a day (at about 01:30am, and 13:30pm local time).

Details about the sensor (compiled from various sources)

MODIS (MODerate resolution Imaging Spectroradiometer) is a whiskbroom multi-spectral sensor (across track scanner) on board the satellites Terra and Aqua that measures 36 spectral frequencies of light. Its swath is 2330 km (across track) by 10° of latitude (along track at nadir). Their circular, sun-synchronous, near polar orbit takes about 99 min. which means that the same region is acquired several times during daylight. However, at nadir observations of the same region are taken every 16 days. A complete earth scan is achieved every 1.2 days. Due to its large swath geometric distortions occur at the scan edges. The pixel size grows with the scan angle, by a factor of 8 at the edge of scan, making the shape of each scan-stripe look like a bow-tie. Successively, the scan-stripes being wider at the edges of the scan, overlap each other. This overlap creates a double-vision effect on the edges of the scan, as each point of the Earth’s surface appears in two adjacent scans. This distortion emphasises the line where scan-stripes meet and, thus, magnifies small differences between the sides of the scan mirror.

MODIS Tile Calculator: https://landweb.modaps.eosdis.nasa.gov/cgi-bin/developer/tilemap.cgi

MODIS Land

MODIS Land provides many data products including:

  • Daily Surface Reflectance
  • Snow coverage
  • NDVI / EVI Vegetation index
  • Land Surface Temperature (LST)
  • Atmosphere Daily Global Products
  • many more

Most products are created separately for the different MODIS sensors onboard the TERRA (MOD products) and AQUA (MYD products) satellites. Many products are also available as a combined MODIS product (MCD products).

See MODIS Land Team Home Page

MODIS data download: installation of convenient GRASS GIS addon i.modis

For a convenient download addon, see i.modis, a toolset for downloading and processing of MODIS products using pyModis.

Installation of pyModis:

# either
pip install pyModis

# or
sudo pip install pyModis

Installation of i.modis toolset:

# in a GRASS GIS session
g.extension i.modis

Now i.modis, i.modis.download, and i.modis.import are locally installed.

Download of MODIS tiles: i.modis.download

For obtaining the needed NASA credentials, see here.

Example: download of Global MODIS Vegetation Index, sixteen days Global 0.05Deg CMG (Terra/Aqua):

##### preparation: first time only
# store NASA username and password in two lines
echo "myusername
xxxxxx" > ~/.imodis

# create new space-time cube
t.create type=strds temporaltype=absolute output=ndvi_16_5600m title="Global NDVI 16 days MOD13C1" \
         description="MOD13C1 Global NDVI 16 days" semantictype=mean
##### END first time only

MODIS_TMPFOLDER=/path/to/modis/ndvi_MOD13C1.006/

# fetch list of available MODIS data up to today
i.modis.download settings=~/.imodis product=ndvi_terra_sixteen_5600 \
	startday=2015-01-01 endday=`date +"%Y-%m-%d"` folder=$MODIS_TMPFOLDER

The downloaded HDF files can then be imported with i.modis.download.

Download of MODIS tiles: modis_download.py

As an alternative to i.modis.download the pyModis' tool modis_download.py can be used directly. Be sure to have created the needed .netrc file with your NASA credentials (see here for details):

# example: Global MODIS Vegetation Index, sixteen days Global 0.05Deg CMG (Terra/Aqua)
mkdir -p ndvi_MOD13C1.006

modis_download.py -e 2015-10-01 -f 2016-01-31 --product=MOD13C1.006 ndvi_MOD13C1.006/

The downloaded HDF files can then be imported with i.modis.download.

MODIS Land Products Quality Control

A number of MODIS Land products include a quality subdataset that is bit-encoded: different bit ranges represent different quality assessments.

The general formula to extract a specific bit range for start bit = sb (first start bit is 0) and number of bits from start bit = nb is the modulus of (quality divided by 2 to the power of the first start bit) by 2 to the power of number of bits - 1, or as r.mapcalc expression using standard math:

bitfield = (quality / (2 ^ sb)) % (2 ^ nb)

or in bit operation notation:

bitfield = (quality >> sb) & ((1 << nb) - 1)

MOD09A1/MYD09A1 version 6

Surface reflectance of MODIS bands 1 - 7 at 500 meter resolution for every 8 days

The name of the quality control subdataset is "sur_refl_qc_500m". The type is unsigned int 32 bit, and the contents are

  • bits 0 - 1: MODLAND QA bits
  • bits 2 - 5: band 1 data quality
  • bits 6 - 9: band 2 data quality
  • bits 10 - 13: band 3 data quality
  • bits 14 - 17: band 4 data quality
  • bits 18 - 21: band 5 data quality
  • bits 22 - 25: band 6 data quality
  • bits 26 - 29 band 7 data quality
  • bit 30: atmospheric correction performe
  • bit 31: adjacency correction performed

Here, the quality subdataset is of type unsigned int 32 bit, therefore you need to use '>>>' instead of '>>' in r.mapcalc expressions. For example, to extract the MODLAND QA bits, use

qa = quality & 3

To extract the quality of band 2, use

band2_quality = (quality >>> 6) & 15

The band quality might indicate lower quality (7 = noisy detector or 8 = dead detector), even though the MODLAND QA bits indicate ideal quality on all bands. On the other hand, MODLAND QA bits sometimes indicate pixel "... not produced for other reasons, some or all bands ...". In this case, the band-specific data quality needs to be checked. That means pixels should only be discarded if both MODLAND QA > 1 and band quality > 8 (maybe band quality > 12 for high tolerance).

MOD11A1/MYD11A1 version 6

Land Surface Temperature (LST) at 1 km resolution, daily

The name of the quality control subdatasets are "QC_Day" and "QC_Night". The type is unsigned int 8 bit, and the contents are

  • bits 0 - 1: Mandatory QA flag
  • bits 2 - 3: Data quality flag
  • bits 4 - 5: Emissivity error flag
  • bits 6 - 7: LST error flag

The nodata (NULL) value is set to 0 (zero) which is not mentioned in the documentation, but set in the hdf file and recognized by GDAL. Unfortunately, the value 0 (zero) also indicates highest quality pixels. In order to preserve pixels with value 0 (zero), the nodata value needs to be changed with "gdalbuildvrt -srcnodata 255 -vrtnodata 255" or with "gdal_translate -a_nodata 255" or after import into GRASS with "r.null null=255". After that, pixels with a QC value of 0 and a valid LST value are of highest quality, while pixels with a QC value of 0 and an invalid LST value can be set to NULL.

The most useful bit fields are the "Mandatory QA flag", extracted with

qa = quality & 3

and the "LST error flag", extracted with

lst_error = (quality >> 6) & 3

MOD13Q1/MYD13Q1 version 6

Vegetation indices (NDVI and EVI) at 250 meter resolution for every 16 days

The name of the quality control subdataset is "250m 16 days VI Quality". The type is unsigned int 16 bit, and the contents are

  • bits 0 - 1: MODLAND QA bits
  • bits 2 - 5: VI usefulness
  • bits 6 - 7: Aerosol Quantity
  • bit 8: Adjacent cloud detected
  • bit 9: Atmosphere BRDF Correction
  • bit 10: Mixed Clouds
  • bits 11 - 13: Land/Water Mask
  • bit 14: Possible snow/ice
  • bit 15: Possible shadow

The VI usefulness bits are sometimes indicating 1111 = "Not useful for any other reason/not processed", even for a reasonable VI value and MODLAND QA bits indicating good quality. Therefore the VI usefulness should be used with care, or not at all. The MODLAND QA bits extracted with

qa = quality & 3

is sufficient to filter out bad quality pixels with qa > 1.

Advanced MODIS LST time series reconstruction

MODIS Ocean

Sea surface color, temperature, and derivative images

The following examples assume you want to import multiple images at once, and so use shell scripting loops. They also assume input files are SMI (Standard Mapped Image) format, as produced by the Ocean Color group (http://oceancolor.gsfc.nasa.gov).

SST (Level 3)

Get the data

You'll probably want the Standard Mapped Image (SMI) file version, not the Binned (BIN) one.

File names look like: A20023532002360.L3m_8D_SST_4

key:
* A: MODIS/Aqua
* 2002: Year at start
* 353: Julian day at start
* 2002: Year at end
* 360: Julian day at end
* L3m: Level 3 data, mapped (Plate carrée)
* 8D: 8 day ensemble
* SST: Sea Surface Temperature product
* 4: 4.6km pixel size (8640x4320 image, 2.5 minute resolution)

Decompress

Run this command in the command line:

bzip2 -d A20023532002360.L3m_8D_SST_4.bz2

Check

Use GDAL's gdalinfo tool to view the HDF file's meta-data. Run this command in the command line:

gdalinfo A20023532002360.L3m_8D_SST_4

Example of SST metadata from gdalinfo:

 Parameter=Sea Surface Temperature
 Measure=Mean
 Units=deg-C
 Scaling Equation=(Slope*l3m_data) + Intercept = Parameter value
 Slope=0.000717185
 Intercept=-2
 Scaled Data Minimum=-2
 Scaled Data Maximum=45
 Data Minimum=-1.999999
 Data Maximum=37.06
Subdatasets:
 SUBDATASET_1_NAME=HDF4_SDS:UNKNOWN:"A20023532002360.L3m_8D_SST_4":0
 SUBDATASET_1_DESC=[4320x8640] l3m_data (16-bit unsigned integer)
 SUBDATASET_2_NAME=HDF4_SDS:UNKNOWN:"A20023532002360.L3m_8D_SST_4":1
 SUBDATASET_2_DESC=[4320x8640] l3m_qual (8-bit unsigned integer)

Import

GRASS can be quite strict about moving maps around between differing map projections. Because GDAL does not automatically georeference HDF4 images, we need to apply the georeferencing information manually. This is a slight chore, but means fewer errors due to projection mis-matches later on. Everything we need to know is given in the data file's meta-data, as seen with gdalinfo.

Method 0

The USGS/NASA's MODIS Reprojection Tool (MRT) support reprojection for Level 3 data or MODIS Swath Reprojection Tool for Level 1B or 2 data and can be used to create a GeoTiff. This requires the geolocation angles.

Method 1

Use GDAL tools to prepare the dataset and create a GeoTiff which can be directly imported into GRASS.

  • Create a Lat/Lon GRASS location and mapset
Plate carrée, or WGS84? -- Does it matter with data at the km scale?
-- Encode with +ellps=sphere (?) or epsg:32662?, see http://lists.osgeo.org/pipermail/gdal-dev/2007-September/014134.html or http://thread.gmane.org/gmane.comp.gis.gdal.devel/12666
FIXME: gdalinfo: " Map Projection=Equidistant Cylindrical"
Use "+nadgrids=@null" trick?? http://proj.maptools.org/faq.html#sphere_as_wgs84
  • Use gdal_translate to extract the desired data array from the HDF4 file and convert into a GeoTiff. While creating the GeoTiff specify the map projection, bounding box, and no-data information. Then import this GeoTiff using the r.in.gdal module. In this example we set the projection to be lat/lon WGS84 (EPSG code 4326), even though this may not be entirely correct.

Note: this shell script is best stored in a text file and then executed:

for file in A*SST_4 ; do
  echo "map: $file"

  gdal_translate -a_srs "+init=epsg:4326" -a_nodata 65535 \
    -a_ullr -180 90 180 -90 -co "COMPRESS=PACKBITS" \
    HDF4_SDS:UNKNOWN:"$file":0  ${file}_prep.tif

  r.in.gdal in=${file}_prep.tif out=$file
done
Method 2

Import raw HDF4 data into a XY location, clean it up, then copy into a lat/lon location lat/lon location using the new r.in.gdal -l flag in GRASS 6.5+. This is more work than method 1 but demonstrates using GRASS tools to do the manipulation instead of GDAL tools (the result should be the same).

Extract

Because the HDF4 data file contains multiple data arrays we need to extract the one we are interested in. In this example we extract the "data" layer but not the "quality" layer, and output a GeoTIFF file using GDAL's gdal_translate tool, using the appropriate SUBDATASET NAME:

for file in *SST_4 ; do
  gdal_translate HDF4_SDS:UNKNOWN:"$file":0 ${file}_unproj.tif
done
Import

Update: in GRASS 6.5 and newer you can now use the r.in.gdal `-l` flag to import unreferenced maps directly into a lat/lon location. You must then immediately run r.region to correct the map's region bounds.

  • Create a simple XY lat/lon GRASS location and mapset
  • Import the imagery and set image bounds and resolution:
for file in *unproj.tif ; do
  BASE=`basename $file _unproj.tif`
  echo "map: $file"
  r.in.gdal -l in=$file out=$BASE
  r.region $BASE n=90 s=-90 w=-180 e=180
done
  • Check:
g.list rast
r.info A20023532002360.L3m_8D_SST_4

  • Convert the simple XY location into a Lat/Lon location
Plate carrée, or WGS84? -- Does it matter with data at the km scale?
Modify the location's projection settings with g.setproj? (run that from the PERMANENT mapset)

Hack: after import move the mapset dir into a world_ll location, then edit the mapset's WIND file and change the proj: line from 0 to 3. (0 is unprojected, 3 is LL) You also have to edit each file header in the $MAPSET/cellhd/ directory.

  • Remove NULLs
for map in `g.mlist rast pat=*SST_4` ; do
  r.null $map setnull=65535
done
Method 3

Like method 2 above, but using i.rectify instead of moving the mapset manually. See the more detailed explanation in the Chlorophyll section below.

Processing

  • Convert to temperature in degrees C. Note slope and intercept are taken from the image's metadata using gdalinfo.
for map in `g.mlist rast pat=*SST_4` ; do
  echo "$map"
  g.region rast=$map
  #r.info -r $map
  Slope=0.000717185
  Intercept=-2
  r.mapcalc "${map}.degC = ($Slope * $map) + $Intercept"
  r.support "${map}.degC" units="deg-C"
  #r.colors "${map}.degC" color=bcyr  # fairly nice
done
  • Check:
r.info A20023532002360.L3m_8D_SST_4.degC
r.univar A20023532002360.L3m_8D_SST_4.degC

Set colors

Although the standard bcyr color map looks nice, for a very nice image we can use Goddard's OceanColor palettes from http://oceancolor.gsfc.nasa.gov/PRODUCTS/colorbars.html

Those are given in 0-255 range, the imported HDF4 data is 0-65535, and the converted temperatures are -2-45 deg C.

  • Use the UNIX awk tool to convert each color rule with the same scaling function we used to convert the data, and save it to a rules file:
# scale 0-255 to 0-65535 and then convert to temperature values
echo "# Color rules for MODIS SST" > palette_sst.gcolors
Slope=0.000717185
Intercept=-2
cat palette_sst.txt | grep -v '^#' | \
  awk -v Slope=$Slope -v Intercept=$Intercept \
    '{ printf("%f %d:%d:%d\n", \
        (Slope * (($1 +1)^2 -1) + Intercept), $2, $3, $4)}' \
  >> palette_sst.gcolors
echo "nv black" >> palette_sst.gcolors
# better: edit last rule to be 45.000719 206:206:206


The processed color rules file is here: [1]

  • Apply color rules to imported maps:
for map in `g.mlist rast pat=*SST_4` ; do
  r.colors "${map}.degC" rules=palette_sst.gcolors
done

Apply a quality mask

If there is cloud cover, the satellite will show the temperature at the top of the cloud, not at the sea surface. The HDF data comes with a quality layer and we can use it to restrict the data only to areas with the highest confidence.

Select the file:

file=T2010235.L3m_DAY_SST_4km
echo "map: $file"

Extract the quality layer from the HDF data file:

gdal_translate -a_srs "+init=epsg:4326" -a_nodata 65535 \
  -a_ullr -180 90 180 -90 -co "COMPRESS=PACKBITS" \
  HDF4_SDS:UNKNOWN:"$file":1  ${file}.qual_prep.tif

Import the extracted GeoTiff with r.in.gdal:

r.in.gdal in=${file}.qual_prep.tif out=$file.qual

Set up some color rules to cover the quality status categories using r.colors:

r.colors $file.qual col=rules << EOF
  0 green
  1 yellow 
  2 red
  255 black
EOF

Assign quality map category labels with r.category:

r.category $file.qual rules=- << EOF
0:best quality
1:not as good quality
2:rather bad quality
3:pretty bad quailty
4:254:really bad quality
255:unusable quality
EOF

Setup a MASK map to only show the best quality data using a r.reclass virtual raster:

r.reclass in=$file.qual out=MASK --o << EOF
  0 = 1     good quality
  * = NULL
EOF

With the MASK map in place, redrawing the map display (or any other raster processing) will only occur for the good data cells.

Chlorophyll-a (Level 3)

Get the data

File names look like: A20023352002365.L3m_MO_CHLO_4

key:
* A: MODIS/Aqua
* 2002: Year at start
* 335: Julian day at start
* 2002: Year at end
* 365: Julian day at end
* L3m: Level 3 data, mapped (Plate carrée)
* MO: One month ensemble
* CHLO: Chlorophyll a concentration product
* 4: 4.6km pixel size (8640x4320 image, 2.5 minute resolution)


Decompress

Run this command in the command line:

bzip2 -d A20023352002365.L3m_MO_CHLO_4.bz2

Check

Use GDAL's gdalinfo tool to view the HDF file's meta-data. Run this command in the command line:

gdalinfo A20023352002365.L3m_MO_CHLO_4

Example of CHLO metadata from gdalinfo:

 Parameter=Chlorophyll a concentration
 Measure=Mean
 Units=mg m^-3
 Scaling=logarithmic
 Scaling Equation=Base**((Slope*l3m_data) + Intercept) = Parameter value
 Base=10
 Slope=5.813776e-05
 Intercept=-2
 Scaled Data Minimum=0.01
 Scaled Data Maximum=64.5654
 Data Minimum=0.002637
 Data Maximum=99.99774

Import

Method 1
  • Create a Lat/Lon GRASS location and mapset
Plate carrée, or WGS84? -- Does it matter with data at the km scale?
  • Use gdal_translate to convert the HDF4 data into a GeoTiff, and specify map projection, bounding box, and no-data information. Then import this GeoTiff using the r.in.gdal module. In this example we set the projection to be lat/lon WGS84 (EPSG code 4326), even though this may not be entirely correct.
for file in A*_MO_CHLO_4 ; do
  echo "map: $file"

  gdal_translate -a_srs "+init=epsg:4326" -a_nodata 65535 \
    -a_ullr -180 90 180 -90 -co "COMPRESS=PACKBITS" \
    $file ${file}_prep.tif

  r.in.gdal in=${file}_prep.tif out=$file
done
Method 2

Update: in GRASS 6.5 and newer you can now use the r.in.gdal `-l` flag to import unreferenced maps directly into a lat/lon location. You must then immediately run r.region to correct the map's region bounds.

  • Create a simple XY lat/lon GRASS location and mapset
  • Import the imagery and set image bounds and resolution:
for file in A*_MO_CHLO_4 ; do
  echo "map: $file"
  r.in.gdal -l in=$file out=$file
  r.region $file n=90 s=-90 w=-180 e=180
done
  • Check:
g.list rast
r.info A20023352002365.L3m_MO_CHLO_4

  • Convert the simple XY location into a Lat/Lon location
Plate carrée, or WGS84? -- Does it matter with data at the km scale?
Modify the location's projection settings with g.setproj? (run that from the PERMANENT mapset)

Hack:: after import move the mapset dir into a world_ll location, then edit the mapset's WIND file and change the proj: line from 0 to 3. (0 is unprojected, 3 is LL) You also have to edit each file header in the $MAPSET/cellhd/ directory.

  • Remove NULLs
for map in `g.mlist rast pat=*MO_CHLO_4` ; do
  r.null $map setnull=65535
done
Method 3

Like method 2 above, but instead of moving the mapset directory in the file system use i.rectify to move the maps into the target Lat/Lon location. After importing the images, setting their bounds, and removing NULLs, add all images to an imagery group with the i.group module. Run i.target and select a lat/lon location. The idea here is to run i.rectify with a first-order transform, where the transform uses a scaling factor of 1.0 and rotation of 0.0, i.e. no change at all. The trick is to set those. Usually you would use i.points or the GUI georeferencing tool to set those, and here you can add a few points by hand if you like. But the POINTS file in the group's data directory should use identical coordinates for both source and destination. Hack: edit the POINTS file by hand to make it so, using the four corners and 0,0 at the center of the image. Finally run i.rectify to push the image into the target location.

Processing

  • Convert to chlorophyll a concentration. Note slope and intercept are taken from the image's metadata using gdalinfo.
for map in `g.mlist rast pat=*MO_CHLO_4` ; do
  echo "$map"
  g.region rast=$map
  #r.info -r $map
  Slope=5.813776e-05
  Intercept=-2
  r.mapcalc "${map}.chlor_A = 10^(($Slope * $map) + $Intercept)"
  r.support "${map}.chlor_A" units="mg m^-3"
  #r.colors -e "${map}.chlor_A" color=bcyr  # :-(
done
  • Check:
r.info A20023352002365.L3m_MO_CHLO_4.chlor_A
r.univar A20023352002365.L3m_MO_CHLO_4.chlor_A

Set colors

The chlorophyll maps are logarithmic which poses some challenges to rendering nice colors. (the unconverted import image displays nicely with a linear color map) We can use Goddard's OceanColor palettes from http://oceancolor.gsfc.nasa.gov/PRODUCTS/colorbars.html

Those are given in 0-255 range, the imported HDF4 data is 0-65535, and the converted chlorophyll a concentration is 0-65 mg/m^3.

  • Use the UNIX awk tool to convert each color rule with the same exponential function we used to convert the data, and save it to a rules file:
# scale 0-255 to 0-65535 and then convert to chlor-a values
echo "# Color rules for MODIS Chloropyll-a" > palette_chl_etc.gcolors
Slope=5.813776e-05
Intercept=-2
cat palette_chl_etc.txt | grep -v '^#' | \
  awk -v Slope=$Slope -v Intercept=$Intercept \
    '{ printf("%f %d:%d:%d\n", \
        10^((Slope * (($1 +1)^2 -1)) + Intercept), $2, $3, $4)}' \
  >> palette_chl_etc.gcolors
echo "nv black" >> palette_chl_etc.gcolors
# better: edit last rule to be 64.574061 100:0:0

The processed color rules file is here: [2]

  • Apply color rules to imported maps:
for map in `g.mlist rast pat=*MO_CHLO_4` ; do
  r.colors "${map}.chlor_A" rules=palette_chl_etc.gcolors
done

Removing holes

See AVHRR.

  • If it's land cover classes, r.fillnulls will produce nonsense data: floating point. Rather use r.neighbors with a modal filter, r.patch the original with the filtered map. This will replace NULLs with the most common surrounding land cover type.

Rolling composite

Create a rolling 28 day composite:

Say you have weekly satellite data but there is a lot of NULL cells where there was cloud cover that day. You can fill these holes by looking for valid data in the previous week's image, and if again you find clouds, look in week prior to that, etc., up to 4 weeks have past, after which point if any holes remain give up and leave it unfilled. You can do this quite simply with r.patch:

r.patch output=rolling28day.date input=newest,last_week,2weeks_ago,3weeks_ago

NULL data cells are filled using the maps in the given input= order.

MODIS SWATH

Here an alternative to MODIS SWATH TOOL using GDAL:

See also

Bibliography

  • Alexandris, N. (2011). Burned area mapping via non-centered Principal Components Analysis using public domain data and free open source software. Department of Remote Sensing & Landscape Information Systems, Faculty of Forest and Environmental Sciences, Albert-Ludwigs-University, Freiburg im Breisgau, Germany. Dissertation Entry at freidok.uni-freiburg.de
  • Carpi G., Cagnacci F., Neteler M., Rizzoli A, 2008: Tick infestation on roe deer in relation to geographic and remotely-sensed climatic variables in a tick-borne encephalitis endemic area. Epidemiology and Infection, 136, pp. 1416-1424. (PubMed)
  • Metz, M., Rocchini, D., Neteler, M., 2014. Surface Temperatures at the Continental Scale: Tracking Changes with Remote Sensing at Unprecedented Detail. Remote Sensing 6, 3822–3840 (DOI, Abstract, PDF). Dataset download: http://gis.cri.fmach.it/eurolst
  • Neteler, M., 2005: Time series processing of MODIS satellite data for landscape epidemiological applications. International Journal of Geoinformatics, 1(1), pp. 133-138 (PDF)
  • Neteler, M., 2010: Estimating daily Land Surface Temperatures in mountainous environments by reconstructed MODIS LST data. Remote Sensing 2(1), 333-351. (DOI, Abstract, PDF)
  • Neteler, M., Roiz, D., Rocchini, D., Castellani, C. and Rizzoli, A. (2011). Terra and Aqua satellites track tiger mosquito invasion: modeling the potential distribution of Aedes albopictus in north-eastern Italy. International Journal of Health Geographics, 10:49 (DOI | PDF)
  • Neteler, M., Metz, M., Rocchini, D., Rizzoli, A., Flacio, E., Engeler, L., Guidi, V., Lüthy, P., Tonolla, M. (2013): Is Switzerland suitable for the invasion of Aedes albopictus? PLoS ONE 8(12): e82090. [DOI | PDF]
  • Roiz D., Neteler M., Castellani C., Arnoldi D., Rizzoli A., 2011: Climatic Factors Driving Invasion of the Tiger Mosquito (Aedes albopictus) into New Areas of Trentino, Northern Italy. PLoS ONE. 6(4): e14800. (DOI - PDF)
  • Zorer, R., Rocchini, D., Metz, M., Delucchi, L., Zottele, F., Meggio, F., Neteler, M. (2013): Daily MODIS Land Surface Temperature Data for the Analysis of the Heat Requirements of Grapevine Varieties. IEEE Transactions on Geoscience and Remote Sensing 51, 2128–2135. (DOI)