MODIS: Difference between revisions
(→Introduction: MODIS Tile Calculator) |
|||
(153 intermediate revisions by 6 users not shown) | |||
Line 1: | Line 1: | ||
== Introduction == | |||
* [http://modis.gsfc.nasa.gov/ About MODIS] | * [http://modis.gsfc.nasa.gov/ About MODIS] | ||
There are two MODIS | There are two satellites, Aqua and Terra which carry the MODIS sensor as payload. The Moderate Resolution Imaging Spectroradiometer (MODIS) is a 36-channel from visible to thermal-infrared sensor that was launched as part of the Terra satellite payload in December 1999 and Aqua satellite (May 2002). The Terra satellite passes twice a day (at about 10:30am, and 22:30pm local time), also the Aqua satellite passes twice a day (at about 01:30am, and 13:30pm local time). | ||
=== MODIS Aqua === | '''Details about the sensor''' | ||
(compiled from various sources) | |||
MODIS (MODerate resolution Imaging Spectroradiometer) is a whiskbroom multi-spectral sensor (across track scanner) on board the satellites Terra and Aqua that measures 36 spectral frequencies of light. Its swath is 2330 km (across track) by 10° of latitude (along track at nadir). Their circular, sun-synchronous, near polar orbit takes about 99 min. which means that the same region is acquired several times during daylight. However, at nadir observations of the same region are taken every 16 days. A complete earth scan is achieved every 1.2 days. Due to its large swath geometric distortions occur at the scan edges. The pixel size grows with the scan angle, by a factor of 8 at the edge of scan, making the shape of each scan-stripe look like a bow-tie. Successively, the scan-stripes being wider at the edges of the scan, overlap each other. This overlap creates a double-vision effect on the edges of the scan, as each point of the Earth’s surface appears in two adjacent scans. This distortion emphasises the line where scan-stripes meet and, thus, magnifies small differences between the sides of the scan mirror. | |||
MODIS Tile Calculator: https://landweb.modaps.eosdis.nasa.gov/cgi-bin/developer/tilemap.cgi | |||
== MODIS Land == | |||
MODIS Land provides many data products including: | |||
* Daily Surface Reflectance | |||
* Snow coverage | |||
* NDVI / EVI Vegetation index | |||
* Land Surface Temperature (LST) | |||
* Atmosphere Daily Global Products | |||
* many more | |||
Most products are created separately for the different MODIS sensors onboard the TERRA (MOD products) and AQUA (MYD products) satellites. Many products are also available as a combined MODIS product (MCD products). | |||
See [http://modis-land.gsfc.nasa.gov/ MODIS Land Team Home Page] | |||
=== MODIS data download: installation of convenient GRASS GIS addon i.modis === | |||
For a convenient download addon, see {{AddonCmd|i.modis}}, a toolset for downloading and processing of MODIS products using pyModis. | |||
Installation of pyModis: | |||
<source lang="bash"> | |||
# either | |||
pip install pyModis | |||
# or | |||
sudo pip install pyModis | |||
</source> | |||
Installation of i.modis toolset: | |||
<source lang="bash"> | |||
# in a GRASS GIS session | |||
g.extension i.modis | |||
</source> | |||
Now {{AddonCmd|i.modis}}, {{AddonCmd|i.modis.download}}, and {{AddonCmd|i.modis.import}} are locally installed. | |||
==== Download of MODIS tiles: i.modis.download ==== | |||
For obtaining the needed NASA credentials, see [http://www.pymodis.org/info.html#requirements here]. | |||
Example: download of Global MODIS Vegetation Index, sixteen days Global 0.05Deg CMG (Terra/Aqua): | |||
<source lang="bash"> | |||
##### preparation: first time only | |||
# store NASA username and password in two lines | |||
echo "myusername | |||
xxxxxx" > ~/.imodis | |||
# create new space-time cube | |||
t.create type=strds temporaltype=absolute output=ndvi_16_5600m title="Global NDVI 16 days MOD13C1" \ | |||
description="MOD13C1 Global NDVI 16 days" semantictype=mean | |||
##### END first time only | |||
MODIS_TMPFOLDER=/path/to/modis/ndvi_MOD13C1.006/ | |||
# fetch list of available MODIS data up to today | |||
i.modis.download settings=~/.imodis product=ndvi_terra_sixteen_5600 \ | |||
startday=2015-01-01 endday=`date +"%Y-%m-%d"` folder=$MODIS_TMPFOLDER | |||
</source> | |||
The downloaded HDF files can then be imported with {{AddonCmd|i.modis.download}}. | |||
==== Download of MODIS tiles: modis_download.py ==== | |||
As an alternative to {{AddonCmd|i.modis.download}} the pyModis' tool [http://www.pymodis.org/scripts/modis_download.html modis_download.py] can be used directly. Be sure to have created the needed <tt>.netrc</tt> file with your NASA credentials (see [http://www.pymodis.org/info.html#requirements here] for details): | |||
<source lang="bash"> | |||
# example: Global MODIS Vegetation Index, sixteen days Global 0.05Deg CMG (Terra/Aqua) | |||
mkdir -p ndvi_MOD13C1.006 | |||
modis_download.py -e 2015-10-01 -f 2016-01-31 --product=MOD13C1.006 ndvi_MOD13C1.006/ | |||
</source> | |||
The downloaded HDF files can then be imported with {{AddonCmd|i.modis.download}}. | |||
=== MODIS Land Products Quality Control === | |||
A number of MODIS Land products include a quality subdataset that is bit-encoded: different bit ranges represent different quality assessments. | |||
The general formula to extract a specific bit range for start bit = sb (first start bit is 0) and number of bits from start bit = nb is the modulus of (quality divided by 2 to the power of the first start bit) by 2 to the power of number of bits - 1, or as r.mapcalc expression using standard math: | |||
bitfield = (quality / (2 ^ sb)) % (2 ^ nb) | |||
or in bit operation notation: | |||
bitfield = (quality >> sb) & ((1 << nb) - 1) | |||
==== MOD09A1/MYD09A1 version 6 ==== | |||
Surface reflectance of MODIS bands 1 - 7 at 500 meter resolution for every 8 days | |||
The name of the quality control subdataset is "sur_refl_qc_500m". The type is unsigned int 32 bit, and the contents are | |||
* bits 0 - 1: MODLAND QA bits | |||
* bits 2 - 5: band 1 data quality | |||
* bits 6 - 9: band 2 data quality | |||
* bits 10 - 13: band 3 data quality | |||
* bits 14 - 17: band 4 data quality | |||
* bits 18 - 21: band 5 data quality | |||
* bits 22 - 25: band 6 data quality | |||
* bits 26 - 29 band 7 data quality | |||
* bit 30: atmospheric correction performe | |||
* bit 31: adjacency correction performed | |||
Here, the quality subdataset is of type unsigned int 32 bit, therefore you need to use '>>>' instead of '>>' in r.mapcalc expressions. For example, to extract the MODLAND QA bits, use | |||
qa = quality & 3 | |||
To extract the quality of band 2, use | |||
band2_quality = (quality >>> 6) & 15 | |||
* See the user guide at https://lpdaac.usgs.gov/sites/default/files/public/product_documentation/mod09_user_guide_v1.4.pdf | |||
The band quality might indicate lower quality (7 = noisy detector or 8 = dead detector), even though the MODLAND QA bits indicate ideal quality on all bands. On the other hand, MODLAND QA bits sometimes indicate pixel "... not produced for other reasons, some or all bands ...". In this case, the band-specific data quality needs to be checked. That means pixels should only be discarded if both MODLAND QA > 1 and band quality > 8 (maybe band quality > 12 for high tolerance). | |||
==== MOD11A1/MYD11A1 version 6 ==== | |||
Land Surface Temperature (LST) at 1 km resolution, daily | |||
The name of the quality control subdatasets are "QC_Day" and "QC_Night". The type is unsigned int 8 bit, and the contents are | |||
* bits 0 - 1: Mandatory QA flag | |||
* bits 2 - 3: Data quality flag | |||
* bits 4 - 5: Emissivity error flag | |||
* bits 6 - 7: LST error flag | |||
The nodata (NULL) value is set to 0 (zero) which is not mentioned in the documentation, but set in the hdf file and recognized by GDAL. Unfortunately, the value 0 (zero) also indicates highest quality pixels. In order to preserve pixels with value 0 (zero), the nodata value needs to be changed with "gdalbuildvrt -srcnodata 255 -vrtnodata 255" or with "gdal_translate -a_nodata 255" or after import into GRASS with "r.null null=255". After that, pixels with a QC value of 0 and a valid LST value are of highest quality, while pixels with a QC value of 0 and an invalid LST value can be set to NULL. | |||
The most useful bit fields are the "Mandatory QA flag", extracted with | |||
qa = quality & 3 | |||
and the "LST error flag", extracted with | |||
lst_error = (quality >> 6) & 3 | |||
==== MOD13Q1/MYD13Q1 version 6 ==== | |||
Vegetation indices (NDVI and EVI) at 250 meter resolution for every 16 days | |||
The name of the quality control subdataset is "250m 16 days VI Quality". The type is unsigned int 16 bit, and the contents are | |||
* bits 0 - 1: MODLAND QA bits | |||
* bits 2 - 5: VI usefulness | |||
* bits 6 - 7: Aerosol Quantity | |||
* bit 8: Adjacent cloud detected | |||
* bit 9: Atmosphere BRDF Correction | |||
* bit 10: Mixed Clouds | |||
* bits 11 - 13: Land/Water Mask | |||
* bit 14: Possible snow/ice | |||
* bit 15: Possible shadow | |||
The VI usefulness bits are sometimes indicating 1111 = "Not useful for any other reason/not processed", even for a reasonable VI value and MODLAND QA bits indicating good quality. Therefore the VI usefulness should be used with care, or not at all. The MODLAND QA bits extracted with | |||
qa = quality & 3 | |||
is sufficient to filter out bad quality pixels with qa > 1. | |||
=== Advanced MODIS LST time series reconstruction === | |||
* See http://www.geodati.fmach.it/eurolst.html (LST dataset download) | |||
== MODIS Ocean == | |||
Sea surface color, temperature, and derivative images | Sea surface color, temperature, and derivative images | ||
* http://oceancolor.gsfc.nasa.gov/PRODUCTS/ | * http://oceancolor.gsfc.nasa.gov/PRODUCTS/ | ||
The following examples assume you want to import multiple images at once, and so use shell scripting loops. They also assume input files are SMI (Standard Mapped Image) format, as produced by the Ocean Color group (http://oceancolor.gsfc.nasa.gov). | |||
* Get data in HDF4 format from http://oceancolor.gsfc.nasa.gov/ | |||
=== SST (Level 3) === | |||
==== Get the data ==== | |||
* Get data in HDF4 format from http://oceancolor.gsfc.nasa.gov/cgi/l3 | |||
: You'll probably want the Standard Mapped Image (SMI) file version, not the Binned (BIN) one. | |||
File names look like: <tt>A20023532002360.L3m_8D_SST_4</tt> | File names look like: <tt>A20023532002360.L3m_8D_SST_4</tt> | ||
''key:'' | :''key:'' | ||
* A: Aqua | * A: MODIS/Aqua | ||
* 2002: Year at start | * 2002: Year at start | ||
* 353: Julian day at start | * 353: Julian day at start | ||
* 2002: Year at end | * 2002: Year at end | ||
* 360: Julian day at end | * 360: Julian day at end | ||
* L3m: Level 3 data, mapped | * L3m: Level 3 data, mapped (Plate carrée) | ||
* 8D: 8 day ensemble | * 8D: 8 day ensemble | ||
* SST: Sea Surface Temperature product | * SST: Sea Surface Temperature product | ||
* 4: | * 4: 4.6km pixel size (8640x4320 image, 2.5 minute resolution) | ||
==== Decompress ==== | |||
Run this command in the command line: | |||
bzip2 -d A20023532002360.L3m_8D_SST_4.bz2 | |||
==== Check ==== | |||
Use [http://www.gdal.org GDAL's] <tt>gdalinfo</tt> tool to view the HDF file's meta-data. Run this command in the command line: | |||
gdalinfo A20023532002360.L3m_8D_SST_4 | |||
Example of SST metadata from gdalinfo: | |||
Parameter=Sea Surface Temperature | |||
Measure=Mean | |||
Units=deg-C | |||
Scaling Equation=(Slope*l3m_data) + Intercept = Parameter value | |||
Slope=0.000717185 | |||
Intercept=-2 | |||
Scaled Data Minimum=-2 | |||
Scaled Data Maximum=45 | |||
Data Minimum=-1.999999 | |||
Data Maximum=37.06 | |||
Subdatasets: | |||
SUBDATASET_1_NAME=HDF4_SDS:UNKNOWN:"A20023532002360.L3m_8D_SST_4":0 | |||
SUBDATASET_1_DESC=[4320x8640] l3m_data (16-bit unsigned integer) | |||
SUBDATASET_2_NAME=HDF4_SDS:UNKNOWN:"A20023532002360.L3m_8D_SST_4":1 | |||
SUBDATASET_2_DESC=[4320x8640] l3m_qual (8-bit unsigned integer) | |||
==== Import ==== | |||
GRASS can be quite strict about moving maps around between differing map projections. Because GDAL does not automatically georeference HDF4 images, we need to apply the georeferencing information manually. This is a slight chore, but means fewer errors due to projection mis-matches later on. Everything we need to know is given in the data file's meta-data, as seen with <tt>gdalinfo</tt>. | |||
===== Method 0 ===== | |||
The USGS/NASA's [https://lpdaac.usgs.gov/tools/modis_reprojection_tool MODIS Reprojection Tool] (MRT) support reprojection for Level 3 data or [https://lpdaac.usgs.gov/tools/modis_reprojection_tool_swath MODIS Swath Reprojection Tool] for Level 1B or 2 data and can be used to create a GeoTiff. This requires the [ftp://ladsweb.nascom.nasa.gov/allData/5/MOD03/ geolocation angles]. | |||
===== Method 1 ===== | |||
Use GDAL tools to prepare the dataset and create a GeoTiff which can be directly imported into GRASS. | |||
* Create a Lat/Lon GRASS location and mapset | |||
: ''Plate carrée, or WGS84? -- Does it matter with data at the km scale? | |||
: -- Encode with +ellps=sphere (?) or epsg:32662?, see http://lists.osgeo.org/pipermail/gdal-dev/2007-September/014134.html or http://thread.gmane.org/gmane.comp.gis.gdal.devel/12666'' | |||
: FIXME: gdalinfo: " Map Projection=Equidistant Cylindrical" | |||
: Use "<tt>+nadgrids=@null</tt>" trick?? http://proj.maptools.org/faq.html#sphere_as_wgs84 | |||
==== | * Use <tt>gdal_translate</tt> to extract the desired data array from the HDF4 file and convert into a GeoTiff. While creating the GeoTiff specify the map projection, bounding box, and no-data information. Then import this GeoTiff using the {{cmd|r.in.gdal}} module. In this example we set the projection to be lat/lon WGS84 (EPSG code 4326), even though this may not be entirely correct. | ||
''' | |||
Note: this shell script is best stored in a text file and then executed: | |||
for file in A*SST_4 ; do | |||
echo "map: $file" | |||
gdal_translate -a_srs "+init=epsg:4326" -a_nodata 65535 \ | |||
-a_ullr -180 90 180 -90 -co "COMPRESS=PACKBITS" \ | |||
HDF4_SDS:UNKNOWN:"$file":0 ${file}_prep.tif | |||
r.in.gdal in=${file}_prep.tif out=$file | |||
done | |||
===== Method 2 ===== | |||
Import raw HDF4 data into a <strike>XY location, clean it up, then copy into a lat/lon location</strike> lat/lon location using the new {{cmd|r.in.gdal|version=65}} <tt>-l</tt> flag in GRASS 6.5+. This is more work than method 1 but demonstrates using GRASS tools to do the manipulation instead of GDAL tools (the result should be the same). | |||
====== Extract ====== | |||
Because the HDF4 data file contains multiple data arrays we need to extract the one we are interested in. In this example we extract the "data" layer but not the "quality" layer, and output a GeoTIFF file using GDAL's <tt>gdal_translate</tt> tool, using the appropriate SUBDATASET NAME: | |||
for file in *SST_4 ; do | |||
gdal_translate HDF4_SDS:UNKNOWN:"$file":0 ${file}_unproj.tif | |||
done | |||
====== Import ====== | |||
''Update: in GRASS 6.5 and newer you can now use the {{cmd|r.in.gdal|version=65}} `-l` flag to import unreferenced maps directly into a lat/lon location. You must then immediately run {{cmd|r.region}} to correct the map's region bounds.'' | |||
* Create a <strike>simple XY</strike> lat/lon GRASS location and mapset | |||
* Import the imagery and set image bounds and resolution: | |||
for file in *unproj.tif ; do | |||
BASE=`basename $file _unproj.tif` | |||
echo "map: $file" | |||
r.in.gdal -l in=$file out=$BASE | |||
r.region $BASE n=90 s=-90 w=-180 e=180 | |||
done | |||
* Check: | |||
g.list rast | |||
r.info A20023532002360.L3m_8D_SST_4 | |||
<strike> | |||
* Convert the simple XY location into a Lat/Lon location | |||
: ''Plate carrée, or WGS84? -- Does it matter with data at the km scale?'' | |||
: Modify the location's projection settings with g.setproj? (run that from the PERMANENT mapset) | |||
''Hack'': after import move the mapset dir into a world_ll location, then edit the mapset's WIND file and change the <tt>proj:</tt> line from 0 to 3. (0 is unprojected, 3 is LL) You also have to edit each file header in the $MAPSET/cellhd/ directory. | |||
</strike> | |||
* Remove NULLs | |||
for map in `g.mlist rast pat=*SST_4` ; do | |||
r.null $map setnull=65535 | |||
done | |||
===== Method 3 ===== | |||
Like method 2 above, but using <tt>i.rectify</tt> instead of moving the mapset manually. See the more detailed explanation in the Chlorophyll section below. | |||
==== Processing ==== | |||
* Convert to temperature in degrees C. Note slope and intercept are taken from the image's metadata using gdalinfo. | |||
for map in `g.mlist rast pat=*SST_4` ; do | |||
echo "$map" | |||
g.region rast=$map | |||
#r.info -r $map | |||
Slope=0.000717185 | |||
Intercept=-2 | |||
r.mapcalc "${map}.degC = ($Slope * $map) + $Intercept" | |||
r.support "${map}.degC" units="deg-C" | |||
#r.colors "${map}.degC" color=bcyr # fairly nice | |||
done | |||
* Check: | |||
r.info A20023532002360.L3m_8D_SST_4.degC | |||
r.univar A20023532002360.L3m_8D_SST_4.degC | |||
==== Set colors ==== | |||
Although the standard <tt>bcyr</tt> color map looks nice, for a ''very'' nice image we can use Goddard's OceanColor palettes from http://oceancolor.gsfc.nasa.gov/PRODUCTS/colorbars.html | |||
Those are given in 0-255 range, the imported HDF4 data is 0-65535, and the converted temperatures are -2-45 deg C. | |||
* Use the UNIX awk tool to convert each color rule with the same scaling function we used to convert the data, and save it to a rules file: | |||
# scale 0-255 to 0-65535 and then convert to temperature values | |||
echo "# Color rules for MODIS SST" > palette_sst.gcolors | |||
Slope=0.000717185 | |||
Intercept=-2 | |||
cat palette_sst.txt | grep -v '^#' | \ | |||
awk -v Slope=$Slope -v Intercept=$Intercept \ | |||
'{ printf("%f %d:%d:%d\n", \ | |||
(Slope * (($1 +1)^2 -1) + Intercept), $2, $3, $4)}' \ | |||
>> palette_sst.gcolors | |||
echo "nv black" >> palette_sst.gcolors | |||
# better: edit last rule to be 45.000719 206:206:206 | |||
The processed color rules file is here: [http://trac.osgeo.org/grass/browser/grass-addons/raster/r.colors.tools/palettes] | |||
* Apply color rules to imported maps: | |||
for map in `g.mlist rast pat=*SST_4` ; do | |||
r.colors "${map}.degC" rules=palette_sst.gcolors | |||
done | |||
==== Apply a quality mask ==== | |||
If there is cloud cover, the satellite will show the temperature at the top of the cloud, not at the sea surface. The HDF data comes with a quality layer and we can use it to restrict the data only to areas with the highest confidence. | |||
Select the file: | |||
file=T2010235.L3m_DAY_SST_4km | |||
echo "map: $file" | |||
Extract the quality layer from the HDF data file: | |||
gdal_translate -a_srs "+init=epsg:4326" -a_nodata 65535 \ | |||
-a_ullr -180 90 180 -90 -co "COMPRESS=PACKBITS" \ | |||
HDF4_SDS:UNKNOWN:"$file":1 ${file}.qual_prep.tif | |||
Import the extracted GeoTiff with {{cmd|r.in.gdal}}: | |||
r.in.gdal in=${file}.qual_prep.tif out=$file.qual | |||
Set up some color rules to cover the quality status categories using {{cmd|r.colors}}: | |||
r.colors $file.qual col=rules << EOF | |||
0 green | |||
1 yellow | |||
2 red | |||
255 black | |||
EOF | |||
Assign quality map category labels with {{cmd|r.category}}: | |||
r.category $file.qual rules=- << EOF | |||
0:best quality | |||
1:not as good quality | |||
2:rather bad quality | |||
3:pretty bad quailty | |||
4:254:really bad quality | |||
255:unusable quality | |||
EOF | |||
Setup a MASK map to only show the best quality data using a {{cmd|r.reclass}} virtual raster: | |||
r.reclass in=$file.qual out=MASK --o << EOF | |||
0 = 1 good quality | |||
* = NULL | |||
EOF | |||
With the ''MASK'' map in place, redrawing the map display (or any other raster processing) will only occur for the good data cells. | |||
=== Chlorophyll-a (Level 3) === | |||
==== Get the data ==== | |||
* Get data in HDF4 format from http://oceancolor.gsfc.nasa.gov/PRODUCTS/L3_chlo.html | * Get data in HDF4 format from http://oceancolor.gsfc.nasa.gov/PRODUCTS/L3_chlo.html | ||
File names look like: <tt>A20023352002365.L3m_MO_CHLO_4</tt> | File names look like: <tt>A20023352002365.L3m_MO_CHLO_4</tt> | ||
:''key:'' | :''key:'' | ||
* A: Aqua | * A: MODIS/Aqua | ||
* 2002: Year at start | * 2002: Year at start | ||
* 335: Julian day at start | * 335: Julian day at start | ||
* 2002: Year at end | * 2002: Year at end | ||
* 365: Julian day at end | * 365: Julian day at end | ||
* L3m: Level 3 data, mapped | * L3m: Level 3 data, mapped (Plate carrée) | ||
* MO: One month ensemble | * MO: One month ensemble | ||
* CHLO: Chlorophyll ''a'' concentration product | * CHLO: Chlorophyll ''a'' concentration product | ||
Line 40: | Line 406: | ||
==== Decompress ==== | |||
Run this command in the command line: | |||
bzip2 -d A20023352002365.L3m_MO_CHLO_4.bz2 | bzip2 -d A20023352002365.L3m_MO_CHLO_4.bz2 | ||
==== Check ==== | |||
Use [http://www.gdal.org GDAL's] <tt>gdalinfo</tt> tool to view the HDF file's meta-data. Run this command in the command line: | |||
gdalinfo A20023352002365.L3m_MO_CHLO_4 | |||
Example of CHLO metadata from gdalinfo: | Example of CHLO metadata from gdalinfo: | ||
Line 61: | Line 428: | ||
Data Maximum=99.99774 | Data Maximum=99.99774 | ||
==== Import ==== | |||
===== Method 1 ===== | |||
* Create a Lat/Lon GRASS location and mapset | |||
: ''Plate carrée, or WGS84? -- Does it matter with data at the km scale?'' | |||
* Use <tt>gdal_translate</tt> to convert the HDF4 data into a GeoTiff, and specify map projection, bounding box, and no-data information. Then import this GeoTiff using the <tt>r.in.gdal</tt> module. In this example we set the projection to be lat/lon WGS84 (EPSG code 4326), even though this may not be entirely correct. | |||
for file in A*_MO_CHLO_4 ; do | |||
echo "map: $file" | |||
gdal_translate -a_srs "+init=epsg:4326" -a_nodata 65535 \ | |||
-a_ullr -180 90 180 -90 -co "COMPRESS=PACKBITS" \ | |||
$file ${file}_prep.tif | |||
r.in.gdal in=${file}_prep.tif out=$file | |||
done | |||
===== Method 2 ===== | |||
* Create a simple XY GRASS location and mapset | ''Update: in GRASS 6.5 and newer you can now use the {{cmd|r.in.gdal|version=65}} `-l` flag to import unreferenced maps directly into a lat/lon location. You must then immediately run {{cmd|r.region}} to correct the map's region bounds.'' | ||
* Create a <strike>simple XY</strike> lat/lon GRASS location and mapset | |||
* Import the imagery and set image bounds and resolution: | * Import the imagery and set image bounds and resolution: | ||
for file in A*_MO_CHLO_4 ; do | for file in A*_MO_CHLO_4 ; do | ||
echo "map: $file" | echo "map: $file" | ||
r.in.gdal in=$file out=$file | r.in.gdal -l in=$file out=$file | ||
r.region $file n=90 s=-90 w=-180 e=180 | r.region $file n=90 s=-90 w=-180 e=180 | ||
done | done | ||
* Check: | |||
g.list rast | |||
r.info A20023352002365.L3m_MO_CHLO_4 | |||
<strike> | |||
* Convert the simple XY location into a Lat/Lon location | * Convert the simple XY location into a Lat/Lon location | ||
: | : ''Plate carrée, or WGS84? -- Does it matter with data at the km scale?'' | ||
: Modify the location's projection settings with g.setproj? (run that from the PERMANENT mapset) | |||
''Hack:'': after import move the mapset dir into a world_ll location, then edit the mapset's WIND file and change the <tt>proj:</tt> line from 0 to 3. (0 is unprojected, 3 is LL) You also have to edit each file header in the $MAPSET/cellhd/ directory. | |||
: Modify the location's projection settings with g.setproj? | </strike> | ||
: | |||
* Remove NULLs | * Remove NULLs | ||
for map in `g.mlist rast pat= | for map in `g.mlist rast pat=*MO_CHLO_4` ; do | ||
r.null $map setnull=65535 | r.null $map setnull=65535 | ||
done | done | ||
===== Method 3 ===== | |||
Like method 2 above, but instead of moving the mapset directory in the file system use {{cmd|i.rectify}} to move the maps into the target Lat/Lon location. After importing the images, setting their bounds, and removing NULLs, add all images to an imagery group with the {{cmd|i.group}} module. Run {{cmd|i.target}} and select a lat/lon location. The idea here is to run <tt>i.rectify</tt> with a first-order transform, where the transform uses a scaling factor of 1.0 and rotation of 0.0, i.e. no change at all. The trick is to set those. Usually you would use {{cmd|i.points}} or the GUI georeferencing tool to set those, and here you can add a few points by hand if you like. But the POINTS file in the group's data directory should use identical coordinates for both source and destination. ''Hack:'' edit the POINTS file by hand to make it so, using the four corners and 0,0 at the center of the image. Finally run <tt>i.rectify</tt> to push the image into the target location. | |||
==== Processing ==== | |||
* Convert to chlorophyll ''a'' concentration. Note slope and intercept are taken from the image's metadata using gdalinfo. | * Convert to chlorophyll ''a'' concentration. Note slope and intercept are taken from the image's metadata using gdalinfo. | ||
Line 87: | Line 485: | ||
for map in `g.mlist rast pat=*MO_CHLO_4` ; do | for map in `g.mlist rast pat=*MO_CHLO_4` ; do | ||
echo "$map" | echo "$map" | ||
g.region rast=$map | |||
#r.info -r $map | #r.info -r $map | ||
Slope=5.813776e-05 | Slope=5.813776e-05 | ||
Line 95: | Line 494: | ||
done | done | ||
* Check: | |||
r.info A20023352002365.L3m_MO_CHLO_4.chlor_A | |||
r.univar A20023352002365.L3m_MO_CHLO_4.chlor_A | |||
==== Set colors ==== | |||
The chlorophyll maps are logarithmic which poses some challenges to rendering nice colors. (the unconverted import image displays nicely with a linear color map) We can use Goddard's OceanColor palettes from http://oceancolor.gsfc.nasa.gov/PRODUCTS/colorbars.html | |||
The | |||
Those are given in 0-255 range, the imported HDF4 data is 0-65535, and the converted chlorophyll ''a'' concentration is 0-65. | Those are given in 0-255 range, the imported HDF4 data is 0-65535, and the converted chlorophyll ''a'' concentration is 0-65 mg/m^3. | ||
* Use the UNIX awk tool to convert each color rule with the same exponential function we used to convert the data, and save it to a rules file: | * Use the UNIX awk tool to convert each color rule with the same exponential function we used to convert the data, and save it to a rules file: | ||
Line 114: | Line 517: | ||
# better: edit last rule to be 64.574061 100:0:0 | # better: edit last rule to be 64.574061 100:0:0 | ||
The processed color rules file is here: [ | The processed color rules file is here: [http://trac.osgeo.org/grass/browser/grass-addons/raster/r.colors.tools/palettes] | ||
* Apply color rules to imported maps: | * Apply color rules to imported maps: | ||
Line 121: | Line 524: | ||
r.colors "${map}.chlor_A" rules=palette_chl_etc.gcolors | r.colors "${map}.chlor_A" rules=palette_chl_etc.gcolors | ||
done | done | ||
=== Removing holes === | |||
See [[AVHRR]]. | |||
* If it's land cover classes, {{cmd|r.fillnulls}} will produce nonsense data: floating point. Rather use {{cmd|r.neighbors}} with a modal filter, {{cmd|r.patch}} the original with the filtered map. This will replace NULLs with the most common surrounding land cover type. | |||
=== Rolling composite === | |||
Create a rolling 28 day composite: | |||
Say you have weekly satellite data but there is a lot of NULL cells where there was cloud cover that day. You can fill these holes by looking for valid data in the previous week's image, and if again you find clouds, look in week prior to that, etc., up to 4 weeks have past, after which point if any holes remain give up and leave it unfilled. You can do this quite simply with r.patch: | |||
r.patch output=rolling28day.date input=newest,last_week,2weeks_ago,3weeks_ago | |||
NULL data cells are filled using the maps in the given '''input=''' order. | |||
=== MODIS SWATH === | |||
Here an alternative to MODIS SWATH TOOL using GDAL: | |||
* https://gis.stackexchange.com/questions/81361/how-to-reproject-modis-swath-data-to-wgs84?rq=1 | |||
== See also == | |||
* [[SeaWiFS]] | |||
* [[AVHRR]] | |||
* [http://acc6.its.brooklyn.cuny.edu/~gurwitz/core5/nav2tool.html Decimal/Binary Conversion Tool] for MODIS QA bitpatterns | |||
== Bibliography == | |||
* Alexandris, N. (2011). Burned area mapping via non-centered Principal Components Analysis using public domain data and free open source software. Department of Remote Sensing & Landscape Information Systems, Faculty of Forest and Environmental Sciences, Albert-Ludwigs-University, Freiburg im Breisgau, Germany. Dissertation [http://www.freidok.uni-freiburg.de/volltexte/8399 Entry at freidok.uni-freiburg.de] | |||
* Carpi G., Cagnacci F., Neteler M., Rizzoli A, 2008: ''Tick infestation on roe deer in relation to geographic and remotely-sensed climatic variables in a tick-borne encephalitis endemic area''. Epidemiology and Infection, 136, pp. 1416-1424. ([http://www.ncbi.nlm.nih.gov/pubmed/18081949 PubMed]) | |||
* Metz, M., Rocchini, D., Neteler, M., 2014. Surface Temperatures at the Continental Scale: Tracking Changes with Remote Sensing at Unprecedented Detail. Remote Sensing 6, 3822–3840 ([http://dx.doi.org/10.3390/rs6053822 DOI], [http://www.mdpi.com/2072-4292/6/5/3822 Abstract], [http://www.mdpi.com/2072-4292/6/5/3822/pdf PDF]). Dataset download: http://gis.cri.fmach.it/eurolst | |||
* Neteler, M., 2005: ''Time series processing of MODIS satellite data for landscape epidemiological applications''. International Journal of Geoinformatics, 1(1), pp. 133-138 ([http://creativecity.gscc.osaka-cu.ac.jp/IJG/article/download/325/326 PDF]) | |||
* Neteler, M., 2010: ''Estimating daily Land Surface Temperatures in mountainous environments by reconstructed MODIS LST data''. Remote Sensing 2(1), 333-351. ([http://dx.doi.org/10.3390/rs1020333 DOI], [http://www.mdpi.com/2072-4292/2/1/333 Abstract], [http://www.mdpi.com/2072-4292/2/1/333/pdf PDF]) | |||
* Neteler, M., Roiz, D., Rocchini, D., Castellani, C. and Rizzoli, A. (2011). Terra and Aqua satellites track tiger mosquito invasion: modeling the potential distribution of Aedes albopictus in north-eastern Italy. International Journal of Health Geographics, 10:49 ([http://dx.doi.org/10.1186/1476-072X-10-49 DOI] | [http://www.ij-healthgeographics.com/content/pdf/1476-072X-10-49.pdf PDF]) | |||
* Neteler, M., Metz, M., Rocchini, D., Rizzoli, A., Flacio, E., Engeler, L., Guidi, V., Lüthy, P., Tonolla, M. (2013): Is Switzerland suitable for the invasion of Aedes albopictus? PLoS ONE 8(12): e82090. [[http://dx.doi.org/10.1371/journal.pone.0082090 DOI] | [http://www.plosone.org/article/fetchObject.action?uri=info%3Adoi%2F10.1371%2Fjournal.pone.0082090&representation=PDF PDF]] | |||
* Roiz D., Neteler M., Castellani C., Arnoldi D., Rizzoli A., 2011: ''Climatic Factors Driving Invasion of the Tiger Mosquito (Aedes albopictus) into New Areas of Trentino, Northern Italy''. PLoS ONE. 6(4): e14800. ([http://dx.plos.org/10.1371/journal.pone.0014800 DOI] - [http://www.plosone.org/article/fetchObjectAttachment.action?uri=info%3Adoi%2F10.1371%2Fjournal.pone.0014800&representation=PDF PDF]) | |||
* Zorer, R., Rocchini, D., Metz, M., Delucchi, L., Zottele, F., Meggio, F., Neteler, M. (2013): Daily MODIS Land Surface Temperature Data for the Analysis of the Heat Requirements of Grapevine Varieties. IEEE Transactions on Geoscience and Remote Sensing 51, 2128–2135. ([http://dx.doi.org/10.1109/TGRS.2012.2226465 DOI]) | |||
[[Category: Geodata]] | |||
[[Category:Image processing]] |
Latest revision as of 16:51, 10 May 2018
Introduction
There are two satellites, Aqua and Terra which carry the MODIS sensor as payload. The Moderate Resolution Imaging Spectroradiometer (MODIS) is a 36-channel from visible to thermal-infrared sensor that was launched as part of the Terra satellite payload in December 1999 and Aqua satellite (May 2002). The Terra satellite passes twice a day (at about 10:30am, and 22:30pm local time), also the Aqua satellite passes twice a day (at about 01:30am, and 13:30pm local time).
Details about the sensor (compiled from various sources)
MODIS (MODerate resolution Imaging Spectroradiometer) is a whiskbroom multi-spectral sensor (across track scanner) on board the satellites Terra and Aqua that measures 36 spectral frequencies of light. Its swath is 2330 km (across track) by 10° of latitude (along track at nadir). Their circular, sun-synchronous, near polar orbit takes about 99 min. which means that the same region is acquired several times during daylight. However, at nadir observations of the same region are taken every 16 days. A complete earth scan is achieved every 1.2 days. Due to its large swath geometric distortions occur at the scan edges. The pixel size grows with the scan angle, by a factor of 8 at the edge of scan, making the shape of each scan-stripe look like a bow-tie. Successively, the scan-stripes being wider at the edges of the scan, overlap each other. This overlap creates a double-vision effect on the edges of the scan, as each point of the Earth’s surface appears in two adjacent scans. This distortion emphasises the line where scan-stripes meet and, thus, magnifies small differences between the sides of the scan mirror.
MODIS Tile Calculator: https://landweb.modaps.eosdis.nasa.gov/cgi-bin/developer/tilemap.cgi
MODIS Land
MODIS Land provides many data products including:
- Daily Surface Reflectance
- Snow coverage
- NDVI / EVI Vegetation index
- Land Surface Temperature (LST)
- Atmosphere Daily Global Products
- many more
Most products are created separately for the different MODIS sensors onboard the TERRA (MOD products) and AQUA (MYD products) satellites. Many products are also available as a combined MODIS product (MCD products).
MODIS data download: installation of convenient GRASS GIS addon i.modis
For a convenient download addon, see i.modis, a toolset for downloading and processing of MODIS products using pyModis.
Installation of pyModis:
# either
pip install pyModis
# or
sudo pip install pyModis
Installation of i.modis toolset:
# in a GRASS GIS session
g.extension i.modis
Now i.modis, i.modis.download, and i.modis.import are locally installed.
Download of MODIS tiles: i.modis.download
For obtaining the needed NASA credentials, see here.
Example: download of Global MODIS Vegetation Index, sixteen days Global 0.05Deg CMG (Terra/Aqua):
##### preparation: first time only
# store NASA username and password in two lines
echo "myusername
xxxxxx" > ~/.imodis
# create new space-time cube
t.create type=strds temporaltype=absolute output=ndvi_16_5600m title="Global NDVI 16 days MOD13C1" \
description="MOD13C1 Global NDVI 16 days" semantictype=mean
##### END first time only
MODIS_TMPFOLDER=/path/to/modis/ndvi_MOD13C1.006/
# fetch list of available MODIS data up to today
i.modis.download settings=~/.imodis product=ndvi_terra_sixteen_5600 \
startday=2015-01-01 endday=`date +"%Y-%m-%d"` folder=$MODIS_TMPFOLDER
The downloaded HDF files can then be imported with i.modis.download.
Download of MODIS tiles: modis_download.py
As an alternative to i.modis.download the pyModis' tool modis_download.py can be used directly. Be sure to have created the needed .netrc file with your NASA credentials (see here for details):
# example: Global MODIS Vegetation Index, sixteen days Global 0.05Deg CMG (Terra/Aqua)
mkdir -p ndvi_MOD13C1.006
modis_download.py -e 2015-10-01 -f 2016-01-31 --product=MOD13C1.006 ndvi_MOD13C1.006/
The downloaded HDF files can then be imported with i.modis.download.
MODIS Land Products Quality Control
A number of MODIS Land products include a quality subdataset that is bit-encoded: different bit ranges represent different quality assessments.
The general formula to extract a specific bit range for start bit = sb (first start bit is 0) and number of bits from start bit = nb is the modulus of (quality divided by 2 to the power of the first start bit) by 2 to the power of number of bits - 1, or as r.mapcalc expression using standard math:
bitfield = (quality / (2 ^ sb)) % (2 ^ nb)
or in bit operation notation:
bitfield = (quality >> sb) & ((1 << nb) - 1)
MOD09A1/MYD09A1 version 6
Surface reflectance of MODIS bands 1 - 7 at 500 meter resolution for every 8 days
The name of the quality control subdataset is "sur_refl_qc_500m". The type is unsigned int 32 bit, and the contents are
- bits 0 - 1: MODLAND QA bits
- bits 2 - 5: band 1 data quality
- bits 6 - 9: band 2 data quality
- bits 10 - 13: band 3 data quality
- bits 14 - 17: band 4 data quality
- bits 18 - 21: band 5 data quality
- bits 22 - 25: band 6 data quality
- bits 26 - 29 band 7 data quality
- bit 30: atmospheric correction performe
- bit 31: adjacency correction performed
Here, the quality subdataset is of type unsigned int 32 bit, therefore you need to use '>>>' instead of '>>' in r.mapcalc expressions. For example, to extract the MODLAND QA bits, use
qa = quality & 3
To extract the quality of band 2, use
band2_quality = (quality >>> 6) & 15
- See the user guide at https://lpdaac.usgs.gov/sites/default/files/public/product_documentation/mod09_user_guide_v1.4.pdf
The band quality might indicate lower quality (7 = noisy detector or 8 = dead detector), even though the MODLAND QA bits indicate ideal quality on all bands. On the other hand, MODLAND QA bits sometimes indicate pixel "... not produced for other reasons, some or all bands ...". In this case, the band-specific data quality needs to be checked. That means pixels should only be discarded if both MODLAND QA > 1 and band quality > 8 (maybe band quality > 12 for high tolerance).
MOD11A1/MYD11A1 version 6
Land Surface Temperature (LST) at 1 km resolution, daily
The name of the quality control subdatasets are "QC_Day" and "QC_Night". The type is unsigned int 8 bit, and the contents are
- bits 0 - 1: Mandatory QA flag
- bits 2 - 3: Data quality flag
- bits 4 - 5: Emissivity error flag
- bits 6 - 7: LST error flag
The nodata (NULL) value is set to 0 (zero) which is not mentioned in the documentation, but set in the hdf file and recognized by GDAL. Unfortunately, the value 0 (zero) also indicates highest quality pixels. In order to preserve pixels with value 0 (zero), the nodata value needs to be changed with "gdalbuildvrt -srcnodata 255 -vrtnodata 255" or with "gdal_translate -a_nodata 255" or after import into GRASS with "r.null null=255". After that, pixels with a QC value of 0 and a valid LST value are of highest quality, while pixels with a QC value of 0 and an invalid LST value can be set to NULL.
The most useful bit fields are the "Mandatory QA flag", extracted with
qa = quality & 3
and the "LST error flag", extracted with
lst_error = (quality >> 6) & 3
MOD13Q1/MYD13Q1 version 6
Vegetation indices (NDVI and EVI) at 250 meter resolution for every 16 days
The name of the quality control subdataset is "250m 16 days VI Quality". The type is unsigned int 16 bit, and the contents are
- bits 0 - 1: MODLAND QA bits
- bits 2 - 5: VI usefulness
- bits 6 - 7: Aerosol Quantity
- bit 8: Adjacent cloud detected
- bit 9: Atmosphere BRDF Correction
- bit 10: Mixed Clouds
- bits 11 - 13: Land/Water Mask
- bit 14: Possible snow/ice
- bit 15: Possible shadow
The VI usefulness bits are sometimes indicating 1111 = "Not useful for any other reason/not processed", even for a reasonable VI value and MODLAND QA bits indicating good quality. Therefore the VI usefulness should be used with care, or not at all. The MODLAND QA bits extracted with
qa = quality & 3
is sufficient to filter out bad quality pixels with qa > 1.
Advanced MODIS LST time series reconstruction
- See http://www.geodati.fmach.it/eurolst.html (LST dataset download)
MODIS Ocean
Sea surface color, temperature, and derivative images
The following examples assume you want to import multiple images at once, and so use shell scripting loops. They also assume input files are SMI (Standard Mapped Image) format, as produced by the Ocean Color group (http://oceancolor.gsfc.nasa.gov).
SST (Level 3)
Get the data
- Get data in HDF4 format from http://oceancolor.gsfc.nasa.gov/cgi/l3
- You'll probably want the Standard Mapped Image (SMI) file version, not the Binned (BIN) one.
File names look like: A20023532002360.L3m_8D_SST_4
- key:
* A: MODIS/Aqua * 2002: Year at start * 353: Julian day at start * 2002: Year at end * 360: Julian day at end * L3m: Level 3 data, mapped (Plate carrée) * 8D: 8 day ensemble * SST: Sea Surface Temperature product * 4: 4.6km pixel size (8640x4320 image, 2.5 minute resolution)
Decompress
Run this command in the command line:
bzip2 -d A20023532002360.L3m_8D_SST_4.bz2
Check
Use GDAL's gdalinfo tool to view the HDF file's meta-data. Run this command in the command line:
gdalinfo A20023532002360.L3m_8D_SST_4
Example of SST metadata from gdalinfo:
Parameter=Sea Surface Temperature Measure=Mean Units=deg-C Scaling Equation=(Slope*l3m_data) + Intercept = Parameter value Slope=0.000717185 Intercept=-2 Scaled Data Minimum=-2 Scaled Data Maximum=45 Data Minimum=-1.999999 Data Maximum=37.06 Subdatasets: SUBDATASET_1_NAME=HDF4_SDS:UNKNOWN:"A20023532002360.L3m_8D_SST_4":0 SUBDATASET_1_DESC=[4320x8640] l3m_data (16-bit unsigned integer) SUBDATASET_2_NAME=HDF4_SDS:UNKNOWN:"A20023532002360.L3m_8D_SST_4":1 SUBDATASET_2_DESC=[4320x8640] l3m_qual (8-bit unsigned integer)
Import
GRASS can be quite strict about moving maps around between differing map projections. Because GDAL does not automatically georeference HDF4 images, we need to apply the georeferencing information manually. This is a slight chore, but means fewer errors due to projection mis-matches later on. Everything we need to know is given in the data file's meta-data, as seen with gdalinfo.
Method 0
The USGS/NASA's MODIS Reprojection Tool (MRT) support reprojection for Level 3 data or MODIS Swath Reprojection Tool for Level 1B or 2 data and can be used to create a GeoTiff. This requires the geolocation angles.
Method 1
Use GDAL tools to prepare the dataset and create a GeoTiff which can be directly imported into GRASS.
- Create a Lat/Lon GRASS location and mapset
- Plate carrée, or WGS84? -- Does it matter with data at the km scale?
- -- Encode with +ellps=sphere (?) or epsg:32662?, see http://lists.osgeo.org/pipermail/gdal-dev/2007-September/014134.html or http://thread.gmane.org/gmane.comp.gis.gdal.devel/12666
- FIXME: gdalinfo: " Map Projection=Equidistant Cylindrical"
- Use "+nadgrids=@null" trick?? http://proj.maptools.org/faq.html#sphere_as_wgs84
- Use gdal_translate to extract the desired data array from the HDF4 file and convert into a GeoTiff. While creating the GeoTiff specify the map projection, bounding box, and no-data information. Then import this GeoTiff using the r.in.gdal module. In this example we set the projection to be lat/lon WGS84 (EPSG code 4326), even though this may not be entirely correct.
Note: this shell script is best stored in a text file and then executed:
for file in A*SST_4 ; do echo "map: $file" gdal_translate -a_srs "+init=epsg:4326" -a_nodata 65535 \ -a_ullr -180 90 180 -90 -co "COMPRESS=PACKBITS" \ HDF4_SDS:UNKNOWN:"$file":0 ${file}_prep.tif r.in.gdal in=${file}_prep.tif out=$file done
Method 2
Import raw HDF4 data into a XY location, clean it up, then copy into a lat/lon location lat/lon location using the new r.in.gdal -l flag in GRASS 6.5+. This is more work than method 1 but demonstrates using GRASS tools to do the manipulation instead of GDAL tools (the result should be the same).
Extract
Because the HDF4 data file contains multiple data arrays we need to extract the one we are interested in. In this example we extract the "data" layer but not the "quality" layer, and output a GeoTIFF file using GDAL's gdal_translate tool, using the appropriate SUBDATASET NAME:
for file in *SST_4 ; do gdal_translate HDF4_SDS:UNKNOWN:"$file":0 ${file}_unproj.tif done
Import
Update: in GRASS 6.5 and newer you can now use the r.in.gdal `-l` flag to import unreferenced maps directly into a lat/lon location. You must then immediately run r.region to correct the map's region bounds.
- Create a
simple XYlat/lon GRASS location and mapset - Import the imagery and set image bounds and resolution:
for file in *unproj.tif ; do BASE=`basename $file _unproj.tif` echo "map: $file" r.in.gdal -l in=$file out=$BASE r.region $BASE n=90 s=-90 w=-180 e=180 done
- Check:
g.list rast r.info A20023532002360.L3m_8D_SST_4
- Convert the simple XY location into a Lat/Lon location
- Plate carrée, or WGS84? -- Does it matter with data at the km scale?
- Modify the location's projection settings with g.setproj? (run that from the PERMANENT mapset)
Hack: after import move the mapset dir into a world_ll location, then edit the mapset's WIND file and change the proj: line from 0 to 3. (0 is unprojected, 3 is LL) You also have to edit each file header in the $MAPSET/cellhd/ directory.
- Remove NULLs
for map in `g.mlist rast pat=*SST_4` ; do r.null $map setnull=65535 done
Method 3
Like method 2 above, but using i.rectify instead of moving the mapset manually. See the more detailed explanation in the Chlorophyll section below.
Processing
- Convert to temperature in degrees C. Note slope and intercept are taken from the image's metadata using gdalinfo.
for map in `g.mlist rast pat=*SST_4` ; do echo "$map" g.region rast=$map #r.info -r $map Slope=0.000717185 Intercept=-2 r.mapcalc "${map}.degC = ($Slope * $map) + $Intercept" r.support "${map}.degC" units="deg-C" #r.colors "${map}.degC" color=bcyr # fairly nice done
- Check:
r.info A20023532002360.L3m_8D_SST_4.degC r.univar A20023532002360.L3m_8D_SST_4.degC
Set colors
Although the standard bcyr color map looks nice, for a very nice image we can use Goddard's OceanColor palettes from http://oceancolor.gsfc.nasa.gov/PRODUCTS/colorbars.html
Those are given in 0-255 range, the imported HDF4 data is 0-65535, and the converted temperatures are -2-45 deg C.
- Use the UNIX awk tool to convert each color rule with the same scaling function we used to convert the data, and save it to a rules file:
# scale 0-255 to 0-65535 and then convert to temperature values echo "# Color rules for MODIS SST" > palette_sst.gcolors Slope=0.000717185 Intercept=-2 cat palette_sst.txt | grep -v '^#' | \ awk -v Slope=$Slope -v Intercept=$Intercept \ '{ printf("%f %d:%d:%d\n", \ (Slope * (($1 +1)^2 -1) + Intercept), $2, $3, $4)}' \ >> palette_sst.gcolors echo "nv black" >> palette_sst.gcolors # better: edit last rule to be 45.000719 206:206:206
The processed color rules file is here: [1]
- Apply color rules to imported maps:
for map in `g.mlist rast pat=*SST_4` ; do r.colors "${map}.degC" rules=palette_sst.gcolors done
Apply a quality mask
If there is cloud cover, the satellite will show the temperature at the top of the cloud, not at the sea surface. The HDF data comes with a quality layer and we can use it to restrict the data only to areas with the highest confidence.
Select the file:
file=T2010235.L3m_DAY_SST_4km echo "map: $file"
Extract the quality layer from the HDF data file:
gdal_translate -a_srs "+init=epsg:4326" -a_nodata 65535 \ -a_ullr -180 90 180 -90 -co "COMPRESS=PACKBITS" \ HDF4_SDS:UNKNOWN:"$file":1 ${file}.qual_prep.tif
Import the extracted GeoTiff with r.in.gdal:
r.in.gdal in=${file}.qual_prep.tif out=$file.qual
Set up some color rules to cover the quality status categories using r.colors:
r.colors $file.qual col=rules << EOF 0 green 1 yellow 2 red 255 black EOF
Assign quality map category labels with r.category:
r.category $file.qual rules=- << EOF 0:best quality 1:not as good quality 2:rather bad quality 3:pretty bad quailty 4:254:really bad quality 255:unusable quality EOF
Setup a MASK map to only show the best quality data using a r.reclass virtual raster:
r.reclass in=$file.qual out=MASK --o << EOF 0 = 1 good quality * = NULL EOF
With the MASK map in place, redrawing the map display (or any other raster processing) will only occur for the good data cells.
Chlorophyll-a (Level 3)
Get the data
- Get data in HDF4 format from http://oceancolor.gsfc.nasa.gov/PRODUCTS/L3_chlo.html
File names look like: A20023352002365.L3m_MO_CHLO_4
- key:
* A: MODIS/Aqua * 2002: Year at start * 335: Julian day at start * 2002: Year at end * 365: Julian day at end * L3m: Level 3 data, mapped (Plate carrée) * MO: One month ensemble * CHLO: Chlorophyll a concentration product * 4: 4.6km pixel size (8640x4320 image, 2.5 minute resolution)
Decompress
Run this command in the command line:
bzip2 -d A20023352002365.L3m_MO_CHLO_4.bz2
Check
Use GDAL's gdalinfo tool to view the HDF file's meta-data. Run this command in the command line:
gdalinfo A20023352002365.L3m_MO_CHLO_4
Example of CHLO metadata from gdalinfo:
Parameter=Chlorophyll a concentration Measure=Mean Units=mg m^-3 Scaling=logarithmic Scaling Equation=Base**((Slope*l3m_data) + Intercept) = Parameter value Base=10 Slope=5.813776e-05 Intercept=-2 Scaled Data Minimum=0.01 Scaled Data Maximum=64.5654 Data Minimum=0.002637 Data Maximum=99.99774
Import
Method 1
- Create a Lat/Lon GRASS location and mapset
- Plate carrée, or WGS84? -- Does it matter with data at the km scale?
- Use gdal_translate to convert the HDF4 data into a GeoTiff, and specify map projection, bounding box, and no-data information. Then import this GeoTiff using the r.in.gdal module. In this example we set the projection to be lat/lon WGS84 (EPSG code 4326), even though this may not be entirely correct.
for file in A*_MO_CHLO_4 ; do echo "map: $file" gdal_translate -a_srs "+init=epsg:4326" -a_nodata 65535 \ -a_ullr -180 90 180 -90 -co "COMPRESS=PACKBITS" \ $file ${file}_prep.tif r.in.gdal in=${file}_prep.tif out=$file done
Method 2
Update: in GRASS 6.5 and newer you can now use the r.in.gdal `-l` flag to import unreferenced maps directly into a lat/lon location. You must then immediately run r.region to correct the map's region bounds.
- Create a
simple XYlat/lon GRASS location and mapset - Import the imagery and set image bounds and resolution:
for file in A*_MO_CHLO_4 ; do echo "map: $file" r.in.gdal -l in=$file out=$file r.region $file n=90 s=-90 w=-180 e=180 done
- Check:
g.list rast r.info A20023352002365.L3m_MO_CHLO_4
- Convert the simple XY location into a Lat/Lon location
- Plate carrée, or WGS84? -- Does it matter with data at the km scale?
- Modify the location's projection settings with g.setproj? (run that from the PERMANENT mapset)
Hack:: after import move the mapset dir into a world_ll location, then edit the mapset's WIND file and change the proj: line from 0 to 3. (0 is unprojected, 3 is LL) You also have to edit each file header in the $MAPSET/cellhd/ directory.
- Remove NULLs
for map in `g.mlist rast pat=*MO_CHLO_4` ; do r.null $map setnull=65535 done
Method 3
Like method 2 above, but instead of moving the mapset directory in the file system use i.rectify to move the maps into the target Lat/Lon location. After importing the images, setting their bounds, and removing NULLs, add all images to an imagery group with the i.group module. Run i.target and select a lat/lon location. The idea here is to run i.rectify with a first-order transform, where the transform uses a scaling factor of 1.0 and rotation of 0.0, i.e. no change at all. The trick is to set those. Usually you would use i.points or the GUI georeferencing tool to set those, and here you can add a few points by hand if you like. But the POINTS file in the group's data directory should use identical coordinates for both source and destination. Hack: edit the POINTS file by hand to make it so, using the four corners and 0,0 at the center of the image. Finally run i.rectify to push the image into the target location.
Processing
- Convert to chlorophyll a concentration. Note slope and intercept are taken from the image's metadata using gdalinfo.
for map in `g.mlist rast pat=*MO_CHLO_4` ; do echo "$map" g.region rast=$map #r.info -r $map Slope=5.813776e-05 Intercept=-2 r.mapcalc "${map}.chlor_A = 10^(($Slope * $map) + $Intercept)" r.support "${map}.chlor_A" units="mg m^-3" #r.colors -e "${map}.chlor_A" color=bcyr # :-( done
- Check:
r.info A20023352002365.L3m_MO_CHLO_4.chlor_A r.univar A20023352002365.L3m_MO_CHLO_4.chlor_A
Set colors
The chlorophyll maps are logarithmic which poses some challenges to rendering nice colors. (the unconverted import image displays nicely with a linear color map) We can use Goddard's OceanColor palettes from http://oceancolor.gsfc.nasa.gov/PRODUCTS/colorbars.html
Those are given in 0-255 range, the imported HDF4 data is 0-65535, and the converted chlorophyll a concentration is 0-65 mg/m^3.
- Use the UNIX awk tool to convert each color rule with the same exponential function we used to convert the data, and save it to a rules file:
# scale 0-255 to 0-65535 and then convert to chlor-a values echo "# Color rules for MODIS Chloropyll-a" > palette_chl_etc.gcolors Slope=5.813776e-05 Intercept=-2 cat palette_chl_etc.txt | grep -v '^#' | \ awk -v Slope=$Slope -v Intercept=$Intercept \ '{ printf("%f %d:%d:%d\n", \ 10^((Slope * (($1 +1)^2 -1)) + Intercept), $2, $3, $4)}' \ >> palette_chl_etc.gcolors echo "nv black" >> palette_chl_etc.gcolors # better: edit last rule to be 64.574061 100:0:0
The processed color rules file is here: [2]
- Apply color rules to imported maps:
for map in `g.mlist rast pat=*MO_CHLO_4` ; do r.colors "${map}.chlor_A" rules=palette_chl_etc.gcolors done
Removing holes
See AVHRR.
- If it's land cover classes, r.fillnulls will produce nonsense data: floating point. Rather use r.neighbors with a modal filter, r.patch the original with the filtered map. This will replace NULLs with the most common surrounding land cover type.
Rolling composite
Create a rolling 28 day composite:
Say you have weekly satellite data but there is a lot of NULL cells where there was cloud cover that day. You can fill these holes by looking for valid data in the previous week's image, and if again you find clouds, look in week prior to that, etc., up to 4 weeks have past, after which point if any holes remain give up and leave it unfilled. You can do this quite simply with r.patch:
r.patch output=rolling28day.date input=newest,last_week,2weeks_ago,3weeks_ago
NULL data cells are filled using the maps in the given input= order.
MODIS SWATH
Here an alternative to MODIS SWATH TOOL using GDAL:
See also
- SeaWiFS
- AVHRR
- Decimal/Binary Conversion Tool for MODIS QA bitpatterns
Bibliography
- Alexandris, N. (2011). Burned area mapping via non-centered Principal Components Analysis using public domain data and free open source software. Department of Remote Sensing & Landscape Information Systems, Faculty of Forest and Environmental Sciences, Albert-Ludwigs-University, Freiburg im Breisgau, Germany. Dissertation Entry at freidok.uni-freiburg.de
- Carpi G., Cagnacci F., Neteler M., Rizzoli A, 2008: Tick infestation on roe deer in relation to geographic and remotely-sensed climatic variables in a tick-borne encephalitis endemic area. Epidemiology and Infection, 136, pp. 1416-1424. (PubMed)
- Metz, M., Rocchini, D., Neteler, M., 2014. Surface Temperatures at the Continental Scale: Tracking Changes with Remote Sensing at Unprecedented Detail. Remote Sensing 6, 3822–3840 (DOI, Abstract, PDF). Dataset download: http://gis.cri.fmach.it/eurolst
- Neteler, M., 2005: Time series processing of MODIS satellite data for landscape epidemiological applications. International Journal of Geoinformatics, 1(1), pp. 133-138 (PDF)
- Neteler, M., 2010: Estimating daily Land Surface Temperatures in mountainous environments by reconstructed MODIS LST data. Remote Sensing 2(1), 333-351. (DOI, Abstract, PDF)
- Neteler, M., Roiz, D., Rocchini, D., Castellani, C. and Rizzoli, A. (2011). Terra and Aqua satellites track tiger mosquito invasion: modeling the potential distribution of Aedes albopictus in north-eastern Italy. International Journal of Health Geographics, 10:49 (DOI | PDF)
- Neteler, M., Metz, M., Rocchini, D., Rizzoli, A., Flacio, E., Engeler, L., Guidi, V., Lüthy, P., Tonolla, M. (2013): Is Switzerland suitable for the invasion of Aedes albopictus? PLoS ONE 8(12): e82090. [DOI | PDF]
- Roiz D., Neteler M., Castellani C., Arnoldi D., Rizzoli A., 2011: Climatic Factors Driving Invasion of the Tiger Mosquito (Aedes albopictus) into New Areas of Trentino, Northern Italy. PLoS ONE. 6(4): e14800. (DOI - PDF)
- Zorer, R., Rocchini, D., Metz, M., Delucchi, L., Zottele, F., Meggio, F., Neteler, M. (2013): Daily MODIS Land Surface Temperature Data for the Analysis of the Heat Requirements of Grapevine Varieties. IEEE Transactions on Geoscience and Remote Sensing 51, 2128–2135. (DOI)