AVHRR: Difference between revisions
(→About) |
m (→See also) |
||
(One intermediate revision by the same user not shown) | |||
Line 131: | Line 131: | ||
=== Interpolation === | === Interpolation === | ||
Cloud cover or technical problems often causes there to be some gaps in the data. GRASS has a | Cloud cover or technical problems often causes there to be some gaps in the data. GRASS has a {{cmd|r.fillnulls}} module which uses spline interpolation to fill in these small gaps. These holes are left in the data for a reason, so check if it is appropriate for your analysis before removing them. Also note that the areas around the holes may be of low quality and using them as the starting point for filling the holes just spreads the error wider. | ||
The first step is to create a mask of the sea, i.e. the area which it is valid to interpolate over. To do this we overlay a time series of satellite images and see which areas are permanently no data. | The first step is to create a mask of the sea, i.e. the area which it is valid to interpolate over. To do this we overlay a time series of satellite images and see which areas are permanently no data. | ||
Line 176: | Line 176: | ||
[[Category: Geodata]] | [[Category: Geodata]] | ||
[[Category:Image processing]] | [[Category: Image processing]] | ||
[[Category: Import]] |
Latest revision as of 15:04, 20 December 2012
AVHRR Pathfinder SST
About
AVHRR sea surface temperature data v5.0 at 4km resolution is available for 1985-present. (interim data up to the end of last year anyway; at the time of writing 2006 was the most recent year with completed 4km data fully available)
- See the 4km AVHRR Pathfinder homepage at NOAA's NODC. (especially the user's guide)
- The data can be downloaded from their FTP site:
- The filesize for a 4km resolution data file is approx 16mb.
- You will probably want to download both the raw sst data and the quality mask layer ("*-sst" and "*-qual" files). During processing we will use the quality layer to mask the raw data for valid pixels.
File names look like: 2000301-2000305.s0451pfv50-sst-16b.hdf
key:
* 2000: Year at start * 301: Julian day at start * 2000: Year at end * 305: Julian day at end * s: 16-bit data * 04: 4km resolution (8192x4096 image, approx 2.637 minute resolution) * 5: 5 day ensemble * 1: nighttime (3 is daytime) * pfv50: Pathfinder version 5.0 * sst: Product: Raw SST (including cloud hits) * 16b: 16 bits (only present in pre-2003 data)
Download
Here's a little shell script code to download SST and Quality maps for a given set of days.
#### 8-day DAYS="\ 2000097-2000104 2000145-2000152 2000241-2000248 2000313-2000320" BASE_URL="http://data.nodc.noaa.gov/pathfinder/Version5.0/8day/2000/" for TIME in 1 3 ; do for JULIAN in $DAYS ; do wget "${BASE_URL}${JULIAN}.s048${TIME}pfv50-sst-16b.hdf" wget "${BASE_URL}${JULIAN}.m048${TIME}pfv50-qual.hdf" done done
Check
- Use GDAL's gdalinfo tool to view the HDF file's meta-data, as above for MODIS HDF4 data.
Import
- This data is already in Lat/Lon WGS84, so things are made a bit easier.
- Open a lat/lon WGS84 location (EPSG code 4326)
- Use gdal_translate to convert the HDF4 file into a GeoTiff containing map projection, bounding box, and no-data information. Then import this GeoTiff using the r.in.gdal module.
# Convert and import SST maps for file in *pfv50-sst*.hdf ; do map=`basename "$file" .hdf` echo "map: $map" gdal_translate -a_srs "+init=epsg:4326" -a_nodata 0 \ -a_ullr -180 90 180 -90 -co "COMPRESS=PACKBITS" \ HDF4_SDS:UNKNOWN:"$file":0 ${map}_prep.tif r.in.gdal in=${map}_prep.tif out="$map" done
# Convert and import Overall Quality layers for file in *pfv50-qual.hdf ; do map=`basename "$file" .hdf` echo "map: $map" gdal_translate -a_srs "+init=epsg:4326" \ -a_ullr -180 90 180 -90 -co "COMPRESS=PACKBITS" \ HDF4_SDS:UNKNOWN:"$file":0 ${map}_prep.tif r.in.gdal in=${map}_prep.tif out="$map" r.support "$map" title="SST Overall Quality" history="" r.support "$map" history="key: 0 is worst, 7 is best" r.colors -r "$map" # use standard colors done
Processing
- Convert values to temperature in degrees C. Note slope and intercept are taken from the image's metadata using gdalinfo:
scale_factor=0.075 add_off=-3
Given those coefficients we can now process the data. In this example we will only accept pixels with a quality of 4 or greater.
Note that special care is taken to quote the map names in r.mapcalc as they contain minus signs.
for map in `g.mlist rast pat="*pfv50-sst*" | grep -v '\.degC$'` ; do echo "$map" g.region rast=$map #r.info -r $map Scale=0.075 Intercept=-3 qual_map="`echo "$map" | sed -e 's/\.s/.m/' -e 's/-sst.*$//'`-qual" r.mapcalc MASK="if(\"$qual_map\" >= 4)" r.mapcalc "\"${map}.degC.q4\" = ($Scale * \"$map\") + $Intercept" g.remove MASK r.support "${map}.degC.q4" units="deg-C" done
- Check:
r.info 2000301-2000305.s0451pfv50-sst-16b.degC.q4 r.univar 2000301-2000305.s0451pfv50-sst-16b.degC.q4
- With r.univar (or d.histogram) we see some spurious data points (125deg C water!). We can check with r.univar's "-e percentile=99" that the majority of the data really does not go above 30deg C. If you like you can use r.mapcalc to remove anything above 40C:
r.mapcalc "clean_map = if( orig_map > 40, null(), orig_map )"
Set colors
For a very nice looking image we can use Goddard's OceanColor palettes from
For more information on how to use those with GRASS see the above section regarding MODIS data. To apply the converted rules:
r.colors "${map}.degC.q4" rules=palette_sst.gcolors
Removing holes
Interpolation
Cloud cover or technical problems often causes there to be some gaps in the data. GRASS has a r.fillnulls module which uses spline interpolation to fill in these small gaps. These holes are left in the data for a reason, so check if it is appropriate for your analysis before removing them. Also note that the areas around the holes may be of low quality and using them as the starting point for filling the holes just spreads the error wider.
The first step is to create a mask of the sea, i.e. the area which it is valid to interpolate over. To do this we overlay a time series of satellite images and see which areas are permanently no data.
- Overlay all maps; the data will be mixed up and bogus but the spatial coverage will be informative
r.patch in=`g.mlist -r rast pat='A200.*4$' sep=,` out=all_sat_images_stacked
- Create a 0/1 mask layer showing areas of sea, and label categories
r.mapcalc "sea_mask = if( isnull(all_sat_images_stacked), 0, 1)" r.category sea_mask rules=- << EOF 0:Land 1:Sea EOF g.remove all_sat_images_stacked
The next step is to perform the filling operation. To speed up processing, we first temporarily fill known land areas with a value as there is no point interpolating over those gaps. To avoid interpolation overshoots, smear coastal values slightly inland before interpolation. After the holes are filled we reapply the mask to again remove those areas. As the r.fillnulls step can take some time, it is advisable to use g.region to zoom into an area of interest first.
for map in `g.mlist rast pat=*SST_4.degC` ; do r.grow in=$map out=${map}.grown radius=5 r.mapcalc "tmp_landfill = if( isnull($map) && sea_mask == 1, \ null(), if( isnull(${map}.grown), 0, ${map}.grown) )" r.fillnulls in=tmp_landfill out=${map}.filled_tmp g.copy sea_mask,MASK r.mapcalc "${map}.filled = ${map}.filled_tmp" g.remove tmp_landfill,${map}.grown,${map}.filled_tmp,MASK done
NOAA HRPT
High Rate Picture Transmission (HRPT) - Imagery and atmospheric sounding
see
- http://noaasis.noaa.gov/NOAASIS/ml/status.html
- http://www2.ncdc.noaa.gov/docs/klm/html/c4/sec4-1.htm
- http://www.hffax.de/html/hauptteil_hrpt.html