Mapping percentage tree cover from Envisat MERIS data using linear and nonlinear techniques


Creative Commons License

BERBEROĞLU S., Şatır O., Atkinson P. M.

INTERNATIONAL JOURNAL OF REMOTE SENSING, cilt.30, sa.18, ss.4747-4766, 2009 (SCI-Expanded) identifier identifier

  • Yayın Türü: Makale / Tam Makale
  • Cilt numarası: 30 Sayı: 18
  • Basım Tarihi: 2009
  • Doi Numarası: 10.1080/01431160802660554
  • Dergi Adı: INTERNATIONAL JOURNAL OF REMOTE SENSING
  • Derginin Tarandığı İndeksler: Science Citation Index Expanded (SCI-EXPANDED), Scopus
  • Sayfa Sayıları: ss.4747-4766
  • Van Yüzüncü Yıl Üniversitesi Adresli: Hayır

Özet

The aim of this study was to predict percentage tree cover from Envisat Medium Resolution Imaging Spectrometer (MERIS) imagery with a spatial resolution of 300 m by comparing four common models: a multiple linear regression ( MLR) model, a linear mixture model (LMM), an artificial neural network ( ANN) model and a regression tree (RT) model. The training data set was derived from a fine spatial resolution land cover classification of IKONOS imagery. Specifically, this classification was aggregated to predict percentage tree cover at the MERIS spatial resolution. The predictor variables included the MERIS wavebands plus biophysical variables (the normalized difference vegetation index (NDVI), leaf area index (LAI), fraction of photosynthetically active radiation (fPAR), fraction of green vegetation covering a unit area of horizontal soil (fCover) and MERIS terrestrial chlorophyll index (MTCI)) estimated from the MERIS data. An RT algorithm was the most accurate model to predict percentage tree cover based on the Envisat MERIS bands and vegetation biophysical variables. This study showed that Envisat MERIS data can be used to predict percentage tree cover with considerable spatial detail. Inclusion of the biophysical variables led to greater accuracy in predicting percentage tree cover. This finer-scale depiction should be useful for environmental monitoring purposes at the regional scale.