Accurate estimation of forest aboveground biomass is important for global carbon budgets and ecosystem change studies.
Most algorithms for regional or global aboveground biomass estimation using optical and microwave remote sensing
data are based on empirical regression and non-parametric training methods, which require large amount of ground
measurements for training and are lacking of explicit interaction mechanisms between electromagnetic wave and
vegetation. In this study, we proposed an optical/microwave synergy method based on a coherent polarimetric SAR
model to estimate woody biomass. The study area is sparse deciduous forest dominated by birch with understory of
shrubs and herbs in Daxing’anling, China. HJ-1, Radarsat-2 images, and field LAI were collected during May to August
in 2013, tree biophysical parameters were measured at the field campaign during August to September in 2012. The
effects of understory and wet ground were evaluated by introducing the NDVI derived from HJ-1 image and rain rate.
Field measured LAI was used as an input to the SAR model to define the scattering and attenuation of the green canopy
to the total backscatter. Finally, an logarithmic equation between the backscatter coefficient of direct forest scattering
mechanism and woody biomass was generated (R2=0.582). The retrieval results were validated with the ground biomass
measurements (RMSE=29.01ton/ha). The results indicated the synergy of optical and microwave remote sensing data
based on SAR model has the potential to improve the accuracy of woody biomass estimation.
Leaf Area Index (LAI) is known as a key vegetation biophysical variable. To effectively use remote sensing LAI
products in various disciplines, it is critical to understand the accuracy of them. The common method for the validation
of LAI products is firstly establish the empirical relationship between the field data and high-resolution imagery, to
derive LAI maps, then aggregate high-resolution LAI maps to match moderate-resolution LAI products. This method is
just suited for the small region, and its frequencies of measurement are limited. Therefore, the continuous observing LAI
datasets from ground station network are important for the validation of multi-temporal LAI products. However, due to
the scale mismatch between the point observation in the ground station and the pixel observation, the direct comparison
will bring the scale error. Thus it is needed to evaluate the representativeness of ground station measurement within pixel
scale of products for the reasonable validation. In this paper, a case study with Chinese Ecosystem Research Network
(CERN) in situ data was taken to introduce a methodology to estimate representativeness of LAI station observation for
validating LAI products. We first analyzed the indicators to evaluate the observation representativeness, and then graded
the station measurement data. Finally, the LAI measurement data which can represent the pixel scale was used to
validate the MODIS, GLASS and GEOV1 LAI products. The result shows that the best agreement is reached between
the GLASS and GEOV1, while the lowest uncertainty is achieved by GEOV1 followed by GLASS and MODIS. We
conclude that the ground station measurement data can validate multi-temporal LAI products objectively based on the
evaluation indicators of station observation representativeness, which can also improve the reliability for the validation
of remote sensing products.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.