Next Article in Journal
Rock Surface Strain In Situ Monitoring Affected by Temperature Changes at the Požáry Field Lab (Czechia)
Previous Article in Journal
Smart Graphene-Based Electrochemical Nanobiosensor for Clinical Diagnosis: Review
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Use of Multi-Date and Multi-Spectral UAS Imagery to Classify Dominant Tree Species in the Wet Miombo Woodlands of Zambia

1
Forest Science Postgraduate Programme, Department of Plant and Soil Sciences, University of Pretoria, Private bag X20, Hatfield, Pretoria 0028, South Africa
2
Department of Urban and Regional Planning, Copperbelt University, Kitwe 21692, Zambia
3
Department of Forest Engineering, Advanced Teachers Training School for Technical Education, University of Douala, P.O. Box 1872, Douala, Cameroon
4
Oliver R Tambo Africa Research Chair Initiative (ORTARChI), Chair of Environment and Development, Department of Environmental and Plant Sciences, Copperbelt University, Kitwe 21692, Zambia
5
Centre for Environmental Studies (CFES), Department of Geography, Geoinformatics and Meteorology after CFES, University of Pretoria, Private Bag X20, Hatfield, Pretoria 0028, South Africa
6
USDA Forest Service, Rocky Mountain Research Station, Forestry Sciences Laboratory, 1221 South Main St., Moscow, ID 83843, USA
7
Department of Geography, Environment and Climate Change, Mukuba University, Kitwe 50100, Zambia
*
Author to whom correspondence should be addressed.
Sensors 2023, 23(4), 2241; https://doi.org/10.3390/s23042241
Submission received: 19 January 2023 / Revised: 12 February 2023 / Accepted: 14 February 2023 / Published: 16 February 2023
(This article belongs to the Section Remote Sensors)

Abstract

:
Accurate maps of tree species distributions are necessary for the sustainable management of forests with desired ecological functions. However, image classification methods to produce species distribution maps for supporting sustainable forest management are still lacking in the Miombo woodland ecoregion. This study used multi-date multispectral Unmanned Aerial Systems (UAS) imagery collected at key phenological stages (leaf maturity, transition to senescence, and leaf flushing) to classify five dominant canopy species of the wet Miombo woodlands in the Copperbelt Province of Zambia. Object-based image analysis (OBIA) with a random forest algorithm was used on single date, multi-date, and multi-feature UAS imagery for classifying the dominant canopy tree species of the wet Miombo woodlands. It was found that classification accuracy varies both with dates and features used. For example, the August image yielded the best single date overall accuracy (OA, 80.12%, 0.68 kappa), compared to October (73.25% OA, 0.59 kappa) and May (76.64% OA, 0.63 kappa). The use of a three-date image combination improved the classification accuracy to 84.25% OA and 0.72 kappa. After adding spectral indices to multi-date image combination, the accuracy was further improved to 87.07% and 0.83 kappa. The results highlight the potential of using multispectral UAS imagery and phenology in mapping individual tree species in the Miombo ecoregion. It also provides guidance for future studies using multispectral UAS for sustainable management of Miombo tree species.

1. Introduction

The Miombo woodlands are the most extensive dry forest type in southern Africa, with an estimated area of about 2.7 million km2 covering Angola, Malawi, Mozambique, Tanzania, Zambia, Zimbabwe, and most of the southern parts of the Democratic Republic of Congo [1]. The woodlands have an estimated 8500 plant species, more than 54 percent of which are endemic. They comprise one of the most important ecosystems in Africa because of their ecological, biological, and socioeconomic significance [2,3,4]. In addition, the Miombo woodlands contribute to the livelihoods of millions of rural and urban dwellers [5]. Some of the local ecosystem’s goods and services include fuelwood, charcoal, timber, fruit, beekeeping, mushrooms, and medicines [3]. These forest ecosystems provide valuable timber resources that support regional economic development, but their ecosystem services have been threatened by climate change and increasing disturbances from deforestation, fragmentation, degradation, and other stressors [2,6]. Trees are the foundational component of the forest ecosystem, and their species composition has important influence on forest biodiversity [7]. Furthermore, tree species composition and spatial distribution are critical information needed to address ecological problems in tropical ecosystems [8]. As a result, accurate information on the spatial distribution of dominant tree species in tropical natural mixed forests with complex distributions and structures, such as the Miombo woodlands, is critical for understanding the dynamics of forest ecosystems. Furthermore, precise mapping of dominant tree species is required for effective management of Miombo woodlands, as well as for characterizing ecosystem services and climate feedbacks on forests [9]. Researchers have mapped tree species composition and distributions to assess biodiversity in other African savanna ecosystems [7,10].
Up-to-date species distribution maps that may be attained from either the application of traditional surveys or remote sensing are critical for sustainable forest resource management [11]. Traditional forest surveys could produce detailed and accurate maps of tree species distributions. However, they are time-consuming, labor-intensive, expensive, and prone to errors that may go undetected [10,12]. Given the difficulties in conducting traditional species mapping surveys [13], remote sensing has emerged as one of the tools for tree species mapping at scales ranging from landscape [14,15] to regional [16,17,18]. The understanding that species have unique spectral signatures associated with characteristic biochemical and biophysical properties can be exploited for mapping plant species using remote sensing [19,20]. Free multispectral imagery, such as Landsat and Sentinel, have low spectral resolutions [21], making them unsuitable for identifying plant species, especially in heterogeneous landscapes, such as the Miombo woodlands, but they can be used for regional species mapping in homogeneous landscapes dominated by planted forests [22]. Hyperspectral imagery, on the other hand, has high spectral resolution with hundreds of contiguous bands across the electromagnetic spectrum, making it more suitable than multispectral imagery for capturing plant biochemical properties, which are closely linked to species identity [19,20], as has been demonstrated in many tree species classification studies across different vegetation formations at landscape scale [10,22,23,24]. However, hyperspectral data are not widely available and remain prohibitively expensive in most Sub-Saharan African countries [25].
To compensate for the low spectral resolution that is common to high resolution imagery (e.g., QuickBird, GeoEye, RapidEye, Pléiades, and WorldView), some studies investigated multi-seasonal imagery for tree species classification [25,26]. One study [25] used two–date WorldView-2 imagery (maximum leaf foliage and transition to senescence) to classify tree species in the South African savannas. Their study compensated for low spectral resolution in WorldView-2 imagery by using two-date WorldView imagery to achieve an overall accuracy (OA) of 80.4% compared to an OA of 76.4% and 72% for maximum leaf foliage and transition to senescence imagery, respectively. Another study [26] investigated the use of multi-season (winter, spring, summer, and autumn) RapidEye imagery for classifying wetland and dryland vegetation communities in Isimangaliso Wetland Park, South Africa’s subtropical coastal region. According to their findings, the four-season imagery combination produced the highest overall classification accuracy (OA = 86 ± 2.8%), followed by the spring (80 ± 2.9%), summer (80 ± 3.1%), autumn (79 ± 3.4%), and winter (66 ± 3.1%). Though the preceding studies demonstrated the ability of high spatial resolution, multi-date imagery to discriminate different tree species in the other African Savanna vegetation formations, none of these studies were conducted within the Miombo ecoregion, which has unique forest structure, species composition, and phenology [27]. Furthermore, very high-resolution spaceborne imagery, such as RapidEye and WorldView, are not flexible enough to capture phenological events that are important for classifying tree species, as cloud cover can be a challenge in the tropics where these species are located. Additionally, the data sets used in these studies are expensive and out of reach for most African savanna researchers and forest managers.
Unmanned Aerial Systems (UAS) have the flexibility of acquiring data almost anytime, anywhere, with limited logistics, making them an essential tool in gathering ultra-high spatial resolution imagery (under 10 cm) on forests for detailed characterization of canopies, which is in contrast to manned airplane and satellite platforms that are less flexible or have fixed acquisition constraints. As a result, using multispectral UAS imagery to classify forest tree species is becoming a popular forestry application [28,29].
The application of UAS imagery for tree species discrimination has shown promising results, as demonstrated in many studies [30,31,32,33,34]. However, all these studies were done in different ecosystems with different tree species, forest structures, and compositions, and therefore, the findings cannot be promulgated to the Miombo ecoregion. Furthermore, [32] observed that the application of UAS imagery for deciduous tree species classification is still at a rudimentary level and recommended that more tests are needed to ascertain its reliability and accuracy. As already stated, species distribution maps are still lacking in the Miombo ecoregion, and remote sensing methods for classifying tree species have not been explored. This study aims to evaluate the potential for multi-spectral and multi-date UAS imagery for classifying the dominant wet Miombo species in Zambia. This study was designed to answer the following research questions:
(i)
What is the optimal single season window for acquiring imagery to discriminate tree species in the Miombo ecoregion?
(ii)
Could multi-season imagery improve the discrimination of tree species in the Miombo ecoregion?
(iii)
What other image features can improve Miombo species classification?

2. Materials and Methods

The workflow containing the methodological steps of this study is shown in Figure 1. Within the framework of this study, we acquired multi-date and multi-spectral imagery from multi-rotor UAS combined with individual tree crown delineation algorithms and a machine-learning classifier to identify the dominant tree species in the Miombo woodland of Mwekera area in Zambia.

2.1. Study Area

The study area is 22 hectares of wet Miombo woodland located (12.860977 °S, 28.357049 °E; Figure 2) in Mwekera National Forest No. 6, about 15 km southeast of the central business district of the City of Kitwe, in the Copperbelt Province of Zambia. The average human population density in the Copperbelt province is 63.0 persons per km2, with an average annual population growth rate of 2.2% (Central Statistical Office, 2012). Mwekera Forest covers about 111 km2 and the elevation ranges from 1210 to 1240 m above mean sea level. Annual rainfall ranges between 1000 and 1500 mm and the temperature ranges between 25 °C and 32 °C. The Miombo woodlands, which cover approximately 45% of Zambia, is the predominant vegetation in Mwekera [35]. Mwekera Forest was classified as a National Forest to protect the Mwekera stream catchment, which is part of the Kafue River system.

2.2. Field Data Collection

The fieldwork was conducted in May 2021, just before the first flight. Considering the accessibility of the field site and the heterogeneity of tree species, twenty plots of 20 m radius were set up at every 200 m and at additional areas with sudden changes in tree cover in the study area. In each plot, all tree species (Appendix A) with a diameter at breast height (DBH) greater than 5 cm were sampled (N = 688). The attributes of trees collected included individual tree positions, DBH, tree height, and species name. The positions of all the sampled trees were measured using a CHC LT700H real-time kinematic (RTK) Global Navigation satellite system (GNSS) receiver. DBH was measured using a diameter tape and tree height was measured using a Nikon Forest Pro hypsometer. In this study, we conducted our classification experiments based on dominant tree species, which were Julbernardia paniculata (JP; 18.5%), Isoberlinia angolensis (IA; 16.6%), Marquesia macroura (MM; 15.7%), Brachystegia longifolia (BL; 9.3%) and Brachystegia spiciformis (BS; 7.4%) (Table 1; Appendix A). The remaining species were recorded in less than 5% of the samples and were, therefore, not considered for classification. Furthermore, the dominant species found in Mwekera (Table 1), except for Marquesia macroura, were found to be preferred charcoal species [36], which makes the site vulnerable to over-exploitation.

2.3. UAS Image Data Acquisition

Three UAS images used to classify tree species were acquired on 25 May 2021 at full leaf maturity, 15 August 2021 at senescence for the majority of dominant canopy tree species and early flushing for BL and BS species, and 24 October 2021 at greening of flushed leaves for the majority of dominant species [1,37]. The DJI Phantom 4 RTK Multispectral multi-rotor UAS, equipped with one RGB camera and a multispectral camera array with five cameras covering blue (450 nm ± 16 nm), green (560 nm ± 16 nm), red (730 nm ± 16 nm), red edge (450 nm ± 16 nm), and near-infrared (840 nm ± 26 nm), as well as a D-RTK 2 mobile Global Navigation Satellite System (GNSS) base station [38], was used to capture imagery for this study. This UAS was chosen for our study because of two capabilities: (i) Real Time Kinematic GNSS capability that enabled direct image georeferencing for easy processing and comparison of multi-date images, and (ii) integrated sunlight sensor for consistency of images collected at different times of the day. All our flights were undertaken between 11:30 a.m. and 12:30 p.m. local time to minimize shadowing on the images. In order to ensure consistent comparisons between the multi-date UAS imagery, the same UAS flight parameters were applied on all dates (Table 2).

2.4. UAS Data Pre-Processing

The UAS images from the three dates were processed using the Structure from Motion (SfM) approach [39] based on the workflow in Agisoft Metashape software version 1.7 [40], and can be summarized as follows: (i) photos were uploaded while selecting the multi-camera system, and bands were arranged according to image metadata, (ii) the reflectance was calibrated based on the sun sensor, (iii) photos were by estimating the camera position of the multi-camera system, and sparse point clouds were generated consisting of tie points and the estimated interior orientation parameters for each sensor, (iv) a dense point cloud was generated based on the calculated exterior and interior orientation parameters using dense stereo matching to densify the point clouds, (v) a Digital Surface Model (DSM) was generated based on the dense point cloud and resolution, (vi) an orthophoto mosaic was generated based on the DSM, and (vii) the orthophoto mosaic was exported in Geotiff format. The other process performed with Metashape software was to classify ground points and generate a digital terrain model (DTM), which was also exported together with the DSM for further processing in the calculation of the canopy height model (CHM). In order to optimize on storage space and processing time, the orthophoto mosaic, DSM, and DTM were exported at a degraded resolution of 0.30 m, which was tried and found suitable for segmenting tree crowns of mature deciduous trees [41].

2.5. Computation of the CHM

The CHM was computed based on recommendations from [42], who found combination of UASs with non-radiometric RGB sensors and the SfM approach (UAS-SfM) to generate better DTMs in open woodlands compared to closed woodlands, due to the inability of optical UAS imagery to capture the ground in closed canopy woodlands. Similar observations were made by [43], who used leaf-off UAS-SfM derived DTMs as ground reference for supporting teak plantations’ inventory in the dry forests of the coastal region of Ecuador. A study by [44] assessed tree damage in a West Virginia Research Forest using leaf-on generated UAS-SfM DSM and leaf-off DTM. Therefore, we took advantage of our multi-date data set to generate the best possible CHM from our available data sets by subtracting the leaf-off (15.08.21) DTM from leaf-on (25.0522) DSM. The computed CHM was resampled to 0.3 m resolution to match the orthophoto and used an input in the tree species classification process.

2.6. Tree Species Classification

The tree species were classified using object-based image analysis (OBIA) [45,46]. This method outperforms pixel-based methods for classifying tree species from high-resolution imagery [47]. Therefore, OBIA was used in this study, and it was performed in three steps: image segmentation, feature extraction, and image classification.

2.6.1. Image Segmentation

The orthophoto images were processed into homogeneous segments that closely correspond with individual tree crowns using the multi-resolution algorithm [48] implemented in eCognition Developer version 9.0 (Trimble) software. This algorithm grows by merging one pixel with neighboring pixels based on spectral and/or shape similarity criteria. A combination of orthophoto and CHM was assessed in this study as CHM was found to improve individual tree segmentation in other studies [49,50]. The UAS imagery captured in May (leaf-maturity) was used for segmentation since all Miombo trees have a well-defined tree crown shapes at this stage of the year. Multiple iterations were performed via trial and error by varying the shape, compactness, and scale parameters, and comparing to the resulting tree crowns. Furthermore, the effect of combining the orthophoto and CHM to the segmentation result was also assessed. The result of the segmentation were polygons of homogeneous objects representing a tree crown or group of similar tree crowns. The image objects polygons generated were used as a basis for segmenting the August UAS orthophoto (senescence for most of the Miombo tree species) and October UAS orthophoto (leaf-flushing for Miombo tree species). This was done to make sure that we used the same tree objects when comparing the accuracy of the classification results from the three image dates.

Segmentation Accuracy Assessment

The accuracy of OBIA analysis is based on the accuracy of the segmentation process and it is therefore important to assess the quality of the segmentation before proceeding to the subsequent processes of feature extraction and image segmentation. In this study, the area estimation technique described in [51] was used to assess the segmentation accuracy of tree crowns. The three measures were compared to assess the accuracy of the tree crown segmentation using the following equations
Oversegmentation   ( OS ) = a r e a ( A R P   A D P ) a r e a ( A R P )
Undersegmentation   ( US ) = a r e a ( A R P   A D P ) ( A D P )
Segmentation   error   ( SE ) = ( ( O S ) 2 + ( U S ) 2 2 )
where ARP is a detected object area segmented by the MRS algorithm that is one-to-one with a reference polygon; ADP is the area of the reference polygon (tree crown), which is manually digitized in ArcMap (ArcGIS Desktop Version 10.7.1); [52] and area (ARPADP) is the area of the manually delineated polygons correctly identified by the MRS algorithm. The ideal value of the oversegmentation, undersegmentation, and total detection error is 0. The reference polygons (tree crowns) were manually digitized in ArcGIS for two forest stands and then applied to quantify the segmentation error.

2.6.2. Feature Extraction

Before classification of tree species, it is essential to extract features of segmented tree objects that are used to discriminate different tree species in the subsequent classification process [50]. The first step in our feature extraction process was to mask off non-canopy tree objects from canopy tree objects so that only features related to canopy tree objects are considered for subsequent tree species classification. This was done by applying a threshold height of greater than 3 m of CHM to represent canopy tree objects.
The non-canopy tree objects taller than 3 m were separated by using normalized difference vegetation index (NDVI) value of less than 0.1. We explored the use of a combination of spectral, texture, and vegetation indices because use of multiple features have been found to improve tree species discrimination in other studies [15,33,53]. All the canopy tree object features for all the three dates were extracted in eCognition Developer software before exporting to ArcGIS for tree species classification. The extracted features built into eCognition Developer software [54] included: spectral features (mean blue, mean green, mean red, mean red-edge, mean near infra-red (NIR), grey level co-occurrence matrix (GLCM) textural features (contrast, correlation, dissimilarity, and standard deviation), and band metrics (mean brightness and maximum difference). The vegetation indices included: green chromatic coordinate (GCC), red chromatic coordinate (RCC), and NDVI, which were computed and extracted within eCognition software using the equations in Table 3.
The tree objects were exported from eCognition as shape files with all the extracted features as attributes. The shape file attributes of the exported object features were rescaled by normalizing them to a common scale in order to prevent attributes with high range values from dominating those with low range values during the classification process [57]. All feature values were rescaled to a range of 0 to 1 in ArcMap using the attribute table field calculator (Equation (1)). The shape files were converted to raster in ArcMap with each feature been used to create a single band raster image.
r e s c a l e d   v a l u e = ( f e a t u r e   v a l u e m i n m u m   v a l u e ) ( m a x i m u m   v a l u e m i n m u m   v a l u e

2.6.3. Species classification

The tree species classification was done using Random Forest (RF), a non-parametric machine learning classifier that has been used widely in tree species classification using very high resolution imagery [26,31,32,50,58]. RF uses training samples, validation samples, and the majority vote to classify an object into a specific class. In the current study, the RF was implemented in ArcMap. The training and validation sample image objects were collected using the training sample manager in ArcMap guided by field sample crowns, but only sunlit objects were collected to represent a pure sample for each tree species, and a shadow class was added to classify shadowed areas. A total of 344 training samples were collected for the six classes divided as follows: JP (89), IA (80), MM (76), BL (45), BS (45), and shadow (19). The sample data were randomly split into training (70%) and validation (30%). The same training and validation samples were used to train and validate classification results for single-date imagery, multi-date and multi-feature image combination to find the optimal solution for discriminating different tree species within the Miombo woodland study area.

Class Separability

The separability of the 6 classes was summarized by collecting mean statistics of training data for each class in ArcMap Training Sample Manager and exporting to Excel for plotting and visualization. The variability of spectral, vegetation indices, and texture features across dates and image combinations were visualized to assess the separability of different species.

Classification Accuracy Assessment

The effectiveness of the different image date combinations to discriminate different tree species was assessed using a confusion matrix. For each classification result, the producer’s accuracy, user’s accuracy, overall accuracy, and kappa statistics were computed to assess the ability to discriminate species.

3. Results

3.1. Identifying Segmentation Parameters

In this study, after a systematic trial and error process, the suitable segmentation parameter combinations for delineating tree crowns were scale (90), shape (0.8), and compactness (0.9). Scale was found to be the most sensitive parameter, and the effect of changing the scale while keeping the other parameters the same was evaluated by visual comparison. This showed that when the scale factor was 50, tree crowns were over-segmented; when the scale factor was 150, tree crowns were under-segmented; and when the scale factor was 80, tree crowns were best segmented (Figure 3). We also compared the CHM’s contribution to segmentation visually in Figure 4 and quantitatively in Table 4.

3.2. Discrimination of Dominant Tree Species

The investigated image features (mean spectral bands, mean spectral indices, and GLCM textural features) used for discriminating tree species revealed that spectral indices performed better than other image features ( Appendix B). The performance of each image feature in discriminating the tree species for each of the image dates is indicated below.
Figure 5 shows the variability in the mean spectra across the three image dates. Figure 5a (May image): in the blue band, JP, BL, and shadow were mixed, while IA, BS, and MM were discriminable; in the green band, only JP stood out with relatively high reflectance and all the other species were mixed with shadow; in the red band, BL was discriminable, JP, IA, and shadow were somewhat mixed, while BS and MM were mixed; and in the red-edge and NIR bands, only MM was discriminable, with all other species mixed with shadow. In Figure 5b (August image): the shadow was discriminable from all the species across the five bands; all the dominant species were clearly discriminable in the red and red-edge bands; in the blue band, JP and IA were discriminable, while BS, BL, and MM were somewhat mixed; in the green band, BS and BL were discriminable, while MM, IA, and MM were somewhat mixed; and in the NIR band, JP, MM, and BS were discriminable, while IA and BL were somewhat mixed. In Figure 5c (October image): the shadow was discriminable from all the tree species in all the bands except for in the blue band, where it was somewhat mixed with BS; in the blue band, BL, JP, AI, and MM were mixed; in the green band, all the species were mixed; in the red band, only MM was discriminable with the rest of the species somewhat mixed; and in the red-edge and NIR bands, MM, BL, and BS were mixed, while IA and JP were discriminable (description summarized in Appendix B).
Figure 6 shows the variability in the extracted spectral indices features across the three image dates, which revealed improved species separability compared to raw spectral band data. Figure 6a (May image): in the brightness band, only the shadow was discriminable, with all the species mixed due to uniform brightness in all species at leaf maturity; maximum difference band, all the species were mixed with shadow; BS was discriminable in the NDVI band, while all other species were mixed with shadow; in the GCC band, shadow, JP, and BS were discriminable, while MM, IA, and BL were mixed; and in the RCC band, only BS was discriminable, while the rest of the species were mixed with shadow. In Figure 6b (August image): the shadow was discriminable from all the species across all spectral metrics bands except in GCC, where it was mixed with IA; all the dominant tree species were discriminable in the NDVI, RCC, and maximum difference bands; and in the GCC band, all species were discriminable except IA, which was mixed with shadow. In Figure 6c (October image): only MM was discriminable in the brightness band, with the rest of the species somewhat mixed with shadow; in the maximum difference band, IA, BS, and MM were discriminable, while JP and BL were somewhat mixed with shadow; in the NDVI band, JP, IA, BS, and MM were discriminable, while BL was somewhat mixed with shadow; in the GCC band, BL and IA were discriminable, while JP was mixed with shadow, and BS was mixed with MM; and in the RCC band, BL, BS, and MM were discriminable, while shadow, JP, and IA were mixed (description summarized in Appendix C).
Figure 7 shows the variability in the extracted GLCM texture features across the three image dates, which exhibited more mixing among species compared to other considered features. Figure 7a (May image): the shadow is discriminable in the contrast and standard deviation bands, BS was discriminable in the entropy band, and the rest of the species were mixed in the rest of the bands: In Figure 7b (August image): the shadow and JP were discriminable in the entropy band, while in the rest of the bands the classes were mixed. In Figure 7c (October image): shadow was discriminable in all bands except the standard deviation band; BS, BL and MM were discriminable in the entropy band; JP was discriminable in the correlation band; while in the rest of the bands, the classes were mixed (description summarized in Appendix D).

3.3. Tree Species Classification

Figure 8 presents the results of the tree species classification using the Random Forest algorithm. The visual observation indicated that JP occupied the most significant distribution across the entire study area. Figure 8b depicts the results of canopy species and herbaceous layer discrimination using data fusion of UAS CHM and multi-spectral orthophoto mosaic, while Figure 8c–e show the classification results from the May, August, and October images, respectively. Figure 8f shows the classification results of the best combination of multi-date and multi-feature images considered in the study.
The confusion matrix of the five dominant tree species using the three groups of metrics is shown in Table 5. In general, using single date data, the accuracy of the tree species classification, except for Marquesia macroura, is higher in the August image (overall accuracy: 80.12 %, kappa accuracy: 68%), followed by the May image, with the October image being the least accurate. In addition, the average producer’s accuracy (PA) and user’s accuracy (UA) for all the dominant species were above 75%, which points to good spectral discrimination among species in the August image when JP is in senescence, while BS and BL are flushing and have a distinctive reddish color. Furthermore, the species were poorly separable in the October image, with BS, BL, and MM mixing across all bands and yielding an average PA and UA of less than 60%. Using multi-date images improved the tree species classification accuracy by about 4% to 84.25% OA and 0.72 kappa. Additionally, combining multi-date images, spectral indices, and texture improved the classification accuracy to 87.07% OA and 0.83 kappa.

4. Discussion

4.1. Segmentation of Tree Crowns

The segmentation of tree crowns in this study was completed using the MRS algorithm iteratively via trial and error method by varying the scale, shape, and compactness parameters. The suitable parameters for delineating tree crowns in this study were 90, 0.8, and 0.9 for scale, shape, and compactness, respectively. Among these parameters, the scale parameter was found to be the most sensitive and it substantially affected the segmentation results. This observation is consistent with the findings of studies by [41] in a mixed forest in Amstelveen, Germany, and [50] in a mixed forest in Xugongqing, Dêqên,, Yunnan province China. The combination of multi-spectral orthophoto and CHM improved the segmentation accuracy by 6% compared with using only the multi-spectral orthophoto (Table 4). This improvement in segmentation accuracy can be attributed to the addition of the three-dimensional structural information of the trees and the CHM. Such observations have also seen in Arizona, United States of America (USA) [59] and Qi’ao Island, China [15], both of which demonstrated the importance of tree height to improve the segmentation accuracy in natural forest stands.
The tree crown segmentation accuracy obtained in this study is within the range (60% to 95%) reported in other deciduous forests [41,50] However, the accuracy of the tree crown segmentation may be dependent on many factors, including image acquisition date and stand structure in different sites For example, [60] applied a local maxima method to UAS-derived CHM to delineate individual tree crowns across a boreal forest, achieving accuracies between 40% and 95%, depending on the characteristics of the site. Another study by [61] used a combination of spectral and point cloud UAS data through sub-crown k-means clustering where 48% of the individual tree crown were correctly detected and segmented across a complex forest ecosystem. They also experimented using the same technique with CHM only and observed an accuracy degradation of 4.1%, thus confirming the observation elsewhere [50] that the synergy between CHM and spectral information gives superior results compared to a single data set approach.

4.2. Optimal Single Date Imagery

The August image (Figure 5b) was identified as the best single date image for discriminating tree species in the wet Miombo woodlands. August–September coincides with the transition to senescence for most of the dominant wet Miombo tree species and early flushing for some species in the Brachystegia genus [1]. Moreover, interspecies phenological differences are more pronounced during this period, which maximizes interspecies spectral variability, a key feature for separating tree species [62]. JP was strongly separable across all spectral bands in the August image, resulting in high producer’s and user’s accuracies compared to other species, and exhibited characteristics of a species in senescence, with high reflectance in the visible part of the spectrum and low reflectance in the red-edge and NIR part of the spectrum. In contrast, MM and BS exhibited the characteristics of species at leaf flushing, with low reflectance in the pigment absorption bands (blue and red) and high reflectance in the red-edge and NIR bands. These results are consistent with findings in the study by [25], who also reported better classification accuracy in the image acquired during transition periods from full green canopy to senescence in the South African savannah. These findings corroborate earlier works in other regions by [63] in West Virginia, USA, [62] in Monks Wood, Cambridgeshire, eastern England, and [64] in Hawai’i Volcanoes National Park, Hawai’i, USA. The October image, which coincided with the period when newly flushed leaves turn green in the wet Miombo woodlands [1,65], resulted in the lowest accuracy (Table 5) due to low interspecies spectral variability at this phenological stage. These results contrast with the findings by [31], who found early summer to be the optimal single date imagery for discriminating deciduous tree species in Grand-Leez municipality, Belgium. The differences in findings could be attributed to differences in species composition in the two regions.

4.3. Improved Accuracy with Multi-Date Image

The high accuracy achieved in the multi-date image compared to single date images (Figure 6 and Table 5) suggests that multi-date imagery takes advantage of interspecies differences in phenologies, exhibiting different spectral characteristics for tree species on different dates, which compensate for the low spectral resolution [63] of the UAS imagery used in this study. Furthermore, it demonstrates that using a single date image results in missing important information that can be used for tree species discrimination. The improvement in the classification results using multi-date imagery is in agreement with the observations in other studies elsewhere [25,26,62], who found that utilizing multi-date image data improves the spectral variability among species because of the differences in the phenological developments of different species across the seasons. Additionally, [31] captured multispectral UAS imagery at strategic dates of phenological development for 130 hectares of broadleaved forest in Grand-Leez, Belgium. They used the Random Forest (RF) classification approach to classify five deciduous species groups using single-date, two-date, and three-date multispectral image combinations and observed that the three-date combination yielded superior results compared to the others.

4.4. Image Indices Improve Classification Accuracy

The addition of spectral indices increases separability of different classes as opposed to just using raw spectral information. For example, the BS and shadow classes, which were difficult to separate using raw spectral information (Figure 5) in the May image, become very separable using the spectral indices (Figure 6), thus demonstrating that a combination of raw spectral bands and spectral indices, even for a single date image, has potential to improve classification accuracy. These findings corroborate works by [50] in China and [66] in Brazil on how spectral indices improve the classification accuracy of tree species. This highlights the importance of using a combination of raw spectral data and derived features, such as texture and spectral indices, when classifying tree species, especially when using images of lower spectral resolution. This was in contrast to the findings of [26], who reported no improvements in vegetation community classification when spectral indices were used. Our study shows that the mixing of species when texture features are used for tree species classification (Figure 6) results in low classification accuracies. This is in line with a study by [67], who found that when combined with spectral features, GLCM textural features did not improve the classification accuracy of tree species in two observed sites in China (homogeneous park forest and heterogeneous management forest). However, our study contradicts studies by [33,68,69,70], who observed that texture features improve tree species discrimination. The differences in results could be attributed to the similar appearance of the Miombo woodlands species [1], which translates to a similar texture.
The methods proposed add a new technique for mapping of Miombo woodland tree species targeted for various products at a local scale. For instance, all the dominant Miombo species identified in this study are targeted for fuelwood production because of their burning qualities [5], Isoberlinia angolensis is targeted for timber, and Brachystegia longifolia is targeted for its bark rope; these qualify them as candidates for conservation and sustainable utilization [36]. The classification results attained using multi-date UAS imagery for the dominant Miombo species unlocks the potential for mapping and monitoring their distribution, as well as to inform decision making for better management and conservation. Although the study was limited to a small site and a few species, site-specific studies confined to one or a small group of species are important for upgrading existing information, and thus help sustainable use and management of forest resources [2]. Therefore, the approach used here can be a turnkey for species distribution mapping in the Miombo, as well as other regions and ecosystems, to supplement already existing methods that are used in the conservation of tree species that are important for the desirable goods and ecosystem services that they provide.

5. Conclusions

This study investigated the potential for using multi-spectral UAS imagery in classifying the dominant tree species of the wet Miombo woodlands. Single dates, combination of dates, and combination of features were used in the classification of tree species, as all of these tend to influence the classification accuracy. The August image achieved the best single date accuracy (80.12% OA, 0.68 kappa), compared to the October (73.25% OA, 0.59 kappa) and May (76.64% OA, 0.63 kappa). The use of a multi-date image combination improved the classification accuracy to 84.25% OA and 0.72 kappa. After the addition of spectral indices, the accuracy was further improved to 87.07% and 0.83 kappa. The use of multi-date imagery was found to be very useful in capturing the interspecies phenological differences that are useful for identifying different tree species in the Miombo woodlands. The study has demonstrated the applicability of multi-spectral UAS imagery and OBIA to classify tree species in the Miombo woodlands
The results have implications on the choice of dates for image acquisition for natural resources managers using multi-spectral UAS imagery to map tree species in the Miombo woodlands and elsewhere. Judging by the variation in species separability across different dates, it seems imperative to acquire imagery on seasonally separated dates that will enable the capture of all of the important phenological traits that are important for separating tree species using spectral information. Specifically, denser image acquisition dates should be concentrated around July–September for the Miombo woodland, as this is when most of the dominant tree species here are in transition from mature leaves through senescence to flushing. Due to the phenological variation of the Miombo woodland tree species, no single date imagery can outperform the broadly spread multi-date imagery combination in capturing the information required for separating different tree species.

Author Contributions

All of the authors made substantial contributions towards the successful completion of this manuscript: Conceptualization, H.S. and S.S.; methodology, validation, H.S. and S.S. Field data collection protocol and fieldwork H.S. and F.H., writing—original draft preparation, H.S.; review and editing, J.C.Z., P.W.C., A.T.H. and A.R. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by The United States Agency for International Development through Partnerships for Enhanced Engagement in Research (PEER) program (2000009945). Additional funding was provided by Oliver R Tambo African Research Chair Initiative (ORTARChI) project, an initiative of Canada’s International Development Research Centre (IDRC), South Africa’s National Research Foundation (NRF) and the Department of Science and Innovation (DSI), in partnership with the Oliver & Adelaide Tambo Foundation (OATF) and National Science and Technology Council, Zambia.

Data Availability Statement

The data are available on request from the corresponding author.

Acknowledgments

This research was supported by the USDA Forest Service, Rocky Mountain Research Station and Oliver R Tambo Africa Research Initiative (ORTARChI) Project. ORTARChI is an initiative of South Africa’s National Research Foundation (NRF) and the Department of Science and Innovation (DSI), in partnership with the Oliver & Adelaide Tambo Foundation (OATF), Canada’s International Development Research Centre (IDRC), and National Science and Technology Council. The findings and conclusions in this publication are those of the authors and should not be construed to represent any official position of the organizations that funded the study. The findings and conclusions in this publication are those of the authors and should not be construed to represent any official USDA or U.S. Government determination or policy.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A. Sampled Tree Species in the Study Area

Tree SpeciesN%DBH (cm)TH(m)
MeanRangeMeanRange
Julbernardia paniculata12718.531.0313.5–59.9017.798.50–25.00
Isoberlinia angolensis11416.623.929.90–44.7014.555.00–20.50
Marquesia macroura10815.729.215.30–70.0015.103.25–25.00
Brachystegia longifolia649.320.6511.8–64.0011.278.50–23.00
Brachystegia spiciformis517.418.555.00–64.209.975.80–20.50
Parinari curatellifolia182.623.486.00–53.5013.676.00–24.00
Ochna pulchra172.57.625.20–10.905.704.50–8.00
Baphia bequaertii162.311.635.80–23.706.953.00–15.00
Pericopsis angolensis162.324.4210.3–70.0014.015.00–25.10
Diplorhynchus condylocarpon142.08.945.00–18.007.644.50–10.00
Anisophyllea boehmii111.618.775.10–44.9011.743.75–19.50
Erythrina abyssinica111.618.058.60–33.3010.215.30–20.50
Hymenocardia ulmoides81.224.055.40–9.9019.944.50–7.00
Pseudolachnostylis maprouneifolia71.022.047.00–20.8011.645.00–10.00
Syzygium cordatum71.021.209.10–19.2011.215.25–10.00
Hexalobus monopetalus71.014.135.80–57.307.944.75–22.00
Pterocarpus angolensis71.012.295.30–28.108.225.30–15.00
Swartzia madagascariensis71.08.165.50–10.805.343.30–8.75
Diospyros batocana40.610.759.00–11.6010.137.00–17.50
Burkia africana40.68.557.80–9.306.386.25–6.50
Albizia adianthifolia40.614.1512.3–18.0013.0010.75–16.50
Uapaca sansibarica40.615.238.90–22.0010.446.00–15.75
Lannea discolor40.613.735.50–23.509.585.00–14.50
Diospyros mespiliformis40.619.7319.1–20.9012.3012.30–12.30
Brachystegia floribunda40.634.4525.7–44.5019.0017.50–20.00
Mapraunea africana30.48.906.80–10.605.834.25–6.75
Bobgunnia madagascariensis30.47.907.50–8.704.504.25–5.00
Dalbergia nitidula30.427.9322.0–30.9013.1713.00–13.25
Strychnos innocua30.47.276.40–7.706.785.35–7.50
Pseudochnostylis maprouneifolia30.47.775.80–11.605.875.30–7.00
Maprounea africana30.48.906.80–10.605.834.25–6.75
Rhus longipes30.49.438.80–9.905.505.00–6.00
Albizya adiansfolia30.418.437.80–26.7013.086.75–17.50
Combretum zeyheri20.323.6517.7–29.6012.009.00–15.00
Faurea speciosa20.38.908.90–8.905.755.75–5.75
Magnistipula butayei20.315.9015.9–15.908.008.00–8.00
Erythropeleum africanum20.330.1030.1–30.1017.7517.75–17.75
Ochna schweinfurthiana20.36.806.60–7.005.955.00–6.90
Albizia antunesiana20.332.4021.6–43.2017.9017.50–18.30
Albizia versicolor20.333.5033.5–33.5011.2511.25–11.25
Phyllocosmos lemaireanus20.35.755.70–5.806.135.75–6.50
Uapaca kirkiana20.314.358.90–19.809.505.50–13.50
Harungana madagascariensis10.15.705.70–5.704.504.50–4.50
Canthium crassum10.137.0037.0–37.0022.0022.00–22.00
Oxtenanthera abyssinica10.19.209.20–9.2011.0011.00–11.00
Dallbegiella nyasae10.133.3033.3–33.3017.2517.25–17.25
Monotes africanus10.17.207.20–7.2010.7510.75–10.75
Syzygium guineense10.15.905.90–5.906.706.70–6.70
Uapaca nitida10.114.6014.6–14.606.006.00–6.00
Albizya atunizyana10.17.507.50–7.507.757.75–7.75
Total688100

Appendix B. Summary of Class Separability Using Mean Spectral Features across the 3 Sampled Dates

BandsSeparable ClassesMixed ClassesDate
BlueIA, BS, MMBL, JP and shadow25.05.21
GreenJPIA, BS, BL, MM, Shadow
RedBLJP, IA and shadow/ BS, MM
Red-edgeMMIA, BS, BL, JP, shadow
Near infraredMMIA, BS, BL, JP, shadow
BlueShadow, JP, IABL, BS, MM15.08.21
GreenShadow, BS, BLIA, MM, JP
RedShadow and all species
Red-edgeShadow and all species
Near infraredShadow, JP, MM, BSIA and BL
BlueShadow, BSBL, BS, MM24.10.21
GreenShadowAll species
RedShadow, BSJP, BL, MM, IA
Red-edgeShadow, JP IABS, BL, MM
Near infraredShadow, JP, IA, BLBS, BL, MM

Appendix C. Summary of Class Separability Using Mean Spectral Indices Features across the 3 Sampled Dates

BandsSeparable ClassesMixed ClassesDate
BrightnessShadowAll species25.05.21
Maximum difference All species, shadow
NDVIBSJP, IA, BL, MM, shadow
GCCShadow, MM, JP, BSIA, BL
RCCBSIA, MM, BL, JP, shadow
BrightnessShadow, JP, IABL, BS and MM15.08.21
Maximum differenceAll species and shadow
NDVIAll species and shadow
GCCBL, BS, JP, MMShadow, IA
RCCAll species and shadow
BrightnessMMBL, BS, IA, JP, shadow24.10.21
Maximum differenceIA, BS, MMshadow, BL, JP
NDVIBS, MM, JP, IABL, shadow
GCCBL, IAJP, shadow/ BS, MM
RCCBS, MM, BLshadow, JP, IA

Appendix D. Summary of Class Separability Using Mean Textural Features across the 3 Sampled Dates

BandsSeparable ClassesMixed ClassesDate
ContrastShadowAll species25.05.21
Correlation JP, shadow, BS, MM/BL, IA
Dissimilarity All classes
EntropyBSIA, BL, MM, JP, shadow
Standard deviationShadowAll species
ContrastIABS, BL, MM, JP, shadow15.08.21
Correlation All classes
Dissimilarity All classes
EntropyShadow, JP, BSIA, MM, BL
Standard deviation JP, shadow/ MM, BS, BL, IA
ContrastShadowAll species24.10.21
CorrelationShadow, JPMM, IA, BS, BL
DissimilarityShadowAll species
EntropyShadow, BS, BL, MMJP, IA
Standard deviation All classes

References

  1. Frost, P. The Ecology of Miombo Woodlands. In The Miombo in Transition: Woodlands and Welfare in Africa; Campbell, B.M., Ed.; Center for International Forestry Research (CIFOR): Jakarta, Indonesia, 1996; pp. 11–57. [Google Scholar]
  2. Syampungani, S.; Chirwa, P.W.; Akinnifesi, F.K.; Sileshi, G.; Ajayi, O.C. The miombo woodlands at the cross roads: Potential threats, sustainable livelihoods, policy gaps and challenges. In Natural Resources Forum; Blackwell Publishing Ltd.: Oxford, UK, 2009; Volume 33, pp. 150–159. [Google Scholar]
  3. Chirwa, P.W.; Syampungani, S.; Geldenhuys, C.J. The ecology and management of the Miombo woodlands for sustainable livelihoods in southern Africa: The case for non-timber forest products. South. For. 2016, 70, 237–245. [Google Scholar] [CrossRef]
  4. Kapinga, K.; Syampungani, S.; Kasubika, R.; Yambayamba, A.M.; Shamaoma, H. 2018 Forest Ecology and Management Species-speci fi c allometric models for estimation of the above-ground carbon stock in miombo woodlands of Copperbelt Province of Zambia. For. Ecol. Manag. 2018, 417, 184–196. [Google Scholar] [CrossRef]
  5. Campbell, B. The Miombo in Transition: Woodlands and Welfare in Africa; Campbell, B.M., Ed.; Center for International Forestry Research: Bogor, Indonesia, 1996. [Google Scholar]
  6. Luoga, E.J.; Witkowski ET, F.; Balkwill, K. Harvested and standing wood stocks in protected and communal miombo woodlands of eastern Tanzania. For. Ecol. Manag. 2002, 164, 15–30. [Google Scholar] [CrossRef]
  7. Madonsela, S.; Azong, M.; Ramoelo, A.; Mutanga, O. Estimating tree species diversity in the savannah using NDVI and woody canopy cover. Int. J. Appl. Earth Obs. Geoinf. 2018, 66, 106–115. [Google Scholar] [CrossRef] [Green Version]
  8. He, C.; Jia, S.; Luo, Y.; Hao, Z.; Yin, Q. Spatial Distribution and Species Association of Dominant Tree Species in Huangguan Plot of Qinling Mountains, China. Forests 2022, 13, 866. [Google Scholar] [CrossRef]
  9. Ribeiro, N.S.; Syampungani, S.; Matakala, N.M.; Nangoma, D.; Isabel, R.A. Miombo Woodlands Research towards the Sustainable Use of Ecosystem Services in Southern Africa; Books on Demand: Norderstedt, Germany, 2015. [Google Scholar]
  10. Cho, A.M.; Mathieu, R.; Asner, G.P.; Naidoo, L.; Van Aardt, J.; Ramoelo, A.; Debba, P.; Wessels, K.; Main, R.; Smit, I.P.J.; et al. Remote Sensing of Environment Mapping tree species composition in South African savannas using an integrated airborne spectral and LiDAR system. Remote Sens. Environ. 2012, 125, 214–226. [Google Scholar] [CrossRef]
  11. Turner, W.; Spector, S.; Gardiner, N.; Fladeland, M.; Sterling, E.; Steininger, M. Remote sensing for biodiversity science and conservation. Trends Ecol. Evol. 2003, 18, 306–314. [Google Scholar] [CrossRef]
  12. Xie, Y.; Sha, Z.; Yu, M. Remote sensing imagery in vegetation mapping: A review. J. Plant Ecol. 2008, 1, 9–23. [Google Scholar] [CrossRef]
  13. Day, M.; Gumbo, D.; Moombe, K.B.; Wijaya, A.; Sunderland, T. Zambia Country Profile Monitoring, Reporting and Verification for REDD+; CIFOR: Bogor, Indonesia, 2014. [Google Scholar]
  14. Hologa, R.; Scheffczyk, K.; Dreiser, C.; Gärtner, S. Tree species classification in a temperate mixed mountain forest landscape using random forest and multiple datasets. Remote Sens. 2021, 13, 4657. [Google Scholar] [CrossRef]
  15. Cao, J.; Leng, W.; Liu, K.; Liu, L.; He, Z. Object-Based Mangrove Species Classification Using Unmanned Aerial Vehicle Hyperspectral Images and Digital Surface Models. Remote Sens. 2018, 10, 89. [Google Scholar] [CrossRef] [Green Version]
  16. Fassnacht, F.E.; Latifi, H.; Stereńczak, K.; Modzelewska, A.; Lefsky, M.; Waser, L.T.; Straub, C.; Ghosh, A. Review of studies on tree species classification from remotely sensed data. Remote Sens. Environ. 2016, 186, 64–87. [Google Scholar] [CrossRef]
  17. Lim, J.; Kim, K.M.; Jin, R. Tree species classification using hyperion and sentinel-2 data with machine learning in South Korea and China. ISPRS Int. J. Geo-Inf. 2019, 8, 150. [Google Scholar] [CrossRef] [Green Version]
  18. Kollert, A.; Bremer, M.; Löw, M.; Rutzinger, M. Exploring the potential of land surface phenology and seasonal cloud free composites of one year of Sentinel-2 imagery for tree species mapping in a mountainous region. Int. J. Appl. Earth Obs. Geoinf. 2021, 94, 102208. [Google Scholar] [CrossRef]
  19. Asner, G.P.; Martin, R.E. Airborne spectranomics: Mapping canopy chemical and taxonomic diversity in tropical forests. Front. Ecol. Environ. 2009, 7, 269–276. [Google Scholar] [CrossRef] [Green Version]
  20. Cho, M.A.; Debba, P.; Mathieu, R.; Naidoo, L.; Van Aardt, J.; Asner, G.P. Improving Discrimination of Savanna Tree Species Through a Multiple-Endmember Spectral Angle Mapper Approach: Canopy-Level Analysis. IEEE Trans. Geosci. Remote Sens. 2010, 48, 4133–4142. [Google Scholar] [CrossRef]
  21. Nagendra, H.; Rocchini, D. High resolution satellite imagery for tropical biodiversity studies: The devil is in the detail. Biodivers. Conserv. 2008, 17, 3431–3442. [Google Scholar] [CrossRef]
  22. Naidoo, L.; Cho, M.A.; Mathieu, R.; Asner, G. Classification of savanna tree species, in the Greater Kruger National Park region, by integrating hyperspectral and LiDAR data in a Random Forest data mining environment. ISPRS J. Photogramm. Remote Sens. 2012, 69, 167–179. [Google Scholar] [CrossRef]
  23. Cao, J.; Liu, K.; Zhuo, L.; Liu, L.; Zhu, Y.; Peng, L. Combining UAV-based hyperspectral and LiDAR data for mangrove species classification using the rotation forest algorithm. Int. J. Appl. Earth Obs. Geoinf. 2021, 102, 102414. [Google Scholar] [CrossRef]
  24. Mäyrä, J.; Keski-Saari, S.; Kivinen, S.; Tanhuanpää, T.; Hurskainen, P.; Kullberg, P.; Poikolainen, L.; Viinikka, A.; Tuominen, S.; Kumpula, T.; et al. Tree species classification from airborne hyperspectral and LiDAR data using 3D convolutional neural networks. Remote Sens. Environ. 2021, 256, 112322. [Google Scholar] [CrossRef]
  25. Madonsela, S.; Azong, M.; Mathieu, R.; Mutanga, O.; Ramoelo, A.; Van De Kerchove, R.; Wolff, E. International Journal of Applied Earth Observation and Geoinformation Multi-phenology WorldView-2 imagery improves remote sensing of savannah tree species. Int. J. Appl. Earth Obs. Geoinf. 2017, 58, 65–73. [Google Scholar]
  26. Van Deventer, H.; Azong, M.; Mutanga, O. Multi-season RapidEye imagery improves the classification of wetland and dryland communities in a subtropical coastal region. ISPRS J. Photogramm. Remote Sens. 2019, 157, 171–187. [Google Scholar] [CrossRef]
  27. White, F. The Vegetaion of Frica; Natural Resources Research; UNESCO: Paris, France, 1983. [Google Scholar]
  28. Fassnacht, F.E.; Neumann, C.; Förster, M.; Buddenbaum, H.; Ghosh, A.; Clasen, A.; Joshi, P.K.; Koch, B. Comparison of Feature Reduction Algorithms for Classifying Tree Species With Hyperspectral Data on Three Central European Test Sites. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 7, 2547–2561. [Google Scholar] [CrossRef]
  29. Torresan, C.; Berton, A.; Carotenuto, F.; Di Gennaro, S.F.; Gioli, B.; Matese, A.; Miglietta, F.; Vagnoli, C.; Zaldei, A.; Wallace, L. Forestry applications of UAVs in Europe: A review. Int. J. Remote Sens. 2017, 38, 2427–2447. [Google Scholar] [CrossRef]
  30. Feng, Q.; Liu, J.; Gong, J. UAV Remote Sensing for Urban Vegetation Mapping Using Random Forest and Texture Analysis. Remote Sens. 2015, 7, 1074–1094. [Google Scholar] [CrossRef] [Green Version]
  31. Lisein, J.; Michez, A.; Claessens, H.; Lejeune, P. Discrimination of Deciduous Tree Species from Time Series of Unmanned Aerial System Imagery. PLoS ONE 2015, 10, e0141006. [Google Scholar] [CrossRef]
  32. Franklin, S.E.; Ahmed, O.S. 2017 Deciduous tree species classification using object-based analysis and machine learning with unmanned aerial vehicle multispectral data multispectral data. Int. J. Remote Sens. 2018, 39, 5236–5245. [Google Scholar] [CrossRef]
  33. Gini, R.; Sona, G.; Ronchetti, G.; Passoni, D.; Pinto, L. Improving Tree Species Classification Using UAS Multispectral Images and Texture Measures. Int. J. Geo-Information 2018, 7, 315. [Google Scholar]
  34. Feng, X.; Li, P. A Tree Species Mapping Method from UAV Images over Urban Area Using Similarity in Tree-Crown Object Histograms. Remote Sens. 2019, 11, 1982. [Google Scholar] [CrossRef] [Green Version]
  35. Stringer, L.C.; Dougill, A.J.; Mkwambisi, D.D.; Dyer, J.C.; Kalaba, F.K.; Mngoli, M. Challenges and opportunities for carbon management in Malawi and Zambia. Carbon Manag. 2012, 3, 159–173. [Google Scholar] [CrossRef] [Green Version]
  36. Syampungani, S.; Geldenhuys, C.J.; Chirwa, P.W. Miombo Woodland Utilization and Management, and Impact Perception among Stakeholders in Zambia: A Call for Policy Change in Southern Africa. J. Nat. Resour. Policy Res. 2011, 3, 163–181. [Google Scholar] [CrossRef]
  37. Shamaoma, H.; Chirwa, P.W.; Ramoelo, A.; Hudak, A.T.; Syampungani, S. The Application of UASs in Forest Management and Monitoring: Challenges and Opportunities for Use in the Miombo Woodland. Forests 2022, 13, 1812. [Google Scholar] [CrossRef]
  38. DJI. P4 Multispectral User Manual v1.0 2019.12; DJI: Shenzhen, China, 2019. [Google Scholar]
  39. Snavely, N.; Seitz, S.M.; Szeliski, R. Modeling the World from Internet Photo Collections. Int. J. Comput. Vis. 2007, 6, 245–255. [Google Scholar] [CrossRef] [Green Version]
  40. Agisoft LLC. Agisoft Metashape User Manual; Agisoft LLC: St. Petersburg, Russia, 2019. [Google Scholar]
  41. Effiom, A.E.; Van Leeuwen, L.M.; Nyktas, P.; Okojie, J.A.; Erdbrügger, J. Combining unmanned aerial vehicle and multispectral Pleiades data for tree species identification, a prerequisite for accurate carbon estimation. J. Appl. Remote Sens. 2019, 13, 034530. [Google Scholar] [CrossRef]
  42. Mlambo, R.; Woodhouse, I.H.; Gerard, F.; Anderson, K. Structure from Motion (SfM) Photogrammetry with Drone Data: A Low Cost Structure from Motion (SfM) Photogrammetry with Drone Data: A Low Cost Method for Monitoring Greenhouse Gas Emissions from Forests in Developing Countries. Forests 2017, 8, 68. [Google Scholar] [CrossRef] [Green Version]
  43. Aguilar, F.J.; Rivas, J.R.; Nemmaoui, A.; Peñalver, A.; Aguilar, M.A. UAV-Based Digital Terrain Model Generation under Leaf-Off Conditions to Support Teak Plantations Inventories in Tropical Dry Forests. A Case of the Coastal Region of Ecuador. Sensors 2019, 19, 1934. [Google Scholar] [CrossRef] [Green Version]
  44. Hentz, K.M.Â.; Strager, M.P. Cicada (Magicicada) Tree Damage Detection Based on UAV Spectral and 3D Data. Nat. Sci. 2018, 10, 31–44. [Google Scholar]
  45. Blaschke, T. Object based image analysis for remote sensing. ISPRS J. Photogramm. Remote Sens. 2010, 65, 2–16. [Google Scholar] [CrossRef] [Green Version]
  46. Shamaoma, H.; Kerle, N.; Alkema, D. Extraction of Flood-Modelling Related Base-Data From Multi-Source Remote Sensing Imagery. In Commission VII, WG/7: Problem Solving Methodologies for Less Developed Countries; Kerle, N., Skidmore, A., Eds.; The International Society for Photogrammetry and Remote Sensing: Enschede, The Netherlands, 2006. [Google Scholar]
  47. Franklin, S.E. Pixel- and object-based multispectral classification of forest tree species from small unmanned aerial vehicles. J. Unmmaned Veh. Syst. 2017, 6, 195–211. [Google Scholar] [CrossRef] [Green Version]
  48. Benz, U.C.; Hofmann, P.; Willhauck, G.; Lingenfelder, I.; Heynen, M. Multi-resolution, object-oriented fuzzy analysis of remote sensing data for GIS-ready information. ISPRS J. Photogramm. Remote Sens. 2004, 58, 239–258. [Google Scholar] [CrossRef]
  49. Jakubowski, M.K.; Li, W.; Guo, Q.; Kelly, M. Delineating individual trees from lidar data: A comparison of vector- and raster-based segmentation approaches. Remote Sens. 2013, 5, 4163–4186. [Google Scholar] [CrossRef] [Green Version]
  50. Xu, Z.; Shen, X.; Cao, L.; Coops, N.C.; Goodbody, T.R.H.; Zhong, T.; Zhao, W.; Sun, Q.; Ba, S.; Zhang, Z.; et al. Tree species classi fi cation using UAS-based digital aerial photogrammetry point clouds and multispectral imageries in subtropical natural forests. Int. J. Appl. Earth Obs. Geoinf. 2020, 92, 102173. [Google Scholar]
  51. Clinton, N.; Holt, A.; Scarborough, J.; Yan, L.; Gong, P. Accuracy assessment measures for object-based image segmentation goodness. Photogramm. Eng. Remote Sens. 2010, 76, 289–299. [Google Scholar] [CrossRef]
  52. ESRI. ArcGIS Desktop: Release 10.7.1; Environmental Systems Research: Redlands, CA, USA, 2019. [Google Scholar]
  53. Shen, X.; Cao, L.; Yang, B.; Xu, Z.; Wang, G. Estimation of Forest Structural Attributes Using Spectral Indices and Point Clouds from UAS-Based. Remote Sens. 2019, 11, 800. [Google Scholar] [CrossRef] [Green Version]
  54. Trimble. eCognition Developer User Guide; Trimble Germany GmbH: Munic, Germany, 2018. [Google Scholar]
  55. Fuller, D.O.; George, T. Canopy phenology of some mopane and miombo woodlands in eastern Zambia. Glob. Ecol. Biogeogr. 1999, 8, 199–209. [Google Scholar] [CrossRef]
  56. Park, J.Y.; Muller-landau, H.C.; Lichstein, J.W.; Rifai, S.W.; Dandois, J.P.; Bohlman, S.A. Quantifying Leaf Phenology of Individual Trees and Species in a Tropical Forest Using Unmanned Aerial Vehicle (UAV) Images. Remote Sens. 2019, 11, 1534. [Google Scholar] [CrossRef] [Green Version]
  57. Hsu, C.; Chang, C.; Lin, C. A Practical Guide to Support Vector Classification; National Taiwan University: New Taipei, Taiwan, 2010. [Google Scholar]
  58. Immitzer, M.; Atzberger, C.; Koukal, T. Tree Species Classification with Random Forest Using Very High Spatial Resolution 8-Band WorldView-2 Satellite Data. Remote Sens. 2012, 4, 2661–2693. [Google Scholar] [CrossRef] [Green Version]
  59. Sankey, T.; Donager, J.; McVay, J.; Sankey, J.B. UAV lidar and hyperspectral fusion for forest monitoring in the southwestern USA. Remote Sens. Environ. 2017, 195, 30–43. [Google Scholar] [CrossRef]
  60. Nevalainen, O.; Honkavaara, E.; Tuominen, S.; Viljanen, N.; Hakala, T.; Yu, X.; Hyyppä, J.; Saari, H.; Pölönen, I.; Imai, N.N.; et al. Individual tree detection and classification with UAV-Based photogrammetric point clouds and hyperspectral imaging. Remote Sens. 2017, 9, 185. [Google Scholar] [CrossRef] [Green Version]
  61. Yancho, J.M.M.; Coops, N.C.; Tompalski, P.; Goodbody TR, H.; Plowright, A. Fine-Scale Spatial and Spectral Clustering of UAV-Acquired Digital Aerial Photogrammetric (DAP) Point Clouds for Individual Tree Crown Detection and Segmentation. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2019, 12, 4131–4148. [Google Scholar] [CrossRef]
  62. Hill, R.A.; Wilson, A.K.; George, M.; Hinsley, S.A. Mapping tree species in temperate deciduous woodland using time-series multi-spectral data. Appl. Veg. Sci. 2010, 13, 86–99. [Google Scholar] [CrossRef]
  63. Key, T.; Warner, T.A.; Mcgraw, J.B.; Fajvan, M.A. A Comparison of Multispectral and Multitemporal Information in High Spatial Resolution Imagery for Classification of Individual Tree Species in a Temperate Hardwood Forest. Remote Sens. Environ. 2001, 75, 100–112. [Google Scholar] [CrossRef]
  64. Somers, B.; Asner, G.P. Multi-temporal hyperspectral mixture analysis and feature selection for invasive species mapping in rainforests. Remote Sens. Environ. 2013, 136, 14–27. [Google Scholar] [CrossRef]
  65. Ribeiro, N.S.; de Miranda, P.L.S.; Timberlake, J. Miombo Woodlands in a Changing Sustainability of People the Resilience and Environment: Securing and Woodlands; Ribeiro, N.S., Katerere, Y., Chirwa, P.W., Grundy, I.M., Eds.; Springer Nature: Cham, Switzerland, 2020; Volume 38, pp. 188–189. [Google Scholar]
  66. Ferreira, M.P.; Zortea, M.; Zanotta, D.C.; Shimabukuro, Y.E.; de Souza Filho, C.R. Mapping tree species in tropical seasonal semi-deciduous forests with hyperspectral and multispectral data. Remote Sens. Environ. 2016, 179, 66–78. [Google Scholar] [CrossRef]
  67. Yang, G.; Zhao, Y.; Li, B.; Ma, Y.; Li, R.; Jing, J.; Dian, Y. 2019 Tree species classification by employing multiple features acquired from integrated sensors. J. Sensors. 2019, 2019, 3247946. [Google Scholar] [CrossRef]
  68. Deur, M.; Gašparović, M.; Balenović, I. Tree species classification in mixed deciduous forests using very high spatial resolution satellite imagery and machine learning methods. Remote Sens. 2020, 12, 3926. [Google Scholar] [CrossRef]
  69. Ferreira, M.P.; Wagner, F.H.; Aragão, L.E.O.C.; Shimabukuro, Y.E.; de Souza Filho, C.R. Tree species classification in tropical forests using visible to shortwave infrared WorldView-3 images and texture analysis. ISPRS J. Photogramm. Remote Sens. 2019, 149, 119–131. [Google Scholar] [CrossRef]
  70. Xie, Z.; Chen, Y.; Lu, D.; Li, G.; Chen, E. Classification of land cover, forest, and tree species classes with Ziyuan-3 multispectral and stereo data. Remote Sens. 2019, 11, 164. [Google Scholar] [CrossRef] [Green Version]
Figure 1. General UAS image acquisition, processing, and classification workflow.
Figure 1. General UAS image acquisition, processing, and classification workflow.
Sensors 23 02241 g001
Figure 2. Study area location.
Figure 2. Study area location.
Sensors 23 02241 g002
Figure 3. Visual comparison of segmentation using different scale parameter: (a) 50 (oversegmentation); (b) 80 (correct segmentation); and (c) 150 (undersegmentation).
Figure 3. Visual comparison of segmentation using different scale parameter: (a) 50 (oversegmentation); (b) 80 (correct segmentation); and (c) 150 (undersegmentation).
Sensors 23 02241 g003
Figure 4. Visual comparison of segmentation using orthophoto alone vs. orthophoto with CHM at highlighted sites 1–3: (a) Original orthophoto; (b) using only the orthophoto, over-segmentation with irregular outlines for tree crowns; and (c) using orthophoto and CHM, tree crowns are well segmented with smoother outlines.
Figure 4. Visual comparison of segmentation using orthophoto alone vs. orthophoto with CHM at highlighted sites 1–3: (a) Original orthophoto; (b) using only the orthophoto, over-segmentation with irregular outlines for tree crowns; and (c) using orthophoto and CHM, tree crowns are well segmented with smoother outlines.
Sensors 23 02241 g004
Figure 5. Species separability in different bands (1, blue; 2, green; 3, red; 4, red-edge; 5, near infrared): (a) 25.05.21 image, (b) 15.08.21 image; and (c) 24.10.21 image. S = shadow.
Figure 5. Species separability in different bands (1, blue; 2, green; 3, red; 4, red-edge; 5, near infrared): (a) 25.05.21 image, (b) 15.08.21 image; and (c) 24.10.21 image. S = shadow.
Sensors 23 02241 g005
Figure 6. Species separability in band spectral metrics bands (1, brightness; 2, maximum difference; 3, NDVI; 4, GCC; 5 RCC): (a) 25.05.21 image, (b) 15.08.21 image; and (c) 24.10.21 image. S = shadow.
Figure 6. Species separability in band spectral metrics bands (1, brightness; 2, maximum difference; 3, NDVI; 4, GCC; 5 RCC): (a) 25.05.21 image, (b) 15.08.21 image; and (c) 24.10.21 image. S = shadow.
Sensors 23 02241 g006
Figure 7. Species separability in GLCM texture bands (1, contrast; 2, correlation; 3, dissimilarity; 4, entropy; and 5, standard deviation): (a) 25.05.22 image, (b) 15.08.21 image; and (c) 24.10.21 image. S = shadow.
Figure 7. Species separability in GLCM texture bands (1, contrast; 2, correlation; 3, dissimilarity; 4, entropy; and 5, standard deviation): (a) 25.05.22 image, (b) 15.08.21 image; and (c) 24.10.21 image. S = shadow.
Sensors 23 02241 g007
Figure 8. Classification of dominant tree species: (a) orthophoto mosaic at leaf maturity; (b) level 1 classification to separate trees from non-tree objects; (c) species classification at leaf maturity (May image); (d) species classification at transition to senescence (August image); (e) species classification at flushing of new leaves; and (f) species classification using multi-date and multi-feature image combination.
Figure 8. Classification of dominant tree species: (a) orthophoto mosaic at leaf maturity; (b) level 1 classification to separate trees from non-tree objects; (c) species classification at leaf maturity (May image); (d) species classification at transition to senescence (August image); (e) species classification at flushing of new leaves; and (f) species classification using multi-date and multi-feature image combination.
Sensors 23 02241 g008
Table 1. Sampled dominant tree species in the area.
Table 1. Sampled dominant tree species in the area.
Species CodeTree SpeciesCommon Local UsesTrees SampledTraining SamplesValidation Samples
JPJulbernardia paniculataCharcoal, pole, timber1278938
IAIsoberlinia angolensisCharcoal, timber, pole1148034
MMMarquesia macrouraPoles, charcoal1087632
BLBrachystegia longifoliaCharcoal, bark rope644519
BSBrachystegia spiciformisCharcoal, bark rope513615
Table 2. Imagery acquisition parameters.
Table 2. Imagery acquisition parameters.
UAS Flight ParametersValue
Camera modelDJI P4 Multi-spectral
Flight height (m)100
Flight speed (m/s)5
Forward overlap (%)85
Side Overlap (%)75
Ground resolution (m)0.05
Spectral bandsBlue, green, red, red-edge, near infrared
Time of flight11:30 a.m.–12:30 p.m.
Table 3. Equations of vegetation indices used.
Table 3. Equations of vegetation indices used.
Vegetation IndexEquationSource
NDVINDVI = (nir − red)/(nir + red)[55]
GCCGCC = green/(blue + green + red)[56]
RCC(RCC = red/(blue + green + red)[56]
Table 4. Segmentation accuracy of using UAS orthophoto and combination of UAS orthophoto and CHM.
Table 4. Segmentation accuracy of using UAS orthophoto and combination of UAS orthophoto and CHM.
Image SourceOSUSSEAccuracy (%)
Orthophoto0.260.170.2278
Orthophoto and CHM0.170.140.1684
OS = oversegmentation, US = undersegmentation, SE = segmentation error.
Table 5. Comparison of classification accuracies of tree species for single date, multi-date, and multi-feature imagery.
Table 5. Comparison of classification accuracies of tree species for single date, multi-date, and multi-feature imagery.
Classes25.05.21
Spectral
15.08.21
Spectral
24.10.21
Spectral
Multi-Date SpectralMulti-Date
SELECTION
(Spectral and Indices)
PA%UA%PA%UA%PA%UA%PA%UA%PA%UA%
JP61.4253.5693.2184.7479.6172.0095.1193.1796.5096.03
IA73.3480.0577.2380.4165.2076.2484.0592.5087.1785.22
MM82.4488.2570.0867.4554.1760.5893.8684.3594.8886.24
BL58.2267.4586.0879.4457.2844.5686.7572.0492.1585.36
BS74.3171.2575.4181.9852.565.0591.1582.1595.0481.26
S65.6267.1598.2010088.7586.3090.5296.0197.42100
OA%74.6480.1268.2584.2587.07
Kappa0.630.680.590.720.83
Abbreviations: JP = Julbernardia paniculata, IA = Isoberlinia angolensis, MM = Marquesia macroura, BL = Brachystegia longifolia, BS = Brachystegia spiciformis, S = shadow.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Shamaoma, H.; Chirwa, P.W.; Zekeng, J.C.; Ramoelo, A.; Hudak, A.T.; Handavu, F.; Syampungani, S. Use of Multi-Date and Multi-Spectral UAS Imagery to Classify Dominant Tree Species in the Wet Miombo Woodlands of Zambia. Sensors 2023, 23, 2241. https://doi.org/10.3390/s23042241

AMA Style

Shamaoma H, Chirwa PW, Zekeng JC, Ramoelo A, Hudak AT, Handavu F, Syampungani S. Use of Multi-Date and Multi-Spectral UAS Imagery to Classify Dominant Tree Species in the Wet Miombo Woodlands of Zambia. Sensors. 2023; 23(4):2241. https://doi.org/10.3390/s23042241

Chicago/Turabian Style

Shamaoma, Hastings, Paxie W. Chirwa, Jules C. Zekeng, Abel Ramoelo, Andrew T. Hudak, Ferdinand Handavu, and Stephen Syampungani. 2023. "Use of Multi-Date and Multi-Spectral UAS Imagery to Classify Dominant Tree Species in the Wet Miombo Woodlands of Zambia" Sensors 23, no. 4: 2241. https://doi.org/10.3390/s23042241

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop