Rapid, accurate extraction of rural residential areas is of great significance to rural planning and urbanization. On the basis of the improved YOLOv8 object detection algorithm, this paper puts forward a technical method for accurately extracting rural residential areas from multi-scale remote sensing images. The rapid extraction of rural blocks comes true via improving the retrieval mechanism of YOLOv8 algorithm: First, the feature extraction module based on ECA local crosschannel interaction attention mechanism is designed to deeply dig the detailed features with inconsistent scales in the detection of residential areas. Efficient channel interaction pays more attention to the positive sample feature information in the feature map, and meanwhile, it reduces the complexity of the model. Second, Swish activation function is proposed to avoid gradient disappearance and poor activation effect caused by over-fitting. Third, DIoU loss is introduced to accurately show the real distance error between two predicted residential areas and enhance the performance of multitarget detection. In the end, ablation experiments and comparative experiments are conducted on CBDV1.0 building data set. The experimental results show that this method can extract rural residential areas from multi-scale remote sensing images, which provides support for large-scale remote sensing image mapping of rural residential areas.
To establish a detection network appropriate for buildings in remote sensing images and lessen the issues including poor detection effects, missing detection and false detection due to the deficiency of detailed features, this paper conducted the design on the basis of Segformer network to solve the problem, coupled the transposed convolutional networks at the decoder stage, and addressed the issue of missing feature semantics via adding holes and fillings. Multiple normalization layers and activation layers were cascaded after the convolution layer to avert overfitting regularization expression and guarantee the classification of stable feature parameters, so as to further advance inter-class differentiated extraction. Ablation experiments and comparison experiments were conducted on AISD, MBD and WHU remote sensing image datasets: The robustness and effectiveness of the improved mechanism were demonstrated by control groups of ablation experiments; in comparison experiments with Hrnet, PSPNet, UNet, Deeplabv3+ and the original detection algorithm, the mIoU of AISD, MBD and WHU was improved by up to 12.83%, 28.82% and 14.26%, respectively. The experimental results indicated that the improved method was better than the comparative methods such as UNet, and had better effects on integrity detection of building edge as well as the reduction of missing detection and false detection.
In recent years, deep-learning-based hyperspectral image (HSI) processing and analysis have made significant progress. However, models with high performance require sufficient training samples because scarce labeled samples limit their generalization ability. To solve this problem, we adopt a self-supervised learning strategy and conduct self-training for a neural network model by obtaining different views of the same sample (positive pairs). As a result, the network can learn representative features for classification from unlabeled samples. In addition, to increase the spatial receptive field compared with the use of conventional convolutions, we use the transformer to capture long-distance dependencies for feature enhancement and adequately combine their advantages. Experimental results on two publicly available HSI datasets demonstrate that the proposed method can extract robust features through self-training on unlabeled samples and can be adapted to HSI classification tasks under the small sample conditions.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.