Next Article in Journal
Embedded LUKS (E-LUKS): A Hardware Solution to IoT Security
Next Article in Special Issue
Lesion Segmentation Framework Based on Convolutional Neural Networks with Dual Attention Mechanism
Previous Article in Journal
Unbalanced-Tests to the Improvement of Yield and Quality
Previous Article in Special Issue
Reliable Memory Model for Visual Tracking
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Multipopulation Particle Swarm Optimization for Evolutionary Multitasking Sparse Unmixing

1
School of Communications and Information Engineering, Xi’an University of Posts and Telecommunications, Xi’an 710121, China
2
School of Electronic Engineering, The Key Laboratory of Intelligent Perception and Image Understanding of Ministry of Education, Xidian University, Xi’an 710071, China
3
School of Cyber Engineering, Xidian University, Xi’an 710071, China
*
Author to whom correspondence should be addressed.
Submission received: 22 October 2021 / Revised: 3 December 2021 / Accepted: 4 December 2021 / Published: 5 December 2021
(This article belongs to the Special Issue Applications of Computational Intelligence)

Abstract

:
Recently, the multiobjective evolutionary algorithms (MOEAs) have been designed to cope with the sparse unmixing problem. Due to the excellent performance of MOEAs in solving the NP hard optimization problems, they have also achieved good results for the sparse unmixing problems. However, most of these MOEA-based methods only deal with a single pixel for unmixing and are subjected to low efficiency and are time-consuming. In fact, sparse unmixing can naturally be seen as a multitasking problem when the hyperspectral imagery is clustered into several homogeneous regions, so that evolutionary multitasking can be employed to take advantage of the implicit parallelism from different regions. In this paper, a novel evolutionary multitasking multipopulation particle swarm optimization framework is proposed to solve the hyperspectral sparse unmixing problem. First, we resort to evolutionary multitasking optimization to cluster the hyperspectral image into multiple homogeneous regions, and directly process the entire spectral matrix in multiple regions to avoid dimensional disasters. In addition, we design a novel multipopulation particle swarm optimization method for major evolutionary exploration. Furthermore, an intra-task and inter-task transfer and a local exploration strategy are designed for balancing the exchange of useful information in the multitasking evolutionary process. Experimental results on two benchmark hyperspectral datasets demonstrate the effectiveness of the proposed method compared with the state-of-the-art sparse unmixing algorithms.

1. Introduction

With the progress of remote sensing technology, hyperspectral imagery, which can obtain hundreds of sequential spectrum bands, has been widely applied in both civilian and military scenarios, for example, land-cover classification [1,2,3], environmental monitoring [4,5,6] and target detection [7,8], and so forth. However, there remains the problem of mixed pixels due to the low spatial resolution of sensors and the mixture of the surface features [9,10]. Therefore, spectral unmixing aims at extracting the collection of constituent spectra (called endmembers) from the mixed pixels and calculating the fractional abundances of these endmembers [11,12]. Accordingly, different spectral unmixing methods can be divided into three categories, that is, the geometrical-based, statistical-based and sparse-regression-based approaches. Traditional geometrical-based and statistical-based methods are extensively used as they can be utilized easily and flexibly, but they also suffer from the weakness of poor performance on highly mixed scenes spectra and the limitedness of time consumption, respectively [13]. Sparse unmixing, as an emerging spectral unmixing technology in recent years, is devised to find out the optimal solution that can represent each pixel of the hyperspectral image the most from a spectral library known in advance. Among these algorithms, the sparse unmixing via variable splitting and augmented Lagrangian (SUnSAL) based on the alternating direction method of multipliers has been proposed to relax the l 0 norm [14]. To overcome the disadvantage of SUnSAL that only utilizes spectral information without considering the spatial-contextual information, Iordache et al. proposed the collaborative SUnSAL (CLSUnSAL) which improves the unmixing results by solving a joint sparse regression problem, where the sparsity is simultaneously imposed to all pixels in the dataset [15,16].
Mathematically, sparse unmixing is an NP-hard problem. Multiobjective evolutionary algorithms (MOEAs), which are able to optimize some contradictory objectives and acquire a set of nondominated solutions called the Pareto-optimal front, are suitable for solving the NP-hard problems and overcoming the aforementioned difficulty in sparse unmixing [17]. A multiobjective sparse unmixing (MOSU) model was first proposed by Gong et al. [18] to deal with the sparse unmixing for hyperspectral imagery. Xu et al. [19] developed a multiobjective optimization based sparse unmixing (SMoSU) to take full advantage of the spectral characteristics of hyperspectral images under the framework of the multiobjective evolutionary algorithm based on decomposition (MOEA/D). In [20], the SMoSU was further improved and a classification-based model called CM-MoSU was designed. The estimation of distribution algorithms is modified to pay more attention to the feasible space with high quality.
However, the existing sparse unmixing algorithms based on MOEAs are limited to the pixel-based unmixing, which leads to the disadvantage of the low efficiency and the lack of the spatial structure information [21]. In some recent studies [22,23,24], a hyperspectral image is clustered into multiple homogeneous regions based on the assumption that the probability of the active endmember set in the homogeneous region is likely to be the same, which not only reduces the complexity of unmixing, but also further enhances the spatial correlation of pixels in the same category. Interestingly, this coincides with the idea of evolutionary multitasking framework emerging in recent years. The evolutionary multitasking [25] aims to solve different multiobjective optimization problems simultaneously to take advantage of the implicit parallelism from different tasks. Therefore, it is promising to employ the evolutionary multitasking multiobjective framework to efficiently solve the sparse unmixing problem. Besides, the particle swarm optimization (PSO) algorithm, which simulates the regularity of bird cluster activities, has proved to be effective in solving multiobjective endmember extraction problems [26,27,28]. From this, the current multitasking paradigm can be further explored and applied to sparse unmixing problems.
In this paper, we propose a novel evolutionary multitasking multipopulation particle swarm optimization (EMMPSO) framework for sparse unmixing. In the proposed method, a hyperspectral image is clustered into multiple homogeneous regions first, then the multipopulation particle swarm optimization is employed to explore each sparsity. Finally, the multiobjective optimization is applied to each task simultaneously to obtain a compromise between the reconstruction error and the endmember sparsity. Significantly, it is different from the traditional MOEA-based algorithms that EMMPSO can process the entire matrix due to the decomposition strategy of evolutionary multitasking, aiming at pixel-based unmixing only. In addition, we design a novel intra-task and inter-task transfer strategy to overcome the impact of negative transfer in multitasking. It can not only utilize the effective information in the same task to speed up the convergence of each sub-particle swarm, but also explore the similarities between different tasks to improve the overall convergence performance. Finally, the Pareto optimal solution in each task can be obtained to reverse the final endmember abundance.
The contributions of the proposed EMMPSO algorithm are summarized as follows:
(1)
A novel evolutionary multitasking multipopulation particle swarm optimization framework is proposed to solve the sparse unmixing problem. With the decomposition of the evolutionary multitasking, multiple homogeneous regions of a hyperspectral image can be processed simultaneously, which can accelerate the convergence by exploring the relevance of all the tasks. In addition, the Pareto optimal solution between the reconstruction error and the endmember sparsity can be obtained with the multiobjective optimization.
(2)
A multipopulation particle swarm optimization is designed in the multitasking framework for the major evolution. In addition, the intra-task and inter-task transfer strategy are proposed to balance the evolutionary process of exploration and utilization. An efficient local exploration strategy with MOEA is designed to facilitate the search process to obtain the optimal points.
(3)
The superiority of EMMPSO on the convergence speed, global optimization performance and unmixing accuracy is substantiated compared with the classical mathemati cal-based and MOEA-basd sparse unmixing algorithms.
The remainder of this paper is structured as follows: Section 2 briefly reviews some related work on sparse unmixing. In Section 3, our method is introduced in detail. Section 4 gives the experimental settings and the analysis of the experimental results. Finally, the conclusions and future works are described in Section 5.

2. Related Work

Generally, the mixed pixels are usually unmixed in the linear mixing model. For a single mixed pixel y R L × 1 with L spectral bands, which can be expressed as:
y = Ax + n ,
where A R L × D is the spectral library. It is worth noting that all the spectral information is known in advance in the spectral library. In addition, x R D × 1 is the corresponding fractional abundance vector, that is, the proportion of each endmember, and n R L × 1 represents the noise term for the mixed pixel. In normal circumstances, a hyperspectral image Y R L × n contains n pixels, the matrix form of (1) can be formulated as:
Y = AX + N .
Therefore, the purpose of sparse unmixing is to obtain the most suitable set of endmembers for the reconstructing remote sensing image from the huge spectral library. Mathematically, this is an NP hard optimization problem, which can be expressed as:
min x x 0 , s . t . y Ax 2 2 δ .
Many studies employed the relaxation methods to solve the l 0 -norm problem. SUnSAL [14] resorted to the l 0 -norm to match l 0 -norm, and the mathematical optimization formula is as follows:
min x ( 1 / 2 ) y Ax 2 2 + λ x 1 + ι R + ( x ) + ι 1 ( 1 T x ) ,
where λ stands for a regularization parameter that controls the relative weight between the sparse term and the error term. In [15], the CLSUnSAL takes spatial information into account and directly processes the whole matrix, which is shown as follows:
min X Y AX F 2 + λ X 2 , 1 + ι R + ( X ) .
Considering the excellent performance of MOEAs in solving NP-hard optimization problems, many studies have turned their attention to MOEAs to solve the sparse unmixing problem in recent year. Gong [18] proposed a novel multiobjective cooperative coevolutionary algorithm to optimize the reconstruction term, the sparsity term and the total variation regularization term simultaneously, which can be expressed as:
min x ( y Ax 2 2 , x 0 , j ε x x j 1 ) ,
where ε stands for the set of the horizontal and vertical neighbors in X . Jiang [29] decomposed the sparse unmixing problem into two stages and employed the MOEAs to solve them separately. In the first phase, it is mainly aimed at the endmember extraction, the optimized formula is as follows: min M ( R S E 1 , S P 1 ) , where RSE1 is the residual of the measured hyperspectral image, SP1 represents the size of the measured estimated endmembers (M). In the second phase, the extracted abundance estimation becomes the focus, which can be expressed as: min M ( R S E 2 , S P 2 ) , where RSE2 is the residuals of the hyperspectral unmixing, SP2 represents the favorable abundance matrix obtained by incorporating the spatial–contextual information. In addition, Jiang [30] improved the Tp-MoSU to settle the problems of the limited performance in identifying real endmembers from high-noise data in the first phase, and cannot effectively use the spatial context information in the second phase due to the similarity metric used. Besides, many sparse unmixing algorithms based on evolutionary multiobjective decomposition [19,20,31] have also been explored.
Recently, evolutionary multitasking optimization [25,32] has become a new favorite in the field of evolutionary computing. In a nutshell, evolutionary multitasking aims to deal with multiple optimization problems at the same time, and promote the optimization of each task by exploring the hidden relationship between these optimization problems. It is worth noting that many evolutionary multitasking optimization related algorithms have been explored and applied to many fields, such as feature selection [33], reinforcement learning [34] and sparse regression [22] and so forth. In sparse unmixing, a hyperspectral image can be clustered into multiple homogeneous regions according to spatial information, so this coincides with the concept of evolutionary multitasking. It is very promising to model each homogeneous region as an optimization task, though the decomposition of multiple tasks can effectively reduce the impact of dimensional disasters.

3. EMMPSO Framework

The pseudo code of EMMPSO is shown in Algorithm 1. In this section, the proposed framework is introduced in detail from initialization, multipopulation particle swarm optimization and the decision making with MOEA.
Algorithm 1 The EMMPSO Framework
  1:
%Initialization
  2:
Set t = 0, G = .
  3:
forj = 1 to K do
  4:
   for s = 1 to S do
  5:
     for i = 1 to N/KS do
  6:
         { X i , s t } j = { ( x 1 , x 2 , . . . , x n ) | x i { 0 , 1 } , | | X i , s t | | 0 = s } .
  7:
     end for
  8:
   end for
  9:
   Evaluate the fitness of each particle in task T j .
10:
   Assign the skill factor τ i .
11:
   Initialize the { pbest s t } j and { gbest s t } j .
12:
    { V i , s t } j = ( { pbest s t } j { X i , s t } j ) + ( { gbest s t } j { X i , s t } j ) .
13:
    G j t = s = 1 S { gbest s t } j .
14:
end for
15:
%Evolution
16:
whilet < Maxt do
17:
   for j = 1 to K do
18:
     Update the { X i , s t + 1 } j and { V i , s t + 1 } j based on (9).
19:
   end for
20:
   Update the particle according to Algorithm 2.
21:
   Evaluate the fitness of each particle.
22:
   Update the { pbest s t } j , { gbest s t } j , and G j t with the Local Exploration Strategy.
23:
   t = t + 1.
24:
end while
25:
%Decision Making
26:
Obtain the optimal point in each task from G j .

3.1. Initialization and Representation

In sparse unmixing, the spectral library known in advance and a hyperspectral image are input for processing, and the endmember set selected from the library and the corresponding abundance map are output. In the proposed EMMPSO, a hyperspectral image is first clustered into K homogeneous regions, and each homogeneous region is processed as a task [22], which is shown in Figure 1. The spectra of the entire spectrum library are coded into each particle in order, that is, the length of particle is equal to the number of spectra. Considering that the sparsity of particles remains unchanged in the evolution for most current discrete particle swarm optimization algorithms, the population in each task is divided into multiple subpopulations according to the sparsity to ensure that there are particles to explore in each sparsity. For the s -th subpopulation in the j -th task, the position of each particle is initialized as follows:
{ X i , s t } j = { ( x 1 , x 2 , . . . , x n ) | x i { 0 , 1 } , | | X i , s t | | 0 = s } ,
where x i is composed of two elements, 0 or 1. If the x i is equal to 1, it means that the spectrum at the corresponding position in the spectral library is selected, and vice versa.
Then, each particle is evaluated with the reconstruction error ( | | Y A v X v | | F ) in the corresponding task, where the Y is the hyperspectral image, v represents the endmember set from the particle { X i , s t } j , A v and X v are the subset of spectral library A and the abundances of endmembers, respectively. After the evaluation is completed, the skill factor τ i , s , defined as the task with the best performance of the subpopulations with sparsity s in all the tasks, is assigned to each particle. Besides, the { pbest s } j and { gbest s } j for the subpopulation with sparsity s in the j -th task can be obtained. The velocity of particle is initialized as:
{ V i , s t } j = ( { pbest s } j { X i , s t } j ) + ( { gbest s } j { X i , s t } j ) .

3.2. Multipopulation Particle Swarm Optimization for Knowledge Transfer

Considering the discreteness of decision variables in sparse unmixing, the population in each task is divided into multiple subpopulations according to the sparsity during initialization. In the process of particle swarm evolution, the position and velocity of the particles in the j-th task with the sparsity s are updated as follows:
{ X i , s t + 1 } j = { X i , s t } j + { V i , s t } j , { V i , s t + 1 } j = T ( { V i , s t + 1 } j ) , if ( any ( { V i , s t } j ) 0 ) R ( { X i , s t + 1 } j ) , otherwise ,
where T and R are both the selection functions [35].
After updating the positions and velocities of all the particles, we designed an efficient knowledge transfer of intra-task and inter-task to explore the useful information, which is shown in Algorithm 2. Firstly, two particles are randomly selected from the current generation of particles. In the intra-task transfer, the same positions of the particles focus on exploitation. ( p a , p b ) represents the positions where the elements in p a and p b are both 1. Then, the new particles directly inherit positions in ( p a , p b ) , and the remaining randomly inherit the position on the original particle. On the contrary, the exploration of randomness focuses on the inter-task transfer. ( p a , p b ) represents the positions where the elements are equal to 1 in p a or p b . For the new particles p a and p b , | | p a | | 0 and | | p b | | 0 positions are directly selected from ( p a , p b ) , respectively. Then the p a and p b are updated with the better fitness particles. In order to more intuitively illustrate the essence of Algorithm 2, Figure 2 shows a simple example for the genetic knowledge transfer. Two particles with sparsity of 3 and 4 are selected from the current generation first, in the intra-task transfer, the new particles are updated by inheriting all positions in their same positions which refer to the positions in p c , then randomly set the rest of positions to 1 to ensure that the sparsity of the new particles is the same as the previous particles. Similar operations are also performed in the inter-task transfer, but the difference is that the new particles are updated by selecting form the positions with 1 in the previous particle, and randomly set them to 1 with the same sparsity.
Algorithm 2 Genetic Knowledge Transfer
Input: P t : the current generation of particles.
  1:
forg = 1 to N/2 do
  2:
   Randomly select two particles p a and p b in P t .
  3:
   if  τ a = τ b  then
  4:
     %Intra-task Transfer
  5:
      p c ( p a , p b ) .
  6:
      p a Inherit all positions of p c and randomly set ( | | p a | | 0 | | p c | | 0 ) positions to 1.
  7:
      p b Inherit all positions of p c and randomly set ( | | p b | | 0 | | p c | | 0 ) positions to 1.
  8:
   else
  9:
     %Inter-task Transfer
10:
      p c ( p a , p b ) .
11:
      p a Randomly select | | p a | | 0 positions in p c .
12:
      p b Randomly select | | p b | | 0 positions in p c .
13:
   end if
14:
   Evaluate the fitness of p a and p b .
15:
   Update the p a and p b .
16:
   g = g + 1.
17:
end for

3.3. An Efficient Local Exploration Strategy with MOEA

After the optimization of multipopulation particle swarms, the set of globally optimal particles with all the sparsity levels on each task ( G = { j = 1 K { s = 1 S { gbest s } j ) can be obtained. Two conflicting parameters are included in each particle, that is, the endmember sparsity and the reconstruction error. Therefore, we employ the multiobjective optimization algorithm to facilitate the search process to obtain the optimal points in each task. In the evolutionary multitasking multiobjective framework, the optimized function is expressed as follows:
{ X 1 * , X 2 * , . . . , X K * } = arg min F ( X 1 ) , F ( X 2 ) , . . . , F ( X K ) } , F ( X j ) = min X j ( | | X j | | 0 , | | Y j AX j | | F ) ,
where the Y j and X j represent the original image and inversion abundance in the j -th task, respectively.
The local exploration strategy processes are in Figure 3. First, the globally optimal particles are transcoded to the first generation of the evolutionary algorithm for the NSGA-II framework. The roulette selection, single-point crossover and bitwise mutation operators are employed to participate in the evolution of multiobjective optimization. Then, the generated offspring are evaluated to update the Pareto front in each task according to the nondominated sorting and crowding distance, and the nondominated solutions are transcoded back to the globally optimal particles.
With the above design, the optimal point in each task can finally be obtained by: X jv * = arg min | | Y j A v X jv | | F , which can be solved simply with the least squares method. Finally, the optimal abundance map obtained from each task constitutes the final inverted abundance map.

4. Experimental Results

4.1. Data Sets

Data 1 provided by Iordache et al. [36] is an image which contains 100 × 100 pixels and 224 bands in each pixel, and the related abundance map of nine endmembers is shown Figure 4. It contains nine randomly selected signatures from a sublibrary of 230 spectral signals, and the fractional abundances are piecewise smooth. Data 2 provided by Tang et al. [37] is an image which contains 64 × 64 pixels and 224 bands in each pixel, the related abundance map of five endmembers is shown Figure 5. It includes five endmembers from a sublibrary of 498 spectral signals, and the fractional abundances are also homogeneous. These two benchmark datasets were tested at different levels of white noise, that is, SNR = 20, 30 and 40 dB. The number of tasks was set to three on these two datasets as recommended in [22]. In order to maintain the fairness of the experiments, all experimental results were taken from the average results of 20 experiments, which is the same as in the comparative method paper.

4.2. Performance Analysis of EMMPSO

In this section, the ablation experiments were performed to demonstrate the effectiveness of the knowledge transfer and the local exploration strategy. The hypervolume indicator was used to compare the evolution process and the convergence procedure of the EMMPSO and the EMMPSO without transfer. Hypervolume was calculated using a reference point 1% larger in every component than the corresponding nadir point [38]. As an important indicator to measure the Pareto-optimal front (PF), the larger the value of the hypervolume, or the faster the convergence speed of the hypervolume, the better the PF obtained by the algorithms. The evolution of the hypervolume indicator is shown in Figure 6. It is clear that, after a few iterations, our method can obtain the higher hypervolume values with the help of the intra-task and inter-task transfer strategy. When several related tasks are optimized simultaneously under the framework of evolutionary multitasking, the convergence rate is improved significantly.
Secondly, to test the efficiency of the local exploration, the performance of EMMPSO and EMMPSO without local exploration was compared. Usually, signal to reconstruction error (SRE) is used to measure the quality of the reconstruction of a signal. Table 1 shows the SRE (dB) with different noise levels of our proposed method and the EMMPSO without the local exploration on the simulated data. It can be observed that our method can achieve values of SRE (dB) higher than the EMMPSO without local exploration on both simulated datas. It is obvious to see that the local exploration is useful for facilitating the search process to obtain the optimal points.

4.3. Comparing with State-of-Art Algorithms

In order to reflect the superiority of our proposed algorithm, EMMPSO compares with the state-of-art algorithms, including SUnSAL, CLSUnSAL, two-phase multiobjective sparse unmixing (Tp-MOSU) and evolutionary multitasking sparse reconstruction (MTSR). Among them, SUnSAL and CLSUnSAL are the traditional pixel-based and matrix-based processing algorithms. Tp-MOSU and MTSR are two algorithms based on the multiobjective optimization and multitasking optimization, respectively. In order to reflect the advantage of the proposed method, Figure 7 and Figure 8 depict the estimated abundance maps for the endmember 2, 5, 8 on data 1 and the endmember 1, 3, 5 data 2, respectively. The rightmost column represents the abundance map of the real endemembers. The closer the inverted abundance map is to the real abundance map, the better the unmixing performance of the modified algorithm is. It can be seen that the Tp-MOSU, MTSR and EMMPSO exhibit better performances than the other two methods in the similarity with the original abundance map. Although the abundance maps obtained by the Tp-MOSU, MTSR and EMMPSO are similar, the abundance map of EMMPSO has much less noise. Table 2 shows the results of SRE (dB) obtained by the five methods on data 1 and data 2. At different levels of noise, the proposed EMMPSO can always achieve the highest values of SRE (dB) on both simulated datasets. The experimental results on two datasets have proved that our proposed EMMPSO is able to achieve a competitive performance by evolutionary multitasking and local exploration strategy.

5. Conclusions

In this paper, we propose a novel evolutionary multitasking multiobjective particle swarm optimization framework called EMMPSO to solve the sparse unmixing problem. With processing multiple homogeneous regions of a hyperspectral image simultaneously, the evolution convergence is accelerated. The local exploration strategy with MOEA is also designed to obtain the optimal solution. For the case study, the proposed EMMPSO is compared with some state-of-the-art methods on benchmark simulated datasets. The results demonstrate the superiority of the EMMPSO.
In future work, we will focus on reducing the time complexity of EMMPSO, and design an efficient multiobjective particle swarm optimization paradigm for the sparse unmixing problem.

Author Contributions

Conceptualization, D.F. and M.Z.; methodology, D.F.; validation, M.Z.; investigation, D.F.; writing—original draft preparation, D.F.; writing—review and editing, D.F., M.Z. and S.W. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Acknowledgments

This work was supported by the National Natural Science Foundation of China under Grant 61906147 and Grant 61806153, the Fundamental Research Funds for the Central Universities (Grant no. XJS200216) and China Post-Doctoral Science Foundation (Grant no. 2021T140528).

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
SUnSALsparse unmixing algorithm via variable splitting and augmented Lagrangian
CLSUnSALcollaborative SUnSAL
MOEAsMultiobjective evolutionary algorithms
MOSUmultiobjective sparse unmixing
MOEA/Dmultiobjective evolutionary algorithm based on decomposition
PSOparticle swarm optimization
EMMPSOevolutionary multitasking multipopulation particle swarm optimization
Tp-MOSUtwo-phase multiobjective sparse unmixing
MTSRmultitasking sparse reconstruction
SREsignal to reconstruction error

References

  1. Lv, Z.Y.; Liu, T.F.; Zhang, P.; Benediktsson, J.A.; Lei, T.; Zhang, X. Novel Adaptive Histogram Trend Similarity Approach for Land Cover Change Detection by Using Bitemporal Very-High-Resolution Remote Sensing Images. IEEE Trans. Geosci. Remote Sens. 2019, 57, 9554–9574. [Google Scholar] [CrossRef]
  2. Maniatis, D.; Dionisio, D.; Guarnieri, L.; Marchi, G.; Mollicone, D.; Morales, C.; Sanchez-Paus Díaz, A. Toward a More Representative Monitoring of Land-Use and Land-Cover Dynamics: The Use of a Sample-Based Assessment through Augmented Visual Interpretation Using Open Foris Collect Earth. Remote Sens. 2021, 13, 4197. [Google Scholar] [CrossRef]
  3. Wu, Y.; Ma, W.P.; Gong, M.G.; Su, L.Z.; Jiao, L.C. A novel point-matching algorithm based on fast sample consensus for image registration. IEEE Geosci. Remote Sens. Lett. 2014, 12, 43–47. [Google Scholar] [CrossRef]
  4. Di Biase, V.; Hanssen, R.F. Environmental Strain on Beach Environments Retrieved and Monitored by Spaceborne Synthetic Aperture Radar. Remote Sens. 2021, 13, 4208. [Google Scholar] [CrossRef]
  5. Jackisch, R.; Lorenz, S.; Zimmermann, R.; Möckel, R.; Gloaguen, R. Drone-Borne Hyperspectral Monitoring of Acid Mine Drainage: An Example from the Sokolov Lignite District. Remote Sens. 2018, 10, 385. [Google Scholar] [CrossRef] [Green Version]
  6. Wu, Y.; Xiao, Z.; Liu, S.; Miao, Q.; Ma, W.; Gong, M.; Xie, F.; Zhang, Y. A Two-Step Method for Remote Sensing Images Registration Based on Local and Global Constraints. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 5194–5206. [Google Scholar] [CrossRef]
  7. Guo, Y.; Du, L.; Lyu, G. SAR Target Detection Based on Domain Adaptive Faster R-CNN with Small Training Data Size. Remote Sens. 2021, 13, 4202. [Google Scholar] [CrossRef]
  8. Li, H.; Li, J.; Zhao, Y.; Gong, M.; Zhang, Y.; Liu, T. Cost-Sensitive Self-Paced Learning with Adaptive Regularization for Classification of Image Time Series. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 11713–11727. [Google Scholar] [CrossRef]
  9. Zhang, J.; Zhang, X.; Jiao, L. Sparse Nonnegative Matrix Factorization for Hyperspectral Unmixing Based on Endmember Independence and Spatial Weighted Abundance. Remote Sens. 2021, 13, 2348. [Google Scholar] [CrossRef]
  10. Lv, Z.; Liu, T.; Wan, Y.; Benediktsson, J.A.; Zhang, X. Post-Processing Approach for Refining Raw Land Cover Change Detection of Very High-Resolution Remote Sensing Images. Remote Sens. 2018, 10, 472. [Google Scholar] [CrossRef] [Green Version]
  11. Feng, R.; Wang, L.; Zhong, Y. Joint Local Block Grouping with Noise-Adjusted Principal Component Analysis for Hyperspectral Remote-Sensing Imagery Sparse Unmixing. Remote Sens. 2019, 11, 1223. [Google Scholar] [CrossRef] [Green Version]
  12. Wang, Z.; Wei, J.; Li, J.; Li, P.; Xie, F. Evolutionary Multiobjective Optimization with Endmember Priori Strategy for Large-Scale Hyperspectral Sparse Unmixing. Electronics 2021, 10, 2079. [Google Scholar] [CrossRef]
  13. Qi, L.; Li, J.; Wang, Y.; Gao, X. Region-Based Multiview Sparse Hyperspectral Unmixing Incorporating Spectral Library. IEEE Geosci. Remote Sens. Lett. 2019, 16, 1140–1144. [Google Scholar] [CrossRef]
  14. Bioucas-Dias, J.M.; Figueiredo, M.A. Alternating direction algorithms for constrained sparse regression: Application to hyperspectral unmixing. In Proceedings of the 2nd Workshop Hyperspectral Image Signal Process, Reykjavik, Iceland, 14–16 June 2010; pp. 1–4. [Google Scholar]
  15. Iordache, M.D.; Bioucas-Dias, J.M.; Plaza, A. Collaborative sparse regression for hyperspectral unmixing. IEEE Trans. Geosci. Remote Sens. 2013, 52, 341–354. [Google Scholar] [CrossRef] [Green Version]
  16. Wu, Y.; Li, J.H.; Yuan, Y.Z.; Qin, A.K.; Miao, Q.G.; Gong, M.G. Commonality Autoencoder: Learning Common Features for Change Detection From Heterogeneous Images. IEEE Trans. Neural Netw. Learn. Syst. 2021. [Google Scholar] [CrossRef]
  17. Orosz, T.; Rassõlkin, A.; Kallaste, A.; Arsénio, P.; Pánek, D.; Kaska, J.; Karban, P. Robust design optimization and emerging technologies for electrical machines: Challenges and open problems. Appl. Sci. 2020, 10, 6653. [Google Scholar] [CrossRef]
  18. Gong, M.; Li, H.; Luo, E.; Liu, J.; Liu, J. A multiobjective cooperative coevolutionary algorithm for hyperspectral sparse unmixing. IEEE Trans. Evol. Comput. 2016, 21, 234–248. [Google Scholar] [CrossRef]
  19. Xu, X.; Shi, Z.; Pan, B. l0-based sparse hyperspectral unmixing using spectral information and a multi-objectives formulation. ISPRS J. Photogramm. Remote Sens. 2018, 141, 46–58. [Google Scholar] [CrossRef]
  20. Xu, X.; Shi, Z.; Pan, B.; Li, X. A classification-based model for multi-objective hyperspectral sparse unmixing. IEEE Trans. Geosci. Remote Sens. 2019, 57, 9612–9625. [Google Scholar] [CrossRef]
  21. Li, J.; Li, H.; Liu, Y.; Gong, M. Multi-fidelity evolutionary multitasking optimization for hyperspectral endmember extraction. Appl. Soft Comput. 2021, 111, 107713. [Google Scholar] [CrossRef]
  22. Li, H.; Ong, Y.S.; Gong, M.; Wang, Z. Evolutionary Multitasking Sparse Reconstruction: Framework and Case Study. IEEE Trans. Evol. Comput. 2019, 23, 733–747. [Google Scholar] [CrossRef]
  23. Li, J.; Du, Q.; Li, Y. Region-based collaborative sparse unmixing of hyperspectral imagery. Proc. Remotely Sens. Data Compression Commun. Process. 2016, 9874, 127–132. [Google Scholar]
  24. Martin, G.; Plaza, A. Region-Based Spatial Preprocessing for Endmember Extraction and Spectral Unmixing. IEEE Geosci. Remote Sens. Lett. 2011, 8, 745–749. [Google Scholar] [CrossRef] [Green Version]
  25. Gupta, A.; Ong, Y.S.; Feng, L.; Tan, K.C. Multiobjective Multifactorial Optimization in Evolutionary Multitasking. IEEE Trans. Evol. Comput. 2017, 47, 1652–1665. [Google Scholar] [CrossRef]
  26. Zhang, B.; Sun, X.; Gao, L.; Yang, L. Endmember Extraction of Hyperspectral Remote Sensing Images Based on the Discrete Particle Swarm Optimization Algorithm. IEEE Trans. Geosci. Remote Sens. 2011, 49, 4173–4176. [Google Scholar] [CrossRef]
  27. Liu, R.; Zhang, L.; Du, B. A Novel Endmember Extraction Method for Hyperspectral Imagery Based on Quantum-Behaved Particle Swarm Optimization. IEEE J. Sel. Topics Appl. Earth Observ. Remote Sens. 2017, 10, 1610–1631. [Google Scholar] [CrossRef]
  28. Xu, M.; Zhang, L.; Du, B.; Zhang, L.; Fan, Y.; Song, D. A Mutation Operator Accelerated Quantum-Behaved Particle Swarm Optimization Algorithm for Hyperspectral Endmember Extraction. Remote Sens. 2017, 9, 197. [Google Scholar] [CrossRef] [Green Version]
  29. Jiang, X.; Gong, M.; Li, H.; Zhang, M.; Li, J. A two-phase multiobjective sparse unmixing approach for hyperspectral data. IEEE Trans. Geosci. Remote Sens. 2017, 56, 508–523. [Google Scholar] [CrossRef]
  30. Jiang, X.; Gong, M.; Zhan, T.; Sheng, K.; Xu, M. Efficient Two-Phase Multiobjective Sparse Unmixing Approach for Hyperspectral Data. IEEE J. Sel. Topics Appl. Earth Observ. Remote Sens. 2021, 14, 2418–2431. [Google Scholar] [CrossRef]
  31. Pan, B.; Shi, Z.; Xu, X. Multiobjective-Based Sparse Representation Classifier for Hyperspectral Imagery Using Limited Samples. IEEE Trans. Geosci. Remote. Sens. 2019, 57, 239–249. [Google Scholar] [CrossRef]
  32. Tuysuzoglu, G.; Birant, D.; Pala, A. Majority voting based multi-task clustering of air quality monitoring network in Turkey. Appl. Sci. 2019, 9, 1610. [Google Scholar] [CrossRef] [Green Version]
  33. Zhang, N.; Gupta, A.; Chen, Z.; Ong, Y.S. Evolutionary Machine Learning with Minions: A Case Study in Feature Selection. IEEE Trans. Evol. Comput. 2021. [Google Scholar] [CrossRef]
  34. Martinez, A.D.; Del Ser, J.; Osaba, E.; Herrera, F. Adaptive Multi-factorial Evolutionary Optimization for Multi-task Reinforcement Learning. IEEE Trans. Evol. Comput. 2021. [Google Scholar] [CrossRef]
  35. Tong, L.; Du, B.; Liu, R.; Zhang, L. An Improved Multiobjective Discrete Particle Swarm Optimization for Hyperspectral Endmember Extraction. IEEE Trans. Geosci. Remote Sens. 2019, 57, 7872–7882. [Google Scholar] [CrossRef]
  36. Iordache, M.D.; Bioucas-Dias, J.M.; Plaza, A. Total variation spatial regularization for sparse hyperspectral unmixing. IEEE Trans. Geosci. Remote Sens. 2012, 50, 4484–4502. [Google Scholar] [CrossRef] [Green Version]
  37. Tang, W.; Shi, Z.; Wu, Y.; Zhang, C. Sparse unmixing of hyperspectral data using spectral a priori information. IEEE Trans. Geosci. Remote Sens. 2014, 53, 770–783. [Google Scholar] [CrossRef]
  38. Seada, H.; Deb, K. A unified evolutionary optimization procedure for single, multiple, and many objectives. IEEE Trans. Evol. Comput. 2015, 20, 358–369. [Google Scholar] [CrossRef]
Figure 1. The evolutionary multitasking optimization framework for hyperspectral sparse unmixing.
Figure 1. The evolutionary multitasking optimization framework for hyperspectral sparse unmixing.
Electronics 10 03034 g001
Figure 2. An example of the knowledge transfer.
Figure 2. An example of the knowledge transfer.
Electronics 10 03034 g002
Figure 3. The illustration of the Local Exploration Strategy with MOEA.
Figure 3. The illustration of the Local Exploration Strategy with MOEA.
Electronics 10 03034 g003
Figure 4. True abundance maps of five endmembers in data 1.
Figure 4. True abundance maps of five endmembers in data 1.
Electronics 10 03034 g004
Figure 5. True abundance maps of nice endmembers in data 2.
Figure 5. True abundance maps of nice endmembers in data 2.
Electronics 10 03034 g005
Figure 6. Comparison of the hypervolume indicator for EMMPSO and EMMPSO without transfer. (a) task 1 on data 1, (b) task 2 on data 1, (c) task 3 on data 1, (d) task 1 on data 2, (e) task 2 on data 2, (f) task 3 on data 2.
Figure 6. Comparison of the hypervolume indicator for EMMPSO and EMMPSO without transfer. (a) task 1 on data 1, (b) task 2 on data 1, (c) task 3 on data 1, (d) task 1 on data 2, (e) task 2 on data 2, (f) task 3 on data 2.
Electronics 10 03034 g006
Figure 7. The fractional abundance maps of endmember 2, 5, 8 by SunSAL, CLSUnSAL, Tp-MOSU, MTSR and EMMPSO on Data 1.
Figure 7. The fractional abundance maps of endmember 2, 5, 8 by SunSAL, CLSUnSAL, Tp-MOSU, MTSR and EMMPSO on Data 1.
Electronics 10 03034 g007
Figure 8. The fractional abundance maps of endmember 1, 3, 5 by SunSAL, CLSUnSAL, Tp-MOSU, MTSR and EMMPSO on Data 2.
Figure 8. The fractional abundance maps of endmember 1, 3, 5 by SunSAL, CLSUnSAL, Tp-MOSU, MTSR and EMMPSO on Data 2.
Electronics 10 03034 g008
Table 1. Comparison of EMMPSO and EMMPSO without Local Exploration on data 1 and data 2.
Table 1. Comparison of EMMPSO and EMMPSO without Local Exploration on data 1 and data 2.
Data 1 SRE   ( dB )
203040
EMMPSO without LE7.943513.365422.7536
EMMPSO8.278315.803925.2174
Data 2 SRE   ( dB )
203040
EMMPSO without LE10.702514.708917.0224
EMMPSO12.357220.589125.7204
Table 2. Comparison of EMMPSO and other methods on data 1 and data 2.
Table 2. Comparison of EMMPSO and other methods on data 1 and data 2.
Data 1 SRE   ( dB )
203040
SunSAL4.55688.583312.9890
CLSUnSAL5.516411.484218.7935
Tp-MOSU8.408314.407022.5478
MTSR7.049613.780222.7329
EMMPSO8.578315.203924.2174
Data 2 SRE   ( dB )
203040
SunSAL3.58238.032312.9896
CLSUnSAL8.238213.098814.3502
Tp-MOSU11.357815.713218.0457
MTSR10.725414.614317.6775
EMMPSO12.357220.589125.7204
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Feng, D.; Zhang, M.; Wang, S. Multipopulation Particle Swarm Optimization for Evolutionary Multitasking Sparse Unmixing. Electronics 2021, 10, 3034. https://0-doi-org.brum.beds.ac.uk/10.3390/electronics10233034

AMA Style

Feng D, Zhang M, Wang S. Multipopulation Particle Swarm Optimization for Evolutionary Multitasking Sparse Unmixing. Electronics. 2021; 10(23):3034. https://0-doi-org.brum.beds.ac.uk/10.3390/electronics10233034

Chicago/Turabian Style

Feng, Dan, Mingyang Zhang, and Shanfeng Wang. 2021. "Multipopulation Particle Swarm Optimization for Evolutionary Multitasking Sparse Unmixing" Electronics 10, no. 23: 3034. https://0-doi-org.brum.beds.ac.uk/10.3390/electronics10233034

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop