Next Article in Journal
OADC: An Obstacle-Avoidance Data Collection Scheme Using Multiple Unmanned Aerial Vehicles
Next Article in Special Issue
DRFENet: An Improved Deep Learning Neural Network via Dilated Skip Convolution for Image Denoising Application
Previous Article in Journal
Retrofit of Existing Reinforced Concrete (RC) Buildings: Steel vs. RC Exoskeletons
Previous Article in Special Issue
Tracking the Rhythm: Pansori Rhythm Segmentation and Classification Methods and Datasets
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

OTSU Multi-Threshold Image Segmentation Based on Improved Particle Swarm Algorithm

1
School of Mechanical Engineering and Rail Transit, Changzhou University, Changzhou 213164, China
2
Jiangsu Province Engineering Research Center of High-Level Energy and Power Equipment, Changzhou University, Changzhou 213164, China
3
Key Laboratory of Noise and Vibration, Institute of Acoustics, Chinese Academy of Sciences, Beijing 100190, China
*
Author to whom correspondence should be addressed.
Submission received: 21 September 2022 / Revised: 7 November 2022 / Accepted: 9 November 2022 / Published: 13 November 2022
(This article belongs to the Special Issue Scale Space and Variational Methods in Computer Vision)

Abstract

:
In view of the slow convergence speed of traditional particle swarm optimization algorithms, which makes it easy to fall into local optimum, this paper proposes an OTSU multi-threshold image segmentation based on an improved particle swarm optimization algorithm. After the particle swarm completes the iterative update speed and position, the method of calculating particle contribution degree is used to obtain the approximate position and direction, which reduces the scope of particle search. At the same time, the asynchronous monotone increasing social learning factor and the asynchronous monotone decreasing individual learning factor are used to balance global and local search. Finally, chaos optimization is introduced to increase the diversity of the population to achieve OTSU multi-threshold image segmentation based on improved particle swarm optimization (IPSO). Twelve benchmark functions are selected to test the performance of the algorithm and are compared with the traditional meta-heuristic algorithm. The results show the robustness and superiority of the algorithm. The standard dataset images are used for multi-threshold image segmentation experiments, and some traditional meta-heuristic algorithms are selected to compare the calculation efficiency, peak signal to noise ratio (PSNR), structural similarity (SSIM), feature similarity (FSIM), and fitness value (FITNESS). The results show that the running time of this paper is 30% faster than other algorithms in general, and the accuracy is also better than other algorithms. Experiments show that the proposed algorithm can achieve higher segmentation accuracy and efficiency.

1. Introduction

Image segmentation is widely used as the basis of computer vision. Image segmentation refers to describing an image as a collection of some connected areas so that the image features are different in different areas. At present, image segmentation methods mainly include the threshold method, edge detection method, region method, morphological watershed method, and so on [1]. The threshold method is at the core of image segmentation applications because of its simple implementation and fast calculation speed [2]. Thresholds can be divided into two forms, namely, two-stage thresholds (BT) and multilevel thresholds (MT). The two-level threshold uses a single threshold to divide the image into two categories, and the multi-level threshold uses multiple thresholds to divide the image into more than two uniform segments [3]. The threshold can be determined by kapur entropy [4], Tsallis entropy [5], fuzzy entropy [6], and OTSU variance [7]. This method uses the information in the histogram and does not need any ground live to classify pixels. However, threshold policy is a time-consuming process, especially when increasing the number of thresholds. The running time will greatly increase, so this has become a problem that must be solved in future research.
Recently, Houssein et al. proposed an improved balance optimizer to solve the problem of imbalance between the exploration stage and the development stage of the balance optimizer, which solves the problem that the algorithm optimization is easy to fall into local optimization and shows the excellent performance of this algorithm [8]. Sharma, Sushmita, and others improved the original butterfly optimization algorithm by combining the mutualism and parasitic stages of the traditional butterfly algorithm and the symbiotic biological search algorithm. Using kapur entropy as the fitness function, they selected a group of benchmark images to find the optimal threshold. The results show that the improved butterfly algorithm is superior to other algorithms in all evaluation indicators [9]. In addition, Elaziz et al., in view of the shortcomings of the Harris Hawks Optimizer algorithm, proposed an improved version of the HHO algorithm, which solved the problem of poor search ability of traditional algorithms and makes it easy to fall into local optimization [10]. Zhang et al. proposed an improved PSO algorithm to solve the problem of premature convergence of traditional PSO and effectively realize adaptive image segmentation [11]. Zhao et al. proposed a cross strategy-based ant colony algorithm (CCACO) in order to solve the continuity problem of the ant colony algorithm, which uses Kapur entropy as the objective function for image segmentation. Experiments show that the proposed CCACO achieved excellent segmentation results in both low-threshold and high-threshold [12].
Of course, in the process of image segmentation, it is far from enough to rely only on threshold segmentation. Because of the low processing speed and segmentation accuracy of threshold segmentation, people will use optimization algorithms to optimize the segmentation process. Among many optimization algorithms, the meta-heuristic optimization algorithm is widely used because of its low cost, high accuracy, and fast speed. So far, various optimization algorithms have been introduced to deal with nonlinear and practical applications, such as genetic algorithm (GA) [13], particle swarm optimization (PSO) [14], whale optimization algorithm (WOA) [15], butterfly optimization algorithm (BOA) [9], sine cosine optimization algorithm (SCA) [16], crow optimization algorithm (CSA) [17], gray wolf optimization algorithm (GWO) [18], and bee colony optimization algorithm (ABC) [19]. In recent years, Raj et al. proposed the Whale Optimization Algorithm (WOA) to optimize TCSC and SVC reactive power planning for the problems of transmission loss and high operating costs. The results show that this method has fewer iterations, will not fall into local minimum, and has good convergence characteristics [15]. Shiva and Gudadappanavar proposed an opposite crow search algorithm for transmission loss, which has better performance in reducing active power and system operation cost [16]. In order to solve the problems of insufficient reactive power and unstable voltage of transmission lines, Babu and Kumar et al. proposed an improved sine cosine optimization algorithm, which uses the techniques of the sine cosine algorithm (SCA) and quasi inverse sine cosine algorithm (QOSCA) to minimize transmission losses and operating costs [17]. Xu et al. proposed an improved hunger game search algorithm (IHGS) for solar photovoltaic system parameter identification, which solved the problem of stability of the algorithm when solving the global optimal solution. It shows the feasibility and effectiveness of the improved HGS algorithm [20]. In addition, Trojovsky et al. proposed a new population-based optimization algorithm, the Pelican Algorithm (POA). This algorithm has high development and strong search ability [21]. Shabani and Asgarian put forward the search and rescue optimization algorithm (SAR) for the problem of single objective continuous optimization and demonstrated the feasibility of the algorithm through experiments [22]. Oliva and Elizabeth put forward a new solution to the problem that the traditional algorithm population is small and easy to fall into the local optimum. They used chaos mapping and opposite learning to initialize the solution of the given problem and improve the diversity of the population by constantly following the new initial population position through the interference operator. The experimental results show that the proposed method has high efficiency in dealing with the optimal solution problem [23].
In order to improve the performance of PSO on image threshold segmentation, a new OTSU multi-threshold segmentation method based on Improved PSO was proposed. The selection process of thresholds were optimized and compared with many classical PSO methods. The results show that our algorithm reduces the running time and improves the accuracy in threshold segmentation. The main contributions in this paper can be summarized as follows:
(1)
An Improved PSO algorithm is proposed, in which (a) chaos optimization was added to reduce premature convergence; (b) elite particle search strategy to particle swarm optimization algorithm was used to reduce optimization time and improve efficiency; (c) learning factors were improved to balance local search and global search.
(2)
Combining PSO with OTSU algorithm, a gray image segmentation algorithm based on improved particle swarm optimization is proposed. The proposed improved particle swarm optimization segmentation algorithm can search for a more accurate threshold, thus promoting better component division of gray-scale images.
(3)
Some classical test functions are selected to verify the robustness and development of the algorithm in solving single peak, multi-peak, and multi-peak fixed dimension functions.
(4)
Compared with the multi threshold segmentation of some standard algorithms, the performance of the improved PSO segmentation algorithm was verified, and the effectiveness of the algorithm on images was verified through multi-threshold image segmentation experiments on PASCAL 2012 dataset images. Experiments show that the method in this paper is faster than other meta heuristic algorithms in OTSU threshold segmentation, and PSNR FSIM and SSIM performance indicators verify that the algorithm in this paper has higher accuracy in image segmentation.
The rest of this paper is organized as follows: Section 2 summarizes the multi-threshold OTSU segmentation model. Section 3 describes the PSO algorithm. Section 4 proposes a multi-threshold segmentation algorithm based on the Improved PSO algorithm. Section 5 describes the experimental results of the segmentation method based on the improved particle swarm optimization. Section 6 presents conclusions and future work.

2. Multi-Threshold OTSU Segmentation Model

The OTSU image segmentation method is the maximum inter-class variance method, which aims to determine the optimal threshold for image segmentation. In OTSU, the image is divided into foreground and background by threshold, and the threshold with the largest class variance in the foreground and background regions is taken as the best segmentation threshold. Assume that the gray scale range is { 0 , 1 , 2 , , L 1 } , representing L different gray levels in a digital image, whose size is M × N pixels. Let B ( i ) be the proportion of grayscale i in the entire image, calculated as
B ( i ) = a ( i ) / ( M × N ) ,
where a ( i ) is the number of pixels with grayscale i. Let the threshold t divide the image into two parts, foreground and background.
h 0 = i = 0 t B ( i ) h 1 = i = t + 1 L 1 B ( i )
ρ 0 = i = 0 t i × B ( i ) / h 0 ρ 1 = i = t + 1 L 1 i × B ( i ) / h 1
where ρ T is the average grayscale of the image, which can be expressed as:
ρ T = i = 0 L i × B ( i )
where h 0 and h 1 are the proportion of the foreground and background in the image, respectively, and ρ 0 and ρ 1 are the average grayscale of the image foreground and background, respectively. σ B 2 is the inter class variance, which can be expressed as:
σ B 2 = h 0 ( ρ 0 ρ T ) 2 + h 1 ( ρ 1 ρ T ) 2
when the interclass variance σ B 2 reaches the maximum, t is the optimal threshold. In the case of multiple thresholds, a threshold set of { t 1 , t 2 , , t K 1 } is the optimal combination of thresholds. The optimization problem can be expressed as:
T ( t 1 , t 2 , , t K 1 ) = a r g m a x σ B 2 ( t 1 , t 2 , , t K 1 )
The OTSU multi-threshold method uses the exhaustive method to solve the optimal thresholds combination with a total computational cost of O( L K ). It can be seen that the computational cost will increase exponentially with the increase in the number of thresholds. In order to improve the efficiency of image segmentation, this paper proposes an improved particle swarm optimization algorithm to optimize the optimal threshold combination and then perform multi-threshold image segmentation.

3. Particle Swarm Algorithm

Particle swarm optimization (PSO) is a popular intelligent optimization algorithm [24] whose origins originate from the research on the predatory behaviors of birds. The PSO algorithm is similar to the genetic algorithm in that the solution of the objective function needs to be initialized randomly and optimized by iteration, without the crossover and mutation of chromosomes. Each particle is abstracted as a bird in the search space, which represents the solution of the optimization problem. The particle has two attributes: speed and position. The former represents the speed of movement and the latter represents the direction of movement. Substituting particle position information and velocity information into the fitness function to be optimized, the fitness value can be obtained. In the process of optimization, the particle determines its next motion through its own flight experience (individual extreme value) and group experience (group extreme value).
In a D-dimensional search space, the particle population size m, particle position x i = ( x i 1 , x i 2 , , x i D ) , and velocity v i = ( v i 1 , v i 2 , , v i D ) are initialized, where D = 1, m = 50, range of position X is [0, 255], and the range of velocity V is [ 10 , 10 ] . Particle i individual extremes p i = ( p i 1 , p i 2 , , p i d , , p i D ) and Global Extremums p g = ( p g 1 , p g 2 , , p g d , , p g D ) p i is the optimal position currently searched by the particle, and p g is the optimal position currently searched by all of the particles. Over the course of each iteration, particles update their velocity and position according to the following equation:
v i d k + 1 = ω v i d k + c 1 r 1 ( p i d z i d k ) + c 2 r 2 ( p g d z i d k )
x i d k + 1 = x i d k + v i d k + 1
where i = 1 , 2 , , m , d = 1 , 2 , , D , k is the number of iterations, and r 1 and r 2 are random numbers between [ 0 , 1 ] ; these two parameters are used to maintain the diversity of the population. Furthermore, c 1 and c 2 are learning factors. Take c 1 = c 2 = 2 , inertia weight ω reflects the ability of particles to inherit the previous velocity, with a linear decrement method, as follows:
ω = ω m a x ω m a x ω m i n k m a x × k
where ω m a x = 0.9 , ω m i n = 0.4 , k is the number of iterations, k m a x is the total number of iterations, and k m a x = 100 .

4. Multi-Threshold Segmentation Algorithm Based on Improved PSO

The classical PSO algorithm is an effective method to deal with the optimization problem of threshold segmentation. However, due to the mechanism of particle swarm optimization algorithm, when used in image processing, a large number of calculations will make the image processing time longer, and premature convergence will fall into local optimization, thus reducing the overall work efficiency and affecting the accuracy of threshold segmentation. This paper proposes an improved particle swarm optimization algorithm to optimize OTSU multi-threshold image segmentation. The speed and accuracy of multi-threshold image segmentation are improved by optimizing the basic parameters of the particle swarm optimization algorithm and proposing an elite particle search strategy based on the classical particle swarm optimization algorithm.

4.1. Linear Optimization Learning Factor

For the problem of low segmentation accuracy of the classical particle swarm algorithm, a linear function is introduced in the process of parameter optimization of the particle swarm algorithm to update the learning factor of particles, and the changes in learning factors c 1 and c 2 affect individual and swarm particles to approach the optimal solution, respectively. Most of the search results from global search to local search are particles, and the centers of gravity of particle searches are also different at different stages. To mitigate this issue, this paper uses asynchronous linear decrement and asynchronous linear incrementing learning factors to balance global search and local search, so that the learning factor can change with the procedure of iterations. According to the characteristics of the PSO algorithm, in the early iterations, it is more likely to rely on the mobile experiences of particle individuals for global search, while in the later iteration, it mainly relies on the mobile experience of the group for local search. The PSO algorithm can enhance the global search ability by taking the larger value of the learning factor c 1 the smaller value of c 2 in the early iterations, and the smaller value of c 1 the larger value of c 2 in the later stages can enhance the local search ability. In order to better play the global search ability and local search ability at different stages and improve the accuracy of the segmentation, we use the asynchronous change learning factor. The formula is:
c 1 ( k ) = c m i n + ( c m a x c m i n ) ( 1 k / k m a x )
c 1 ( k ) = c m i n + ( c m a x c m i n ) ( 1 k / k m a x )
where c m a x and c m i n are the maximum and minimum values of the learning factor, c m a x = 2.5 , c m i n = 0.5 ; k and k m a x are the current number of iterations and the maximum number of iterations, respectively. The improved speed update formula is:
v i d k + 1 = ω v i d k + c 1 ( k ) ( p i d z i d k ) + c 2 ( k ) ( p g d z i d k )

4.2. Elite Particle Search Strategy

In view of the low efficiency and slow speed of classical particle swarm optimization in multi-threshold segmentation, this paper improves classical particle swarm optimization by updating the speed and position of particles and proposes a search strategy for elite particles, which is divided into three steps. Firstly, the threshold segmentation contribution of each particle is calculated; secondly, 20% elite particles are selected according to the threshold segmentation contribution of each particle; and finally, the optimal solution of this iteration is found through chaos optimization.
In this paper, 20% of the particles are selected as elite particles according to the contribution. The contribution of each particle is obtained by comparing the fitness value of each particle with the fitness value of the global extremum in the current iteration. We calculated the distance difference between the location of a single particle and the location of the global limit particle. The closer the distance was to the optimal threshold, the greater the contribution to the optimization. Assuming that the global extremum particle is ( e g b e s t , s g b e s t ) in each iteration, the distance difference between the location of a single particle and the location of the global extreme particle and the threshold segmentation contribution value of each single particle are obtained according to the following formula:
z = ( e g b e s t e i ) 2 + ( s g b e s t s i ) 2
A C D = 1 z
where z represents the distance between the individual particle of the ith particle and the particle individual, corresponding to the global extremum, e g b e s t is the x-axis of the particle individual, corresponding to the global extremum in the target search space, s g b e s t is the y-axis coordinate of the particle individual, corresponding to the global extremum in the target search space, e i is the x-axis of the i-th particle individual in the target search space, s i is the y-axis coordinate of the ith particle individual in the target search space, A C D (actual contribution degree) is the threshold segmentation contribution of each particle, threshold segmentation contribution, and distance difference of z, which is inversely proportional. The closer the i-th particle individual is to the particle individual, corresponding to the global extreme value, the greater the contribution of the i-th particle individual to the threshold segmentation of this optimization. The top 20% of elite particles are selected in order of contribution from largest to smallest by threshold segmentation, reducing optimization time. For details, see Figure 1. Where the arrows represent the change in the number of thresholds, and the three circles indicate the optimization when the threshold is a different value.
Taking the gray value range of the segmented image as the coordinate range of x and y values, the black dot as all particles, and the central red dot as the global optimal solution g b e s t of the current iteration, the search range can be reduced to a black circular region, that is, the region where the elite particles are located, and k is the number of threshold values. Each threshold requires a corresponding particle swarm to determine, while n threshold values require n particle swarms. In the same iteration, determine the global optimal particle of the n-particle swarm in turn.
Put the selected elite particles into chaos optimization. During each iteration, the g b e s t is scrambled and used as the update position of the particles, which makes them search locally for the global optimal solution. The purpose is to increase the diversity of the population, enhance the local optimization ability of the algorithm, and realize the local depth search of the particle swarm algorithm. In order to solve the problem of premature convergence of particle swarm optimization algorithm and easy to fall into local optimization, so as to improve the accuracy of image segmentation chaos, the specific steps of chaos optimization elite particles are as follows:
Step 1. Set the number of chaotic iterations to M and map g b e s t through Equation (15) to the defining domain of the Logistic equation [0, 1];
y 1 k = g b e s t R m i n k R m a x k R m i n k
where, y 1 k is the sequence element obtained by the first iteration, g b e s t is the global extremum in the whole population, R m i n k is the minimum value of the particle position, and R m a x k is the maximum value of the particle position.
Step 2. y 1 k is obtained after M iterations through Equation (16), and the chaotic sequence y k = ( y 1 k , y 2 k , , y M k ) is obtained.
y n + 1 k = μ y n k ( 1 y n k )
where y n + 1 k is the sequence element obtained by the (n+1)-thi iteration, μ is the control parameter, and y M k is the sequence element obtained by the M-th iteration.
Step 3. The chaotic sequence is inversely mapped back to the original solution space through Equation (17) to obtain the feasible solution sequence of chaotic variables g b e s t k = ( g b e s t , 1 k , g b e s t , 2 k , , g b e s t , M k )
g b e s t , m k = R m i n k + ( R m a x k R m i n k ) y m k
where y m k is the sequence element obtained by the m-th iteration, m = ( 1 , 2 , , M ) .
Step 4. Calculate the fitness value of each feasible solution vector in the feasible solution sequence and retain the optimal vector, denoted as g b e s t k .
Step 5. Randomly select a particle from the current particle swarm and replace the position vector of the selected particle with a g b e s t k .
Step 6. Iterate until the maximum number of iterations is reached or gets a sufficiently satisfactory solution.

4.3. Our Method and Process

Compared with the classical particle swarm optimization algorithm, our improved particle swarm optimization algorithm is superior to the classical algorithm in terms of both speed and accuracy. At the same time, it also avoids the problem that the particle swarm optimization algorithm is prone to fall into local optimization, greatly improving the efficiency of image segmentation. The flow chart of our improved particle swarm optimization algorithm is shown in Figure 2.
To a certain extent, linear learning factors and elite particle search strategies are added to the classical particle swarm algorithm, which improves the problems of slow optimization speed, low precision, low efficiency, and easy optimization in the classical particle swarm. The detailed process based on the improved particle swarm algorithm is as follows:
Step 1. Set the parameters, which include the population size (m), dimension ( D ) , velocity range ( v ) , position range ( x ) , and maximum number of iterations ( k m a x ) .
Step 2. Initialize the population. Initialize the dimensions and vectors of particle individuals according to the threshold number ( k ) . The range of position vectors is [0255].
Step 3. Update c 1 and c 2 with asynchronous monotonically decreasing individual learning factor and asynchronous monotonically increasing social learning factor, c 1 ( k ) and c 2 ( k ) , instead of Equations (10) and (11) to individual, thereby improving the accuracy of particle optimization. Calculate the fitness values for individual particles, updating the individual extremums of particles and the global extremums of particles.
Step 4. Iteratively update the position of the particle individual and the speed of the particle individual.
Step 5. 20% elite particles are selected according to the threshold segmentation contribution of individual particles.
Step 6. Use chaotic iterative optimization to find the best fitness value for elite particles by comparing the chaotic optimization results.
Step 7. Determine whether the current number of chaotic iterations meets the set maximum number of chaos iterations or whether the chaos optimization reaches preset accuracy. If the current number of chaos iterations meets the maximum number of chaotic iterations or the chaos optimization reaches the set accuracy, go to step 8; otherwise, go to step 2.
Step 8. Map the optimal fitness value inversely back into the particle cluster group, calculate the fitness value of all particles in the particle cluster, and output the optimal solution if the set conditions are met; otherwise, go to step 1.

5. Experimental Analysis

5.1. Experimental Environment

Experiments demonstrate the effectiveness of our improved multi-threshold image segmentation algorithm. The test image is taken from the PASCAL 2012 data set [25]. The experimental environment consists of MATLAB 2018a, Windows 10, and an Intel(R) Core(TM) i5-5600H @ 2.4GHz CPU with 8GB of RAM. We set population size m = 50 , maximum number of iterations k m a x = 100 , particle position range x i [ 0 , 255 ] , velocity range v i [ 10 , 10 ] , individual learning factor c 1 [ 0.5 , 2.5 ] , and social learning factor c 2 [ 0.5 , 2.5 ] .

5.2. Benchmark Function

This paper selects 12 standard test functions for comparative experiments to test the robustness, exploration, and development performance of the functions. These functions were derived from the CEC benchmark function [26], covering unimodal functions (F1–F4), multimodal functions (F5–F8), and fixed dimensional multimodal functions (F9–F12), all of which had minimum values. The algorithm compared with this paper uses the traditional meta-heuristic algorithms, which are whale optimization algorithm (WOA) [27], butterfly optimization algorithm [28], and particle swarm optimization algorithm [14]. In order to make a fair evaluation, this paper conducted 30 experiments on four optimization algorithms at the same time, and the experimental results less than 10 8 are represented by 0.
Table 1 shows the selected test functions: Dim is the number of variables and N\A is unsolvable. Table 2 shows the average value and standard deviation of the benchmark function calculated by this method and the traditional meta heuristic algorithm. According to the data in Table 2, our method is robust in dealing with the mechanism of benchmark function, whether it is unimodal, multimodal, or a fixed-dimensional multimodal function. From Table 2, we can see that our method has better performance than WOA, BOA, and PSO. In particular, there is a smaller error in multimodal function, which shows that the method in this paper has better exploration capability. We found that our method did not find the global minimum in several functions, but the error was smaller than other algorithms. To sum up, our method was more exploitative and exploratory than other algorithms. The specific experiments are as follows:

5.3. Segmentation Experiment and Results

We took the optimization time and image segmentation quality to evaluate the performance of algorithms for comparison. The segmentation quality was evaluated with peak signal-to-noise ratio (PSNR) [29,30], structural similarity (SSIM) [31,32], and feature similarity (FSIM) [33,34], and fitness values of all algorithms were calculated. PSNR was used to evaluate the degree of image distortion, and the higher its value, the smaller the image distortion. SSIM evaluated image segmentation performance from image contrast and structural information. The larger the value, the better the performance. FSIM was used to evaluate the feature similarity between the original image and the segmented image. The larger the F S I M [ 0 , 1 ] , the better the image segmentation quality.
In this paper, the images in PASCAL 2012 [25] were selected as the experimental data set, including Figure 3a–f (numbered 000019, 000436, 001478, 003579, 004423, 006404). They were typical images with relatively centralized gray distribution and small inter class variance, while Figure 3g–l (numbered 001236, 001876, 002036, 004231, 004610, 006946) selected images with obvious differences in gray distribution, that is, the inter class variance was relatively large, as shown in Figure 3. Theoretically, the greater the number of segmentation thresholds, the greater the accuracy of the image segmentation. In order to quantitatively analyze the performance of segmentation, in the experiment, four groups of different segmentation thresholds were used. k is, 2, 4, 6, and 8 for index analysis. Our improved particle swarm optimization algorithm was compared with the classical particle swarm optimization algorithm [13], whale optimization algorithm [25], and butterfly optimization algorithm [26]. With several evaluation indicators, such as the classical particle swarm optimization algorithm, the whale optimization algorithm and the butterfly optimization algorithm are both bionic algorithms. The differences are that the whale optimization algorithm achieves the goal of searching by searching, surrounding, chasing, and attacking prey, while the butterfly algorithm locates the source of the target through its own perception. In this way, we can better understand the segmentation effect of the improved particle swarm optimization OTSU image segmentation method compared with other algorithms under different thresholds. Table 3, Table 4, Table 5, Table 6, Table 7, Table 8, Table 9 and Table 10 show the comparison results of PSNR, SSIM, FSIM, and fitness values, respectively. It can be seen that with the increase of the number of thresholds k, the distortion after image segmentation gradually decreases, which is closer to the original image. SSIM and FSIM gradually increase, and the segmentation performance gets better and better, which shows the advantages of multi-threshold image segmentation methods. To verify the universality of the algorithm in this paper, we randomly selected 100 images from the PASCAL 2012 dataset for the same experiment and listed the average values of the experimental parameters in Table 11. In addition, the algorithm that performs best on the evaluation criteria is marked in bold.
When the number of thresholds is small (k = 2 or k = 4), the PSNR, SSIM, and FSIM values of the improved particle swarm optimization algorithm are slightly better than other algorithms. However, after the threshold number increases (k = 6 or k = 8), the advantages of improved particle swarm optimization are reflected. The calculated PSNR, SSIM, and FSIM values were the highest among the four algorithms, which indicates that the segmentation, when the improved particle swarm algorithm solves the optimal threshold combination, has the highest similarity with the original image, and the optimization accuracy performance of the algorithm can be reflected. The algorithm runtime is shown in Figure 4 and Figure 5.
From Table 3, Table 4, Table 5, Table 6, Table 7, Table 8, Table 9 and Table 10, it can be seen that the segmentation effect of the proposed algorithm is better. Image 1 (000019) was selected as an example, and the PSNR value of the image, k = 8, was 21.3771, which was significantly better than other comparison algorithms. The SSIM value of the image was 0.7333, which was also slightly better than other algorithms, indicating that the image distortion is low and ensures the image quality after segmentation. Under the premise of maintaining the segmentation effect, it can be seen from Figure 4 that the running time of the proposed scheme was shorter than that of other algorithms, especially in the case of a large number of thresholds. When k = 8, the algorithm segmentation running time was shortened from 3.7 s to 2.8 s, which decreased by more than 30%, which greatly improved the running speed. To verify the universality of the algorithm in this paper, we randomly selected 100 images from the PASCAL 2012 dataset for parameter comparison tests. The experimental results show that our method ia faster and more accurate than other algorithms in terms of PSNR, SSIM, FSIM, and runtime parameters. Experimental results show that the proposed method can improve the segmentation accuracy and segmentation efficiency. Table 6 and Table 10 show the fitness values obtained by different algorithms under different thresholds. The segmented image is shown in Figure 6 and Figure 7.

6. Conclusions

This paper proposes an improved particle swarm optimization algorithm and applies it to the field of multi-threshold imaging. A new elite particle search strategy is introduced to narrow the scope of particle search, asynchronous monotone increasing social learning factor and asynchronous monotone decreasing individual learning factor are introduced to balance global and local search, and chaos optimization is introduced to increase population diversity. An optimal multi-threshold combination of image segmentation based on an improved particle swarm optimization algorithm is used for optimization, and multi-threshold image segmentation is realized. The test function proves that the algorithm has good development and exploration. The experimental results show that the proposed algorithm can achieve higher segmentation accuracy and efficiency.
Although our method has a faster speed and higher accuracy in image segmentation, compared with threshold segmentation methods, such as fuzzy entropy, our algorithm has a greater speed improvement, but the segmentation accuracy is slightly insufficient, which is one of the directions we will study in the future. Next, it is important to consider applying the improved particle swarm optimization algorithm to image segmentation and image processing in other fields (such as new material image processing, target detection, lightweight embedded robots, and other fields), study the ability of the algorithm to solve practical problems, and further enhance the practicability of the algorithm.

Author Contributions

Conceptualization, J.Z. (Jianfeng Zheng) and Y.G.; methodology, J.Z. (Jianfeng Zheng), Y.G. and Y.L.; software, Y.G.; validation, J.Z. (Jianfeng Zheng), Y.G. and J.Z. (Ji Zhang); formal analysis, Y.G. and H.Z.; investigation, Y.G. and Y.L.; resources, J.Z. (Jianfeng Zheng) and J.Z. (Ji Zhang); data curation, Y.G., Y.L. and H.Z.; writing—original draft preparation, Y.G.; writing—review and editing, J.Z. (Jianfeng Zheng) and Y.G.; visualization, Y.G.; supervision, J.Z. (Jianfeng Zheng) and J.Z. (Ji Zhang). All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Postgraduate Research & Practice Innovation Program of Jiangsu Province under the grant number [SJCX21_1282].

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
PSOParticle swarm algorithm
PSNRPeak-Signal-to-Noise-Ratio
FSIMFeature similarity
SSIMStructural Similarity

References

  1. Cheng, Y.; Li, B. Image Segmentation Technology and Its Application in Digital Image Processing. In Proceedings of the 2021 IEEE Asia-Pacific Conference on Image Processing, Electronics and Computers (IPEC), Dalian, China, 14–16 April 2021; pp. 1174–1177. [Google Scholar]
  2. Chakraborty, F.; Roy, P.K.; Nandi, D. Oppositional elephant herding optimization with dynamic Cauchy mutation for multilevel image thresholding. Evol. Intell. 2019, 12, 445–467. [Google Scholar]
  3. Abd Elaziz, M.; Oliva, D.; Ewees, A.A.; Xiong, S. Multi-level thresholding-based grey scale image segmentation using multi-objective multi-verse optimizer. Expert Syst. Appl. 2019, 125, 112–129. [Google Scholar]
  4. Upadhyay, P.; Chhabra, J.K. Kapur’s entropy based optimal multilevel image segmentation using crow search algorithm. Appl. Soft Comput. 2020, 97, 105522. [Google Scholar]
  5. Yazid, H.; Basah, S.N.; Rahim, S.A.; Safar, M.J.A.; Basaruddin, K.S. Performance analysis of entropy thresholding for successful image segmentation. Multimed. Tools Appl. 2022, 81, 6433–6450. [Google Scholar]
  6. Mahajan, S.; Mittal, N.; Pandit, A.K. Image segmentation using multilevel thresholding based on type II fuzzy entropy and marine predators algorithm. Multimed. Tools Appl. 2021, 80, 19335–19359. [Google Scholar] [CrossRef]
  7. Wu, B.; Zhou, J.; Ji, X.; Yin, Y.; Shen, X. An ameliorated teaching–learning-based optimization algorithm based study of image segmentation for multilevel thresholding using Kapur’s entropy and Otsu’s between class variance. Inf. Sci. 2020, 533, 72–107. [Google Scholar]
  8. Houssein, E.H.; Helmy, B.E.d.; Oliva, D.; Jangir, P.; Premkumar, M.; Elngar, A.A.; Shaban, H. An efficient multi-thresholding based COVID-19 CT images segmentation approach using an improved equilibrium optimizer. Biomed. Signal Process. Control 2022, 73, 103401. [Google Scholar]
  9. Sharma, S.; Saha, A.K.; Majumder, A.; Nama, S. MPBOA—A novel hybrid butterfly optimization algorithm with symbiosis organisms search for global optimization and image segmentation. Multimed. Tools Appl. 2021, 80, 12035–12076. [Google Scholar]
  10. Elaziz, M.E.A.; Heidari, A.A.; Fujita, H.; Moayedi, H. A competitive chain-based Harris Hawks Optimizer for global optimization and multi-level image thresholding problems. Appl. Soft Comput. 2020, 95, 106347. [Google Scholar] [CrossRef]
  11. Zhang, L.; Wang, J.; An, Z. FCM fuzzy clustering image segmentation algorithm based on fractional particle swarm optimization. J. Intell. Fuzzy Syst. 2020, 38, 3575–3584. [Google Scholar]
  12. Zhao, D.; Liu, L.; Yu, F.; Heidari, A.A.; Wang, M.; Oliva, D.; Muhammad, K.; Chen, H. Ant colony optimization with horizontal and vertical crossover search: Fundamental visions for multi-threshold image segmentation. Expert Syst. Appl. 2021, 167, 114122. [Google Scholar] [CrossRef]
  13. Holland, J.H. Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence; MIT Press: Cambridge, MA, USA, 1992. [Google Scholar]
  14. Shami, T.M.; El-Saleh, A.A.; Alswaitti, M.; Al-Tashi, Q.; Summakieh, M.A.; Mirjalili, S. Particle swarm optimization: A comprehensive survey. IEEE Access 2022, 10, 10031–10061. [Google Scholar] [CrossRef]
  15. Raj, S.; Bhattacharyya, B. Optimal placement of TCSC and SVC for reactive power planning using Whale optimization algorithm. Swarm Evol. Comput. 2018, 40, 131–143. [Google Scholar]
  16. Shiva, C.K.; Gudadappanavar, S.S.; Vedik, B.; Babu, R.; Raj, S.; Bhattacharyya, B. Fuzzy-Based Shunt VAR Source Placement and Sizing by Oppositional Crow Search Algorithm. J. Control. Autom. Electr. Syst. 2022, 33, 1576–1591. [Google Scholar]
  17. Babu, R.; Kumar, V.; Shiva, C.K.; Raj, S.; Bhattacharyya, B. Application of Sine–Cosine Optimization Algorithm for Minimization of Transmission Loss. Technol. Econ. Smart Grids Sustain. Energy 2022, 7, 6. [Google Scholar] [CrossRef]
  18. Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H. Harris hawks optimization: Algorithm and applications. Future Gener. Comput. Syst. 2019, 97, 849–872. [Google Scholar] [CrossRef]
  19. Su, H.; Zhao, D.; Yu, F.; Heidari, A.A.; Zhang, Y.; Chen, H.; Li, C.; Pan, J.; Quan, S. Horizontal and vertical search artificial bee colony for image segmentation of COVID-19 X-ray images. Comput. Biol. Med. 2022, 142, 105181. [Google Scholar] [CrossRef] [PubMed]
  20. Xu, B.; Heidari, A.A.; Kuang, F.; Zhang, S.; Chen, H.; Cai, Z. Quantum Nelder-Mead Hunger Games Search for optimizing photovoltaic solar cells. Int. J. Energy Res. 2022, 46, 12417–12466. [Google Scholar]
  21. Trojovskỳ, P.; Dehghani, M. Pelican optimization algorithm: A novel nature-inspired algorithm for engineering applications. Sensors 2022, 22, 855. [Google Scholar] [CrossRef] [PubMed]
  22. Shabani, A.; Asgarian, B.; Gharebaghi, S.A.; Salido, M.A.; Giret, A. A new optimization algorithm based on search and rescue operations. Math. Probl. Eng. 2019, 2019, 2482543. [Google Scholar] [CrossRef]
  23. Oliva, D.; Elaziz, M.A. An improved brainstorm optimization using chaotic opposite-based learning with disruption operator for global optimization and feature selection. Soft Comput. 2020, 24, 14051–14072. [Google Scholar] [CrossRef]
  24. Sun, Y.; Yang, Y. An Adaptive Bi-Mutation-Based Differential Evolution Algorithm for Multi-Threshold Image Segmentation. Appl. Sci. 2022, 12, 5759. [Google Scholar]
  25. Shetty, S. Application of convolutional neural network for image classification on Pascal VOC challenge 2012 dataset. arXiv 2016, arXiv:1607.03785. [Google Scholar]
  26. Liang, J.J.; Suganthan, P.N.; Deb, K. Novel composition test functions for numerical global optimization. In Proceedings of the 2005 IEEE Swarm Intelligence Symposium, SIS 2005, Pasadena, CA, USA, 8–10 June 2005; pp. 68–75. [Google Scholar]
  27. Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar]
  28. Arora, S.; Singh, S. Butterfly optimization algorithm: A novel approach for global optimization. Soft Comput. 2019, 23, 715–734. [Google Scholar]
  29. Huynh-Thu, Q.; Ghanbari, M. Scope of validity of PSNR in image/video quality assessment. Electron. Lett. 2008, 44, 800–801. [Google Scholar]
  30. Hore, A.; Ziou, D. Image quality metrics: PSNR vs. SSIM. In Proceedings of the 2010 IEEE 20th International Conference on Pattern Recognition, Istanbul, Turkey, 23–26 August 2010; pp. 2366–2369. [Google Scholar]
  31. Wang, Z.; Bovik, A.C.; Sheikh, H.R.; Simoncelli, E.P. Image quality assessment: From error visibility to structural similarity. IEEE Trans. Image Process. 2004, 13, 600–612. [Google Scholar]
  32. Brunet, D.; Vrscay, E.R.; Wang, Z. On the mathematical properties of the structural similarity index. IEEE Trans. Image Process. 2011, 21, 1488–1499. [Google Scholar] [CrossRef] [PubMed]
  33. Sara, U.; Akter, M.; Uddin, M.S. Image quality assessment through FSIM, SSIM, MSE and PSNR—A comparative study. J. Comput. Commun. 2019, 7, 8–18. [Google Scholar] [CrossRef] [Green Version]
  34. Zhang, L.; Zhang, L.; Mou, X.; Zhang, D. FSIM: A feature similarity index for image quality assessment. IEEE Trans. Image Process. 2011, 20, 2378–2386. [Google Scholar]
Figure 1. Particle swarm narrows the search range according to the contribution of particles.
Figure 1. Particle swarm narrows the search range according to the contribution of particles.
Applsci 12 11514 g001
Figure 2. Improved particle swarm algorithm flowchart.
Figure 2. Improved particle swarm algorithm flowchart.
Applsci 12 11514 g002
Figure 3. Histogram distribution.
Figure 3. Histogram distribution.
Applsci 12 11514 g003
Figure 4. Algorithm running time (Images a–f in Figure 3).
Figure 4. Algorithm running time (Images a–f in Figure 3).
Applsci 12 11514 g004
Figure 5. Algorithm running time (Images g–l in Figure 3).
Figure 5. Algorithm running time (Images g–l in Figure 3).
Applsci 12 11514 g005
Figure 6. When threshold k = 8 , the original image and the segmented image of different algorithms are shown (images a–f in Figure 3).
Figure 6. When threshold k = 8 , the original image and the segmented image of different algorithms are shown (images a–f in Figure 3).
Applsci 12 11514 g006
Figure 7. When threshold k = 8 , the original image and the segmented image of different algorithms are shown (images g–l in Figure 3).
Figure 7. When threshold k = 8 , the original image and the segmented image of different algorithms are shown (images g–l in Figure 3).
Applsci 12 11514 g007
Table 1. Benchmark Function.
Table 1. Benchmark Function.
Test FunctionTypeDimRange f m i n
F 1 ( x ) = i = 1 n x i 2 US30 [ 100 , 100 ] 0
F 2 ( x ) = i = 1 n | x i | + i = 1 n | x i | US30 [ 10 , 10 ] 0
F 3 ( x ) = i = 1 n ( | x i + 0.5 | ) 2 US30 [ 100 , 100 ] 0
F 4 ( x ) = i = 1 n i x i 4 + r a n d o m [ 0 , 1 ) US30 [ 128 , 128 ] 0
F 5 ( x ) = 20 e x p ( 0.2 1 n i = 1 n x i 2 ) e x p ( 1 n i = 1 n c o s ( 2 π x i ) ) + 20 + e UN30 [ 32 , 32 ] 0
F 6 ( x ) = 1 4000 i = 1 n x i 2 i = 1 n c o s ( x i i ) + 1 UN30 [ 600 , 600 ] 0
F 7 ( x ) = π n ( 10 s i n ( π y 1 ) + i = 1 n 1 ( y i 1 ) 2 ( 1 + 10 s i n 2 ( π y i + 1 ) ) + ( y n 1 ) 2 ) UN30 [ 50 , 50 ] 0
+ i = 1 n u ( x i , 10 , 100 , 4 )
F 8 ( x ) = 0.1 ( s i n 2 ( 3 π x 1 ) + i = 1 n ( x i 1 ) 2 ( 1 + s i n 2 ( 3 π x i + 1 ) ) + ( x n 1 ) 2 UN30 [ 50 , 50 ] 0
( 1 + s i n 2 ( 2 π x n ) ) ) + i + 1 n u ( x i , 5 , 100 , 4 )
F 9 ( x ) = 4 x 1 2 2.1 x 1 4 + 1 3 x 1 6 + x 1 x 2 4 x 2 2 + 4 x 2 4 FDM2 [ 5 , 5 ] 1.0316
F 10 ( x ) = ( x 2 5.1 4 π 2 x 1 2 + 5 π x 1 6 ) 2 + 10 ( 1 1 8 π ) c o s x 1 + 10 FDM2 [ 5 , 5 ] 0.398
F 11 ( x ) = ( 1 + ( x 1 + x 2 + 1 ) 2 ( 19 14 x 1 + 3 x 1 2 14 x 2 + 6 x 1 x 2 + 3 x 2 2 ) ) FDM2 [ 2 , 2 ] 3
× ( 30 + ( 2 x 1 3 x 2 ) 2 × ( 18 32 x 1 + 12 x 1 2 + 48 x 2 36 x 1 x 2 + 27 x 2 2 ) )
F 12 ( x ) = i = 1 1 0 ( ( X a i ) ( X a i ) T + c i ) 1 FDM4[0,10] 10.5363
Table 2. Compare the mean and variance of different algorithms through benchmark functions.
Table 2. Compare the mean and variance of different algorithms through benchmark functions.
FunctionMean (WOA)Std (WOA)Mean (BOA)Std (BOA)Mean (PSO)Std (PSO)Mean (OURS)Std (OURS)
f 1 0000 7.23 × 10 5 6.30 × 10 5 00
f 2 00000.23241 8.23 × 10 2 00
f 3 000.439100.183345.612220.7384500
f 4 0.005110.004290.009350.009030.691900.212190.001490.00065
f 5 00002.108300.2757200
f 6 00000.001050.0016800
f 7 0.022030.010700.697560.081524.414291.816120.038980.07815
f 8 000.012050.018320.501790.2445200
f 9 1.0316 0N\AN\A 1.0316 0 1.0316 0
f 10 0.39790 9.22 × 10 6 0.542900.305990.3978900.39789 2.48 × 10 5
f 11 3.00001 3 × 10 5 3.048930.05603303.000170.00025
f 12 7.25931 3.34554 4.01282 0.38772 8.91402 3.24476 10.49534 0.12228
Table 3. Comparison of PSNR in images a–f.
Table 3. Comparison of PSNR in images a–f.
Metric NameImagekWOABOAPSOOURS
PSNRImage1(000019)211.492412.673213.028413.7823
412.507513.782914.167314.7642
616.538917.265917.845419.6037
819.859520.486320.526921.3771
Image2(000436)213.141513.798614.537614.1383
414.136114.528415.322815.6705
617.924317.998818.365019.3411
818.502919.222920.913021.9703
Image3(001478)213.378413.719413.791314.2379
413.664714.143414.918115.4509
615.875816.466117.434418.6340
817.310917.606019.438820.3651
Image4(003579)213.638613.572614.487115.9936
414.340814.705215.635815.7378
615.929716.659517.226018.2881
816.692616.852517.443219.6017
Image5(004423)215.193215.818915.546615.7275
415.511516.237916.928916.8156
617.572818.881920.891721.7370
818.690019.940721.985423.3966
Image6(006404)213.876813.855014.976814.9706
414.617314.745214.836115.7539
616.504317.592218.745319.1893
817.854318.525220.700821.1228
Table 4. Comparison of SSIM in images a–f.
Table 4. Comparison of SSIM in images a–f.
Metric NameImagekWOABOAPSOOURS
SSIMImage1(000019)20.41860.46280.45210.4955
40.44470.45590.54430.5968
60.49210.51610.62200.6973
80.68990.69300.70110.7333
Image2(000436)20.43360.49220.55500.5806
40.47810.56150.59750.6381
60.54820.61020.62380.6676
80.57200.62670.65410.7895
Image3(001478)20.42990.44740.51200.5506
40.52190.57740.64840.6386
60.56340.59710.69600.7235
80.65630.68570.74360.8154
Image4(003579)20.56100.59610.59690.5904
40.57960.60570.61720.6714
60.62860.64110.64840.7172
80.67640.69380.70360.7297
Image5(004423)20.34690.37550.37580.4954
40.44430.45700.57140.5978
60.53960.56800.66870.7178
80.68170.66380.74300.7746
Image6(006404)20.42160.44190.46750.4990
40.44290.47230.54820.5745
60.48590.51860.57440.6482
80.56320.61140.63670.6861
Table 5. Comparison of FSIM in images a–f.
Table 5. Comparison of FSIM in images a–f.
Metric NameImagekWOABOAPSOOURS
FSIMImage1(000019)20.70530.72140.74230.7711
40.72780.75350.78610.8026
60.75960.79740.80130.8250
80.80580.81440.82240.8476
Image2(000436)20.65250.69340.75710.7918
40.71200.72350.76100.8132
60.73500.74410.78320.8317
80.75820.76550.80860.8594
Image3(001478)20.65920.67630.70840.7135
40.67950.69400.72880.7402
60.71100.74170.76430.7732
80.73900.76520.78640.8277
Image4(003579)20.74950.76340.77700.7953
40.76790.78550.79210.8113
60.78690.80240.82750.8322
80.81390.82890.84110.8672
Image5(004423)20.75520.76950.78920.7859
40.77230.78200.81180.8382
60.79650.79980.83670.8547
80.81850.82470.85590.8712
Image6(006404)20.62920.64670.66380.6779
40.64820.66140.67350.7066
60.66840.68760.69100.7134
80.68350.71550.73340.7780
Table 6. Comparison of Fitness in images a–f.
Table 6. Comparison of Fitness in images a–f.
Metric NameImagekWOABOAPSOOURS
FitnessImage1(000019)21704.85741704.52981704.37601704.8259
42539.68132539.42392539.24582539.7578
62765.97732765.27832764.24562764.2568
83858.83083858.14153857.76723858.3303
Image2(000436)22213.50652213.17382213.31472213.1353
43450.90443450.64743450.68183450.8937
64033.84494033.19274033.10994033.8446
84905.77274905.62144904.34924905.7649
Image3(001478)22451.98902451.75172451.17172451.9397
43375.89023375.42633375.11523375.8437
63766.76243766.27093766.23893766.4151
83986.80893986.31773984.31643986.7295
Image4(003579)21765.92981765.35071765.14581765.4344
42276.99542276.81732276.36802276.9576
63877.96743877.14733877.12183877.2627
85090.76535090.50325090.26235090.6338
Image5(004423)21721.87011721.18171721.11241721.2721
42551.94002551.74422551.45942551.8431
62968.96822968.76382968.45202968.7339
83859.57903859.14033857.13003859.3798
Image6(006404)22972.73882972.25422972.10442972.4020
43387.75433387.31683387.22313387.6870
64142.61264142.40514142.30224142.4612
85984.90555984.76595984.34705984.8045
Table 7. Comparison of PSNR in Images g–l.
Table 7. Comparison of PSNR in Images g–l.
Metric NameImagekWOABOAPSOOURS
PSNRImage7(001236)213.829314.550315.716715.7977
414.197014.658215.662115.6767
617.414617.585918.381220.5188
818.298018.137719.101621.6592
Image8(001876)213.400914.579014.846514.7751
414.569015.612115.652416.4362
616.499616.799817.599719.7598
817.938418.622619.549821.5051
Image9(002036)213.555714.469614.603414.7418
415.561115.142716.417716.3583
616.530617.365018.356519.1878
818.428118.525420.595122.6777
Image10(004231)213.663214.333715.203315.9469
414.298914.690815.555115.2502
615.120516.567116.751717.4736
815.752916.732517.248618.2805
Image11(004610)214.769714.927315.602516.4122
414.800415.432315.812016.9533
616.108017.581719.823520.4539
818.696519.885120.871221.7442
Image12(006946)211.611312.938713.888913.3067
412.307412.402514.669014.1064
615.112115.239416.332317.7174
816.719516.890017.945118.3580
Table 8. Comparison of SSIM in Images g–l.
Table 8. Comparison of SSIM in Images g–l.
Metric NameImagekWOABOAPSOOURS
SSIMImage7(001236)20.55710.58610.58750.6273
40.60120.63020.65610.6567
60.62310.65190.72530.7461
80.66850.69180.73350.7824
Image8(001876)20.44740.46100.46190.4849
40.49710.53000.57190.5826
60.54960.58260.61380.6611
80.62150.62910.67360.6884
Image9(002036)20.36600.39200.41360.4650
40.38780.40120.48960.5109
60.41270.43220.54830.5705
80.48180.48450.59120.6318
Image10(004231)20.33090.33870.35670.3671
40.37260.38620.42690.4262
60.42450.45270.45650.5049
80.47080.54280.55650.5778
Image11(004610)20.61880.62380.64540.6840
40.62970.66320.69250.6918
60.72490.76500.76930.7891
80.73680.77280.78810.8017
Image12(006946)20.43100.44290.45230.5013
40.49840.52040.53630.5699
60.53770.62390.67370.6875
80.65070.69180.73030.7471
Table 9. Comparison of FSIM in Images g–l.
Table 9. Comparison of FSIM in Images g–l.
Metric NameImagekWOABOAPSOOURS
FSIMImage7(001236)20.70660.72770.75100.7889
40.73970.76450.79160.8246
60.75600.80890.84330.8760
80.80300.82790.86600.8982
Image8(001876)20.64610.65160.67470.6850
40.66420.67370.69270.7071
60.68230.69570.71820.7419
80.71280.72460.75840.8142
Image9(002036)20.67260.69490.71910.7347
40.69380.71440.73410.7694
60.72200.73790.75500.7977
80.74210.75180.77960.8084
Image10(004231)20.63810.66790.69800.7130
40.65260.67290.71110.7368
60.69320.72740.75100.7787
80.73490.75640.78600.8072
Image11(004610)20.72830.74840.75850.7467
40.73790.75130.76850.7760
60.74510.77100.78190.8029
80.76700.78500.81290.8558
Image12(006946)20.73750.75820.79550.8250
40.75410.77630.81520.8430
60.77960.80970.84210.8654
80.80160.82170.84270.8693
Table 10. Comparison of Fitness in Images g–l.
Table 10. Comparison of Fitness in Images g–l.
Metric NameImagekWOABOAPSOOURS
FitnessImage7(001236)22750.89922750.76162750.12402750.7641
43517.74373517.72563517.32123517.7290
64509.98534509.78144509.57644509.7829
86495.56616495.50186493.29536495.5538
Image8(001876)22338.49612338.32092338.28882338.4402
43251.98463251.44603251.25483251.7075
63637.98113637.41993637.12193637.6122
84735.80504735.34754734.14914735.5701
Image9(002036)22777.96332777.83452777.19292777.9091
43709.89623709.20283709.19773709.6104
64837.62704837.20124837.12854837.5268
85453.68885453.57495453.34425453.6073
Image10(004231)21699.84901699.30381699.16011699.7288
42878.59532878.23502878.12992878.2976
63708.68703708.41143708.40163708.4181
84828.79054828.57734825.48864828.9096
Image11(004610)21343.94851343.60481343.44101343.7570
42561.87682561.42192561.35652561.2494
62905.97552905.56452905.54182905.7108
84101.84974101.58074100.42354101.6487
Image12(006946)22200.62622200.14692200.12882200.5102
43164.79623164.23523164.29363164.7918
63932.82093932.25913932.22453932.3805
84459.99454459.82154458.18374459.9166
Table 11. Average of 100 image segmentation results.
Table 11. Average of 100 image segmentation results.
Metric NamekWOABOAPSOOURS
PSNR213.234314.723614.963015.8237
414.527314.926315.432815.9417
616.386216.973217.723719.6872
818.468719.243320.293421.6278
SSIM20.43270.49380.50190.5823
40.48250.53490.58350.6132
60.59260.64190.67230.6835
80.63240.66380.69240.7247
FSIM20.68420.71540.73690.7437
40.71470.75190.78300.8039
60.74220.77510.79840.8241
80.76450.79570.82260.8469
Running Tinme21.351.351.351.28
41.641.651.641.52
62.853.122.972.25
83.824.833.942.46
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Zheng, J.; Gao, Y.; Zhang, H.; Lei, Y.; Zhang, J. OTSU Multi-Threshold Image Segmentation Based on Improved Particle Swarm Algorithm. Appl. Sci. 2022, 12, 11514. https://0-doi-org.brum.beds.ac.uk/10.3390/app122211514

AMA Style

Zheng J, Gao Y, Zhang H, Lei Y, Zhang J. OTSU Multi-Threshold Image Segmentation Based on Improved Particle Swarm Algorithm. Applied Sciences. 2022; 12(22):11514. https://0-doi-org.brum.beds.ac.uk/10.3390/app122211514

Chicago/Turabian Style

Zheng, Jianfeng, Yinchong Gao, Han Zhang, Yu Lei, and Ji Zhang. 2022. "OTSU Multi-Threshold Image Segmentation Based on Improved Particle Swarm Algorithm" Applied Sciences 12, no. 22: 11514. https://0-doi-org.brum.beds.ac.uk/10.3390/app122211514

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop