An ACO-Based Clustering Algorithm With Chaotic Function Mapping

An ACO-Based Clustering Algorithm With Chaotic Function Mapping

Lei Yang, Xin Hu, Hui Wang, Wensheng Zhang, Kang Huang, Dongya Wang
DOI: 10.4018/IJCINI.20211001.oa20
Article PDF Download
Open access articles are freely available for download

Abstract

To overcome shortcomings when the ant colony optimization clustering algorithm (ACOC) deal with the clustering problem, this paper introduces a novel ant colony optimization clustering algorithm with chaos. The main idea of the algorithm is to apply the chaotic mapping function in the two stages of ant colony optimization: pheromone initialization and pheromone update. The application of chaotic mapping function in the pheromone initialization phase can encourage ants to be distributed in as many different initial states as possible. Applying the chaotic mapping function in the pheromone update stage can add disturbance factors to the algorithm, prompting the ants to explore new paths more, avoiding premature convergence and premature convergence to suboptimal solutions. Extensive experiments on the traditional and proposed algorithms on four widely used benchmarks are conducted to investigate the performance of the new algorithm. These experiments results demonstrate the competitive efficiency, effectiveness, and stability of the proposed algorithm.
Article Preview
Top

Introduction

Data mining is the most critical work in the era of big data. Cluster analysis is one of the most basic tasks of data mining (Kao & Cheng, 2006), which can divide a set of data objects into multiple groups. Data objects located in the same group indicated that they have close similarities. Otherwise, they will belong to different groups. (Ding et al. 2016; Yang et al.2004). By analyzing the similarity and dissimilarity between data in the data set, data objects are grouped or clustered (Hidayat, Fatichah, & Ginardi, 2016; Jabbar, Ku-Mahamud, & Sagban, 2018). Cluster analysis is also called unsupervised learning because class labels and even the number of classes of data objects are unknown before analyzing the data (Gonzalez-Pardo, Jung, & Camacho, 2017; Han, Pei, & Kamber, 2011). Although cluster analysis and classification prediction tasks are not equal, cluster analysis can be used as a prerequisite for classification (Baig, Shahzad, & Khan, 2013). That is, when a set of data objects was unknown about what kinds of labels it can be divided into, cluster analysis could be firstly used to divide the similar data objects into the same groups. And then, according to certain principles, class labels are affixed to those groups. If data is sufficient, class labels generated by the data set can be used for data classification, and the data set can be used as a training data set of the classification task.

Over the past two decades, group intelligence has attracted a great deal of interest among researchers because of its dynamic and flexible capabilities and its advantages in solving real-world nonlinear problems with high efficiency, and many group intelligence-based algorithms have been introduced for optimization in various areas of computer science (Anand Nayyar & Nayyar, 2018). Ant colony optimization algorithm is a swarm intelligence algorithm developed based on natural genetics and natural evolution of biological circles (Gonzalez-Pardo et al., 2017). As part of group intelligence, it solves complex combinatorial optimization problems by mimicking cooperative behavior among ants (Anand Nayyar, 2018). The algorithm has great global search ability and does not depend on the form of objective functions, so it is applied to solving the clustering problem (Menéndez, Otero, & Camacho, 2016; Monmarché, Slimane, & Venturini, 1999). At the same time, it has a particularly good ability to solve discrete, stochastic, dynamic problems (A Nayyar & Singh, 2016),and routing issues of sensor networks (Anand Nayyar & Singh, 2014). Basic analysis ant colony clustering algorithm (ACOC) aims to assign N data objects into K groups, by making the square of the Euclidean minimize between the data object and center of the corresponding group (Zhang Jianhua Jiang He, 2006). ACOC uses artificial ants (agent) to construct paths, each artificial ant starts with an empty string with length N, and each element in the string represents a data object in the data set. The value of this element object represents the grouping to which the corresponding data object is assigned. (Gao, Wang, Cheng, Inazumi, & Tang, 2016; Pei Zhenkui Li Hua, 2008).In order to improve the convergence rate, the principle of direct allocation is adopted in the initial stage of the ACOC algorithm, putting the ants on the data point at random and generating random global memory(Wang & Luo, 2019).In order to further improve the ACOC convergence and search ability, the variation factor of genetic algorithm was combined to improve the ant colony algorithm, and it enables the ant colony algorithm to generate genetic algorithm initial data in each iteration process, so as to improve the species diversity, expand the search scope of the solution and avoid getting into the local optimal solution dilemma(Wu, Yan, Zhang, & Shen, 2018).A hybrid algorithm for Big Data preprocessing ACO-clustering algorithm approach was proposed, which can help to increase search speed by optimizing the process. As the proposed method using ant colony optimization with clustering algorithm it will also contribute to reducing pre-processing time and increasing analytical accuracy and efficiency(Singh, Singh, & Pant, 2019).

Complete Article List

Search this Journal:
Reset
Volume 18: 1 Issue (2024)
Volume 17: 1 Issue (2023)
Volume 16: 1 Issue (2022)
Volume 15: 4 Issues (2021)
Volume 14: 4 Issues (2020)
Volume 13: 4 Issues (2019)
Volume 12: 4 Issues (2018)
Volume 11: 4 Issues (2017)
Volume 10: 4 Issues (2016)
Volume 9: 4 Issues (2015)
Volume 8: 4 Issues (2014)
Volume 7: 4 Issues (2013)
Volume 6: 4 Issues (2012)
Volume 5: 4 Issues (2011)
Volume 4: 4 Issues (2010)
Volume 3: 4 Issues (2009)
Volume 2: 4 Issues (2008)
Volume 1: 4 Issues (2007)
View Complete Journal Contents Listing