Some Information Measures for Fuzzy Rough Sets

Some Information Measures for Fuzzy Rough Sets

Omdutt Sharma, Pratiksha Tiwari, Priti Gupta
Copyright: © 2021 |Pages: 21
DOI: 10.4018/IJFSA.2021040105
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Information theory is a tool to measure uncertainty; these days, it is used to solve various challenging problems that involve hybridization of information theory with the fuzzy set, rough sets, vague sets, etc. In order to solve challenging problems in scientific data analysis and visualization recently, various authors are working on hybrid measures of information theory. In this paper, using the relation between information measures, some measures are proposed for the fuzzy rough set. Firstly, an entropy measure is derived using the fuzzy rough similarity measure, and then corresponding to this entropy measure, some other measures like mutual information measure, joint entropy measure, and conditional entropy measure are also proposed. Some properties of these measures are also studied. Later, the proposed measure is compared with some existing measures to prove its efficiency. Further, the proposed measures are applied to pattern recognition, medical diagnoses, and a real-life decision-making problem for incorporating software in the curriculum at the Department of Statistics.
Article Preview
Top

Introduction

In recent years, there is an emerging trend where the principles of information theory are used to solve the uncertainty problems. Information theory provides a theoretical foundation to quantify the information content or the uncertainty of a random variable represented as a distribution. In the literature, the issue of uncertainty has not been always welcomed by the scientific community. In the conventional prospect of science, uncertainty characterizes an unfavorable situation, a situation that must be prevented at all costs. To evade uncertainty, it is important to quantify the degree of uncertainty. Shannon (1948) used “entropy” to quantify the uncertain degree of randomness in a probability distribution. The information contained in this experiment is given by IJFSA.2021040105.m01 which is known as Shannon Entropy. Later Zadeh (1965) introduced the concept of fuzzy set and defined the entropy of a fuzzy set which was distinct from classical Shannon's entropy as no probabilistic notion was required to describe it. Fuzzy entropy is a measure of fuzziness of a set that develops from the inherent vagueness or imprecision conducted by the fuzzy set.

In contrast to the fuzzy set theory, there is another theory recognized as rough set theory projected by Pawalk (1982) is exploited to deal with the imprecise and unclear information but their beginning and highlighting is distinct from fuzzy set theory. From the distinct aspect of prospect, these theories may complement each other. Combining fuzzy sets with rough sets Nanda and Majumdar (1992) introduced a fuzzy rough set. A fuzzy rough set sum up the connected but distinct notions of the vagueness of fuzzy set and indiscernibility of the rough set, both of which are opposite and can be meet in real-life problems. A fuzzy rough set has a benefit over a rough set while diminishing the information loss in real-valued data sets produced by the discretization in the rough set.

The entropy of a system introduced by Shannon (1948), defines a measure of uncertainty about its actual structure. It has been a helpful procedure for illustrating the information contained in several forms and applications in various separated fields. Various authors like Gupta and Sheoran (2014) and Sharma et al. (2017) have used Shannon's concept and its variants to calculate uncertainty in rough set and fuzzy set theory. Some authors like Chengyi et al. [2004], Qi and Chengyi (2008), Gupta et al. (2016) and Sharma et al. (2017), used similarity measure to deal with the uncertainty and vagueness. Liu (1992) systematically gave the axiomatic definition of entropy, distance measure, and similarity measure of fuzzy sets and discuss some basic relations between these measures. Li and Ren (2015) discussed a method related to multi-attribute decision making considering the amount and reliability of intuitionistic fuzzy information. Zhu and Li (2016) describe a new definition and formula of entropy for intuitionistic fuzzy sets. Qamar and Hassan (2018, 2019) discussed an approach towards the Q-neutrosophic soft set and some information measures based on that approach with their applications. Yu et al. (2018, 2019) worked in the field of heterogeneous multi-attribute group decision making with preference deviation and a compromise- typed variable weight decision method for hybrid multi-attribute decision making respectively. Wei et al. (2019) discussed the novel generalized exponential entropy for intuitionistic fuzzy sets and interval-valued intuitionistic fuzzy sets.

As information measures are used to measure the uncertainty and vagueness in the information and data in the form of a numeral. It is a universal discourse that the hybridization technique is used to get the better result of the existing approach and also to overcome the difficulty or limitation of the existing approach. In literature various information measures are exist related to different theories like probabilistic approach, fuzzy approach, soft approach, and rough set approach. This paper introduces some information measures of a fuzzy rough set and its elements. Also, their axiomatic definition is given. The relationship between similarity measures and entropy measures are used to derive the proposed entropy measures. By using proposed entropy measures some more information measures are also derived. The rest of the paper is summarized as:

Complete Article List

Search this Journal:
Reset
Volume 13: 1 Issue (2024)
Volume 12: 1 Issue (2023)
Volume 11: 4 Issues (2022)
Volume 10: 4 Issues (2021)
Volume 9: 4 Issues (2020)
Volume 8: 4 Issues (2019)
Volume 7: 4 Issues (2018)
Volume 6: 4 Issues (2017)
Volume 5: 4 Issues (2016)
Volume 4: 4 Issues (2015)
Volume 3: 4 Issues (2013)
Volume 2: 4 Issues (2012)
Volume 1: 4 Issues (2011)
View Complete Journal Contents Listing