Computational Intelligence: Ary Programming. Each Algorithm Operates On A Sequence
Computational Intelligence: Ary Programming. Each Algorithm Operates On A Sequence
J. Webster (ed.), Wiley Encyclopedia of Electrical and Electronics Engineering. Copyright 2007 John Wiley & Sons, Inc.
2 Computational Intelligence
by Zadeh (29), and belongs to a class of intersection opera- tion. Rough set theory also offers a model for approxima-
tors (min, product, bold intersection) known as triangular tion of vague concepts (69, 83).
or t-norms. A t-norm is a mapping t : [0, 1]2 [0, 1]. The In particular, rough set methods provide a means of ap-
s-norm (t-conorm) is a mapping s : [0, 1]2 [0, 1] (also tri- proximating a set by other sets (33, 34). For computational
angular co-norm) is commonly used for the union of fuzzy reasons, a syntactic representation of knowledge is pro-
sets. The properties of triangular norms are presented in vided by rough sets in the form of data tables. In general,
(82). an information system IS is represented by a pair (U, F),
Fuzzy sets exploit imprecision in conventional systems where U is a non-empty set of objects and F is a non-empty,
in an attempt to make system complexity manageable. It countable set of probe functions that are a source of mea-
has been observed that fuzzy set theory offers a new model surements associated with object features. For example, a
of vagueness (13). Many examples of fuzzy systems are feature of an image may be color with probe functions that
given in Pedrycz (23), and in Kruse, Gebhardt, and Kla- measure tristimulus values received from three primary
wonn (24). color sensors, brightness (luminous ux), hue (dominant
wavelength in a mixture of light waves), and saturation
(amount of white light mixed with a hue). Each f F maps
NEURAL COMPUTING
an object to some value. In effect, we have f : U Vf for
every f F.
Neural networks offer a powerful and distributed comput-
The notions of equivalence and equivalence class are
ing architecture equipped with signicant learning abili-
fundamental in rough sets theory. A binary relation R X
ties (predominantly as far as parametric learning is con-
X is an equivalence relation if it is reexive, symmetric
cerned). They help represent highly nonlinear and multi-
and transitive. A relation R is reexive if every object x X
variable relationships between system variables. Starting
has relation R to itself. That is, we can assert x R x. The sym-
from pioneering research of McCulloch and Pitts (25), and
metric property holds for relation R if xRy implies yRx for
others (26, 27), neural networks have undergone a signi-
every x, y X. The relation R is transitive for every x, y, z
cant metamorphosis and have become an important reser-
X, then xRy and yRz imply xRz. The equivalence class of an
voir of various learning methods (28) as well as an exten-
object x X consists of all objects y X so that xRy. For each
sion of conventional techniques in statistical pattern recog-
B A, there is associated an equivalence relation IndA (B)
nition (29). Articial Neural Networks (ANNs) were intro-
= {(x, x) | B. (x) = (x)} (indiscernibility relation). If (x,
duced to model features of the human nervous system (25).
x) IndA (B), we say that objects x and x are indiscernible
An articial neural network is collection of highly inter-
from each other relative to attributes from B. This is a fun-
connected processing elements called neurons. In ANNs, a
damental concept in rough sets. The notation [x]B is a com-
neuron is a threshold device, which aggregates (sums) its
monly used shorthand that denotes the equivalence class
weighted inputs, and applies an activation function to each
dened by x relative to a feature set B. In effect, [x]B = {y
aggregation to produce a response. The summing part of a
U | x IndA (B) y}. Further, partition U/IndA (B) denotes the
neuron in an ANN is called an Adaptive Linear Combiner
family of all equivalence classes of relation IndA (B) on U.
(ALC) in (30, 31). A McCulloch-Pitts neuron ni is a binary
Equivalence classes of the indiscernibility relation (called
threshold unit withan ALC that computes a weighted sum
n B-granules generated by the set of features B (13)) repre-
net where net = j=0
w j x j . A weight wi associated with
sent granules of an elementary portion of knowledge we
xi represents the strength of connection of the input to a
are able to perceive relative to available data. Such a view
neuron. Input x0 represents a bias, which can thought of
of knowledge has led to the study of concept approxima-
as an input with weight 1. The response of a neuron can
tion (40) and pattern extraction (41). For X U, the set X
be computed in a number of ways. For example, the re-
can be approximated only from information contained in B
sponse of neuron ni can be computed using sgn(net), where
by constructing a B-lower and B-upper approximation de-
sgn(net) = 1 for net > 0, sgn(net) = 0 for net = 0, and sgn(net)
noted by B X = {x U | [x]B X} and B*X = {x U | [x]B X
= 1, if net < 0. A neuron comes with adaptive capabili-
= }, respectively. In other words, a lower approximation
ties that could be fully exploited assuming that there is an
B X of a set X is a collection of objects that can be classied
effective procedure is introduced to modify the strengths
with full certainty as members of X using the knowledge
of connections so that a correct response is obtained for a
represented by features in B. By contrast, an upper approx-
given input. A good discussion of learning algorithms for
imation B X of a set X is a collection of objects representing
various forms of neural networks can be found in Freeman
both certain and possible uncertain knowledge. In the case
and Skapura (32) and Bishop (29). Various forms of neural
where B X is a proper subset of B X, then the objects in X
networks have been successfully used in system modeling,
cannot be classied with certainty, and the set X is rough.
pattern recognition, robotics, and process control applica-
It has recently been observed by Pawlak (13) that this is
tions (46,50,51,54,75,76).
exactly the idea of vagueness proposed by Frege (41). That
is, the vagueness of a set stems from its borderline region.
ROUGH SETS The size of the difference between lower and upper ap-
proximations of a set (i.e., boundary region) provides a ba-
Rough sets introduced by Pawlak in 1981 (77, 78) and elab- sis for the roughness of an approximation. This is im-
orated in (13,33,34,67,68,74,7981) offer another approach portant because vagueness is allocated to some regions of
to CI by drawing attention to the importance of set approx- what is known as the universe of discourse (space) rather
imation in knowledge discovery and information granula- than to the whole space as encountered in fuzzy sets. The
Computational Intelligence 3
study of what it means to be a part of provides a basis for sue. Computation Intelligence: An International Journal, vol.
what is known as mereology introduced by Lesniewski in 17, no. 3, 2001, 399603.
1927 (36). More recently, the study of what it means to be 7. A. Skowron, S. K. Pal (Eds.), Rough Sets, Pattern Recognition
a part of to a degree has led to a calculus of granules (8,37 and Data Mining Special Issue. Pattern Recognition Letters,
39,71,73). In effect, granular computing allows us to quan- vol. 24, no. 6, 2003, 829933.
tify uncertainty and take advantage of uncertainty rather 8. A. Skowron, Toward intelligent systems: Calculi of informa-
than blindly discarding it. tion granules. In: T. Terano, T. Nishida, A. Namatane, S.
Approximation spaces introduced by Pawlak (77), elab- Tsumoto, Y. Ohsawa, T. Washio (Eds.), New Frontiers in Arti-
orated by (33,34,66,69,7073), applied in (68,40,46,59,64) cial Intelligence, Lecture Notes in Articial Intelligence 2253.
Berlin: Springer-Verlag, 2001, 2839.
serve as a formal counterpart of our perception ability or
observation (69), and provide a framework for approximate 9. J. F. Peters, A. Skowron, J. Stepaniuk, S. Ramanna, Towards
an ontology of approximate reason, Fundamenta Informaticae,
reasoning about vague concepts. In its simplest form, an
vol. 51, nos. 12, 2002, 157173.
approximation space is any pair (U, R), where U is a non-
10. R. Marks, Intelligence: Computational versus Articial, IEEE
empty set of objects (called a universe of discourse) and R
Trans. on Neural Networks, 4, 1993, 737739.
is an equivalence relation on U (called an indiscernibililty
11. D. Fogel, Review of Computational Intelligence: Imitating
relation). Equivalence classes of an indiscernibility rela-
Life, IEEE Trans. on Neural Networks, 6, 1995, 15621565.
tion are called elementary sets (or information granules)
12. J. F. Peters, Time and Clock Information Systems: Concepts
determined by R. Given an approximation space S = (U,
and Roughly Fuzzy Petri Net Models. In: J. Kacprzyk (Ed.),
R), a subset X of U is denable if it can be represented as Knowledge Discovery and Rough Sets. Berlin: Physica Verlag,
the union of some of the elementary sets determined by a division of Springer Verlag, 1998.
R. It was originally observed that not all subsets of U are 13. Z. Pawlak, A. Skowron, Rudiments of rough sets, Information
denable in S (69). Given a non-denable subset X of U, Sciences, 177, 2006, 327. See, also, J. F. Peters, A. Skowron,
our observation restricted by R causes X to be perceived as Zdzislaw Pawlak life and work (19262006), Information Sci-
a vague object. An upper approximation B*X is the least ences, 177, 12, Z. Pawlak, A. Skowron, Rough sets: Some ex-
denable subset of U containing X, and the lower approxi- tensions, Information Sciences, 177, 2840 and Z. Pawlak, A.
mation B X is the greatest denable subset of U contained Skowron, Rough sets and Boolean reasoning, Information Sci-
in X. ences, 177, 4173.
Fuzzy set theory and rough set theory taken singly and 14. J. H. Holland, Adaptive plans optimal for payoff-only environ-
in combination pave the way for a variety of approximate ments, Proc. of the Second Hawaii Int. Conf. on System Sci-
reasoning systems and applications representing a syn- ences, 1969, 917920.
ergy of technologies from computational intelligence. This 15. J. R. Koza, Genetic Programming: On the Progamming of Com-
synergy can be found, for example, in recent work on the re- puters by Means of Natural Selection. Cambridge, MA: MIT
lation between fuzzy sets and rough sets (13,35,46,60,65), Press, 1993.
rough mereology (3739,65,66), rough control (42, 43), 16. C. Darwin, On the Origin of Species by Means of Natural Se-
fuzzy-rough-evolutionary control (44), machine learning lection, or the Preservation of Favoured Races in the Struggle
for Life. London: John Murray, 1959.
(34,45,59), fuzzy neurocomputing (3), rough neurocomput-
ing (46) diagnostic systems (34, 47), multi-agent systems 17. L. Chambers, Practical Handbook of Genetic Algorithms, vol.
I. Boca Raton, FL: CRC Press, 1995.
(8,9,48), real-time decision-making (12, 49), robotics and
unmanned vehicles (5053), signal analysis (55), and soft- 18. L. J. Fogel, A. J. Owens, M. J. Walsh, Articial Intelligence
through Simulated Evolution, Chichester, J. Wiley, 1966.
ware engineering (4,5558).
19. L. J. Fogel, On the organization of the intellect. Ph.D. diss.,
UCLA, 1964.
20. R. R. Yager and D. P. Filev, Essentials of Fuzzy Modeling and
BIBLIOGRAPHY Control. NY: John Wiley & Sons, Inc., 1994.
21. L. A. Zadeh, Fuzzy sets, Information and Control, 8, 1965,
1. J. C. Bezdek, On the relationship between neural networks, 338353.
pattern recognition and intelligence, Int. J. Approximate Rea- 22. L. A. Zadeh, Outline of a new approach to the analysis of com-
soning, 6, 1992, 85107. plex systems and decision processes, IEEE Trans. on Systems,
2. J. C. Bezdek, What is Computational Intelligence? In: J. Zu- Man, and Cybernetics, 2, 1973, 2844.
rada, R. Marks, C. Robinson (Eds.), Computational Intelli- 23. W. Pedrycz, Fuzzy Control and Fuzzy Systems, NY: John Wiley
gence: Imitating Life, Piscataway, IEEE Press, 1994, 112. & Sons, Inc., 1993.
3. W. Pedrycz, Computational Intelligence: An Introduction. Boca 24. R. Kruse, J. Gebhardt, F. Klawonn, Foundations of Fuzzy Sys-
Raton, FL: CRC Press, 1998. tems. NY: John Wiley & Sons, Inc., 1994.
4. W. Pedrycz, J. F. Peters (Eds.), Computational Intelligence 25. W. S. McCulloch, W. Pitts, A logical calculus of ideas imma-
in Software Engineering, Advances in Fuzzy Systems nent in nervous activity, Bulletin of Mathematical Biophysics
Applications and Theory, vol. 16. Singapore: World Scientic, 5, 1943, 115133.
1998. 26. F. Rosenblatt, Principles of Neurodynamics: Perceptrons and
5. D. Poole, A. Mackworth, R. Goebel, Computational Intelli- the Theory of Brain Mechanisms, Washington: Spartan Press,
gence: A Logical Approach. Oxford: Oxford University Press, 1961.
1998. 27. M. Minsky, S. Pappert, Perceptrons: An Introduction to Com-
6. N. Cercone, A. Skowron, N. Zhong (Eds.), Rough Sets, Fuzzy putational Geometry, Cambridge: MIT Press, 1969.
Sets, Data Mining, and Granular-Soft Computing Special Is-
4 Computational Intelligence
28. E. Fiesler, R. Beale (Eds.), Handbook on Neural Computation. Sets and Data Mining: Analysis for Imprecise Data. Norwell,
UK: Institute of Physics Publishing and Oxford University MA, Kluwer Academic Publishers, 1997, 91108.
Press, 1997. 46. S. K. Pal, L. Polkowski, A. Skowron (Eds.), Rough-Neuro
29. C. M. Bishop, Neural Networks for Pattern Recognition. Ox- Computing: Techniques for Computing with Words. Berlin:
ford: Oxford University Press, 1995. Springer-Verlag, 2003.
30. B. Widrow, M. E. Hoff, Adaptive switching circuits, Proc. IRE 47. R. Hashemi, B. Pearce, R. Arani, W. Hinson, M. Paule, A fu-
WESCON Convention Record, Part 4, 1960, 96104. sion of rough sets, modied rough sets, and genetic algorithms
31. B. Widrow, Generalization and information storage in net- for hybrid diagnostic systems. In: T. Y. Lin, N. Cercone (Eds.),
works of adaline neurons. In M. C. Yovits, G. T. Jacobi, G. D. Rough Sets and Data Mining: Analysis for Imprecise Data.
Goldstein (Eds.), Self-Organizing Systems. Washington, Spar- Norwell, MA, Kluwer Academic Publishers, 1997, 149176.
tan, 1962. 48. R. Ras, Resolving queries through cooperation in multi-agent
32. J. A. Freeman and D. M. Skapura, Neural Networks: Algo- systems. InT. Y. Lin, N. Cercone (Eds.), Rough Sets and Data
rithms, Applications and Programming Techniques. Reading, Mining: Analysis for Imprecise Data. Norwell, MA, Kluwer
MA, Addison-Wesley, 1991. Academic Publishers, 1997, 239258.
33. Z. Pawlak, Rough sets, Int. J. of Information and Computer 49. A. Skowron, Z. Suraj, A parallel algorithm for real-time deci-
Sciences, vol. 11, no. 5, 1982, 341356, 1982 sion making: a rough set approach. J. of Intelligent Systems,
34. Z. Pawlak, Rough Sets. Theoretical Aspects of Reasoning about vol. 7, 1996, 528.
Data, Dordrecht, Kluwer Academic Publishers, 1991. 50. J. F. Peters, T. C. Ahn, M. Borkowski, V. Degtyaryov, S. Ra-
35. W. Pedrycz, Shadowed sets: Representing and processing fuzzy manna, Line-crawling robot navigation: A rough neurocom-
sets, IEEE Trans. on Systems, Man, and Cybernetics, Part B: puting approach. In:C. Zhou, D. Maravall, D. Ruan (Eds.),
Cybernetics, 28/1, Feb. 1998, 103108. Autonomous Robotic Systems. Berlin: Physica-Verlag, 2003,
141164.
36. S. Lesniewski, O podstawach matematyki (in Polish), Przeglad
Filozoczny, vol. 30, 164206, vol. 31, 261291, vol. 32, 60101, 51. J. F. Peters, T. C. Ahn, M. Borkowski, Object-classication by a
and vol. 33, 142170, 1927. line-crawling robot: A rough neurocomputing approach. In:J.
J. Alpigini, J. F. Peters, A. Skowron, N. Zhong (Eds.), Rough
37. L. Polkowski and A. Skowron, Implementing fuzzy contain-
Sets and Current Trends in Computing, LNAI 2475. Springer-
ment via rough inclusions: Rough mereological approach
Verlag, Berlin, 2002, 595601.
to distributed problem solving, Proc. Fifth IEEE Int. Conf.
on Fuzzy Systems, vol. 2, New Orleans, Sept. 811, 1996, 52. M. S. Szczuka, N. H. Son, Analysis of image sequences for un-
11471153. manned aerial vehicles. In:M. Inuiguchi, S. Hirano, S. Tsumoto
(Eds.), Rough Set Theory and Granular Computing. Berlin:
38. L. Polkowski, A. Skowron, Rough mereology: A new paradigm
Springer-Verlag, 2003, 291300.
for approximate reasoning, International Journal of Approxi-
mate Reasoning, vol. 15, no. 4, 1996, 333365. 53. H. S. Son, A. Skowron, M. Szczuka, Situation identication by
unmanned aerial vehicle. In: Proc. of CS&P 2000, Informatik
39. L. Polkowski, A. Skowron, Rough mereological calculi of gran-
Berichte, Humboldt-Universitat zu Berlin, 2000, 177188.
ules: A rough set approach to computation, Computational
Intelligence: An International Journal, vol. 17, no. 3, 2001, 54. J. F. Peters, L. Han, S. Ramanna, Rough neural computing in
472492. signal analysis, Computational Intelligence, vol. 1, no. 3, 2001,
493513.
40. Bazan, H. S. Nguyen, A. Skowron, M. Szczuka: A view on rough
set concept approximation, In: G. Wang, Q. Liu, Y. Y. Yao, A. 55. J. F. Peters, S. Ramanna, Towards a software change classica-
Skowron, Proceedings of the Ninth International Conference tion system: A rough set approach, Software Quality Journal,
on Rough Sets, Fuzzy Sets, Data Mining and Granular Com- vol. 11, no. 2, 2003, 87120.
puting RSFDGrC2003), Chongqing, China, 2003, LNAI 2639, 56. M. Reformat, W. Pedrycz, N. J. Pizzi, Software quality analysis
181188. with the use of computational intelligence, Information and
41. J. Bazan, H. S. Nguyen, J. F. Peters, A. Skowron, M. Szczuka: Software Technology, 45, 2003, 405417.
Rough set approach to pattern extraction from classiers, Pro- 57. J. F. Peters, S. Ramanna, A rough sets approach to assess-
ceedings of the Workshop on Rough Sets in Knowledge Dis- ing software quality: Concepts and rough Petri net models.
covery and Soft Computing at ETAPS2003, April 1213, 23, In:S. K. Pal andA. Skowron (Eds.), Rough-Fuzzy Hybridiza-
Warsaw University, electronic version in Electronic Notes in tion: New Trends in Decision Making. Berlin: Springer-Verlag,
Computer Science, Elsevier, 2029. G. Frege, Grundlagen der 1999, 349380.
Arithmetik, 2, Verlag von Herman Pohle, Jena, 1893. 58. W. Pedrycz, L. Han, J. F. Peters, S. Ramanna, R. Zhai, Cali-
42. T. Munakata, Z. Pawlak, Rough control: Application of rough bration of software quality: Fuzzy neural and rough neural
set theory to control, Proc. Eur. Congr. Fuzzy Intell. Technol. approaches. Neurocomputing, vol. 36, 2001, 149170.
EUFIT96, 1996, 209218. 59. J. F. Peters, C. Henry, Reinforcement learning with approx-
43. J. F. Peters, A. Skowron, Z. Suraj, An application of rough set imation spaces. Fundamenta Informaticae, 71 (2-3), 2006,
methods to automatic concurrent control design, Fundamenta 323349.
Informaticae, 43(14), 2000, 269290. 60. W. Pedrycz, Granular computing with shadowed sets. In:D.
44. T. Y. Lin, Fuzzy controllers: An integrated approach based on Slezak, G. Wang, M. Szczuka, I. Duntsch, Y. Yao (Eds.),Rough
fuzzy logic, rough sets, and evolutionary computing. In: T. Y. Sets, Fuzzy Sets, Data Mining, and Granular Computing,
Lin, N. Cercone (Eds.), Rough Sets and Data Mining: Analysis LNAI 3641. Springer, Berlin 2005, 2331.
for Imprecise Data. Norwell, MA, Kluwer Academic Publish- 61. W. Pedryz, Granular computing in knowledge integration and
ers, 1997, 109122. reuse. In:D. Zhang, T. M. Khoshgoftaar, M.-L. Shyu (Eds.),
45. J. Grzymala-Busse, S. Y. Sedelow, W. A. Sedelow, Machine IEEE Int. Conf. on Information Reuse and Integration. Las
learning & knowledge acquisition, rough sets, and the En- Vegas, NV, USA, 1517 Aug. 2005.
glish semantic code. In: T. Y. Lin, N. Cercone (Eds.), Rough
Computational Intelligence 5