<ul> <li>Foreword</li> <li>Preface <ul> <li>Acknowledgements</li></ul></li> <li>Chapter 1: Introduction <ul> <li>1.1 THE NAME OF THE GAME</li> <li>1.2 OVERVIEW OF MACHINE LEARNING METHODS</li> <li>1.3 HISTORY OF MACHINE LEARNING</li> <li>1.4 SOME EARLY SUCCESSES</li> <li>1.5 APPLICATIONS OF MACHINE LEARNING</li> <li>1.6 DATA MINING TOOLS AND STANDARDS</li> <li>1.7 SUMMARY AND FURTHER READING</li></ul></li> <li>Chapter 2: Learning and Intelligence <ul> <li>2.1 WHAT IS LEARNING</li> <li>2.2 NATURAL LEARNING</li> <li>2.3 LEARNING, INTELLIGENCE, CONSCIOUSNESS</li> <li>2.4 WHY MACHINE LEARNING</li> <li>2.5 SUMMARY AND FURTHER READING</li></ul></li> <li>Chapter 3: Machine Learning Basics <ul> <li>3.1 BASIC PRINCIPLES</li> <li>3.2 MEASURES FOR PERFORMANCE EVALUATION</li> <li>3.3 ESTIMATING PERFORMANCE</li> <li>3.4 *COMPARING PERFORMANCE OF MACHINE LEARNING ALGORITHMS</li> <li>3.5 COMBINING SEVERAL MACHINE LEARNING ALGORITHMS</li> <li>3.6 SUMMARY AND FURTHER READING</li></ul></li> <li>Chapter 4: Knowledge Representation <ul> <li>4.1 PROPOSITIONAL CALCULUS</li> <li>4.2 *FIRST ORDER PREDICATE CALCULUS</li> <li>4.3 DISCRIMINANT AND REGRESSION FUNCTIONS</li> <li>4.4 PROBABILITY DISTRIBUTIONS</li> <li>4.5 SUMMARY AND FURTHER READING</li></ul></li> <li>Chapter 5: Learning as Search <ul> <li>5.1 EXHAUSTIVE SEARCH</li> <li>5.2 BOUNDED EXHAUSTIVE SEARCH (BRANCH AND BOUND)</li> <li>5.3 BEST-FIRST SEARCH</li> <li>5.4 GREEDY SEARCH</li> <li>5.5 BEAM SEARCH</li> <li>5.6 LOCAL OPTIMIZATION</li> <li>5.7 GRADIENT SEARCH</li> <li>5.8 SIMULATED ANNEALING</li> <li>5.9 GENETIC ALGORITHMS</li> <li>5.10 SUMMARY AND FURTHER READING</li></ul></li> <li>Chapter 6: Measures for Evaluating the Quality of Attributes <ul> <li>6.1 MEASURES FOR CLASSIFICATION AND RELATIONAL PROBLEMS</li> <li>6.2 MEASURES FOR REGRESSION</li> <li>6.3 **FORMAL DERIVATIONS AND PROOFS</li> <li>6.4 SUMMARY AND FURTHER READING</li></ul></li> <li>Chapter 7: Data Preprocessing <ul> <li>7.1 REPRESENTATION OF COMPLEX STRUCTURES</li> <li>7.2 DISCRETIZATION OF CONTINUOUS ATTRIBUTES</li> <li>7.3 ATTRIBUTE BINARIZATION</li> <li>7.4 TRANSFORMING DISCRETE ATTRIBUTES INTO CONTINUOUS</li> <li>7.5 DEALING WITH MISSING VALUES</li> <li>7.6 VISUALIZATION</li> <li>7.7 DIMENSIONALITY REDUCTION</li> <li>7.8 **FORMAL DERIVATIONS AND PROOFS</li> <li>7.9 SUMMARY AND FURTHER READING</li></ul></li> <li>Chapter 8: *Constructive Induction <ul> <li>8.1 DEPENDENCE OF ATTRIBUTES</li> <li>8.2 CONSTRUCTIVE INDUCTION WITH PRE-DEFINED OPERATORS</li> <li>8.3 CONSTRUCTIVE INDUCTION WITHOUT PRE-DEFINED OPERATORS</li> <li>8.4 SUMMARY AND FURTHER READING</li></ul></li> <li>Chapter 9: Symbolic Learning <ul> <li>9.1 LEARNING OF DECISION TREES</li> <li>9.2 LEARNING OF DECISION RULES</li> <li>9.3 LEARNING OF ASSOCIATION RULES</li> <li>9.4 LEARNING OF REGRESSION TREES</li> <li>9.5 *INDUCTIVE LOGIC PROGRAMMING</li> <li>9.6 NAIVE AND SEMI-NAIVE BAYESIAN CLASSIFIER</li> <li>9.7 BAYESIAN BELIEF NETWORKS</li> <li>9.8 SUMMARY AND FURTHER READING</li></ul></li> <li>Chapter 10: Statistical Learning <ul> <li>10.1 NEAREST NEIGHBORS</li> <li>10.2 DISCRIMINANT ANALYSIS</li> <li>10.3 LINEAR REGRESSION</li> <li>10.4 LOGISTIC REGRESSION</li> <li>10.5 *SUPPORT VECTOR MACHINES</li> <li>10.6 SUMMARY AND FURTHER READING</li></ul></li> <li>Chapter 11: Artificial Neural Networks <ul> <li>11.1 INTRODUCTION</li> <li>11.2 TYPES OF ARTIFICIAL NEURAL NETWORKS</li> <li>11.3 *HOPFIELD’S NEURAL NETWORK</li> <li>11.4 *BAYESIAN NEURAL NETWORK</li> <li>11.5 PERCEPTRON</li> <li>11.6 RADIAL BASIS FUNCTION NETWORKS</li> <li>11.7 **FORMAL DERIVATIONS AND PROOFS</li> <li>11.8 SUMMARY AND FURTHER READING</li></ul></li> <li>Chapter 12: Cluster Analysis <ul> <li>12.1 INTRODUCTION</li> <li>12.2 MEASURES OF DISSIMILARITY</li> <li>12.3 HIERARCHICAL CLUSTERING</li> <li>12.4 PARTITIONAL CLUSTERING</li> <li>12.5 MODEL-BASED CLUSTERING</li> <li>12.6 OTHER CLUSTERING METHODS</li> <li>12.7 SUMMARY AND FURTHER READING</li></ul></li> <li>Chapter 13: **Learning Theory <ul> <li>13.1 COMPUTABILITY THEORY AND RECURSIVE FUNCTIONS</li> <li>13.2 FORMAL LEARNING THEORY</li> <li>13.3 PROPERTIES OF LEARNING FUNCTIONS</li> <li>13.4 PROPERTIES OF INPUT DATA</li> <li>13.5 CONVERGENCE CRITERIA</li> <li>13.6 IMPLICATIONS FOR MACHINE LEARNING</li> <li>13.7 SUMMARY AND FURTHER READING</li></ul></li> <li>Chapter 14: **Computational Learning Theory <ul> <li>14.1 INTRODUCTION</li> <li>14.2 GENERAL FRAMEWORK FOR CONCEPT LEARNING</li> <li>14.3 PAC LEARNING MODEL</li> <li>14.4 VAPNIK-CHERVONENKIS DIMENSION</li> <li>14.5 LEARNING IN THE PRESENCE OF NOISE</li> <li>14.6 EXACT AND MISTAKE BOUNDED LEARNING MODELS</li> <li>14.7 INHERENT UNPREDICTABILITY AND PAC-REDUCTIONS</li> <li>14.8 WEAK AND STRONG LEARNING</li> <li>14.9 SUMMARY AND FURTHER READING</li></ul></li> <li>Appendix A: *Definitions of some lesser known terms <ul> <li>A.1 COMPUTATIONAL COMPLEXITY CLASSES</li> <li>A.2 ASYMPTOTIC NOTATION</li> <li>A.3 SOME BOUNDS FOR PROBABILISTIC ANALYSIS</li> <li>A.4 COVARIANCE MATRIX</li></ul></li> <li>References</li> <li>Index</li></ul>