Machine Learning

Price: 745.00 INR

We sell our titles through other companies
Disclaimer :You will be redirected to a third party website.The sole responsibility of supplies, condition of the product, availability of stock, date of delivery, mode of payment will be as promised by the said third party only. Prices and specifications may vary from the OUP India site.

ISBN:

9780190127275

Publication date:

06/08/2021

Paperback

496 pages

246x189mm

Price: 745.00 INR

We sell our titles through other companies
Disclaimer :You will be redirected to a third party website.The sole responsibility of supplies, condition of the product, availability of stock, date of delivery, mode of payment will be as promised by the said third party only. Prices and specifications may vary from the OUP India site.

ISBN:

9780190127275

Publication date:

06/08/2021

Paperback

496 pages

First Edition

S. Sridhar and M. Vijayalakshmi

This book on Machine Learning is designed as a textbook for undergraduate and post-graduate students of engineering. It provides a comprehensive coverage of fundamentals of machine learning.

Spread over 16 chapters, the book starts with an overview of machine learning and discusses the need for understanding data and necessary mathematics. It goes on to explain the basics of learning theory, regression analysis, decision tree, and decision rule-based classification algorithms. The book provides an introduction to Bayesian learning and probabilistic graphical models. Important topics such as support vector machines, artificial neural networks, ensemble learning, clustering algorithms, reinforcement algorithms, and genetic algorithms are discussed in depth. It ends with the latest developments in deep learning.

A perfect balance between theoretical and mathematical exposition is provided with several numerical examples, review questions, and Python programs. It will also be useful for engineering professionals and IT employees who want to learn the basics of the subject.

Rights:  World Rights

First Edition

S. Sridhar and M. Vijayalakshmi

Description

This book on Machine Learning is designed as a textbook for undergraduate and post-graduate students of engineering. It provides a comprehensive coverage of fundamentals of machine learning.

Spread over 16 chapters, the book starts with an overview of machine learning and discusses the need for understanding data and necessary mathematics. It goes on to explain the basics of learning theory, regression analysis, decision tree, and decision rule-based classification algorithms. The book provides an introduction to Bayesian learning and probabilistic graphical models. Important topics such as support vector machines, artificial neural networks, ensemble learning, clustering algorithms, reinforcement algorithms, and genetic algorithms are discussed in depth. It ends with the latest developments in deep learning.

A perfect balance between theoretical and mathematical exposition is provided with several numerical examples, review questions, and Python programs. It will also be useful for engineering professionals and IT employees who want to learn the basics of the subject.

About the author

Dr. S. Sridhar is Professor at the Department of Information Science and Technology, College of Engineering, Guindy Campus, Anna University, Chennai.

Dr. M. Vijayalakshmi is Associate Professor at the Department of Information Science and Technology, College of Engineering, Guindy Campus, Anna University, Chennai.

First Edition

S. Sridhar and M. Vijayalakshmi

Table of contents

Chapter 1 introduces the Basic Concepts of Machine Learning and explores its relationships with other domains. This chapter also explores the machine learning types and applications.

Chapter 2 of this book is about Understanding Data, which is crucial for data-driven machine learning algorithms. The mathematics that is necessary for understanding data such as linear algebra and statistics covering univariate, bivariate, and multivariate statistics are introduced in this chapter. This chapter also includes feature engineering and dimensionality reduction techniques.

Chapter 3 covers the Basic Concepts of Learning. This chapter discusses theoretical aspects of learning such as concept learning, version spaces, hypothesis, and hypothesis space. It also introduces learning frameworks like PAC learning, mistake bound model, and VC dimensions.

Chapter 4 is about Similarity Learning. It discusses instance-based learning, nearest-neighbor learning, weighted k-nearest algorithms, nearest centroid classifier, and locally weighted regression (LWR) algorithms.

Chapter 5 introduces the basics of Regression. The concepts of linear regression and non-linear regression are discussed in this chapter. It also covers logistic regression. Finally, this chapter outlines the recent algorithms like Ridge, Lasso, and Elastic Net regression.

Chapter 6 throws light on the concept of Decision Tree Learning. The concept of information theory, entropy, and information gain are discussed in this chapter. The basics of tree construction algorithms like ID3, C4.5, CART, and Regression Trees and its illustration are included in this chapter. The decision tree evaluation is also introduced here.

Chapter 7 discusses Rule-based Learning. This chapter illustrates rule generation. The sequential covering algorithms like PRISM and FOIL are introduced here. This chapter also discusses analytical learning, explanation-based learning, and active learning mechanisms. An outline of association rule mining is also provided in this chapter.

Chapter 8 introduces the basics of Bayesian model. The chapter covers the concepts of classification using the Bayesian principle. Naïve Bayesian classifier and Continuous Features classification are introduced in this chapter. The variants of Bayesian classifier are also discussed.

Chapter 9 discusses Probabilistic Graphical Models. The discussion of the Bayesian Belief network construction and its inference mechanism are included in this chapter. Markov chain and Hidden Markov Model (HMM) are also introduced along with the associated algorithms.

Chapter 10 introduces the basics of Artificial Neural Networks (ANN). The chapter introduces the concepts of neural networks such as neurons, activation functions, and ANN types. Perceptron, back-propagation neural networks, Radial Basis Function Neural Network (RBFNN), Self-Organizing Feature Map (SOFM) are covered here. The chapter ends with challenges and some applications of ANN.

Chapter 11 covers Support Vector Machines (SVM). This chapter begins with a gentle introduction of linear discriminant analysis and then covers the concepts of SVM such as margins, kernels, and its associated optimization theory. The hard margin and soft margin SVMs are introduced here. Finally, this chapter ends with support vector regression.

Chapter 12 introduces Ensemble Learning. It covers meta-classifiers, the concept of voting, bootstrap resampling, bagging, and random forest and stacking algorithms. This chapter ends with the popular AdaBoost algorithm.

Chapter 13 discusses Cluster Analysis. Hierarchical clustering algorithms and partitional clustering algorithms like k-means algorithm are introduced in this chapter. In addition, the outline of density-based, grid-based and probability-based approaches like fuzzy clustering and EM algorithm is provided. This chapter ends with an evaluation of clustering algorithms.

Chapter 14 covers the concept of Reinforcement Learning. This chapter starts with the idea of reinforcement learning, multi-arm bandit problem and Markov Decision Process (MDP). It then introduces model-based (passive learning) and model-free methods. The Q-Learning and SARSA concepts are also covered in this chapter.

Chapter 15 is about Genetic Algorithms. The concepts of genetic algorithms and genetic algorithm components along with simple examples are present in this chapter. Evolutionary computation, like simulated annealing, and genetic programming are outlined at the end of this chapter.

Chapter 16 discusses Deep Learning. CNN and RNN are explained in this chapter. Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU) are outlined here. Additional web contents are provided for a thorough understanding of deep learning.

Appendix A discusses Python basics.

Appendix B covers Python packages that are necessary to implement the machine learning algorithms. The packages like Numpy, Pandas, Matplotlib, Scikit-learn and Keras are outlined in this appendix.

Appendix C offers 25 laboratory experiments covering the concepts of the textbook.

First Edition

S. Sridhar and M. Vijayalakshmi

Features

  • Algorithmic Approach: Illustrates machine learning concepts using simple language and includes 100+ numerical problems.
  • Minimal Mathematics Strategy: Focuses on building a strong foundational understanding of machine learning with minimal reliance on complex mathematics.
  • Comprehensive Coverage: Covers all essential machine learning topics with 100+ figures and Python code examples.
  • Simple Explanations: Clearly explains advanced topics such as:
    • Clustering
    • Support Vector Machines
    • Genetic Algorithms
    • Artificial Neural Networks
    • Ensemble Learning
    • Deep Learning
  • Appendices: Provide foundational knowledge of Python and key packages like:
    • NumPy
    • Pandas
    • Scikit-learn
    • Matplotlib
    • SciPy
    • Keras
  • Laboratory Manual: Includes practical examples implemented using Python and its libraries.
  • Useful Pedagogical Features: Engages learners with tools like Crossword Puzzles and Word Search activities.

First Edition

S. Sridhar and M. Vijayalakshmi

First Edition

S. Sridhar and M. Vijayalakshmi

Description

This book on Machine Learning is designed as a textbook for undergraduate and post-graduate students of engineering. It provides a comprehensive coverage of fundamentals of machine learning.

Spread over 16 chapters, the book starts with an overview of machine learning and discusses the need for understanding data and necessary mathematics. It goes on to explain the basics of learning theory, regression analysis, decision tree, and decision rule-based classification algorithms. The book provides an introduction to Bayesian learning and probabilistic graphical models. Important topics such as support vector machines, artificial neural networks, ensemble learning, clustering algorithms, reinforcement algorithms, and genetic algorithms are discussed in depth. It ends with the latest developments in deep learning.

A perfect balance between theoretical and mathematical exposition is provided with several numerical examples, review questions, and Python programs. It will also be useful for engineering professionals and IT employees who want to learn the basics of the subject.

About the author

Dr. S. Sridhar is Professor at the Department of Information Science and Technology, College of Engineering, Guindy Campus, Anna University, Chennai.

Dr. M. Vijayalakshmi is Associate Professor at the Department of Information Science and Technology, College of Engineering, Guindy Campus, Anna University, Chennai.

Table of contents

Chapter 1 introduces the Basic Concepts of Machine Learning and explores its relationships with other domains. This chapter also explores the machine learning types and applications.

Chapter 2 of this book is about Understanding Data, which is crucial for data-driven machine learning algorithms. The mathematics that is necessary for understanding data such as linear algebra and statistics covering univariate, bivariate, and multivariate statistics are introduced in this chapter. This chapter also includes feature engineering and dimensionality reduction techniques.

Chapter 3 covers the Basic Concepts of Learning. This chapter discusses theoretical aspects of learning such as concept learning, version spaces, hypothesis, and hypothesis space. It also introduces learning frameworks like PAC learning, mistake bound model, and VC dimensions.

Chapter 4 is about Similarity Learning. It discusses instance-based learning, nearest-neighbor learning, weighted k-nearest algorithms, nearest centroid classifier, and locally weighted regression (LWR) algorithms.

Chapter 5 introduces the basics of Regression. The concepts of linear regression and non-linear regression are discussed in this chapter. It also covers logistic regression. Finally, this chapter outlines the recent algorithms like Ridge, Lasso, and Elastic Net regression.

Chapter 6 throws light on the concept of Decision Tree Learning. The concept of information theory, entropy, and information gain are discussed in this chapter. The basics of tree construction algorithms like ID3, C4.5, CART, and Regression Trees and its illustration are included in this chapter. The decision tree evaluation is also introduced here.

Chapter 7 discusses Rule-based Learning. This chapter illustrates rule generation. The sequential covering algorithms like PRISM and FOIL are introduced here. This chapter also discusses analytical learning, explanation-based learning, and active learning mechanisms. An outline of association rule mining is also provided in this chapter.

Chapter 8 introduces the basics of Bayesian model. The chapter covers the concepts of classification using the Bayesian principle. Naïve Bayesian classifier and Continuous Features classification are introduced in this chapter. The variants of Bayesian classifier are also discussed.

Chapter 9 discusses Probabilistic Graphical Models. The discussion of the Bayesian Belief network construction and its inference mechanism are included in this chapter. Markov chain and Hidden Markov Model (HMM) are also introduced along with the associated algorithms.

Chapter 10 introduces the basics of Artificial Neural Networks (ANN). The chapter introduces the concepts of neural networks such as neurons, activation functions, and ANN types. Perceptron, back-propagation neural networks, Radial Basis Function Neural Network (RBFNN), Self-Organizing Feature Map (SOFM) are covered here. The chapter ends with challenges and some applications of ANN.

Chapter 11 covers Support Vector Machines (SVM). This chapter begins with a gentle introduction of linear discriminant analysis and then covers the concepts of SVM such as margins, kernels, and its associated optimization theory. The hard margin and soft margin SVMs are introduced here. Finally, this chapter ends with support vector regression.

Chapter 12 introduces Ensemble Learning. It covers meta-classifiers, the concept of voting, bootstrap resampling, bagging, and random forest and stacking algorithms. This chapter ends with the popular AdaBoost algorithm.

Chapter 13 discusses Cluster Analysis. Hierarchical clustering algorithms and partitional clustering algorithms like k-means algorithm are introduced in this chapter. In addition, the outline of density-based, grid-based and probability-based approaches like fuzzy clustering and EM algorithm is provided. This chapter ends with an evaluation of clustering algorithms.

Chapter 14 covers the concept of Reinforcement Learning. This chapter starts with the idea of reinforcement learning, multi-arm bandit problem and Markov Decision Process (MDP). It then introduces model-based (passive learning) and model-free methods. The Q-Learning and SARSA concepts are also covered in this chapter.

Chapter 15 is about Genetic Algorithms. The concepts of genetic algorithms and genetic algorithm components along with simple examples are present in this chapter. Evolutionary computation, like simulated annealing, and genetic programming are outlined at the end of this chapter.

Chapter 16 discusses Deep Learning. CNN and RNN are explained in this chapter. Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU) are outlined here. Additional web contents are provided for a thorough understanding of deep learning.

Appendix A discusses Python basics.

Appendix B covers Python packages that are necessary to implement the machine learning algorithms. The packages like Numpy, Pandas, Matplotlib, Scikit-learn and Keras are outlined in this appendix.

Appendix C offers 25 laboratory experiments covering the concepts of the textbook.