Feature Selection in Machine Learning with Python
Feature Selection in Machine Learning with Python
Over 20 methods to select the most predictive features and build simpler, faster, and more reliable machine learning models.
About the Book
Feature selection is the process of selecting a subset of features from the total variables in a data set to train machine learning algorithms. Feature selection is an important aspect of data mining and predictive modelling.
Feature selection is key for developing simpler, faster, and highly performant machine learning models and can help to avoid overfitting. The aim of any feature selection algorithm is to create classifiers or regression models that run faster and whose outputs are easier to understand by their users.
In this book, you will find the most widely used feature selection methods to select the best subsets of predictor variables from your data. You will learn about filter, wrapper, and embedded methods for feature selection. Then, you will discover methods designed by computer science professionals or used in data science competitions that are faster or more scalable.
First, we will discuss the use of statistical and univariate algorithms in the context of artificial intelligence. Next, we will cover methods that select features through optimization of the model performance. We will move on to feature selection algorithms that are baked into the machine learning techniques. And finally, we will discuss additional methods designed by data scientists specifically for applied predictive modeling.
In this book, you will find out how to:
- Remove useless and redundant features by examining variability and correlation.
- Choose features based on statistical tests such as ANOVA, chi-square, and mutual information.
- Select features by using Lasso regularization or decision tree based feature importance, which are embedded in the machine learning modeling process.
- Select features by recursive feature elimination, addition, or value permutation.
Each chapter fleshes out various methods for feature selection that share common characteristics. First, you will learn the fundamentals of the feature selection method, and next you will find a Python implementation.
The book comes with an accompanying Github repository with the full source code that you can download, modify, and use in your own data science projects and case studies.
Feature selection methods differ from dimensionality reduction methods in that feature selection techniques do not alter the original representation of the variables, but merely select a reduced number of features from the training data that produce performant machine learning models.
Using the Python libraries Scikit-learn, MLXtend, and Feature-engine, you’ll learn how to select the best numerical and categorical features for regression and classification models in just a few lines of code. You will also learn how to make feature selection part of your machine learning workflow.
- Who is this book for
- What this book covers
- Technical requirements
- Download the code files
- Get in touch
Chapter 1: Feature Selection Overview
- What is feature selection?
- Why do we select features?
- Feature selection methods
- Filter methods
- Wrapper methods
- Embedded methods
- Other methods
Chapter 2: Basic Feature Selection Methods
- Constant features
- Quasi-constant features
- Duplicated features
Chapter 3: Correlation of Predictors
- Correlation coefficients
- Visualizing correlated features
- Remove correlated features: retain first, remove the rest
- Remove correlated features: retain best feature, remove the rest
- Correlation of categorical variables
Chapter 4: Filter Methods
- Mutual information
Chapter 5: Univariate Feature Selection
- Single feature model
- Target encoding
Chapter 6: Wrapper Methods
- Exhaustive search
- Forward feature selection
- Backward feature elimination
Chapter 7: Embedded Methods
- Feature importance from decision trees
- Recursive feature elimination by feature importance
Chapter 8: Other Methods
- Recursive feature addition
- Recursive feature elimination
- Feature shuffling
- Other books by the author
- Online courses by the author
The Leanpub 60-day 100% Happiness Guarantee
Within 60 days of purchase you can get a 100% refund on any Leanpub purchase, in two clicks.
See full terms
80% Royalties. Earn $16 on a $20 book.
We pay 80% royalties. That's not a typo: you earn $16 on a $20 sale. If we sell 5000 non-refunded copies of your book or course for $20, you'll earn $80,000.
(Yes, some authors have already earned much more than that on Leanpub.)
In fact, authors have earnedover $12 millionwriting, publishing and selling on Leanpub.
Learn more about writing on Leanpub
Free Updates. DRM Free.
If you buy a Leanpub book, you get free updates for as long as the author updates the book! Many authors use Leanpub to publish their books in-progress, while they are writing them. All readers get free updates, regardless of when they bought the book or how much they paid (including free).
Most Leanpub books are available in PDF (for computers) and EPUB (for phones, tablets and Kindle). The formats that a book includes are shown at the top right corner of this page.
Finally, Leanpub books don't have any DRM copy-protection nonsense, so you can easily read them on any supported device.