Email the Author

You can use this page to email Soledad Galli, PhD about Feature Selection in Machine Learning.

Please include an email address so the author can respond to your query

This message will be sent to Soledad Galli, PhD

This site is protected by reCAPTCHA and the Google  Privacy Policy and  Terms of Service apply.

About the Book

Feature selection is the process of selecting a subset of features from the total variables in a data set to train machine learning algorithms. Feature selection is an important aspect of data mining and predictive modelling.

Feature selection is key for developing simpler, faster, and highly performant machine learning models and can help to avoid overfitting. The aim of any feature selection algorithm is to create classifiers or regression models that run faster and whose outputs are easier to understand by their users.

In this book, you will find the most widely used feature selection methods to select the best subsets of predictor variables from your data. You will learn about filter, wrapper, and embedded methods for feature selection. Then, you will discover methods designed by computer science professionals or used in data science competitions that are faster or more scalable.

First, we will discuss the use of statistical and univariate algorithms in the context of artificial intelligence. Next, we will cover methods that select features through optimization of the model performance. We will move on to feature selection algorithms that are baked into the machine learning techniques. And finally, we will discuss additional methods designed by data scientists specifically for applied predictive modeling.

In this book, you will find out how to:

  • Remove useless and redundant features by examining variability and correlation.
  • Choose features based on statistical tests such as ANOVA, chi-square, and mutual information.
  • Select features by using Lasso regularization or decision tree based feature importance, which are embedded in the machine learning modeling process.
  • Select features by recursive feature elimination, addition, or value permutation.

Each chapter fleshes out various methods for feature selection that share common characteristics. First, you will learn the fundamentals of the feature selection method, and next you will find a Python implementation.

The book comes with an accompanying Github repository with the full source code that you can download, modify, and use in your own data science projects and case studies.

Feature selection methods differ from dimensionality reduction methods in that feature selection techniques do not alter the original representation of the variables, but merely select a reduced number of features from the training data that produce performant machine learning models.

Using the Python libraries Scikit-learn, MLXtend, and Feature-engine, you’ll learn how to select the best numerical and categorical features for regression and classification models in just a few lines of code. You will also learn how to make feature selection part of your machine learning workflow.


About the Author

Soledad Galli, PhD’s avatar Soledad Galli, PhD

@Soledad_Galli

Soledad Galli is a seasoned data scientist, instructor, and software developer with over a decade of experience across esteemed academic institutions and renowned businesses. She specializes in developing and deploying machine learning models for assessing insurance claims, credit risk, and fraud prevention.


Sole is the leading instructor at Train in Data, where she shares her wealth of knowledge through online courses on machine learning, boasting an enrollment of over 50,000 students worldwide, consistently earning high praise. Additionally, she is the driving force behind the open-source Python library Feature-engine, currently enjoying a monthly download count of 150,000+.


In 2018, Sole was honored with the Data Science Leaders' award, and in 2019, she gained recognition as one of LinkedIn's influential voices in data science and analytics. Her passion for disseminating machine learning knowledge extends to speaking engagements at data science conferences and numerous publications on the subject, including a notable piece addressing the misuse of data and artificial intelligence.

Logo white 96 67 2x

Publish Early, Publish Often

  • Path
  • There are many paths, but the one you're on right now on Leanpub is:
  • Feature-selection-in-machine-learning › Email Author › New
    • READERS
    • Newsletters
    • Weekly Sale
    • Monthly Sale
    • Store
    • Home
    • Redeem a Token
    • Search
    • Support
    • Leanpub FAQ
    • Leanpub Author FAQ
    • Search our Help Center
    • How to Contact Us
    • FRONTMATTER PODCAST
    • Featured Episode
    • Episode List
    • MEMBERSHIPS
    • Reader Memberships
    • Department Reader Memberships
    • Author Memberships
    • Your Membership
    • COMPANY
    • About
    • About Leanpub
    • Blog
    • Contact
    • Press
    • Essays
    • AI Services
    • Imagine a world...
    • Manifesto
    • More
    • Partner Program
    • Causes
    • Accessibility
    • AUTHORS
    • Write and Publish on Leanpub
    • Create a Book
    • Create a Bundle
    • Create a Course
    • Create a Track
    • Testimonials
    • Why Leanpub
    • Services
    • TranslateAI
    • TranslateWord
    • TranslateEPUB
    • PublishWord
    • Publish on Amazon
    • CourseAI
    • GlobalAuthor
    • Marketing Packages
    • IndexAI
    • Author Newsletter
    • The Leanpub Author Update
    • Author Support
    • Author Help Center
    • Leanpub Authors Forum
    • The Leanpub Manual
    • Supported Languages
    • The LFM Manual
    • Markua Manual
    • API Docs
    • Organizations
    • Learn More
    • Sign Up
    • LEGAL
    • Terms of Service
    • Copyright Policy
    • Privacy Policy
    • Refund Policy

*   *   *

Leanpub is copyright © 2010-2025 Ruboss Technology Corp.
All rights reserved.

This site is protected by reCAPTCHA
and the Google  Privacy Policy and  Terms of Service apply.

Leanpub requires cookies in order to provide you the best experience. Dismiss