#29 Machine Learning Specialization [Course 1, Week 2, Lesson 2]
Choosing the Right Features for Your Learning Algorithm
In this section, we will learn about how to choose or engineer the most appropriate features for your learning algorithm. We will revisit an example of predicting the price of a house and explore feature engineering.
Example: Predicting House Prices
- The choice of features can have a huge impact on your learning average performance.
- Choosing or entering the right features is a critical step to making the algorithm work well.
- Two features for each house are X1 (width of lot size) and X2 (depth of lot size).
- A model can be built using f(x) = W1X1 + W2X2 + b.
- Another option is to use area as a new feature by defining X3 as X1 times X2.
- With this new feature, the model becomes FWB of x equals W1 X1 plus W2 X2 plus w3x3 plus b.
Feature Engineering
In this section, we will learn about feature engineering and how it allows you to fit not just straight lines but curves non-linear functions to your data.
Feature Engineering
- Feature engineering involves using knowledge or intuition about the problem to design new features.
- New features are usually created by transforming or combining original features in order to make it easier for the learning algorithm to make accurate predictions.
- By defining new features, you might be able to get a much better model than just taking the original set of features.
- One flavor of feature engineering allows you to fit not just straight lines but curves non-linear functions to your data.