Decline of Feature Engineering

Coming up with features is difficult, time-consuming, and requires expert knowledge. When working applications of learning, we spend a lot of time tuning the features. However, these features can be learned.

Large numbers of features is critical. The specific learning algorithm is important, but the ones who can scale to many features also have a big advantage.

Often, it’s not who has the best algorithm that wins; it’s who has the most data.

Arguably, most algorithms ‘win’ over others by scaling the model and learning more features.
Andrew Ng

    <iframe
        width="400"
        height="300"
        src="https://www.youtube.com/embed/n1ViNeWhC24"
        frameborder="0"
        allowfullscreen
    ></iframe>
Written on April 10, 2015