No Free Lunch Theorem
NFL TheoremIt means that no learning algorithm can produce accurate learners in all fields, that is, for problems in a certain field, the expected performance of all algorithms is the same.
NFL Specific Description
- Averaging all possible objective functions yields the same expected value of the "non-training set error";
- Averaging the objective function in any fixed training set yields the same expected value of the "non-training set error";
- Averaging the prior knowledge yields the same expected value of the “non-training set error”;
- Averaging the prior knowledge in any fixed training set yields the same expected value of the “non-training set error”;
The NFL theorem leads to a universal "conservation law" - for a feasible learning algorithm, the sum of its performance over all possible objective functions is zero.