Instance-based learning is called lazy learning because:
- It does not build an explicit model during training.
- Instead, it simply stores the training data and waits until a query/test instance is received.
- When a test instance comes, it computes results using the stored training data (e.g., by finding the nearest neighbors).
- Since it delays learning until prediction time, it is termed lazy.
Examples:
- k-Nearest Neighbors (k-NN)
- Locally Weighted Regression (LWR)
Comparison: Instance-based vs. Model-based Learning
| Aspect | Instance-Based Learning (Lazy) | Model-Based Learning (Eager) |
|---|---|---|
| Model creation | No explicit model is created | Learns a model during training |
| Training phase | Fast (just stores data) | Slow (training and optimization required) |
| Prediction phase | Slow (computation occurs at query time) | Fast (uses trained model) |
| Memory usage | High (stores all training data) | Low to medium (stores model parameters only) |
| Flexibility | More flexible (adapts locally to data) | May be less flexible depending on the model |
| Examples | k-NN, LWR | Decision Trees, SVM, Neural Networks |
