basis functions are the mathematical building blocks used to transform input data into a form that a model can learn from.Think of it like a mathematical “lens” to represent data in a new way.
. Let’s go through the common models and their implicit or explicit basis functions in a compact way
Linear Regression / Logistic Regression
- Basis Function: f(x)=β0+β1×1+β2×2+⋯+βnxnf(x) = \beta_0 + \beta_1 x_1 + \beta_2 x_2 + \dots + \beta_n x_nf(x)=β0+β1x1+β2x2+⋯+βnxn A linear combination of input features. No transformation unless you add polynomial or other features manually.
Support Vector Machine (SVM)
- Basis Function:
Depends on the kernel you choose.- Linear SVM: same as logistic regression (linear basis).
- RBF kernel: Uses radial basis functions: ϕ(x,x′)=exp(−γ∥x−x′∥2)\phi(x, x’) = \exp(-\gamma \|x – x’\|^2)ϕ(x,x′)=exp(−γ∥x−x′∥2)
- Polynomial kernel: Applies polynomial basis functions.
Decision Tree
the decisioin tree it’s different from models like linear regression or SVM.
It uses non-continuous, step-wise functions based on thresholds—you can think of them as indicator functions.