Basis Functions

basis functions are the mathematical building blocks used to transform input data into a form that a model can learn from.Think of it like a mathematical “lens” to represent data in a new way.
. Let’s go through the common models and their implicit or explicit basis functions in a compact way

Linear Regression / Logistic Regression

  • Basis Function: f(x)=β0+β1×1+β2×2+⋯+βnxnf(x) = \beta_0 + \beta_1 x_1 + \beta_2 x_2 + \dots + \beta_n x_nf(x)=β0​+β1​x1​+β2​x2​+⋯+βn​xn​ A linear combination of input features. No transformation unless you add polynomial or other features manually.

Support Vector Machine (SVM)

  • Basis Function:
    Depends on the kernel you choose.
    • Linear SVM: same as logistic regression (linear basis).
    • RBF kernel: Uses radial basis functions: ϕ(x,x′)=exp⁡(−γ∥x−x′∥2)\phi(x, x’) = \exp(-\gamma \|x – x’\|^2)ϕ(x,x′)=exp(−γ∥x−x′∥2)
    • Polynomial kernel: Applies polynomial basis functions.
    🔑 SVM doesn’t transform data directly, it uses kernel trick to compute similarity in high-dimensional spaces implicitly.

Decision Tree
the decisioin tree it’s different from models like linear regression or SVM.

It uses non-continuous, step-wise functions based on thresholds—you can think of them as indicator functions.

Leave a Comment

Your email address will not be published. Required fields are marked *