Machine Learning
  • Introduction
  • man
  • Linear model
    • Linear Regression
    • Generalized Linear Models
    • Nonlinear regression
  • bayes
    • bayesian network
    • Variational Bayesian inference
    • Gaussian Process Regression
  • Logistic Regression
    • L1 regularization
    • L2 regularization
    • softmax
    • Overflow and Underflow
  • SVM
    • C-SVM
    • C-SVM求解
  • EM
    • GMM
  • Maximum Entropy
    • IIS
  • HMM
    • viterbi algorithm
  • CRF
  • Random Forest
    • bagging
    • random forest
  • boosting
    • catboost
    • gradient boosting
    • Newton Boosting
    • online boosting
    • gcForest
    • Mixture models
    • XGBoost
    • lightGBM
    • SecureBoost
  • LDA
  • rank
    • RankNet
    • LambdaRank
    • SimRank
  • Factorization Machine
    • Field-aware Factorization Machine
    • xdeepFM
  • Clustering
    • BIRCH
    • Deep Embedding Clustering
  • Kalman filtering
  • word2vec
  • 关联规则挖掘
  • MATH-Mathematical Analysis
    • measure
  • MATH-probability
    • Variational Inference
    • Dirichlet分布
    • Gibbs Sampling
    • Maximum entropy probability distribution
    • Conjugate prior
    • Gaussian Process
    • Markov process
    • Poisson process
    • measure
    • Gumbel
  • MATH-Linear Algebra
    • SVD
    • SVD-推荐
    • PCA
    • Linear Discriminant Analysis
    • Nonnegative Matrix Factorization
  • MATH-Convex optimization
    • 梯度下降
    • 随机梯度下降
    • 牛顿法
    • L-BFGS
    • 最速下降法
    • 坐标下降法
    • OWL-QN
    • 对偶问题
    • 障碍函数法
    • 原对偶内点法
    • ISTA
    • ADMM
    • SAG
  • MATH-碎碎念
    • cost function
    • Learning Theory
    • sampling
    • Entropy
    • variational inference
    • basis function
    • Diffie–Hellman key exchange
    • wavelet transform
    • 图
    • Portfolio
    • 凯利公式
  • ML碎碎念
    • 特征
    • test
    • TF-IDF
    • population stability index
    • Shapley Values
  • 课件
    • xgboost算法演进
  • Time Series
  • PID
  • graph
    • SimRank
    • community detection
    • FRAUDAR
    • Anti-Trust Rank
    • Struc2Vec
    • graph theory
    • GNN
  • Anomaly Detection
    • Isolation Forest
    • Time Series
  • Dimensionality Reduction
    • Deep Embedded Clustering
  • Federated Learning
  • automl
  • Look-alike
  • KNN
  • causal inference
Powered by GitBook
On this page
  • kernel LDA
  • 参考佳文

Was this helpful?

  1. MATH-Linear Algebra

Linear Discriminant Analysis

PreviousPCANextNonnegative Matrix Factorization

Last updated 5 years ago

Was this helpful?

Linear Discriminant Analysis (线性判别分析)的缩写也叫LDA,核心就是 J(w)=(μ2−μ1)2s12+s22J(w) = \frac{(\mu_2 - \mu_1)^2}{s_1^2 + s_2^2}J(w)=s12​+s22​(μ2​−μ1​)2​ : 就是希望投影后的两类样本 的均值越远越好,但是各类别方差越小越好。

kernel LDA

y=WTX+λ∥W∥y = W^T X + \lambda \left \| W \right \|y=WTX+λ∥W∥

从bayes的角度看LDA很重要,因为QDA的train和predict一般实现就是从bayes角度实现的。而且和 NaiveBayes中的关联Gaussian Naive Bayes classifier 也是从这个角度的。

LDA假设各类别的协方差一样,而QDA没有该要求,所以更灵活。

一般看y=1的概率,但是也可以看log-probability ratios 。 (此不作为重点,因为一般分类用AUC来评价)

从此式也可以看出是关于x的一次函数,所以是线性判别

参考佳文

http://www.doc88.com/p-147667786401.html
线性判别分析(二)——Bayes最优分类器的角度看LDA
高斯判别分析模型
MLAPP 读书笔记 - 04 高斯模型(Gaussian models)
Linear Discriminant Analysis 线性判别分析
https://mp.weixin.qq.com/s/AeLwfmM0N-b1dfxt3v4C-A
线性判别分析LDA详解
机器学习中的数学(4)-线性判别分析(LDA), 主成分分析(PCA)
正则化最小二乘线性判别分析算法
https://blog.csdn.net/VictoriaW/article/details/78275394