Machine Learning
  • Introduction
  • man
  • Linear model
    • Linear Regression
    • Generalized Linear Models
    • Nonlinear regression
  • bayes
    • bayesian network
    • Variational Bayesian inference
    • Gaussian Process Regression
  • Logistic Regression
    • L1 regularization
    • L2 regularization
    • softmax
    • Overflow and Underflow
  • SVM
    • C-SVM
    • C-SVM求解
  • EM
    • GMM
  • Maximum Entropy
    • IIS
  • HMM
    • viterbi algorithm
  • CRF
  • Random Forest
    • bagging
    • random forest
  • boosting
    • catboost
    • gradient boosting
    • Newton Boosting
    • online boosting
    • gcForest
    • Mixture models
    • XGBoost
    • lightGBM
    • SecureBoost
  • LDA
  • rank
    • RankNet
    • LambdaRank
    • SimRank
  • Factorization Machine
    • Field-aware Factorization Machine
    • xdeepFM
  • Clustering
    • BIRCH
    • Deep Embedding Clustering
  • Kalman filtering
  • word2vec
  • 关联规则挖掘
  • MATH-Mathematical Analysis
    • measure
  • MATH-probability
    • Variational Inference
    • Dirichlet分布
    • Gibbs Sampling
    • Maximum entropy probability distribution
    • Conjugate prior
    • Gaussian Process
    • Markov process
    • Poisson process
    • measure
    • Gumbel
  • MATH-Linear Algebra
    • SVD
    • SVD-推荐
    • PCA
    • Linear Discriminant Analysis
    • Nonnegative Matrix Factorization
  • MATH-Convex optimization
    • 梯度下降
    • 随机梯度下降
    • 牛顿法
    • L-BFGS
    • 最速下降法
    • 坐标下降法
    • OWL-QN
    • 对偶问题
    • 障碍函数法
    • 原对偶内点法
    • ISTA
    • ADMM
    • SAG
  • MATH-碎碎念
    • cost function
    • Learning Theory
    • sampling
    • Entropy
    • variational inference
    • basis function
    • Diffie–Hellman key exchange
    • wavelet transform
    • 图
    • Portfolio
    • 凯利公式
  • ML碎碎念
    • 特征
    • test
    • TF-IDF
    • population stability index
    • Shapley Values
  • 课件
    • xgboost算法演进
  • Time Series
  • PID
  • graph
    • SimRank
    • community detection
    • FRAUDAR
    • Anti-Trust Rank
    • Struc2Vec
    • graph theory
    • GNN
  • Anomaly Detection
    • Isolation Forest
    • Time Series
  • Dimensionality Reduction
    • Deep Embedded Clustering
  • Federated Learning
  • automl
  • Look-alike
  • KNN
  • causal inference
Powered by GitBook
On this page

Was this helpful?

  1. MATH-碎碎念

Diffie–Hellman key exchange

p是素数,g和x是整数,虽然y=gxmod  py = g^x \mod py=gxmodp很快,但是逆过程求x就困难了。

以下是DH协议的方案: 1. Alice和Bob先对p 和g达成一致,而且公开出来。Eve也就知道它们的值了。 2. Alice取一个私密的整数a,不让任何人知道,发给Bob 计算结果:A=gamod  pA = g^a \mod pA=gamodp. Eve 也看到了A的值。 3. 类似,Bob 取一私密的整数b,发给Alice计算结果B=gbmod  pB = g^b \mod pB=gbmodp.同样Eve也会看见传递的B是什么。 4. Alice 计算出 S=Bamod  p=(gb)amod  p=gabmod  pS = B^a \mod p = (g^b)^a \mod p = g^{ab} \mod pS=Bamodp=(gb)amodp=gabmodp. 5. Bob 也能计算出S=Abmod  p=(ga)bmod  p=gabmod  pS = A^b \mod p = (g^a)^b \mod p = g^{ab} \mod pS=Abmodp=(ga)bmodp=gabmodp. 6. Alice 和 Bob 现在就拥有了一个共用的密钥S. 7. 虽然Eve看见了p,g, A and B, 但是鉴于计算离散对数的困难性,她无法知道a和b 的具体值。所以Eve就无从知晓密钥S 是什么了。

证明 (gbmod  p)a=gabmod  p(g^b \mod p)^a = g^{ab} \mod p(gbmodp)a=gabmodp :

假设:g=p+x(gbmod  p)a=((p+x)bmod  p)a=(xb)a=xabgabmod  p=xab(gbmod  p)a=gabmod  p\text{假设:} g = p+x \\ (g^b \mod p)^a = ((p+x)^b \mod p)^a = (x^b)^a = x^{ab} \\ g^{ab} \mod p = x^{ab} \\ (g^b \mod p)^a = g^{ab} \mod p假设:g=p+x(gbmodp)a=((p+x)bmodp)a=(xb)a=xabgabmodp=xab(gbmodp)a=gabmodp
Previousbasis functionNextwavelet transform

Last updated 5 years ago

Was this helpful?