文件名称:bayes_multiclass
介绍说明--下载内容均来自于网络,请自行研究使用
z=bayes_classifier(m,S,P,X). This function outputs the Bayesian classification
rule for M classes, each modeled by a Gaussian distribution.
where,
M: the number of classes.
• l: the number of features (for each feature vector).
• N: the number of data vectors.
• m: lxM matrix, whose j-th column corresponds to the mean of the j-th class.
• S: lxlxM matrix. S(:,:,j) is the covariance matrix of the j-th normal distribution.
• P: M-dimensional vector whose j-th component is the a priori probability of the j-th class.
• X: lxM data matrix, whose rows are the feature vectors, i.e., data matrix in scikit-learn convention.
• y: N-dimensional vector containing the known class labels, i.e., the ground truth, or target
vector in scikit-learn convention.
• z: N-dimensional vector containing the predicted class labels, i.e., the vector of predicted class
labels in scikit-learn convention.-z=bayes_classifier(m,S,P,X). This function outputs the Bayesian classification
rule for M classes, each modeled by a Gaussian distribution.
where,
M: the number of classes.
• l: the number of features (for each feature vector).
• N: the number of data vectors.
• m: lxM matrix, whose j-th column corresponds to the mean of the j-th class.
• S: lxlxM matrix. S(:,:,j) is the covariance matrix of the j-th normal distribution.
• P: M-dimensional vector whose j-th component is the a priori probability of the j-th class.
• X: lxM data matrix, whose rows are the feature vectors, i.e., data matrix in scikit-learn convention.
• y: N-dimensional vector containing the known class labels, i.e., the ground truth, or target
vector in scikit-learn convention.
• z: N-dimensional vector containing the predicted class labels, i.e., the vector of predicted class
labels in scikit-learn convention.
rule for M classes, each modeled by a Gaussian distribution.
where,
M: the number of classes.
• l: the number of features (for each feature vector).
• N: the number of data vectors.
• m: lxM matrix, whose j-th column corresponds to the mean of the j-th class.
• S: lxlxM matrix. S(:,:,j) is the covariance matrix of the j-th normal distribution.
• P: M-dimensional vector whose j-th component is the a priori probability of the j-th class.
• X: lxM data matrix, whose rows are the feature vectors, i.e., data matrix in scikit-learn convention.
• y: N-dimensional vector containing the known class labels, i.e., the ground truth, or target
vector in scikit-learn convention.
• z: N-dimensional vector containing the predicted class labels, i.e., the vector of predicted class
labels in scikit-learn convention.-z=bayes_classifier(m,S,P,X). This function outputs the Bayesian classification
rule for M classes, each modeled by a Gaussian distribution.
where,
M: the number of classes.
• l: the number of features (for each feature vector).
• N: the number of data vectors.
• m: lxM matrix, whose j-th column corresponds to the mean of the j-th class.
• S: lxlxM matrix. S(:,:,j) is the covariance matrix of the j-th normal distribution.
• P: M-dimensional vector whose j-th component is the a priori probability of the j-th class.
• X: lxM data matrix, whose rows are the feature vectors, i.e., data matrix in scikit-learn convention.
• y: N-dimensional vector containing the known class labels, i.e., the ground truth, or target
vector in scikit-learn convention.
• z: N-dimensional vector containing the predicted class labels, i.e., the vector of predicted class
labels in scikit-learn convention.
(系统自动生成,下载前可以参看下载内容)
下载文件列表
bayes_multiclass.ipynb