搜索资源列表
-
0下载:
a Java toolkit for training, testing, and applying Bayesian Network Classifiers. Implemented classifiers have been shown to perform well in a variety of artificial intelligence, machine learning, and data mining applications. -a Java toolkit for tra
-
-
0下载:
BP神经网络分类器
程序有两种运行状态,一个是学习,另外一个是分类。在学习状态下,在Dos命令符下输入bp learn,便开始学习了,学习的结果放在weight.dat中;在工作状态下,在Dos命令符下输入bp work,便开始识别classfyme.dat中的数据了,识别完成后,结果放在results.dat中。在bp运行的任何一种状态下,都不能手工打开Weight.dat、Sample.dat、classfyme.dat、results.dat中的任何一种。~..~-BP neur
-
-
0下载:
Boosting is a meta-learning approach that aims at combining an ensemble of weak classifiers to form a strong classifier. Adaptive Boosting (Adaboost) implements this idea as a greedy search for a linear combination of classifiers by overweighting the
-
-
0下载:
基于自组织数据挖掘的多分类器集成选择的程序-Multiple classifiers ensemble selection based on GMDH
-
-
2下载:
将多类别问题分解成多个二类别问题是解决多类别分类问题的常用方式。传统one against all(OAA)分解方式的性能更多的依赖于个体分类器的精度,而不是它的差异性。本文介绍一种基于集成学习的适于多类问题的神经网络集成模型,其基本模块由一个OAA方式的二类别分类器和一个补充多类分类器组成。测试表明,该模型在多类问题上比其他经典集成算法有着更高的精度,并且有较少存储空间和计算时间的优势。-Decompose multi-class problem into multiple binary cl
-
-
0下载:
机器学习中几种典型的分类算法,SVM, ML, Gaussian Mixture Model等-typical classifiers(SVM, ML) in ,machine learning.
-
-
0下载:
adaboost代码,比较简单。但方便了解Adaboost算法的基本原理-The aim of the project is to provide a source of the
meta-learning algorithm known as AdaBoost to improve
the performance of the user-defined classifiers.
-
-
0下载:
K近邻(KNN):分类算法KNN是non-parametric分类器(不做分布形式的假设,直接从数据估计概率密度),是memory-based learning KNN不适用于高维数据(curse of dimension)-K-Nearest Neighbor (KNN): Classification Algorithm. KNN is a non-parametric classifiers (not to assume that the distribution of forms, fr
-
-
0下载:
机器学习算法,支持向量机,与其他分类器的比较算法-Machine learning algorithms, support vector machines, and other classifiers comparison algorithm
-
-
0下载:
这是matlab编写的3个常用机器学习分类器代码。其中包括了: 1)PCA 分类其;2)LDA分类器:3)naive贝叶斯分类器。 3个算法的实现参考了《Introduction to Machine
Learning》。 除了这3个分类算法的实现外,代码里面还包含了用于测试的main.m 主程序和一个实验的简要报告。实验在著名数据集acoustic_train_data 上进行。-This source code includes the implementation of three f
-
-
0下载:
The package includes the MATLAB code of the multi-instance learning algorithm miFV, which is an efficient and scalable MIL algorithm. miFV maps the original MIL bags into a new feature vector representation, which can obtain bag-level information, an
-
-
1下载:
模式识别 北京大学 本科生课程 课件 (包括贝叶斯模型、最近邻、SVM、线性与非线性分类器、boosting、统计学习、非监督学习等)-Pattern Recognition Peking University Courseware (including Bayesian model, the nearest neighbor, SVM, linear and non-linear classifiers, boosting, statistical learning, unsupervised
-
-
1下载:
自主学习把稀疏自编码器和分类器实现结合。先通过稀疏自编码对无标签的5-9的手写体进行训练得到最优参数,然后通过前向传播,得到训练集和测试集的特征,通过0-4有标签训练集训练出softmax模型,然后输入测试集到分类模型实现分类。-Independent Learning the encoder and the sparse classifiers achieve the combination. First through sparse coding since no label was ha
-
-
0下载:
用c编译的online svm适合大数据快速学习的分类器,对内存要求较低,适合数据量大的数据。-C compiled with online svm suitable for large data quickly learning classifiers, low memory requirements for large amount of data.
-
-
0下载:
presented dictionary pair classifierdriven
CNNs for object detection, where dictionary pair
back propagation (DPBP) is proposed for the end-to-end
learning of dictionary pair classifiers and CNN representation,
and sample weighting is adopted
-
-
0下载:
集成学习将若干基分类器的预测结果进行综合,具体包括Bagging算法和AdaBoost算法;还有随机森林算法,利用多棵树对样本进行训练并预测的一种分类器-Integrated learning integrates the prediction results of several base classifiers, including Bagging algorithm and AdaBoost algorithm and random forest algorithm, using a t
-
-
1下载:
adaboost算法的训练和测试代码,简单的实例(The aim of the project is to provide a source of the meta-learning algorithm known as AdaBoost to improve the performance of the user-defined classifiers.)
-
-
1下载:
boosting算法用于集成学习,包含多种弱分类器(Boosting algorithm is used for ensemble learning, and it contains many weak classifiers)
-
-
0下载:
Learning Kernel Classifiers Theory and Algorithms, Introducti()
-