搜索资源列表
-
0下载:
这是一个基于Java的分词、N-gram统计、分段
、分句等功能的程序,支持多种语言-This is a Java-based segmentation, N-gram statistics, the sub-clause of the function procedures, multilingual support
-
-
0下载:
字符处理这是一个基于Java的分词、N-gram统计、分段 、分句等功能的程序,支持多种语-characters to deal with this is a Java-based segmentation, N-gram to statistics, subparagraph Clauses function procedures, multiple language support
-
-
0下载:
基于中科院的ICTCLAS实现中文分词系统
开发工具是JAVA.经测试,效果很好.-ICTCLAS based on the realization of the Chinese Academy of Sciences Chinese word segmentation system is the Java development tools. Tested, good results.
-
-
0下载:
基于词典和最大匹配算法的的中文分词组件,达到很好的分词准确率-Dictionary and the largest based on the matching algorithm of the Chinese word segmentation components, to achieve good word accuracy rate
-
-
0下载:
基于python的中文分词程序,易用性高,可以作为接口直接在python程序中使用,Python-based Chinese word segmentation process, ease of use high, can be used as interface directly in the python program to use
-
-
0下载:
一个基于VB.NET开发的中文分词及关键词提取系统,采用双向最大匹配、词频统计、快速排序等算法实现。-VB.NET developed based on Chinese word segmentation and Key Extraction System, the largest two-way matching, word frequency statistics, such as quick sort algorithm.
-
-
0下载:
基于hashmap的首字哈希查找法,正向最大匹配法分词系统。代码用c++编写,本系统很好的实现了分词功能。-Based on the first word hash hashmap Find law, being the largest sub-word matching system. Code using c++ development, the system achieved a very good word function.
-
-
0下载:
基于视觉的web页面分割算法(vips)-VIPSa Vision-based Page Segmentation Algorithm
-
-
0下载:
基于条件随机场的汉语自动分词系统。把句子划分成一个一个的词-Based on the conditions of the airport with Chinese automatic segmentation system. The sentence was divided into an a word
-
-
0下载:
The random walker algorithm was introduced in the paper:
Leo Grady and Gareth Funka-Lea, "Multi-Label Image Segmentation for Medical Applications Based on Graph-Theoretic Electrical Potentials", in Proceedings of the 8th ECCV04, Workshop on Compute
-
-
0下载:
由KaiToo搜索开发的一款基于字典的简单中英文分词算法-Search by KaiToo developed a simple dictionary based on English and Chinese word segmentation algorithm
-
-
0下载:
基于统计的分词,采用隐马尔可夫模型,并有实验报告-Based on statistics segmentation using hidden Markov models, and there is experimental report
-
-
0下载:
贝叶斯网络概率中文分词算法,基于概率的分词算法-Bayesian network probability of Chinese word segmentation algorithm, based on the probability of word segmentation algorithm
-
-
1下载:
imdict-chinese-analyzer 是 imdict智能词典 的智能中文分词模块,算法基于隐马尔科夫模型(Hidden Markov Model, HMM),是中国科学院计算技术研究所的ictclas中文分词程序的重新实现(基于Java),可以直接为lucene搜索引擎提供简体中文分词支持。-imdict-chinese-analyzer is a smart imdict Chinese Dictionary smart module segmentation algorithm
-
-
0下载:
这是一个基于lucene接口的一个中文分析器,他采用的是双向切分的方法-This is an interface based on a lucene analyzer in Chinese, he used a two-way method of segmentation
-
-
0下载:
基于机械分词的中文分词技术代码,使用最大概率法去歧义。词典很全。-Chinese word segmentation based on mechanical technology code word, using the maximum probability method to ambiguity. Dictionary is full.
-
-
0下载:
基于VC++的中文分词代码。程序编码基本正确,实现了程序设计中提到的两种分词策略,分词结果就在预料之中。-Chinese word segmentation based on VC++ code
-
-
0下载:
IKAnalyzer2012,一个以lucene为基础的非常好用的中文分词器,有两种分词模式,智能分词模式和最细粒度分词模式。-IKAnalyzer2012 very easy to use a lucene-based Chinese Word Breaker, there are two sub-word mode, intelligent word patterns and most fine-grained segmentation model.
-
-
0下载:
通过一个已经标号词性的训练集来得到训练数据,再根据训练数据对需要进行分词的数据进行分词,采用概率最高的分词情况为最后结果。-By a label the parts of speech training set training data to get the need segmentation data based on the training data segmentation with the highest probability of segmentation for the fin
-