搜索资源列表
ChineseTokenizer20060426
- 分词文件,是一个比较简单词库.简单的说就是个强大的搜索-word document is a relatively simple thesaurus. Simply means that the powerful search
ictclas4j
- 中文分词器ictclas4j的源码,含有分词器的算法源码,以及案例-Chinese tokenizer ictclas4j source containing the word algorithm source code, as well as case