搜索资源列表
je-analysis-1[1].4.0.jar
- 极易分词源码包,好用最新版本,双向匹配分词-vulnerable segmentation source package, the latest version easier to use, two-way matching word
je-analysis-1[1].5.0
- 基于luecen的分词包,包含很多特性使用起来简单方便-luecen based on the word package includes many of the characteristics is simple and convenient to use
je-analysis-1.4.0
- 基于java的中文分词系统,直接更改扩展名为jar即可。暂无源码提供
je-analysis-1.5.3
- java lucene 开源全文检索,中文分词组件之 je-analysis-1.5.3
lucene-1.4.3
- java分词技术,只实现英文分词,但是该分词算法很经典(来源于apache)-java-term technology, achieving only English Word, but the Word algorithm classic (from apache)
je-analysis-1[1].4.0.jar
- 极易分词源码包,好用最新版本,双向匹配分词-vulnerable segmentation source package, the latest version easier to use, two-way matching word
je-analysis-1[1].5.0
- 基于luecen的分词包,包含很多特性使用起来简单方便-luecen based on the word package includes many of the characteristics is simple and convenient to use
lucenesegment
- lucene中文分词源码,做搜索引擎需要用到的好东西哦-lucene Chinese word source and do search engines need to use the good stuff, oh
je-analysis1.40
- 中文分词系统,免费提够接口,实现对中文的分词-Chinese word segmentation system, providing free enough interface, the realization of the Chinese word
je-analysis-1.4.0
- 基于java的中文分词系统,直接更改扩展名为jar即可。暂无源码提供-Java-based Chinese word segmentation systems, directly change the extension to jar. No source provided
lucene+mysql+eclipe
- lucene+mysql+eclipe开发实例,实现了分页和中文分词。欢迎交流qq:276367673-lucene+ mysql+ eclipe development examples realize the page and Chinese word segmentation. Welcome the exchange of qq: 276367673
luceneheritrixCDROM
- 开发自己的搜索引擎——Lucene 2.0+Heriterx随书光盘源码-Developing its own search engine- Lucene 2.0+ Heriterx book with CD-ROM source
je-analysis-1.5.3
- java lucene 开源全文检索,中文分词组件之 je-analysis-1.5.3-java lucene open-source full-text search, Chinese word segmentation component of the je-analysis-1.5.3
Segment
- 使用lucene组件的分词例子。同时利用JE-Analysis 1.5.1 组件分词。-Use Lucene component segmentation examples. At the same time, the use of JE-Analysis 1.5.1 Segmentation components.
Jena
- Jena推理机,用于解析OWL文档 抽取本体概念-Jena inference engine for the Analysis of OWL Ontology taken the concept of document
je-analysis-1.5.3
- je-analysis-1.5.3.jar JE中文分词1.5.3版本的源包-je-analysis-1.5.3.jar JE Chinese Segmentation 1.5.3 version of the source package
GoogleExtract
- 该爬虫自动收集网络上(指定网站)指定的信息,存入MYSQL数据库,并可下载到磁盘,默认的搜索是“武汉大学”-The reptiles collected automatically on the network (designated site) the specified information, into MYSQL database, and can be downloaded to disk, the default searc
je-analysis-1.5.3.jar
- 搜索引擎开中中文分词包JE分词器 开发者必备哦-Search engine to open in Chinese word segmentation package JE Oh device developers must
je-analysis-1.5.1
- je-analysis-1.5.1.jar包(je-analysis-1.5.1.jar)
lucene-core-2.4.1.jar je-analysis-1.5.3.jar
- import jeasy.analysis.MMAnalyzer 实现句子中文分词的jar包(import jeasy.analysis.MMAnalyzer Jar package for Chinese word segmentation in sentences)