搜索资源列表
jieba
- 超级解霸 V2.0 源码,由于当时还是386的机器,都没有加密保护。很难的才弄到手的-STHVCD V2.0 source, as it was then or 386 machines, no encryption protection. Difficult into their hands before the
jieba
- 街霸游戏,游戏很逼真,是一款可玩性非常强的手机游戏-neighborhood games, the game is very realistic, the creator of one of the very strong mobile phone games
jieba
- 在手机上运行的街霸哦,还可以哦,用来学习也是不错的啊,大家研究研究
jieba
- j2me 游戏 在手机运行的街头霸王
Jieba
- 请多指教 这是一个常见的街霸游戏的源码 虽然简单 但是适合初学者 请主明 转载 我也是的
jieba
- 超级解霸 V2.0 源码,由于当时还是386的机器,都没有加密保护。很难的才弄到手的-STHVCD V2.0 source, as it was then or 386 machines, no encryption protection. Difficult into their hands before the
jieba
- 街霸游戏,游戏很逼真,是一款可玩性非常强的手机游戏-neighborhood games, the game is very realistic, the creator of one of the very strong mobile phone games
The-Super-mathematical-Jieba-V1.0
- 龙风奇超级数学解霸V1.0 原创 龙风奇工作室-The Super mathematical Jieba V1.0
jieba
- jieba分词软件,是python下的开源分词软件,里面有使用例子,简单易用-jieba segmentation software, is under the open source python segmentation software, there are examples of the use, easy to use
jieba分词
- jieba 的java分词包,一般都是python的包,这个可用于java的jieba分词(Jieba Java word segmentation package, generally Python package, this can be used for the Java Jieba participle)
jieba for Python
- jieba分词功能在python中的实现方法(The Method of jieba for word-split in python)
jieba
- 将句子分成很小的独立词,来提取信息,对照数据字典得到有用的关键信息,进行智能筛选题目或回答问题。(The sentence is divided into very small independent words to extract information, and the data dictionary is used to obtain useful key information.)
jieba分词
- 分词出差的武器的呼气和对区华东区希望成为(xwijidwdjdowslkmxkszmwksww)
jieba.NET.0.38.2
- jieba 分词,用于诶文档进行分词处理,可以作为文本分析的材料。(jieba fenci,use yin doc wordsplit,and it can use in the case word analist . and i fuxxk this site)
Downloads
- jieba分词和ansj分词使用的java包(The package used by the ansj participle)
jieba-jieba3k
- MATLAB 结巴分词的工具包,用于很多中文分词的模式识别代码程序,利用已有函数工具包提高工作效率,内有安装说明(MATLAB jieba toolkit, used for many Chinese word segmentation pattern recognition code programs, using existing function toolkits to improve work efficiency, with
jieba
- 精确模式,试图将句子最精确地切开,适合文本分析; 全模式,把句子中所有的可以成词的词语都扫描出来, 速度非常快,但是不能解决歧义; 搜索引擎模式,在精确模式的基础上,对长词再次切分,提高召回率,适合用于搜索引擎分词。(Accurate mode, trying to cut the sentence up to the most accurate, suitable for text analysis. The whole m
jieba-0.38
- jieba 分词,用在Python中,对中文文本进行分词(Jieba participle, used in Python to segment Chinese text;)
jieba
- 用jieba实现自然语言初步处理,包含自定义停用词表,自定义词典,统计词频,以实际例子进行演示。(The initial processing of natural language is realized by using jieba, including custom stop word list, custom dictionary, and count word frequency, which is demonstrated
chatbot
- 聊天机器人 原理: 严谨的说叫 ”基于深度学习的开放域生成对话模型“,框架为Keras(Tensorflow的高层包装),方案为主流的RNN(循环神经网络)的变种LSTM(长短期记忆网络)+seq2seq(序列到序列模型),外加算法Attention Mechanism(注意力机制),分词工具为jieba,UI为Tkinter,基于”青云“语料(10万+闲聊对话)训练。 运行环境:python3.6以上,Tensorflow,pan