C#中文分词算法:ChineseAnalyzer

mac2022-06-30  22

1.首先需要引用 2个dll库文件 Lucene.Net.dll + Lucene.China.dll

using Lucene.Net;using Lucene.Net.Analysis;using Lucene.China;

2.还有一个data文件夹需要放在C:\Program Files (x86)\Common Files\microsoft shared\DevServer\10.0目录下    

里面有三个文件

(1).sDict.txt

(2).sDict.txt.bak

(3).sNoise.txt

这三个文件主要是用来根据那些词去分词的词语 如图:

代码实例:

protected void Button1_Click(object sender, EventArgs e)    {        StringBuilder sb = new StringBuilder();        sb.Remove(0, sb.Length);        string t1 = "";        int i = 0;        Analyzer analyzer = new Lucene.China.ChineseAnalyzer();        StringReader sr = new StringReader(TextBox1.Text);        TokenStream stream = analyzer.TokenStream(null, sr);

        long begin = System.DateTime.Now.Ticks;        Token t = stream.Next();        while (t != null)        {            t1 = t.ToString();   //显示格式: (关键词,0,2) ,需要处理            t1 = t1.Replace("(", "");            char[] separator = { ',' };            t1 = t1.Split(separator)[0];

            sb.Append(i + ":" + t1 + "\r\n");            t = stream.Next();            i++;        }        TextBox2.Text = sb.ToString();        long end = System.DateTime.Now.Ticks; //100毫微秒        int time = (int)((end - begin) / 10000); //ms

       TextBox2.Text += "耗时" + (time) + "ms \r\n=====\r\n";    }

运行结果如图:

转载于:https://www.cnblogs.com/hww9011/archive/2013/04/18/3029207.html

相关资源:中文分词ChineseAnalyzer.rar
最新回复(0)