Class LimitTokenCountFilter
This TokenFilter limits the number of tokens while indexing. It is a replacement for the maximum field length setting inside IndexWriter.
By default, this filter ignores any tokens in the wrapped TokenStream
once the limit has been reached, which can result in Reset() being
called prior to IncrementToken() returning false
. For most
TokenStream implementations this should be acceptable, and faster
then consuming the full stream. If you are wrapping a TokenStream
which requires that the full stream of tokens be exhausted in order to
function properly, use the
LimitTokenCountFilter(TokenStream, Int32, Boolean) consumeAllTokens
option.
Inherited Members
Assembly: Lucene.Net.Analysis.Common.dll
Syntax
[Serializable]
public sealed class LimitTokenCountFilter : TokenFilter, IDisposable
Constructors
Name | Description |
---|---|
LimitTokenCountFilter(TokenStream, Int32) | Build a filter that only accepts tokens up to a maximum number.
This filter will not consume any tokens beyond the |
LimitTokenCountFilter(TokenStream, Int32, Boolean) | Build an filter that limits the maximum number of tokens per field. |
Methods
Name | Description |
---|---|
IncrementToken() | |
Reset() |