Class ParallelCompositeReader
A CompositeReader which reads multiple, parallel indexes. Each index added must have the same number of documents, and exactly the same hierarchical subreader structure, but typically each contains different fields. Deletions are taken from the first reader. Each document contains the union of the fields of all documents with the same document number. When searching, matches for a query term are from the first index added that has the field.
This is useful, e.g., with collections that have large fields which change rarely and small fields that change more frequently. The smaller fields may be re-indexed in a new index and both indexes may be searched together.
Warning: It is up to you to make sure all indexes are created and modified the same way. For example, if you add documents to one index, you need to add the same documents in the same order to the other indexes. Failure to do so will result in undefined behavior. A good strategy to create suitable indexes with IndexWriter is to use LogDocMergePolicy, as this one does not reorder documents during merging (like TieredMergePolicy) and triggers merges by number of documents per segment. If you use different MergePolicys it might happen that the segment structure of your index is no longer predictable.
Inheritance
Inherited Members
Assembly: DistributedLucene.Net.dll
Syntax
public class ParallelCompositeReader : BaseCompositeReader<IndexReader>, IIdentifiableSurrogate
Constructors
Name | Description |
---|---|
ParallelCompositeReader(CompositeReader[]) | Create a ParallelCompositeReader based on the provided
readers; auto-disposes the given |
ParallelCompositeReader(Boolean, CompositeReader[]) | Create a ParallelCompositeReader based on the provided
|
ParallelCompositeReader(Boolean, CompositeReader[], CompositeReader[]) | Expert: create a ParallelCompositeReader based on the provided
|
Methods
Name | Description |
---|---|
DoClose() |