LUCENE-3453: fix up MIGRATE.txt
git-svn-id: https://svn.apache.org/repos/asf/lucene/dev/branches/lucene3453@1231388 13f79535-47bb-0310-9956-ffa450edef68
diff --git a/lucene/MIGRATE.txt b/lucene/MIGRATE.txt
index 35b8b0f..fe47605 100644
--- a/lucene/MIGRATE.txt
+++ b/lucene/MIGRATE.txt
@@ -422,13 +422,13 @@
-* LUCENE-2308: Separate IndexableFieldType from Field instances
+* LUCENE-2308,LUCENE-3453: Separate IndexableFieldType from Field instances
With this change, the indexing details (indexed, tokenized, norms,
indexOptions, stored, etc.) are moved into a separate FieldType
instance (rather than being stored directly on the Field).
-This means you can create the IndexableFieldType instance once, up front,
+This means you can create the FieldType instance once, up front,
for a given field, and then re-use that instance whenever you instantiate
the Field.
@@ -439,15 +439,21 @@
IDS (does not index term frequency nor positions). This field
does not store its value, but exposes TYPE_STORED as well.
- * BinaryField: a byte[] value that's only stored.
-
* TextField: indexes and tokenizes a String, Reader or TokenStream
value, without term vectors. This field does not store its value,
but exposes TYPE_STORED as well.
+ * StoredField: field that stores its value
+
+ * DocValuesField: indexes the value as a DocValues field
+
+ * NumericField: indexes the numeric value so that NumericRangeQuery
+ can be used at search-time.
+
If your usage fits one of those common cases you can simply
-instantiate the above class. To use the TYPE_STORED variant, do this
-instead:
+instantiate the above class. If you need to store the value, you can
+add a separate StoredField to the document, or you can use
+TYPE_STORED for the field:
Field f = new Field("field", "value", StringField.TYPE_STORED);
@@ -465,9 +471,14 @@
t.setStored(true);
t.setOmitNorms(true);
t.setIndexOptions(IndexOptions.DOCS_AND_FREQS);
+ t.freeze();
FieldType has a freeze() method to prevent further changes.
+There is also a deprecated transition API, providing the same Index,
+Store, TermVector enums from 3.x, and Field constructors taking these
+enums.
+
When migrating from the 3.x API, if you did this before:
new Field("field", "value", Field.Store.NO, Field.Indexed.NOT_ANALYZED_NO_NORMS)
@@ -528,7 +539,7 @@
you can now do this:
- new BinaryField("field", bytes)
+ new StoredField("field", bytes)
* LUCENE-3396: Analyzer.tokenStream() and .reusableTokenStream() have been made final.
It is now necessary to use Analyzer.TokenStreamComponents to define an analysis process.