[SPARK-34093][ML] param maxDepth should check upper bound
### What changes were proposed in this pull request?
update the ParamValidators of `maxDepth`
### Why are the changes needed?
current impl of tree models only support maxDepth<=30
### Does this PR introduce _any_ user-facing change?
If `maxDepth`>30, fail quickly
### How was this patch tested?
existing testsuites
Closes #31163 from zhengruifeng/param_maxDepth_upbound.
Authored-by: Ruifeng Zheng <ruifengz@foxmail.com>
Signed-off-by: Sean Owen <srowen@gmail.com>
diff --git a/mllib/src/main/scala/org/apache/spark/ml/tree/treeParams.scala b/mllib/src/main/scala/org/apache/spark/ml/tree/treeParams.scala
index 19ea8ae..768e14f 100644
--- a/mllib/src/main/scala/org/apache/spark/ml/tree/treeParams.scala
+++ b/mllib/src/main/scala/org/apache/spark/ml/tree/treeParams.scala
@@ -60,8 +60,9 @@
*/
final val maxDepth: IntParam =
new IntParam(this, "maxDepth", "Maximum depth of the tree. (Nonnegative)" +
- " E.g., depth 0 means 1 leaf node; depth 1 means 1 internal node + 2 leaf nodes.",
- ParamValidators.gtEq(0))
+ " E.g., depth 0 means 1 leaf node; depth 1 means 1 internal node + 2 leaf nodes." +
+ " Must be in range [0, 30].",
+ ParamValidators.inRange(0, 30))
/**
* Maximum number of bins used for discretizing continuous features and for choosing how to split
diff --git a/python/pyspark/ml/tree.py b/python/pyspark/ml/tree.py
index dfb24a22..7ddeb09 100644
--- a/python/pyspark/ml/tree.py
+++ b/python/pyspark/ml/tree.py
@@ -67,7 +67,8 @@
typeConverter=TypeConverters.toString)
maxDepth = Param(Params._dummy(), "maxDepth", "Maximum depth of the tree. (>= 0) E.g., " +
- "depth 0 means 1 leaf node; depth 1 means 1 internal node + 2 leaf nodes.",
+ "depth 0 means 1 leaf node; depth 1 means 1 internal node + 2 leaf nodes. " +
+ "Must be in range [0, 30].",
typeConverter=TypeConverters.toInt)
maxBins = Param(Params._dummy(), "maxBins", "Max number of bins for discretizing continuous " +