[SPARK-42631][CONNECT][FOLLOW-UP] Expose Column.expr to extensions

### What changes were proposed in this pull request?
This PR is a follow-up to https://github.com/apache/spark/pull/40234, which makes it possible for extensions to create custom `Dataset`s and `Column`s. It exposes `Dataset.plan`, but unfortunately it does not expose `Column.expr`. This means that extensions cannot build custom `Column`s that provide a user provider `Column` as input.

### Why are the changes needed?
See above.

### Does this PR introduce _any_ user-facing change?
No. This only adds a change for a Developer API.

### How was this patch tested?
Existing tests to make sure nothing breaks.

Closes #40590 from tomvanbussel/SPARK-42631.

Authored-by: Tom van Bussel <tom.vanbussel@databricks.com>
Signed-off-by: Herman van Hovell <herman@databricks.com>
(cherry picked from commit c3716c4ec68c2dea07e8cd896d79bd7175517a31)
Signed-off-by: Herman van Hovell <herman@databricks.com>
diff --git a/connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/Column.scala b/connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/Column.scala
index 4212747..6a660a7 100644
--- a/connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/Column.scala
+++ b/connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/Column.scala
@@ -51,7 +51,7 @@
  *
  * @since 3.4.0
  */
-class Column private[sql] (private[sql] val expr: proto.Expression) extends Logging {
+class Column private[sql] (@DeveloperApi val expr: proto.Expression) extends Logging {
 
   private[sql] def this(name: String, planId: Option[Long]) =
     this(Column.nameToExpression(name, planId))