[SPARK-49609][PYTHON][TESTS][FOLLOW-UP] Skip Spark Connect tests if dependencies are not found
### What changes were proposed in this pull request?
This PR is a followup of https://github.com/apache/spark/pull/48085 that skips the compatibility tests if Spark Connect dependencies are not installed.
### Why are the changes needed?
To recover the PyPy3 build https://github.com/apache/spark/actions/runs/11016544408/job/30592416115 which does not have PyArrow installed.
### Does this PR introduce _any_ user-facing change?
No, test-only.
### How was this patch tested?
Manually.
### Was this patch authored or co-authored using generative AI tooling?
No
Closes #48239 from HyukjinKwon/SPARK-49609-followup.
Authored-by: Hyukjin Kwon <gurwls223@apache.org>
Signed-off-by: Haejoon Lee <haejoon.lee@databricks.com>
diff --git a/python/pyspark/sql/tests/test_connect_compatibility.py b/python/pyspark/sql/tests/test_connect_compatibility.py
index ca1f828..8f3e86f 100644
--- a/python/pyspark/sql/tests/test_connect_compatibility.py
+++ b/python/pyspark/sql/tests/test_connect_compatibility.py
@@ -18,6 +18,7 @@
import unittest
import inspect
+from pyspark.testing.connectutils import should_test_connect, connect_requirement_message
from pyspark.testing.sqlutils import ReusedSQLTestCase
from pyspark.sql.classic.dataframe import DataFrame as ClassicDataFrame
from pyspark.sql.connect.dataframe import DataFrame as ConnectDataFrame
@@ -172,6 +173,7 @@
)
+@unittest.skipIf(not should_test_connect, connect_requirement_message)
class ConnectCompatibilityTests(ConnectCompatibilityTestsMixin, ReusedSQLTestCase):
pass